# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of www.feriepartner.de by web crawlers (e.g. Googlebot) in order to avoid serving # duplicate content, i.e. identical or very similar pages on several different URLs. # # This file will be ignored unless it is placed in the root of the domain: # http://www.feriepartner.de/robots.txt # # Further information about robots.txt: # http://www.robotstxt.org/ # # Last modified 2018-10-31 #All Searchengines Crawlers User-agent: * #CMS Disallow: /episerver/* #Searchresults Disallow: /suchergebnisse/?* Disallow: /*/suchergebnisse/ #URL's for Holiday Homes with search parameters Disallow: */?checkin Disallow: */?adults Disallow: */?duration Disallow: */?houseid Disallow: */?areas Disallow: */?subregion Disallow: */?region Disallow: */?children Disallow: */?pets Disallow: */?count Disallow: */?mode Disallow: */?icid Disallow: */?showerror Disallow: */page-not-found Disallow: */fejl- #Booking Flow pages Disallow: */booking/