# $Id: robots.txt,v 2.0 2012/8/8 12:59:59 goba Exp $ # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html # http://tool.motoricerca.info/robots-checker.phtml # You should not separate with blank lines commands belonging to the same block of code. User-agent: * Disallow: / # Disallow search and adslots for some bots User-agent: Baiduspider Crawl-delay: 10 Disallow: */search* Disallow: /adslots.json User-agent: * Disallow: /adslots.json # # Users Disallow: */login/* Disallow: */account # Disallow: */zendesk/* # # Place actions Disallow: */edit Disallow: */p/add # # filter results Disallow: *isverified=* Disallow: *opennow=* Disallow: *price=* User-agent: * Disallow: / # point to the sitemap Sitemap: https://jo.jeeran.com/sitemap.xml