Home / Wiki / Robots File

Robots File


What is robots.txt?

robots.txt is a file which is used for controlling search engine crawlers. It will be useful for allow/disallow crawlers to crawl your web pages in site. Search engine crawlers will crawl your website, if robots configuration is empty.

How to use robots.txt in Zoho Wiki?

Steps to follow:
  1. Go to wiki Settings
  2. Click on Robots in the left hand side
  3. Enter your new robots configuration in the textarea box.
  4. Click Save button.

How to view your wiki robots.txt?

Just type http://<your wiki domain>/robots.txt

Examples

Allow: /*
Allow all the pages in your wiki to crawl.
Allow: /sitemap.zhtml
Allow only site map in your wiki to crawl.
Disallow: /*
Disallow all the pages in your wiki to crawl.
Disallow: /sitemap.zhtml
Disallow only site map in your wiki to crawl.
#
Hash symbol means line comment. Crawler wont see these lines.

Note: As per most users requirement, we disallow wiki pages crawling by default. Please enable it as per your wish if you  need.



 RSS of this page