| | | |
Uploading ....
What is robots.txt?robots.txt is a file which is used for controlling search engine crawlers. It will be useful for allow/disallow crawlers to crawl your web pages in site. Search engine crawlers will crawl your website, if robots configuration is empty.
How to use robots.txt in Zoho Wiki?Steps to follow:
- Go to wiki Settings
- Click on Robots in the left hand side
- Enter your new robots configuration in the textarea box.
- Click Save button.
How to view your wiki robots.txt?Just type http://<your wiki domain>/robots.txt
ExamplesAllow: /*
| Allow all the pages in your wiki to crawl.
| Allow: /sitemap.zhtml
| Allow only site map in your wiki to crawl.
| Disallow: /*
| Disallow all the pages in your wiki to crawl.
| Disallow: /sitemap.zhtml
| Disallow only site map in your wiki to crawl.
| #
| Hash symbol means line comment. Crawler wont see these lines.
|
Note: As per most users requirement, we disallow wiki pages crawling by default. Please enable it as per your wish if you need.
|
|
|
| | | |
|