What is a robots.txt file?
Answer / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is domain extension?
How do you know which pages are indexed by google?
How will you check the number of backlinks of your competitors site?
Do you know what is social networking?
Tell me what is seo & why is it so important?
When do the Submission appear on the engines? - SEO
Tell us how can you check if someone is not building or re-directing a low-quality backlink to your site?
What is keyword difficulty in SEO?
What is directory and article submission?
What is on page optimization?
What is the main usage of search engine spiders?
How to carry out keyword research?