What is a robots.txt file?
Answer / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
Is This Answer Correct ? | 0 Yes | 0 No |
What is the name of the search engine technology due to which a query for the word ‘actor’ will also show search results for related words such as actress, acting or act?
What is link building?
What is an html sitemap?
What is the difference between targeted page and landing page?
What is a Backlink?
Who is Matt Cutts?
Do you know what are the key aspects of penguin update?
What is forum posting?
What is social bookmarking?
Tell us what is the difference between pr (page rank) and serp (search engine result page)?
What are robots.txt file?
Which of the following are examples of agents? ► A. Internet Explorer ► B. Search engine spiders ► C. Opera ► D. SQL Server database attached to a website