What is a robots.txt file?
Answer / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is google panda, penguin, emd, hummingbird & payday loan (spam) algorithms?
What is an internal link?
Define on page and off page seo?
What’s the keyword density and how that is calculated?
How can I see what pages are indexed in google?
Where do we use keywords?
What are the character limitations of title and description tags?
What is seo off-page activities?
What is 'ethical seo'?
Explain what do you understand by frames in html?
Why is video SEO different from text-based SEO? How would you approach it?
What is google webmaster tools?