Answer Posted / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
What is google fred?
What do you understand by crawling?
What is google panda algorithms?
What is an SEO package?
Tell me popular seo blogs to follow?
Explain me what will be your approach if your seo method don't work?
Which is more important- building backlinks to a website or building great content?
How many characters limits in & Meta Description tag?
What do you know about the florida update?
Data shows the audience for a client's running shoe store is women ages 35 to 50. How can you optimize this client's display network campaign based on your research?
Tell me what is seo?
What is keyword prominence?
Explain how does ad rank impact cost-per-click?
How would you increase the pagerank of a page?
What are contextual backlinks?