Answer Posted / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
You can win a higher ad position in the auction with a lower cost per click (cpc) bid by?
List types of seo?
What do you know about black hat seo?
Which is better — meta robot tags or robots.txt?
Explain me what is google pagerank?
Does ctr help in improving quality score?
How does a google auction work?
Do you know what is image alt text?
Explain what is seo?
Explain what is keyword stemming?
How to calculate quality score?
What tools do you use for doing seo?
Tell me how do you separate words in url?
What is a web page?
How many characters limits in & Meta Description tag?