Answer Posted / nashiinformaticssolutions
Search engine bots can be instructed on how to crawl and index a website and its pages by using the Robots.txt file.
Because this file is in the.txt format, it contains simply text and no HTML code. Its function is to prevent search engine bots from visiting the website needlessly. This is typically the first file that crawlers see on the website.
Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
what happens when an advertiser sets a daily budget lower than the recommended amount, using the standard delivery method?
When do the Submission appear on the engines? - SEO
What is press release submission?
Do you know what are paid results?
How do we determine if we should manage search engine marketing activities in-house or outsource to an sem vendor?
What is google panda algorithms?
Explain distinct types of seo practice?
What are the different techniques used in offpage seo?
Tell us what will you do, for the company website you are working for, decides to move all the contents to new domain?
which best describes the optimize ad rotation setting in adwords?
What is a website or what do you understand by a website?
which of the following items is not a component of quality score?
What is mobilegeddon (mobile-friendly update)?
What is Page Authority?
What is crawling?