What is a robots.txt file, and how do you use it?
Answer / nashiinformaticssolutions
Search engine crawlers are told which pages or sections of a website to crawl and which to avoid using a robots.txt file. It is employed to stop sensitive or duplicate content from being indexed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Do you know what is social networking?
What do you mean by spider?
What are the limitations of the title and description tags?
Tell me what if your website is banned by the search engines for black hat practices?
Explain link popularity.
Which of the following methods can help you get around the Google Sandbox? ► a. Buying an old Website and getting it ranked ► b. Buying a Google Ad words PPC campaign ► c. Placing the website on a sub domain of a ranked website and then 301 re-directing the site after it has been indexed ► d. Getting a DMOZ listing
Tell me what is anchor text?
If I do business in europe, do I need to get listed on local search engines and directories?
Does the spider program run through all links on a web page?
What is seo off-page activities?
What are doorway pages?
What is external link?