Robot.txt was used in the context of _____________.
a. Hiding some pages from the search engine crawler
b. Making the crawler look at these files easily
c. Make the website sitemap
d. Enable easier access to content
Answers
Answered by
18
Answer:
The robots. txt file is primarily used to specify which parts of your website should be crawled by spiders or web crawlers. It can specify different rules for different spiders.
Similar questions