Which of the following are requirements in a robots.txt file?
*Disallow: [URL string not to be crawled]
O Allow: [URL string to be crawled]
O Sitemap: (sitemap URL]
O "User-agent: [user-agent name]
Mark for Later
Answers
Answered by
0
Answer:
like like lololplolllolol
Similar questions