Create SEO-optimized robots.txt files to control search engine crawling.
Robots.txt is a file placed in your website’s root directory that tells search engine bots which pages they are allowed or not allowed to crawl. A properly configured robots.txt file improves crawl efficiency, SEO health, and protects sensitive URLs.
Not mandatory, but highly recommended for crawl optimization.
It must be placed at yourdomain.com/robots.txt
Yes, but blocking important pages may hurt rankings.