DIGITAL SOP 057: How to Configure the Perfect Robots.txt File

Last Updated / Reviewed: March 25th, 2024

Execution Time: 30 mins

Goal: To properly create or optimize your robots.txt file.

Ideal Outcome: You have an excellent robots.txt on your website that allows search engines to index your website exactly as you want them to.

Pre-requisites or requirements:

  • You need to have access to the Google Search Console property of the website you are working on. If you don’t have a Google Search Console property setup yet you can do so by following SOP 020.

Why this is important: Your robots.txt sets the fundamental rules that most search engines will read and follow, once they start crawling your website. They tell search engines which parts of your website you don’t want (or they don’t need) to crawl.

Where this is done: In a text editor, and in Google Search Console. If you are using Wordpress, in your Wordpress Admin Panel as well.

When this is done: Typically you would audit your robots.txt at least every 6 months to make sure it is still current. You should have a proper robots.txt whenever you start a new website.

Who does this: The person responsible for SEO in your organization.

Follow Us