WordPress Robots.txt Explained – Where To Put It And How To Use It

wordpress robots txt

A deep understanding of WordPress robots.txt will go a long way in helping you improve your website’s SEO. And in this guide, you will get to learn what robot.txt is all about, and most importantly learn how to use it.

Basically, robot.txt is made for robots – which, for instance, are software that crawls through web pages and index them for search results.

It allows website owners bar search bots from crawling certain pages or content on their website. The wrong use of robot.txt could ruin your site SEO.

As such, it should be used with caution. But not to worry, everything you need to learn about this subject is covered in this guide.

Content:

What is WordPress Robots.txt File?

The robots.txt file, a fundamental part of web development and SEO, serves as a guide for search engine bots, instructing them on which parts of a website they can and cannot crawl. Its origins trace back to the early days of the internet, making it a pivotal component in the relationship between websites and search engines.

The History of Robots.txt

In the mid-1990s, as the internet began to burgeon, the need to organize and index web content became apparent. Search engines deployed bots, also known as crawlers, to navigate and index the vast expanse of online information. However, not all website content was meant for public indexing, leading to the creation of the robots.txt protocol in 1994. This simple text file was designed to provide website owners with the means to communicate with these bots, guiding their crawl and ensuring that sensitive or irrelevant pages remained unindexed.

How Robots.txt Works

At its core, the robots.txt file is straightforward. Placed in the root directory of a website, it uses a simple syntax to instruct bots on which directories or pages to avoid. For example, a line stating “Disallow: /private” tells bots to skip over the ‘private’ directory during their crawl. Conversely, “Allow: /public” would indicate that the ‘public’ directory is open for indexing. The file can also direct bots to a website’s sitemap, further aiding the indexing process by highlighting the site’s structure.

Evolution and Best Practices

Over the years, the application of robots.txt has evolved. While its fundamental purpose remains the same, the approach to its implementation has become more nuanced. Modern best practices suggest a balanced use of the file, ensuring that it neither overly restricts bots nor leaves too much open for indexing. The goal is to guide bots efficiently through a site, enhancing the site’s SEO while protecting sensitive areas from public view.

Benefits of Creating an Optimized Robots txt File

The major reason for creating a robots.txt file is to prevent search engine robots from crawling certain content of your website.

For instance, you wouldn’t want users to access the theme and admin folder, plugin files, and categories page of your website.

Also, an optimized robots.txt file helps conserve what is known as crawl quota. Crawl quota is the maximum allowable number of pages of a website search bots can crawl at a time.

You want to ensure that only useful pages are crawled, else your crawl quota would waste away. Doing this will improve your website’s SEO greatly.

Thirdly, a well-scripted robots.txt file can help you minimize the activities of search bots, including bad bots, around your website. That way, your website’s load speed would improve greatly.

SEO Best Practices and Common Mistakes with Robots.txt

Optimizing your website’s robots.txt file is a crucial step in enhancing your SEO strategy. However, it’s easy to fall into common pitfalls that can inadvertently harm your site’s search engine visibility. Here, we’ll explore essential SEO best practices for using robots.txt effectively and highlight some common mistakes to avoid.

Best Practices for Robots.txt

  • Be Specific: Use precise paths in your directives. For example, if you want to block a specific directory, use “Disallow: /example-directory/” instead of a broad rule that might unintentionally block more content than intended.
  • Use with Caution: Remember, the robots.txt file is a powerful tool. A small error can prevent search engines from accessing important content. Always review your rules carefully.
  • Regular Updates: As your website evolves, so should your robots.txt file. Regularly review and update it to reflect new content or structural changes on your site.
  • Keep It Accessible: Place your robots.txt file in the root directory of your site. This makes it easily accessible to search engine bots.
  • Avoid Blocking CSS and JS Files: Search engines like Google need to access these files to render your pages correctly. Blocking them can negatively impact your site’s rendering and indexing.
  • Sitemap Inclusion: Include the path to your sitemap in the robots.txt file. This helps search engines discover and index your content more efficiently.

Common Mistakes to Avoid

  • Overuse of Disallow: Over-restricting search engine access can lead to unindexed pages and lost SEO opportunities. Only block content that truly needs to be hidden from search engines.
  • Blocking Important Content: Ensure you’re not inadvertently blocking access to pages or resources that contribute to your site’s SEO value, such as product pages or important articles.
  • Syntax Errors: Even small typos can have significant consequences. A misplaced “/” or “*” can change the scope of a rule, so double-check your syntax.
  • Neglecting the File: An outdated robots.txt file can be as harmful as not having one at all. Make sure it reflects your current site structure and content strategy.
  • Using Robots.txt for Privacy: If you’re trying to keep sensitive information private, robots.txt is not the solution. Blocked URLs can still appear in search results without content. Use password protection or noindex tags for private content.

Where Is Robots.txt File Located?

By default, a robots.txt file is created and stored in your website’s root directory whenever you install a WordPress website. To view it, open your website in a browser, then append “/robots.txt” at the end. For instance:

https://mywebsite.com/robots.txt

Here’s how ours at Fixrunner looks like:

where is robots.txt file in WordPress

The default WordPress robots.txt file is virtual, and so can’t be accessed nor edited. To access or edit it you would have to create one – and there are many ways to do so. Let’s see some of them!

How to Create Robots.txt File in WordPress

Creating a robots.txt file in WordPress is a straightforward process. You can either do so manually or use WordPress plugins. We are going to see both processes here, and the plugin we are going to be using is Yoast SEO.

Using Yoast SEO Plugin

Yoast SEO plugin can create a robot.txt file for WordPress on the fly. Of course, it does a whole lot more when it comes to SEO for WordPress.

First off, install and activate the plugin, if you haven’t.

Once you have Yoast up and running on your website, navigate to SEO >> Tools

yoast seo tools

Next, click on the File editor link in the Yoast dashboard.

yoast seo file editor

This will take you to the page where you can create a robots.txt file. Click the Create button.

create robots txt file inside yoast seo

This will take you to an editor where you can add and edit rules to your WordPress’ robots.txt file.

default yoast robots txt file

Add new rules to the file editor and save changes. Not to worry, we will show you the rules to add shortly.

Adding Robots.txt Manually with FTP in WordPress

This method is quite simple, and just about anybody can do it. To begin with, launch Notepad – or any favorite editor of yours, as long as it’s not a word processor like Microsoft Word – on your machine.

For a start, add the following rules to the file you just created.

User-agent: *

Disallow: /wp-admin/

Allow: /wp-admin/admin-ajax.php

Save the file as robots.txt. What you want to do next is to upload it to your website via an FTP program such as FileZilla.

First off, establish a connection to your website in FileZilla. Next, navigate to your public_html folder. In this folder upload the robots.txt file you had just created.

upload robots txt file via ftp

Once the upload is completed, you are good to go.

Adding Rules

Basically, there are just two instructions you can give to search bots: Allow and Disallow. Allow grants them access to a folder, and Disallow does the opposite.

To allow access to a folder, add:

User-Agent: *

Allow: /wp-content/uploads/

The asterisk (*) tells search bots, “hey this rule is applicable to all of you”.

To block access to a folder, use the following rule

Disallow: /wp-content/plugins/

In this instance, we are denying search bots access to the plugins folder.

It’s entirely up to you to determine which rule is most applicable to your website. If you run a forum, for instance, you may decide to block off crawlers from your forum page with the following rule:

Disallow: /forum/

As a rule of thumb, the fewer the rules the better. The following rule is enough to get the job done:

User-Agent: *

Allow: /wp-content/uploads/

Disallow: /wp-content/plugins/

Disallow: /wp-admin/

How to Test Your Created Robots.txt file in Google Search Console

Now that you’ve created the robots.txt file in WordPress, you will want to be sure it’s working as it should. And there is no better way to do that other than using a robots.txt tester tool.

Google Search Console has the right tool for this purpose. So first things first, log into your Google Console account. You can always create an account if you don’t have one.

Once in Google Search Console, scroll down and click Go to the old version.

test robots txt file in google webmaster tools - search console

Once you are in the old version, navigate to Crawl >> robots.txt tester.

robots.txt tester

In the text editor, paste the rules you had added to the robots.txt file, finally, click Test.

search console test button

If it checks out, then you are done!

Real-World Robots.txt Insights: What Works and What Doesn’t

Let’s dive into some stories from the trenches of SEO, where robots.txt plays a starring role. These tales not only shed light on best practices but also warn of the pitfalls to avoid.

The Case of the Hidden Gems

Imagine “GlobeTrotters,” a travel blog that decided to block its entire gallery using robots.txt, thinking it would protect their bandwidth:

Disallow: /gallery/

What they didn’t foresee was the drop in user engagement. Their travel photos, a huge draw for visitors, vanished from search results. It was a classic case of out of sight, out of mind. The lesson? Your visual content can be a magnet for visitors; don’t hide it away.

The Sitemap Oversight

Next up, “GizmoGeeks,” a gadget review site, revamped their site but forgot one tiny detail in their robots.txt: the sitemap reference. It was an “oops” moment that slowed down the indexing of their fresh content. Once they added:

Sitemap: https://gizmogeeks.com/sitemap.xml

…the search engines were back on track, gobbling up their new reviews. The takeaway? Your sitemap is like a treasure map for search engines; make sure they have it.

The SEO Win with a Simple Allow

Now, let’s talk about “VeggieVibes,” a blog about plant-based diets. They used robots.txt to spotlight their best articles:

Allow: /top-recipes/

This move put their prized content in the limelight, boosting their search presence. The insight here? Directing search engines to your star content can really pay off.

A Privacy Misstep

Lastly, there’s “FinanceHub,” a site offering premium financial advice. They tried using robots.txt to hide member-only content:

Disallow: /members-only/

But here’s the catch: those pages still showed up in search results, minus the content. It was a privacy faux pas. The moral? Robots.txt isn’t a cloak of invisibility for sensitive content. For that, you need stronger measures like login barriers.

FAQ: Navigating the World of WordPress Robots.txt

What exactly is a robots.txt file in WordPress?

Think of robots.txt as a doorman for your WordPress site. It’s a simple text file that lives at the heart of your site, telling search engine bots which pages they can waltz into and which ones are off-limits.

Do I really need a robots.txt file for my WordPress site?

While your site can live without it, having a robots.txt file is like giving search engines a roadmap. It’s especially handy as your site grows, helping ensure that search engines spend their time on the pages that matter most.

Can I create a robots.txt file directly in WordPress?

Absolutely! While WordPress doesn’t have a built-in editor for this, you can use SEO plugins like Yoast SEO or All in One SEO. They offer a user-friendly way to whip up a robots.txt file without diving into the site’s files.

What should I include in my WordPress robots.txt file?

Start with the basics: allow search engines to index your main content and steer them away from administrative areas like /wp-admin/. Don’t forget to include a path to your sitemap to make indexing even smoother.

Can using robots.txt improve my site’s SEO?

Yes, when used wisely. By guiding bots to your site’s prime content and keeping them away from duplicate or irrelevant pages, you can make your site more appealing to search engines.

Is there anything I shouldn’t do with my robots.txt file?

Avoid using it as a privacy tool. If there’s something you don’t want the world to see, blocking it in robots.txt isn’t enough since the file itself is public. Also, be careful not to block essential elements like CSS or JS files that can impact how your site appears in search results.

 How can I check if my robots.txt file is doing its job?

Use Google Search Console’s robots.txt Tester tool. It’ll show you how Googlebot sees your file and point out any issues that might keep your content from being indexed properly.

Conclusion

Search bots can be unruly at times, and the only way to checkmate their activities on your website is to use robots.txt. Even at that, some bots will still completely ignore whatever rules you have laid out – you just have to deal with that.

While it’s true WordPress automatically generates a robots.txt file for you upon installation, creating one for yourself is a good idea. A well optimized robots.txt file will prevent search bots from doing harm to your website.

If you found this article helpful, do share. For more WordPress tutorials, follow our WordPress blog.

More Resources:

 This post was written by Sam Mulaim

Hello! I’m Sam - the founder and CEO of FixRunner WordPress support. When I started FixRunner one of my goals was to help people run a successful website and overcome WordPress issues. I don’t have much time these days to write new posts but when I do I enjoy it very much.

Last edited by: FixRunner Team