WordPress Robots.txt Explained – Where To Put It And How To Use It
A deep understanding of WordPress robots.txt will go a long way in helping you improve your website’s SEO. And in this guide, you will get to learn what robot.txt is all about, and most importantly learn how to use it.
Basically, robot.txt is made for robots – which, for instance, are software that crawls through web pages and index them for search results.
It allows website owners bar search bots from crawling certain pages or content on their website. The wrong use of robot.txt could ruin your site SEO.
As such, it should be used with caution. But not to worry, everything you need to learn about this subject is covered in this guide.
Content:
What is WordPress Robots.txt File?
The content of a robots.txt file typically looks like this:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow, in this case, tells search bots: “Hey, you are not permitted to crawl the wp-admin folder”. And Allow, well you guessed it right.
Every day thousands of new websites are published on the internet. To make it easy for searchers to find these websites, Google – and other search engines – index each and every website.
Considering the sheer amount of work this is, Google relies on its search bots to get the job done, quickly.
When a search robot lands on your website, it would first study your website’s XML sitemap to know all the pages contained therein.
Next, the bot proceeds to crawl and index not just the website’s pages but also its content, including the JS and CSS folder. If it’s a WordPress website, the bot would also crawl the wp-admin folder.
You sure don’t want that to happen, and the only way to stop them is by instructing them not to in the robots.txt file.
While creating a robots.txt file isn’t mandatory – as search bots will still crawl your website whether or not you have one – having one has lots of benefits.
Benefits of Creating an Optimized Robots txt File
The major reason for creating a robots.txt file is to prevent search engine robots from crawling certain content of your website.
For instance, you wouldn’t want users to access the theme and admin folder, plugin files, and categories page of your website.
Also, an optimized robots.txt file helps conserve what is known as crawl quota. Crawl quota is the maximum allowable number of pages of a website search bots can crawl at a time.
You want to ensure that only useful pages are crawled, else your crawl quota would waste away. Doing this will improve your website’s SEO greatly.
Thirdly, a well-scripted robots.txt file can help you minimize the activities of search bots, including bad bots, around your website. That way, your website’s load speed would improve greatly.
Where Is Robots.txt File Located?
By default, a robots.txt file is created and stored in your website’s root directory whenever you install a WordPress website. To view it, open your website in a browser, then append “/robots.txt” at the end. For instance:
https://mywebsite.com/robots.txt
Here’s how ours at Fixrunner looks like:
The default WordPress robots.txt file is virtual, and so can’t be accessed nor edited. To access or edit it you would have to create one – and there are many ways to do so. Let’s see some of them!
How to Create Robots.txt File in WordPress
Creating a robots.txt file in WordPress is a straightforward process. You can either do so manually or use WordPress plugins. We are going to see both processes here, and the plugin we are going to be using is Yoast SEO.
Using Yoast SEO Plugin
Yoast SEO plugin can create a robot.txt file for WordPress on the fly. Of course, it does a whole lot more when it comes to SEO for WordPress.
First off, install and activate the plugin, if you haven’t.
Once you have Yoast up and running on your website, navigate to SEO >> Tools
Next, click on the File editor link in the Yoast dashboard.
This will take you to the page where you can create a robots.txt file. Click the Create button.
This will take you to an editor where you can add and edit rules to your WordPress’ robots.txt file.
Add new rules to the file editor and save changes. Not to worry, we will show you the rules to add shortly.
Adding Robots.txt Manually with FTP in WordPress
This method is quite simple, and just about anybody can do it. To begin with, launch Notepad – or any favorite editor of yours, as long as it’s not a word processor like Microsoft Word – on your machine.
For a start, add the following rules to the file you just created.
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Save the file as robots.txt. What you want to do next is to upload it to your website via an FTP program such as FileZilla.
First off, establish a connection to your website in FileZilla. Next, navigate to your public_html folder. In this folder upload the robots.txt file you had just created.
Once the upload is completed, you are good to go.
Adding Rules
Basically, there are just two instructions you can give to search bots: Allow and Disallow. Allow grants them access to a folder, and Disallow does the opposite.
To allow access to a folder, add:
User-Agent: *
Allow: /wp-content/uploads/
The asterisk (*) tells search bots, “hey this rule is applicable to all of you”.
To block access to a folder, use the following rule
Disallow: /wp-content/plugins/
In this instance, we are denying search bots access to the plugins folder.
It’s entirely up to you to determine which rule is most applicable to your website. If you run a forum, for instance, you may decide to block off crawlers from your forum page with the following rule:
Disallow: /forum/
As a rule of thumb, the fewer the rules the better. The following rule is enough to get the job done:
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
How to Test Your Created Robots.txt file in Google Search Console
Now that you’ve created the robots.txt file in WordPress, you will want to be sure it’s working as it should. And there is no better way to do that other than using a robots.txt tester tool.
Google Search Console has the right tool for this purpose. So first things first, log into your Google Console account. You can always create an account if you don’t have one.
Once in Google Search Console, scroll down and click Go to the old version.
Once you are in the old version, navigate to Crawl >> robots.txt tester.
In the text editor, paste the rules you had added to the robots.txt file, finally, click Test.
If it checks out, then you are done!
Conclusion
Search bots can be unruly at times, and the only way to checkmate their activities on your website is to use robots.txt. Even at that, some bots will still completely ignore whatever rules you have laid out – you just have to deal with that.
While it’s true WordPress automatically generates a robots.txt file for you upon installation, creating one for yourself is a good idea. A well optimized robots.txt file will prevent search bots from doing harm to your website.
If you found this article helpful, do share. For more WordPress tutorials, follow our WordPress blog.
More Resources:
- WordPress two factor authentication
- Vary Accept-encoding Header Error: How to Fix in WordPress
- How To Find, Create And Use htaccess File In WordPress
- How to Install and Configure All in One SEO Pack Plugin