Robots.txt file for WordPress.
What is a Robots.txt File
A robots.txt file tells search engines which parts of your WordPress website they can crawl and index. This file should be placed in your root directory.
It helps you control what search engine bots can crawl and index on your website. For example, you can block access to sensitive folders like wp-admin or wp-includes.
This keeps your backend safe and prevents duplicate content from showing up in search engine results. A well-structured robots.txt can also improve crawl efficiency.
If search engines waste time crawling and indexing unnecessary files, they may miss your important pages. Make sure you don’t accidentally block content you want to rank.
Here’s a common, default example you can copy:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourwebsite.com/sitemap_index.xml
Here’s a custom Robots.txt file you can also use to enhance your website is crawled and indexed efficiently by search engine.
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /cgi-bin/
Disallow: /xmlrpc.php
Disallow: /feed/
Disallow: /comments/
Disallow: /category/*/feed/
Allow: /wp-content/uploads/
Sitemap: https://yourwebsite.com/sitemap_index.xml
It’s a good practice to review your robots.txt regularly. You can easily do that visiting https://yourwebsite.com/robots.txt
Ask yourself: Are you hiding the right things while allowing search engines to access your valuable content?
Testimonials form Our Clients
Read how our clients boosted growth, improved security, and strengthened their brand with CBN Tek