|
/ Documentation /Getting Started/ Robots.txt Feature in SureRank

Robots.txt Feature in SureRank

The Robots.txt feature in SureRank allows you to easily edit and manage your website’s robots.txt file directly from your WordPress dashboard without needing to access your server files manually.

What is Robots.txt?

The robots.txt file is a text file placed in the root directory of your website that gives instructions to search engine crawlers (like Googlebot, Bingbot, etc.) about which pages or sections of your site they can or cannot crawl.

How does it help your site?

  • Controls search engine access – Prevents search engines from crawling unnecessary or sensitive pages (e.g., admin pages, staging areas).
  • Saves crawl budget – Ensures that search engines focus on crawling your important content instead of wasting time on irrelevant pages.
  • Improves SEO efficiency – Optimizes how your site is indexed in search results.

Exceptional Cases

Case 1 – Search Engine Visibility Warning

If the Search Engine Visibility option is checked in:
WordPress Dashboard → Settings → Reading → Discourage search engines from indexing this site

You will see this warning when editing the robots.txt file:

⚠️ Warning:
Your site’s search engine visibility is currently set to Hidden in Settings > Reading. Any changes made here will not be applied until you set the search engine visibility to Public. This is required to update the robots.txt content.

Solution:

Case 2 – Existing robots.txt File in Root Folder

If your site already has a robots.txt file in its root directory, SureRank will not allow you to edit it from the dashboard.

Why?

  • The existing file on the server overrides the dynamic file generated by SureRank.

Solution:

  • Connect to your hosting via FTP or File Manager.
  • Navigate to your website’s root folder (often /public_html/).
  • Delete the existing robots.txt file.
  • Return to SureRank and edit your robots.txt from the dashboard.

How to Access and Edit Robots.txt in SureRank

  • Go to SureRank Advanced Settings
    • In your WordPress dashboard, navigate to:
      SureRank → Advanced Settings
AD 4nXe WKGFdqQ6iS3C EwKrw3wzRdHF4aM6TzY9I1dUsx2sKhefxAypUaUBGt2edY OjUBEycceekcuJlB6vuVaGZ7MAWy9JIuWWhKbkcoif BIpanIarllzz0K1boaz5kd9dNGz5n7g?key=SSCtNfHM1YYBrP0SKBmnRQ
  • Open the Robots.txt Editor
    • Scroll to the end of the Advanced Settings page.
    • You will find the Robots.txt Editor section.
AD 4nXe2Wziq9wvCpGfT3AiDz6BqFt04U3Z5xnbQe2wFg7oypcSev9IZgpjq6xBCllxEmBXPMh5GueK4nAZQTkcN2Px3R WxAbLuGrFqWTHtzoz6tau4TtdPwRRG8omANBeSOJI9U3fpog?key=SSCtNfHM1YYBrP0SKBmnRQ
  • Edit the File
    • Add or update rules in the editor according to your needs.

Example:

Plaintext
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

  • Save the Settings
    • After making changes, click the Save button to update the file instantly.

How to Check Your Robots.txt File

Once saved, you can verify your robots.txt file by:

  • Visiting your website in a browser.
  • Adding /robots.txt at the end of your URL.
    Example:  https://yourwebsite.com/robots.txt
  • You should now see your updated robots.txt file displayed.

Common Robots.txt Examples

1. Default WordPress-Friendly Setup

Plaintext
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

2. Block a Specific Folder

Plaintext
User-agent: *
Disallow: /private/

3. Block a Specific File

Plaintext
User-agent: *
Disallow: /secret-page.html

4. Allow Only One Crawler

Plaintext
User-agent: Googlebot
Disallow:
User-agent: *
Disallow: /

5. Block All Crawlers (Not Recommended for Live Sites)

Plaintext
User-agent: *
Disallow: /

Troubleshooting

If your robots.txt file is not working or not updating, try these steps:

  1. Clear Caches
    • Clear your WordPress cache (if using a caching plugin).
    • Clear your server cache (check with your hosting provider).
    • Clear your browser cache.
  2. Check for Conflicts
    • Some SEO plugins (like Yoast or RankMath) may also manage the robots.txt file.
    • If using multiple SEO tools, disable the robots.txt feature in the others to avoid conflicts.
  3. File Not Found Error
    • Ensure SureRank’s Robots.txt feature is enabled and settings are saved.
    • Verify your site is not blocking access to robots.txt in server configuration files (like .htaccess).
  4. Google Still Crawling Blocked Pages
    • Search engines may take time to update crawl rules.
    • Use Google Search Console → URL Inspection Tool to check if a URL is blocked by robots.txt.
  5. Still Not Working?
    • Contact your hosting provider to confirm your server allows dynamic robots.txt generation.

Important Tips

  • Always be careful when blocking pages blocking important content can prevent it from appearing in search results.
  • Remember, robots.txt is a public file. Anyone can view it by visiting yoursite.com/robots.txt.
  • Use the file for crawl control, not for hiding sensitive information. Sensitive pages should be protected with passwords or noindex tags.
Was this doc helpful?
What went wrong?

We don't respond to the article feedback, we use it to improve our support content.

Need help? Contact Support
On this page
Scroll to Top