top of page
Curved Line Design

How to Write and Submit a Robots.txt File (Step-by-Step Guide)

  • Ajitesh Agarwal
  • Mar 26
  • 3 min read

To write and submit a robots.txt file, create a text file with crawl rules (Allow/Disallow), save it as “robots.txt,” and upload it to your website’s root directory. Then test it using tools like Google Search Console to ensure it works correctly.


In this guide, you’ll learn how to create, write, test, and submit a robots.txt file step-by-step.


What is the robots.txt file used for?

A robots.txt file is one of the most important technical SEO elements that helps search engines understand how to crawl your website. Whether you're running a blog, eCommerce store, or business site, properly configuring robots.txt can improve crawl efficiency and protect sensitive pages.


Without a robots.txt file, search engines will crawl your entire website by default.


Why is Robots.txt Important for SEO?

  • Controls crawl budget (especially for large websites)

  • Prevents indexing of sensitive or duplicate pages

  • Helps search engines focus on important content

  • Improves overall site performance in search rankings


You can generate this instantly using

👉 Marcitors Robots.txt Generator


Basic Robots.txt Example

User-agent: Googlebot
Disallow: /private/
User-agent: *
Allow: /
Sitemap: https://marcitors.com/sitemap.xml

What This Means:

  • Googlebot cannot access /private/ folder

  • All other bots can crawl the entire website

  • Sitemap location is provided for better indexing


How to Write a robots.txt File?

  • Open a text editor

  • Add rules like:

    User-agent: *

    Disallow: /admin/

  • Save the file as robots.txt


How to submit a robots.txt file?


To write and submit a robots.txt file, create a text file with crawl rules, save it as robots.txt, upload it to your website’s root directory, and test it using Google Search Console.


Step-by-Step: How to Create a Robots.txt File

Step 1: Create a file

  • Use a simple text editor like Notepad or VS Code

  • Name the file exactly: robots.txt

  • Save it in UTF-8 encoding


Step 2: Add Rules (Directives)

Here are the main directives:

User-agent

Defines which crawler the rule applies to

User-agent: Googlebot

Disallow

Blocks specific pages or folders

Disallow: /admin/

Allow

Allows specific pages within blocked folders

Allow: /admin/public-page.html

Sitemap

Specifies your sitemap location


Common Robots.txt Rules Examples

Block Entire Website

User-agent: *Disallow: /

Allow Full Access

User-agent: *Disallow:

Block Specific File Type (e.g., PDFs)

User-agent: *Disallow: /*.pdf$

Block Internal Search Pages

User-agent: *Disallow: /search/

Step 3: Save File

  • Name: robots.txt

  • Format: .txt


Important Robots.txt Guidelines

  • File must be named robots.txt

  • Place it in the root directory only

  • Only one robots.txt file per domain

  • Rules are case-sensitive

  • Use / at the beginning of paths

  • Use # to add comments


Step 4: Where to Upload Robots.txt File?

Upload your robots.txt file to:

Hosting Platform Notes:

  • Wix / Blogger: Use built-in SEO settings

  • WordPress: Use plugins or FTP access

  • Custom Sites: Upload via hosting cPanel or FTP


Step 5: Validate

Check errors using Google Search Console


How to Test Robots.txt File

Method 1: Manual Check


Method 2: Google Search Console

  • Use Robots.txt Tester Tool

  • Check for errors and blocked URLs


Why Write a Robots.txt File?

  • Block admin pages

  • Avoid duplicate content

  • Improve crawl efficiency

  • Guide search engine bots

👉 Generator tools save time and reduce errors.


How to Submit Robots.txt to Google

Good news:👉 You don’t need to manually submit robots.txt.

Once uploaded, Google automatically detects it.


To speed up updates:

  • Go to Google Search Console

  • Use URL Inspection Tool

  • Request reindexing


Best Practices for robots.txt

  • Use Disallow carefully

  • Add sitemap for better indexing

  • Keep rules minimal

  • Update regularly


Common Mistakes to Avoid

❌ Blocking important pages (like product pages)

❌ Using wrong syntax or case errors.

❌ Forgetting to add sitemap.

❌ Placing file in subfolders.

❌ Blocking CSS/JS files (affects rendering)


Pro Tips for Better SEO

  • Combine robots.txt with meta robots tags

  • Regularly audit your file

  • Keep it clean and minimal

  • Use it to manage crawl budget for large sites


A well-optimized robots.txt file helps search engines crawl your site efficiently while protecting sensitive content. It’s a small file with a big SEO impact.

If you want an easy way to create one, you can use tools like a robots.txt generator to automate the process and avoid errors.

bottom of page