Logo

robots.txt Generator

Build a robots.txt file for your website — presets, rules, sitemaps

Quick Presets

Rule #1

Additional Settings

robots.txt Output

6 lines85 chars
# robots.txt generated by OmniWebKit
# Generated: 2026-04-05

User-agent: *
Allow: /

Free Online robots.txt Generator — Build Your File in Seconds

Every website needs a robots.txt file. It tells search engine crawlers which pages they can access and which ones they should skip. Without it, crawlers may index pages you want to keep private — admin panels, checkout flows, search result pages, or duplicate content. A properly configured robots.txt file improves your SEO, reduces server load, and keeps your site clean in search results.

This free robots.txt Generator lets you build the file visually. Add rules for different user agents, specify Allow and Disallow paths, set a crawl delay, add your sitemap URLs, and see the output in real time. Five presets are included: Allow All, Block All, WordPress, Next.js, and E-commerce. Each preset is a one-click starting point that you can customise.

When you are finished, copy the output to your clipboard or download it as a robots.txt file. Upload it to the root directory of your website. All processing runs in your browser — no data is sent to any server.

Understanding robots.txt Directives

User-agent

Specifies which crawler the rules apply to. Use * for all crawlers, or name a specific bot like Googlebot, Bingbot, or Yandex.

Disallow

Tells the crawler NOT to access a specific path. Disallow: /admin/ blocks the entire /admin/ directory. Disallow: / blocks the entire site.

Allow

Overrides a Disallow rule for a specific path. Useful when you block a directory but want to allow a specific file within it.

Sitemap

Points crawlers to your XML sitemap. This helps search engines discover all your pages. Use the full URL: https://example.com/sitemap.xml.

Crawl-delay

Requests that crawlers wait a number of seconds between requests. Not all crawlers honour this directive, but it can reduce server load.

Host

Specifies the preferred domain for your site. Primarily used by Yandex. Most sites do not need this directive.

Five Presets Explained

Allow All

Allows all crawlers to access all pages. This is the most open configuration. Use it if you have nothing to hide and want maximum indexing.

Block All

Blocks all crawlers from accessing any page. Use this for staging sites, development servers, or sites not ready for public indexing.

WordPress

Blocks common WordPress admin directories, plugin files, trackbacks, feeds, and internal search results. Includes a sitemap directive.

Next.js

Blocks Next.js internal routes (_next), API routes, and error pages (404, 500). Includes a sitemap directive.

E-commerce

Blocks cart, checkout, account, admin, search queries, wishlist, and URL parameters for sorting and filtering. Keeps product pages indexed.

Frequently Asked Questions

Is this robots.txt generator free?+
Yes, completely free with no account, no limits, and no data collection.
Where do I put the robots.txt file?+
Upload it to the root directory of your website: https://yourdomain.com/robots.txt. Most web hosts let you upload via FTP or your hosting panel.
Does robots.txt block pages from Google?+
Robots.txt tells crawlers not to crawl certain pages, but it does not prevent indexing. To block indexing, use a noindex meta tag instead.
What is the crawl-delay directive?+
It requests that crawlers wait a number of seconds between requests. Google ignores this directive, but Bing and Yandex honour it.
Can I have multiple User-agent blocks?+
Yes. You can create separate rules for different crawlers. For example, one block for Googlebot and another for Bingbot.
What is the Sitemap directive?+
It points crawlers to your XML sitemap, helping them discover all your pages. Use the full URL.
Do I need a robots.txt file?+
Technically no, but it is strongly recommended. Without one, crawlers will access everything, which may include pages you do not want indexed.
Does this tool send my data to a server?+
No. Everything runs in your browser. Your configuration is never uploaded anywhere.
Advertisement
Logo

Your all-in-one digital toolkit with 100+ free online tools. Fast, secure, and always available when you need them.

Secure & Private

All processing happens locally in your browser

Mobile Friendly

Works perfectly on all devices and screen sizes

Always Free

No registration, no limits, completely free to use

100+
Free Tools
50K+
Daily Users
1M+
Tools Used
150+
Countries
© 2026 OmniWebKit. All rights reserved.
Made withfor developers and creators