Skip to content

[Bots] Clarify managed robots.txt page for readability#29039

Merged
patriciasantaana merged 2 commits intoproductionfrom
jun/bots/robots-txt/eli5-demo
Apr 20, 2026
Merged

[Bots] Clarify managed robots.txt page for readability#29039
patriciasantaana merged 2 commits intoproductionfrom
jun/bots/robots-txt/eli5-demo

Conversation

@Oxyjun
Copy link
Copy Markdown
Contributor

@Oxyjun Oxyjun commented Mar 17, 2026

Improves clarity and reduces jargon on the managed robots.txt reference page, based on an ELI5 analysis that identified missing context, undefined terms, and buried key information.

  • Rewrite introduction to establish the problem (AI crawlers scraping content) before describing the feature — the original jumped straight to "protect your website" without context
  • Promote voluntary compliance from a buried note to the second paragraph — this is the most important decision-making fact on the page
  • Define "directives" inline on first use with a parenthetical example (Disallow: /) for readers unfamiliar with robots.txt syntax
  • Define Content Signals in prose before the code block — previously only defined inside legal-style code comments that readers had to parse
  • Replace "Free zones" with "domains on the Free plan" to eliminate Cloudflare-internal jargon
  • Clarify robots.txt vs AI Crawl Control — note now explains the two features are complementary, not alternatives
  • Fix grammar — "a HTTP" → "an HTTP"
  • Remove unused Render import

Improve introduction, define jargon, and reduce assumptions on the
managed robots.txt reference page:

- Rewrite intro to establish why (AI crawlers scraping content) before
  describing the feature
- State voluntary compliance prominently instead of burying it
- Define 'directives' inline on first use
- Define Content Signals in prose before the code block
- Replace 'Free zones' with 'domains on the Free plan'
- Clarify relationship between robots.txt and AI Crawl Control
- Fix grammar: 'a HTTP' to 'an HTTP'
- Remove unused Render import
@Oxyjun Oxyjun requested review from a team and patriciasantaana as code owners March 17, 2026 10:17
@github-actions github-actions Bot added product:bots Related to Bots product size/s labels Mar 17, 2026
@github-actions
Copy link
Copy Markdown
Contributor

This pull request requires reviews from CODEOWNERS as it changes files that match the following patterns:

Pattern Owners
/src/content/docs/bots/ @patriciasantaana, @cloudflare/pcx-technical-writing

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Mar 17, 2026

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 1, 2026

Hey there, we've marked this pull request as stale because there's no recent activity on it. This label is helps us identify PRs that might need updates (or to be closed out by our team if no longer relevant).

@github-actions github-actions Bot added the stale label Apr 1, 2026
@patriciasantaana patriciasantaana enabled auto-merge (squash) April 20, 2026 20:47
@patriciasantaana patriciasantaana merged commit 6f9c1db into production Apr 20, 2026
12 checks passed
@patriciasantaana patriciasantaana deleted the jun/bots/robots-txt/eli5-demo branch April 20, 2026 20:47
nojvek pushed a commit to nojvek/cloudflare-docs that referenced this pull request Apr 29, 2026
)

* [Bots] Clarify managed robots.txt page for readability

Improve introduction, define jargon, and reduce assumptions on the
managed robots.txt reference page:

- Rewrite intro to establish why (AI crawlers scraping content) before
  describing the feature
- State voluntary compliance prominently instead of burying it
- Define 'directives' inline on first use
- Define Content Signals in prose before the code block
- Replace 'Free zones' with 'domains on the Free plan'
- Clarify relationship between robots.txt and AI Crawl Control
- Fix grammar: 'a HTTP' to 'an HTTP'
- Remove unused Render import

* Trigger build

---------

Co-authored-by: Pedro Sousa <680496+pedrosousa@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

product:bots Related to Bots product size/s stale

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants