The problem nobody talks about
Every week, another company launches a page “for agents.” Developer portals. Agent card directories. API marketplaces. Most of them are invisible to the agents they claim to serve.
The reason is almost always the same: the frontend framework made the wrong thing easy. A React SPA ships zero server-rendered content by default. An Angular app hides everything behind client-side routing. A Vue app with Cloudflare Rocket Loader defers all script execution. The developer didn’t choose to be invisible to agents — they chose a popular framework and followed its defaults.
When an agent fetches a URL, it makes an HTTP GET request and parses the HTML response. No browser. No JavaScript engine. No waiting for hydration. If the content isn’t in the initial HTML, it doesn’t exist.
This isn’t a niche concern. It’s the primary way agents interact with the web. LangChain, CrewAI, AutoGPT, Claude’s computer use, OpenAI’s browsing — they all start with an HTTP GET. The framework you choose determines whether agents can read your site before you write a single line of application code.
How we ranked
We evaluated every major frontend framework against one criterion: what does the default output look like to an agent?
Not “can it be configured to be agent-readable.” Everything can. The question is what happens when a developer follows the getting-started guide, uses the default project template, and ships without thinking about agents at all. That default behavior is the framework’s actual opinion about agent readability.
We also tested six agent-readability checks against a fresh project from each framework:
1. HTTP GET extracts ≥ 100 words of content
2. Semantic HTML (h1, h2, lists, tables, meaningful link text)
3. Meta description present and substantive
4. No noindex on content pages
5. Content in the DOM, not behind API calls
6. Clean URLs (no hash routing)
Tier 1 — Agent-native by default
Astro
RecommendedAstro is the most agent-native frontend framework available today. Its “islands architecture” ships zero JavaScript by default. Every page is static HTML. When you need interactivity — a search bar, a filter panel, a chart — you explicitly opt in by creating an “island” that hydrates while the rest of the page stays pure HTML.
The key insight: a developer has to actively try to make something agents can’t read. The default path — following the docs, using the starter template — produces a fully agent-readable page every time.
Agent-readability strengths
- ✓ Zero JS shipped by default. Content is in the HTML, period.
- ✓ Islands architecture: interactive components are explicit opt-in, not the default mode.
- ✓ Built-in sitemap generation, RSS feeds, and image optimization.
- ✓ Content Collections: structured content management with type-safe schemas.
- ✓ Supports React, Svelte, Vue, or Solid for interactive islands — no vendor lock-in.
- ✓ View Transitions API for SPA-like page transitions without client-side routing.
Visual design ceiling: As high as any React framework. Tailwind works. Any animation library works. Stripe-level polish is achievable. The constraint is design skill, not the framework.
11ty (Eleventy)
The minimalist’s answer. Pure static HTML plus progressive enhancement. 11ty generates HTML files from templates — Markdown, Nunjucks, Liquid, whatever you prefer — and ships exactly what you write. No build step beyond templating. No framework runtime. No hydration.
The tradeoff is a lower ceiling for interactive experiences. If you need complex client-side behavior, you’re writing vanilla JavaScript or bolting on a framework. For content-heavy sites (documentation, blogs, directories), 11ty is bulletproof.
Tier 2 — Agent-friendly with discipline
These frameworks can produce agent-readable output, and their defaults lean in the right direction. But the component ecosystem, community patterns, and developer habits pull toward client-heavy architectures. You need awareness and discipline to stay agent-native.
Next.js (App Router / React Server Components)
React Server Components were a massive step in the right direction.
In the App Router, components render on the server by default. You
explicitly opt into client-side rendering with
'use client'. This
means a default Next.js page ships server-rendered HTML with real
content.
The problem is React’s ecosystem gravity. Most React
libraries, tutorials, and Stack Overflow answers assume
client-side rendering. State management libraries pull you toward
'use client'.
Interactive patterns (modals, dropdowns, search) require client
components. It’s easy to end up with a component tree
where the root is server-rendered but the content-bearing children
are all client components — producing a nice SSR shell with
loading spinners where the data should be.
The discipline required
- →
Keep data-fetching in Server Components. Never fetch content
in a
'use client'component. - →
Push
'use client'boundaries as far down the tree as possible. Interactive leaf, not interactive root. - →
Run
curlagainst every page during development. If a human can see content but curl can’t, you have a client-rendering leak.
SvelteKit
SvelteKit server-renders by default, compiles to lean JavaScript
(no virtual DOM runtime), and produces some of the smallest bundles
in the framework ecosystem. Its
+page.server.ts pattern makes
it clear what runs on the server versus the client.
The risk is smaller than Next.js but still present: Svelte
components are inherently client-rendered after initial SSR. Data
fetched in load() functions
is server-rendered, but any reactive state changes after hydration
only exist client-side. Agent readability depends on the initial
load() returning everything
meaningful.
Remix / React Router v7
Remix has the best philosophical alignment with agent readability of any React framework. Progressive enhancement is a core value: forms work without JavaScript, loaders run on the server, and the mental model encourages server-first thinking.
In practice, it’s still React underneath. The same ecosystem gravity applies, though Remix’s conventions resist it more effectively than Next.js. The recent merge into React Router v7 adds some complexity to the story, but the progressive enhancement principles remain.
Tier 3 — Agent-hostile by default
These frameworks produce zero agent-readable content in their default configuration. Building an agent-readable site with them requires bolting on server-side rendering after the fact — fighting the framework rather than working with it.
A fresh create-react-app or
npm create vite@latest -- --template react
produces an HTML file with a single
<div id="root"></div>
and a script tag. An agent sees an empty page. All content loads
via JavaScript after the initial render.
This is the pattern behind the majority of agent-invisible sites we’ve audited. It’s not that the developers made bad choices — they made the default choice, and the default is invisible.
$ curl -s https://example-spa.com | wc -w
12
# Twelve words: the HTML boilerplate and a <noscript> tag. Zero content.
It’s not the framework — it’s the default
Every framework on this list can produce agent-readable output. The ranking isn’t about capability — it’s about defaults. The question is: when a new developer follows the getting-started guide, does the result work for agents?
This matters because defaults compound across an industry. If the most popular framework defaults to client-rendered SPAs, then most new sites will be invisible to agents. Not because developers chose invisibility, but because they chose the most popular framework.
The agent-native architectural pattern works regardless of framework:
| Principle | What it means |
|---|---|
| Content in the DOM | All meaningful text is in the server-rendered HTML. Not behind API calls. Not waiting for hydration. |
| Progressive enhancement | The page works without JavaScript. JS adds interactivity on top. Remove it and the content remains. |
| Semantic HTML | Headings, lists, tables, links with meaningful text. Structure that machines can parse without guessing. |
| Structured data | JSON-LD, Open Graph, meta descriptions. The machine-readable layer on top of human-readable content. |
| Discovery files | robots.txt, sitemap.xml, llms.txt, security.txt. The handshake files that tell agents how to navigate your site. |
| Clean URLs | /service/stripe not
/#/service/stripe. Hash
routing is invisible to HTTP GET.
|
The five-minute test
Before you ship anything that agents should be able to read, run these commands. They take five minutes and will tell you if your framework choice is working for you or against you.
# 1. Can an agent read your content?
curl -s https://yoursite.com | python3 -c "import sys; from html.parser import HTMLParser; ..."
# If < 100 words: your content is client-rendered
# 2. Do your discovery files exist?
for f in robots.txt sitemap.xml llms.txt; do curl -s -o /dev/null -w "$f: %{http_code}\n" https://yoursite.com/$f; done
# 3. Is your content in the DOM or behind JS?
curl -s https://yoursite.com | grep -c 'use client'
# High count on content pages = potential problem
# 4. Any noindex on content pages?
curl -s https://yoursite.com | grep -i 'noindex'
# 5. Structured data present?
curl -s https://yoursite.com | grep 'application/ld+json'
What we’d choose today
If we were starting a new developer-facing product today — documentation, an API directory, a developer portal — we’d choose Astro with React or Svelte islands for interactive components.
The reasoning is simple: Astro makes the agent-friendly path the default. You don’t need a “make it work for agents” checklist because the framework already did it. Every page is static HTML. Interactive components are explicit islands. Discovery files are built-in. The visual design ceiling is identical to any React framework because you can use React for the parts that need it.
If you’re already on Next.js, don’t rewrite. The App
Router with React Server Components is solid when used with
discipline. The key rules: keep data-fetching in Server Components,
push
'use client'
boundaries to leaves, and run
curl against every page.
If you’re building a new SPA with Vite + React — stop. Pick a framework that server-renders by default. The cost of adding SSR to an SPA later is always higher than starting with it.
Where this is heading
Agent readability isn’t a nice-to-have anymore. It’s becoming a distribution channel. When an agent recommends tools, evaluates options, or routes between services, it starts with what it can read. Sites that are invisible to agents lose a growing share of discovery traffic.
The frameworks that default to agent readability will win the next decade of the web the same way mobile-responsive frameworks won the last one. We went from “does your site work on phones?” to “every site works on phones” in about five years. Agent readability is on the same curve.
The question for every frontend team isn’t whether to care about agents. It’s whether your framework makes it easy or makes you fight for it.
Use the agent-readable stack to launch one governed lane, not a giant control panel
Picking Astro or a disciplined server-first stack solves discovery. The next useful move is proving one bounded execution path, with clear capability scope and a real operator handoff, before you add more UI chrome.
The frontend stack decides discoverability, not fleet discipline
Astro or a disciplined server-first stack gets the page into the DOM, but the harder operator questions show up later: what breaks when calls become loops, how shared rate limits are governed across many workflows, and how credentials stay narrow after the UI is readable enough to invite real usage.
What happens after the agent-readable frontend hands real work to an unattended loop.
How readable frontends still need shared-budget discipline once many agents hit the same provider rails.
Why a clean server-rendered experience still fails if credential scope, expiry, and rotation widen behind the scenes.
Methodology: All tests conducted via standard HTTP GET requests with readability extraction — the same approach used by LangChain, CrewAI, AutoGPT, and most agent frameworks. No JavaScript execution. Framework evaluations based on default project templates as of March 2026.