We all optimize our websites for Google — but in 2026, there's something else that matters just as much.
How do you make your website readable and understandable by AI Agents like ChatGPT, Claude, and Copilot?
Why Does This Matter?
- More and more people ask AI first before visiting a website
- If the AI doesn't understand your pages, it will give wrong answers about your service
- This silently costs you leads and opportunities without you even noticing
This post is a quick guide on how to prepare your website for AI — using the same concept as SEO.
The examples use Next.js, but the ideas apply to any tech stack.
Step 1: Discovery Files
Make sure these files exist and are accessible:
/robots.txt/sitemap.xml/llms.txt/sitemap.md
In your robots.txt, it's important to explicitly allow OpenAI bots and add Content Signals:
User-agent: OAI-SearchBot
Allow: /
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
Content-Signal: ai-train=no, search=yes, ai-input=no
Step 2: Markdown Negotiation
This is the most important point of the whole topic.
When a request comes in with Accept: text/markdown, return a Markdown version of the same page.
When it's a regular browser request, serve normal HTML.
Your response should include:
Content-Type: text/markdown; charset=utf-8Vary: AcceptLink: canonicalx-markdown-tokens(if available)
Step 3: Direct Markdown Links
Create Markdown versions of your important pages:
/pricing.md/faq.md/about.md/contact.md
This makes it much easier for agents to read your content without HTML noise.
Step 4: Link Headers
Add Link headers to your HTML pages so agents can quickly discover important resources:
rel="describedby"→ pointing tollms.txtrel="sitemap"→ pointing tositemap.xmlrel="service-doc"→ if you have API docsrel="api-catalog"→ if you have an API catalog
Step 5: Structured Data + Canonical
On every page make sure you have:
- A clear Canonical URL
- Appropriate JSON-LD schema (WebSite / Organization / Breadcrumb / FAQ)
- Clean heading structure (H1 → H2/H3)
Step 6: Test It Manually
Test with curl:
# Test discovery files
curl https://yoursite.com/robots.txt
curl https://yoursite.com/llms.txt
curl https://yoursite.com/sitemap.xml
# Test Markdown negotiation
curl -H "Accept: text/markdown" https://yoursite.com/about
# Test normal HTML
curl -H "Accept: text/html" https://yoursite.com/about
You can also use this evaluation tool to see your score and gaps instantly:
AgentSEO Score Checker
Summary
SEO today isn't only for search engines — it's also for AI Agents.
If your website isn't ready for machine reading, AI will return incomplete or incorrect information about you.
And that directly affects your reputation and conversion rates.
Start small: add
llms.txt, updaterobots.txt, and enable Markdown negotiation. Those three steps alone make a big difference.
Originally published on LinkedIn.
