TL;DR: llms.txt is a proposed web standard that sounds revolutionary but currently has zero support from major AI companies. While the hype suggests it could change how AI understands websites, the reality is that no major LLM provider actually uses these files. At Hungry Ram Web Design, we're monitoring the situation and will implement it for clients when (and if) it gains legitimate traction.
The Origin Story: Good Intentions, Limited Impact
In September 2024, Jeremy Howard, co-founder of Answer.AI, proposed something that seemed like a game-changer for the AI-driven web: llms.txt. The idea was simple—add a markdown file to websites to provide LLM-friendly content, offering brief background information, guidance, and links to detailed markdown files to help LLMs use a website at inference time.
The problem Howard aimed to solve was real. Large language models increasingly rely on website information, but face a critical limitation: context windows are too small to handle most websites in their entirety. Converting complex HTML pages with navigation, ads, and JavaScript into LLM-friendly plain text is both difficult and imprecise.
The Hype Train Gains Steam
What happened next looked like rapid adoption success. Adoption remained niche until November, when Mintlify rolled out support for /llms.txt across all docs sites it hosts. Practically overnight, thousands of docs sites—including Anthropic and Cursor—began supporting llms.txt.
Suddenly, SEO tools and marketing platforms were promoting llms.txt as the "next big thing." WordPress plugins emerged, with one plugin reaching over 3,000 downloads in just three months. Industry publications started calling it essential for AI optimization.
The Reality Check: Nobody's Actually Using It
Here's where the story takes a turn. Despite all the excitement, according to analysis from Ahrefs, "no major LLM provider currently supports llms.txt. Not OpenAI. Not Anthropic. Not Google".
Even more telling, Google's John Mueller weighed in on LLMS.txt, saying it reminds him of the old meta keywords tag—the same outdated SEO hack we all left behind in 2010. That's not exactly a ringing endorsement from one of the web's most influential voices.
The adoption numbers tell the real story. A comprehensive scan found only 3 LLMS.txt implementations among the top 1,000 websites, representing a 0.3% adoption rate. For context, that's essentially statistical noise.
Why Major AI Companies Aren't Interested
The lack of adoption by AI companies isn't accidental—it's strategic. Here's why:
1. They Already Have Better Solutions
Major AI companies like OpenAI, Google, and Anthropic have invested billions in sophisticated web crawling and content processing systems. They don't need website owners to manually curate content for them.
2. Control Issues
Website owners face traffic challenges without effective protocols for managing AI interactions, with recent studies showing Google's AI Overviews reduced organic clicks by 34.5%, while 80% of companies actively block AI crawlers. AI companies prefer maintaining control over what they access rather than relying on potentially biased self-reported content.
3. Scale Problems
AI models need to work across the entire web, not just the small percentage of sites that might implement llms.txt. Building systems that depend on optional website cooperation doesn't scale.
The Technical Reality
Even when llms.txt files exist, server log analyses show that no major AI crawler, including Googlebot or Gemini, is actually checking for llms.txt files at all. Even sites that have it fully implemented aren't seeing any signs of those files being accessed.
This means the entire premise—that AI systems will respect and use these files—is currently false. You might as well be optimizing for a search engine that doesn't exist.
Why the Marketing Hype Persists
So why do we keep hearing about llms.txt? Several factors are at play:
1. Tool Vendor Interest
SEO tool companies need new features to sell. llms.txt generators, checkers, and optimization tools have sprouted up everywhere, each promising to future-proof your website.
2. Consultant Revenue
Digital marketing consultants can charge for implementing "cutting-edge AI optimization" strategies, even when those strategies have no proven benefit.
3. Fear of Missing Out
Website owners worry that if they don't implement llms.txt now, they'll be behind when AI companies eventually adopt it. This fear drives adoption despite the lack of evidence.
Our Position at Hungry Ram Web Design
As a web design agency that stays on top of emerging technologies, we've evaluated llms.txt thoroughly. Here's our stance:
We're Not Implementing It Yet
Currently, implementing llms.txt for clients would be offering a solution to a problem that doesn't exist. We prefer to focus our clients' resources on proven strategies that actually drive results.
We're Monitoring Developments
We've seen this play out before—robots.txt wasn't universally respected on day one either, but now it's a web standard. If major AI companies announce support, we'll be ready to implement immediately.
We'll Adapt When Necessary
The moment we see legitimate adoption signals—like official announcements from OpenAI, Google, or Anthropic—we'll integrate llms.txt into our client workflows. Until then, we're focusing on strategies that work today.
What Actually Matters for AI Optimization
Instead of chasing llms.txt, here's what we recommend for clients who want their content to perform well with AI systems:
1. Clean, Structured Content
Write clear, well-organized content that's easy for both humans and AI to understand. Use proper headings, concise paragraphs, and logical information hierarchy.
2. Fast, Accessible Websites
AI crawlers, like search engines, prefer fast-loading, accessible websites. Focus on Core Web Vitals and technical SEO fundamentals.
3. Comprehensive Coverage
Create thorough, authoritative content on your topics. AI systems favor comprehensive resources over thin content.
4. Regular Updates
Keep your content fresh and accurate. AI systems, like search engines, prefer up-to-date information.
5. Schema Markup That Actually Works
Unlike llms.txt, Schema.org structured data is already being used by AI systems. Implement relevant schema types like Article, FAQ, HowTo, and Organization markup to help AI understand your content context and relationships. This proven standard has been working for years and continues to influence how AI systems interpret web content.
6. Solid SEO Fundamentals
The foundation of AI optimization is still good old-fashioned SEO. Search engines and AI systems both reward well-optimized websites with clear site architecture, proper internal linking, optimized meta tags, and quality content. Rather than chasing unproven trends like llms.txt, focus on comprehensive SEO strategies that have demonstrated results for over two decades. At Hungry Ram Web Design, we build websites with SEO best practices baked in from day one, ensuring your content performs well across all platforms—current and future.
The Bottom Line
llms.txt represents an interesting idea that suffers from a fundamental flaw: it assumes AI companies want website owners to tell them what content to prioritize. The evidence suggests they don't.
As one industry expert put it, "It's more like 'quietly simmering on the back burner'—it's not taking off right now, but it's not useless either".
At Hungry Ram Web Design, we believe in evidence-based optimization strategies. While we'll continue monitoring llms.txt developments, we won't recommend implementing it until we see concrete evidence of AI platform adoption.
Our clients deserve strategies that work today, not tomorrow's maybes. When llms.txt becomes genuinely useful, you can trust that we'll be among the first to implement it properly. Until then, we're focusing on the fundamentals that actually drive results.
Want to discuss AI optimization strategies that actually work? Contact Hungry Ram Web Design for a consultation on proven digital marketing approaches that deliver real results.