What Is llms.txt?
The SEO File AI Systems Actually Read
By Lesli Rose · April 14, 2026 · 9 min read
Your website has a robots.txt that tells search engines where they can go. It has a sitemap.xml that lists every page you want indexed. Now there is a third file that belongs in your SEO foundation: llms.txt. It tells AI systems what your business actually is. If you are doing SEO in 2026 without it, you are leaving a gap that your competitors will fill.
The Three Files Every Site Needs
For twenty years, the SEO foundation for any website has included two root-level files: robots.txt (access control) and sitemap.xml (page discovery). These files communicate directly with crawlers in a language they understand. They are not for humans. They are infrastructure.
llms.txt is the third file in that stack. It serves the same purpose -- communicating directly with machines -- but for a new type of machine. While robots.txt talks to Google's crawler and sitemap.xml talks to Google's indexer, llms.txt talks to AI systems like ChatGPT, Claude, Perplexity, and Google's AI Overviews.
robots.txt -- tells crawlers where they can and cannot go. Access control.
sitemap.xml -- lists all your pages with priority and update frequency. Page discovery.
llms.txt -- summarizes your business in plain text. Business context for AI.
Each file does a different job. Together, they give every type of machine -- traditional crawlers and AI systems -- everything they need to find, understand, and accurately represent your business.
Why AI Systems Need a Separate File
Google's crawler reads HTML. It follows links. It parses structured data. It has had twenty years to get good at understanding web pages built for humans. AI systems are different. They are trying to answer a question: "What does this business do, and should I recommend it?"
To answer that, an AI system has to piece together information from your homepage, about page, service pages, review profiles, directory listings, and social media. The result is often incomplete. Sometimes it is wrong. An AI might describe your business based on a blog post from three years ago or a directory listing with an old address.
llms.txt solves this by giving AI systems a single, authoritative source of truth. Instead of guessing what your business does based on scattered signals, the AI reads one file and gets the complete picture. Written by you. Updated by you. Accurate because you control it.
How to Create Your llms.txt
Creating an llms.txt file takes about thirty minutes. The format is plain text with markdown-style headers. No HTML, no JSON, no code. Just clean, organized information about your business.
Step 1: Write the summary
Start with a one-paragraph description at the top. This is the most important part. It should answer: Who are you? What do you do? Where are you? Who do you serve?
# Your Business Name
> One-paragraph summary. Business name, what you do, where you are located, who you serve. Keep it factual. No marketing language.
Step 2: Add structured sections
After the summary, add sections for the key details AI systems need. Use ## headers and bullet points.
## Services
- Service one
- Service two
- Service three
## Location
City, Province/State. Areas served.
## Team
- Name, Title -- credentials or experience
## Pricing
- Service: starting at $X
## Links
- About: https://yourdomain.com/about
- Services: https://yourdomain.com/services
- Contact: https://yourdomain.com/contact
Step 3: Place the file
Save the file as llms.txt and place it in your website's root directory so it is accessible at yourdomain.com/llms.txt. If you use WordPress, upload it to the same directory as your wp-config.php. If you use a framework like Next.js, put it in the public folder. The content type should be text/plain.
Step 4: Test it
Open your browser and go to yourdomain.com/llms.txt. You should see the plain text. If you see a 404 or your homepage, the file is not in the right place. Fix the path and test again.
What to Include (and What to Leave Out)
Include:
Business name and one-line description
Location and areas served
Services or products with brief descriptions
Team members with titles and credentials
Pricing (ranges are fine)
Links to your 5-8 most important pages
Hours of operation
Certifications, awards, or notable credentials
Leave out:
Marketing language ("best in the city," "world-class service")
Testimonials or review excerpts
Internal jargon or acronyms
Promotional offers or temporary discounts
Anything that changes weekly -- this file should be stable
How llms.txt Fits Into Your SEO Stack
If you think about your SEO foundation as layers, llms.txt sits right alongside robots.txt and sitemap.xml at the infrastructure layer. It is not a content strategy. It is not a ranking factor (yet). It is plumbing -- the kind of work that does not feel exciting but makes everything else work better.
Here is how the full stack works together:
Layer 1: Access & Discovery
robots.txt, sitemap.xml, llms.txt, AI crawler directives
Layer 2: On-Page Structure
Title tags, meta descriptions, heading hierarchy, internal links
Layer 3: Structured Data
Schema markup (LocalBusiness, FAQPage, Article, Service)
Layer 4: Content
Answer-first formatting, topic authority, depth
Layer 5: Off-Site Signals
Reviews, citations, backlinks, directory listings
Most businesses jump straight to Layers 4 and 5 -- content and backlinks. But without Layers 1-3 in place, you are building on a weak foundation. llms.txt is a Layer 1 file. It takes thirty minutes to create and costs nothing. There is no reason not to have one.
Real Example: Before and After
A plumbing company in Fredericton has a solid website, good reviews, and decent local rankings. But when someone asks ChatGPT "who does emergency plumbing in Fredericton?" the AI cobbles together an answer from Yelp, Yellow Pages, and a three-year-old blog post. The business name is right but the services listed are incomplete and the phone number is the old one.
After adding an llms.txt file with current services, the correct phone number, hours, and a direct link to the emergency services page, AI systems have a clean, authoritative reference. The next time someone asks, the AI pulls from the llms.txt and gives an accurate, complete answer. Same business. Same website. Better AI representation because the business gave AI the information directly.
When to Update It
Update your llms.txt whenever you:
Add or remove a service
Change your pricing
Move to a new location
Add a new team member
Launch a new section of your website
Get a new certification or credential
Change your phone number or email
Set a calendar reminder to review the file quarterly, even if nothing has changed. A quick read-through takes two minutes and catches anything that has drifted out of date.
The Bottom Line
llms.txt is infrastructure. It belongs in your SEO foundation alongside robots.txt and sitemap.xml. It is not glamorous work. Nobody will compliment your llms.txt file. But it gives AI systems the clean, authoritative context they need to accurately describe and recommend your business. And in a world where AI is increasingly the first place people go for recommendations, that matters.
Create the file. Place it at your domain root. Update it when things change. It is one of the highest-value, lowest-effort things you can do for your AI visibility right now.
Frequently Asked Questions
What is llms.txt in simple terms?
A plain-text file at the root of your website that gives AI systems a structured summary of your business. Name, location, services, team, pricing, and key links. Think of it as a resume for your business that AI can read in seconds.
Is llms.txt part of SEO?
Yes. It sits at the infrastructure layer alongside robots.txt and sitemap.xml. Traditional SEO talks to Google crawlers. llms.txt extends that to AI systems. If you maintain a robots.txt and sitemap, adding llms.txt is the natural next step.
How do I create an llms.txt file?
Create a plain-text file named llms.txt with markdown-style headers (## Services, ## Location, ## Links). Write factual, concise descriptions. Place the file in your website root so it loads at yourdomain.com/llms.txt. Takes about thirty minutes.
What is the difference between llms.txt and robots.txt?
robots.txt controls access -- it tells crawlers which pages they can visit. llms.txt provides context -- it tells AI what your business is. robots.txt is a bouncer. llms.txt is a tour guide. You need both.
Does llms.txt replace schema markup?
No. Schema markup gives page-level structured data. llms.txt gives a site-level business summary. Schema is the floor plan for each room. llms.txt is the property listing for the whole building. Use both.
Need Help With Your SEO Foundation?
I'll audit your robots.txt, sitemap, schema, and llms.txt -- then set up everything AI systems need to find and recommend you. Free audit to start.
Get Your AI Visibility Audit