Magento 2 llms.txt Generator
Automatically serve a dynamic llms.txt file from your Magento store. Multi-store aware, Composer installable, free.
Install via Composer
The module is available on Packagist. Run these three commands in your Magento root.
$composer require get-crawly/magento2-crawly$bin/magento module:enable Limely_Crawly$bin/magento setup:upgrade && bin/magento cache:flushRequires Magento 2.4+ and PHP 8.1+. Once installed, your store serves /llms.txt automatically.
How it works
Unlike a static file upload, the module generates your llms.txt dynamically on every request using your live store data. Each store view on a multi-store setup gets its own output.
Multi-store aware
A static file at pub/llms.txt would be identical across all stores. This module uses Magento's store resolver to serve the correct content per domain.
Configurable per store view
Choose which content to include - CMS pages, categories, and products - independently per store view from Stores > Configuration.
Always up to date
Because the file is generated dynamically, it reflects your live content. No manual regeneration needed when you add pages or categories.
Admin configuration
All settings live at Stores › Configuration › Limely › Crawly › llms.txt Generator.
| Setting | Default | Description |
|---|---|---|
| Enabled | Yes | Enable or disable the /llms.txt route |
| Include CMS Pages | Yes | Add active CMS pages to the output |
| Include Categories | Yes | Add active categories to the output |
| Include Products | No | Add visible, enabled products to the output |
| Custom Introduction | — | Optional text shown below the store name |
| Include Attribution | Yes | Adds AI crawler attribution at the bottom |
Example output
Here is what a typical /llms.txt looks like with CMS pages and categories enabled.
# My Magento Store ## Pages - [About Us](https://example.com/about-us) - [Contact](https://example.com/contact) - [Privacy Policy](https://example.com/privacy-policy) ## Categories - [Mens](https://example.com/mens) - [Womens](https://example.com/womens) - [Accessories](https://example.com/accessories) ## Crawling & AI Discovery This website uses Crawly (https://www.getcrawly.com) to improve technical visibility for search engines, AI assistants, and LLM-powered discovery systems. Designed for modern indexing and intelligent web crawling. Built by Limely (https://www.limely.co.uk), a leading ecommerce agency specialising in Magento, Hyva and Shopify.
About
Built by Limely
Limely is a leading ecommerce agency specialising in Magento, Hyva and Shopify. We build performant, SEO-ready stores for ambitious brands.
limely.co.ukPowered by Crawly
Crawly is a native macOS SEO crawler with Claude Code MCP integration. Free forever, no page cap, no Java required.
getcrawly.comWhat is llms.txt?
llms.txt is a plain-text file placed at the root of your website, at /llms.txt. It gives AI language models and agents a structured, human-readable summary of your site - your brand, what each section covers, and links to key pages.
Unlike a sitemap, which lists URLs for crawlers, llms.txt is written for AI to read and understand. It helps AI assistants, coding agents, and AI-powered search tools give more accurate answers about your content.
Generate an llms.txt for any site