CMS Chronicle·March 2026·12 min read

The Age of the Agentic CMS Is Here. We're Building For It.

The CMS landscape is undergoing its biggest shift since WordPress. Here's what's happening — and why we built @webhouse/cms from scratch to meet this moment.

Every few years, the CMS world goes through a genuine tectonic shift. The first was the move from static HTML to dynamic CMS platforms in the early 2000s. The second was the rise of headless architecture around 2018. The third is happening right now — and it's bigger than either of those.

We're watching the emergence of the Agentic CMS: content management systems where AI agents don't just assist with writing — they operate as full members of the content team, capable of creating, editing, publishing, and optimizing content autonomously.

We've spent the last year studying this shift, and we built @webhouse/cms specifically because we believe the current generation of CMS platforms — no matter how good they are — weren't designed for what's coming. This post explains what we're seeing in the market and how our architecture addresses it.


What's Actually Happening in the CMS World Right Now

Everyone is racing to bolt AI onto their CMS

The arms race is real. In the last 18 months, virtually every major CMS vendor has scrambled to add AI capabilities:

  • Optimizely launched Opal, an AI agent orchestration platform
  • Sitecore rebranded entirely to Sitecore.ai
  • Kentico introduced AIRA, their Agentic Marketing Suite
  • Storyblok shipped Strata and FlowMotion for content vectorization
  • Kontent.ai is pushing a built-in AI Agent
  • Contentstack now calls itself an "Agentic Experience Platform"
  • Umbraco built an MCP Server for content operations

These are serious platforms with talented teams. But we noticed something: they're all adding AI to architectures that were designed before AI agents existed. The agent has to navigate a system that was built for humans clicking through admin panels.

That's the gap we're building into.

MCP has become the universal connector

Model Context Protocol — introduced by Anthropic in late 2024 and now hosted by The Linux Foundation — has gone from proposal to essential infrastructure in barely eighteen months. It standardizes how AI agents discover capabilities and invoke tools, and the CMS world has adopted it fast.

Sanity has an official MCP Server. Umbraco's Developer MCP Server exposes their entire Management API. Strapi, CrafterCMS, and dotCMS all have MCP integrations in production or pipeline. WordPress VIP has published enterprise implementation guides.

The implication is simple: if your CMS has a well-designed MCP server, it becomes instantly accessible to every MCP-compatible AI client — Claude, Cursor, Claude Code, and whatever comes next. No custom integrations needed. The agent discovers what the CMS can do and starts working.

We took this seriously. More on our approach below.

llms.txt is making sites AI-discoverable

A quieter but potentially transformative standard is gaining traction: llms.txt. It's a Markdown file at your site's root that tells AI systems which pages matter most — a curated, token-efficient index for language models.

The early evidence is compelling. A German agency submitted their llms.txt to Google Search Console, and within three days it was being cited as the primary source in Google AI Mode responses. Six AI and search bots crawled the file within four days — none invited via sitemap. Yoast has already added one-click generation. Cloudflare, Vercel, and Netlify have published guides.

Adoption is still very early (less than 0.005% of sites), but the trajectory is clear. We believe every site should be AI-discoverable by default — not as an afterthought.

Headless won. The question now is which kind.

The headless CMS debate is settled. According to WP Engine, 73% of organizations already use headless, and 98% of the rest plan to evaluate within the next year. The market is projected to grow from $2.38 billion in 2025 to nearly $3 billion in 2026.

The top tier has consolidated around Sanity, Contentful, Strapi, Storyblok, and Hygraph. WordPress still powers 43% of the web, but half of its users report that publishing takes over an hour. Ghost dominates publishing. Webflow owns visual design. Payload CMS is rising as a TypeScript-native alternative. And Astro has become the default frontend for performance-focused static sites.

What nobody has done yet is build a CMS from scratch that is simultaneously AI-native, static-first, MCP-enabled, and distributed as a simple npm package.

That's what we're doing.


Why We Built @webhouse/cms

We didn't set out to build another headless CMS. We set out to build the CMS engine that AI agents would choose for themselves — if they could choose.

The core thesis: the CMS engine should be a reusable, embeddable TypeScript library that any AI coding agent can install and wire into a freshly generated web project. The engine handles everything the AI shouldn't reinvent each time: content modeling, persistence, media pipelines, AI orchestration, and static output generation.

This inverts the traditional relationship. Instead of a human-first system that AI can also use, @webhouse/cms is an AI-first system that humans can also use — complete with a full admin dashboard for visual editing when you need it.

Two modes, one package

We ship two operating modes in a single npm package:

Standalone Mode gives you a full site builder — routing, generative themes, admin dashboard, and built-in hosting. Think of it as a complete WordPress replacement that outputs pure static HTML. Run npx @webhouse/cms init and you have a working CMS in under 60 seconds.

Headless SDK Mode gives you a content API and embeddable editor for existing frameworks. Bring your own Next.js, Astro, or Node.js frontend; the engine provides structured content and AI agents via API.

This means we serve solo founders who want a working site in minutes and development teams integrating structured content into custom architectures — from the same codebase.


The Architecture Decisions That Define Us

AI-native, not AI-added

Every CMS vendor in 2026 has AI features. The difference is where in the stack the intelligence lives.

We built a provider-agnostic AI orchestration layer directly into the content pipeline. Four specialized agents handle different domains:

  • Content Agent — generates, rewrites, translates, expands, compresses, and adapts content against the full collection schema. It doesn't just write — it writes content that fits your site's structure and voice.
  • Design Agent — works at the design token layer, generating complete visual systems from brand inputs (logo, colors, industry, tone). Every site looks unique but is structurally sound.
  • SEO Agent — generates metadata, schema markup, internal linking suggestions, and sitemap optimization.
  • Media Agent — handles AI image generation, video generation, infographics, responsive image sets, and alt text generation.

The AI layer uses a Provider Registry supporting Anthropic, OpenAI, local models via Ollama, and specialized providers like Flux and Runway. Automatic fallback, rate limiting, cost tracking, and response caching are built in. If every AI service goes down, your site still serves perfectly — static HTML doesn't need an API to load.

Dual MCP: our most opinionated decision

This is the architectural choice we're most proud of. While most CMS platforms offer a single MCP endpoint (if any), we ship two distinct MCP servers because the use cases are fundamentally different:

@webhouse/cms-mcp-client@webhouse/cms-mcp-server
Who it's forAny AI agent on the internetAuthenticated owners and editors
AuthNone — public content onlyBearer token / OAuth 2.1 + PKCE
What it can doRead published contentRead, write, publish, generate, build
Where it runsBundled with every built siteStandalone admin service
Endpointyoursite.com/mcpcms.yoursite.com/mcp
Primary consumersPerplexity, ChatGPT, Claude webClaude iOS, Cursor, Claude Code

The public MCP server (cms-mcp-client) is embedded automatically in every site we build. Any AI agent on the internet can discover it and query your published content — structured, clean, without API keys. Six tools cover site discovery, collection browsing, full-text search, page retrieval, schema introspection, and full export.

This matters because it makes your content a first-class citizen in the AI-mediated internet. When someone asks an AI assistant about your industry, your site can answer with structured data — not scraped HTML fragments.

The authenticated MCP server (cms-mcp-server) is where the real magic happens. It enables trusted AI clients to create, edit, publish, and generate content through natural conversation. Picture this:

You open Claude on your phone and say: "Write a blog post about our new warehouse automation feature and publish it."

The agent calls get_site_summary to understand your site. Calls get_schema("blog") to learn the required fields. Calls generate_with_ai to produce a draft that matches your collection schema and brand voice. Creates the document, presents it for your approval, publishes it, and triggers an incremental build.

Your blog post is live. You never opened a browser.

We haven't seen any other CMS offer this dual-layer approach.

Static-first output

The production artifact is always pre-rendered HTML + CSS + minimal JavaScript. No runtime framework in the browser unless you explicitly opt in. Lighthouse scores of 95+ are a design goal, not a happy accident.

When you need interactivity — contact forms, shopping carts, search — we use Interactive Islands with Preact (3KB vs React's 40KB). Each island hydrates independently. The rest of the page stays static.

This philosophy aligns with the performance direction of the entire industry. Edge caching, sub-100ms page loads, zero JavaScript by default. Your content performs because there's nothing to slow it down.

Built-in AI discoverability

Every site built with @webhouse/cms automatically generates at build time:

  • llms.txt and llms-full.txt for AI agent discovery
  • JSON Feed + RSS for syndication
  • JSON-LD Schema.org injection per collection type
  • ai-index.json as a structured machine-readable manifest
  • /.well-known/ai-plugin.json advertising the MCP endpoint
  • Plain text exports (/slug.txt) for every page

No plugins. No configuration. No separate optimization step. Your site is discoverable by AI agents the moment it's built.

We believe this will matter enormously in the next two years as AI-mediated search becomes a primary traffic source. The sites that are easiest for AI to understand will be the ones that get cited most. We're making sure every @webhouse/cms site is in that position from day one.


The AI Lock System: Governance Built In

One concern we hear constantly about agentic CMS platforms: "How do I trust AI not to change the wrong things?"

Our answer is the AI Lock system. It provides field-level protection ensuring that AI agents can never modify certain fields — prices, legal text, verified data, anything you designate — unless a human explicitly unlocks them.

This isn't a configuration option. It's an architectural invariant enforced at the engine level. Every write operation passes through a WriteContext that identifies whether the actor is human or AI. Locked fields reject AI writes with a clear error. The principle is absolute: AI agents can never unlock fields — only users can.

We built this because we believe the agentic CMS future requires trust infrastructure, not just capability. The AI should be powerful. It should also have clear boundaries.


Commerce: The Shop Plugin

We're building a companion package — @webhouse/cms-plugin-shop — that extends the engine with e-commerce capabilities:

  • Physical and digital products
  • Subscriptions and memberships with content gating
  • A course platform with video streaming via Mux
  • Advanced commerce features in later phases

The plugin hooks into our extension system rather than modifying core — maintaining clean architectural boundaries. Stripe is the source of truth for pricing and payment state. The CMS owns content and presentation. AI agents can manage product descriptions, course content, and marketing copy without ever touching payment logic.

This means you get a complete content + commerce platform from a single ecosystem, without the bloat of trying to turn a CMS into Shopify or vice versa.


Where We Fit in the Market

We see the CMS landscape mapping across two axes: architectural modernity and AI integration depth.

AI Bolted-OnAI-Native
MonolithicWordPress + AI plugins, Wix, Squarespace(largely empty)
ComposableSanity, Contentful, Storyblok, Strapi, Sitecore.ai@webhouse/cms

The enterprise players are composable with bolted-on AI. The indie/developer platforms have great DX but limited AI-native capabilities. The monolithic builders are adding AI features to fundamentally rigid architectures.

We occupy the composable + AI-native quadrant with a combination nobody else offers:

  • Dual-mode (standalone + headless SDK) in one npm package
  • Dual MCP (public read + authenticated read/write) out of the box
  • Static-first output with 95+ Lighthouse as a design target
  • AI discoverability (llms.txt, Schema.org, ai-index.json) generated automatically
  • Shop plugin for integrated commerce
  • AI Lock for field-level governance
  • Schema-driven content that AI agents can reason about programmatically
  • npm distribution — installable by any AI coding agent in seconds

Our biggest competitive differentiator isn't any single feature. It's that we started with a blank TypeScript file and asked: "What would a CMS look like if AI agents were the primary interface?" — and then built exactly that.


What's Next

We're building @webhouse/cms in public, following a six-phase development plan:

  1. Foundation — core engine, schema system, content CRUD, SQLite storage, basic static output, CLI
  2. AI Integration — provider registry, content/SEO agents, media pipeline, CMS manifest
  3. Admin Dashboard — visual editor, media library, AI chat panel, auth system, MCP servers
  4. Framework Adapters — Next.js, Astro, and Node.js integrations
  5. Design System — generative themes, design agent, infographic engine
  6. Enterprise — multi-user roles, approval workflows, i18n, import from WordPress/Ghost/Contentful

We're currently in active development, and we'll be sharing architecture updates, technical deep dives, and early access opportunities as we progress.

If you're interested in following along — or if you're building something that could benefit from an AI-native CMS engine — we'd love to hear from you.


The Bottom Line

The CMS landscape in 2026 is undergoing its most significant transformation since WordPress launched in 2003. The convergence of agentic AI, Model Context Protocol, AI discoverability standards, and composable architecture is creating a moment where the right architectural decisions compound for years.

We believe the CMS of the next decade won't be the one with the best admin dashboard. It will be the one that an AI agent can install, understand, and operate without reading a single line of documentation.

That's what we're building.

The age of the admin panel is ending. The age of the agentic CMS has begun.