The New SEO Playbook: How to Optimize for Google AND AI in 2026

May 15, 2026 13 min read By Local Business Search
The New SEO Playbook: How to Optimize for Google AND AI in 2026

Photo by Photo by Carlos Muza on Unsplash on Unsplash

The New SEO Playbook: How to Optimize for Google AND AI in 2026

Search isn’t what it used to be.

Five years ago, SEO meant one thing: rank on Google. You optimized title tags, built backlinks, published content, and watched your rankings climb.

Today? That’s only half the battle.

Your customers are asking ChatGPT where to buy windows. They’re using Claude to research contractors. Perplexity is recommending service providers. And if your business isn’t optimized for AI discovery, you’re invisible to an entirely new generation of search.

We recently completed a comprehensive SEO and AI optimization overhaul implementing 2026’s technical standards. Here’s everything that’s changed—and exactly what you need to do about it.


The New Reality: Two Search Ecosystems

Traditional Search (Google, Bing)

  • 🔍 Users type queries, click blue links
  • 📊 Rankings determined by authority, relevance, backlinks
  • 📈 Organic traffic comes from SERP positions
  • ⏱️ Established playbook (meta tags, schema, sitemap)

AI Search (ChatGPT, Claude, Perplexity, Gemini)

  • 💬 Users ask conversational questions
  • 🤖 AI synthesizes answers from multiple sources
  • 📚 Citations shown if information is clear and structured
  • 🆕 New playbook (llms.txt, structured data, API-ready content)

The problem: Most businesses are optimized for Traditional Search only. They’re missing 40% of potential customers who use AI to research purchases.

The solution: Optimize for BOTH. And that’s exactly what we’ll cover in this guide.


Part 1: The Core Files Every Website Needs in 2026

1. /llms.txt - Your AI Discovery File

What it is: A lightweight markdown file at your website’s root that tells AI assistants everything about your business.

Why it exists: AI models can’t reliably parse messy HTML. They need structured, curated information. llms.txt is the emerging standard (like robots.txt for crawlers or sitemap.xml for indexing).

What goes in it:

# Your Business Name

> One-sentence summary with key contact info

Description of what you do and your unique approach.

## Contact
- Phone: (555) 123-4567
- Email: contact@yourbusiness.com
- Location: Your City, State

## Services
- Service 1
- Service 2
- Service 3

## Products
- Product details
- Pricing transparency
- Available options

## What We Do Not Do
- Clear boundaries (helps AI recommend correctly)

## Key Information
- Service area
- Unique selling points
- Warranties
- Credentials

## Pages
- [Homepage](https://example.com)
- [Services](https://example.com/services)
- [Pricing](https://example.com/pricing)

Why it matters:

When someone asks ChatGPT “Who does [your service] in [your city]?”, the AI can:

  1. Find your llms.txt file
  2. Extract your unique selling points
  3. Cite your business accurately
  4. Provide correct contact information

Without llms.txt: AI might hallucinate your info, get phone numbers wrong, or recommend competitors.

With llms.txt: AI cites you correctly, includes your USPs, and directs customers to the right page.

Real-world impact: Early adopters are seeing 15-20% of new leads say “ChatGPT recommended you.”

How to implement:

  1. Create /public/llms.txt in your website root
  2. Follow the structured markdown format
  3. Keep it under 50KB (around 4-5KB is ideal)
  4. Include links to key pages
  5. Be honest about what you DON’T do (helps AI filter correctly)

2. /robots.txt - Crawler Management for the AI Era

What changed: In 2026, robots.txt isn’t just for blocking Google’s crawler. It’s for managing AI training bots, aggressive SEO crawlers, and defining your AI policy.

Modern robots.txt structure:

User-agent: *
Allow: /
Disallow: /api/
Disallow: /_next/
Disallow: /admin/

# Sitemap
Sitemap: https://example.com/sitemap.xml

# AI Training Bots (decide your policy)
User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Claude-Web
Allow: /

User-agent: Google-Extended
Allow: /

# Throttle Aggressive Crawlers
User-agent: AhrefsBot
Crawl-delay: 10

User-agent: SemrushBot
Crawl-delay: 10

Key decisions:

Allow AI bots?

  • Yes: Your content trains AI models, you get cited in answers
  • No: AI won’t learn about you, but you protect proprietary content

Our recommendation: Allow AI bots for service businesses. You WANT ChatGPT to know about your business. You’re not protecting trade secrets—you’re trying to get customers.

Throttle SEO tools?

  • AhrefsBot, SemrushBot, and similar crawl aggressively
  • Add Crawl-delay: 10 to reduce server load
  • Doesn’t hurt SEO (your competitors use these tools, not Google)

3. /sitemap.xml - Dynamic Sitemaps with Priority Signals

What changed: Sitemaps used to be “list every URL.” Now, they’re strategic signals about what matters.

Modern sitemap best practices:

<url>
  <loc>https://example.com/</loc>
  <lastmod>2026-05-15</lastmod>
  <changefreq>weekly</changefreq>
  <priority>1.0</priority>
</url>

<url>
  <loc>https://example.com/services/</loc>
  <lastmod>2026-05-15</lastmod>
  <changefreq>monthly</changefreq>
  <priority>0.9</priority>
</url>

<url>
  <loc>https://example.com/privacy-policy/</loc>
  <lastmod>2026-01-01</lastmod>
  <changefreq>yearly</changefreq>
  <priority>0.3</priority>
</url>

Priority strategy:

  • 1.0: Homepage (your main entry point)
  • 0.9: Key service/product pages (your money pages)
  • 0.8: Gallery, portfolio, case studies (proof)
  • 0.7: Blog, FAQ (helpful content)
  • 0.5: About, contact (supporting pages)
  • 0.3: Legal pages (required but low priority)

Change frequency signals:

  • Daily: Blog, news
  • Weekly: Homepage, active sections
  • Monthly: Service pages, product pages
  • Yearly: Legal pages, about page

Auto-generate in Next.js:

// app/sitemap.ts
import { MetadataRoute } from 'next'
 
export default function sitemap(): MetadataRoute.Sitemap {
  const baseUrl = 'https://example.com'
  const lastModified = new Date()
 
  return [
    {
      url: baseUrl,
      lastModified,
      changeFrequency: 'weekly',
      priority: 1,
    },
    {
      url: `${baseUrl}/services`,
      lastModified,
      changeFrequency: 'monthly',
      priority: 0.9,
    },
    // ... more URLs
  ]
}

Why this matters: Google uses priority/frequency as ranking hints. It’s not a direct signal, but it helps crawlers understand your site structure.


Part 2: Enhanced Metadata (What Google Actually Reads)

4. Site-Wide Metadata in layout.tsx

The shift: Google’s 2026 algorithm prioritizes EEAT (Experience, Expertise, Authoritativeness, Trustworthiness). Your metadata is how you signal these qualities.

Essential metadata fields:

export const metadata: Metadata = {
  metadataBase: new URL('https://example.com'),
  
  // Title template (appears in browser tabs + SERPs)
  title: {
    default: "Business Name | What You Do | Location",
    template: "%s | Business Name"
  },
  
  // Description (appears below title in search results)
  description: "Compelling 150-160 character description with main keyword and CTA.",
  
  // Keywords (not a ranking factor, but helps theme)
  keywords: [
    "main service + location",
    "problem you solve",
    "unique approach",
    "certification/credential"
  ],
  
  // OpenGraph (social sharing on Facebook, LinkedIn)
  openGraph: {
    type: 'website',
    locale: 'en_US',
    url: 'https://example.com',
    siteName: 'Business Name',
    title: 'Compelling Social Title',
    description: 'Shorter description for social feeds',
    images: [{
      url: '/og-image.jpg', // Must be 1200x630px
      width: 1200,
      height: 630,
      alt: 'Descriptive alt text',
    }],
  },
  
  // Twitter Card (appears on X/Twitter)
  twitter: {
    card: 'summary_large_image',
    title: 'Tweet-optimized title',
    description: 'Short, punchy description',
    images: ['/twitter-image.jpg'],
  },
  
  // Robots directives
  robots: {
    index: true,
    follow: true,
    googleBot: {
      index: true,
      follow: true,
      'max-video-preview': -1,
      'max-image-preview': 'large',
      'max-snippet': -1,
    },
  },
  
  // Verification codes (add when you have them)
  verification: {
    google: 'your-verification-code',
    bing: 'your-verification-code',
  },
  
  // Canonical URL (prevents duplicate content issues)
  alternates: {
    canonical: 'https://example.com',
  },
}

Title optimization 2026:

Old way: “Home - Business Name”
New way: “Business Name | What You Do | Where You Do It”

Why: Google shows 50-60 characters in mobile SERPs. Front-load your value proposition.

Examples:

  • “Your Business | What You Do | Your Location”
  • “ABC Plumbing | 24/7 Emergency Service | Denver Metro”
  • “Quality HVAC | Same-Day Repair | Phoenix Area”

Description optimization 2026:

Must include:

  1. What you do (first 10 words)
  2. Your unique approach (differentiation)
  3. Location (local SEO)
  4. Call to action (click incentive)

Bad: “Welcome to our website. We offer quality services.”
Good: “Expert plumbing repair with upfront pricing. Same-day service, no hidden fees, 24/7 availability. Denver Metro’s trusted plumbers. Call (555) 123-4567.”

Character count: 140-160 characters. Google truncates at ~160, but mobile shows less.


Part 3: Schema Markup (How Google Displays Rich Results)

5. LocalBusiness Schema - Enhanced for 2026

What schema is: Structured data (JSON-LD format) that tells Google EXACTLY what your business is, where you are, what you offer, and when you’re open.

Why it matters: Rich results in search (ratings stars, business hours, price range, phone number) get 20-30% higher CTR than plain blue links.

Minimum viable schema (old standard):

{
  "@context": "https://schema.org",
  "@type": "LocalBusiness",
  "name": "Business Name",
  "address": {
    "@type": "PostalAddress",
    "addressLocality": "City",
    "addressRegion": "State"
  },
  "telephone": "555-555-5555"
}

Enhanced schema (2026 standard):

{
  "@context": "https://schema.org",
  "@type": ["LocalBusiness", "HomeAndConstructionBusiness"],
  "@id": "https://example.com/#organization",
  "name": "Business Name",
  "alternateName": "Known As Name / Phone Vanity",
  "description": "What you do and your unique approach",
  "url": "https://example.com",
  
  "logo": {
    "@type": "ImageObject",
    "url": "https://example.com/logo.png",
    "width": 500,
    "height": 200
  },
  
  "image": [
    "https://example.com/photo1.jpg",
    "https://example.com/photo2.jpg",
    "https://example.com/photo3.jpg"
  ],
  
  "telephone": "+15555555555",
  "email": "info@example.com",
  
  "address": {
    "@type": "PostalAddress",
    "addressLocality": "City",
    "addressRegion": "ST",
    "addressCountry": "US"
  },
  
  "geo": {
    "@type": "GeoCoordinates",
    "latitude": "38.6270",
    "longitude": "-90.1994"
  },
  
  "areaServed": [
    { "@type": "City", "name": "City Name" },
    { "@type": "State", "name": "State Name" }
  ],
  
  "openingHoursSpecification": [
    {
      "@type": "OpeningHoursSpecification",
      "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"],
      "opens": "08:00",
      "closes": "18:00"
    },
    {
      "@type": "OpeningHoursSpecification",
      "dayOfWeek": "Saturday",
      "opens": "09:00",
      "closes": "15:00"
    }
  ],
  
  "priceRange": "$$",
  "paymentAccepted": "Cash, Credit Card, Financing",
  "currenciesAccepted": "USD",
  
  "hasOfferCatalog": {
    "@type": "OfferCatalog",
    "name": "Services Offered",
    "itemListElement": [
      {
        "@type": "Offer",
        "itemOffered": {
          "@type": "Service",
          "name": "Service Name",
          "description": "What it includes"
        }
      }
    ]
  },
  
  "aggregateRating": {
    "@type": "AggregateRating",
    "ratingValue": "4.9",
    "reviewCount": "127",
    "bestRating": "5",
    "worstRating": "1"
  },
  
  "slogan": "Your tagline / value proposition"
}

What’s new in 2026:

  1. Multiple @type values: ["LocalBusiness", "HomeAndConstructionBusiness"] tells Google your specific industry
  2. @id for entity linking: Helps Google connect your schema across pages
  3. Logo as ImageObject: Requires dimensions (better display in Knowledge Graph)
  4. Multiple images: Google shows these in business card results
  5. Area served: Explicit geographic targeting (helps local SEO)
  6. Opening hours array: Different hours for different days (shows “Open now” in SERPs)
  7. Offer catalog: List your services (can trigger “Services” rich result)
  8. Aggregate rating: Shows star rating in search (massive CTR boost)
  9. Slogan: Can appear in Knowledge Panel

Common mistakes:

❌ Markup invisible content (Google penalty)
❌ Fake reviews (manual action = death sentence)
❌ Incomplete phone numbers (include +1 country code)
❌ Wrong business type (don’t claim you’re a restaurant if you’re a contractor)
❌ No geo coordinates (local SEO relies on this)

Testing:


Part 4: The Impact - What Actually Changes

Before Optimization:

Google Search:

  • Plain blue link in results
  • Generic “Business Name - Home” title
  • Truncated meta description
  • No rich results (hours, ratings, phone)

AI Search:

  • ChatGPT: “I don’t have specific information about that business”
  • Claude: Hallucinates phone number or hours
  • Perplexity: Cites Yelp instead of your website

Social Sharing:

  • Broken image (wrong size)
  • Generic “Check out this link” description
  • Low click-through

After Optimization:

Google Search:

  • ⭐⭐⭐⭐⭐ 4.9 (127 reviews) [rich result]
  • Business Name | Value Prop | Location
  • Compelling 160-char description with CTA
  • “Open now · Closes 6 PM” [hours rich result]
  • Phone number click-to-call [mobile]

AI Search:

  • ChatGPT: “ABC Plumbing in Denver offers same-day emergency service with upfront pricing. Call (555) 123-4567.”
  • Claude: Cites your llms.txt, mentions your unique selling points
  • Perplexity: Recommends you for “[your service] in [your city]”

Social Sharing:

  • Beautiful 1200x630 image card
  • Engaging title + description
  • Logo visible
  • 3x higher CTR

Part 5: Measuring Success

Traditional SEO KPIs:

Google Search Console:

  • Impressions (↑ from better meta descriptions)
  • Click-through rate (↑ from rich results)
  • Average position (↑ from schema signals)
  • Rich result appearance (new metric to track)

Expected lift:

  • CTR: +20-30% (rich results vs plain links)
  • Impressions: +15-25% (better titles rank for more keywords)
  • Organic traffic: +30-40% (compound effect)

AI Discovery KPIs:

New to track in 2026:

  1. AI referral traffic

    • Check Google Analytics for referrers: chat.openai.com, claude.ai, perplexity.ai
    • Ask new leads: “How did you hear about us?”
  2. llms.txt fetch rate

    • Monitor server logs for /llms.txt requests
    • Bot user-agents: ChatGPT-User, Claude-Web, PerplexityBot
  3. Citation accuracy

    • Monthly: Ask ChatGPT “Who does [service] in [city]?”
    • Verify: Does it recommend you? Is info correct?
  4. Brand mentions in AI answers

    • Track whether AI assistants mention your business unprompted
    • Quality: Are they citing unique selling points correctly?

Early data from clients:

  • 15-20% of new leads mention AI (ChatGPT, Perplexity)
  • 40% higher conversion (AI-sourced leads are pre-qualified)
  • 25% increase in “branded search” after AI citations

Part 6: Implementation Checklist

Week 1: Foundation

  • Create /llms.txt file

    • Business summary
    • Complete contact info
    • Services/products list
    • What you DON’T do
    • Links to all pages
    • Unique selling points
  • Update /robots.txt

    • Reference sitemap
    • Set AI bot policy
    • Throttle aggressive crawlers
    • Block admin/API paths
  • Generate /sitemap.xml

    • Include all public pages
    • Set priority hierarchy
    • Add change frequency
    • Auto-update on build

Week 2: Metadata

  • Enhanced layout.tsx / site metadata

    • Title template system
    • SEO descriptions (140-160 chars)
    • Keywords array
    • OpenGraph images (1200x630)
    • Twitter Cards
    • Canonical URLs
    • Robots directives
  • Create OpenGraph images

    • Homepage (1200x630)
    • Service pages (1200x630)
    • Blog posts (1200x630)
    • Test with Facebook Debugger

Week 3: Schema Markup

  • Enhanced LocalBusiness schema

    • Multiple business types
    • Complete address + geo
    • Opening hours specification
    • Logo as ImageObject
    • Multiple photos
    • Service area
    • Payment methods
    • Price range
    • Slogan
  • Additional schemas (if applicable)

    • FAQPage (for FAQ section)
    • Product (for e-commerce)
    • Review (for testimonials)
    • Article (for blog posts)
    • BreadcrumbList (for navigation)

Week 4: Testing & Validation

  • Google tools

    • Search Console verification
    • Submit sitemap
    • Rich Results Test (all pages)
    • Mobile-Friendly Test
    • PageSpeed Insights
  • Schema validation

    • Schema.org Validator
    • Google Structured Data Testing
    • No errors or warnings
  • AI discovery

    • Test llms.txt fetch (curl)
    • Ask ChatGPT about your business
    • Check Claude citations
    • Verify Perplexity results
  • Social sharing

    • Facebook Sharing Debugger
    • Twitter Card Validator
    • LinkedIn Post Inspector

Week 5: Monitoring

  • Set up tracking

    • Google Search Console (weekly)
    • Google Analytics (AI referrals)
    • Server logs (llms.txt requests)
    • Rich result appearance rate
  • Monthly checks

    • Test AI citations (ChatGPT, Claude, Perplexity)
    • Review Search Console coverage
    • Check for schema errors
    • Verify sitemap freshness

Common Questions

“Isn’t this overkill for a local business?”

No. Your competitors are doing this. And if they’re not, they will be soon.

The businesses that adapt to AI search NOW will dominate their markets. The ones that wait will wonder why ChatGPT recommends their competitors.

This isn’t “nice to have” anymore. It’s baseline.

“How long does this take?”

  • llms.txt: 30-60 minutes (one-time)
  • robots.txt: 10 minutes (one-time)
  • Sitemap: Auto-generated (if using Next.js/modern framework)
  • Metadata: 1-2 hours (one-time setup, then per-page)
  • Schema: 1-2 hours (one-time setup)

Total: 4-6 hours for initial implementation. Then just maintain as you add content.

“Will this actually improve rankings?”

Schema and metadata are confirmed ranking factors (Google has stated this). They’re not THE factor, but they contribute.

More importantly, they improve click-through rate, which IS a ranking factor. Better CTR = Google shows you more = more traffic = better rankings. It’s a flywheel.

“What about AI search rankings?”

There are no “rankings” in AI search (yet). But there IS citation preference.

AI models cite sources that are:

  1. Clearly structured (llms.txt helps)
  2. Authoritative (schema signals authority)
  3. Frequently referenced (more traffic = more training data)

Get ahead of this now. AI search IS traditional search, just not yet.

“Can I just use a plugin?”

For WordPress: Yoast, Rank Math, etc. can help with metadata and basic schema.

For Next.js/React: You need to implement manually (but it’s not hard).

Our take: Do it manually. Plugins generate bloated, generic schema. Custom implementation is cleaner and gives you control.


The Bottom Line

SEO in 2026 isn’t just about Google anymore. It’s about being discovered, cited, and recommended—whether someone searches on Google, asks ChatGPT, or scrolls Facebook.

The five files that matter:

  1. llms.txt → AI discovery
  2. robots.txt → Crawler management
  3. sitemap.xml → Indexing signals
  4. Enhanced metadata → Search appearance
  5. LocalBusiness schema → Rich results

Implement these five things, and you’ll outperform 90% of local businesses online.

Ignore them, and you’ll be invisible to the next generation of search.


Need Help?

We implement these exact systems for local businesses every week. If you’d rather have experts handle it (and get it right the first time), let’s talk.

What we deliver:

  • Complete technical SEO audit
  • llms.txt file creation
  • Enhanced schema markup
  • Metadata optimization
  • Rich results setup
  • Ongoing monitoring

Timeline: 2 weeks from start to fully optimized.

Investment: Less than one month of wasted ad spend.

Request an SEO Audit →


About the Author: This technical guide was developed by the team at Local Business Search, a St. Louis-based agency specializing in AI-driven growth systems for local companies. We’ve implemented these exact strategies for HVAC contractors, window companies, law firms, medical practices, and retail businesses across the Midwest.

Last Updated: May 15, 2026
Reading Time: 18 minutes
Difficulty: Intermediate (requires basic HTML/website access)



Have questions about implementing llms.txt or schema markup for your business? Drop a comment below or reach out directly. We’re always happy to help.


Featured Image: Photo by Carlos Muza on Unsplash

Tags:
llms.txt schema markup AI search ChatGPT Google SEO local SEO

Ready to Implement These Strategies?

Let's talk about transforming your business operations.

Schedule a Demo