What You Can Measure
1. Direct Traffic Patterns
When someone asks an AI assistant for a recommendation and visits your site, it typically appears as direct traffic — no referrer attached.
The signal: Unexplained increases in direct traffic, especially to pages that wouldn't normally get direct visits.
Think about it: nobody bookmarks /services/workflow-optimization. If that page suddenly gets direct traffic, something is sending people there without a trackable link. AI assistants are a likely source.
What to track:
- Direct traffic to non-homepage pages (week-over-week, month-over-month)
- New landing pages appearing in direct traffic reports
- Correlation between optimization work and direct traffic changes
2. AI Platform Referrals
Some AI-assisted traffic does include referrers:
| Source | What It Captures |
|---|---|
| perplexity.ai | Perplexity citations with click-through |
| you.com | You.com AI search results |
| bing.com (via Copilot) | Some Microsoft Copilot referrals |
| chatgpt.com | ChatGPT web browsing mode |
These won't capture everything — most AI interactions don't result in tracked clicks — but they're a real signal when they appear.
What to track:
- Traffic from AI-related domains
- Trends over time (are AI referrals growing?)
- Which pages AI platforms are sending traffic to
3. Brand Search Lift
If AI assistants are recommending you, people will search your brand name to verify or learn more. This creates a measurable signal in traditional search.
The signal: Increase in branded search queries in Google Search Console.
What to track:
- Impressions and clicks for your brand name and variations
- Week-over-week and month-over-month trends
- Correlation with AI optimization work
4. Structured Data Validation
Not traffic, but a leading indicator that your technical implementation is working.
What to track:
- Schema validation errors (target: zero)
- Number of valid structured data items detected
- Rich results eligibility in Google Search Console
Tools:
- Google Rich Results Test
- Schema.org Validator
- Bing Webmaster Tools structured data report
5. AI Citation Monitoring
The most direct measurement: actually check if AI assistants mention you.
The process:
- Identify 10-20 queries relevant to your business
- Run them across ChatGPT, Claude, Perplexity, Gemini
- Document: Were you mentioned? Cited accurately? Recommended?
- Repeat monthly and track trends
What to track:
- Mention rate (% of relevant queries where you appear)
- Citation accuracy (is the AI saying correct things?)
- Recommendation strength (mentioned vs. recommended vs. top recommendation)
- Competitive position (are you mentioned more or less than competitors?)
6. Self-Reported Attribution
Sometimes the simplest approach works: ask people how they found you.
What to track:
- "How did you hear about us?" form responses
- Post-purchase or post-inquiry surveys
- Sales conversation notes
Important: Add "AI assistant (ChatGPT, Claude, Perplexity)" as an explicit option. People won't write it in unprompted, but they'll select it if offered.