Mention or citation rate? What should you measure in AEO?
With every new discipline comes a new headache: how do you measure the success of your efforts in AI search? Should you track the number of mentions or the citation rate, or are these just vanity metrics?
Here’s what to analyze to make sense of it:

The great fragmentation of traditional metrics
It was just yesterday, yet it already feels far away: tracking keyword rankings has lost much of its importance over the past three years. With a significant rise in zero-click searches—from 56% in 2024 to 69% in 2025 in the United States (Source: Similarweb), mainly driven by AI-generated answers in standard results, everyone has been looking for alternatives to better track and influence their websites.
In short, traditional indicators (SERP rankings, CTR, etc.) are no longer enough, hence the exploration of new, more relevant KPIs.
At the same time, other familiar metrics have gained importance, conversion rate being a prime example, with exponential growth per click. This suggests that even if organic traffic is declining, it is far more qualified. In that sense, Google and OpenAI are doing their job well.
Good riddance to mass traffic that barely serves the visitor and ends up costing more in server expenses.
That said, Google is evolving while still remaining familiar. But what about AI assistants like ChatGPT? While they are becoming less marginal, they still don’t match Google in terms of volume, although Gemini is gaining ground month after month.
Yet the enthusiasm for AI hasn’t slowed (yet). Progress is rapid and, most importantly, conversion quality from LLMs is often much higher due to the granularity of responses.
Mentions vs. citations
In this sea of uncertainty, tracking citation and mention rates in LLMs is gradually taking shape.
These two new types of signals, brand mentions and web citations, focus on distinct indicators:
A brand mention occurs when an AI assistant references your brand in its response (without linking to your site)
A citation occurs when the AI explicitly references your content with a link to your site
In practice, these signals remain rare: 85% of AI brand mentions come from third-party sites (source: AirOps), highlighting the importance of external sources for credibility.
To make matters more complex, AIs rarely cite sources: Gemini shows no clickable links in 92% of its responses, and Perplexity cites only 3 to 4 pages out of about ten visited.
In short, mentions and citations are two distinct but complementary metrics of your AI visibility.
This is usually where tools like Profound or Xfunnel come into play. They act as command launchers for different AIs, indicating whether you are cited or mentioned. The problem: prompts are created by you, since no official data on real user queries is shared. This introduces a major bias: you’re measuring performance based on what you think users are searching for. And if SEO has taught us anything, it’s that real search behavior often differs significantly from our assumptions.
Still, we need to measure the impact of our efforts, even imperfectly, until OpenAI releases its own version of Search Console and Google finally shares data from AI Overviews in GSC.
What should you measure and how?
The first metric won’t surprise anyone: revenue generated from AI sources.
Easy to track in tools like Google Analytics 4, this metric is still incomplete. It fails to capture the many users who turn to traditional search engines or social media after encountering your brand via an LLM.
Next, and again unsurprisingly, traffic from AI assistants. As with revenue, the main issue is attribution, which depends on your setup (first-touch vs. last-touch models).
Finally, this is where mentions and citations come into play.
In practice, you shouldn’t focus blindly on their volume. For example, if your brand is frequently mentioned but never cited, it likely lacks authority in the eyes of the system. In other words, the AI may recognize your brand but doesn’t consider it a reliable source. That’s why these metrics must be aligned with real business goals.
There’s also no direct attribution for mentions. Today, it’s impossible to say: +10 mentions in ChatGPT = +X% revenue. The right approach isn’t direct attribution; it’s inference through signals.
That means following this process:
- Identify strategic AI queries
- Top-of-funnel informational queries (guides, definitions)
- Comparison queries (“best solution for…”, “X vs Y”)
- Recommendation queries (“tool for…”, “specialized agency…”)
- Measure visibility on these queries
- Percentage of responses where your brand is mentioned
- Percentage of responses where your site is cited
- Relative position (primary vs. secondary mention)
- Observe indirect effects
- Increase in branded searches (Search Console, Google Trends)
- Increase in direct traffic
- Increase in conversion rate
And this is where creativity pays off. Identifying strategic AI queries is one thing, but how do you actually do it?
Many rely on Search Console keywords and the “People Also Ask” box. Others, like the brand Kayak, build tools such as kayak.ai. By offering AI-powered services, they can observe real user prompts and reasonably infer that similar phrasing and intent are used in ChatGPT or Gemini. That’s your targeting—jackpot.
Final thoughts
To sum up the current state of AEO:
- AI mentions and citations do not replace business KPIs
- They act as signals of visibility, credibility, and influence
- Their value is measured through delayed effects on brand, traffic, and conversions
Tracking them is essential, but analyzing them in isolation turns them into yet another vanity metric, just like unqualified traffic.
