When someone asks Perplexity, Gemini, Claude, or ChatGPT a question, the AI doesn't return a ranked list of pages. It breaks the query into multiple sub-questions, searches each independently, then synthesises the results into a single answer. Google named this "query fan-out" at Google I/O 2025. A typical query generates 6–20 sub-queries, each returning its own sources.
That changes the game for content. A page optimised for one keyword is only eligible for one slot in that process, in a system where each slot cites just 2–7 sources. Comprehensive topic coverage is becoming as important as ranking.
This shift is already visible in New Zealand.
AI tools are increasingly part of how people research, compare, and make decisions, and the signals these systems reward are different from traditional SEO.
Research shows that adding statistics to content improved AI citation rates by up to 41%. Keyword stuffing performed worse than doing nothing at all. What LLMs reward is specificity, structure, and quantified data.
For New Zealand brands, local data is a genuine advantage. New Zealand-specific figures from qualified sources answer sub-queries that generic global content can't. When an AI is synthesising an answer about, say, digital ad spend or consumer behaviour in this market, the source with local data wins the citation. Not the global overview.
The Optimisers has published a practical guide to navigating this shift.
The full guide is available to download below, and for more details visit theoptimisers.com/geo-keyword-

Comments