AI & SEO: Staying Visible in the Generative Search Era

generative search optimisation SEO Ian Booth

The landscape of search is rapidly evolving, with generative AI applications like ChatGPT, Gemini, DeepSeek and Perplexity reshaping how users seek and consume information. As an SEO specialist, I’m committed to helping my clients navigate this new terrain. In this article I’ll try to guide you on how to optimise your content for generative AI search, ensuring your brand achieves visibility and relevance.

The approach to ranking in LLMs requires a strategic shift from traditional SEO. The emphasis is no longer solely on links, but also on mentions and establishing your brand as a trusted source within relevant contexts. This guide will help you understand how to achieve this. Before we proceed, I encourage you to read my previous article on Generative Search Optimisation, which discussed making it easy for crawlers to extract meaning and context from your website through enhanced structured data and organised content experiences. This provides foundational knowledge regarding how search engines use connected content and schema markup to understand page context, which is highly relevant to optimising for LLMs.

Understanding Generative Engine Optimisation (GEO)

Generative Engine Optimisation (GEO) is the practice of optimising your online presence to increase visibility within the outputs of generative AI applications. This involves ensuring your brand, products, and website content are prominently featured in AI-generated responses, whether through direct mentions or citations.

How Large Language Models (LLMs) Function

LLMs utilise Natural Language Processing (NLP) to process and interpret data, intending to generate comprehensive answers based on user prompts.

  • LLMs operate through encoding and decoding processes, converting data into tokens and vectors, and then interpreting probabilities to generate natural language.
  • They employ a statistical approach, rather than human-like comprehension, to process information.
  • LLMs can refine initial query results and generate supplementary queries to gather more comprehensive data, enhancing the accuracy and relevance of their answers.
  • Retrieval-augmented generation (RAG) is crucial, providing LLMs with additional topic-specific data to overcome the limitations of basic training. Knowledge graphs and entity nodes can also be used to add to the contextual awareness of LLMs.
  • The context window is expanded using NLP to identify main and secondary entities via grammatical sentence structure.
  • LLMs can process multimedia formats, converting them into text tokens for encoding.

Key Strategies for Ranking in AI Search

To improve your visibility in AI-driven search results, consider the following strategies:

  1. Create High-Quality, Authoritative Content: Focus on producing well-researched, informative, and trustworthy content that aligns with Google’s E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) guidelines. This is especially important for YMYL (Your Money, Your Life) topics.
  2. Utilise Citable Sources: Improve your content’s credibility by including citable sources, particularly when presenting factual information.
  3. Incorporate Statistics and Quotes: Enhance your content’s persuasiveness by integrating relevant statistics and quotes to add authenticity and depth.
  4. Optimise for Your Specific Domain: Tailor your optimisation strategies to align with the unique characteristics of your industry, as different platforms prefer different sources based on the topic.
  5. Think Semantics Not Keywords: Traditional keyword stuffing is ineffective for generative search responses, think about the meaning behind your content.
  6. Focus on User Intent: Ensure your content directly answers user queries and provides genuine value, focusing on context rather than exact keyword matches.
  7. Establish Authority: Position your brand as a trusted and authoritative source within your industry. This includes building domain authority and acquiring credible inbound links.
  8. Engage on Relevant Platforms: Actively participate on community platforms where your target audience is present, such as Reddit.
  9. Monitor Trends: Stay informed on the latest developments in AI search and be ready to adjust your strategies.
  10. Match Embedding Distance: Ensure your content’s language and structure align with the style of AI-generated summaries by using consistent terminology and structure.

Tactical Approaches for LLMO/GEO

Want your content to stand out in the world of Large Language Models (LLMs) and Generative Search Engines? Here’s how to make your mark:

Establish Your Authority. Think of LLMs and GEOs as highly intelligent research assistants. They prioritise credible and authoritative sources. Enhance your content’s visibility by:

  • Writing with Authority: Demonstrate expertise and trustworthiness in your writing style.
  • Citing Your Sources: Back up your claims with solid evidence from reputable sources.
  • Adding Statistics: Strengthen your arguments with compelling data.
  • Including Relevant Quotes: Incorporate insights from experts and key figures.

Become a Go-To Source. LLMs and GEOs learn by analysing vast amounts of text data. The more frequently your content appears in this data, the more likely it is to be recognised as relevant and valuable.

  • Aim for Frequent Selection: Strive to be among the top sources consistently chosen by these AI systems.

Understand the Power of Context. LLMs and GEOs are incredibly sophisticated at understanding language. They analyse how words and phrases are used together to determine their meaning and relationships.

  • Token Frequency Matters: The more often specific terms appear together, the stronger their contextual connection becomes. This increases the likelihood of those terms being used together in generated content.

Optimise for Training Data. Keep in mind that LLMs are trained on massive datasets. To maximise your content’s visibility, consider:

  • Initial Training Data: Align your content with the type of information used to initially train the LLM.
  • RAG Process: Tailor your content to be relevant to the sources added during the Retrieval-Augmented Generation (RAG) process, where the LLM pulls in additional information on a topic-specific basis.
  • Authority in writing, citations, statistical additions, and quotation additions can all enhance content visibility.
  • It’s crucial to establish a presence among frequently selected sources.
  • The more frequently specific tokens are mentioned together, the more likely they are to be contextually related, increasing their probability during decoding.
  • Consider both the initial training data of LLMs and the sources added on a topic-specific basis during the RAG process.

Understanding AI Overviews

Imagine searching for something and instantly getting a concise, informative summary drawn from the best sources on the web. That’s the power of AI Overviews (AIOs).

AIOs are like having a super-smart research assistant that sifts through tons of information and presents you with the most relevant, credible answers. They appear at the top of Google’s search results, giving you a clear and comprehensive understanding of your topic right away.

How AIO Work

AIOs are generated using cutting-edge AI technology, including Google’s powerful Gemini model. This model can understand the meaning and relationships between words, images, and other types of data, allowing it to identify the most reliable and informative sources.

Here’s how it works:

  1. Understanding Your Question: AIOs are often triggered by informational queries, where you’re seeking knowledge or explanations.
  2. Finding the Best Sources: Google’s system scans its vast index of pre-evaluated, top-ranked documents to find the most relevant and trustworthy sources. It even considers your location to provide geographically relevant results.
  3. Creating a Concise Summary: Gemini uses its advanced language processing capabilities to create a clear and concise summary, drawing on information from multiple sources.
  4. Providing Links for Deeper Exploration: AIOs often include links to the original sources, allowing you to dive deeper into the topic. They may also include links to relevant products in the Google Shopping Graph.

AIOs represent a significant step forward in Google’s mission to organise the world’s information and make it universally accessible and useful. By leveraging the power of AI, AIOs provide a more intelligent and informative search experience, helping you find the answers you need more efficiently than ever before.

Key Factors for AIO Visibility

  • High search positions and click-through rates (CTR) are important for inclusion in AIOs.
  • The average position of pages in AIOs is around the 5th spot.
  • The trustworthiness of the document is assessed based on the author’s credibility, domain reputation, and inbound links.
  • The system prioritises fresh and frequently updated content.
  • Diversity among selected documents is sought, to present a range of perspectives and content types.

The Importance of Mentions

  • For brands, the primary currency of LLMs is mentions across the web, which are words that frequently appear near other words in the training data.
  • Identify and engage with online platforms that discuss your area of expertise and ensure your brand is mentioned there.
  • Ask LLMs which websites are likely to contribute to their training data.
  • Utilise tools to identify websites that have a strong affinity with your target audience.

Optimising for AI Overviews

Improve your SERP rankings, as 52% of sources in AIOs rank in the top 10. Perform on-page optimisation, ensuring you make it easy to understand the meaning and context of a page, with no ambiguity.

  • Work on your backlink profile to signal to search engines that your content is valuable, but ensure the backlinks are from relevant content.
  • Format your content by using short paragraphs and elements like lists and bullet points, as AIOs often contain lists.
  • Cite authoritative sources, particularly for YMYL (your money, your life) topics.
  • Ensure your website meets Google’s technical requirements, by checking if your pages are indexable in Google Search Console.
  • Use structured data to help Google understand your content.
  • Improve user experience by ensuring mobile optimisation, page speed, and user-friendly design.
  • Regularly update your content.

Optimising for LLM Crawlers

  • LLMs use crawlers to gather information, similar to traditional search engines.
  • You can manage their access via your robots.txt file.
  • Prioritise content that provides detailed information about your brand, products, or services.

Managing Brand Entities

  • Ensure your brand entities are represented accurately.
  • Work on your structured data, Wikipedia presence, and broader semantic footprint.
  • LLMs are trained on data from Wikipedia and other sources, making entity management critical.

Getting Involved with LLM Platforms

  • Consider content partnerships with LLMs, creating custom GPTs, and developing Perplexity pages.
  • These steps can help you get your brand noticed by a broader audience.

Optimising for Specific LLMs

Each LLM has unique functionalities, so it’s important to adapt your approach. Some LLMs, such as Claude, do not have access to real-time data and rely on initial training data. Tools like Claude, the Microspft Copilot app assistant, ChatGPT’s free models and oftentimes Gemini generate answers based on pre-trained datasets without access to live web data.

Monitor training data updates for visibility changes in models that use pre-trained datasets and check the documentation for the version of the model that visibility tools use before making adjustments to your strategy. Provide feedback on response accuracy. Even when an LLM isn’t receiving fresh data, it learns from user feedback. For search-augmented LLMs such as Perplexity and Copilot, your optimisation strategy needs to consider both the LLM and the search engine behind it. Optimise for the relevant search engine. Check your core queries regularly. Prioritise pages that show in the LLM when internal linking.

Pathways to AI Overview Visibility

There are several pathways to visibility in AI Overviews:

  • Direct Match Queries: Aim to rank in the top 10 for your primary targeted query, aligning content with specific, high-intent searches.
  • Target-Related Queries: Cover associated topics to signal contextual relevance and improve inclusion even if you don’t rank for the main query.
  • YouTube: Create optimised, relevant video content.

Sub-pathways to visibility include:

  • Top 2 Ranking: Securing a top 2 ranking significantly increases the likelihood of appearing in an AI Overview for both direct matches and related queries.
  • Unique, High-Quality Information: Enhance your content with unique insights to set it apart.
  • Match Embedding Distance: Use language and structure that align with the style of AI-generated summaries.

Conclusion

Ranking in AI search requires a different approach compared to traditional SEO. By focusing on quality content, building authority, and understanding how LLMs and AI Overviews function, you can position your brand for success in the evolving world of generative search. Remember to stay adaptable and monitor the latest trends to maintain a competitive advantage.  Achieving visibility in AI Overviews means producing content that is authoritative, comprehensive, and contextually relevant.

Scroll to Top