The cases for and against AI search optimization (“AIO”)
Website professionals are split in how they are thinking about new AI-powered search features like Google’s AI Overviews (SGE), Gemini, or Bing’s incorporation of GPT-4.
Some, like Tony Stubblebine, CEO at Medium, bristle at the idea of an LLM “stealing” their content and delivering (likely sans click) it to users directly in SGE, Bard, ChatGPT, Bing, or another platform’s AI tool. On Medium last year, he wrote: “AI companies have leached value from writers in order to spam Internet readers.”
With this in mind, Medium has now attempted to block AI crawlers from accessing content published on its platform by using disallow directives in its website’s code. Stubblebine mentions that this decision follows similar AI-blocking choices from large publishers including the New York Times, CNN, Reuters, and the Australian Broadcasting Corporation.
But for businesses whose revenue isn’t so closely linked to their editorial output, AI search, if handled correctly, may offer businesses another channel through which to promote their product and gain brand awareness.
New AI indexing reports in Lumar
Did you know you can find pages on your site that are
blocked from AI indexing using the Lumar platform?
We have created new reports in Lumar called “Google AI Blocked” and “Bing AI Blocked”. These reports show you the pages on your website that have been blocked from indexing in the Search Engine AIs, meaning content from those pages will thus not appear in Google Bard or Bing Chat results.
Learn more about the new Lumar AI reports with a
personalized Lumar platform demo.
So for other tech leaders, such as Wix’s Avishai Abraham, speaking on the Decoder podcast late last year, and for SEO specialist Sara Moccand, speaking on Lumar’s webinar series, there is an opportunity to be had in learning how to optimize for AI search.
“You still need to be able to tell your story, share your images, your content, your product, your special offers… and if Google’s Bard is able to communicate that and increase conversions for businesses, I think that’s fantastic,” said Abraham on Decoder.
“You’ll notice, by the way, that almost every demo Google does of Bard ends in a transaction. They’re very focused on it,” host Nilay Patel adds in the podcast conversation.
“Brand awareness is a thing. For some clients, it’s just gonna be about appearing in the SERPs in as many places as you can.”
Chris Spann, Senior Technical SEO at Lumar
“… And that’s the side I tend to fall on. Because you’re right; SGE is going to serve up answers that don’t require a clickthrough response anymore. But at the end of the day, it does also cite its sources. So if you’re creating the most compelling content or subject matter information about a certain topic or about the product, it still has to cite where it comes from. And if you’re selling a paid-for product, a buying action still has to be taken at some point.”
Sara Moccand notes that web users may have different needs when they choose a search engine’s AI chat over traditional search methods. This has been posited before by Bing’s Fabrice Canel as well.
Sometimes users may have a less specific idea of what it is they want to find out. AI chatbot tools may cater to these more uncertain or vague queries and Google understandably wants to be able to satisfy these kinds of queries in their search results too — hence the introduction of SGE.
For businesses who may benefit from inclusion in these vaguer user queries, or who want to expand their brand awareness in general, AIO — or ‘AI optimization’ — may be a worthwhile endeavor in 2024.
Whichever side of the debate you fall on, it’s worth knowing how to handle your website content with regard to AI bots.
Blocking AI bots from crawling your website content
For those who decide they don’t want their brand’s content to appear in AI-generated search results, website managers can attempt to block these AI crawlers using the robots.txt file on their site.
Lumar’s technical SEO platform also offers reports to track which pages on your site are currently blocking Google’s — and Bing’s — AI bots. You can find the new reports in the Lumar platform by going to: Indexability > Non-Indexable.
Lumar’s “Google AI Blocked” report shows 200 response pages that have been disallowed for the google-extended user agent token in robots.txt.
The “Bing AI Blocked” report shows 200 response pages with a robots nocache or noarchive directive that will have limited visibility in BingChat.
The reports have a neutral sign in the platform because if AI bots are being blocked on your website, it’s assumed to be intentional.
“AIO”: Optimizing for AI search
“I think in big, really competitive spaces, a
lot of the players in there aren’t just thinking about
the ‘ten blue links’ at the top of Google SERPs
anymore. They’re thinking about appearing in
image search and in shopping listings and
everywhere they can.”
According to SEO specialist Sara Moccand, because Google’s AI search tools likely rely, in part, on Google’s Knowledge Graph to generate search responses, one way that SEOs and brand managers can help ensure their company is being fairly represented in generative search results is by ensuring their Knowledge Graph information is accurate and up-to-date.
Let’s dig into how SEOs who do want their website content included in AI search results can optimize their content and Knowledge Graph data to get more accurate AI-generated search results.
Brand challenges from AI-generated search results
With generative AI tools being incorporated into search engines like Google and Bing, brands may increasingly be at the mercy of artificial intelligence when users seek out information about their company.
The problem is, AI can often ‘hallucinate’ or provide false information in response to a query. So, how can brand managers and SEOs help ensure that AI tools and search features are providing accurate information about their business?
“There is a big problem, which is that generative AI can invent things — which means they can also potentially invent false information about your brand.”
— Sara Moccand, SEO Specialist & Co-Host of SEOnerdSwitzerland, in her Lumar webinar session
Combatting AI hallucinations with Knowledge Graph optimizations
One way to counter AI search hallucinations centers on ensuring that Google’s Knowledge Graph — the database that stores factual information about entities, people, or businesses and represents it in a quick-to-process way for machines — is supplied with factual information. But it is not always easy to do this.
SEO specialist Sara Moccand points to a 2022 paper on Google’s LaMBDA project (part of its AI development efforts) entitled “LaMBDA: Language Models for Dialog Applications.”
“It really, really, really, really… looks like they are combining Google AI search features with the Knowledge Graph,” Moccand says of the paper’s content in her recent Lumar webinar session on AI and SEO.
Assuming Google’s AI search tools are pulling information from its Knowledge Graph database when generating responses for users, Moccand recommends optimizing your business’s content for inclusion in Google’s Knowledge Graph to help combat any potential hallucinations its AI chatbots may generate about your company.
A 3-step process for Knowledge Graph optimization
To ensure Google can offer the richest SERP results about your brand – and improve its GenAI search results relating to your business — Moccand refers to the “Kalicube process” – a method of branded SERP optimization established by the Kalicube agency to help make sure search engines surface the correct information about your business
This process, as adapted by Moccand, can help companies feed the correct information to Google’s Knowledge Graph.
There are 3 parts to the process that apply to knowledge graph / knowledge panel optimization, Moccand explains:
1.) Building Understanding
The first part of the knowledge graph optimization process is about building an understanding of your brand within search engines’ knowledge graphs — including information about your business’s identity, its offering, and its target audience. This means becoming an “entity” and feeding the machine with the right data about the company and its ecosystem. Having your company’s information included among the ‘trusted sources’ that feed into Google’s knowledge graph can be a key part of this stage.
2.) Building Credibility
As part of their knowledge graph optimization — and wider SEO — efforts, companies need to create content that demonstrates their experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) related to their specific niche/industry. Credibility, or E-E-A-T, can be established across 3 different levels: author, website/domain, and content.
3.) Ensuring Deliverability
Ensuring your content is deliverable to search engines and users alike is a key part of both your SEO and knowledge graph optimization efforts. On the human side, users need to find content that answers the questions they have. On the machine side, if search engines cannot find, parse, and index your content in their databases, then it’s not supporting your SEO or generative AI optimization strategies. (Note: This ties into technical SEO strategies as well.)
Further reading on AI and SEO
- On-Demand Webinar: AI, LLMs & the Future of Search: How SEOs Can Adapt (Lumar webinar)
- SEO Trends: Be Smart About Artificial Intelligence & AI-Generated Content (Lumar blog)
- On-Demand Webinar: AI, Knowledge Graphs, & SEO: Teaching LLMs About Your Business (Lumar webinar)