I have noticed an uptick in client interest in the last 6 months regarding whether AI chatbots and AI search will negatively impact or completely change SEO. Mostly, they ask me whether we are rethinking what we are doing in their strategy as a reaction to the changes in the landscape. And they want some assurance that their budget and strategy will not be wasted in a time of rapid change.
My answer is absolutely! SEO strategy should always be constantly reviewed and adjusted. This is no different than any other year. Previously, we have dealt with massive algorithm changes, core updates, the Helpful Content Update, Spam updates and changes to other digital platforms. We must always be agile and move quickly. Or get left behind.
The truth: Is SEO on the way out?
As with the start of each year, there was a slew of clickbait articles claiming that SEO is dead or at least on life support.
The quick answer is no.
The longer answer is more nuanced. Manipulative SEO practices are definitely on the way out. Small niche publishers are no longer favoured and suffered from the HCU (Helpful Content Update). Websites that publish content that does not relate to their core purpose will lose a significant amount of traffic (see Hubspot).
This aside, high-quality SEO consulting has just become even more critical. AI Search has added additional complexity and has some unique requirements. Traditional search has also lifted the game in requirements and compliance.
Poor quality SEO is dead. Dated SEO tactics are dead. Tricks and gaming search engines are dead. Content scaling with AI is dead. Keyword stuffing is dead.
But SEO is well and truly alive. It’s just different in 2025.
Starting your AI Search optimisation journey
In this post, I will share some introductory concepts and try to avoid anything too techy or complicated. It doesn’t help anyone if you don’t understand what I am talking about. What’s your first step in optimising for AI Search? First, it makes sense to see whether the various chatbots recognise and rate your brand and content. There are a few simple tests you can do.
Does AI Search recognise your website?
ChatGPT
Start with ChatGPT. ChatGPT is currently sending about half of all AI Search website visitors.
Add your website URL into ChatGPT. Pay attention to what ChatGPT says about your website. Does it summarise your website accurately? What other pages are offered when you ask “tell me more”? If it then digs down into services or specifics and asks if there is anything, in particular, you would like to find out more about, carry on down that rabbit hole. How comprehensive is ChatGPT’s understanding of your website and business?
Perplexity
Now try Perplexity. Perplexity’s results are very interesting. It starts with an accurate summary and then offers a range of related searches. So, for our domain, we get a Whippet Digital summary. Then we go to what services are provided and how Whippet Digital can help with digital marketing, success stories and industries. Each layer you dig down offers more nuanced related searches like “What is the S.A.F.E. approach that Whippet Digital uses?
Both ChatGPT and Perplexity have a very good understanding of our agency.
Andi Search
Now, do the same thing with Andi search. This will gather information from your website, testimonials, social media profiles, media, agency directories and even your company registration. This one even finds older content under different brand names if you have rebranded at some stage.
Try these out to see how well these platforms understand your brand and content.
Firecrawl
If you want to see how LLMs render your content, paste your URL into Firecrawl. You can look at both markdown and JSON responses. “Firecrawl is an API service that takes a URL, crawls it, and converts it into clean markdown. We crawl all accessible subpages and give you a clean markdown for each. No sitemap required.”
Enterprise monitoring
If you want to go deeper in monitoring your brand mentions in AI, you can use several new tools, including AI Monitor, Profound, ProQuo AI, Evertune, Otterly AI, Revere and BrandRank AI. These allow brands to optimise across a wide range of current and emerging platforms, test brand performance and creative strategies, product sentiment analyses and identify vulnerabilities.
As mentioned in the Profound summary, “Google’s first page is no longer enough. Less than half the sources cited by AI answer engines are in the top 10 results of search engines. New factors influence content visibility on AI platforms.”
Technical Requirements
Now that you know if your website is showing up well on AI platforms, we can address the new levels of technical performance necessary for optimisation for both traditional and AI search. Let’s look at some of the primary influences.
- Robots.txt – You will need to allow each AI bot crawler. You will also need to block AI training as that uses up valuable resources. And you need to enable traditional search bots. You also need an XML sitemap here.
- Speed of loading – important for traditional search engines and even more critical for AI search. Test your pages with Google’s Pagespeed Insights Developer Tools. Check Core Web Vitals and work with your developer to get your website as fast as possible.
- Do not use JavaScript for content. Plain HTML or markdown works much better.
- Implement all possible structured data. The more information you can give about intent and what the page is about, the better. Titles, descriptions, dates, types of content, Open Graph and more should be used. Use JSON-LD.
- Use Search Console to ensure all pages you want found are crawlable and error-free. Use an SEO tool to audit for canonical or metadata issues, 404 errors, HTTP link issues, broken links, oversized images or CSS, duplicates and orphan pages.
- Create an llms.txt file.
Submit your XML sitemap and add it to your robots.txt file.
Structure your site so AI agents can easily understand navigation and content.
AI crawlers are in their infancy, so these requirements will no doubt change in the coming months as refinements and improvements are made to their efficiency.
Content for AI Search
The vital balance is to address AI’s needs without compromising your strategy for traditional search engines. Google organic still has a massive market share, which means that AI Search is nowhere near challenging despite anecdotal increases in AI use as a search engine.
Content specifically created for AI Search is often not that great for human users. I have seen pages that feel more like technical FAQs than content that will engage with users. And there is often a lot of repetition as creators attempt to answer every possible question permutation.
Although Google’s algorithms have used AI for some time now and some of the nuances apply to both traditional search and AI search, content crawling and citations work slightly differently.
The trick is to create content in a way that makes it easy for AI to select answers from within your content without making it boring for human users. It is very easy to fall into the trap of repetition, duplication and listicle over-use so ensure you have a robust human edit regime..
With that in mind, here are some content recommendations:
- Use research and education topics – What are? Explain how. How many etc.
- Conversational queries – full questions as they would be naturally spoken.
- Comparisons.
- Problem solving – how, what, why, when, who.
- Complex questions requiring multiple sections.
- FAQs.
- Popular brand queries.
- Suggested additional questions from Perplexity.
- Address customer challenges and pain points.
- Up to date.
- Short simple sentences – no waffle.
- Good use of clear headings – heading – answer, heading – answer, heading – answer etc.
- Factual content with citations where needed.
- Creative – guides, tutorials, instructional.
- Personalised – How do I?
- Advanced analysis.
- Test content with TextRazor.
If you map some of these to your offerings and your customer needs/challenges you will be able to connect with the right behaviour in AI search. Remember, you are providing value to your prospective customers. You are not hacking a new technology to generate traffic volume. If you approach your content from this perspective you will do much better.
Is AI Search Vulnerable to SEO Spam?
At this stage, unfortunately, the answer is yes. As with any new frontier, there will be plenty of people who are prepared to push limits and breach protocol. Spam tactics may work in AI Search but if you also value traffic from traditional search you will be shooting yourself in the foot. With the relatively small amount of traffic currently coming from AI chatbots, any spam tactics need to be on a massive scale to generate any income or to influence change.
Repeated information patterns influencing LLM text generation are one of the main concerns. LLMs predict token sequences based on training data and these generate the results. This opens up a vulnerability where malign actors could flood a particular niche with AI-generated content, keyword stuffing and aggressive topic clusters. Google is focused on eliminating mass-generated AI content from the SERPs, but this can be a game of whack-a-mole.
If Google has to work hard on this through core updates, AI will have significant challenges in combating these tactics. Google has added a section to Quality Rater Guidelines specifically about scaled content abuse, citing “an abundance of content with little effort or originality with no editing or manual curation” and gives several examples including:
- Use of automation to create many pages that offer little or no value for website visitors.
- Scraping feeds or search results to generate multiple pages where little value is provided.
- Stitching or combining content from different pages without adding value.
- Creating multiple sites to hide massive scaling of content.
- Multiple pages with intensive keyword use that make little sense to a human reader.
According to these guidelines, sites that use these tactics should be rated as the lowest by Google’s Quality Raters.
The challenge for AI chatbots will be interpreting training data without key traditional indicators like domain authority and brand reputation. The low-hanging fruit of e-commerce sales will likely be too good to resist for some less scrupulous SEOs, even if the campaigns are short-lived. We risk returning to the old strategy of creating a site, spamming the hell out of it, getting caught, shutting it down, and starting again.
None of this is good for the web or AI chatbot users. So far, there is documentation of spam tactics influencing local business transactional results and propagating medical misinformation, which led to a drop in citation accuracy of 38%. This project used 50K forum posts to promote debunked vaccine effect studies.
As referral traffic grows from these sources, we will likely see more black hat tactics being implemented unless some guardrails are introduced. If there is no reaction, the quality of results from these platforms will be compromised and unreliable.
How much traffic is AI Search actually sending?
A recent study by Ahrefs looked at the traffic of 3,000 websites recording volume and referral sources and whether the site size has an effect.
They looked at sites with more than 10K visitors, between 1K and 10K and below 1K.
The AI chatbots analysed were:
🔹 ChatGPT🔹 Claude🔹 Copilot🔹 Gemini🔹 Perplexity🔹 Jasper🔹 Mistral
Here are some of the findings.
-
63% of sites received some AI traffic
-
98% of AI traffic is sent by 3 chatbots
-
ChatGPT is the biggest supplier with ~50% of visitors
-
Perplexity is number 2 with ~31% and Gemini is next with ~18%
-
Smaller websites get a greater % from AI
-
0.17% is the average % of traffic
This excludes AI visitors that fall under the direct source category.
And it is essential to note that many, if not most of these sites will not be optimised for AI traffic.
Several other surveys are being published, and most are not far away from these results. It is interesting to see the differences based on niche. For example, B2B SaaS has a higher percentage than ecommerce or B2C service.
Copilot shows up better in some of the other surveys but these can be impacted by smaller sample sizes or geographic/language differences. And Copilot is more likely to end up in Direct than most others. Measurement can also be skewed by people referring to ChatGPT with a space i.e. “chat gpt”.
The expectation is that as this evolves, more prominent brands will begin to dominate AI traffic, but smaller sites are doing better at this stage. This is a very interesting anomaly.
The numbers and percentages are still relatively small and this is to be expected. This is no threat to Google’s massive market share but over the coming months, AI will continue to grow in analytics visibility as more and more users choose it as an alternative to traditional search engines. This behaviour is demographic-driven, so it would be wise to look at your customer base to see if they are likely to use AI or traditional search engines and what the trends are.
Another data impact is caused by people using AI to get information before using a more informed search engine like Google to access the correct websites.
We will likely see more and more data coming out in the next few months as the platforms, marketers and analysts get better at tracking these new sources.
Other considerations
There are some other influences on your performance in AI Search.
Your brand and its reputation is a big one. Media, PR, social media, industry sites and citations all contribute strong signals.
Domain authority is a clumsy measurement and is more advantageous for SEOs to see if PR and backlink strategies are effective. But it does seem to count with AI, even if most is generated by brand work rather than quality backlinks.
FAQs are important. Answer as many of your prospective customers’ questions and challenges as possible. Answer sections within pages are useful.
Content quality is a key consideration so don’t push out poorly edited AI content. It’s not good for AI and terrible for your brand.
Author authority is essential. This is an advantage if the author is known as an expert in a specific niche. Unique insights and experience will also add additional value.
User experience should also be at the top of your list. Your goal is to create a fast-loading site with the right content at the right time and in the right place.
Optimisation tweaks like schema will give AI crawlers a better understanding of your content and page intent.
Fresh content with good authority signals will be favoured. Citing trusted sources should form part of your content strategy.
And finally, have a robust traditional search engine content and SEO strategy. Most of the top results in AI Search feature in the top 10 positions on Google and Bing.
What next?
This post covered some of the basic SEO concepts in 2025, but there is a lot more I haven’t included, which I will share insights on in the coming months. I am posting regularly on LinkedIn, so connect with me here to stay up to date.

About the author
Mike Morgan co-founded Whippet Digital in 2010, gaining international recognition for innovative content marketing and SEO strategies in their first few years of operation. Mike designs digital growth strategies for a range of clients in New Zealand and Australia and utilises the expertise and experience of the specialists in the Whippet Digital team. Connect with him on LinkedIn or Bluesky and visit the Whippet Digital website to book a call.
If online visibility is important to your business and you’re keen to learn more about the (rapidly) changing nature of SEO please get in touch with one of the Proxi team and we’ll help you connect the dots.