A and I. Two letters you’ve heard countless times in the last few years with increasing frequency and little sign of a slow down.
It now seems to be something we can’t avoid - we see AI generated advertisements, we hear AI generated voices, we read AI generated articles and, with every day that passes, it becomes harder to tell what’s AI and what isn’t.
People are becoming more comfortable using it in their daily lives, which raises a lot of questions for businesses. For example:
- How will users' web browsing behaviour change?
- What does this mean for our website?
- Will we even need a website in the AI future?
This article focuses on the final question. As it’s my job to build websites, it's something I ought to have some opinion on. So, let’s jump in and go on this journey together to find some answers.
How do AI tools get their data?
A minor detail to cover off first is that the tools we use (ChatGPT, Claude, Gemini etc.) are actually just interfaces that allow us to interact with the Large Language Models (LLMs) that sit behind them.
These LLMs are what we 'converse with' and they can be thought of as text prediction engines. Every letter that the LLM produces in its response is a prediction based on what it determines to be the most likely correct next letter. Its level of confidence all boils down to how much data it has been trained on and the quality of said data.
That data is fetched in a variety of ways, but the most common way is by scraping. The companies that create these LLMs will have an army of bots scouring the web, downloading data and sending it back to feed the machine.
I will admit this is a very high-level overview and skips a lot of steps, however, the key point here is that the data it is trained on is publicly available - something you or I could find if we wanted to.
Remember this, we’ll need it for later.
Before we go any further on our journey and explore future user behaviour, we need to take a quick pit stop and look at user behaviour today and, importantly, how we understand it.
User behaviour of today
The way we currently use the internet has evolved over the last couple of decades but, at its core, it’s still the same: we log on, we search for a thing, we click a link, we find or buy a thing, and we move on. Search engines are our keys to the internet and enable us to find what we’re looking for.
As website builders and marketers, we have tools that enable us to see how people are surfacing our content and, importantly, we have tools that allow us to see what users are searching so we can tailor our content to fit.
In theory, this is great, we’ve got all this visibility, so we can find users at the source and bring them in. In reality, however, it’s led to modern day SEO, which is a bit of a minefield and has left us feeling as though we’re in some “keyword handcuffs”, where we’re driven not by want we want to present, but what we’re told we should present.
Crucially though, that visibility is very important. It gives us insight into how our content is performing and if we’ve hit the mark or not. Are we appearing in the right places at the right time?
Today, you can answer this question definitively, which leads us out of the pit stop and onto the next stage of the journey…
User behaviour of tomorrow
With users slowly migrating to AI tools - and in some cases the AI tools migrating into the search engines we’re so used to – what does this mean for our visibility? Can we see what users are asking? Can we see what the tools said about our website and what information they referenced? Most importantly, did the AI display accurate information?
The answer to all the above is: we just don’t know.
The visibility we have today isn’t there in tomorrow’s AI world. The current suite of tools doesn’t tell us how or why our content was surfaced.
For all we know, the user could have been presented with information that’s no longer relevant because it was scraped from old data. Or the AI mis-interpreted some information then mis-represented the business.
This presents us with an interesting conundrum. On one hand, it almost releases us from the aforementioned keyword handcuffs, freeing us to focus on producing high-quality content because we think it is meaningful and important, knowing this is what LLMs prefer.
On the other hand, we lose crucial insight; are we appearing in the right place at the right time?
The best we can do today is get an indication of how much traffic these AI tools are driving to our websites. Users of our Beacon benchmarking tool can already see this data in their dashboard and it’s providing some eye-opening insights.
For instance, across all our Beacon members, less that 1% of their website traffic is driven by AI tools.
I can already hear your next question, so let’s swiftly move on.
How can we take back ownership of our content and regain visibility?
Good news. All is not lost here.
We have an opportunity to take back control. We can regain that lost visibility. We can serve users with the best quality content. We can even serve users information that we know these traditional AI tools won’t have from their scraping.
How?
Build your own AI tool.
I know it sounds like an impossible task - but I’m here to assure you that not only is it not an impossible task (it is, in fact, very do-able), it is one that I think will be vital.
Let me state my case.
- You lead the change. User behaviour with AI tools is still evolving so it may well change, but what we do know is that people are using these platforms more and more. People are getting used to searching with natural language, speaking to AI as if they were talking to a person. A key benefit people find with AI is how quickly it can surface information. Tasks that would have taken minutes of cumbersome manual trawling now take seconds. By building your own tool, users will understand that they can access all your content in a way they’ve become accustomed to.
- You control the content. The big drawback with these AI tools is that the data they train on is often outdated by the time we have access to the model, and while they can use online search, it’s down to the model to correctly source and interpret the data it finds. By building your own tool, you can ensure the model only has access to the content you give it. A crucial point here is that you can give a model “exclusive information”, which only your model can use and access, meaning users would find the best information about your business by using your tool.
- You regain visibility. By building your own tool, you can see what users are searching for. You see their areas of interest and can drive action from there. Your AI tool can see the user’s intent and provide contextual follow ups - "Do you want me to find the contact information of the person you should speak to about this?”
I want to be clear and emphasise that I don’t think your website should just become a home for an AI chatbot and nothing else. On the contrary, it should become a place full of rich, high-quality information that can be surfaced in a traditional way that we’re used to today and/or via your very own AI tool that can help drive user action and serve users with information in seconds.
So, with all that said, what’s the answer to the original question?
Well, it should be no surprise that I think not only will you need a website, but I think your website will be more important than ever before to ensure that you can take back the control these AI tools have taken from us.
