Hallucinations and the case for content accuracy

Insights

AI content has for some time been flooding the internet, and it’s only going to increase, possibly exponentially, going forward. This is giving search engines quite a headache, especially when it comes to the accuracy of content.

Firstly, many of the AI tools are stuck in time. ChatGPT is stuck in September 2021, Anthropic’s Claude is stuck in early 2020 and other tools have similar cut off points. It means that they don’t have any new or fresh data to reference and use in their responses.

Secondly, AI chatbots suffer from hallucinations, which Wikipedia describes as “a confident response by an AI that does not seem to be justified by its training data”. Basically, it makes things up! We’ve seen this quite often during our SEO testing over the last few months. The AI will create fake quotes, fake businesses and link to resources that don’t exist.

Website owners may be using these tools to generate their own digital marketing content, unaware that there may be publishing misleading or fake information.

Search engines now have the challenge of determining what is true and what isn’t. They can’t afford to surface incorrect information. Historical facts may not be such an issue, but future facts and information…AI could distort this and create a virtual future which ultimately, is just an hallucination. For now, just make sure your content is accurate and we’ll worry about the future tomorrow!

View More Insights