Justin's SEO Insights
Justin, our Technical Director, shares his thoughts on what is happening in the world of SEO...
Is site engagement a ranking factor?
It’s no secret that Chrome browser records a significant amount of user interaction data and sends quite a lot of it back to Google. It could be assumed that a significant amount of that data is used to help Google understand user profiles so that they can better target ads to them.
But there is also a possibility that the data is used for organic ranking purposes too. In fact, it’s probably highly likely that they do.
Following Google’s DOJ trial last year in the US, user engagement with the search results was highlighted as one of the main ranking factors. It would make complete sense that as well as interactions with the search results, interactions with web pages, and the elements of pages, would also be used for ranking purposes.
The DOJ is threatening to remove Chrome from Google’s control and this could severely disrupt the data flow used from Chrome for ranking purposes. It will be interesting to see how this progresses.
In the meantime, you can see your own website engagement metrics by typing chrome://site-engagement/ in your Chrome browser address bar.
August Core Update Helping Small Websites? No!
Following the Helpful Content Update (HCU) which rolled out in September 2023, a very significant level of small websites, and small businesses, were negatively impacted by this update. It appeared, once again, that Google was favouring, even more than before, large, established websites in search results.
The Core Update that followed in March of this year was also quite extreme, with no recoveries visible from the HCU update, and it led to probably the biggest response from the SEO community that I’ve seen in years. It ultimately forced Google to create a feedback form when the update was finished to understand what was potentially wrong with the update and the search results.
The response from Google was that for the next core update, they would look to better surface the smaller websites, which often have far superior content to the more authoritative websites, but after two weeks of the update rolling out, there has been no sign of this. Of the impacted websites that we monitor, not one has recovered. If anything, the latest core update has further cemented the authority of the larger websites in the search results.
It could be said that Google has broken the web as it continues to hoard traffic and distribute the rest to select websites. It there was ever a moment to consider diversification of traffic sources…it’s now.
Recovery from the helpful content update IS possible
It’s no secret that the helpful content update (HCU) which rolled out in September 2023 was quite a catastrophic update for many website owners. It appeared to mainly target content websites, i.e., websites particularly focused on publishing content and monetised through ads and affiliate links.
As of today, there are no reported cases of any recoveries from this update. However, we have successfully recovered 2 websites that lost practically ALL of their Google traffic following the HCU update.
I didn’t want to post this too soon but they have been recovered for over 3 months now. One has returned to traffic levels above those which it received before the HCU update, whilst the other website is at 60% of traffic, although this website is now much smaller than pre-update, so this was the expected traffic level at this stage.
What did we do? The reality is that after the update hit, it was obvious how things had changed in the results and feedback from Google engineers made it clear that the world has changed, the internet has changed and this is how it is now.
It took a deep assessment of what we had on the websites previously and aligning them to what Google now expects to rank. Listening to Google’s advice of “the internet has changed” is probably the best advice I could give to anyone that’s still struggling to recover their websites. Recovery is possible. Get in touch with us if you need some help with this!
Google has killed the creator economy with the March updates
There was a time, not so long ago, that if you had some knowledge about a particular subject, or just wanted to have a bit of fun, you could create a living for yourself writing a blog or creating what you deemed to be a useful website. However, over the last 12 months, and in particular now, with Google’s March algorithm updates, this seems to have all but put an end to the creator economy.
As Google tries to tackle web spam and the influx of AI content, it is quickly destroying what was left of the small, independent websites that provided exceptional value to users searching on their topic. Reading the comments on forums and on news sites of website owners who have practically lost of all of their traffic with the recent updates, it’s heartbreaking to realise the serious effects that these updates can have on people’s livelihoods.
Many people depend on Google’s traffic to pay the bills and as always happens with these large updates, good, useful and legitimate websites get caught up in the updates and they suffer unnecessarily. I’ve seen it happen over and over again during the past 20 years since I started being one of those creators.
The search results are now full of large authority websites, forums and so much other “noise” that it’s difficult for small creators to be seen. It’s not impossible, but Google’s just made it that much harder for everyone. It’s not something that is ever likely to be reversed now so they have effectively killed an economy which they practically created when they launched Adsense in 2003. However, the party isn’t quite over…it’s just a different sort of party now!
Why “average” backlinks don’t work
There was a time…a very long time ago…when, as SEOs, we used to say that “a link is a link”. You would do anything and everything to get new links to websites regardless of what sites they were on. Things have changed quite a lot since then and as Google started getting better at understanding what are good and not-so-good links, the effort shifted to always focus on quality, relevant and helpful links.
This is why, at Artemis, we do all link-building in-house. That way we can control the quality and effectiveness of each link we build for clients. However, it is VERY time-consuming and one of the most challenging things we have to do, and that is why most SEO agencies actually outsource their link-building to link-building companies (I won’t name them here).
As our link quality standards are very high, we started wondering if perhaps they were too high, making it more difficult for ourselves than it should be. So, we decided to test the links built by one of these outsourced link-building agencies to see if their links actually have an impact on the rankings of a website. Our test website was one we built over a year ago for which we hadn’t done any prior link building. That way, we could see that if there were any changes in the rankings or search visibility of the website, it would be solely down to the links.
The links from the agency wouldn’t even meet our very minimum link quality standards and as expected…nothing happened. The links had no effect. We weren’t surprised. Quality links work; there’s no point in wasting money on low-quality links these days…a link is not always a link.
Do clicks from the search results impact rankings?
In the 20+ years that I’ve been doing SEO, the topic of the influence of clicks on search results, and the subsequent impact on the rankings of pages, has long been discussed and debated at length. The assumption has always been that yes, if a website in the search results seems to get a higher level of engagement, compared to other websites, then it’s likely that that website will continue to rank, or rank better for the associated search query.
Recent documents released by the US Department of Justice, as part of an antitrust trial brought against Google, appear to back up this theory. The documents seem to suggest that this “ranking factor” should be kept confidential outside of Google, due to the ease and possible manipulation of this ranking signal.
And it makes sense. When we are optimising websites, we also analyse how users interact with pages, as this helps us to maximise them for conversions. Google does the same thing to ensure that its results are in line with what users are appearing to find useful and relevant to their searches. Google analyses clicks, scrolls, pauses, hovers and much more to understand engagement with the results.
The concept of “chasing the click” has always been relevant; it just hasn’t been spoken about in public by Google engineers…but it is a thing and it does matter.
Why are there so many Google algorithm updates?
This morning Google just announced another Core Update and yesterday a Spam Update. This follows on from the Core Update in August the not-so-helpful Helpful Content Update in September. That’s a lot of big updates in a very short space of time!
Although Google makes many minor updates continuously, which is why the search results are never static, these much larger updates, which they announced publicly, tend to be the ones that have the biggest impact and can really negatively (or positively) affect the rankings of websites.
Over the years, website (business) owners have called on Google not to release significant updates in the lead up to the peak Christmas shopping period, but the reality is that we will see more key updates even in December too.
But why so many? Current estimates reveal that there are 250,000 websites created EVERY day. Each will have many pages so there are well over 1 million new pages for Google to crawl and index every day. As the web grows, it gets more and more difficult to filter the good from the bad. Google also has to tackle increasing spam techniques and the amount of AI generated content flooding the internet.
The Christmas period tends to also encourage the spammers to try and make the most of the busy Christmas shopping period, which is why we’ll see updates around this time too. The reality is that there are increasing challenges in crawling the web and delivering quality search results, so we’re going to see many more significant updates. Stable search results are well and truly a thing of the past.
The constant need for testing in SEO
I receive a regular email from a large SEO website that shares some findings of SEO tests that they have run for some of their clients. The results are often as expected, although not always.
The fine details of the experiments are very limited and, therefore, it means that it’s ultimately not something that can just be repeated for another website and expect similar results.
There are so many factors that determine why a page ranks above another. Changing certain elements, such as titles, headings, content, etc., can help but every search query is different, every website is different and there is no one-size-fits-all approach that can be adopted.
It means as SEOs we have to be constantly testing new ideas, new optimisation, new page layouts, set ups and content approaches. As Google evolves, we must do the same. We don’t always get it right, but failure is an important learning exercise. Through rigorous testing we can build up a much better picture of what works and what doesn’t. This is how we can consistently achieve good rankings and traffic growth for our clients.
Test, test, test…test some more and keep testing. It’s all about the testing!
The root problem of the SEO industry
So you want to be an SEO? That’s great…just call yourself one! You don’t need to have any qualifications or accreditations to start offering SEO services. You can start a website, regurgitate what others or Google says, get enough followers on Twitter (sorry, X) and now you’re a fully-fledged SEO.
What I’ve often seen is that the SEOs that are prolifically active on a daily basis on social media, are not the best SEOs. Most don’t even have their own websites and don’t practice what they preach. With so much time on social media, when do they ever get any work done? In fact, the best SEOs that I know are the ones that never share a single comment about what they do on any platform.
There was a great example the other day of the sort of bad SEO advice that you see on social media all of the time. A user on Twitter, who classifies himself as a “Technical SEO” suggested that if your web pages load too quickly….you should slow them down! His argument was that increasing a user’s time on page, but making it load slowly, will make Google think that the user is finding the page helpful and therefore increase the rankings of the page.
As bad advice goes, that’s got to be right up there with “grow your website authority with bookmarking links” (yes, this is still offered as a service!).
Nothing beats experience and knowledge backed up by data. We’ve been doing SEO for our own websites and clients’ websites for over 20 years and we continuously invest heavily in R&D. Don’t always believe everything you read online about SEO…unfortunately so much of it is ****.
Should you use AI generated images for your website?
If you’re a business owner or you run your own website, you may have fallen foul of using an image for which you don’t have the rights to use. This may then have been accompanied by a legal letter demanding a serious amount of money in compensation.
The days of scouring the internet for images, or paying monthly subscriptions to download professional photos, may be well and truly over with the astonishing advancements in image generation AI tools such as Midjourney making them mostly redundant.
But are they safe to use on your website and how will they impact your website in search? Google’s advice on the use of AI for content, doesn’t just apply to written content, but images too. As long as the images are helpful, useful and complement the page, then there is no reason whatsover not to use them. Does an image created in Photoshop have any more value than one created by AI? Not necessarily, they can both be just as good.
For someone like myself with zero artistic ability, AI image generation tools provide a convenient way to add high quality and unique visual elements to the pages of my website. In fact, since I have been using them they have started appearing in the search results. For now Google isn’t seeing them as a negative. And to close off this insight…here’s an AI generated image of a robot making images!
Should you delete old content from your website?
This has been a hot topic this due to an annoucement from the large tech website, CNET, that they had deleted thousands of pages from their website to increase the overall freshness of the website, with a view to improving Google rankings.
Various voices from Google joined the discussions to state that this does not help and is not “a thing”. It seems that CNET has been a little misguided with their approach to freshness.
However, it’s not such a bad move as it’s being made out to be. When the Panda algorithm launched in 2011, one of the main factors that could cause a site-wide ranking penalty was having a significant amount of low quality content. What is deemed as low quality can be argued at length but the reality is that it is better to have a smaller, higher quality website instead of one that has many pages that mostly don’t rank.
Understanding the concept of PageRank and how Google transfers the value of inbound links to internal pages is key to understand that the fewer pages that you have results in more PageRank being passed to each of them.
Google doesn’t want you to remove pages, unless they are really bad, as they may one day answer an obsscure query, but it does want you to keep them updated as much as possible. This is challenging for large websites. The reality is that less is more and it’s always better to have less pages, which are very good and up-to-date, rather than a large website with many pages of outdated content.
Has the novelty of generative AI worn off?
It’s difficult to believe that just over 6 months ago, “ChatGPT” was not something that anyone even knew existed. But since its launch late last year, it has been the hot topic of so many conversations and news articles. The reality is that generative AI was nothing new but ChatGPT made it accessible and somewhat fun.
It even kicked off a search war between Google and Bing with both launching their AI chatbots so as not to appear to be getting left behind. I can only image the fear within Google seeing these new tools potentially eating into their lucrative search dominance. It may have seemed like the “Nokia/iPhone” moment for Google!
But, you may have noticed that things have gone a little quiet. There are less news articles about generative AI, there is less chatter and some of that initial buzz has definitely faded. In fact, an independent source last week estimated that after traffic to ChatGPT plateaued, it has now started to decrease. A recent survey of SEOs also showed that most are no longer interested in the new Google generative AI experimental results in the US.
Whilst these tools definitely have some very powerful business use cases, for the vast majority of people, on a day to do basis, they don’t. We are possibly seeing the initial novelty wearing off although these tools will evolve and they are here to stay. I suspect quite strongly that Google will reevaluate how it integrates generative AI into its search results, if the early signs are that it’s not actually what people ultimately want from a search engine.
This week we say farewell to Google Analytics
When I was first starting out building websites in 2000, Google Analytics was but a dream. In those days we had very basic analytics software, such as AWStats, although for those with some money to spend, they could opt to pay for an expensive software tool called Urchin Analytics, which provided much more website usage data.
Luckily for website owners around the world, Google was on a spending spree and in 2005 spent a supposedly $30m dollars to aquire Urchin. This product was then rebranded as “Google Analytics” (GA) and it was made available to everyone…for free! Happy days 🙂
For the last 18 years we’ve grown to know, understand and query GA like seasoned data analysts and so it is with some sadness (and frustration) that we bid a fond farewell to a tool that’s been a key part of our business for as long as it’s existed.
GA4 is the new analytics tool replacing GA as of the 1st of July. To say that GA4 has received a very cold reception since it was introduced is putting it mildly. But we will need to adopt it, adapt to it and hopefully in 18 years’ time, learn to love it…at least a little bit. Farewell GA, you will be missed.
What constitutes quality content? It’s more than just the text
Google has always been very clear about one thing…”Produce great quality content that users will love”. Follow this simple guideline and you’ll be rewarded with Google traffic. Actually, it’s not that straight forward, there are many factors at play. But what is “quality content”?
In a recent Google SEO office-hours Q&A session, Google’s Gary Illyes was asked about issues related to the indexing of a website. Gary clarified that in order for Google to properly index a website, it depends upon the popularity of the website (backlinks) and the quality of the content.
Previously, Google engineers have stated that quality content is not just about the textual part of the page, but also the quality and integration of images, the page layouts and page load speed, to name a few.
One of the ways to look at this is from an intent perspective. If a user is searching for “anniversary gift ideas”, it’s unlikely that they want to read a lot of text, they probably want to see images of actual gifts that they could buy for their partner. A page with just text on it, is unlikely to rank for this search term. In many cases, especially for searches where the intent is for an expected visual result, it’s probably better to make the images the priority, even above the text.
There is a new trend now towards the use of AI images on web pages. These can sometimes be misleading to users and Google has announced that they will be identifying AI images in their search results and labelling them accordingly. With the ease now of creating new images, it’s a great time to ensure that web pages are deemed higher quality by search engines.
Users tend to scan text, but will always focus on an image. Use them to your advantage to make pages high quality…you will be rewarded!
Hallucinations and the case for content accuracy
AI content has for some time been flooding the internet, and it’s only going to increase, possibly exponentially, going forward. This is giving search engines quite a headache, especially when it comes to the accuracy of content.
Firstly, many of the AI tools are stuck in time. ChatGPT is stuck in September 2021, Anthropic’s Claude is stuck in early 2020 and other tools have similar cut off points. It means that they don’t have any new or fresh data to reference and use in their responses.
Secondly, AI chatbots suffer from hallucinations, which Wikipedia describes as “a confident response by an AI that does not seem to be justified by its training data”. Basically, it makes things up! We’ve seen this quite often during our SEO testing over the last few months. The AI will create fake quotes, fake businesses and link to resources that don’t exist.
Website owners may be using these tools to generate their own digital marketing content, unaware that there may be publishing misleading or fake information.
Search engines now have the challenge of determining what is true and what isn’t. They can’t afford to surface incorrect information. Historical facts may not be such an issue, but future facts and information…AI could distort this and create a virtual future which ultimately, is just an hallucination. For now, just make sure your content is accurate and we’ll worry about the future tomorrow!
Focus on the money (terms)
At Artemis we’ve always had one key philosophy and it’s the essence of the messaging on our website…we help our clients make more money. If clients aren’t getting more leads or generating more revenue, then we aren’t doing our job properly.
For that reason, we’ve always focused on the search terms which drive targeted traffic to their websites. We’ve never been about chasing traffic numbers. It’s easy to just get more traffic, but if it’s not targeted and the intent isn’t commericial, isn’t generally quite pointless.
With the relentless arrival of AI directly in the search results, this has now become even more relevant. Whereas many SEO strategies over the years have been focused on increasing traffic through content generation, if this has worked then it’s not likely to work for long. The AI will directly be serving up information in the search results.
The only search terms that will matter are the money terms, such as searches for local services and commercial queries for products. We’re foruntate to have always focused on the money terms, it’s what works now and will work in the future.
Neeva was neeva going to work
Excuse the pun in the title! This week Neeva announced that it was shutting down its subscription based, ad-free, privacy focused search engine. Although this news was greeted by quite some surprise in the search world, it wasn’t really a surprise at all. How many people have actually used Neeva? No one I know has ever used it, let alone heard of it.
You only have to look at the struggle that Bing has had, with all of the mighty resources of Microsoft behind it, trying to compete with Google and failing. Bing has a miniscule share of the search market, probably less than 5%.
So, how was a search engine, that you had to pay to use, ever going to compete against the free established options? Neeva stated that the problem wasn’t so much getting users to pay, it was just the underestimated difficulty in getting people to change from what they are so familiar with. Once users are used to having something for free, it’s practically impossible to make them pay for it. Even Twitter now, with its pointless paid options, has only got a tiny percentage of users to part with their money, and most of those are either Musk fanatics or those that desperately want a little blue tick…which means nothing now.
Search has evolved to the extent that it is highly complex and requiring vast amounts of money. It’s also part of our lives. It’s highly likely that people will still “google it” for a long time to come. The competition just isn’t there and we’re so ingrained in Google’s eco system.
Why Google will still be sending traffic to websites in an AI world
With all the recent advancements in AI, and especially Google’s planned integrated of its chatbot directly in the search results, one of the biggest concerns for any business or website owner is if Google will still end up driving traffic to their websites when search becomes “Generative AI search”.
The short answer is “Yes”. And the reason is very simple. It’s all about the money. In 2022, Google generated a total of $227.47B in advertising revenue, of which $162.54B (71%) was from ads in search (PPC), $32.78B (15%) from ads on third-party sites (Adsense) and $29.24B (13%) from YouTube.
Google can’t just jeopardise any of that revenue. On the contrary, it has to show continuous growth. Paid ads send traffic to websites, the content network displays Google ads (Adsense) and that alone generates billions in revenue for Google. The content network alone generates more revenue than YouTube and it generates a significant amount of its traffic from search.
Website traffic will change, but it’s not going away. It can’t, it’s too valuable for Google.
Google’s new “Generative AI Search” announced
Yesterday was Google’s much anticipated Google I/O annual conference, and the focus throughout the whole presentation was extensively about the application of AI in Google products. The main area of interest was how Google plans to integrate AI into the search results, what it called a “Generative AI Search Experience”.
What Google revealed was not a complete surprise. We were anticipating that, unlike Bing, Google’s AI chatbot would be fully integrated and front and centre in the search results. It showed how the AI will help users to get a more “helpful” search experience by being able to easily expand on topics, and help users to find products to buy based on Google’s huge Shopping database.
It was interesting to note that there are areas where the AI is less likely to appear, and that’s for critical searches. Generally anything to do with finance and health, what Google calls “YMYL” (your money, your life) pages. It will still prefer to show trusted search results as it does now.
The examples Google showed of its new search in action, were mainly for long queries, the type you’d ask a chatbot. But we will need to see what happens when it’s used for searches to do with local businesses or finding a particular service.
At Artemis we’ve been embracing and adapting to the influence of AI in search for many years, and we’re excited for this next phase and maximising the opportunities for our clients.
Anticipation for Google I/O conference today
Google I/O is an annual conference where Google reveals its latest news and innovations to the public. Today’s conference is an important one and is being highly anticipated, with many in the SEO community trying to predict what will be announced.
It’s no secret that Google has been caught on the back foot with the onslaught of AI tools that have been released following the success of ChatGPT and Bing launching its own AI chatbot back in February. Google’s subsequent release of Bard, its own chatbot, has received a very lukewarm reception. It felt like it was just too little too late.
But, in today’s conference, it is expected that Google will announce its plans to integrate AI into its search engine and to make search more personal, interactive and well, up to date. This will be big news for all website and business owners as any changes as to how Google handles and displays its search results always impact the amount of traffic that Google sends to websites.
AI is going to be front and centre of today’s conference and we can expect that some major changes are coming, and probably to be launched in the coming weeks and months. We’ve already started seeing some of the possible changes coming to search, but Google is always testing so it’s difficult to know if these changes are permanent or just part of its regular testing schedule.
You can watch the conference live tonight at 6pm (UK time) and this year, I definitely think it’s worth the time.
Helpful content needs a good page experience
Google has recently updated it’s helpful content guidelines to include a section about page experience. Last year, Google began rolling out new updates called “Helpful Content Updates”. These were designed to promote websites in the search results which were deemed to have, well, helpful content. It is worth reading what Google deems to be helpful content and why some websites may perform better than others in search when it comes to this criteria.
It is important to note that page experience isn’t necessarily a ranking factor on its own. It’s a combination of factors, such as website security, lack of ad intrusion, good core web vitals etc. This differs from helpful content which is more about the content itself.
Including page experience as part of the helpful content criteria is to guide website owners in ensuring that if their website does happen to have the best content, then just make sure a user can properly interact with it. Don’t make the content difficult to access, don’t plaster the page with too many ads, make sure the page works well on mobile, etc.
Although relevancy to the search query will likely trounce most other search ranking factors, if there are two listings that have similar helpful content, but one has a better user experience, then that website may just get to appear above the less usable result. It’s always a very good idea to create a great user experience…and it’s good that search engines are starting to take this into account!
Google rushing out new AI features and new search engine
Not a day goes past without some new and urgent news related to AI and search. It currently seems that Google is under some pressure to respond to the influx of AI tools, having been caught on the back foot by Bing launching its AI chat feature at the beginning of the year.
A recent article by the New York Times stated that Google now has over 160 engineers, designers and executives working full time on project “Magi”. Magi is an updated version of Google search but incorporating AI answers directly in the search results, offering a much more personalised approached. It seems that this could be launched as soon as May, with internal testing already under way. Initial release will be limited, with a gradual rollout.
As if that wasn’t enough, the article also mentioned that Google is working on a new search engine built on AI. This is intended to be much more conversational and personalised when compared to the existing search engine. This project seems a long way off yet but if anything, it highlights that the future of search is scheduled to change, possibly quite significantly compared to what it is today.
As if that wasn’t enough, news also surfaced that Samsung is looking to change the default search engine on its phones from Google to Bing, possibly because of Bing’s much more integrated AI chat advantage. Being the default search engine brings in billions of dollars in revenue for Google! We wait and see what happens…
Google’s spam fighting and what it means for non-spammers
Google just released their annual webspam report and, as expected, the results are that their webspam-figthing algorithm, terribly named SpamBrain, has been busier than ever fighting an ever increasing amount of spam on the web. There was a 5-fold increase in the number of spam websites detected in 2022, compared to 2021, and a 50-fold increase in detecting link spam.
What’s not clear is if the increase is attributed to there being a higher level of spam websites and links than the previous year, or that Google is just getting better and better at detecting it. I assume it’s a combination of the two.
Webspam has always been an issue, ever since the web was created, and although Google doesn’t always manage to detect it, it does a pretty of job of keeping the worst of it out of the search results. We do see spam websites appear in the results from time to time, but generally it is quite rare that Google will lead you to a website that is pure spam or likely to ruin your day by installing some malicious software on your computer.
The huge amounts of spam content and links means that there is a lot of noise on the web and it’s still challenging for search engines to detect it all. As legitimate content creators, we have to possibly do more to stand out and be heard. Google doesn’t index all content it discovers, so the focus on quality and authority is a major consideration for website owners and businesses today. It’s only getting noisier out there, we just need to always do more (good things) to be heard and be rewarded for it.
Authority or relevance – What’s more important for links?
It’s no secret that backlinks matter. Inbound links are a fundamental ranking factor in Google’s ranking algorithms and although content quality is very important, in order to build up authority online a website needs quality inbound links. If you don’t believe me, try ranking for car insurance with a great page, full of the best content but with no inbound links!
However, not all links are created equal and some links will have more of an effect than others. There are many factors that determine what weighting is passed across from a link (which we won’t go into here), if at all, but a common discussion around links is about authority vs relevance. Which one is more important?
The reality is that a link from an authoritative and trusted website, not necessarily in an associated industry, is going to be more impactful than a link from a lower authority website but which may be more relevant to the destination website. Both links, if they are deemed by Google’s algorithm to be “natural” links, are likely to help the page being linked to to rank better, but the authoritative website is likely to have the highest influence.
It is for this reason that you often see large authority websites, which may not necessarily have the best content, still outrank better, related websites but with lower overall authority. Authority (links) matters. Links are still a very significant aspect of a website when it comes to good rankings. Authority first, relevancy second.
Is Bing now driving more traffic after the launch of its AI chatbot?
Microsoft was immensely proud of having been the first major search engine to release an AI chatbot, and rightly so. Despite Google claiming itself to be an “AI First Company”, it was slow to release its AI chatbot and was well and truly beaten to it by Microsoft, having implemented a chatbot into its Bing search engine in February, then forcing Google to release its competing chatbot a month later.
Microsoft had certainly seen this as an opportunity to potentially steal some search market share from Google. Despite Microsoft’s attempts and resources, Bing only has a 6% share of the US search market, and 4% of the UK. Those are very low numbers but Microsoft was anticipating an increase in search share based on its implementation of Bing AI.
However, although Bing usage has reportedly increased, this is not translating into an increase in search traffic to websites. Having spent some time seeing the evolution of Bing traffic to some of my websites today, it looks like it’s very varied, some are up and some are down. In the best case, one of my sites has seen an increase of 16% in traffic from Bing over the last month, compared to last year. However, although this may sound impressive, it’s still only 4% of the overall traffic. Most of the rest is from Google.
It’s early days but I think this reflects that for those of us who are in the industry, or related industries, we are constantly playing with these tools and looking for opportunities, we are in a slight bubble as users in general are not rushing to try out these new AI tools….just yet. We’ll keep monitoring the traffic from Bing but right now, I don’t envisage any change to the status quo, not for some time yet anyway.
AI overload – GPT-4, Bard, Bing, Claude, Midjourney and so much more
The world is going AI crazy and it’s become quite overwhelming. In March there were over 1,000 new AI tools launched, covering all sorts of real-world applications based on the foundations set up by ChatGPT, which is now using GPT-4. Google launched its much anticipated AI chatbot called Bard, Anthropic launched Claude, Midjourney, the AI image generator launched V5, which creates unbelievably realistic images….and so much more.
The plethora of tools and articles about all of the tools, how to use them, how to benefit from them, it’s all becoming a little too much and too fast. The rate of innovation and adoption has been unprecedented and even Google’s release of Bard hasn’t exactly set the world alight, possibly a little late to the party.
The onslaught of AI tools has also opened up an entire argument as to the ethics behind these tools (they use content found online to generate their responses, yet they mostly don’t credit the sources) and also the fact that they are mostly open to everyone, regardless of age. Italy has been the first country to acknowledge the risks of these AI tools and has banned ChatGPT and similar tools from being used. This is probably just the beginning.
It’s just all happening way too fast but the pace of innovation is unlikely to slow down any time soon. If you feel overwhelmed by it all, you’re not alone. Personally, I’m avoiding a lot of the social media activity around the tools and taking the time to just understand and master a couple of tools and ignoring the others. We can’t slow down the progress of AI tools, but we can control how we choose to consume them.
March 2023 core update rolling – Why I like core updates
Google just annouced that it has started rolling out it’s latest core update, imaginatively called the March 2023 Core Update. The last core update was back in September 2022, but that one was quite a mild one, especially compared to the highly impactful update from May 2022.
Core updates are essentially key improvements made to Google’s algorithms, designed to improve the relevancy and quality of the search results. It may not always seem that the results of core updates have really improved the quality of the search results, but this is a continuous and evolving process and essential in dealing with increasing spam and in making sense of an ever changing web.
I personally love to see core updates being rolled out. Personally, and as a business, we always focus on creating high quality websites and content for ourselves and our clients, and sometimes it’s frustrating when you see lower quality, less helpful results in the higher ranking positions. Core updates help to reshuffle the results and recognise the pages that should be ranking better than they are.
Losing rankings during a core update does not mean that a website has been penalised, it just means that the content on the page may now be deemed to be less relevant for the search query. Care updates help to maintain focus and to understand how Google is now seeing the intent behind search queries. It’s a great time to focus on increasing quality and on page relevancy. SEO never stops!
Don’t automatically redirect users based on their IP
I’ve touched on this before but it’s worth a post on its own as I still keep seeing this and not only is automatically redirecting users to the version you think they should be on very annoying, it can also create huge problems for search engines as they may not be able to access all of your content.
Therefore, say I’m in the UK and I access a website that it set up for international traffic. The website automatically redirects me to the UK version of the site, maybe with pricing in GBP. This is what I probably want to see as I’m in the UK. Now, say someone in the US tries to access the UK website but then gets automatically redirected back to the US version, with pricing in USD. It’s probably the correct version of the website for that user, but here’s where the problems begin.
Google’s crawler, Googlebot, mostly crawls from the US. It means that if Googlebot tries to access the UK website, it will automatically be redirected to the US version instead. In other words, Google cannot crawl, index or rank the UK website as it can’t see it.
It’s best to ask users what version of the website they would like to go to. Don’t assume that just because they are in a certain country that they want the local variant, or even worse, assume that they speak the local language. So, always ask, never presume!
SEO – It’s not just about traffic. It’s about conversions
It’s interesting when we speak to new potential clients and we get onto the subject of the usability and conversion set up of their websites. SEO has been traditionally thought of as optimising a website, such as titles, metas and headings and with some off page work in the form of link building. In fact, talking to potential clients we often hear the phrase “I don’t really understand what you do” as they see SEO as some sort of black box and potentially underhand service that they can’t see.
At Artemis we are very open book, and we want our clients to understand what we are doing and why. It’s important that they are invested in the process as it does take time to build up a reputation in order to increase rankings and traffic. However, our initial focus is also very much on usability and website conversions. Making a website generate more leads or sales is a much quicker process than the organic side of SEO, and this can create some early wins, especially if the website is not well set up to convert traffic.
We see CRO, Conversion Rate Optimisation, as an absolutely integral part of SEO. If clients experience an increase in sales without an increase in traffic early on then it starts to justify the investment in SEO and allows the organic side of things to build up over time. Traffic is not a good metric of success as not all traffic has the same value. Our approach is to always focus on targeted traffic whilst simultaneously working to improve conversions across the website. This has the added benefit that as the traffic grows it will convert at a higher rate. That way everyone is happy!
Pointing expired or other domains to your website – Is it OK?
This is a topic that we get asked about quite often. The SEO world is burdened by terrible paranoia, that the wrong move can destroy rankings and traffic in an instant. From building the wrong types of links to your website, having low quality content or Google thinking that your website is not as helpful as you think it is. We’ve seen many types of spam techniques over the years, many of which still work today, but one that seems to be considered “spam” may not be spam at all.
Say you come across a competitor that has gone out of business and you think it would be a good idea to buy their domain name and repoint it to yours because they had some half decent rankings and some branded traffic. Is it wrong to do this?
The reality is that it’s not and there is nothing wrong with “merging” two websites. In the real world it’s quite normal for a business to buy another and merge them into one bigger business. The same is true online. The best thing would be to merge the new domain on a page level basis, so that you redirect each page to its nearest equivalent on your website, as this will maximise the effect of the merger. It’s best not to just redirect everything to your home page.
Where you do need to take care is when you want to repoint multiple domains to your website. This can trip a filter in Google’s algorithm as this can appear unnatural. In reality, in the worst case Google may not apply any benefit of those domains to your website. Everything is fine in moderation, just don’t overdo it!