Justin's SEO Insights

Justin, our Technical Director, shares his thoughts on what is happening in the world of SEO...

Core Web Vitals – It’s not always worth the effort

I’ve discussed Core Web Vitals a couple of times in my insights, and my advice has always been the same. Don’t focus too much on it unless your website is really slow and problematic for users. The reason being is that trying to achieve a good CWV score can be very difficult and very time consuming, especially when most gains will be very minimal.

I think what generally happens is that SEOs use CWV as a focus as it’s a measurable action, albeit one that ultimately may have no, or very little impact, on the performance of the website for users and in search. The reality is that for most small businesses it’s not an issue.

For a typical small, local business, in Search Console there generally isn’t enough data for Google to be able to report back the metrics for CWV for the website. This means that it can’t take these metrics into account for ranking purposes.

If you’ve got a website that’s enjoying #1 rankings for all of its target keywords then sure, spend some time trying to fine tune all of the CWV metrics, but otherwise, don’t lose too much sleep over it. It’s a sentiment that was recently shared by John Mueller from Google. There are much better things you could be spending your time on.

“SEO is dead”…it’s that time of year again

I’ve been optimising and marketing websites since 2003 and during that time it’s always been funny hearing the “SEO is dead” comments reverberating around the SEO community. As search evolved and became more complex, those who were unable to stay ahead of the game inevitably were the ones calling an end to SEO. And with the sudden and (expected) increase in AI activity online, those who can’t see a way forward are yet again declaring that this is the end of SEO.

We’ve been through some very significant moments where search has changed radically, from Google being able to understand content, and not that it’s just a bunch of words, to adding featured snippets in the results, and so much more. Through each of these phases we’ve been able to embrace the changes, adapt and continue to grow traffic and sales for our clients.

We’ve spent a significant amount of time and money over the last few years understanding how AI is impacting search, so it’s what we’ve been expecting and working towards. Yes, search is going to change but it’s not going to happen overnight. AI tools are just not good enough yet, there have even been some interesting cases of Bing’s new AI chat getting a bit nasty with users. Search engines are trusted to deliver reliable results, and until these AI tools can be fully trusted (we are very far away from this), there will be a gradual integration of this functionality into the search results.

SEO is far from dead. Search engines and AI tools need content, they need websites and they need optimisation to understand what everything is. Those that can’t navigate this new era will exit the space and maybe the SEO world will end up with a higher quality level of SEO professionals. It’s going to definitely get harder, but we love a challenge!

Unconfirmed Google updates…what causes them?

This week, Google celebrated Valentines Day with an unconfirmed search algorithm update. Various tools which track changes in the search results reported a high level of volatility, although there was no confirmation from Google that there was an update under way. It is quite common to notice an increase in the volatility of the search results when an update is rolling, as Google only confirms the update once its effects start becoming noticeable.

However, what’s happening when an update is never confirmed by Google? The reality is that Google launches hundreds of updates to its search algorithms every year, practically on a daily basis. Many are relatively small and most won’t be noticed by users. A recent comment by Gary Illyes from Google was very interesting when he stated that very often when public search monitoring tools report big changes in the search results, they don’t align with any significant updates that Google has made. In other words, they don’t know why these tools would be reporting any major changes in the results. They remain a mystery…even to Google!

 

Bounce rate as a ranking factor

I came across a typical discussion online yesterday regarding bounce rate. Bounce rate is a measure of the number of users that land on a page and leave again without clicking on any other page on the website. There is a common myth in the SEO world that a high bounce rate is a bad thing and a negative ranking factor. Actually, it isn’t. Representatives from Google have commented various times that Google does not look at bounce rates. In fact, when GA4, Google’s new analytics software, was launched it didn’t even have bounce rate included as a metric.

The simple rule of thumb regarding bounce rate is that it’s likely to be relative, rather than absolute. For example, if you are searching for a gift for your partner’s 20th birthday, it’s likely that you’ll click through to a search result, read the 20th birthday gift ideas and then leave. It’s unlikely that you’re going to be interested in gift ideas for other years. Therefore, for these types of results it’s fine, and expected, that users will bounce back to the search results, causing a high bounce rate figure. If, however, you are looking for a local surveyor, you may spend more time on a website understanding the services provided and the surveyors themselves. In this case you wouldn’t want a high bounce rate, and it shouldn’t be expected.

Therefore, think of bounce rate as relative. If it’s high, it may be absolutely normal.

How often do you check your crawl stats?

Google Search Console is a great tool full of useful data so that you can understand how your website is performing in search and any issues that Google finds that may be holding it back in search, such as errors and broken schema mark up. However, there is one feature that is a little tucked away and doesn’t seem to get much coverage. This is the Crawl Stats report. It’s under the Settings section in Search Console, which is a little odd, as you can’t actually set anything for crawl stats!

We recently had an issue with a website where new content was not being indexed and some of the old pages were being deindexed. Going through the process of checking that the content was indexable, linked to internally and not blocking search engines, and seeing that everything was fine, we then discovered that the answer lay in the crawl stats report. A recent increase in the “Average Response Time” had decreased the “Total Crawl Requests”. In other words, something was causing Google to take longer to crawl the pages of the website, so it started crawling less often.

We managed to find the cause of the issue, fix it, and crawling and indexing was restored. New content started getting indexed again. It’s good to keep an eye on the Crawl Stats report, as it can highlight an issue which other reports in Search Console will never expose. Google needs to be able to efficiently crawl a website, as fast as possible. It’s a good idea not to make things difficult for GoogleBot!

JavaScript is great…but not always for SEO

We recently received a new enquiry from a company that had experienced a big loss in traffic following the launch of their new website. We often see this anyway as many website migrations are not handled properly, with not enough consideration given to the effects on rankings in search. However, this recent enquiry had a key issue which we have seen a few times previously and seems to be becoming more prevalent. That is, the increased use of JavaScript to load the main content on a page.

Whereas a “normal” HTML website will have all of the code processed at the server and sent to the browser in its final form, a website that relies on JavaScript will only send the HTML part of the code to the browser with the JavaScript part then rendered by the browser. This is a two-step process and search engines handle it so. However, many JavaScript implementations are not SEO friendly and cause serious crawling and indexing issues.

With a recent client we replaced the JavaScript with HTML and the rankings recovered. It’s no coincidence. If you are planning on using JavaScript on your web pages, it’s still best to use it only for the dynamic features necessary for your users to carry out certain interactions with the website, not for loading the content.

The search category is about fair use

There was a very interesting interview with Microsoft CEO, Satya Nadella, published on The Verge discussing integration of AI into MS products including Bing search engine and its Edge browser. Bing currently has a very small share of the search market, approximately 4% in the UK and 6% in the US. Microsoft seem quite confident that this could be their moment, they’ve beaten Google to launch a very innovative and well-integrated AI product into their search engine and browser. They really have done a very good job with it.

The interview with Satya Nadella is interesting because in it he discusses the importance of the new AI chat feature in sending traffic back to websites: “The search category is about fair use so that we can generate traffic back to publishers….Our bots are not going to be allowed to crawl search if we are not driving traffic.” According to Satya, they will be monitoring the traffic being sent out to websites and it will be interesting to see how, over time, the level of traffic to websites is impacted by the new chat tools. Ultimately, if publishers don’t get the traffic then they won’t be investing in new content. The AI needs content to learn, publishers need money and Microsoft needs to keep things relevant and display ads. Interesting times ahead!

Google’s view on the use of AI for content

Yesterday Google published a blog post clarifying, to some extent, the use of AI generated content for websites. The short summary of the post is that from Google’s point of view, AI is used in many applications online, such as weather prediction, sports scores and transcripts, so it may be fine to use AI but not if it’s used for the sole purpose of ranking in Google’s search results. It goes on to say that Google has had systems in place for some time now to detect content that is “not helpful” and not unique enough, so it’s up to you if you want to use AI tools to generate content. You just may not get any ranking benefit from it.

I think ultimately, Google is unable to completely rule out AI content in the search results. They use AI in their algorithms so it would be hypocritical if they said that no one should use it at all for their own benefit. What is comes down to is the quality of the content and if it fits in with their E-E-A-T guidelines. Experience, expertise and knowing who’s written the content is going to become very important. There is a great summary of this here. Use AI to help you, but don’t depend on it. Google has been fighting “low quality content” for a long time and it knows what it’s looking for.

Bing lays down the AI gauntlett

Bing has officially announced that its integration of OpenAI into its products is now available to the public, although currently limited and rolling out gradually. This was expected as they wanted to beat Google to be the first major search engine to integrate AI generated responses into the search results.

This is a major and significant change to the types of search results that we have become accustomed to over the years. Is this going to change how people search and interact with the search results? Of course, but not for all types of search queries.

If you’re looking for a local plumber, you’re unlikely to ask the AI. You’ll want to see options, reviews, images, etc. However, if you want to plan a trip somewhere or want some gift ideas for your anniversary, those are the types of searches that will likely see an impact from the AI results. This could cause websites to receive less organic traffic.

But, is this a feature which ultimately suffers the same fate as voice search? We all have a voice search enabled device in our pockets, but the take up has been very slow. We’ll need to see how AI results develop but Bing have certainly started the ball rolling now…and it’s rolling fast.

Google and Bing AI is coming

There’s doesn’t seem to be a day that goes by at the moment where there isn’t some sort of announcement about AI chatbot tools and search engines. Google CEO, Sundar Pichai, yesterday blogged about how Google will be enabling an AI chatbot feature, called Claude, into its search results. The example given in the blog post shows that the resulting text does not have any citations, and this could cause content creators to respond angrily, as the AI is using human created content to generate its responses without any accreditation to the original sources.

A user on Twitter recently found a “chat” option when he went to use Bing, and this appears to be how Bing may ultimately integrate ChatGPT into the search engine. Bing is offering the chat feature as a separate search function, although it does cite its sources, which is a positive over Google’s integration.

This new era of search is still in its infancy and we are going to be seeing a lot more announcements and changes over the coming weeks and months. Stay tuned!

 

Google’s AI is coming

Google has been using AI to refine the search results, tackle spam, interpret images, etc, for quite some time now, but the applications for users have been limited. With the serious level of interest and usage that ChatGPT has generated over the last couple of months, has this forced Google into releasing their AI chatbot equivalent sooner than was anticipated?

It would seem so. In yesterday’s Q4 earnings call, Google CEO, Sundar Pichai, said that “In the coming weeks and months, we’ll make these language models available, starting with LaMDA, so that people can engage directly with them”. It won’t be long before we see a ChatGPT equivalent in place within Google search, although it’s likely to be a secondary option instead of part of the main search results.

Some initial insider comments seem to suggest that Google may replace the long-standing “I’m feeling lucky” button with a new button to access their chatbot. We will have to wait and see but this is shaping up to be quite the year! Things are moving very fast.

The importance of authorship

If there’s one area of SEO that’s going to become even more important in 2023, it’s authorship. We are coming into an era where the internet is going to be flooded with AI generated content, and this content is not the same as content written by real people. Search engines are more likely to reward content with higher rankings if they can trust the source. They have to know who the author is. If the author is a known person, known to Google as someone who has expertise in a certain field, then it’s more likely that this content will rank above “anonymous” content.

Google itself has just created an authors page for the Search Central Blog. Note how each author lists their own social media accounts and the posts that they have written. This is what Google is expecting to see going forward for websites. It’s part of its E-E-A-T philosophy. Experience, expertise, authoritativeness and trustworthiness. Gone are the days where you could just write an article and expect it to rank. You have to build your own personal profile as an expert in your field and make sure search engines can associate your content to you.

Is link building bad?

There was an interesting discussion on Twitter yesterday on the subject of backlinks and disavowing low quality links to a website. At Artemis we haven’t disavowed links for clients for several years now. Google discourages using the disavow file as you may unwittingly tell Google to ignore links that are actually helping your website rank. Google is much better now at determining which links should and shouldn’t count towards rankings, so there’s no need to disavow anything anymore, not unless you’ve been hit with a manual penalty.

In yesterday’s conversation, John Mueller from Google said that “these agencies (both creating and those disavowing) are just making stuff up”. That’s quite a statement. Disavowing I agree with, but not when it comes to creating links. A fundamental part of Google’s algorithm (PageRank) relies on links to understand the reputation and relevancy of a website. Without links it’s very difficult to get a page to rank, unless it’s for an ultra-low competition search term. If you don’t make an active effort to get links to your website, it’s never really going to make much of an impact in search.

Of course, getting low quality links is pointless, but actively working on gaining high quality links is absolutely key for SEO. The proof is in the pudding. No links generally means no ranking improvements, despite the quality of the content on the pages. Link building isn’t bad, low quality link building is (pointless).

Rankings “may go nuts” with a site redesign

We get enquiries from businesses all of the time where the migration to a new website has not gone well and they’ve lost rankings, traffic and enquiries. It’s unfortunately all too common. Search engine rankings are delicate things and when migrating to a new website, it’s all too common for the SEO aspect of the website to not be taken into account. Sometimes important redirects are missed when URLs are changed, sometimes the structure of the menu is changed and other times, the entire content is changed.

All of these changes can be disasterous for rankings if not executed well. We’ve seen it all too often. However, even if URLs stay the same, the content stays the same and the crawl paths through the website (how search engines navigate from one page to another) also stay the same, just changing the design of the website, and the associated HTML code, rankings can be impacted.

Google engineer, Gary Illyes posted on LinkedIn yesterday saying “when you redesign a site, its rankings in search engines may go nuts”. The HTML markup is what search engines use to make sense of the pages, such as the order of the content, the headings, etc. If you change this, it will impact how search engines see the page and it can affect the rankings.

Migrations need to be handled with care, even small changes can have major consequences. Proceed with care!

Yandex Code Leak

It’s been an interesting weekend with the news of the Yandex code leak. Many will never have heard of Yandex, but it is like the Google of Russia. Although Google has a presence in Russia, Yandex is bigger and more widely used than Google. The leak, not a hack, according to Yandex, was allegedly carried out by an ex-employee. The code is complete and covers all of Yandex’s products, including their search engine.

SEOs have been very excited over the last couple of days ploughing through the 1922 rankings factors listed in the leaked documents. It is important to note that many of the factors may be negative ranking factors. Although Yandex is not Google, and the way the two search engines rank websites will be different, there will inevitably be some crossover between the two. It’s worth having a look through the list, but appreciate that the application of the factors isn’t clear. From the list, there are some big surprises although one shouldn’t surprise anyone: Ukrainian websites are automatically applied a negative ranking signal. Sigh!

The fundamentals still count

“It is surprising how the fundamentals of SEO are often overlooked. When we get a new client enquiry and we see that the title tag on their home page says “Home”, we know we can make a big difference to the client’s rankings and traffic by just implementing the very basics of SEO which, today, still make a significant difference. The concept is simple…don’t make Google think too much. If it can clearly understand what product or service a website is offering, and in what location, that’s the minimum requirement for getting SEO on the right path for the growth of the website.

It pains me to see an under-optimised title tag, probably the most important signal for search engines to understand what a page is about. This week, for a new client, just adding the client’s location in the title tag took them from nowhere to first in the search results for their service at a local level. Don’t overlook the basics, they still matter. “

International targeting – Mistakes can be costly

I’ve been building websites targeting international audiences since I started online in 2002. During all this time, the methods to effectively optimise for international targeting have changed considerably. With the launch of the HREFLANG tag in 2011, it made it much easier to be able to expand into other countries, even in the same language, without causing a problem for the existing local rankings. However, HREFLANG caused much confusion, and still does today. Having just completed a website audit for a new client that’s targeting the UK and US, which is nothing overly complicated, it’s fascinating to see the devasting effects of an incorrect implementation of the HREFLANG tags.

The way we approach international targeting today, despite HREFLANG, has changed and we don’t necessarily implement it like many websites do, where they replicate the entire site multiple times to cover all markets. To add to that, some websites also introduce IP redirection to automatically redirect uses to their supposed language and country variant. But this is also bad practice. International targeting is not difficult, but the incorrect set up can severely impact the performance of a website in search. Get it right and it can work wonders!

Make it personal

When I first started online in 2002, the trend in those days was to make your website look like it was possibly bigger and more popular than it was. This was done by removing personalisation and always referring to everything internally as “the team”, when in actual fact it may just have been one person in their bedroom. In order to differentiate myself in the early days, I went all in on and put my face and name all over my websites. It paid dividends! My websites grew, and grew fast, as users felt they were connecting with me and trusted me.

Things have changed a lot over the last 20 years, but personalisation is possibly more important than ever now. As the Internet gets flooded with AI generated content, search engines are more likely to trust content that has been generated and attributed to someone they know about and have built trust in them. If you’re the author of the content on your website, make sure you have a solid profile page, list on it your background, experience, where you’ve been featured, your social networks, etc. People buy from people and search engines trust content from people, much more than from anonymous users. Personalisation is key, no matter how big the website or business is.

ChatGPT and Google’s future

If there’s been one hot topic in SEO over the last 3 months, it’s the ChatGPT AI tool and the impact it could have on Google’s business model moving forward. ChatGPT has created a lot of buzz around its ability to generate huge amounts of what appears to be relatively well written content on just about any subject matter (as long as it was not related to things that happened in 2022 as it was only trained with information up to 2021). Having a tool that can basically answer just about any question you throw at it, where does this leave a search engine such as Google? Well, the reality is that ChatGPT is not a tool which people will use in its current form on a daily basis, but the potential is obvious.

It will only get better and it will happen very quickly. After Microsoft, a large investor in OpenAI, the company that developed ChatGPT, announced that they would be incorporating ChatGPT into its products and search engine, rumours have now surfaced that Google will preview its own AI chat alternative but probably integrated into Google search. We can expect that Google’s AI will solve the biggest problem that ChatGPT has, which is that it doesn’t know if the text it spews out is correct and it doesn’t cite its sources. We can assume that Google’s version will provide users with a much more reliable and trusted AI chat tool.

Farewell Google Optimize. It’s been fun

Google announced on Friday last week that Google Optimize will be closing on the 30th of September 2023. This is sad and unexpected news. Google Optimize is a great tool for running A/B tests on web pages to test and measure the effects on conversion when changing elements such as calls to action, buttons, positions on pages, etc. It appears that this is part of Google’s shift from Google Analytics (GA) to GA4, the new analytics package which fully replaces GA as of July this year.

I would assume that within GA4 we will begin to see the ability for increased A/B testing, and this will then be the natural successor to Google Optimize. At Artemis we have used Optimize since its inception in 2012 to successfully improve the conversion rates of hundreds of websites. There are alternative tools available to use but they are all paid tools and are very expensive. I do hope that the existing functionality of Optimize is fully integrated into GA4 and then enhanced to incorporate further testing capabilities. We shall wait and see what happens.

Careful adding videos to mobile websites

We’ve come across several instances of the last few weeks with websites featuring auto-loading videos on their mobile home pages. This is a very bad idea! Videos should never auto-load on mobile for three main reasons. Firstly, they significantly slow down the loading of the page, which can be frustrating for users, especially when they are on a slow data connection on their mobiles. Secondly, they can consume a huge amount of data when loading and this can be a problem for users who have data limits on their phones. And thirdly, slow page loading can affect the rankings of the page in search.

The only time a video should auto-load on mobile is if the phone is connected via a Wi-Fi connection, as there are no data limits. So, before using videos to enhance pages on mobile, it’s important to consider the impact on users, their data usage and the potential impact on rankings. Videos are great to make pages more interesting and engaging, but they need to be used with a little caution, especially on mobile.

Can you publish too much content?

Does publishing a lot of content every month, or every week or day, classify it as spam? There has often been the misconception that if you publish a huge amount of content, such as several blogs a day, that Google could see this as spam, and it could affect the performance of the website in search. However, Google advocate, John Mueller, stated on Twitter today that it’s fine to publish content on a regular basis, for example, a new blog post every day, as long as you have new and relevant things to talk about.

The important thing is not to force it! Producing new content just for the sake of it will likely lead to the creation of pages with relatively low value. It is best to focus on writing really good, relevant and insightful content, even if it is every day, or even multiple times a day. John clarified that it’s unlikely that Google sees excessive content production as a spam signal within Google.

It’s time to get familiar with GA4

If there is one thing that is putting the fear of dread into SEOs at the moment it’s GA4. GA4 is Google’s new analytics software and it’s very different the old (current) Google Analytics that we are all so familiar with. However, time is running out and as of the 1st of July 2023, Google Analytics will stop receiving and processing data, and GA4 will then become the default analytics tool going forward. Whereas Google Analytics (GA) relies on cookies to track user activity, GA4 is designed to cope with a world where cookies are often disabled, and data is missing from traditional analytics reports because of this.

GA4 receives and processes data quite differently to GA and it uses AI to fill in the missing data as best as it can. It means that the data in reports in GA differ from those in GA4 and the way the software presents them has changed quite considerably. It’s important to really start to use GA4 on a daily basis now before the switchover in July. It’s going to come around very quickly.

Is ChatGPT breaking the law?

The hot topic in the SEO world at the moment is very much around ChatGPT, and other similar AI tools, which are being used to generate content for websites. Forgetting for a moment the issue of the quality of the content produced by these tools, the big question is about the ethicality of tools that scrape the web for content, and then regurgitate it in a different way. Essentially, these tools are plagiarising existing content, which has been written by humans and then presenting it to the user as if it was their own.

ChatGPT, for example, does not cite its sources when outputting a result from a prompt. This is a very grey area and one which is going to intensify over the coming month. Just yesterday, a class-action lawsuit was filed in the US against Stable Diffusion, one of the biggest AI tools for generating images, on account of their scraping of billions of copyrighted images in order to train the AI. This is just the beginning; we can expect to see many more lawsuits being filed. AI needs to be fair and ethical, and currently it isn’t.”

It pays to be an expert

Do you review products on your website? If so, have you actually reviewed the product or are you just talking about it? Google has been increasing it’s focus in this area with the release of the Product Review Updates. The goal is to demote the websites that are appearing to provide product reviews when in actual fact they are just talking about a product with a view to either generating traffic for it or generating commissions as part of an affiliate program.

It’s not enough now to just talk about a product, you also have to show that you have actually tried it out and tested it properly, so that the review is useful to the user. Google Search Advocate, Alan Kent, stated this week on Twitter that it may be helpful to include links out to more than one product site (affiliate site) to give the user choice and make the review more useful. There is nothing more frustrating than reading an unhelpful product review. Think of the user, what do they need to know, what will help them make a reasoned buying decision?”

Google’s Link Spam Update is finished

Google has confirmed today that the Link Spam Update that began rolling out in December has now finished. It took nearly a month to roll out as it was partially paused during the holiday. The Link Spam Update is focused on removing the effects of spam links that point to websites that may be artificially helping them to rank well. The effects of this update may be difficult to determine in isolation as it was overlapped by the release of the new Helpful Content Update.

Therefore, if a website has lost rankings over the last 30 days, it could have been because of either update or could have been affected by both. Google still heavily relies on backlinks for ranking websites and it is continuing to refine how it determines if a link is natural or artificial, i.e., that the website hasn’t gained that backlink organically. We can expect more Link Spam updates to continue rolling out during the year.

Even Google’s updates take a holiday

The Helpful Content and Spam updates which were released by Google in December, are still continuing to roll out, which may explain the current turbulence in the search results. These sorts of updates typically take a couple of weeks to fully roll out but the December updates have not yet finished, and it’s been nearly a month now. Danny Sullivan, Search Liaison at Google, stated that the the reason why this is taking so long is because Google pauses the release of the updates during the holidays.

There was some speculation that the length of an update is relative to its impact, but this seems not to be the case. There is still no news as to when these updates will be completed but the search results have been in a high level of flux recently, so we hope it won’t be too long now before things settle down again.