I receive a regular email from a large SEO website that shares some findings of SEO tests that they have run for some of their clients. The results are often as expected, although not always.
The fine details of the experiments are very limited and, therefore, it means that it’s ultimately not something that can just be repeated for another website and expect similar results.
There are so many factors that determine why a page ranks above another. Changing certain elements, such as titles, headings, content, etc., can help but every search query is different, every website is different and there is no one-size-fits-all approach that can be adopted.
It means as SEOs we have to be constantly testing new ideas, new optimisation, new page layouts, set ups and content approaches. As Google evolves, we must do the same. We don’t always get it right, but failure is an important learning exercise. Through rigorous testing we can build up a much better picture of what works and what doesn’t. This is how we can consistently achieve good rankings and traffic growth for our clients.
Test, test, test…test some more and keep testing. It’s all about the testing!
So you want to be an SEO? That’s great…just call yourself one! You don’t need to have any qualifications or accreditations to start offering SEO services. You can start a website, regurgitate what others or Google says, get enough followers on Twitter (sorry, X) and now you’re a fully-fledged SEO.
What I’ve often seen is that the SEOs that are prolifically active on a daily basis on social media, are not the best SEOs. Most don’t even have their own websites and don’t practice what they preach. With so much time on social media, when do they ever get any work done? In fact, the best SEOs that I know are the ones that never share a single comment about what they do on any platform.
There was a great example the other day of the sort of bad SEO advice that you see on social media all of the time. A user on Twitter, who classifies himself as a “Technical SEO” suggested that if your web pages load too quickly….you should slow them down! His argument was that increasing a user’s time on page, but making it load slowly, will make Google think that the user is finding the page helpful and therefore increase the rankings of the page.
As bad advice goes, that’s got to be right up there with “grow your website authority with bookmarking links” (yes, this is still offered as a service!).
Nothing beats experience and knowledge backed up by data. We’ve been doing SEO for our own websites and clients’ websites for over 20 years and we continuously invest heavily in R&D. Don’t always believe everything you read online about SEO…unfortunately so much of it is ****.
If you’re a business owner or you run your own website, you may have fallen foul of using an image for which you don’t have the rights to use. This may then have been accompanied by a legal letter demanding a serious amount of money in compensation.
The days of scouring the internet for images, or paying monthly subscriptions to download professional photos, may be well and truly over with the astonishing advancements in image generation AI tools such as Midjourney making them mostly redundant.
But are they safe to use on your website and how will they impact your website in search? Google’s advice on the use of AI for content, doesn’t just apply to written content, but images too. As long as the images are helpful, useful and complement the page, then there is no reason whatsover not to use them. Does an image created in Photoshop have any more value than one created by AI? Not necessarily, they can both be just as good.
For someone like myself with zero artistic ability, AI image generation tools provide a convenient way to add high quality and unique visual elements to the pages of my website. In fact, since I have been using them they have started appearing in the search results. For now Google isn’t seeing them as a negative. And to close off this insight…here’s an AI generated image of a robot making images!
This has been a hot topic this due to an annoucement from the large tech website, CNET, that they had deleted thousands of pages from their website to increase the overall freshness of the website, with a view to improving Google rankings.
Various voices from Google joined the discussions to state that this does not help and is not “a thing”. It seems that CNET has been a little misguided with their approach to freshness.
However, it’s not such a bad move as it’s being made out to be. When the Panda algorithm launched in 2011, one of the main factors that could cause a site-wide ranking penalty was having a significant amount of low quality content. What is deemed as low quality can be argued at length but the reality is that it is better to have a smaller, higher quality website instead of one that has many pages that mostly don’t rank.
Understanding the concept of PageRank and how Google transfers the value of inbound links to internal pages is key to understand that the fewer pages that you have results in more PageRank being passed to each of them.
Google doesn’t want you to remove pages, unless they are really bad, as they may one day answer an obsscure query, but it does want you to keep them updated as much as possible. This is challenging for large websites. The reality is that less is more and it’s always better to have less pages, which are very good and up-to-date, rather than a large website with many pages of outdated content.
It’s difficult to believe that just over 6 months ago, “ChatGPT” was not something that anyone even knew existed. But since its launch late last year, it has been the hot topic of so many conversations and news articles. The reality is that generative AI was nothing new but ChatGPT made it accessible and somewhat fun.
It even kicked off a search war between Google and Bing with both launching their AI chatbots so as not to appear to be getting left behind. I can only image the fear within Google seeing these new tools potentially eating into their lucrative search dominance. It may have seemed like the “Nokia/iPhone” moment for Google!
But, you may have noticed that things have gone a little quiet. There are less news articles about generative AI, there is less chatter and some of that initial buzz has definitely faded. In fact, an independent source last week estimated that after traffic to ChatGPT plateaued, it has now started to decrease. A recent survey of SEOs also showed that most are no longer interested in the new Google generative AI experimental results in the US.
Whilst these tools definitely have some very powerful business use cases, for the vast majority of people, on a day to do basis, they don’t. We are possibly seeing the initial novelty wearing off although these tools will evolve and they are here to stay. I suspect quite strongly that Google will reevaluate how it integrates generative AI into its search results, if the early signs are that it’s not actually what people ultimately want from a search engine.
When I was first starting out building websites in 2000, Google Analytics was but a dream. In those days we had very basic analytics software, such as AWStats, although for those with some money to spend, they could opt to pay for an expensive software tool called Urchin Analytics, which provided much more website usage data.
Luckily for website owners around the world, Google was on a spending spree and in 2005 spent a supposedly $30m dollars to aquire Urchin. This product was then rebranded as “Google Analytics” (GA) and it was made available to everyone…for free! Happy days 🙂
For the last 18 years we’ve grown to know, understand and query GA like seasoned data analysts and so it is with some sadness (and frustration) that we bid a fond farewell to a tool that’s been a key part of our business for as long as it’s existed.
GA4 is the new analytics tool replacing GA as of the 1st of July. To say that GA4 has received a very cold reception since it was introduced is putting it mildly. But we will need to adopt it, adapt to it and hopefully in 18 years’ time, learn to love it…at least a little bit. Farewell GA, you will be missed.
Google has always been very clear about one thing…”Produce great quality content that users will love”. Follow this simple guideline and you’ll be rewarded with Google traffic. Actually, it’s not that straight forward, there are many factors at play. But what is “quality content”?
In a recent Google SEO office-hours Q&A session, Google’s Gary Illyes was asked about issues related to the indexing of a website. Gary clarified that in order for Google to properly index a website, it depends upon the popularity of the website (backlinks) and the quality of the content.
Previously, Google engineers have stated that quality content is not just about the textual part of the page, but also the quality and integration of images, the page layouts and page load speed, to name a few.
One of the ways to look at this is from an intent perspective. If a user is searching for “anniversary gift ideas”, it’s unlikely that they want to read a lot of text, they probably want to see images of actual gifts that they could buy for their partner. A page with just text on it, is unlikely to rank for this search term. In many cases, especially for searches where the intent is for an expected visual result, it’s probably better to make the images the priority, even above the text.
There is a new trend now towards the use of AI images on web pages. These can sometimes be misleading to users and Google has announced that they will be identifying AI images in their search results and labelling them accordingly. With the ease now of creating new images, it’s a great time to ensure that web pages are deemed higher quality by search engines.
Users tend to scan text, but will always focus on an image. Use them to your advantage to make pages high quality…you will be rewarded!
AI content has for some time been flooding the internet, and it’s only going to increase, possibly exponentially, going forward. This is giving search engines quite a headache, especially when it comes to the accuracy of content.
Firstly, many of the AI tools are stuck in time. ChatGPT is stuck in September 2021, Anthropic’s Claude is stuck in early 2020 and other tools have similar cut off points. It means that they don’t have any new or fresh data to reference and use in their responses.
Secondly, AI chatbots suffer from hallucinations, which Wikipedia describes as “a confident response by an AI that does not seem to be justified by its training data”. Basically, it makes things up! We’ve seen this quite often during our SEO testing over the last few months. The AI will create fake quotes, fake businesses and link to resources that don’t exist.
Website owners may be using these tools to generate their own digital marketing content, unaware that there may be publishing misleading or fake information.
Search engines now have the challenge of determining what is true and what isn’t. They can’t afford to surface incorrect information. Historical facts may not be such an issue, but future facts and information…AI could distort this and create a virtual future which ultimately, is just an hallucination. For now, just make sure your content is accurate and we’ll worry about the future tomorrow!
At Artemis we’ve always had one key philosophy and it’s the essence of the messaging on our website…we help our clients make more money. If clients aren’t getting more leads or generating more revenue, then we aren’t doing our job properly.
For that reason, we’ve always focused on the search terms which drive targeted traffic to their websites. We’ve never been about chasing traffic numbers. It’s easy to just get more traffic, but if it’s not targeted and the intent isn’t commericial, isn’t generally quite pointless.
With the relentless arrival of AI directly in the search results, this has now become even more relevant. Whereas many SEO strategies over the years have been focused on increasing traffic through content generation, if this has worked then it’s not likely to work for long. The AI will directly be serving up information in the search results.
The only search terms that will matter are the money terms, such as searches for local services and commercial queries for products. We’re foruntate to have always focused on the money terms, it’s what works now and will work in the future.
Excuse the pun in the title! This week Neeva announced that it was shutting down its subscription based, ad-free, privacy focused search engine. Although this news was greeted by quite some surprise in the search world, it wasn’t really a surprise at all. How many people have actually used Neeva? No one I know has ever used it, let alone heard of it.
You only have to look at the struggle that Bing has had, with all of the mighty resources of Microsoft behind it, trying to compete with Google and failing. Bing has a miniscule share of the search market, probably less than 5%.
So, how was a search engine, that you had to pay to use, ever going to compete against the free established options? Neeva stated that the problem wasn’t so much getting users to pay, it was just the underestimated difficulty in getting people to change from what they are so familiar with. Once users are used to having something for free, it’s practically impossible to make them pay for it. Even Twitter now, with its pointless paid options, has only got a tiny percentage of users to part with their money, and most of those are either Musk fanatics or those that desperately want a little blue tick…which means nothing now.
Search has evolved to the extent that it is highly complex and requiring vast amounts of money. It’s also part of our lives. It’s highly likely that people will still “google it” for a long time to come. The competition just isn’t there and we’re so ingrained in Google’s eco system.
With all the recent advancements in AI, and especially Google’s planned integrated of its chatbot directly in the search results, one of the biggest concerns for any business or website owner is if Google will still end up driving traffic to their websites when search becomes “Generative AI search”.
The short answer is “Yes”. And the reason is very simple. It’s all about the money. In 2022, Google generated a total of $227.47B in advertising revenue, of which $162.54B (71%) was from ads in search (PPC), $32.78B (15%) from ads on third-party sites (Adsense) and $29.24B (13%) from YouTube.
Google can’t just jeopardise any of that revenue. On the contrary, it has to show continuous growth. Paid ads send traffic to websites, the content network displays Google ads (Adsense) and that alone generates billions in revenue for Google. The content network alone generates more revenue than YouTube and it generates a significant amount of its traffic from search.
Website traffic will change, but it’s not going away. It can’t, it’s too valuable for Google.
Yesterday was Google’s much anticipated Google I/O annual conference, and the focus throughout the whole presentation was extensively about the application of AI in Google products. The main area of interest was how Google plans to integrate AI into the search results, what it called a “Generative AI Search Experience”.
What Google revealed was not a complete surprise. We were anticipating that, unlike Bing, Google’s AI chatbot would be fully integrated and front and centre in the search results. It showed how the AI will help users to get a more “helpful” search experience by being able to easily expand on topics, and help users to find products to buy based on Google’s huge Shopping database.
It was interesting to note that there are areas where the AI is less likely to appear, and that’s for critical searches. Generally anything to do with finance and health, what Google calls “YMYL” (your money, your life) pages. It will still prefer to show trusted search results as it does now.
The examples Google showed of its new search in action, were mainly for long queries, the type you’d ask a chatbot. But we will need to see what happens when it’s used for searches to do with local businesses or finding a particular service.
At Artemis we’ve been embracing and adapting to the influence of AI in search for many years, and we’re excited for this next phase and maximising the opportunities for our clients.
Google I/O is an annual conference where Google reveals its latest news and innovations to the public. Today’s conference is an important one and is being highly anticipated, with many in the SEO community trying to predict what will be announced.
It’s no secret that Google has been caught on the back foot with the onslaught of AI tools that have been released following the success of ChatGPT and Bing launching its own AI chatbot back in February. Google’s subsequent release of Bard, its own chatbot, has received a very lukewarm reception. It felt like it was just too little too late.
But, in today’s conference, it is expected that Google will announce its plans to integrate AI into its search engine and to make search more personal, interactive and well, up to date. This will be big news for all website and business owners as any changes as to how Google handles and displays its search results always impact the amount of traffic that Google sends to websites.
AI is going to be front and centre of today’s conference and we can expect that some major changes are coming, and probably to be launched in the coming weeks and months. We’ve already started seeing some of the possible changes coming to search, but Google is always testing so it’s difficult to know if these changes are permanent or just part of its regular testing schedule.
You can watch the conference live tonight at 6pm (UK time) and this year, I definitely think it’s worth the time.
Google has recently updated it’s helpful content guidelines to include a section about page experience. Last year, Google began rolling out new updates called “Helpful Content Updates”. These were designed to promote websites in the search results which were deemed to have, well, helpful content. It is worth reading what Google deems to be helpful content and why some websites may perform better than others in search when it comes to this criteria.
It is important to note that page experience isn’t necessarily a ranking factor on its own. It’s a combination of factors, such as website security, lack of ad intrusion, good core web vitals etc. This differs from helpful content which is more about the content itself.
Including page experience as part of the helpful content criteria is to guide website owners in ensuring that if their website does happen to have the best content, then just make sure a user can properly interact with it. Don’t make the content difficult to access, don’t plaster the page with too many ads, make sure the page works well on mobile, etc.
Although relevancy to the search query will likely trounce most other search ranking factors, if there are two listings that have similar helpful content, but one has a better user experience, then that website may just get to appear above the less usable result. It’s always a very good idea to create a great user experience…and it’s good that search engines are starting to take this into account!
Not a day goes past without some new and urgent news related to AI and search. It currently seems that Google is under some pressure to respond to the influx of AI tools, having been caught on the back foot by Bing launching its AI chat feature at the beginning of the year.
A recent article by the New York Times stated that Google now has over 160 engineers, designers and executives working full time on project “Magi”. Magi is an updated version of Google search but incorporating AI answers directly in the search results, offering a much more personalised approached. It seems that this could be launched as soon as May, with internal testing already under way. Initial release will be limited, with a gradual rollout.
As if that wasn’t enough, the article also mentioned that Google is working on a new search engine built on AI. This is intended to be much more conversational and personalised when compared to the existing search engine. This project seems a long way off yet but if anything, it highlights that the future of search is scheduled to change, possibly quite significantly compared to what it is today.
As if that wasn’t enough, news also surfaced that Samsung is looking to change the default search engine on its phones from Google to Bing, possibly because of Bing’s much more integrated AI chat advantage. Being the default search engine brings in billions of dollars in revenue for Google! We wait and see what happens…
Google just released their annual webspam report and, as expected, the results are that their webspam-figthing algorithm, terribly named SpamBrain, has been busier than ever fighting an ever increasing amount of spam on the web. There was a 5-fold increase in the number of spam websites detected in 2022, compared to 2021, and a 50-fold increase in detecting link spam.
What’s not clear is if the increase is attributed to there being a higher level of spam websites and links than the previous year, or that Google is just getting better and better at detecting it. I assume it’s a combination of the two.
Webspam has always been an issue, ever since the web was created, and although Google doesn’t always manage to detect it, it does a pretty of job of keeping the worst of it out of the search results. We do see spam websites appear in the results from time to time, but generally it is quite rare that Google will lead you to a website that is pure spam or likely to ruin your day by installing some malicious software on your computer.
The huge amounts of spam content and links means that there is a lot of noise on the web and it’s still challenging for search engines to detect it all. As legitimate content creators, we have to possibly do more to stand out and be heard. Google doesn’t index all content it discovers, so the focus on quality and authority is a major consideration for website owners and businesses today. It’s only getting noisier out there, we just need to always do more (good things) to be heard and be rewarded for it.
It’s no secret that backlinks matter. Inbound links are a fundamental ranking factor in Google’s ranking algorithms and although content quality is very important, in order to build up authority online a website needs quality inbound links. If you don’t believe me, try ranking for car insurance with a great page, full of the best content but with no inbound links!
However, not all links are created equal and some links will have more of an effect than others. There are many factors that determine what weighting is passed across from a link (which we won’t go into here), if at all, but a common discussion around links is about authority vs relevance. Which one is more important?
The reality is that a link from an authoritative and trusted website, not necessarily in an associated industry, is going to be more impactful than a link from a lower authority website but which may be more relevant to the destination website. Both links, if they are deemed by Google’s algorithm to be “natural” links, are likely to help the page being linked to to rank better, but the authoritative website is likely to have the highest influence.
It is for this reason that you often see large authority websites, which may not necessarily have the best content, still outrank better, related websites but with lower overall authority. Authority (links) matters. Links are still a very significant aspect of a website when it comes to good rankings. Authority first, relevancy second.
Microsoft was immensely proud of having been the first major search engine to release an AI chatbot, and rightly so. Despite Google claiming itself to be an “AI First Company”, it was slow to release its AI chatbot and was well and truly beaten to it by Microsoft, having implemented a chatbot into its Bing search engine in February, then forcing Google to release its competing chatbot a month later.
Microsoft had certainly seen this as an opportunity to potentially steal some search market share from Google. Despite Microsoft’s attempts and resources, Bing only has a 6% share of the US search market, and 4% of the UK. Those are very low numbers but Microsoft was anticipating an increase in search share based on its implementation of Bing AI.
However, although Bing usage has reportedly increased, this is not translating into an increase in search traffic to websites. Having spent some time seeing the evolution of Bing traffic to some of my websites today, it looks like it’s very varied, some are up and some are down. In the best case, one of my sites has seen an increase of 16% in traffic from Bing over the last month, compared to last year. However, although this may sound impressive, it’s still only 4% of the overall traffic. Most of the rest is from Google.
It’s early days but I think this reflects that for those of us who are in the industry, or related industries, we are constantly playing with these tools and looking for opportunities, we are in a slight bubble as users in general are not rushing to try out these new AI tools….just yet. We’ll keep monitoring the traffic from Bing but right now, I don’t envisage any change to the status quo, not for some time yet anyway.
The world is going AI crazy and it’s become quite overwhelming. In March there were over 1,000 new AI tools launched, covering all sorts of real-world applications based on the foundations set up by ChatGPT, which is now using GPT-4. Google launched its much anticipated AI chatbot called Bard, Anthropic launched Claude, Midjourney, the AI image generator launched V5, which creates unbelievably realistic images….and so much more.
The plethora of tools and articles about all of the tools, how to use them, how to benefit from them, it’s all becoming a little too much and too fast. The rate of innovation and adoption has been unprecedented and even Google’s release of Bard hasn’t exactly set the world alight, possibly a little late to the party.
The onslaught of AI tools has also opened up an entire argument as to the ethics behind these tools (they use content found online to generate their responses, yet they mostly don’t credit the sources) and also the fact that they are mostly open to everyone, regardless of age. Italy has been the first country to acknowledge the risks of these AI tools and has banned ChatGPT and similar tools from being used. This is probably just the beginning.
It’s just all happening way too fast but the pace of innovation is unlikely to slow down any time soon. If you feel overwhelmed by it all, you’re not alone. Personally, I’m avoiding a lot of the social media activity around the tools and taking the time to just understand and master a couple of tools and ignoring the others. We can’t slow down the progress of AI tools, but we can control how we choose to consume them.
Google just annouced that it has started rolling out it’s latest core update, imaginatively called the March 2023 Core Update. The last core update was back in September 2022, but that one was quite a mild one, especially compared to the highly impactful update from May 2022.
Core updates are essentially key improvements made to Google’s algorithms, designed to improve the relevancy and quality of the search results. It may not always seem that the results of core updates have really improved the quality of the search results, but this is a continuous and evolving process and essential in dealing with increasing spam and in making sense of an ever changing web.
I personally love to see core updates being rolled out. Personally, and as a business, we always focus on creating high quality websites and content for ourselves and our clients, and sometimes it’s frustrating when you see lower quality, less helpful results in the higher ranking positions. Core updates help to reshuffle the results and recognise the pages that should be ranking better than they are.
Losing rankings during a core update does not mean that a website has been penalised, it just means that the content on the page may now be deemed to be less relevant for the search query. Care updates help to maintain focus and to understand how Google is now seeing the intent behind search queries. It’s a great time to focus on increasing quality and on page relevancy. SEO never stops!
I’ve touched on this before but it’s worth a post on its own as I still keep seeing this and not only is automatically redirecting users to the version you think they should be on very annoying, it can also create huge problems for search engines as they may not be able to access all of your content.
Therefore, say I’m in the UK and I access a website that it set up for international traffic. The website automatically redirects me to the UK version of the site, maybe with pricing in GBP. This is what I probably want to see as I’m in the UK. Now, say someone in the US tries to access the UK website but then gets automatically redirected back to the US version, with pricing in USD. It’s probably the correct version of the website for that user, but here’s where the problems begin.
Google’s crawler, Googlebot, mostly crawls from the US. It means that if Googlebot tries to access the UK website, it will automatically be redirected to the US version instead. In other words, Google cannot crawl, index or rank the UK website as it can’t see it.
It’s best to ask users what version of the website they would like to go to. Don’t assume that just because they are in a certain country that they want the local variant, or even worse, assume that they speak the local language. So, always ask, never presume!
It’s interesting when we speak to new potential clients and we get onto the subject of the usability and conversion set up of their websites. SEO has been traditionally thought of as optimising a website, such as titles, metas and headings and with some off page work in the form of link building. In fact, talking to potential clients we often hear the phrase “I don’t really understand what you do” as they see SEO as some sort of black box and potentially underhand service that they can’t see.
At Artemis we are very open book, and we want our clients to understand what we are doing and why. It’s important that they are invested in the process as it does take time to build up a reputation in order to increase rankings and traffic. However, our initial focus is also very much on usability and website conversions. Making a website generate more leads or sales is a much quicker process than the organic side of SEO, and this can create some early wins, especially if the website is not well set up to convert traffic.
We see CRO, Conversion Rate Optimisation, as an absolutely integral part of SEO. If clients experience an increase in sales without an increase in traffic early on then it starts to justify the investment in SEO and allows the organic side of things to build up over time. Traffic is not a good metric of success as not all traffic has the same value. Our approach is to always focus on targeted traffic whilst simultaneously working to improve conversions across the website. This has the added benefit that as the traffic grows it will convert at a higher rate. That way everyone is happy!
This is a topic that we get asked about quite often. The SEO world is burdened by terrible paranoia, that the wrong move can destroy rankings and traffic in an instant. From building the wrong types of links to your website, having low quality content or Google thinking that your website is not as helpful as you think it is. We’ve seen many types of spam techniques over the years, many of which still work today, but one that seems to be considered “spam” may not be spam at all.
Say you come across a competitor that has gone out of business and you think it would be a good idea to buy their domain name and repoint it to yours because they had some half decent rankings and some branded traffic. Is it wrong to do this?
The reality is that it’s not and there is nothing wrong with “merging” two websites. In the real world it’s quite normal for a business to buy another and merge them into one bigger business. The same is true online. The best thing would be to merge the new domain on a page level basis, so that you redirect each page to its nearest equivalent on your website, as this will maximise the effect of the merger. It’s best not to just redirect everything to your home page.
Where you do need to take care is when you want to repoint multiple domains to your website. This can trip a filter in Google’s algorithm as this can appear unnatural. In reality, in the worst case Google may not apply any benefit of those domains to your website. Everything is fine in moderation, just don’t overdo it!
Over the weekend I finally got access to the new Bing AI Chat and what an impressive thing it really is. I spent several hours on Saturday playing around with it and came away feeling that this is the future, although it’s not quite the now. Let me explain…
I first accessed the internet as a student in 1992. Back then, the concept of “searching” was very new and the number of available websites was very small. As search engines back then were not very good, it also meant that the search results were very poor and lacking. It took a long time after that for the internet and searching in general to become what could essentially be described now as a habit. We automatically just go to Google when we want to know something.
Bing AI Chat is impressive, but it’s a very different way of interacting with a search engine and it will take time to adjust. Understanding how to prompt the AI is very important, and that’s a skill in itself which will need to be developed. With reportedly a million people on the wait list for access to Bing’s AI Chat, it shows that this still has some way to go before it is adopted by the masses.
We are entering a very new and exciting time, but this isn’t going to change search overnight…although it will happen as it becomes more normal. We’re creatures of habit and we’ve become accustomed to searching and trusting the results from search engines. That’s not going to change any time soon…but it will change and evolve eventually.
I’ve discussed Core Web Vitals a couple of times in my insights, and my advice has always been the same. Don’t focus too much on it unless your website is really slow and problematic for users. The reason being is that trying to achieve a good CWV score can be very difficult and very time consuming, especially when most gains will be very minimal.
I think what generally happens is that SEOs use CWV as a focus as it’s a measurable action, albeit one that ultimately may have no, or very little impact, on the performance of the website for users and in search. The reality is that for most small businesses it’s not an issue.
For a typical small, local business, in Search Console there generally isn’t enough data for Google to be able to report back the metrics for CWV for the website. This means that it can’t take these metrics into account for ranking purposes.
If you’ve got a website that’s enjoying #1 rankings for all of its target keywords then sure, spend some time trying to fine tune all of the CWV metrics, but otherwise, don’t lose too much sleep over it. It’s a sentiment that was recently shared by John Mueller from Google. There are much better things you could be spending your time on.
I’ve been optimising and marketing websites since 2003 and during that time it’s always been funny hearing the “SEO is dead” comments reverberating around the SEO community. As search evolved and became more complex, those who were unable to stay ahead of the game inevitably were the ones calling an end to SEO. And with the sudden and (expected) increase in AI activity online, those who can’t see a way forward are yet again declaring that this is the end of SEO.
We’ve been through some very significant moments where search has changed radically, from Google being able to understand content, and not that it’s just a bunch of words, to adding featured snippets in the results, and so much more. Through each of these phases we’ve been able to embrace the changes, adapt and continue to grow traffic and sales for our clients.
We’ve spent a significant amount of time and money over the last few years understanding how AI is impacting search, so it’s what we’ve been expecting and working towards. Yes, search is going to change but it’s not going to happen overnight. AI tools are just not good enough yet, there have even been some interesting cases of Bing’s new AI chat getting a bit nasty with users. Search engines are trusted to deliver reliable results, and until these AI tools can be fully trusted (we are very far away from this), there will be a gradual integration of this functionality into the search results.
SEO is far from dead. Search engines and AI tools need content, they need websites and they need optimisation to understand what everything is. Those that can’t navigate this new era will exit the space and maybe the SEO world will end up with a higher quality level of SEO professionals. It’s going to definitely get harder, but we love a challenge!
This week, Google celebrated Valentines Day with an unconfirmed search algorithm update. Various tools which track changes in the search results reported a high level of volatility, although there was no confirmation from Google that there was an update under way. It is quite common to notice an increase in the volatility of the search results when an update is rolling, as Google only confirms the update once its effects start becoming noticeable.
However, what’s happening when an update is never confirmed by Google? The reality is that Google launches hundreds of updates to its search algorithms every year, practically on a daily basis. Many are relatively small and most won’t be noticed by users. A recent comment by Gary Illyes from Google was very interesting when he stated that very often when public search monitoring tools report big changes in the search results, they don’t align with any significant updates that Google has made. In other words, they don’t know why these tools would be reporting any major changes in the results. They remain a mystery…even to Google!
I came across a typical discussion online yesterday regarding bounce rate. Bounce rate is a measure of the number of users that land on a page and leave again without clicking on any other page on the website. There is a common myth in the SEO world that a high bounce rate is a bad thing and a negative ranking factor. Actually, it isn’t. Representatives from Google have commented various times that Google does not look at bounce rates. In fact, when GA4, Google’s new analytics software, was launched it didn’t even have bounce rate included as a metric.
The simple rule of thumb regarding bounce rate is that it’s likely to be relative, rather than absolute. For example, if you are searching for a gift for your partner’s 20th birthday, it’s likely that you’ll click through to a search result, read the 20th birthday gift ideas and then leave. It’s unlikely that you’re going to be interested in gift ideas for other years. Therefore, for these types of results it’s fine, and expected, that users will bounce back to the search results, causing a high bounce rate figure. If, however, you are looking for a local surveyor, you may spend more time on a website understanding the services provided and the surveyors themselves. In this case you wouldn’t want a high bounce rate, and it shouldn’t be expected.
Therefore, think of bounce rate as relative. If it’s high, it may be absolutely normal.
Google Search Console is a great tool full of useful data so that you can understand how your website is performing in search and any issues that Google finds that may be holding it back in search, such as errors and broken schema mark up. However, there is one feature that is a little tucked away and doesn’t seem to get much coverage. This is the Crawl Stats report. It’s under the Settings section in Search Console, which is a little odd, as you can’t actually set anything for crawl stats!
We recently had an issue with a website where new content was not being indexed and some of the old pages were being deindexed. Going through the process of checking that the content was indexable, linked to internally and not blocking search engines, and seeing that everything was fine, we then discovered that the answer lay in the crawl stats report. A recent increase in the “Average Response Time” had decreased the “Total Crawl Requests”. In other words, something was causing Google to take longer to crawl the pages of the website, so it started crawling less often.
We managed to find the cause of the issue, fix it, and crawling and indexing was restored. New content started getting indexed again. It’s good to keep an eye on the Crawl Stats report, as it can highlight an issue which other reports in Search Console will never expose. Google needs to be able to efficiently crawl a website, as fast as possible. It’s a good idea not to make things difficult for GoogleBot!