Justin's Daily SEO Insights
Justin, our Technical Director, shares his daily thoughts on what is happening in the world of SEO...
Neeva was neeva going to work
Excuse the pun in the title! This week Neeva announced that it was shutting down its subscription based, ad-free, privacy focused search engine. Although this news was greeted by quite some surprise in the search world, it wasn’t really a surprise at all. How many people have actually used Neeva? No one I know has ever used it, let alone heard of it.
You only have to look at the struggle that Bing has had, with all of the mighty resources of Microsoft behind it, trying to compete with Google and failing. Bing has a miniscule share of the search market, probably less than 5%.
So, how was a search engine, that you had to pay to use, ever going to compete against the free established options? Neeva stated that the problem wasn’t so much getting users to pay, it was just the underestimated difficulty in getting people to change from what they are so familiar with. Once users are used to having something for free, it’s practically impossible to make them pay for it. Even Twitter now, with its pointless paid options, has only got a tiny percentage of users to part with their money, and most of those are either Musk fanatics or those that desperately want a little blue tick…which means nothing now.
Search has evolved to the extent that it is highly complex and requiring vast amounts of money. It’s also part of our lives. It’s highly likely that people will still “google it” for a long time to come. The competition just isn’t there and we’re so ingrained in Google’s eco system.
Why Google will still be sending traffic to websites in an AI world
With all the recent advancements in AI, and especially Google’s planned integrated of its chatbot directly in the search results, one of the biggest concerns for any business or website owner is if Google will still end up driving traffic to their websites when search becomes “Generative AI search”.
The short answer is “Yes”. And the reason is very simple. It’s all about the money. In 2022, Google generated a total of $227.47B in advertising revenue, of which $162.54B (71%) was from ads in search (PPC), $32.78B (15%) from ads on third-party sites (Adsense) and $29.24B (13%) from YouTube.
Google can’t just jeopardise any of that revenue. On the contrary, it has to show continuous growth. Paid ads send traffic to websites, the content network displays Google ads (Adsense) and that alone generates billions in revenue for Google. The content network alone generates more revenue than YouTube and it generates a significant amount of its traffic from search.
Website traffic will change, but it’s not going away. It can’t, it’s too valuable for Google.
Google’s new “Generative AI Search” announced
Yesterday was Google’s much anticipated Google I/O annual conference, and the focus throughout the whole presentation was extensively about the application of AI in Google products. The main area of interest was how Google plans to integrate AI into the search results, what it called a “Generative AI Search Experience”.
What Google revealed was not a complete surprise. We were anticipating that, unlike Bing, Google’s AI chatbot would be fully integrated and front and centre in the search results. It showed how the AI will help users to get a more “helpful” search experience by being able to easily expand on topics, and help users to find products to buy based on Google’s huge Shopping database.
It was interesting to note that there are areas where the AI is less likely to appear, and that’s for critical searches. Generally anything to do with finance and health, what Google calls “YMYL” (your money, your life) pages. It will still prefer to show trusted search results as it does now.
The examples Google showed of its new search in action, were mainly for long queries, the type you’d ask a chatbot. But we will need to see what happens when it’s used for searches to do with local businesses or finding a particular service.
At Artemis we’ve been embracing and adapting to the influence of AI in search for many years, and we’re excited for this next phase and maximising the opportunities for our clients.
Anticipation for Google I/O conference today
Google I/O is an annual conference where Google reveals its latest news and innovations to the public. Today’s conference is an important one and is being highly anticipated, with many in the SEO community trying to predict what will be announced.
It’s no secret that Google has been caught on the back foot with the onslaught of AI tools that have been released following the success of ChatGPT and Bing launching its own AI chatbot back in February. Google’s subsequent release of Bard, its own chatbot, has received a very lukewarm reception. It felt like it was just too little too late.
But, in today’s conference, it is expected that Google will announce its plans to integrate AI into its search engine and to make search more personal, interactive and well, up to date. This will be big news for all website and business owners as any changes as to how Google handles and displays its search results always impact the amount of traffic that Google sends to websites.
AI is going to be front and centre of today’s conference and we can expect that some major changes are coming, and probably to be launched in the coming weeks and months. We’ve already started seeing some of the possible changes coming to search, but Google is always testing so it’s difficult to know if these changes are permanent or just part of its regular testing schedule.
You can watch the conference live tonight at 6pm (UK time) and this year, I definitely think it’s worth the time.
Helpful content needs a good page experience
Google has recently updated it’s helpful content guidelines to include a section about page experience. Last year, Google began rolling out new updates called “Helpful Content Updates”. These were designed to promote websites in the search results which were deemed to have, well, helpful content. It is worth reading what Google deems to be helpful content and why some websites may perform better than others in search when it comes to this criteria.
It is important to note that page experience isn’t necessarily a ranking factor on its own. It’s a combination of factors, such as website security, lack of ad intrusion, good core web vitals etc. This differs from helpful content which is more about the content itself.
Including page experience as part of the helpful content criteria is to guide website owners in ensuring that if their website does happen to have the best content, then just make sure a user can properly interact with it. Don’t make the content difficult to access, don’t plaster the page with too many ads, make sure the page works well on mobile, etc.
Although relevancy to the search query will likely trounce most other search ranking factors, if there are two listings that have similar helpful content, but one has a better user experience, then that website may just get to appear above the less usable result. It’s always a very good idea to create a great user experience…and it’s good that search engines are starting to take this into account!
Google rushing out new AI features and new search engine
Not a day goes past without some new and urgent news related to AI and search. It currently seems that Google is under some pressure to respond to the influx of AI tools, having been caught on the back foot by Bing launching its AI chat feature at the beginning of the year.
A recent article by the New York Times stated that Google now has over 160 engineers, designers and executives working full time on project “Magi”. Magi is an updated version of Google search but incorporating AI answers directly in the search results, offering a much more personalised approached. It seems that this could be launched as soon as May, with internal testing already under way. Initial release will be limited, with a gradual rollout.
As if that wasn’t enough, the article also mentioned that Google is working on a new search engine built on AI. This is intended to be much more conversational and personalised when compared to the existing search engine. This project seems a long way off yet but if anything, it highlights that the future of search is scheduled to change, possibly quite significantly compared to what it is today.
As if that wasn’t enough, news also surfaced that Samsung is looking to change the default search engine on its phones from Google to Bing, possibly because of Bing’s much more integrated AI chat advantage. Being the default search engine brings in billions of dollars in revenue for Google! We wait and see what happens…
Google’s spam fighting and what it means for non-spammers
Google just released their annual webspam report and, as expected, the results are that their webspam-figthing algorithm, terribly named SpamBrain, has been busier than ever fighting an ever increasing amount of spam on the web. There was a 5-fold increase in the number of spam websites detected in 2022, compared to 2021, and a 50-fold increase in detecting link spam.
What’s not clear is if the increase is attributed to there being a higher level of spam websites and links than the previous year, or that Google is just getting better and better at detecting it. I assume it’s a combination of the two.
Webspam has always been an issue, ever since the web was created, and although Google doesn’t always manage to detect it, it does a pretty of job of keeping the worst of it out of the search results. We do see spam websites appear in the results from time to time, but generally it is quite rare that Google will lead you to a website that is pure spam or likely to ruin your day by installing some malicious software on your computer.
The huge amounts of spam content and links means that there is a lot of noise on the web and it’s still challenging for search engines to detect it all. As legitimate content creators, we have to possibly do more to stand out and be heard. Google doesn’t index all content it discovers, so the focus on quality and authority is a major consideration for website owners and businesses today. It’s only getting noisier out there, we just need to always do more (good things) to be heard and be rewarded for it.
Authority or relevance – What’s more important for links?
It’s no secret that backlinks matter. Inbound links are a fundamental ranking factor in Google’s ranking algorithms and although content quality is very important, in order to build up authority online a website needs quality inbound links. If you don’t believe me, try ranking for car insurance with a great page, full of the best content but with no inbound links!
However, not all links are created equal and some links will have more of an effect than others. There are many factors that determine what weighting is passed across from a link (which we won’t go into here), if at all, but a common discussion around links is about authority vs relevance. Which one is more important?
The reality is that a link from an authoritative and trusted website, not necessarily in an associated industry, is going to be more impactful than a link from a lower authority website but which may be more relevant to the destination website. Both links, if they are deemed by Google’s algorithm to be “natural” links, are likely to help the page being linked to to rank better, but the authoritative website is likely to have the highest influence.
It is for this reason that you often see large authority websites, which may not necessarily have the best content, still outrank better, related websites but with lower overall authority. Authority (links) matters. Links are still a very significant aspect of a website when it comes to good rankings. Authority first, relevancy second.
Is Bing now driving more traffic after the launch of its AI chatbot?
Microsoft was immensely proud of having been the first major search engine to release an AI chatbot, and rightly so. Despite Google claiming itself to be an “AI First Company”, it was slow to release its AI chatbot and was well and truly beaten to it by Microsoft, having implemented a chatbot into its Bing search engine in February, then forcing Google to release its competing chatbot a month later.
Microsoft had certainly seen this as an opportunity to potentially steal some search market share from Google. Despite Microsoft’s attempts and resources, Bing only has a 6% share of the US search market, and 4% of the UK. Those are very low numbers but Microsoft was anticipating an increase in search share based on its implementation of Bing AI.
However, although Bing usage has reportedly increased, this is not translating into an increase in search traffic to websites. Having spent some time seeing the evolution of Bing traffic to some of my websites today, it looks like it’s very varied, some are up and some are down. In the best case, one of my sites has seen an increase of 16% in traffic from Bing over the last month, compared to last year. However, although this may sound impressive, it’s still only 4% of the overall traffic. Most of the rest is from Google.
It’s early days but I think this reflects that for those of us who are in the industry, or related industries, we are constantly playing with these tools and looking for opportunities, we are in a slight bubble as users in general are not rushing to try out these new AI tools….just yet. We’ll keep monitoring the traffic from Bing but right now, I don’t envisage any change to the status quo, not for some time yet anyway.
AI overload – GPT-4, Bard, Bing, Claude, Midjourney and so much more
The world is going AI crazy and it’s become quite overwhelming. In March there were over 1,000 new AI tools launched, covering all sorts of real-world applications based on the foundations set up by ChatGPT, which is now using GPT-4. Google launched its much anticipated AI chatbot called Bard, Anthropic launched Claude, Midjourney, the AI image generator launched V5, which creates unbelievably realistic images….and so much more.
The plethora of tools and articles about all of the tools, how to use them, how to benefit from them, it’s all becoming a little too much and too fast. The rate of innovation and adoption has been unprecedented and even Google’s release of Bard hasn’t exactly set the world alight, possibly a little late to the party.
The onslaught of AI tools has also opened up an entire argument as to the ethics behind these tools (they use content found online to generate their responses, yet they mostly don’t credit the sources) and also the fact that they are mostly open to everyone, regardless of age. Italy has been the first country to acknowledge the risks of these AI tools and has banned ChatGPT and similar tools from being used. This is probably just the beginning.
It’s just all happening way too fast but the pace of innovation is unlikely to slow down any time soon. If you feel overwhelmed by it all, you’re not alone. Personally, I’m avoiding a lot of the social media activity around the tools and taking the time to just understand and master a couple of tools and ignoring the others. We can’t slow down the progress of AI tools, but we can control how we choose to consume them.
March 2023 core update rolling – Why I like core updates
Google just annouced that it has started rolling out it’s latest core update, imaginatively called the March 2023 Core Update. The last core update was back in September 2022, but that one was quite a mild one, especially compared to the highly impactful update from May 2022.
Core updates are essentially key improvements made to Google’s algorithms, designed to improve the relevancy and quality of the search results. It may not always seem that the results of core updates have really improved the quality of the search results, but this is a continuous and evolving process and essential in dealing with increasing spam and in making sense of an ever changing web.
I personally love to see core updates being rolled out. Personally, and as a business, we always focus on creating high quality websites and content for ourselves and our clients, and sometimes it’s frustrating when you see lower quality, less helpful results in the higher ranking positions. Core updates help to reshuffle the results and recognise the pages that should be ranking better than they are.
Losing rankings during a core update does not mean that a website has been penalised, it just means that the content on the page may now be deemed to be less relevant for the search query. Care updates help to maintain focus and to understand how Google is now seeing the intent behind search queries. It’s a great time to focus on increasing quality and on page relevancy. SEO never stops!
Don’t automatically redirect users based on their IP
I’ve touched on this before but it’s worth a post on its own as I still keep seeing this and not only is automatically redirecting users to the version you think they should be on very annoying, it can also create huge problems for search engines as they may not be able to access all of your content.
Therefore, say I’m in the UK and I access a website that it set up for international traffic. The website automatically redirects me to the UK version of the site, maybe with pricing in GBP. This is what I probably want to see as I’m in the UK. Now, say someone in the US tries to access the UK website but then gets automatically redirected back to the US version, with pricing in USD. It’s probably the correct version of the website for that user, but here’s where the problems begin.
Google’s crawler, Googlebot, mostly crawls from the US. It means that if Googlebot tries to access the UK website, it will automatically be redirected to the US version instead. In other words, Google cannot crawl, index or rank the UK website as it can’t see it.
It’s best to ask users what version of the website they would like to go to. Don’t assume that just because they are in a certain country that they want the local variant, or even worse, assume that they speak the local language. So, always ask, never presume!
SEO – It’s not just about traffic. It’s about conversions
It’s interesting when we speak to new potential clients and we get onto the subject of the usability and conversion set up of their websites. SEO has been traditionally thought of as optimising a website, such as titles, metas and headings and with some off page work in the form of link building. In fact, talking to potential clients we often hear the phrase “I don’t really understand what you do” as they see SEO as some sort of black box and potentially underhand service that they can’t see.
At Artemis we are very open book, and we want our clients to understand what we are doing and why. It’s important that they are invested in the process as it does take time to build up a reputation in order to increase rankings and traffic. However, our initial focus is also very much on usability and website conversions. Making a website generate more leads or sales is a much quicker process than the organic side of SEO, and this can create some early wins, especially if the website is not well set up to convert traffic.
We see CRO, Conversion Rate Optimisation, as an absolutely integral part of SEO. If clients experience an increase in sales without an increase in traffic early on then it starts to justify the investment in SEO and allows the organic side of things to build up over time. Traffic is not a good metric of success as not all traffic has the same value. Our approach is to always focus on targeted traffic whilst simultaneously working to improve conversions across the website. This has the added benefit that as the traffic grows it will convert at a higher rate. That way everyone is happy!
Pointing expired or other domains to your website – Is it OK?
This is a topic that we get asked about quite often. The SEO world is burdened by terrible paranoia, that the wrong move can destroy rankings and traffic in an instant. From building the wrong types of links to your website, having low quality content or Google thinking that your website is not as helpful as you think it is. We’ve seen many types of spam techniques over the years, many of which still work today, but one that seems to be considered “spam” may not be spam at all.
Say you come across a competitor that has gone out of business and you think it would be a good idea to buy their domain name and repoint it to yours because they had some half decent rankings and some branded traffic. Is it wrong to do this?
The reality is that it’s not and there is nothing wrong with “merging” two websites. In the real world it’s quite normal for a business to buy another and merge them into one bigger business. The same is true online. The best thing would be to merge the new domain on a page level basis, so that you redirect each page to its nearest equivalent on your website, as this will maximise the effect of the merger. It’s best not to just redirect everything to your home page.
Where you do need to take care is when you want to repoint multiple domains to your website. This can trip a filter in Google’s algorithm as this can appear unnatural. In reality, in the worst case Google may not apply any benefit of those domains to your website. Everything is fine in moderation, just don’t overdo it!
Bing AI Chat access and first thoughts
Over the weekend I finally got access to the new Bing AI Chat and what an impressive thing it really is. I spent several hours on Saturday playing around with it and came away feeling that this is the future, although it’s not quite the now. Let me explain…
I first accessed the internet as a student in 1992. Back then, the concept of “searching” was very new and the number of available websites was very small. As search engines back then were not very good, it also meant that the search results were very poor and lacking. It took a long time after that for the internet and searching in general to become what could essentially be described now as a habit. We automatically just go to Google when we want to know something.
Bing AI Chat is impressive, but it’s a very different way of interacting with a search engine and it will take time to adjust. Understanding how to prompt the AI is very important, and that’s a skill in itself which will need to be developed. With reportedly a million people on the wait list for access to Bing’s AI Chat, it shows that this still has some way to go before it is adopted by the masses.
We are entering a very new and exciting time, but this isn’t going to change search overnight…although it will happen as it becomes more normal. We’re creatures of habit and we’ve become accustomed to searching and trusting the results from search engines. That’s not going to change any time soon…but it will change and evolve eventually.
Core Web Vitals – It’s not always worth the effort
I’ve discussed Core Web Vitals a couple of times in my insights, and my advice has always been the same. Don’t focus too much on it unless your website is really slow and problematic for users. The reason being is that trying to achieve a good CWV score can be very difficult and very time consuming, especially when most gains will be very minimal.
I think what generally happens is that SEOs use CWV as a focus as it’s a measurable action, albeit one that ultimately may have no, or very little impact, on the performance of the website for users and in search. The reality is that for most small businesses it’s not an issue.
For a typical small, local business, in Search Console there generally isn’t enough data for Google to be able to report back the metrics for CWV for the website. This means that it can’t take these metrics into account for ranking purposes.
If you’ve got a website that’s enjoying #1 rankings for all of its target keywords then sure, spend some time trying to fine tune all of the CWV metrics, but otherwise, don’t lose too much sleep over it. It’s a sentiment that was recently shared by John Mueller from Google. There are much better things you could be spending your time on.
“SEO is dead”…it’s that time of year again
I’ve been optimising and marketing websites since 2003 and during that time it’s always been funny hearing the “SEO is dead” comments reverberating around the SEO community. As search evolved and became more complex, those who were unable to stay ahead of the game inevitably were the ones calling an end to SEO. And with the sudden and (expected) increase in AI activity online, those who can’t see a way forward are yet again declaring that this is the end of SEO.
We’ve been through some very significant moments where search has changed radically, from Google being able to understand content, and not that it’s just a bunch of words, to adding featured snippets in the results, and so much more. Through each of these phases we’ve been able to embrace the changes, adapt and continue to grow traffic and sales for our clients.
We’ve spent a significant amount of time and money over the last few years understanding how AI is impacting search, so it’s what we’ve been expecting and working towards. Yes, search is going to change but it’s not going to happen overnight. AI tools are just not good enough yet, there have even been some interesting cases of Bing’s new AI chat getting a bit nasty with users. Search engines are trusted to deliver reliable results, and until these AI tools can be fully trusted (we are very far away from this), there will be a gradual integration of this functionality into the search results.
SEO is far from dead. Search engines and AI tools need content, they need websites and they need optimisation to understand what everything is. Those that can’t navigate this new era will exit the space and maybe the SEO world will end up with a higher quality level of SEO professionals. It’s going to definitely get harder, but we love a challenge!
Unconfirmed Google updates…what causes them?
This week, Google celebrated Valentines Day with an unconfirmed search algorithm update. Various tools which track changes in the search results reported a high level of volatility, although there was no confirmation from Google that there was an update under way. It is quite common to notice an increase in the volatility of the search results when an update is rolling, as Google only confirms the update once its effects start becoming noticeable.
However, what’s happening when an update is never confirmed by Google? The reality is that Google launches hundreds of updates to its search algorithms every year, practically on a daily basis. Many are relatively small and most won’t be noticed by users. A recent comment by Gary Illyes from Google was very interesting when he stated that very often when public search monitoring tools report big changes in the search results, they don’t align with any significant updates that Google has made. In other words, they don’t know why these tools would be reporting any major changes in the results. They remain a mystery…even to Google!
Bounce rate as a ranking factor
I came across a typical discussion online yesterday regarding bounce rate. Bounce rate is a measure of the number of users that land on a page and leave again without clicking on any other page on the website. There is a common myth in the SEO world that a high bounce rate is a bad thing and a negative ranking factor. Actually, it isn’t. Representatives from Google have commented various times that Google does not look at bounce rates. In fact, when GA4, Google’s new analytics software, was launched it didn’t even have bounce rate included as a metric.
The simple rule of thumb regarding bounce rate is that it’s likely to be relative, rather than absolute. For example, if you are searching for a gift for your partner’s 20th birthday, it’s likely that you’ll click through to a search result, read the 20th birthday gift ideas and then leave. It’s unlikely that you’re going to be interested in gift ideas for other years. Therefore, for these types of results it’s fine, and expected, that users will bounce back to the search results, causing a high bounce rate figure. If, however, you are looking for a local surveyor, you may spend more time on a website understanding the services provided and the surveyors themselves. In this case you wouldn’t want a high bounce rate, and it shouldn’t be expected.
Therefore, think of bounce rate as relative. If it’s high, it may be absolutely normal.
How often do you check your crawl stats?
Google Search Console is a great tool full of useful data so that you can understand how your website is performing in search and any issues that Google finds that may be holding it back in search, such as errors and broken schema mark up. However, there is one feature that is a little tucked away and doesn’t seem to get much coverage. This is the Crawl Stats report. It’s under the Settings section in Search Console, which is a little odd, as you can’t actually set anything for crawl stats!
We recently had an issue with a website where new content was not being indexed and some of the old pages were being deindexed. Going through the process of checking that the content was indexable, linked to internally and not blocking search engines, and seeing that everything was fine, we then discovered that the answer lay in the crawl stats report. A recent increase in the “Average Response Time” had decreased the “Total Crawl Requests”. In other words, something was causing Google to take longer to crawl the pages of the website, so it started crawling less often.
We managed to find the cause of the issue, fix it, and crawling and indexing was restored. New content started getting indexed again. It’s good to keep an eye on the Crawl Stats report, as it can highlight an issue which other reports in Search Console will never expose. Google needs to be able to efficiently crawl a website, as fast as possible. It’s a good idea not to make things difficult for GoogleBot!
JavaScript is great…but not always for SEO
We recently received a new enquiry from a company that had experienced a big loss in traffic following the launch of their new website. We often see this anyway as many website migrations are not handled properly, with not enough consideration given to the effects on rankings in search. However, this recent enquiry had a key issue which we have seen a few times previously and seems to be becoming more prevalent. That is, the increased use of JavaScript to load the main content on a page.
Whereas a “normal” HTML website will have all of the code processed at the server and sent to the browser in its final form, a website that relies on JavaScript will only send the HTML part of the code to the browser with the JavaScript part then rendered by the browser. This is a two-step process and search engines handle it so. However, many JavaScript implementations are not SEO friendly and cause serious crawling and indexing issues.
With a recent client we replaced the JavaScript with HTML and the rankings recovered. It’s no coincidence. If you are planning on using JavaScript on your web pages, it’s still best to use it only for the dynamic features necessary for your users to carry out certain interactions with the website, not for loading the content.
The search category is about fair use
There was a very interesting interview with Microsoft CEO, Satya Nadella, published on The Verge discussing integration of AI into MS products including Bing search engine and its Edge browser. Bing currently has a very small share of the search market, approximately 4% in the UK and 6% in the US. Microsoft seem quite confident that this could be their moment, they’ve beaten Google to launch a very innovative and well-integrated AI product into their search engine and browser. They really have done a very good job with it.
The interview with Satya Nadella is interesting because in it he discusses the importance of the new AI chat feature in sending traffic back to websites: “The search category is about fair use so that we can generate traffic back to publishers….Our bots are not going to be allowed to crawl search if we are not driving traffic.” According to Satya, they will be monitoring the traffic being sent out to websites and it will be interesting to see how, over time, the level of traffic to websites is impacted by the new chat tools. Ultimately, if publishers don’t get the traffic then they won’t be investing in new content. The AI needs content to learn, publishers need money and Microsoft needs to keep things relevant and display ads. Interesting times ahead!
Google’s view on the use of AI for content
Yesterday Google published a blog post clarifying, to some extent, the use of AI generated content for websites. The short summary of the post is that from Google’s point of view, AI is used in many applications online, such as weather prediction, sports scores and transcripts, so it may be fine to use AI but not if it’s used for the sole purpose of ranking in Google’s search results. It goes on to say that Google has had systems in place for some time now to detect content that is “not helpful” and not unique enough, so it’s up to you if you want to use AI tools to generate content. You just may not get any ranking benefit from it.
I think ultimately, Google is unable to completely rule out AI content in the search results. They use AI in their algorithms so it would be hypocritical if they said that no one should use it at all for their own benefit. What is comes down to is the quality of the content and if it fits in with their E-E-A-T guidelines. Experience, expertise and knowing who’s written the content is going to become very important. There is a great summary of this here. Use AI to help you, but don’t depend on it. Google has been fighting “low quality content” for a long time and it knows what it’s looking for.
Bing lays down the AI gauntlett
Bing has officially announced that its integration of OpenAI into its products is now available to the public, although currently limited and rolling out gradually. This was expected as they wanted to beat Google to be the first major search engine to integrate AI generated responses into the search results.
This is a major and significant change to the types of search results that we have become accustomed to over the years. Is this going to change how people search and interact with the search results? Of course, but not for all types of search queries.
If you’re looking for a local plumber, you’re unlikely to ask the AI. You’ll want to see options, reviews, images, etc. However, if you want to plan a trip somewhere or want some gift ideas for your anniversary, those are the types of searches that will likely see an impact from the AI results. This could cause websites to receive less organic traffic.
But, is this a feature which ultimately suffers the same fate as voice search? We all have a voice search enabled device in our pockets, but the take up has been very slow. We’ll need to see how AI results develop but Bing have certainly started the ball rolling now…and it’s rolling fast.
Google and Bing AI is coming
There’s doesn’t seem to be a day that goes by at the moment where there isn’t some sort of announcement about AI chatbot tools and search engines. Google CEO, Sundar Pichai, yesterday blogged about how Google will be enabling an AI chatbot feature, called Claude, into its search results. The example given in the blog post shows that the resulting text does not have any citations, and this could cause content creators to respond angrily, as the AI is using human created content to generate its responses without any accreditation to the original sources.
A user on Twitter recently found a “chat” option when he went to use Bing, and this appears to be how Bing may ultimately integrate ChatGPT into the search engine. Bing is offering the chat feature as a separate search function, although it does cite its sources, which is a positive over Google’s integration.
This new era of search is still in its infancy and we are going to be seeing a lot more announcements and changes over the coming weeks and months. Stay tuned!
Google’s AI is coming
Google has been using AI to refine the search results, tackle spam, interpret images, etc, for quite some time now, but the applications for users have been limited. With the serious level of interest and usage that ChatGPT has generated over the last couple of months, has this forced Google into releasing their AI chatbot equivalent sooner than was anticipated?
It would seem so. In yesterday’s Q4 earnings call, Google CEO, Sundar Pichai, said that “In the coming weeks and months, we’ll make these language models available, starting with LaMDA, so that people can engage directly with them”. It won’t be long before we see a ChatGPT equivalent in place within Google search, although it’s likely to be a secondary option instead of part of the main search results.
Some initial insider comments seem to suggest that Google may replace the long-standing “I’m feeling lucky” button with a new button to access their chatbot. We will have to wait and see but this is shaping up to be quite the year! Things are moving very fast.
The importance of authorship
If there’s one area of SEO that’s going to become even more important in 2023, it’s authorship. We are coming into an era where the internet is going to be flooded with AI generated content, and this content is not the same as content written by real people. Search engines are more likely to reward content with higher rankings if they can trust the source. They have to know who the author is. If the author is a known person, known to Google as someone who has expertise in a certain field, then it’s more likely that this content will rank above “anonymous” content.
Google itself has just created an authors page for the Search Central Blog. Note how each author lists their own social media accounts and the posts that they have written. This is what Google is expecting to see going forward for websites. It’s part of its E-E-A-T philosophy. Experience, expertise, authoritativeness and trustworthiness. Gone are the days where you could just write an article and expect it to rank. You have to build your own personal profile as an expert in your field and make sure search engines can associate your content to you.
Is link building bad?
There was an interesting discussion on Twitter yesterday on the subject of backlinks and disavowing low quality links to a website. At Artemis we haven’t disavowed links for clients for several years now. Google discourages using the disavow file as you may unwittingly tell Google to ignore links that are actually helping your website rank. Google is much better now at determining which links should and shouldn’t count towards rankings, so there’s no need to disavow anything anymore, not unless you’ve been hit with a manual penalty.
In yesterday’s conversation, John Mueller from Google said that “these agencies (both creating and those disavowing) are just making stuff up”. That’s quite a statement. Disavowing I agree with, but not when it comes to creating links. A fundamental part of Google’s algorithm (PageRank) relies on links to understand the reputation and relevancy of a website. Without links it’s very difficult to get a page to rank, unless it’s for an ultra-low competition search term. If you don’t make an active effort to get links to your website, it’s never really going to make much of an impact in search.
Of course, getting low quality links is pointless, but actively working on gaining high quality links is absolutely key for SEO. The proof is in the pudding. No links generally means no ranking improvements, despite the quality of the content on the pages. Link building isn’t bad, low quality link building is (pointless).
Rankings “may go nuts” with a site redesign
We get enquiries from businesses all of the time where the migration to a new website has not gone well and they’ve lost rankings, traffic and enquiries. It’s unfortunately all too common. Search engine rankings are delicate things and when migrating to a new website, it’s all too common for the SEO aspect of the website to not be taken into account. Sometimes important redirects are missed when URLs are changed, sometimes the structure of the menu is changed and other times, the entire content is changed.
All of these changes can be disasterous for rankings if not executed well. We’ve seen it all too often. However, even if URLs stay the same, the content stays the same and the crawl paths through the website (how search engines navigate from one page to another) also stay the same, just changing the design of the website, and the associated HTML code, rankings can be impacted.
Google engineer, Gary Illyes posted on LinkedIn yesterday saying “when you redesign a site, its rankings in search engines may go nuts”. The HTML markup is what search engines use to make sense of the pages, such as the order of the content, the headings, etc. If you change this, it will impact how search engines see the page and it can affect the rankings.
Migrations need to be handled with care, even small changes can have major consequences. Proceed with care!
Yandex Code Leak
It’s been an interesting weekend with the news of the Yandex code leak. Many will never have heard of Yandex, but it is like the Google of Russia. Although Google has a presence in Russia, Yandex is bigger and more widely used than Google. The leak, not a hack, according to Yandex, was allegedly carried out by an ex-employee. The code is complete and covers all of Yandex’s products, including their search engine.
SEOs have been very excited over the last couple of days ploughing through the 1922 rankings factors listed in the leaked documents. It is important to note that many of the factors may be negative ranking factors. Although Yandex is not Google, and the way the two search engines rank websites will be different, there will inevitably be some crossover between the two. It’s worth having a look through the list, but appreciate that the application of the factors isn’t clear. From the list, there are some big surprises although one shouldn’t surprise anyone: Ukrainian websites are automatically applied a negative ranking signal. Sigh!