Learnings from Google’s January 2020 Core Algorithm Update

Google has had a busy January making significant changes to its algorithm and search results. As large updates are usually not implemented in the run up to Christmas, January is often the month where we see quite a few changes in the search results. Here are the three main changes that have happened in January:

Core Algorithm Update

Google makes regular updates to its core algorithm but the one in January was quite a large one, resulting in significant changes in search results across the board.

There are no details of what the core algorithm change was focused on, these are never released by Google, but the update has very positively benefited Artemis clients’ websites. Our initial assessment of this update is that it is focused on content quality and better rewarding content that deserves to rank higher, due to higher relevancy, accuracy and presentation, but which may have lower PageRank authority than other websites targeting those same key phrases.

New Desktop Search Results Appearance

In 2019 Google changed the look and feel of the mobile search results to include a favicon for each listing and placing the web address (brand) above the page title.

In January Google copied this same set-up across to the desktop search results. This layout may work well on mobile screens but for desktop it does not work well at all.

 

The new desktop results were so poorly received that the backlash caused by this change prompted Google to make the unprecedented move of admitting that maybe it wasn’t the best update to the interface and that they would begin to experiment with further alternatives.

We have now already seen the favicons removed from the desktop search results and we can expect further changes over the coming weeks.

Featured Snippets

A featured snippet is presented in the search results when Google’s algorithm believes that an enhanced result may be very useful to the user based on the search query entered.

Ever since the feature was introduced, a website could enjoy two listings on the first page of the search results: The featured snippet and the organic search result itself, for example:

 

This double listing was often deemed unfair by many and in January Google changed this set up so that if a website had the featured snippet, the same page would not appear again in the organic search results.

This change has advantages and disadvantages. If a listing had a featured snippet and a first place organic listing, this update will result in a loss in traffic as now there is only one opportunity to click through to that page. However, for those websites that rank low on the first page of the results, to get the featured snippet is a way of getting to the top of the search results without needing to be first organically. There can only be a traffic increase in this scenario.

 

As with all changes there are always winners and losers. The latest changes have been very positive for Artemis’ clients and it is a reflection of our constant attention to detail, optimisation refinement and very high-quality link building activities. We expect the Artemis approach to SEO will continue to have an increasingly positive effect on business for all of our clients throughout 2020.

If you would like to learn more about Google’s updates and how Artemis can help your business stay ahead of them, please get in touch with our experienced team today.


Google introduces new speed report in GSC

Google introduces new Speed report in GSC

We have recently seen a couple of significant algorithm updates which were not publicly announced or confirmed by Google. It is important to note that Google applies updates to its algorithm on a daily basis and most of them go relatively unnoticed. However, sometimes particular updates can cause the search results to noticeably shift the rankings for certain keywords.

Apart from those unconfirmed updates, the most interesting news last month was about the introduction of the new Speed report in Google Search Console (GSC). Google has been testing this new tool for the past six months and is now available for all websites.

It is listed under the Enhancements section but it is still in an experimental stage so it will likely change and improve over time.

Google Search Console speed report

 

There are two reports available, one for mobile and one for desktop.

The results are then broken down into three categories:

  • Slow (longer than 3 seconds)
  • Moderate (longer than 100ms)
  • Fast

Each category then list the specific URLs that fall in that category.

GSC speed report example

 

Interestingly it groups similar URLs, understanding that similar pages, for example blog pages, may all have a common theme contributing to the speed issue.

Why is this important?

Speed metrics have long been available in Google Analytics but the introduction of these reports in GSC is a further reflection of the current and growing importance of page load speeds for ranking purposes.

However, fast loading pages are not just beneficial from a ranking perspective; users are less likely to get frustrated with a website if the pages load quickly resulting in an increase in enquiries. Website visitors have little patience and often little loyalty. Improving the user experience is vital for good conversions.

SEO is a continuous process of refining pages, content, optimisation and authority whilst being relevant and useful for a given search query. A significant part of this process now is also the ongoing task of refining the pages to improve load speeds.

We have known for some time now that page load speed is a ranking factor for mobile results and the significance of it will increase over time. Working towards fast loading pages to create a great user experience is a key SEO factor going into 2020.

 

At Artemis, our team our highly experienced SEO professionals keep up-to-date with all of the changes Google rolls out so that our clients don’t have to. If you are interested in learning more about what we can offer, get in contact with our team today.


Google BERT

How Google’s BERT update is affecting search results

Google BERT update blog image

In early 2015 Google launched its machine learning artificial intelligence system, RankBrain, to further improve the ranking algorithm for its search results. Google actually launched RankBrain in April 2015 but didn’t announce it until October of the same year.

During that time no one had noticed any difference or that anything was refining the search results in the background.

Fast forward four years and we’re seeing the same effect with Google’s latest and most significant update since RankBrain: BERT.

Google seems to have a habit of giving its updates silly names but BERT is a highly significant and advanced addition to Google’s ranking algorithm.

BERT stands for “Bidirectional Encoder Representations from Transformers” and is an AI based natural language processing (NLP) system. The system enables Google to better understand the relationship of all words in a sentence and, therefore, better understand what the searcher is ultimately trying to find.

Understanding the relationship of words in a sentence sounds like a simple task but it’s actually incredibly complex for computer algorithms to process accurately. The technology behind BERT is so advanced that it’s also required a change in hardware to handle the complex calculations and processes which were pushing the existing hardware to its limits.

How BERT improves search

BERT is able to determine the intent behind a search query by understanding the relative significance of the words before and after each word in a search query. Here is an example from Google of BERT in action:

Before and After BERT image

“In the past, a query like this would confuse our systems–we placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb!”

For more examples see Google’s news post about BERT.

When you look at the examples of the change in results with BERT’s input, you’ll see that the results are now completely different. It’s not like with regular Google updates where results change around a bit, with BERT the results returned for the search queries are completely different.

This is why BERT is the most significant change to Google’s algorithm since the introduction of RankBrain.

BERT isn’t refining the search results; it’s completely changing them to make them relevant and more accurate.

Key takeaways

BERT is using machine learning which means that it will continue to improve over time, resulting in an increasingly intelligent language learning system. It also means that the search results will continuously become more accurate and relevant over time.

However, note that BERT will mainly impact longer tail search phrases where the relationship between the words is vital in order to return the correct results. For short phrases it’s likely to have very little impact.

Google estimates that BERT will have an impact on 10% of all search queries and at the moment it is only impacting English queries in the US. It will begin to roll out to other countries and languages over the coming weeks and months.

BERT is also being applied to featured snippets as so many of these are often generated through longer tail search queries. As a result, they should be much more relevant than they are today.

On a final note, it’s important to understand that there really isn’t anything that you can do to optimise for BERT apart from having the relevant content for a given search query. If you were previously ranking top for a specific search and with BERT it no longer is, it’s probably because Google was previously showing the wrong results.

We positively welcome BERT and the positive impact that it will have on the quality of long tail search results. The impact may not be obvious to most searchers or SEOs but it’s Google’s biggest update in over four years and a very exciting one at that.


Man searching on smartphone

Google’s Core Algorithm Update

Man searching on smartphone

In March there were a couple of significant announcements made by Google which are worth discussing in more detail. These are:

Core Algorithm Update

Although Google makes daily changes to its algorithm, most of these changes generally have a very low impact on search results. Most people would never even notice the effects of these updates.

However, this month Google confirmed that it has made a significant adjustment to its core algorithm. It’s rare for Google to confirm that a core update has taken place, but the result of the update on the 8th/9th of March had a significant effect on some websites and it seems Google saw the need to confirm that a “broad core update” had been rolled out.

What’s interesting about this update, compared to previous confirmed updates, is this public comment from Google:

Google SearchLiaison March update tweet

And then followed up by this one:

Google SearchLiaison March update tweet follow up

“There’s nothing wrong with pages that may now perform less well” and “benefiting pages that were previously under-rewarded” and “There’s no fix” and “Over time, it may be that your content may rise relative to other pages”.

So what does this mean?

Historically, updates such as Panda and Penguin have always demoted websites with low quality content or toxic backlinks. However, this core update appears to be working the opposite way and is instead rewarding websites that deserve to rank better, hence the “no fix” comment.

However, the sentence of most significance is probably this one:

“Over time, it may be that your content may rise relevant to other pages”

Over time! In other words, there is “no quick fix”.

What this update appears to have done is adapt the core algorithm in line with artificial intelligence (AI) data that Google has been exponentially accumulating over the last couple of years, since the integration of “RankBrain” into its ranking algorithm.

Google’s AI data is powerfully analysing enormous datasets of user behaviour and the insights from this data has enabled the core algorithm to be adapted to better focus on useful, relevant, correct, engaging and valuable content. Rewarding engaging content is what this update appears to be focused on.

The reason why there is no quick fix is because in order to improve rankings it is necessary to satisfy Google’s AI that a page is better, more relevant and more engaging than the others in the search results. It takes time for this data to accumulate and for the benefit to be applied to the page.

Therefore, the focus has to be, as Google puts it:

“to remain focused on building great content”

Apart from that, some additional insights from this update include the following:

• Avoid competing pages on a website
• Avoid poor internal website structure
• Ensure pages are engaging
• Ensure pages are completely relevant to the search query
• Ensure pages have accurate information and data
• Ensure pages have “high value”
• Ensure pages load quickly
• Ensure pages are considerably better than the competition

There may not be a quick fix, but there’s always a fix.

Mobile-First Index Rolling Out

At the end of March, Google announced that after 18 months of testing, the mobile-first index is now slowly being rolled out to all websites.

A notification will appear in Google Search Console when a website is migrated to the new index, as such:

Mobile First Index Google Search Console Notice

The move to a mobile-first index is in line with the majority of searches now being made on smartphone devices.

Google’s intention is to not affect rankings with this update, hence the slow rollout, and for most responsive websites that will be the case. However, for mobile websites that offer a different crawl path to search engine crawlers or different content when compared with the desktop version, they may see a negative effect from this change.

We have worked with our clients for many months to ensure there is no impact from this change and we continue to monitor results and the transition to the mobile-first index.

We will notify clients when their websites are added to the mobile-first index and will monitor their subsequent progress in search as a result of this. We don’t anticipate any impact in rankings for our clients.


Mobile First SEO

Top 5 SEO trends in 2017

What an interesting year in SEO 2017 is already proving to be! So far we’ve seen a lot of changes.

From the jokingly named Google Fred update to the increased dominance of local and personalised search, to our faster than ever push into a mobile-only world. Then there’s the speed of voice search adoption.

But there’s much more coming.

Here are my Top 5 trends to watch for the remainder of 2017. All are interconnected and cannot be viewed in isolation. Nothing in SEO operates in its own separate silo.

Mobile First SEO

 

AI and RankBrain

Google’s RankBrain and algorithmic machine learning continues to dominate.

Ever since the Hummingbird update, Google’s emphasis on semantic search is never-ending and evolving at a tremendous pace.

Google even took the unusual step of confirming that RankBrain was the third most important ranking factor after links and content in 2016. This importance has only increased throughout the latter half of 2016 and into 2017.

Having moved on from its days of poetry and reading romantic novels, Google’s AI technology is getting better by the day.

It’s very hard to optimise for RankBrain.

It’s so all-encompassing and fast-moving that only true quality will dominate SERPs (search engine results pages). Which is great.

UX (user experience), CTRs (click-through rates), aiming for the ‘long click’ and the resulting engagement metrics should be high on your watch list.

The increasing importance of personal branding

The web is about people. It’s about us.

So that means having an outstanding About Us page; having a description of who you are; and a statement on just what makes you stand out from the competition. These are essential.

You need to build a personal brand as a core strategy for SEO. To establish trust.

Pictures and especially videos will be a central focus for Google for the remainder of 2017 – and well into the future. Having a team video and/or personalised photographs is no longer a choice, it’s a necessity these days.

If you show yourself as an approachable and friendly person, visitors will trust you much more readily. This will drive ever more traffic and conversions to your website.

Even social media platforms such as Facebook have been honing their algorithms in favour of personal posts (as opposed to brand posts). 

In the future more businesses will choose the personal approach to gain success.

User Experience Optimisation (UEO) and Conversion Rate Optimisation (CRO)

To a varying degree, user experience has always been important to SEO. Google ranks sites that are properly set up for mobile devices, that load quickly and where users spend a long time on a page.

This year we will likely see even more focus on user experience, especially on mobile devices. So focus on the traffic you already have to offer people much more than they expect.

Page depth, time-on-site, CTRs, and pogo-sticking are all things to work on.

If you offer true value you will notice the difference and soon know the full benefits of your efforts.

Personal digital assistants will become more sophisticated

Thanks to personal digital assistants the opportunity for new types of search and more advanced forms of conversational queries is huge.

Excellent tools such as Cortana and Siri have enhanced our user experience, made our lives easier and massively increased the number of verbal searches and enquiries.

For the rest of the year, we’ll see these tools become even more smoothly polished and capable of offering even more useful features. And that means excellent new ranking opportunities that have to be brought into play.

Voice search has the potential to really shake up the SEO industry.

The need for speed: a fast-loading user experience

It’s no secret that speed really matters.

Research has shown slower loading web pages are associated with higher bounce rates, and up to 40 percent of visitors are likely to abandon your site if it loads in longer than just three seconds.

Speed will be of even more importance in the coming year. AMP (Accelerated Mobile Pages) pages help and will be of increasing importance in the future.

There are so many other interesting technologies on the horizon as well – such as HTTP/2 and Google’s new open source JPEG encoder Guetzli, which are just two to keep a keen eye on.

In conclusion

Knowing who your customer is and what they want is the big change this year. Not just with SEO, it’s where the entire digital strategy will be directed.

You need to meet, match and exceed searcher’s expectations. To achieve this you have to understand your target audience better than ever before.

Google’s aim is to provide the most relevant website to the search entered.

Going big on word count is not working as well as it used to, not when short videos and images can be so much more attractive. Done well, they can deliver what you want to say and what customers want to know much quicker.

So, keep it simple! Give users what they want, let the search engines do their job – and it will all fall into place.

In 2017 it’s time to focus on providing true value.


Panda Penguin Possum

Panda, Penguin, Possum – What’s what?

panda-penguin-possum

September was a bit of a manic month for SEO, with the new Google update named Possum affecting the local search results as well as Penguin going real-time in the algorithm. In this blog, we will go over three of the most important recent updates in the SEO industry and establish what they are doing to your search rankings.

Panda

Panda was first launched in February 2011 (hitting Europe around March 2011). It was one of the biggest and most significant Google updates to hit, with up to 12% of Google search results being affected.

The Panda update was introduced to stop websites with poor quality content from ranking at the top of the Google search results. It forced webmasters to focus on quality rather than quantity.

How to avoid tripping the Panda filter?

  • Don’t use duplicate content – Using content that already appears on another website is one of the ways you can get hit by the Panda algorithm. Google will only rank the original source, so this will have no benefit to your website.
  • Don’t write thin content – We often see thin content on websites; this is when a page is only a couple of sentences long. This is unlikely to rank and will only end up causing you issues.
  • Ensure the content is in the website’s native language – Outsourcing content can seem like a quick and easy way to get content up on your website, but remember, well written content isn’t cheap. There are plenty of websites out there that offer content for £5 or less, but the chances are they will be provided by people whose native language isn’t English. Look for writers based locally to you who can provide you a better service.
  • Write quality content – Really it all comes down to this: write good quality content for the user about topics that provide value and information to the people who visit your site.

Penguin

Google launched the Penguin update in April 2012. It affected around 3.1% of search queries. There have been multiple updates since then and it is now part of the core algorithm. It was created to lower the rankings of sites that were using tactics such as creating spammy backlinks, buying links and using private blog networks.

Google introduced the disavow tool in October 2012 which allowed webmasters to tell Google which links to ignore. So if you had undertaken any unnatural link building, you could basically say sorry to Google and upload a list of bad links. This only affected a site hit by Penguin when the update refreshed which, up until recently, was over two years ago, so lots of webmasters were waiting for a while for this to take effect.

In September of this year, Google announced that Penguin was now part of the core ranking algorithm, and they wouldn’t be stating anymore updates around Penguin.

What does this mean for webmasters and SEOs?

Google will now automatically discount low quality and spammy backlinks without a disavow file in place (although they are saying you should still disavow bad links).

Any new links built to a website will now have a near immediate effect on a website’s ranking; no more waiting a few weeks for the algorithms to refresh!

Possum

Possum is a name given by the SEO community to the update at the beginning of September 2016 that affected local SEO.

The Possum update changed the boundaries by allowing businesses that were based outside of the town or city’s location to rank for a local term. It also started to limit the number of results shown per address. So, for example, if you are a solicitor with many different departments verified in Google My Business, Google may choose to only show one of these in the results.

It is also limiting the way virtual addresses are shown, as many businesses will use the same address to try and get exposure in a location where they don’t have a physical office. The good thing about the new update is that there isn’t as much of a need to try and trick Google now as they can show you in the results for a town close to your office.

If you need any more information, or have been affected by Panda, Penguin or Possum, then give us a call for a chat – 01444 645018


Penguin real time

Penguin Is Finally Here And It’s Real…Time

Penguin real time

After a very long two years, Google has finally released its latest version of the Penguin algorithm but this time it’s a little different, as announced on the Webmaster Central Blog today.

Traditionally major updates such as Panda and Penguin have been run in isolation, separate to Google’s main algorithm, often resulting in quite significant impacts on the search results.  Websites affected by these algorithms would remain supressed in the search results until they were refreshed and the sites found to now adhere to Google’s Webmaster Guidelines.

Last year Panda was integrated into the main algorithm and now Penguin has been too, meaning it runs in real-time. The main benefit of this is that a website affected by Penguin will not need to wait another two years before it can recover; it can recover as soon as the spam is fixed and Google detects that that is the case during its normal crawl cycle.

But, by far the biggest change to Penguin is on its actual impact on a website.

With previous iterations, a website flagged by Penguin as having a spam backlink profile would cause the entire website to lose rankings, even if the links were all pointing to one page or were all focused on a specific key phrase.

Today, Penguin appears to be more granular.  This is how Google expressed it in their blog post:

Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

That sentence could actually be interpreted in several ways but if they’ve listened to webmasters over the last two years then we can interpret it as follows:

  • A specific page is devalued if it’s deemed to have too many negative spam signals, such as keyword heavy inbound links, links from low quality sites (bookmarking, forum profiles, etc) or links from known private networks. The page may be penalised if it’s deemed that the spam was created intentionally and cannot be recovered until the links are removed or disavowed completely
  • A specific page is devalued by ignoring the links Google deems should not be passing pagerank to it, especially from low quality sites. The page is not penalised as it could have been subjected to a spam attack and can be recovered with new fresh, good quality links
  • The entire website is devalued if a significant portion of the backlinks are spam and are all pointing to the home page. The whole website loses value because the home page is generally the strongest and passes the weighting down throughout the website.  Removing bad backlinks and disavowing them can go some way to aid recovery
  • The severity of Penguin is now greatly reduced and quicker to recover from. This is of particular importance to any business that has suffered from negative SEO or whose SEO company has used underhand ranking techniques

 

Over the last three weeks there have been some significant movements in the search results; no doubt this was Google testing out the new Penguin algorithm.  At least it now means that the fluctuations should settle down and that those businesses that were impacted severely two years ago may now be in a position to rebuild their trust in Google and regain their rankings and traffic again.

A real-time Penguin is a friendly Penguin….finally!


Google Confirms: Penguin Update Imminent!

 

facebook-google-confirms-penguin-4

 

John Mueller, a webmaster trends analyst at Google, recently announced they are working on the announcement for the launch of the latest Penguin algorithm; version4.0. Yes, you did read that correctly, they are working on the announcement itself, which may or may not mean they have finished work on the actual update. It could also mean the update may have already been fully tested as Google is not known for announcing something before it happens. It has been almost two years since the last Penguin algorithm update which rolled out in October 2014, however, in the interim time there has been a great deal of talk about the functionality of Penguin being written into part of the main Google search algorithm and so negating the need to ever run a specific update again.

Skip to the 44 minute and 30 second mark for announcement…about the announcement itself.

For those of you who may not know the Penguin algorithm is designed to identify poor quality and unnatural backlinks. A site that is hit by a penguin update can be severely penalised and knocked a long way down the rankings across a wide range key terms. Many sites which were penalised by Penguin 3.0 have attempted to clean up their backlink profile using Google’s disavow tool in preparation for the next update, in hope they may recover from their penalty but it is still a point of some debate whether or not it is even possible to regain old rankings after a penalty.

It remains to be seen what the announcement will bring and some people are even suggesting Mr Mueller may be playing a slight prank on a lot of over eager SEOs by describing the work on the announcement itself rather than the update. Throughout all of this current wave of debate the critical fact remains that link building is critical for good rankings and taking short cuts leaves sites at great risk. After all, why sit around hoping for an update that might fix a mistake made by you or your SEO agency when you could have avoided the penalty in the first place by sticking to top quality links.

If you would like to discuss this subject or any other SEO concerns, contact us today.

Keep a look out on our blog or social platforms for further updates as our SEO Team keep up to date with SEO news, day and night.


Google Analytics

Integration of Search Console in Google Analytics

In May Google announced that Google Search Console could be deeper integrated with Google Analytics but what exactly does this mean, what insights will it give and how do you enable this feature?

Search Console is a free service offered by Google that helps website owners and marketers manage and monitor how they appear in Google organic search results. Google Analytics focuses on the data that the traffic creates once it has reached your website.

Search Console allows you to analyse a websites performance in Google search. It shows you data on Total Impressions, Clicks, CTR and Average Position for keyword phrases that the website is ranking for. These phrases may not have been identified as phrases to target but could still be driving significant traffic to your website.

Anyone wishing to analyse, understand and improve organic traffic from Google will be interested in this update. Essentially the Search Engine Optimisation reports in Analytics have been replaced with a Search Console section. The new reports combine Search Console and Analytics metrics, allowing you see the data for organic search traffic from both in one report.

What do the reports show?

The reports pull in the following data from Search Console – Impressions, Clicks, CTR and Average Position and the following from Analytics – Sessions, Bounce Rate, Pages/Sessions, Goals/Ecommerce, Conversion Rate, Transactions and Revenue. For the first time this data appears side by side.

landing-page-report

There are 4 new reports – Landing Pages, Countries, Devices and Queries which are found in Analytics under Acquisition.

analytics-searchconsole-reports

Landing Pages Report

Each landing page appears as a separate row within the report and allows you to see at glance how the organic search traffic performs for that specific page, how visitors reached the website and what they did when they go there.

What does it all mean?

It means greater actionable insight into the performance of a website for organic search results. The landing page report joins acquisition data with behaviour and conversion data. You can therefore see at landing page level how many clicks, the average position, bounce rate and conversion rate that page gets.

Let’s say for example you had an optimised landing page for pink girls bikes – mymadeupsite.co.uk/pink-girls-bikes with a form set up as a goal, you would be able to see the keywords that had driven traffic to that landing page and at a rolled up level what happened to the visitors when they were on the site. Did they bounce? Did they navigate further into the website? Did they convert? It creates insights which creates actions to better optimise the landing page.

Devices Report

This report allows you to deep dive into the devices – desktop, mobile and tablet and how they arrive and navigate your website, You can see at a glance the comparison between Click Through Rates (CTRs) and Goal Conversions of desktop, mobile and tablet and the landing pages and search queries behind them. This is incredibly valuable data. Back to Pink Girls Bikes you might see that the conversion (remember a form was setup as a goal) is better on desktop and mobile than a tablet. This might mean you review how the form looks or is setup for a tablet user to help improve that conversion rate. You might also notice that some landing pages perform better on mobile than desktop and therefore may look at why that is.

This all sounds great but how do I enable it?

You will need to link your Search Console and Analytics properties through Analytics.
Step 1: Navigate in Analytics to Acquisition > Search Console where there are 4 reports – landing pages, countries, devices and queries. Select one of them and select “Set up Search Console data sharing”:
Step 2: Select “Property Settings”
Step 3: Scroll to the bottom of the page and select “Adjust Search Console”
Step 4: Select the site to be linked, Save and Select “add a site to Search Console”
Step 6: Start gaining valuable insights

Summary

In summary integrating Search Console with Analytics will enable a deeper understanding of search data from beginning to end and enable actionable insights such as:

  • Understanding the search queries that are ranking well for each organic landing page rather than the website as whole
  • Examining how desktop, mobile and tablet users find and interact with the website
  • Improve landing pages in two specific ways:
    • Improving the landing pages where many users are arriving at the landing page (high click through rate and impressions) but not spending time on the website by navigating through the site (pages/sessions), immediately exiting the website (bounce rate) or not converting to a goal (eg: filling in a contact form).
    • Improving the search presence of landing pages where the users are navigating further through the website and converting but have a low click through rate.

All of these insights should help build a better user experience and in Google’s eyes a better search experience too.


Google's Real-Time Penguin Algorithm – Due for 2015

There could well be an extra gift under our Christmas tree this winter from a certain major search engine, with the next Google Penguin Update likely to arrive within the next two months. The new real-time Penguin algorithm version 4.0 is due for release at the end of this year. We’ve been expecting it and we’re going by news from Gary Illyes, the Webmaster Trends Analyst at Google, who said it would be released in 2015.

Penguin

What we are expecting of the next algorithm update is that it will be a real-time version, which means that the algorithm will update continuously in real time. The upcoming release is Penguin 4.0 and we’ve been told about these updates on many occasions in the past but until now we haven’t had a continuous update that won’t include any specific release dates. Instead of this, any detections of spammy links detected will be acted upon by Penguin.

When spammy links are removed once they are detected and the Google indexer is aware of this, the sites will stop being impacted by Penguin. The news on Penguin 4.0 is very brief at this stage but it’s intriguing to discover that it will indeed arrive before the end of 2015. So what is Real-time Penguin all about?

Real-Time Penguin

There’s not much information on the real-time algorithm update just yet but what we do know is that as soon as Google discovers that a link has been removed, the Penguin algorithm will do exactly what it says on the tin – process this in real time. You would therefore be able to recover from any penalty issued by Penguin pretty quickly, although you could also end up with a penalty just as quickly.

Get in Touch

Here at Artemis we stay up-to-date with all the latest happenings at Google to ensure our clients’ websites benefit and traffic continues to increase. Get in touch with us today to find out more.