PPC versus SEO

SEO Strategy in an Increasingly PPC Dominated World

The layout of Google’s SERPs (search engine results pages) has changed dramatically over the years. With maps, videos, images, featured snippets, the Knowledge graph, news and various personal suggestions all claiming their place, the first page of Google’s SERPs is now almost unrecognisable.

As marketers, one of the biggest shakeups we’ve seen this year is the increasing real estate given to paid search.

In February of 2016, Google announced (after years of testing) that paid search ads will no longer appear on the right hand side of the SERPs and that for ‘highly commercial’ terms, they’ll show an additional ad at the top, thereby increasing the space given to paid advertising from three places to four.

Four paid search ads and Local within SERPs

 

Paid search ads that didn’t make the top four places were moved to the bottom of the page. This meant that for some ‘highly commercial’ queries, you could see no less than seven paid search ads, which severely limited the number of organic possibilities.

The reasoning behind this shift was fairly sound and understandable. With the explosion of mobile usage and the corresponding local signals, Google wanted to standardise the listings across devices.

For many brick-and-mortar businesses, this meant a renewed interest and push for better local listings. The reward for appearing in the local three-pack was never more attractive. With the local listings appearing just beneath the top paid ads and above the normal organic ones. In essence, jumping into first place for anything organic.

More recently, and somewhat more controversially, Google has started to offer paid listings in the local finder as well, reached after clicking “More places” from a local three-pack in the main Google search results.

This is yet another piece of prime real estate that has been sold to advertisers.

Paid search ads in local listings

What does this mean for organic listings?

Before answering this question, it’s important to understand the reasoning behind these changes.

Search engines and Google, in particular, have become increasingly good at understanding user intent. If I’m searching to buy a laptop and search the exact laptop model with the words ‘buy’ or ‘purchase,’ Google knows and understands this intention, and being a strictly commercial one, they’d most likely present me with a series of paid search ads.

Is this a bad user experience? The answer, in most cases, is no. I’m looking to buy a laptop, I know what kind of laptop I want to buy and at this stage in my decision making process I just want to be presented with options of where to buy it. My intent is purely commercial and transactional. Displaying a series of paid ads is more relevant to me and provides better performance for advertisers. In many ways, a win-win situation.

It’s the informational ‘Micro-moments’ that SEO strategy should be attempting to target. The billions of queries that are not highly commercial and offer some scope for branding and connection.

There are over 3.5 billion searches on Google a day. Over 15% of these search queries have never been seen before.

Google’s engineers now feel confident enough in RankBrain’s ability to sift through these unrecognised queries (sorting them into vectors and assigning a ‘probable’ meaning to them) that they’ve recently announced RankBrain is now used in every search. It has become the third most important ranking signal (after links and content).

As we mentioned in our article about semantic search, we’re moving away from strings to things. Towards meaning and providing true value to the user.

A combined PPC and SEO search strategy

Having a combined SEO and paid search strategy is a good way forward for many companies. Particularly for e-commerce sites.

You’d have a carefully targeted PPC campaign with different landing pages for the commercial ‘I-want-to-buy-now,’ moments and a strong organic presence for people at the informational and research stages. Rather than competing with each other, these different listings would compliment one another, reinforcing your brand and presence at every stage of the decision making process.

The path to purchase is fragmented and non-linear. More so now than ever before.

The consumer journey has been fractured into hundreds of tiny decision-making moments at every stage of the ‘buyer’s funnel’—from inspiration to purchase.

For SEO to succeed, we need to address these ‘Micro-moments.’ We need to answer people’s questions, exceed their expectations and meet them at whatever stage of the decision making process they are currently in.

As a recent study by Google concluded – you need to be there, be useful and be quick. Therein lies the key to success online.


Google Analytics

Integration of Search Console in Google Analytics

In May Google announced that Google Search Console could be deeper integrated with Google Analytics but what exactly does this mean, what insights will it give and how do you enable this feature?

Search Console is a free service offered by Google that helps website owners and marketers manage and monitor how they appear in Google organic search results. Google Analytics focuses on the data that the traffic creates once it has reached your website.

Search Console allows you to analyse a websites performance in Google search. It shows you data on Total Impressions, Clicks, CTR and Average Position for keyword phrases that the website is ranking for. These phrases may not have been identified as phrases to target but could still be driving significant traffic to your website.

Anyone wishing to analyse, understand and improve organic traffic from Google will be interested in this update. Essentially the Search Engine Optimisation reports in Analytics have been replaced with a Search Console section. The new reports combine Search Console and Analytics metrics, allowing you see the data for organic search traffic from both in one report.

What do the reports show?

The reports pull in the following data from Search Console – Impressions, Clicks, CTR and Average Position and the following from Analytics – Sessions, Bounce Rate, Pages/Sessions, Goals/Ecommerce, Conversion Rate, Transactions and Revenue. For the first time this data appears side by side.

landing-page-report

There are 4 new reports – Landing Pages, Countries, Devices and Queries which are found in Analytics under Acquisition.

analytics-searchconsole-reports

Landing Pages Report

Each landing page appears as a separate row within the report and allows you to see at glance how the organic search traffic performs for that specific page, how visitors reached the website and what they did when they go there.

What does it all mean?

It means greater actionable insight into the performance of a website for organic search results. The landing page report joins acquisition data with behaviour and conversion data. You can therefore see at landing page level how many clicks, the average position, bounce rate and conversion rate that page gets.

Let’s say for example you had an optimised landing page for pink girls bikes – mymadeupsite.co.uk/pink-girls-bikes with a form set up as a goal, you would be able to see the keywords that had driven traffic to that landing page and at a rolled up level what happened to the visitors when they were on the site. Did they bounce? Did they navigate further into the website? Did they convert? It creates insights which creates actions to better optimise the landing page.

Devices Report

This report allows you to deep dive into the devices – desktop, mobile and tablet and how they arrive and navigate your website, You can see at a glance the comparison between Click Through Rates (CTRs) and Goal Conversions of desktop, mobile and tablet and the landing pages and search queries behind them. This is incredibly valuable data. Back to Pink Girls Bikes you might see that the conversion (remember a form was setup as a goal) is better on desktop and mobile than a tablet. This might mean you review how the form looks or is setup for a tablet user to help improve that conversion rate. You might also notice that some landing pages perform better on mobile than desktop and therefore may look at why that is.

This all sounds great but how do I enable it?

You will need to link your Search Console and Analytics properties through Analytics.
Step 1: Navigate in Analytics to Acquisition > Search Console where there are 4 reports – landing pages, countries, devices and queries. Select one of them and select “Set up Search Console data sharing”:
Step 2: Select “Property Settings”
Step 3: Scroll to the bottom of the page and select “Adjust Search Console”
Step 4: Select the site to be linked, Save and Select “add a site to Search Console”
Step 6: Start gaining valuable insights

Summary

In summary integrating Search Console with Analytics will enable a deeper understanding of search data from beginning to end and enable actionable insights such as:

  • Understanding the search queries that are ranking well for each organic landing page rather than the website as whole
  • Examining how desktop, mobile and tablet users find and interact with the website
  • Improve landing pages in two specific ways:
    • Improving the landing pages where many users are arriving at the landing page (high click through rate and impressions) but not spending time on the website by navigating through the site (pages/sessions), immediately exiting the website (bounce rate) or not converting to a goal (eg: filling in a contact form).
    • Improving the search presence of landing pages where the users are navigating further through the website and converting but have a low click through rate.

All of these insights should help build a better user experience and in Google’s eyes a better search experience too.


Quantum computing and SEO

Quantum Computing and SEO

Quantum computing and SEO

The Guardian newspaper recently asked its readers the question “has the age of quantum computing arrived?” In the world of search engine technology the short answer to this question is “yes”. Google has been testing and utilising the power of quantum computing for some time now in an effort to improve its search results. In October 2015, Google announced the existence of the artificial intelligence component of its algorithm known as RankBrain. Whilst many in the SEO sphere had been anticipating AI developments in search, the announcement still surprised many people as Google also stated that this new component was one of the most critical factors in determining the ranking of resources on the web. It’s almost certain that RankBrain’s deployment and announcement wouldn’t have been possible without the aid of quantum computing.

There is plenty of evidence to link Google’s use of quantum computing to their RankBrain algorithm. Prior to the launch of RankBrain in 2015, they made the following announcement:

GOOGLE IS UPGRADING its quantum computer. Known as the D-Wave, Google’s machine is making the leap from 512 qubits—the fundamental building block of a quantum computer—to more than a 1000 qubits. And according to the company that built the system, this leap does not require a significant increase in power, something that could augur well for the progress of quantum machines.” Source.

Certainly, computers with this level of computational power will assist with the ultimate aim of RankBrain, which is to sort through, understand and learn from billions of web pages then deliver the most relevant results. This is the basis of semantic search, which has been at the heart of Google’s development strategy since their inception.

How does quantum computing assist RankBrain?

I’ll readily admit that I’m no expert on quantum mechanics, the subject is notoriously brain taxing and has perplexed some of the greatest minds in physics. So, I will keep the science brief! In simplest terms, quantum computing provides the computational power required to do extremely complex calculations quickly by borrowing the concepts of superposition and entanglement as theorised in quantum mechanics. In regular computing a bit can be a 0 or 1, however, through superposition each quibit in Google’s quantum computer can be 0 or 1 or 0 and 1. What this means is that Google can perform two equations at the same time and, as the Guardian article states “two qubits can perform four equations. And three qubits can perform eight, and so on in an exponential expansion. That leads to some inconceivably large numbers, not to mention some mind-boggling working concepts.”

Whilst increasing Google’s computational power without increasing their use of resources is clearly a key aim in the adoption of quantum technology, the desire to make a breakthrough in artificial intelligence and create computers that can “think” like a human is of paramount ambition. This makes perfect sense for an organisation like Google whose ultimate concern is with truly understanding the intent of its search engine users. Who can better understand what someone is trying to look for than an actual human being who is an expert on the subject matter? By investing heavily in AI development, Google hopes to replicate the reasoning capability of a human with the extended capability of being able to sort and understand a vast amount of data quickly.

What does this mean for the future of SEO?

Google’s investment in AI is likely to lead to its ever increasing capability to assess and understand the theme of a web page and the authority of that page on the domain it is served on. For businesses, this means it’s increasingly imperative for them to focus on and display their expertise via their web properties. This is especially critical for organisations who provide services based on knowledge and experience. Businesses should be thinking about how they can better address the needs and questions of their customer base, so that those searching on a particular topic are more likely to encounter their content. This requires a renewed focus on content strategy and on improving the quality of web pages. Detailed and in-depth pages should result in the “long-click”, keeping users on your site longer and helping Google’s RankBrain learn from each users’ action.


Content Marketing – How to Meet Searchers Expectations

content-marketing-meeting-searchers-expectations-670x340

 

Content is not King. Search engines only care about your content in so far as it answers searchers questions. In the last few years especially, the web has become one huge answering machine. People query search engines and search engines attempt to answer these questions by providing results. Simple. Not really.

As we saw in the introduction to semantic search article – A search engine takes our queries, tries to understand the words, and delivers the same results a human would – the same results a friend would give you. And not just any friend, a close friend. A friend who understands you, who knows your current and previous locations, who knows your tastes and preferences and most importantly, knows your intentions.

These days, your website doesn’t just have to target keywords, you need to meet, match and exceed searcher’s expectations. To achieve this you have to understand your target audience better than ever before.

You need to understand the kinds of questions they have and, most importantly, provide them with enough information that they will not have to pogo-stick back to the search engine results pages (SERPs)

This pogo sticking can have a dramatic effect on rankings.


Pogo sticking, Long and Short Clicks

Pogo sticking - Google's SERPs

A long click is a sign of user satisfaction. It’s a sign of expectations and intentions being met.

Many people mistakenly confuse a ‘long click,’ with low bounce rates. Although the two metrics do have some correlation, they are still very different.

Popular resource pages (think Wikipedia & Stackflow) and blogs often have high bounce rates. People come in, find what they need and leave again. Or in the case of blogs, they read the latest post and leave.

They have no need to carry on searching. And that is the key. Their intentions have been met.

They have no need to ‘pogo-stick’ back to the search results and click on other results.

This pogo-sticking is an easy metric for Google to calculate and keep track of. It also provides a very clear indication of user satisfaction.

When a user is actively choosing another website from the SERPs to get the information they are looking for, this shows Google that your content is not good enough and doesn’t deserve to rank for those queries. It also shows Google which websites should be ranking above yours.

If this becomes a common occurrence on your landing pages, search engines will notice these short clicks and your rankings will decrease. (this natural voting system is far more transparent in PPC and makes up a large part of the Quality Score).

How do you Match and Exceed Searcher’s Expectations?

In our semantically themed world, you have to understand your clients, understand their questions, their queries, you need to know what goes on in their heads. First and foremost, you should put yourself in their shoes.

What questions would they have before taking a decision?

A good exercise is to get a piece of paper and write down the most common questions you hear from your clients. Get everyone involved, ask all the members of your team for their feedback.  

Break these queries down for each one of your products and/or services and then look at your existing content. Are you addressing these questions? Are you addressing them fully? Would users need to go somewhere else to get the information they need? – to one of your competitors.

This ‘completeness’ is so important these days. It’s the difference between a long and short click.

Optimising Existing Content

Cyrus Shepard wrote a brilliant article – one that sums up perfectly what we’ve been doing at Artemis.

When you already have traffic coming to a website and have some information to work with, the place to start is Google’s Search Console.

Search console query data at the URL level

Pick a few underperforming pages. I tend to pick ones that drive some traffic, but could/should drive more, ones that often rank on the second page of SERPs and have low click-through-rates.

Select the individual pages, adjust the timeframe to the maximum 90-day of queries and filter by impressions. These are the queries that are landing on your page and they often give you great insight into people’s intentions.

Sometimes it’s not immediately clear and you need to dig around a bit, but usually, you’ll find a true mine of information.

Does the Content on your Page Satisfy User Intent?

There are various ways to address this. And we’ll be covering them in later articles.

A quick fix is to amend title tags and meta descriptions. By including these big traffic driving queries in your title tag (particularly near the beginning) or within the meta description, you will increase click-through-rates and drive more traffic. At least in the short term.

But remember, this traffic needs to be sustainable and you need to aim for long clicks.

It’s very important that title tags and meta descriptions are not deceptive. They need to fairly reflect what a user is going to find on that page. If they don’t, people will quickly bounce off, ‘pogo-sticking’ back to the search results, causing the whole page to lose rankings.

A far better solution is to provide real value to searchers. By having the best content, by being helpful and answering all their questions clearly and fully.

Content is no longer King. The new King is the long click.


The End of Referral Spam as We Know it?

Although, still unconfirmed by Google, it looks like there have been some big changes in how referral spam is handled in Google Analytics. We have been seeing a steady decline over the last month and the last week in particular.

It looks as if Google began aggressively filtering from mid-to-late February.

There are still accounts with some minor issues, but nothing compared to what we saw a few months ago.

Having examined over 100 analytics accounts, it does indeed look like referral spam is coming to an end. At last. We hope.

What is Referral Spam?

Wikipedia defines referral spam as

“Referrer spam (also known as referral spam, log spam or referrer bombing) is a kind of spamdexing (spamming aimed at search engines). The technique involves making repeated web site requests using a fake referrer URL to the site the spammer wishes to advertise.”

What does Referral Spam Do?

 

Google Analytics

 

Basically, referral spam was an inconvenience. It was something you had to continually filter, monitor and explain. It wasted a lot of time. If it wasn’t filtered properly, it would also mess up your other metrics and skewer averages.

These automated requests would overload servers and slow down load times. With slower speeds and higher bounce rates, this would eventually translate into lower rankings. Many webmasters also feared the security implications, some of this spam traffic could be looking for WordPress, plugin and server vulnerabilities.

According to Jennifer Slegg from the Sempost, it looks as if the spam is being filtered after hitting the site, with it still visible in real time and then Google applies a filter before it hits the acquisition reporting.

Hopefully, this spells the end of referral spam as we know it.

For further reading, see How to Stop Spam Bots from Ruining Your Analytics Referral Data and for a free SEO and Analytics Audit contact us today!


Google officially launches AMP results in Mobile SERPs

 

Google has today started to display Accelerated Mobile Pages (AMP) pages in SERPs that trigger a news listing. To see the new AMP results, search for anything that triggers the news results box such as ‘The Brits’ or ‘Pensions’ from a mobile device.

You can identify the AMP compatible pages by the green lightning icon and AMP next to it.

 

When you click onto an AMP result, it loads a version that Google has cached. This is to help increase the speed that the page loads. You can also swipe left and right to see more results, rather than having to go back to the SERP and clicking onto the next result.

AMP Gif opto

 

If you run a website that triggers a news result SERP, you should start to implement AMP as soon as you can. There is plenty of documentation out there, but we recommend looking at the official AMP site for more information. If you use WordPress, you can install this plugin to help optimise your pages for AMP.

How does it work?

AMP works like the majority of other HTML pages, apart from it uses a reduced set of functionality that will still load in any modern mobile web browser. There are a number of technical and code based advantages when using AMP which helps page load time and the user experience.

Google also caches the AMP pages in the cloud to make the load time even better for users.

As it is a fairly new technology, it will be interesting to see if Google starts rolling this out to non news based SERPs. We are monitoring it closely here at Artemis to see how it progresses.

If you are looking for any help setting up AMP on your website or any help with your Search Engine Marketing, then please feel free to give us a call on 01444 645018 or drop us an email.


Starting at the beginning…

Keyword research starts a successful online presence and marketing campaign. But what is it? How can you do it better and what does it actually mean?

What is keyword research?

Keyword research is about what people type into search engines to find what they need.  It is essentially the gathering of search terms which are firstly relevant to your website and secondly that people are actually using. For example people who wish to buy a bicycle may type in the word bicycle into Google’s search box.

How do I find more information on keywords?

Google has a handy tool called Keyword Planner which you can access if you setup a Google Ads account. You can use it in different ways to give you keywords ideas and their search volume. The Keyword is the term typed into Google and Search Volume is the amount of times that keyword is typed into

Let’s run through an example. If I owned a Bicycle store and wanted to research keywords where would I start? How about the words around Bicycle? Would you target cycle, bicycle or bike? If we look at these words in Google’s keyword planner we can see the results below:

We can see in keyword planner that bike and bicycle have search volumes of 33100 and cycle has a much lower volume of 6600. What is interesting is that by also searching for the plurals bikes is also a top volume term of 40500. We might therefore think about targeting the keyword bikes more than we would cycles.

Competing for a search term like bike on its own will be extremely competitive. Let’s say our example business is actually a specialist in children’s bikes. So we need to think around the word children and bike and research the combinations of children, kids, boys and girls with the bicycle words.

Keyword Avg. Monthly Searches (exact match only)
kids bikes 14800
girls bikes 9900
boys bikes 5400
kids bike 3600
girls bike 3600
childrens bikes 2900
boys bike 1600
childrens bike 1000
kids bicycle 590
girls bicycle 480
boys mountain bikes 390
boys bicycles 260
children bike 210
kids bicycles 170
girls bicycles 140
kid bike 110
children bicycle 90
child bicycle 70
boys bicycle 70
kid bicycle 30

 

It is interesting to see the difference between some of the plurals and how much more search volume there is around girls than boys.  But what about delving deeper into what are known as long tail keywords such as a coloured girls children’s bike?

Keyword Avg. Monthly Searches (exact match only)
pink girls bike 140
pink kids bike 40
kids pink bike 40
pink girls bikes 20
pink girl bike 10
pink girl bikes 10
pink girls bicycle 10
pink kids bikes 10
pink childs bike 10
kids pink bikes 10
pink kid bike 10
pink kids bicycle 10
pink childrens bikes 10
kids pink bicycle 10
pink girl bicycle 10
pink kids bicycles 10

 

We can see above that the keywords “pink girls bike” has the most potential traffic. By taking these journeys through the keywords we can start to build a strategy and target the keyword that has the potential to drive the most traffic.

What should I do with this information?

The information can help create the strategy for the website in terms of content, how to describe the service being offered in the most valuable way and most importantly how to drive relevant traffic to the website.  For example, when creating a website you can group related keywords into categories and think about creating category pages on the website to target that specific market. In the example, we might want to think about creating a page targeting the words pink, kids, girls and bikes. By creating content that’s relevant with associated terms like pink girls bikes, this in turn will help the website rank for the shorter keyword searches “bikes” and “girls bikes” where the larger search volume is, while also obtaining high rankings for the specific long tail search. Also, by targeting the long tail keywords around pink girls bikes it might be that you are targeting people who have the intention to buy rather than people who are just researching the topic.

In summary keyword research essentially helps create and refine a website to market itself to the people searching for it. If you know how people are searching for what your business provides, you have found your market.


An Introduction to Small Business and Semantic Search

Search engines are changing and they’re evolving faster than ever before. Have you ever felt that Google knows exactly what you are looking for? That you are being prompted with suggestions that are eerily close to what you have in mind? Sometimes even before you’ve finished typing your query. How does Google understand your thoughts so well? How can it possibly know what you’re looking for? Welcome to the world of semantic search.

Semantic search is one of the most exciting developments of our time. It is also one that is levelling the playing field between large and small businesses. Now, even small businesses have a place in the search results – a place where they can attract visitors and triumph.

What is Semantic Search?

The word semantics comes from Ancient Greek and involves the study of meaning. Attempting to find meaning is nothing new on the internet.

Indeed, Tim Berners-Lee, the father of the modern web, originally coined the term semantic web, which is defined as “providing a common framework that allows data to be shared and reused across application, enterprise, and community boundaries.”

semantic_search_hummingbird

Although the theory and concepts behind semantic search are fairly easy to grasp, their very mechanism and the mathematics behind them are incredibly complex.

We are moving from a web of things to a web of people. From strings to things. Gone are the days where you can hide behind a large faceless portal and expect to build trust over the Internet.

The Knowledge Graph

The knowledge graph is often referred to as the brain behind semantic search.

Amit Singhal, the head of the Google Search team, retired last week. His replacement is John Giannandrea, none other than the man behind the Knowledge Graph, the recent Rankbrain update, the so-called Hummingbird algorithm and most of Google’s artificial intelligence initiatives in the last few years. With John around, AI and semantics will undoubtedly remain the major focus for Google.

John, like most techies, is a huge Star Trek fan. Anyone who’s ever seen an episode of Star Trek cannot help but be impressed by the computer onboard the Starship Enterprise – you know, the one that responds to voices, and that gets increasingly intelligent over time. The computer that ‘understands’ what you are asking it.

This is exactly where Google is headed. The search engine takes our queries, tries to understand the words, and delivers the same results a human would – the same results a friend would give you. And not just any friend, a close friend. A friend who understands you, who knows your current and previous locations, who knows your tastes and preferences and most importantly, knows your intentions.

Central to this new understanding is the Knowledge Graph.

The Knowledge graph semantic search

 

The knowledge graph uses fuzzy logic, which was first identified in the 1960s by Dr. Lotfi Zadeh, professor emeritus of computer science at the University of California, Berkeley. Fuzzy logic is a way to introduce “degrees of truth” into mathematics. It ascribes a mathematical value to logical variables, rather than a straight binary “yes” or “no.” Unlike traditional boolean logic, fuzzy logic allows Google to introduce probabilities into its calculations.

Each time you query Google, the results appear via semantic search, in the form of a list of possible answers. Google attempts to interpret the meaning of every query, by using all the information it has on you (for example, your location, search history, preferences, associations, friendships, your friends’ reviews, shopping history, the content of your emails and much more). It does all of this in order to give you answers based on your intent. Something which Google has become surprisingly good at over the years.

What can small businesses do?

When you think of semantics, you have to think about transparency and understanding. Semantics is all about you and the reasons you started your business in the first place. It’s about putting your passion on display and showing visitors what makes you stand out. What makes you special? And more importantly, why should visitors give you their business? First and foremost it’s about building trust.

Semantic search is so all encompassing and vast, that any attempts to manipulate it are doomed to failure. As a small business with limited time and resources, concentrate on the basics. Having a carefully optimised website, with a strong local presence and valuable content. Content that is going to help people. Content that is going to answer their questions and built trust.

It’s of little surprise that one of the most visited pages on any website is usually the ‘about us’ page. More so for e-commerce sites. People relate to people, they want to know about the people behind the site. A carefully thought out and written about us page goes a long way to building confidence.

Key Points for Small Businesses to keep in mind

  • What makes you special?
  • How do you stand out from your competitors?
  • Which qualities do you have that will make people trust you?
  • Think about your target demographics, the people you are trying to reach, your potential customers, what kinds of questions might they have? How are you addressing these questions?
  • Are you addressing these questions using the language your target audience would use?
  • Identify the problems that your business will help solve. How will you go about solving them? What solutions do you offer?

It is now more important than ever to provide real value to the end user. To take full advantage of semantic search, we have to go back to our basic values.

In other words, we have to provide value, answer visitors’ questions and exceed their expectations. The goal is to establish trust and build lasting relationships. As Tim Berners-Lee also stated, “The Web does not just connect machines, it connects people.”

 

For further reading on semantic search I recommend any books or articles written by David Amerland, his small business easy checklist and for a more technical (Google patent themed) reading list – seobythesea.


Can’t Rank Won’t Rank? Maybe The Problem Is The Domain

Why won't my website rank in Google?

“I don’t understand why my website doesn’t rank for my main terms, it’s so much better than all those other rubbish websites that Google is ranking instead.”  Of course, everyone believes that their website is the best but far too often we come across websites that just can’t seem to rank at all, no matter what you do to them or how good their content may be.

One of the toughest parts of the role of an SEO is not to get a website ranking but having to try to explain to a client why his/her website isn’t ranking and is unlikely to; the worst part being that sometimes we just don’t really know the exact reason.

Google uses over 200 different signals to evaluate where pages should rank for a given search term.  Additionally it frequently rolls out additional algorithms, such as Panda, Penguin, PayDay Loans to target particular spam or search quality issues.  There are so many factors that come into play to make a page rank that sometimes it’s not easy to tell exactly why one page ranks better than another.

What we often tend to find is that some domains are untrusted and these are unlikely to be able to rank highly in Google search results in the short-term, no matter what you do.  Manual and algorithmic penalties are supposed to have an expiry date but it could be years from now and most businesses can’t wait months let alone years to enjoy the benefits of good Google rankings.

We still have test websites that were penalised 4 years ago and today they are still not ranking.

Trust me, I’m not a link spammer

By far the biggest cause of lack of trust in a domain is if it has a very low quality backlink profile.  This can take the form of hundreds of bookmarking links, links from low quality directories, links from other untrusted sites or a very keyword heavy anchor text profile to cite just a few examples.

When a new client comes to us with a website which has this type of link profile, which is far more common than it should be, we have to make a decision as to whether we think the website has been flagged as “untrusted” and whether it would be best to start again with a new domain.

You can run some simple tests such as optimising pages in a certain way and seeing if the corresponding change in the rankings matches expectations, but ultimately the domain may be flagged as untrusted by Google and therefore the effort to get it to rank well may far outweigh the effort required to achieve good results with a new domain.

All is not lost with a new domain

There is always a lot of resistance from businesses when it comes to changing a domain name, for various reasons, but from a search rankings point of view you can start with a new domain but you aren’t necessarily starting from scratch.

You can’t just redirect an old, untrusted domain to the new one as you will just pass across the low trust signals to the new domain.  There are ways around this and it’s important to ensure that the new domain remains completely clean and independent of the old one.  Changing signals that tell Google what your new website address is such as Google+, local directories, business listings, etc., will help to accelerate the ranking of the new domain.

The clean-up operation

Over the years I’ve spent huge amounts of time cleaning-up websites to get them ranking again.  Cleaning up bad backlinks is very tough and although you can disavow links with Google, the process is far from guaranteed and Google can choose to accept or ignore what you put in your disavow file.  The Penguin algorithm flags websites as untrusted if they have spam backlinks.  The only way to recover from this is to clean up the bad backlinks and wait for Penguin to run again.  Seeing as it hasn’t run for over a year that’s a huge amount of time for any business to be struggling in Google search results.

If the issue is poor quality content or too many cookie-cutter pages (this is pure Panda fodder) this can also take a long time to correct and the efforts can take even longer to realise in search results.

Time is money and going through a clean-up process is generally very time-consuming and with no guaranteed outcome.

Sometimes it’s just best to bite the bullet and start afresh…a solution also stated by John Mueller (Google engineer) in this Google+ post:



Google's Real-Time Penguin Algorithm – Due for 2015

There could well be an extra gift under our Christmas tree this winter from a certain major search engine, with the next Google Penguin Update likely to arrive within the next two months. The new real-time Penguin algorithm version 4.0 is due for release at the end of this year. We’ve been expecting it and we’re going by news from Gary Illyes, the Webmaster Trends Analyst at Google, who said it would be released in 2015.

Penguin

What we are expecting of the next algorithm update is that it will be a real-time version, which means that the algorithm will update continuously in real time. The upcoming release is Penguin 4.0 and we’ve been told about these updates on many occasions in the past but until now we haven’t had a continuous update that won’t include any specific release dates. Instead of this, any detections of spammy links detected will be acted upon by Penguin.

When spammy links are removed once they are detected and the Google indexer is aware of this, the sites will stop being impacted by Penguin. The news on Penguin 4.0 is very brief at this stage but it’s intriguing to discover that it will indeed arrive before the end of 2015. So what is Real-time Penguin all about?

Real-Time Penguin

There’s not much information on the real-time algorithm update just yet but what we do know is that as soon as Google discovers that a link has been removed, the Penguin algorithm will do exactly what it says on the tin – process this in real time. You would therefore be able to recover from any penalty issued by Penguin pretty quickly, although you could also end up with a penalty just as quickly.

Get in Touch

Here at Artemis we stay up-to-date with all the latest happenings at Google to ensure our clients’ websites benefit and traffic continues to increase. Get in touch with us today to find out more.