9 Simple A/B tests you can run to improve conversions

A lot of website owners think that the most vital part of their online business is getting traffic. While this is, of course, pretty essential, there’s actually something far more important to work towards – converting that traffic into paying customers.

It sounds obvious, but it’s easy to get caught up in metrics like search engine rankings and page traffic and forget about what’s actually making you money!

A/B testing

When it comes to boosting conversions quickly and effectively, A/B testing is the answer.

Also known as split testing, A/B testing is where you make one small change to a web page (or advert, or marketing email) and run both the new and original versions simultaneously, to see which one brings in more customers. Rather than just assuming that a bigger button is better or having “a hunch” that a simpler banner will boost click-throughs, A/B testing provides solid evidence.

Here are some examples of really simple A/B tests that can have a surprising impact on your conversions – just remember to only change one thing at a time, so you can clearly see which options are more effective.

  1. Rephrasing your Call to Action (CTA) – is “Buy Now” more effective than “Add to Basket”? Now you’ll know for sure.
  2. Moving your CTA button – you might find more people click on a button that’s higher up the page or is slightly bigger than your current one.
  3. Changing the colour of your CTA button – do your customers see the colour green as “go” or will red instil urgency into their click-through? What Only an A/B test will tell you for sure.
  4. Swapping images – will a photograph perform better than an illustration? Perhaps your customers would rather see a young entrepreneur than a picture of your product – or perhaps not. Play around with images to see how they affect your conversion rates.
  5. Refining your headlines – try using this title text to address your customer pain points or explain the benefits you bring them. Does it make a difference?
  6. Compare subject lines – if you’re running an email campaign, see how a playful subject line plays against an informative one, or whether asking a question yields better results than a short, simple statement.
  7. Revise your copy – this might be a bigger job than some of the others, but it can provide excellent returns. Can you make your text longer? Shorter? Punchier? More problem-focused? Chattier?
  8. Simplify your navigation – if your customers are getting distracted or confused by pages within pages, try changing the location of your navigation tools (and possibly the language they use) to make it easier for browsers to hit “buy”.
  9. Shorten your forms – you might think that you need all that data “for marketing”, but if it’s actually putting customers off, is it worth it? Split test long and short forms to see how much business they’re costing you, if any.

Keep in mind, a single A/B test is unlikely to revolutionise your business. It’s more about incremental gains that slowly but surely help your website to work as hard as it possibly can for you and your customers, making the most out of its excellent ranking and high volume of traffic. For friendly technical advice and more details about different ways in which you can increase your website conversions, give our team a call or leave us a comment below


Four quick daily Google Analytics checks

Here are four quick daily Google Analytics checks courtesy of Senior SEO Manager Jack Stonehouse. 

Google Analytics is a free service offered by Google that provides insights and data for your website traffic. It has a vast amount of useful information that you can use to ensure your business is on track. It can however be quite overwhelming when you log in there are hundreds if not thousands of reports you can view and create.

Below are a handful of quick and simple checks you can do for your website. I prefer to change my date view to show the last 30 days or so, this gives me a better trend across the month. I will also review all the reports compared to the previous 30 days and the same period last year. Comparing year on year will help identify any seasonal trends.

What we are looking out for is any major change that isn’t in line with the normal trend.

1. Channels

The channels report is found by going to Acquisition > All Traffic > Channels. This report covers the main mediums that send you traffic. The key channels are:

  • Organic Search – This is all traffic from Search Engines (Google, Bing, Duck Duck Go etc.)
  • Paid Search – This is traffic from PPC ads such as Google Adwords
  • Direct – This is traffic that comes from users that type your website domain name straight into the browser, it is also a catch-all for traffic that Google cannot identify and is just placed in the Direct bucket
  • Social – This is traffic from social networks such as Facebook, Twitter and Pinterest
  • Referral – This is traffic to your website from other websites
  • Email – This is traffic from Email campaigns

The key stats to check on the channel report are Users, Sessions, Bounce Rate, Conversion Rate and Revenue/Goals (Revenue for E-commerce websites and Goals for all other sites).

 

You can change the data that is displayed on the graph by using the menu/drop down highlighted red in the screenshot above.

2. Referrals

The referrals report is found by going to Acquisition > All Traffic > Referrals. This report shows traffic to your website from other websites.

This check is just a quick one to see if there are any new websites that are driving traffic to your site. If there are you could potentially contact them to see how you could both work together to increase the traffic further. This could be through providing them content with another link back to your website, or if have an affiliate system setup, you could ask them to sign up.

3. Landing Pages

The landing pages report is found by going to Behaviour > Site Content > Landing Pages. This report shows which pages users land on when they first come to your website.

This is important to monitor for any vast changes in visits, some common situations for an increase in landing page traffic are if a social post goes viral, you get a good backlink that is driving traffic or a key term starts ranking in a high up position in the Search Engine Results Pages.

4. Ecommerce Overview

The ecommerce overview report is found by going to Conversions > Ecommerce >Overview. This report is just for sites that sell products and shows key information such as revenue and conversion rate.

You need to enable this report by adding ecommerce tracking to your website, it is also recommended you add enhanced ecommerce tracking so you get even more in-depth data to review.

Key stats to watch here are any drop in revenue or conversion rate, some reasons could be due to an item going out of stock or an issue with the checkout process.

For more detailed statistics you can also view the following e-commerce reports (once you have enabled enhanced tracking):

  • Shopping Behaviour
  • Checkout Behaviour
  • Product Performance
  • Sale Performance

If you have any questions then please leave a comment below or get in touch with our friendly team here.


Meta description FAQs – 5 things everyone wants to know

We see them every day on Google’s search results pages: meta tags – the title tag that specifies the title of a web page, and the meta description underneath. But what exactly is a meta description? Is it important to have one and how do you write a good one?

These and other questions pop up with regularity among SEO enthusiasts, both beginners and those who should really know better. Let’s take a closer look on the 5 questions everyone seems to be asking to see if we can shed some light.

FAQ 1: What is a meta description?

Meta descriptions are a fundamental part of successful website optimisation. It’s the first snippet of text that you see in search results, below the page title, and should provide a short summary description of the content on your site. When you type a search query into Google, the search engine will show the meta description on the results page including the keyword you used in your search.

Example:

 

 

 

FAQ 2: Why do you need a meta description?

The short answer is that it’s good SEO practice to have effective meta descriptions on each of your web pages. While Google is adamant that meta descriptions don’t actually affect search engine rankings, they are still an extremely useful tool to help drive traffic to your site.

In fact, the whole point of a meta description is to give the user a good reason to click through to your site. A good description will give an overview of what the page is about and be written in an appealing way so that the user wants to find out more. Given the competitive nature of online marketing, a well written meta description may make all the difference between web traffic going to your page, or to a competitor page.

FAQ 3: How long is a meta description?

In theory, a meta description can be any length but do bear in mind that if it’s too short, the description you give may not be useful enough, and if it’s too long, Google may truncate your snippet. It’s not an exact science and Google likes to increase or decrease the limit now and again while conceding that, in fact, ‘there is no fixed length for snippets. Length varies based on what our systems deem to be most useful.’

At Artemis, our best practice is to write meta descriptions that are just under 155 characters long, making sure the most important messages are communicated with the first 120 characters just in case the displayed snippet does get truncated.

Example:

FAQ 4: How do you write a good meta description?

In order to convince someone to click through to your page from organic search results, you only have a short snippet of text to convey the right messages, which is why it’s worth putting in the effort to craft good copy for every unique meta description. Think of it as the equivalent of writing ad copy for Google Adwords for Pay-Per-Click.

While the content of each meta description must accurately reflect the content of the page it points to, it should be written in a compelling fashion to appeal to the reader, generate interest in your page and ultimately increase click through rates.

See if you can tell the difference between a ‘good’ and ‘bad’ meta description in these two examples:

 

 

 

 

 

 

FAQ 5: What are some top tips for writing a meta description?

So, how do you put all the above into action when it comes to writing your own meta descriptions? Commercial content writers use these top 5 tips and tricks of the trade to create great copy, including irresistible meta descriptions.

1. Write for people, not for bots – meta descriptions are primarily aimed at the user, not the search engine. While it’s important to include the main keyword(s) in the copy, don’t stuff your description full of them – it looks spammy and will put people off. Instead, make your description informative and easy to read, written by humans and for humans in an effort to get them to engage with the snippet and click through to your site.

2. Include structured content – for product pages in particular, users will be looking for information such as a detailed item description, technical spec, optional extras and, of course, price. It is highly likely that a click through to your website will be triggered by this highly relevant structured content rather than persuasive advertising copy, so make sure it is included in the meta description.

3. Feature rich snippets – you can increase the appeal of your meta descriptions by adding additional information such as star ratings or customer ratings, product details, events and much more, using the latest schema markup code. If you’re not familiar with the concept, ask your SEO consultant to explain how this can enrich the information displayed in search results.

4. Use an active voice – advertising copy is always aimed directly at the readers, with the ultimate intention to get them to do something (e.g. make a purchase). Rather than providing factual but dull information, write in an active, direct voice using imperatives (‘read this’, ‘click that’), giving clear direction towards clicking on the title tag.

5. Don’t forget the call-to-action – the ultimate aim of your meta description is to drive click-throughs to your site, so the more compelling the reason given to the user to do just that, the more successful your meta description will be. ‘Find out more’, ‘Read our blog’, ‘Book a free consultation’, ‘Shop the sale’, ‘Buy now’ are all important calls to action inviting the user to visit your website for a specific purpose.


Four SEO quick wins every site can implement

In general, SEO is much more about building for the long-term, rather than quick wins. At Artemis we take a measured and engineered approach to optimising sites, but we understand that getting results is what’s truly important. There are some things that almost every website owner with limited time can do that will make a big difference to rankings, performance, and, ultimately, sales. Here are four quick wins that virtually every website can implement today.

1. Optimise your images

Page loading speed is still an important factor in search rankings, and given that there are so many things that anyone can do to improve their page speed, there really is no excuse not to make changes. One of the best things that you can do is to optimise your images and compress their file sizes – you’ll still have stunning images, but they will take a fraction of the time to load.

This is even possible with large banner images, which you might typically think of as being very large file sizes. Take a look at our blog on how to successfully compress large images for a real quick win for your page loading speed.

2. Make mobile your priority

Mobile search is a big deal. In 2017, mobile accounted for more than 50 per cent of all web traffic generated across the world. And yet the vast majority of marketing and digital staff interact with websites at work through the use of a desktop or laptop. This can give you a skewed perspective on how your site looks and performs.

The important move here is to understand your Analytics data. If your website receives a higher proportion of traffic to its mobile site, then you should be spending more time working on the mobile version, rather than the desktop. To get into the habit it can be a great idea to have staff spend a whole day of work accessing the site and working exclusively through tablets and smartphones. This can provide a huge insight into how user-friendly the mobile version of your site is which will give you countless ideas for how to improve the site’s usability.

3. Fix 404s immediately

Site errors can be a huge problem for SEO. For example, a 404 page error is a guaranteed way to increase bounce rate and is a terrible user experience for customers. Unsurprisingly Google and other search engines see 404 errors as a red flag and even just a few of them can see your whole site tumbling in the rankings.

Use a site crawling tool such as Screaming Frog regularly to find rogue 404 errors and fix them as soon as possible. You might be surprised at how many you have if you have never taken a look before.

4. Refresh your content… even if it performs

We have known for a long time that Google loves to see fresh content going up on a site. But what do you do if you’ve had a page for a time and it’s still doing well, even if the information isn’t quite as good as your competitors?

It might sound counter-intuitive, but even your top performing content needs to be regularly refreshed. The crucial aspect here is keeping the content as up-to-date and relevant as possible. Google loves to see content that is well received by users and provides answers to their questions.  As Google begins to utilise artificial intelligence more and more in its analysis of pages, it will become better at understanding whether your content is actually useful and engaging for your customers. This will increasingly affect rankings. Improving your content now can help to ensure that you keep your position or improve.


Artificial Intelligence and Search Engines. Part Two: Changing SEO

In Part One of our series on artificial intelligence (AI) and search engines, we looked at how AI is changing the way the users search and how search engines function. Welcome to Part Two, where we will be taking a closer look at the ways in which the SEO industry will have to react to AI and what it will mean for your website rankings on Google.

For as long as there have been search engines used by millions of people, there have been advantages for those websites and businesses that have been able to influence the rankings in their favour. As early search engines used relatively crude methods for determining results, it was historically relatively easy to optimise a site. However, over the years, algorithms have expanded and become more complex. AI is just the latest factor in this constantly evolving process. And to understand how AI is changing SEO, we need to first understand how SEO has evolved over time.

How SEO has evolved – a brief history

Today, SEO is big business: companies are willing to spend significant portions of their marketing budget to attempt to rank above their competitors in Google’s search results. The earliest recognisable search engines emerged in 1994 and the algorithms they used in order to rank the websites were fairly basic. Factors such as how many times a website used a specific word, whether that word was in the URL and the meta data, were crucial in determining where websites would be placed. This meant that a website owner simply had to ‘stuff’ their pages and their URLs with keywords to rank well – early SEO was simple.

The first major advancement in the intelligence of search came when they began to factor backlinks into their algorithms. When Google launched in 1998 it was revolutionary in the way it ranked pages because it looked at the internet as a whole, not just the content on the one website. It was able to see which websites were linking to others, and it recognised these links as an important factor in determining the importance and relevance of a site – akin to the way that a university essay cites sources. However, initially the algorithm worked on a relatively basic premise – the more links that a website had pointing towards it, the more valuable and powerful was deemed to be.

early Google search

This led to a situation in which if websites wanted to perform well in Google’s search results they could continue to utilise keywords, but also boost their site further by building a huge number of links, regardless where these links came from. For a period of time this was standard practice. However, when Google realised that many businesses were using these sorts of underhand tactics to receive an artificially-inflated ranking, they deciding to do something about it. This saw the launch of two large scale updates to Google’s algorithm: Panda, promoting the value of high quality content, and Penguin, punishing sites with large numbers of links coming from poor quality sites.

This is where SEO became a far more complicated and delicate process, and website owners and SEO specialists had to think very carefully about everything they did to a website to ensure it wouldn’t fall foul of the new rules.

A more advanced algorithm

The fallout from Penguin and Panda was enormous, and it indicated that Google was going to be continually refining its algorithm to attempt to make it impossible to manipulate or artificially enhance a website’s position. The next major update, which was known as Hummingbird, focussed on a shift towards natural language.

While webmasters and site owners had become used to using text and content to serve a purpose (to drive sites up the rankings), Hummingbird placed a greater preference for sites that used ‘natural’ language. This meant that websites that were filled with useful and interesting content ranked higher than those that simply contained a good density of relevant keywords.

There is no doubt, then, that Google’s algorithm was evolving and becoming more advanced with each change. But at this point they all had in common that there were ideas that were programmed into the algorithm by humans. However, this changed with the deployment of RankBrain.

The rise of RankBrain

Google began using RankBrain as a factor it is search results in 2015. It is an AI system that is considered to be the third most important ranking factor, behind content and links. RankBrain uses AI to analyse words and phrases that it has never seen before – it can then make a guess at the meaning of the phrase based on similar phrases. This means that it is extremely effective at showing relevant results even if it does not necessarily understand the query.

As search has become more conversational and in the form of long-tail, complicated questions, this AI is designed to help the algorithm translate the questions into something it can understand and provide search results for. Data from previous search queries is fed into RankBrain and it is uses this data to learn how connections are made between topics. It is also able to spot patterns between searches that might appear unconnected.

This is one of the first examples of AI being used to improve search results, but this begs a question: how should website owners optimise their sites for an algorithm that is learning by itself?

This is good news for SEO!

It might seem as if the addition of AI to Google’s algorithm spells trouble for those in SEO – after all, as AI learns more about websites and what kind of content a user is searching for when they use a particular search query, it becomes much harder to manipulate or influence the system in any way. However, on closer inspection, this is actually excellent news for SEO – or, more specifically, those businesses using ‘white hat’ SEO techniques.

Reputable SEO agencies and experienced professionals already know the steps that they need to take to ensure not only that their site will rank well in search results, but also won’t fall foul of penalties under the algorithm: focus on creating the best possible content and achieving links that the website deserves.

However, it has always been frustrating for white hat SEOs when they can see competitors utilising black hat techniques and getting results without being punished. Not only will AI reward reputable and genuine SEO, it will make it easier for search engines to spot poor practice. AI is definitely bad news for those agencies and companies using underhand methods to artificially inflate their rankings.

SEO

What this means for content creation and link building

Let’s take a look at what Google itself describes as the two most important ranking factors in its algorithm: content and links. These will be affected by the rise of AI.

For example, Google’s is AI becoming better at recognising the difference between genuine high quality content and simply average, non-duplicated text. This means that those websites that create the best possible content that is genuinely useful and interesting to their audience will see the rewards. This effectively means that the best advice is to carry on with the same plan that Google has been recommending for a long time: create amazing content that answers the questions of your audience and provides value to the reader.

In terms of links, things have moved on dramatically from the early days when a link from any site would do. And yet, links remain a vital aspect of determining the quality of a website. This means that websites that focus on gaining strong, earned links from powerful and relevant sites will continue to see a benefit.

Additionally, as we have seen with RankBrain, Google is getting better at understanding search intent – what the user is trying to achieve with their search term. This comes from the AI being able to more clearly understand what a user means when they type in a query or use voice search. From an SEO perspective, you can take advantage of this by tracking how visitors use your site and drawing conclusions from the behaviour of those who convert.

How to prepare your site for AI

So, what should you do in order to prepare your website for the increasing use of AI in Google’s algorithms? The truth is that AI itself will not make any changes to the way that the algorithm operates – nor will it change Google’s priorities. The use of AI is to make it easier for Google to meet its main goal: providing the best possible search results for its users.

This means that to prepare for AI you simply need to follow the same advice that Google has been suggesting for a long time. Firstly, create the best possible content that is going to be genuinely useful and informative for the user; never has the phrase ‘content is king’ been more relevant.

Remember additionally that AI is constantly getting better at understanding natural, conversational language. This means that when you create your content you must always do it with a human reader in mind. Gone are the days that you could trick the search engine with content that was ‘optimised’ – if Google spots content that looks forced or unnatural, it will be able to tell the difference.

You also need to ensure that you are earning your links. With the help of AI, Google is getting better at noticing patterns and trends. So if you are still engaging in the practice of buying links this is something that Google will notice, more so than ever before.

Finally, it is more importantly than ever to stay up-to-date with what search engines are looking for from sites. As Google and others increasingly utilise AI it will make them more capable than ever to enforce their algorithms. So it is vital to stay ahead of the game. Working with experienced SEO professionals is crucial, as the development of AI in search is fast and it can be easy to get left behind without expert advice.

Google Analytics

We hope you have enjoyed our series on artificial intelligence and search engines. This is still very much an emerging field and an exciting part of the future of SEO. Please check back to the Artemis blog regularly as we will be updating our content which further specific developments in AI and search engines, as well as providing insight into all areas of SEO best practice.

And if your business could benefit from our SEO expertise please don’t hesitate to get in contact with us today.


London Big Ben

The big picture – what to do with bigger images

London Skyline Test Image - Original

In one of our recent blog posts, we took a look at image optimisation and some great online tools to reduce the file size of your images. In that post, we suggested that the ideal file size for images should be 100kb or below. This is because big images with large file sizes can slow down your website, which is a key factor for site optimisation – as well as keeping users from leaving your site to go somewhere else.

However, getting your image to this file size isn’t always achievable. In fact, there are some cases where you might not want to lower your image to 100kb at all. If you’ve got a big hero image that needs adding, then it is more than likely that reducing that larger image to 100kb will lower the overall quality of the picture.

When should images be 100kb or less?

So, you might now be wondering when you should actually aim to get your image to 100kb or less. Imagine you are working on a service page for your website, taking our previous blog post about image compression as an example. If you are putting an image in a content area like this, it doesn’t need to be very big. This includes the actual dimensions of the image, as well as the file size itself.

Say you have taken a great photo of one of your recent projects, and you want to include a preview of that on your service page. The photo itself probably looks fantastic, but you don’t need to squeeze in print quality images from a HD camera into a small section like this. Take a look at the file size and the image dimensions before you upload it to your page. You can check this easily by right clicking and going to the properties and details section of the image. If it is really big, try and reduce this to fit the area it’s going to be put. If you don’t have Photoshop, you can use a free tool like Pixlr.com or GIMP.

When should images be over 100kb?

Again, you should always aim to keep your file sizes low. In some cases however, this is not an option – and as mentioned, reducing the file size can have an impact on the quality of the image. If you are adding a big hero image, or maybe a gallery of photos to show off a recent project, then the image dimensions and file size are naturally going to be bigger. Still, you don’t need to upload images that are several MB in size. This is going to have a notable impact on your site speed if you do.

Try some of the tools from our previous post to see if you can get your file sizes down. There are a lot of great, free tools out there on the internet that can reduce your file sizes. Some of them also include a handy tool that let’s you preview the image as you adjust the size. This way, you can see whether the image quality is getting too low for your liking. You should also think about the dimensions of the picture you want to use. Does it fit the section you’re trying to put it? Does it need to be several 1000 pixels wide by several 1000 pixels high? Reducing the dimensions can also help to reduce the file size.

Be smart

In summary, you just have to think about what’s best for the image that you’re uploading. Try and get the file size down as much as you can, but keep an eye on the quality of the image. See if you can reduce the image to an appropriate size. WordPress and other Content Management Systems might automatically scale down your images on the page itself, but it will still be loading the full-size image. A small image within your text doesn’t need to be several MB in size, neither does a larger image.

If you would like more advice on image optimisation, or any other SEO concerns that you might have, don’t hesitate to get in touch with us today.


Man on a laptop

How to perform a content audit

Man writing content on a laptop
Performing a content audit can be extremely valuable. It can help you to improve your website and plan marketing activity, and it is something that almost every business could benefit from. Here at Artemis we regularly carry out content audits for our clients – if you are interested in having one conducted on your site by professional content and SEO specialists, please get in contact with our team today. In this blog we look at some of the benefits of content audits and how you can carry one out for yourself.

What is a content audit and what is its purpose?

A content audit takes a look at all of the content on a website to assess its strengths, weaknesses and performance. It is an evaluation of data and key performance indicators (KPIs) to help you to understand how well content is doing the job it is intended for, as well as gaining insight into how content could be improved and to guide potential new content creation in the future.

More than just an inventory of the current content on a site, a good content audit establishes the performance of all aspects of content and helps to guide future marketing activity.

Understand your goals

To get as much as possible out of a content audit, it is first important to understand why you are performing it and to establish the goals you are hoping to achieve. There are many different reasons to carry out a content audit:

  • SEO – you may be conducting an audit to help you to identify areas of potential improvement for search engine optimisation (SEO). In this case it would be important to focus closely on aspects such as keywords, image optimisation, word count and current page rankings.
  • Content marketing – it could be that you want to gain insight into the success and failures of your content marketing. Here you could take a look at visit metrics, social shares and user behaviour.
  • Conversion rate – you might be most interested in improving the conversion rates across your site – a content audit can help to achieve this too

Create a spreadsheet listing your content

The first step in the actual auditing process involves finding all of the content on a website. This is where it can be useful to use a crawling tool such as Screaming Frog, as this will find all of the URLs associated with a site and provide them as a list, along with helpfully listing many of the relevant details about a page – such as its word count, headers and more. Many of these tools allow you to export the list in full, so this can allow you to easily create a spreadsheet with the content details you will be needing.

A more time consuming process could be to manually enter all of the pages and their details into a spreadsheet. Clearly for larger websites this would be impractical, but it might be possible if you are auditing a smaller site.

Analyse your data

Gathering relevant data is also an important aspect of your content audit. You will need to utilise various tools to pull in key facts. As discussed above, this will depend on the goals of your content audit, but you may wish to get data such as the last time the page was updated, how the page ranks on Google and how many conversions or goals that the page has achieved over a set period.

Once again, how you analyse the data is based entirely on the goals you are trying to achieve from your audit. But as an example, if you are looking at the conversion rates of your content you might be able to look at key metrics such as average time on page, bounce rate and completion of goals.

You can then see which pages are doing well, and which need improvement. It might be prudent to arrange the pages by those which get most clicks, so that you can focus your future content work on the areas of the site that are most active, but that convert at the lowest rate.

Look at the competition

You can take your audit further than the current content on your site by examining the content of your competitors as well as the most popular content found in the subject matter. Tools like Buzzsumo allow you to explore content in a niche to understand which is the most successful. No matter why you are carrying out your content audit, it is always beneficial to understand exactly what you audience is looking for.

If you would like to learn more about content audits or you are interested in having one carried out, please contact our experienced team today.


Hotjar Interface

How to turn visitors into customers with Hotjar

There are many ways that you can optimise your website to improve its position in search rankings. But for some businesses, the real challenge is turning visitors to the site into customers. To do so it is important to find the areas on your site that could potentially be used more effectively and generate more conversions from the traffic you are currently receiving.

Hotjar is an increasingly popular tool that can be used to help websites generate more enquiries or leads. It does this by collecting user data and feedback which enables you to fully understand where the web and mobile traffic is focussed on your site, and how you can benefit from it.

Whether your website is used for ecommerce or for referrals, you need to know which pages or actions can create the most leads. Hotjar uses tools like heatmaps and recordings to help you better understand and manage your website. But optimising a website for an increase in traffic or conversions can be tricky. Here are our quick top tips on how to use Hotjar effectively to increase conversion rates and turn your visitors into customers.

How heatmaps work

Hotjar’s heatmap feature essentially monitors a user’s movements and engagement across a website or on specific pages. Heatmaps are capable of showing users’ behaviour on a page and are useful for understanding where your visitors are clicking and how far they are willing to scroll down a page for more information.

It is important to then highlight where your users are often clicking to and how you can then essentially turn that into a lead or improve in other areas of the page. This can work really well for a ‘Contact’ page you’re looking to improve, whether it is by looking into how your form is shown or guiding you to whether your buttons and other elements are creating a potential barrier for users.

Hotjar Heatmaps Image
In the image above you will see an example of how users behave on the Artemis homepage. Interestingly, the majority of clicks received on the homepage can be found on the ‘About Us’ and ‘Contact Us’ tab. This can also show the areas that your visitors aren’t clicking onto that much or not at all, it gives you insight into what you need to focus on more or on how you can improve elements such as call-to-action buttons, forms or general content.

Hotjar Heatmaps - Artemis Homepage Image
However, heatmaps are capable of more. The image above shows the movements of your visitors across the page and what users are more drawn to. It can also highlight the ratio of most mouse movements compared to the clicks from the previous image, which again, can indicate on how to gain leads from certain elements such as buttons, forms and opt-ins.

Record targeted pages

One of Hotjar’s most powerful features is recordings. The recording feature can enable you to see how visitors are interacting with your website. It does this by collecting and storing visitor session data and actively records a user’s movements.

It’s not as scary as it sounds… Each event is tracked as a different session and allows you to play it back and watch how visitors are behaving on your website and which pages they are going to. Through analysing the data you have collected to see how users are interacting with the site, you can essentially build or change the site around them.

It is important to understand how a user navigates around a website, you can see your visitors’ journeys and how they digest content along the way, especially for an ecommerce site when the focus is on the products you are trying to sell.

How does recording pages help me?

Recordings can help you answer a large number of questions on the user experience and usability of your site, such as:

  • What barriers exist on my website? And how can I fix them?
  • What is driving people to convert on my page?
  • Are users ignoring my CTA buttons? Are they even seeing them?

It’s important to define the changes you need to make to a page, even if it means a small increase in clicks, especially to a targeted page such as a contact page. It means you can build or change the website around your visitors and make the user experience better, especially if there is a barrier blocking their way to another page. It is great to understand these challenges and how you can fix them.

And even better is the fact that you are not limiting: you can record both static and dynamic pages along with shopping carts and logged in areas.

Understanding how Hotjar forms work

When you have contact forms on your website, it is important to check whether you are actually receiving any conversions or interactions for what you have featured on a targeted page.

Hotjar Form Reports can provide you with an in-depth view of how each of your forms are collecting data and if they are converting traffic from them. Sessions are collected for each form and you will be able to analyse how long each user is spending on a field or whether they abandoned the contact form altogether. It is good to understand how they interact with it and if there are any challenges they are facing (e.g. a ‘Submit’ button not working correctly).

Hotjar - Forms Image
In the image above you can see the rate of sessions and drop off along with interactions for each field, making it easier to understand how to can improve a form or see what barriers your visitors could be facing.

Hotjar is becoming an increasingly popular tool that can help a website generate more enquiries and lead to more conversions. Artemis have many years of digital marketing experience and can provide expert help to enable your business to reach a wide yet targeted audience of new and existing customers. Contact us for a free Hotjar consultation.


Statistic on a laptop

Crawlability and the basics of SEO

It’s an ever-elusive acronym: SEO. But what is search engine optimisation? In truth, it’s probably the case that no-one truly understands how search engines evaluate internet content. It’s an ever-evolving game and the rules are continually being redefined. Particularly in our new AI dominated search world.

Even those at Google sometimes struggle to comprehend how their algorithms index a piece of content, especially when there are more than 200 ranking factors. SEO brings with it new ideas, new knowledge and new concepts. Google’s AI bots decide what content to show for what search query, it’s just a matter of understanding the language used to communicate this content across the internet.

To understand SEO, we need to first understand crawlability.

What is crawlability?

Before Google can index a piece of content, it must first be given access to it so that Google’s crawlers (or spiders) – the bots that scan content on a webpage – can determine its place in the search engine results pages (SERPs). If Google’s algorithms cannot find your content, it cannot list it.

Think about a time before the internet. We had listing services like the Yellow Pages. A person could choose to list their phone number for others to find, or choose not to list a number and remain unknown. It’s the same concept on Google. Your web page (whether that’s a blog post or otherwise) must offer permission to crawlers so it can be indexed.

Robots.txt files: how do they work?

The internet uses a text file called robots.txt. It’s the standard that crawlers live by, and it outlines the permissions a crawler has on a webpage (i.e. what they can and cannot scan). Robots.txt is a part of the Robots Excursion Protocol (REP), which is a group of web standards that regulate how robot crawlers can access the internet.

Want an example? Type a website URL into your search browser and include ‘/robots.txt’ at the end. You should find yourself with a text file that outlines the permissions that a crawler has on a website. For example, here is Facebook’s robots.txt file:

 

So what we see here is that a Bingbot (a crawler used by Bing.com) cannot access any URL that will have ‘/photo.php’. This means that Bing cannot index on its SERPs any users’ Facebook photos, unless these photos exist outside of the ‘/photo.php’ subfolder.

By understanding robots.txt files, you can begin to comprehend the first stage a crawler (or spider or Googlebot, it’s all the same thing) goes through to index your website. So, here’s an exercise for you:

Go to your website and search your robots.txt file and become familiar with what you do and don’t allow crawlers to do. Here’s some terminology so you can follow along:

  • User-agent: The specific web crawler to which you’re giving crawl instructions (usually a search engine).
  • Disallow: The command used to tell a user-agent not to crawl particular URL.
  • Allow (Only applicable for Googlebot. Other search engines have different variations of bots and consequently, different commands): The command to tell Googlebot it can access a page or subfolder even though its parent page or subfolder may be disallowed.
  • Crawl-delay: How many milliseconds a crawler should wait before loading and crawling page content. Note that Googlebot does not acknowledge this command.
  • Sitemap: Used to call out the location of any XML sitemaps associated with this URL.

Crawlers (spiders): what do they look for?

A crawler is looking for specific technical factors on a web page to determine what the content is about and how valuable that content is. When a crawler enters a site, the first thing it does is read the robots.txt file to understand its permissions. Once it has permission to crawl a web page, it then looks at:

  • HTTP headers (Hypertext Transfer Protocol): HTTP headers specifically look at information about the viewer’s browser, the requested page, the server and more.
  • Meta tags: these are snippets of text that describe what a web page is about, much like the synopsis of the book.
  • Page titles: H1 and H2 tags are read before body copy is. Crawlers will get a sense of what content is by reading these next.
  • Images: Images come with alt-text, which is a short descriptor telling crawlers what the image is and how it relates to the content.
  • Body text: Of course, crawlers will read your body copy to help it understand what a web page is all about.

With this information, a crawler can build a picture about what a piece of content is saying and how valuable it is to a real human reading it.

But here’s the thing…

There are more than 200 ranking factors that a crawler will consider. It’s a complicated process, but so long as your technical checks are in place, you have a great chance of ranking in the SERPs. Backlinks, for example, are extremely important to determine how authoritative a piece of content is, as is the overall domain authority.

SEO is nothing more than about ensuring your content has the correct technical checks in place. It’s about making sure you give a crawler permission in the robots.txt files, that the crawler can easily understand your meta tags, that your page headings are clear enough and relate to the body copy, and that what you provide your readers is valuable and worth reading. And this last point is quite possibly the most important: value is everything. Because let’s face it, if an algorithm isn’t going to read your content, a human certainly won’t.


snail-on-a-mouse

4 quick tips to speed up your website

Snail on a Mouse
Your website might have an amazing design, be well optimised for conversions and get good levels of traffic but there is one issue that could make all your hard work go to waste: your site speed is too slow.

The time it takes for your website to load is now one of the most important factors that affects how well it converts. With the industry focus having shifted firmly onto mobile search by users who have little patience and demand near instant search results, a load time of 5 seconds can result in up to 25% of your search traffic bouncing and going to one of your competitors instead.

And we don’t want that, do we?

The good news is that there are some pretty straightforward things you can do to speed up your website. However, before you start tinkering, it’s a good idea to benchmark your current load time first. There are many tools available to help you do this – we recommend using GTmetrix, Pingdom Tools or Google’s own tool PageSpeed Insights.

Once you have established your benchmark, take a look at these 4 simple ways of making the necessary improvements to your site. In our advice, we’ve focused mainly on WordPress websites but you should be able to implement these solutions on most websites.

1 – Get the right web hosting service

Let’s start with the basics. If your web hosting is poor, then frankly none of the tips mentioned below are going to make any difference. This is a key area to get right from day one – but what is ‘right’? There are so many different types of web hosting – shared hosting, reseller hosting, VPS hosting, dedicated hosting – with some services costing as little as 99p while others will set you back over £100 a month. Which to choose?

When it comes to hosting, the old adage ‘you get what you pay for’ couldn’t be more appropriate. Cheap 99p-type deals will most likely be on second-hand servers with thousands of other websites hosted alongside. While this may be sufficient for, say, a small blog that gets a handful of visits per month, higher traffic levels won’t be able to cope and your website will crash.

As an absolute minimum, a website that is used to advertise a service should be hosted on a VPS (virtual private server) to give you more control over the hosting. With ecommerce websites, it’s important that they’re hosted on a dedicated service designed to deal with large volumes of traffic and that is secure enough to handle payment transactions.

2 – Optimise your images for web

When a website is built, it is best practice to upload any images in the required size, i.e. the size that will actually be displayed on the site. However, this doesn’t always happen. Often, the developer will upload images in whatever size they’ve been supplied, perhaps scaling them to fit using CSS. This is far from ideal since large image files (1MB+) can seriously slow down your page load speed.

WordPress does a pretty good job at resizing, and of course there’s always Photoshop. In addition, there are a number of free online tools you can use to compress images, such as tinyjpg which allows you to upload up to 20 images at a time and gives you the optimised images as a download, ready to use on your site.

Also available are plug-ins such as wp-smushit but these won’t give you as much control as resizing manually, and if you’re not happy with the image quality you’ll have to restore a back-up and re-upload from scratch, which is neither user friendly nor time efficient.

3 – Implement browser caching

Caching is a way of telling your browser to store certain elements of the website, such as image and CSS files, so they don’t have to be loaded every time. Implementing this is probably one of the quickest ways to improve your site load time.

There are literally hundreds of WordPress plug-ins to help you do this, the most popular being W3 Total Cache. It works straight out the box and you can also tweak the settings to enable more advanced caching, such as minifying CSS and JavaScript.

Some servers offer settings such as gZip and other caching plug-ins, but these vary depending on the server type, operating system and web host. It is certainly worth contacting your host to ask about any additional settings they may be able to activate for you.

4 – Maintain your WordPress plug-ins

It should go without saying that any WordPress plug-ins that you use should be kept up to date at all times. No doubt you are aware that any failure to do so puts your website at risk of being hacked. But did you also know that old plug-ins using outdated scripts can lead to your site slowing down?

What’s more, unused plug-ins in WordPress will sometimes still load, and may use the database, even if they’ve been disabled. A by-product of this is that your site will take longer to load. Make sure that any plug-ins that you don’t use are completely deleted from the website.

At Artemis, we’ve been helping businesses to get the most out of their websites since 2004. From local campaigns for small companies through to global ecommerce sites for international brands, our capable SEO team is fully focused on achieving tangible, measurable results for each client. Why not get in touch to see what we can do for your website?