Many businesses lack a diverse range of images on their Google local listings, yet they are an essential element to having a well optimised Google My Business listing. Google’s algorithm will look fondly on a business with a range of well optimised photos. It’s so important that GMB Insights will even tell you how your photos perform against competitors in your local area.
Your image dashboard can be found by logging into your GMB profile and clicking on photos in the menu.
You will first be asked to add a Profile, Logo and Cover Photo to your local listing. Your profile and logo should both be different and measure to 250 x 250 pixels. The best dimensions for your cover photo are 2120 x 1192 pixels.
Once you have uploaded these, click the ‘What are these?’ link. This will allow you to select the photo you recommend to show first on Google Maps and Search. Make sure to keep track of what image is being pulled through on Search in your business knowledge box, Google will typically show the latest image you uploaded.
For best practice, here are Google’s guidelines on what size your images should be:
JPG or PNG
Max 10 KB and Min 5 MB
At a minimum 720px tall and 720px wide
Your photo should represent reality and be well lit. Do not make excessive alterations.
At its basic level, depending on the business category you have chosen, you will have 5 image categories which you are encouraged to fill out. These usually include:
Interior Photos (of your office or shop)
Exterior Photos (of your office or shop)
Photos at Work
You should add a minimum of three photos into each segment.
Don’t stop there
Once you have uploaded your well optimised and real life images, you need to ensure you manage and check your photos and insights.
Periodically adding more images to these categories is a great way of keeping your profile active, a necessary strategy to ranking well as Google suggests ‘business with recent photos typically receive more clicks to their website’.
In December 2016, Google My Business added insights for photos. Ensure you keep track of your GMB Photo Insights, you’ll be surprised just how many of your photos have been viewed, but most importantly keep track of your competition in the ‘photo views’ section. You can also view the photo quantity in comparison with business like you, a great metric to judge if you should be adding more images to your profile.
As we mentioned before, you should keep track on which image Google is showing on Maps and Search as it is subject to change to the latest image uploaded.
Beware! Anyone can add photos to a business’s location and those photos are likely to display on search. If it’s a malicious attack the images are unlikely to be anything to do with your business. In this case, you will need to flag the images to Google, this can take a few days to resolve. Click to enlarge the image and look for the flag. We recommend flagging the image by three different accounts at a minimum.
In May Google announced that Google Search Console could be deeper integrated with Google Analytics but what exactly does this mean, what insights will it give and how do you enable this feature?
Search Console is a free service offered by Google that helps website owners and marketers manage and monitor how they appear in Google organic search results. Google Analytics focuses on the data that the traffic creates once it has reached your website.
Search Console allows you to analyse a websites performance in Google search. It shows you data on Total Impressions, Clicks, CTR and Average Position for keyword phrases that the website is ranking for. These phrases may not have been identified as phrases to target but could still be driving significant traffic to your website.
Anyone wishing to analyse, understand and improve organic traffic from Google will be interested in this update. Essentially the Search Engine Optimisation reports in Analytics have been replaced with a Search Console section. The new reports combine Search Console and Analytics metrics, allowing you see the data for organic search traffic from both in one report.
What do the reports show?
The reports pull in the following data from Search Console – Impressions, Clicks, CTR and Average Position and the following from Analytics – Sessions, Bounce Rate, Pages/Sessions, Goals/Ecommerce, Conversion Rate, Transactions and Revenue. For the first time this data appears side by side.
There are 4 new reports – Landing Pages, Countries, Devices and Queries which are found in Analytics under Acquisition.
Landing Pages Report
Each landing page appears as a separate row within the report and allows you to see at glance how the organic search traffic performs for that specific page, how visitors reached the website and what they did when they go there.
What does it all mean?
It means greater actionable insight into the performance of a website for organic search results. The landing page report joins acquisition data with behaviour and conversion data. You can therefore see at landing page level how many clicks, the average position, bounce rate and conversion rate that page gets.
Let’s say for example you had an optimised landing page for pink girls bikes – mymadeupsite.co.uk/pink-girls-bikes with a form set up as a goal, you would be able to see the keywords that had driven traffic to that landing page and at a rolled up level what happened to the visitors when they were on the site. Did they bounce? Did they navigate further into the website? Did they convert? It creates insights which creates actions to better optimise the landing page.
This report allows you to deep dive into the devices – desktop, mobile and tablet and how they arrive and navigate your website, You can see at a glance the comparison between Click Through Rates (CTRs) and Goal Conversions of desktop, mobile and tablet and the landing pages and search queries behind them. This is incredibly valuable data. Back to Pink Girls Bikes you might see that the conversion (remember a form was setup as a goal) is better on desktop and mobile than a tablet. This might mean you review how the form looks or is setup for a tablet user to help improve that conversion rate. You might also notice that some landing pages perform better on mobile than desktop and therefore may look at why that is.
This all sounds great but how do I enable it?
You will need to link your Search Console and Analytics properties through Analytics. Step 1: Navigate in Analytics to Acquisition > Search Console where there are 4 reports – landing pages, countries, devices and queries. Select one of them and select “Set up Search Console data sharing”: Step 2: Select “Property Settings” Step 3: Scroll to the bottom of the page and select “Adjust Search Console” Step 4: Select the site to be linked, Save and Select “add a site to Search Console” Step 6: Start gaining valuable insights
In summary integrating Search Console with Analytics will enable a deeper understanding of search data from beginning to end and enable actionable insights such as:
Understanding the search queries that are ranking well for each organic landing page rather than the website as whole
Examining how desktop, mobile and tablet users find and interact with the website
Improve landing pages in two specific ways:
Improving the landing pages where many users are arriving at the landing page (high click through rate and impressions) but not spending time on the website by navigating through the site (pages/sessions), immediately exiting the website (bounce rate) or not converting to a goal (eg: filling in a contact form).
Improving the search presence of landing pages where the users are navigating further through the website and converting but have a low click through rate.
All of these insights should help build a better user experience and in Google’s eyes a better search experience too.
There could well be an extra gift under our Christmas tree this winter from a certain major search engine, with the next Google Penguin Update likely to arrive within the next two months. The new real-time Penguin algorithm version 4.0 is due for release at the end of this year. We’ve been expecting it and we’re going by news from Gary Illyes, the Webmaster Trends Analyst at Google, who said it would be released in 2015.
What we are expecting of the next algorithm update is that it will be a real-time version, which means that the algorithm will update continuously in real time. The upcoming release is Penguin 4.0 and we’ve been told about these updates on many occasions in the past but until now we haven’t had a continuous update that won’t include any specific release dates. Instead of this, any detections of spammy links detected will be acted upon by Penguin.
When spammy links are removed once they are detected and the Google indexer is aware of this, the sites will stop being impacted by Penguin. The news on Penguin 4.0 is very brief at this stage but it’s intriguing to discover that it will indeed arrive before the end of 2015. So what is Real-time Penguin all about?
There’s not much information on the real-time algorithm update just yet but what we do know is that as soon as Google discovers that a link has been removed, the Penguin algorithm will do exactly what it says on the tin – process this in real time. You would therefore be able to recover from any penalty issued by Penguin pretty quickly, although you could also end up with a penalty just as quickly.
Get in Touch
Here at Artemis we stay up-to-date with all the latest happenings at Google to ensure our clients’ websites benefit and traffic continues to increase. Get in touch with us today to find out more.
A technical puzzle posed by a client had me scratching my head for a while. It went a little something like this
Client: “We would like to show data from our Google Analytics live within the site”
Custom Reporting Dashboards
Monthly reporting to one side there are a number of ways we can create custom reports within Google Analytics or using third party tools such as Cyfe (I love Cyfe), trouble is this either involves providing a public URL, scheduling reports to run periodically and emailing them on set days/dates or stakeholders logging into Google Analytics and finding the reports you have created.
All very useful yes, but I know very few clients that can actually wrap their heads around Google Analytics never mind dig for data to help them make decisions. So why not create custom dashboards that are part of an internal system, or even nearly live data dashboards that can be accessed at any time?
Then it quickly dawned on me, if it was that hard for me, how hard would it be to try and translate what I have learned to someone else?
Turns out Google had already thought about this and offered the solution to my puzzle at the same time. So enter the “Google Analytics Spreadsheet Add-on”, once you get your head around some of the terminology & you are in the swing of configuring the custom reports you can have a live interactive graph of your data setup within 15 minutes.
Setting up the Google Analytics Spreadsheet Add-on
1 – Login to Google Drive using the same username and password that you use for Google Analytics
2 – Click on New > Google Sheets from the left hand navigation
4 – Search for “Google” in the search box and select Google Analytics by clicking the “Free” button
Using the Google Analytics API within Google Sheets
Now that we have the API hooked up to our Google Sheets, we want to start polling the API, there is an extensive list of Dimensions and Metrics we can pull from the API all available here, but for the purposes of this walkthrough we are goiing to pull some basic stats..
1 – Click on “Add-Ons” & select “Google Analytics & then “Create New Report”
2 – On the right hand side of the sheet a 3 step form will appear with various input options
3 – Give the report a name, select the GA account, property and view you want to pull data from, then select the metric and dimension you want to poll. Clicking within the input fields will reveal option lists to make life a little easier
4 – I have chosen a simple report for Sessions by Source/Medium for this example, click create report when you are ready
5 – You will notice that a new sheet has been created which contains the configuration that you just requested. I have also added in “-ga:sessions” and max results of 3, (top 3 sorted by highest to lowest)
Running Reports Using the Google Analytics Add-On
So far so good, pretty pain free eh? It gets easier again! I am not going to go into too much detail about start and end dates or “Last N Days” here as I wanted to keep it simple, but when you are ready to go again.
How easy was that! An alert box appears once more to inform of the status of the report you have just ran & in the background the eagle eyed will have noticed metrics magically appearing
Sharing the Report
Now that our data has been generated, next we need to present it in a much more readable format and share it, as ever the inbuilt capabilities of Google Sheets not only makes this easy but the permission based access also makes it shareable only with the eyes that need to see it. In my case though I wanted to make my data publicly available. As with any data in spreadsheets we look to present it through graphs and charts.
1 – Highlight your data
2 – Within the sheets menu click “insert” & “Chart” and select the chart type you want to use to present your data
3 – click insert
4 – change the chart title to make sense of your data (click on the title to edit)
5 – Click anywhere on the chart and notice the small drop down arrow, select it and click on “Publish Chart”
6 – From the alert that appears select “Embed” and click publish
Updating the data
There are 2 ways to handle updating the data within the reports you have configured, either manually run the report again and your chart will change, or, and this is the beautiful part, you can schedule the data to automatically update!
Scheduling Updates for Your Interactive Charts
1 – Click “Add-ons” within the sheets menu & select “Google Analytics” & then “Schedule Reports”
2 – Select the checkbox for “Enable reports to run automatically”
3 – Select when you want your reports to run
4 – Click save
This scheduling flexibility to automatically update our Google Analytics data saves a tonne of time.
So now we have a free way to create custom reporting dashboards from our Google Analytics data where all we need to do is configure them once and schedule to automatically update. Other than an iframe snippet there is no scripting involved, no oauth to worry about and the inbuilt permissions of Google Sheets takes care of the data integrity.
The following videos are more in depth guides to my simplistic guide, what they do offer though are introductions to working with more dimensions and metrics and custom date ranges and much much more.
Today’s the day Google introduces a major update that focuses on penalising sites that aren’t mobile friendly. “Mobilegeddon”, “Mobile D-Day”, whatever you want to call it, this is a pretty big deal.
This change to the mobile search algorithm is set to cause some pretty hefty ripples across the web, with websites that are not deemed mobile-friendly likely to witness a dramatic reduction in the number of visits they get.
As always, Artemis are fully aware of all the latest Google updates and algorithm changes and we’re monitoring the situation closely.
Who’s Going To Be Penalised?
It’s being dubbed “Mobilegeddon”, with major organisations including Microsoft, Wikipedia and the European Union likely to be negatively affected by Google’s changes to their search formula. It’s also the first time ever that Sky have reported a Google algorithm change, which further highlights the significance of this particular update.
In truth, anyone with a website will be anxious to find out if they are classed as mobile-friendly and you can do so by entering your web address here. Google’s mobile-friendly test analyses your URL and reports back with the results of whether your site has a mobile-friendly design or not.
The Outcome For Smaller Businesses
While Google provided site owners and webmasters with a two-month warning about the impending change, it’s highly likely that many small businesses were either unaware of today’s update or unable to finance a mobile-friendly site.
We will certainly see many sites fall away from their original positions in Google’s search rankings, so what does it mean for smaller businesses?
Google have said that “Mobilegeddon” won’t have an impact on “local pack” results.
However, local listings aren’t the only traffic drivers for businesses who also rely on neighbourhood blog posts and web pages, all of which must now be mobile-friendly.
Both business and non-business websites, who may well have worked extremely hard to brush up on quality content in the past, are likely to suffer.
This begs the question “Is mobile-friendly content more important than highly-trusted content?” It’s certainly up for debate over the coming weeks.
Mobile-Friendliness “One Of Many” Ranking Factors
A Google representative was quoted on the BBC website this morning citing mobile friendliness as “one of many” ranking factors.
“As people increasingly search on their mobile devices, we want to make sure they can find content that’s not only relevant and timely, but also easy to read and interact with on smaller mobile screens” the representative said.
Get In Touch
If you are at all concerned about your site feeling the effects of “Mobilegeddon”, please do not hesitate to get in touch with our team of experts here at Artemis Marketing. We have already helped many of our clients prepare their sites for today’s update and we will do the same for you.
Google’s mobile-friendly algorithm is released on the 21st April, so we’re interested to see what this means as far as SEO is concerned and what else can be expected.
The mobile-friendly algorithm, which Google confirmed will roll out over the course of a week, will be on a page-by-page and real-time basis. So how can you be sure if your page is going to benefit?
This is one hell of an algorithm update as far as Google are concerned, with the impact likely to be much more significant compared to the likes of Panda and Penguin. It’s no surprise that there are plenty of webmasters out there who are sitting anxiously on the edge of their seats in anticipation for the release.
A Google+ hangout that took place on Tuesday brought up the new algorithm update and a number of important questions were answered in the process. Here’s what we found out:
There are no set degrees of mobile friendliness, which means you will be judged on whether you are mobile friendly or you aren’t. It’s as simple as that.
The easiest way to discover whether you are currently mobile friendly is to check live mobile search results and see if you have the mobile-friendly label attached to it. You can also use the mobile-friendly testing tool to match live Google search results. Mobile usability reports in Webmaster Tools may be delayed based on crawl time.
The algorithm will take between a few days and a week to roll out on a global scale.
So firstly, the 21st is more of a guess as the algorithm will take a few days or so to roll out. It’s not unheard of for algorithms to take a little longer than expected, so we can’t be too sure on the dates just yet.
You Are or You Aren’t
Now comes the interesting part. This is an on-or-off algorithm that works on a page-by-page basis, but you have to be mobile friendly to see any benefits. In other words, it doesn’t matter how mobile friendly your pages are.
Some of the criteria mentioned in the Google+ hangout were small font sizes, tap links being spread out and content being readable from a mobile viewpoint. However, it was also stated that there were over 200 different factors that will determine whether you are, or indeed aren’t, mobile friendly.
Are You Mobile Friendly?
The easiest way to see if you are mobile friendly is to check for a mobile-friendly label in the live mobile search results. The label confirms Google understands you are mobile-friendly.
When it comes to optimising keywords, content is widely overlooked despite it being the driving factor. It is likely to be the difference between your prospects choosing you ahead of your biggest competitors.
Advertising involves referencing and reminding people that they need to be up to date with the latest products. On the other hand, mass marketing puts us under pressure as we have to appeal to a much wider audience.
It’s impossible to appeal to everyone through short advertising ploys, so I guess that’s why websites with quality content need to exist…
The Wrong Kind of Optimisation – The Dangers and History of Keyword Stuffing
It all comes down to how you utilise your website of course, with many people optimising keywords the wrong way for some years before Google told us about their latest algorithm.
People were insistent on marauding through keyword research data in an attempt to learn every synonym in existence; hoping to use the same fundamental keyword in whatever piece of topical content that was being written. This lead to problematic, keyword-heavy content.
In some cases written content wouldn’t have anything to do with the keywords being thrown into the mix.
After Google introduced its Freshness algorithm we were seeking the best rewards through consistent uploading of fresh content.
Regular Updates Brought About Results
However, this didn’t necessarily monitor what that content was about, as Google were far more interested in discovering whether or not you were regularly updating your website. This wasn’t the road the SEO industry wanted to pursue, so what was it that changed?
Panda’s Influence on Quality Content
Search marketers went through a prolonged period of frenzied content creation after they realised Google weren’t penalising the quality of their written work whatsoever.
Google made the mistake of presuming search marketers would think it obvious that their content would need to be good even though poor content was working just as well.
Google needed to come forward and clarify the issue and they did so with the introduction of their latest Panda algorithm last year, which, instead of targeting just thin and duplicate content, would target thin and duplicate content that offered little or no value.
Making it to the Top
It’s essential that people realise the importance of detailed, comprehensive and informative content that answers all the questions of their audience, as this could ultimately get them to the top of Google’s search rankings without having to build another link.
Links are still an important part of the marketing process but if you’re consistently in the top 10, you’ve probably got enough links to make it to top spot.
The importance of appearing on other publications that you’re audience regularly read up on or visit is vital as people won’t do their research on just the one site. An example would be having your products reviewed elsewhere by influencers.
Search Is Growing
So when it comes to written content and optimising for keywords, it’s important to remember that your content must be relevant to the users’ intent.
Your internal links will only get you so far if you haven’t yet updated the way you see content, while your external links remain vitally important as, with smartphone ownership going through the roof, more and more people are searching all over the place.
Satisfying as many searchers in one go is what the cleverest SEOs will hope to achieve in 2015, especially if they’re hoping to rank for the biggest terms…
If we take a look back at the good old days when Matt Cutts was still head of web spam at Google, you might remember something he spoke of in March 2011 relating to exception lists, also known as “whitelists”.
Cutts explained that while Google did use whitelists, they existed on a per-algorithm basis. If you caught up with John Mueller’s Google+ webmaster hangout on Tuesday, you might have heard him addressing the issue surrounding exception lists once again.
No Exception Lists for Panda & Penguin
What we found out was that Google uses whitelists for some of its algorithms but Mueller specifically said that there weren’t any exception lists in place for Panda or Penguin.
“For the most part we do not have a whitelist where we can say this web site is okay and we can take it out of the algorithm. For a lot of the general search algorithms, we do not have that ability”.
Why Exception Lists Are Needed
Mueller went on to say that there were some scenarios were they did have whitelists, mostly for the odd individual case. He explained that, with algorithms such as SafeSearch and for false positives on adult content, they do have that ability.
It’s quite clear that leading sites feel the early effects of any newly introduced algorithm or algorithm change and there’s very little Google can do until the algorithm is refreshed.
If you’re interested to hear what was talked about regarding whitelists, start the video provided below at around the 25-minute mark.
“it wouldn’t be feasible to handle them (search queries) manually”
Matt Cutts was asked by SearchEngineLand’s News Editor Barry Schwartz back in March 2011 whether Google used exception lists and the former head of web spam explained that whitelists were necessary for the odd search algorithms that weren’t 100% perfect.
“Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with the computer algorithms. In our experience, algorithms generate much better results than humans ranking websites page by page. And given the hundreds of millions of queries we get every day, it wouldn’t be feasible to handle them manually anyway”.
Cutts went on to say that ideally, Google didn’t want to maintain their exception lists at all but emphasised the fact that search was “still in its infancy, and our algorithms can’t answer all questions”.
If you’re looking to improve your ranking in Google to help boost the success of your online business, call Artemis today and take advantage of our free consultation service.
I recently came across an article that outlined the impact SEO has on the quality of written content. It was argued that, despite persistent historical criticism stating the opposite, SEO was more than capable of enhancing the calibre of our web-based writing.
The writer sarcastically introduces their article by denouncing the use of SEO within high quality content, citing ‘awkward, keyword-stuffed phrases’ in the deriding opening paragraph.
It seems SEO has received plenty of unmerited condemnation in the past with regards to its effect on quality content. Perhaps it remains an easy target for many an infuriated writer who struggle with the ever-changing demands Google likes to throw at us.
The article I read this week stated that anyone who feels SEO is disrupting their standard of content should be classed as “not a very good writer”, although I find this harsh in the sense that SEO is a rapidly growing and developing industry practically dictating the paths which skilled and ambitious writers must now follow in order to succeed on their website.
We want to know if SEO has changed content writing for the better and whether we should feel despondent about compressing our content with a plethora of regimenting keywords from now on.
As writers, we have to understand that prioritising SEO with keywords is crucial to the success of our website and that Google wants to see as much keyword-ridden content as possible. We spend much of our time acknowledging our own work; concluding that this is exactly what the reader wants and anything else wouldn’t live up to expectations.
What SEO gives us is the opportunity to discover more about our audience and what it is they want to be reading, thus improving our use of vocabulary (as well as keywords) within that specific sector.
We might be using the right words and providing our audience with an excellent source of information that’s plentiful, insightful and appealing to any industry expert but are the writers out there finding it easier to produce high quality content as a result of this?
The point of SEO and Google’s newfound ideology is to have clients, customers, fans and enthusiasts leaving with a superior amount of knowledge and ideally with everything they’d hoped to obtain, but those of us who venture to supermarkets once or twice a week know full well how difficult this is to achieve.
Instead we take something that’s just as appealing, albeit different, or nothing at all. If we can somehow formulate this similarity between websites and supermarkets, there could be a valid reason to support writers who feel oppressed by the often manipulative characteristics of Google algorithms that offer very little in the way of compromise.
Of course, it won’t help a website achieve its potential if we search too vigorously for the opportunity to shun keywords wherever possible. Instead, we need to avoid circumventing and utilise the benefits of meeting SEO requirements. So how do we do this?
SEO Does Generate Ideas
The article I refer to at the start of this post helped me find the origin of what is widely regarded as a seasoned writer’s nightmare.
Generating ideas for content can be debilitating at times and if there’s anything out there that has the potential to blemish a consistent writer’s portfolio, it’s discovering up to the minute topics to write about.
Analytics provides us with a monumental amount of topics covering everything from e-books to on-site content.
It’s now easier than ever to find similar topics using related search results provided by search engines and sufficiently fuel the part of our brains that allows the creative juices to flow.
SEO – Turn It On Its Head
“I’ve just written the best piece of content I think I’ve ever had the pleasure of completing, what without all those problematic little keywords to hold me back…”
“Great, how many views does it have?”
It’s important that we don’t embarrass ourselves and forget that anything we do write must have the pleasure of being read. This is probably the clearest reason yet why writers should be embracing SEO instead of uncovering its distinguishable flaws.
We effectively organise content, we learn how to incorporate the right terms and we adopt new styles of writing. We couldn’t just expect someone to stumble across the work we do either.
SEO and Writers Unite
There’s no debate that keywords and SEO are systematising from a writer’s point of view. However, what’s taken away from us is given back in the shape of something a whole lot more valuable in the modern world; recognition.
For the traditional writers out there I say this; there’s always something new to learn and written content is fast becoming a huge part of SEO, so embrace it as early as possible and you’ll have forgotten the substance behind your quarrels faster than you can type out a troublesome old keyword.
Google has started work on child-friendly versions of their services, from Chrome to the search site itself. A US-based report states that Google are developing these modifications in an attempt to provide parents with more security when children are surfing the web.
“The big motivator inside the company is everyone is having kids”
Pavni Diwanji, Google’s vice-president of engineering, is leading the project and told USA Today a little more about what to expect from their child-friendly version of the search site, stating that the new modifications would be designed for children up to the age of 12. There is still no news on a release date for these modifications.
Child-Friendly Search Results
One of the examples Ms Diwanji put forward revolved around search results and how search terms such as “trains” could provide and prioritise child-friendly results such as “Thomas the Tank Engine” when children were using the search site instead of sending them in the direction of ticket booking sites.
Improved Online Security for Parents
As well as child-friendly search results, Google are also developing tools that let parents monitor their child’s time surfing the web, including the sites they visit and how much time they are spending online. While there are already Safe Search tools available for parental use, it’s believed Google’s plans will take child safety a whole lot further.
Google may come unstuck at some stage during the development of these site modifications however. This is down to the Children’s Online Privacy Protection Act (Coppa) in the US, which specifies the amount of data that can be collected about children and what it can be used for.