The dust is finally starting to settle on what has been a turbulent six or seven weeks in the SEO world. If you are a subscriber to AccuRanker’s daily Google status, I’m not sure a day has gone by in the last seven weeks where Google has not been grumpy or furious which basically equates to huge fluctuations in rankings across all markets and all devices and signals algorithm updates at work.
So, what do we know and what does it mean for SEOs?

Google Possum – a new local algorithm

The first ‘hit’ to strike on the rankings was an update on September 1 which has no official name from Google but which has been affectionately dubbed as Possum by the SEO community. This name comes from the fact that following the update, many webmasters saw a disappearance of their Google My Business listing from the search results but what was actually happening was that the results were being filtered or hidden, or as the American’s like to say, playing possum.
The update seemed to impact on the snack pack local results – the three results found under the map listing when carrying out a local search query. It seems that this update had no direct impact on the organic ‘blue link’ listings and our data seems to concur with that.
Google displays 3 local search results in a snack pack below a map and above the organic blue links

The impact?
From what we are seeing and from the limited information that has been released or reported on by the SEO community, these seem to be the main impact areas:

1. There has been a boost for businesses that fall outside of the city limits

Traditionally, companies located outside of the city limits have found it near on impossible to rank in the local snack pack for related queries e.g. a hotel in Greenlane ranking for the search query ‘hotels in auckland’ however this now seems to have changed with more and more companies based outside of the main CBD now ranking in the snack pack local results.

2. There are greater variances for similar keywords

Within the local snack pack, we are now seeing much more variance for similar keyword searches e.g. hotels in Auckland and Auckland hotels. Previously, Google would treat these queries pretty much the same and rank the same three sites for the queries and whilst there are still some examples where the same companies are ranking for a bunch of similar keywords, across the board, many SEOs are seeing their sites or client’s sites ranking for some keywords and not for others in the local snack pack results.
Google now delivers 3 different results for certain queries even with very small variances in the keyword search query

3. Google is now filtering on address ad affiliation

This is something that we have definitely seen with a number of clients, both positive and negative. We have seen a lot of businesses filtered out because of a specific address. For example, if there are a lot of similar businesses all located within the same area of a city, Google is now filtering out a lot of these results and showing what they feel is the most relevant in that particular address rather than multiple companies all based on the same street or even in the same shopping centre. Making your location based content the most relevant to your audience is the key to ranking well when there are multiple companies within the same sector, fighting for the same keywords all located within close proximity.

4. The physical location of the searcher is now more important than before

One thing we noticed straight away was a shift in the results we were seeing from our office location north of Auckland when we searched for queries such as ‘hotels auckland’ or ‘car rentals auckland’. Previously, we would have been shown what Google felt were the most relevant results from the CBD, however now we started to see locations on the North Shore and on the outskirts of the CBD ranking in the local snack pack. We had to go and do some physical location testing across the city to find out what was really ranking and where when searching in different areas of the city.

Conclusion

Google has been grumpy or furious for nearly two months now

We are still seeing a lot of fluctuations in the keyword rankings that features in the local snack pack results which suggests Google may still be tweaking the Possum algorithm and potentially still doing some A/B testing. AccuRanker is still reporting Grumpy updates daily although this could also be linked to the Google Penguin update that took place later in September (more of that in a second…). The best thing we can recommend is that you focus on your location-specific content, the stuff you want to rank in that snack pack and make sure that it is better than the content that is currently ranking in those top three spots below the maps. These snack pack rankings have a huge impact on your traffic from local search queries so getting your own site or your client’s site ranking will see a higher click through rate and hopefully more conversions.

Google Penguin 4.0

Webmasters were just coming to terms with the Possum update and dealing with any potential issues, we also got the news that many have been waiting months for – Google was rolling out Penguin 4.0 or perhaps more accurately, Google rolled out Penguin into their core algorithm meaning no more Penguin versions – just Penguin baked into the main algorithm.
For those who are not aware (where have you been!), Penguin is Google’s algorithm that tackles spam links and it has been extremely effective in curing the web of dodgy link buying tactics that plagued SEO agencies across the world not more than 5 years ago. The introduction of the Penguin algorithm saw sites that had been engaged with dodgy link buying practices removed from the Google search results until they cleaned up their backlink profile. Historically, their site would only be re-indexed by Google once the Penguin algorithm updated which could leave you hanging on for months on end. Thankfully, some things have changed now that Penguin is part of the main algorithm.

Let’s take a look at what has changed:

1. No more waiting for Penguin to update

One of the most frustrating things about a previous Penguin penalty was the time taken to recover. Even when you had completed your due diligence, removed all the bad links or disavowed those you couldn’t get removed, you were still beholden to the almighty Google and their timescales. In the case of Penguin 3.0, if you were hit by that particular penalty, you could have been waiting for 2 years for your site to recover and for many, this spelt an abrupt end to their business online.
Now that Penguin is real time, there is no more waiting. If you are hit with a Penguin penalty and manage to resolve your link issues, you can quickly recover. One of the problems is that now Penguin is real time, you can’t be 100% sure of why your site has been penalised. Previously there was a clear dateline but now it could be one of a number of issues including Panda and Penguin.

2. Pages rather than whole sites can be impacted

In old Penguin land, if you had dodgy links pointing to a page on your site, the whole site would be penalised and demoted from the rankings. Now however, individual pages can be penalised if there is a particularly high volume of spammy links pointing to that one single page.

3. Links are ‘re-valued’ rather than penalised

This seems to be more hearsay than fact, however it appears that rather than removing all your links using manual requests or using the disavow file, spammy links may now just be discounted by Google, therefore eliminating the need for webmasters to as actively manage their disavow file. For us, we will still maintain a thorough auditing process for all clients as we feel there is no place for these spammy links and it is better to be safe than sorry.

One thing that could support the final point above is a theory of SEO Guru Rand Fishkin of Moz who suggests that throughout all previous Penguin algo updates, webmasters were encouraged to use the disavow file to identify and eliminate spammy links. What this provided Google with is a huge database of spammy links which have been identified by webmasters throughout the world. This data can now be fed back into the core algorithm and machine learning can determine which are spammy links and which aren’t – this is just Rand’s theory but it’s certainly one that makes a lot of sense. You can hear more about it in this episode of Whiteboard Friday: https://moz.com/blog/how-real-time-penguin-model-changes-seo-whiteboard-friday

Conclusion

For better or for worse, Penguin is now part of Google’s core algorithm so there is no going back. In the long run, we think it will be a positive thing for webmaster and the user experience on the web as a whole and we have been pleased to see that all of our clients received a very healthy rankings boost following the roll out of Penguin 4.0 thanks to our ongoing and diligent link auditing.

Share this story