Our Best Google Algorithm Update Articles

Another Broad Core Algorithm Update


Hold on to your asses, the SERPs are getting a little bit wild again.

Recently, Google confirmed the roll-out of a broad core algorithm update (which is what the “Medic” update really was) on March 12th. Whether your traffic came crashing down or shot up into space, these broad core algo updates aren’t subtle.

Professional re-enactment of Google releasing a core algo update.

The solution to ranking well again, if your traffic fell? Same as before:

A Glimpse At The Bleak, Post-Apocalyptic Future of SEO

Google Says: Disavowing Bad Links Could Increase Trust


This seems like a pretty important point that most people paying attention to their backlinks should tuck away for the future.

Google’s John Mueller said in a webmaster hangout on Tuesday at the 16:44 mark that in some cases, disavowing or cleaning up bad links to your site may help Google’s algorithm trust other links to your site.

So if you’ve been on the fence about whether or not to disavow those crappy links a disgruntled competitor sent your way, maybe this pushes you over the edge. Here’s the video where you can watch the conversation:

The August 1st Core Algorithm Update
(“Medic” SEO Update)

On August 1st, Google rolled out a core algorithm update, which means they made a change to how the algorithm scores and values the many factors that determine how well a site does or does not rank for a given keyword.

Google is constantly pushing out small updates to its algorithm to try and improve the results it serves to searchers, but this was one of the biggest many SEO experts had ever seen.

If you don’t follow SEO-related news closely, chances are you probably noticed a change to your site’s traffic sometimes from August 1st – August 8th. Whether it increased or declined, the August 1st update had some pretty big impacts for a lot of sites.

According to data gathered by SEO publication sites (and a ton of chatter and first-hand accounts on site where SEOs hang out), this update seemed to target sites related to the health industry and related keywords. However, health was just one industry of many that was affected.

Here’s a graph from SERoundtable that that pulls a bunch of info together to show which industries were most affected:

Expert Speculation and What Google Says

This core algorithm update–dubbed the Medic Update by an industry news site because it largely targets health-realted sites–took a solid week to fully roll out.

Sites that pushed “alternate health” advice saw the biggest initial drop. Examples include DrAxe.com and Prevention.com.

This led several experts to push the idea that Google had tweaked their algorithm to reward sites that are true authorities relating to the healthcare industry. In Google’s Quality Rater Guidelines (QRG), a document that spells out what does (and does not) constitute a high-quality site, Google spells out the importance of E-A-T:

Expertise. Authority. Trust.

The initial takes on the update pointed to sites like DrAxe.com losing organic traffic rankings, while sites with more “traditional” authority in the medical industry, like (the health section of) ScienceDaily.com gaining a significant amount of organic traffic through better rankings:

Marie Haynes pointed to the Google’s Quality Rater Guidelines–specifically the Trust part of the E-A-T acronym as to the reason why sites lost ground in their rankings:

If you run a [health related] site, the following are all going to be important factors in how you rank:

  • Is your content written by people who are truly known as authorities in their field?
  • Do your business and your writers have a good reputation?
  • Are you selling products that are potentially either scams, not helpful, or even harmful to people?

If you are lacking business or author reputation or have products that don’t inspire trust, then re-establishing trust and ranking well again may be difficult.

Many were quick to jump on the E-A-T bandwagon to explain the drastically changed search results. However, focusing only on the matter of Trust and Expertise is to ignore many important factors that may impact a site. As Glenn Gabe wrote re: the update:

I highly recommend reading the QRG to see what Google deems high versus low quality, to understand how Google treats [health-related] sites, to understand the importance of E-A-T, to understand the impact of aggressive, disruptive, and deceptive ads, and much more.

But it’s not the only thing you should do. …don’t ignore technical SEO, thin content, performance problems, and other things like that. Think about the site holistically and root out all potential problems.

This is evergreen good advice when it comes to SEO.

Here’s what Google team member Danny Sullivan said about the update:


What the update was targeting (and what to do about it)

Now that you understand the scope and a little bit about what this update targeted (trust, yes, but many other issues), you’re probably wondering what you can do about this update if you lost some ground.

The most important thing to remember when reading the next section is: this was a broad core algorithm update. The key is “broad.” It wasn’t just one thing. It’s not just targeting the medical/health niche, although they were hit particularly hard. It’s not just about query intent or site speed or content. It’s about ALL of them.

The second most important thing to remember when reading these updates from various SEOs: how does their advice relate to their product? Bias is a hell of a lens to view the world through, so just be aware of what’s on offer.

Is a particular ‘expert’ or agency really hammering, say, individual author authority as the biggest thing this update targeted? Do they happen to offer reputation management? If so, take their advice into consideration with the bias in mind.

At Smash Digital, we build links. And we’re really good at it. So just be aware of the bias that we think building powerful links is one of the most important things you can do for your business.

Also, we’re totally right about this, but keep that in mind when you listen to our take–and anyone else’s take–on the update.

With those points covered, let’s dig into what the August 1st update might have targeted and, if possible, what you can do about it.

1. For some queries, in some niches, the intent behind the query changed

For some queries–specifically related to medical niches–Google seemed to do the SERP equivalent of reaching over the table and mixing up your plate of food right as you were about to Instagram it.

I’ve seen several queries that went from being “transactional” in nature i.e. showing results assuming the person searching was looking to buy something to informational i.e. showing results assuming the person searching is looking for information. So if you’re an ecommerce site that used to rank a product page for a particularly valuable medical-related query, and Google (or their algorithm) changed the Query Intent to informational and are not showing products anymore, you gotta step up your content game.

If this is the case, try to write a “buyer’s guide” or similar educationally-focused post that teaches rather than sells. Flex your authority and trust by showing you’ve got the searcher’s best interest at heart, and are just trying to spread knowledge.

We’ve seen some early promising results where rankings have popped back up after dialing back the sales-talk, turning away from pushing a product and just letting the content teach–and that’s it.

So check your main keywords to see if the Query Intent of the results has changed.

2. What kind of page is ranking, and does your site match up?

This will be brief, as it’s similar to point #1.

Choose a page that lost some rankings and look at what URL Google is ranking for your site. Is it a fresh af blog post? Is it the homepage?

Now look at the top 10 and compare. If you notice something like, most of the pages in the top 10 are deeply categorized blog posts, but you’ve been trying to rank the homepage…

3. User experience matters. A lot.

This may come as shock, but Google doesn’t care about your site’s profit.


So if you need to be aggressive with ads to make enough to pay for all that beautiful epic content they want you to create, maybe say goodbye to your good rankings. If not now, soon.

Chances are, if you’re a big media site that’s ugly with deceptive ads, you probably got slapped in this last update.

image from CanIRank.com

Other obvious things that may have hurt your site (or will, if you don’t get it together):

  • slowly-loading sites
  • pop-ups that block content (especially on mobile)
  • excessive ads
  • autoplaying videos
  • and other terrible experience.

4. All the obvious things you’ve heard

The update is not even a month old at the time this post is being written. Getting perspective on an update–especially one this big and far-reaching, can take months to years to fully understand. Of course, SEOs are relentlessly curious and data-curious, so there’s a lot it’s possible to piece together even only a few weeks out.

But the majority of what we understand about this update is ahead of us, not behind.

In the mean time, stick to (and follow!) the SEO advice you constantly hear. It’s your best guard against future updates.

Make sure your on-page SEO is solid.

Links are important; you need good ones.

Produce high-quality content that demonstrates your authority.

Link out to sources; don’t be outbound-link-greedy.

User experience is vital; make it good.

bonus tip

Need some help?

If you think you were impacted by the August 1st update and want to see if we’d be able to help, just hit up the contact page and I’ll get back to you ASAP.

Other reading on the subject that we enjoyed and learned from


An Updated Page Rank


First off, yes Page Rank is still a thing. Google uses it within their ranking algorithm. The reason you probably haven’t heard of it for a few years is that they stopped publicly updating it.


Because SEOs used it help them rank sites or sell services and Google has no love for SEOs…

So what’s this about an updated Page Rank?

SEO By the Sea is covering a recent update to a previously-granted patent pertaining to Page Rank. It’s some pretty complicated stuff (or at least, is written that way):

One embodiment of the present invention provides a system that ranks pages on the web based on distances between the pages, wherein the pages are interconnected with links to form a link-graph. More specifically, a set of high-quality seed pages are chosen as references for ranking the pages in the link-graph, and shortest distances from the set of seed pages to each given page in the link-graph are computed. Each of the shortest distances is obtained by summing lengths of a set of links which follows the shortest path from a seed page to a given page, wherein the length of a given link is assigned to the link based on properties of the link and properties of the page attached to the link. The computed shortest distances are then used to determine the ranking scores of the associated pages.

Basically, Google is trying to use Seed Pages, and the distance a given site is (via links) from a Seed Page. It’s like six degrees of separation from some really authoritative pages, with links.


Google is Still Serious about HTTPS


Google just launched the new .apps gTLD and here’s a really interesting part of using their new domain ending:

key benefit of the .app domain is that security is built in—for you and your users. The big difference is that HTTPS is required to connect to all .app websites, helping protect against ad malware and tracking injection by ISPs, in addition to safeguarding against spying on open WiFi networks. Because .app will be the first TLD with enforced security made available for general registration, it’s helping move the web to an HTTPS-everywhere future in a big way.

The Ultimate Black Hat SEO Bug (Now Squashed)


Recently, Tom Anthony discovered a way to rank a brand new site for some crazy-valuable keywords at the top of the SERPs:

I recently discovered an issue to Google that allows an attacker to submit an XML sitemap to Google for a site for which they are not authenticated. As these files can contain indexation directives, such as hreflang, it allows an attacker to utilise these directives to help their own sites rank in the Google search results.

I spent $12 setting up my experiment and was ranking on the first page for high monetizable search terms, with a newly registered domain that had no inbound links.

Leading to results like this:

And traffic like this:

Click through and check out the post for all the details–it’s pretty amazing.


A Recent Core Algorithm Update


Did your site’s organic traffic get its ass kicked or start kicking ass lately? It may be due to a recent Google algorithm update. SE Roundtable posted about a bunch of chatter they started seeing on various SEO and webmaster forums about a possible update.

Shortly thereafter, Google actually confirmed the update (which they don’t always do).

So, as this was a core algorithm update, I haven’t seen much in the way of “these sites took a hit because of this reason” type of info, but it also seems to surface a few weeks after the fact, so I’ll keep my eyes open and post an update in a future This Week in SEO post to let you know.


The Chrome Browser Says: Not Secure


Imagine you’ve put all this hard work and effort into building a site, ranking it well for a bunch of sweet keywords, and… an no one stays on your site for more than 10 seconds.

In an upcoming edition of Google’s Chrome browser, all websites without an SSL certificate will be marked “not secure.” Like this:

Starting in July, the latest version of Chrome will show the second notification for sites that don’t have SSL even if someone is not inputting information into a form field.

Why this matters to your SEO:

  • People are going to see “not secure” and thing your site will infect their computer.
  • They’ll immediately go back to Google and Google will think “damn, people aren’t staying on this page a long time. I guess it’s not relevant.
  • Google will push your page further down the rankings because user-experience shows it’s not relevant.
  • Your rankings go down.

It’s an easy fix, and you’ve got a few months to get it done, so…


Google Removes 96/101 GMB Reviews


Damn, Google!

So here’s a bit of local SEO drama for you (and a very costly lesson for you to learn from).

Someone complained on the Google My Business advertising forum that a competing law firm was incentivizing reviews by offering a free pass to a zoo reward, to be chosen from people that left them a review.

Here is a review from someone complaining about the practice:

Of course, most attorneys in the area have a few reviews which is totally normal. Almost all of their fake reviews came in at the exact same time 25-30 days ago. Although, there is a new push with more reviews popping up today, and there’s a batch that was all left on the same day 6mos ago.

They should have under 10 reviews at the very most.

Eventually, the law firm in question chimes in several times on the thread (with some super lawyer-y lingo like “pursuant” and “to the extent”) to say that they are not offering their services in return for reviews, and so they are not in violation of the guidelines.

I definitely recommend reading the full thread, but here’s how it ends (and this is the part you really should internalize):

Google’s team decided that the reviews WERE, in fact, against their guidelines:

“Reviews are only valuable when they are honest and unbiased. (For example, business owners shouldn’t offer incentives to customers in exchange for reviews.) Read more in our review posting guidelines. If you see a review that’s inappropriate or that violates our policies, you can flag it for removal.

And here’s how it all shook out:



Another Algorithm Update (Probably!)


Eventually these won’t be newsworthy anymore with the frequency they keep happening…

Okay, probably not. It’s always a big deal when Google drops an algorithm update like a diss track aimed at your website.

I am seeing signs both within the search community and from the automated tracking tools of an update with Google’s search results going on right now. The interesting thing is about 50% of the tools are reporting on the algorithm update and the other are not. Maybe Google is doing a 50/50 test on a new algorithm?

SERP/algorithm tracking site evidence:

Cognitive SEO:



What the F Has Been Going on the SERPs Lately?



Since August, we’ve seen a number of updates I would call significant. I actually can’t remember seeing that many substantial updates in such a short period of time.

This has been clear from the thousands of keywords we track here at Supremacy. While we’re used to seeing the occasional turbulence in the SERPs, rankings across the last couple of months have been more like paint on a speaker in slow motion:

So why the recent volatility (not even accounting for the recent mobile stuff)? Glenn Gabe takes a very smart—and well informed—stab at it in this post. There’s a number of points he makes, and I highly recommend you go through and read them all, but here’s the most likely terrifying explanation:

Google may be increasing the frequency of refreshing its quality algorithms. And the end goal could be to have that running in near real-time (or actually in real-time). If that happens, then site owners will truly be in a situation where they have no idea what hit them, or why.

Or, you could do something about it:

Based on the volatility this fall, and what I’ve explained above, I’m sure you are wondering what you can do. I’ve said this for a while now, but we are pretty much at the point where sites need to fix everything quality-wise. I hate saying that, but it’s true.


Thoughts on the August Algorithm Update


Google has no chill. I’ve seen a ton of movement in the SERPs this summer than I can remember in any previous quarter. And it’s all, as far as I can tell, been tied to a site’s quality score.

As I mentioned in my post about the May 17, 2017 update, Google seems to be pushing quality updates almost monthly now (refreshing its quality algorithms). That’s great if you are looking to recover, but tough if you’re in the gray area of quality and susceptible to being hit. Over the past several months, we have seen updates on May 17, June 25, July 10, and now August 19. Google has been busy.

There seems to have been another one in early September (around the 7th or 11th).

Lots of rankings doing this:

and this:

And a few killer sites doing this:

The post goes in-depth to some examples of what may have caused these pages to be impacted (the usual suspects like thin content, lots of ads, and this):

It never ceases to amaze me how some sites absolutely hammer their visitors, attempt to trick them into clicking ads, provide horrible UX barriers, and almost give them a seizure with crazy ads running across their pages.

A good post to dig into and follow the advice for your site. It’s only going to get worse for your SEO if visiting your site is a UX disaster.


Auto-playing Video Ads in Google’s SERPs


Google has confirmed with Search Engine Land that they are running a small experiment where they auto-play videos in the search results page. Jennifer Slegg spotted the test this morning after conducting some test searches using Internet Explorer. The video in the knowledge panel will auto-play if you are in this experiment.

And you were mad when they reduced the local pack from 7 to 3…

Big SERP Shakeup

It’s been a damn interesting week.

How’s your organic traffic doing?

It looks like there’s been a very significant Google algorithm update that started rolling out around June 23rd.

Here’s a couple of questions you may be wondering about, and my answers:

Q: What is this? Panda? Penguin? Hummingbird? Pigeon?

A: ¯\_(ツ)_/¯ You see, SEO moves slowly, in general, and it takes time to study the data, to see common site elements that trend down, others the trend up, and really put the pieces together.

Q: Should I panic?

A: No. Probably not. I’ve seen many situations where a site owner has done more damage by trying to disavow a ton of links, 301 redirect a ton of others, and just generally muck-up the site’s architecture trying to get some rankings back. It’s better to take action once you understand the situation better.

I’ve also seen sites that get knocked down by a penalty and, when the dust settles, rank slightly higher than they did after a week of poor rankings. The algorithm can take a while to roll out, and aftershocks can further change things around. So don’t panic!

Here’s some more info on this algorithm update.


Search Engine Land Covering the Possible Update


Google technically did not confirm it, outside of John Mueller’s typical reply to any questions around an algorithm update. But based on the industry chatter that I track closely and the automated tracking tools from Mozcast, SERPMetrics, Algoroo, Advanced Web Rankings, Accuranker, RankRanger and SEMRush, among other tools, it seems there was a real Google algorithm update.


Rank Ranger’s Early Coverage of the 2017 June Update


Despite the length of the current update, the initial chatter, per Barry Schwartz of SERoundtable, was quite light. This is obviously peculiar, not only in light of the length of the update, but the fluctuation levels themselves as well. The risk levels on our Rank Risk Index have risen above moderate, and show a continuous series of high fluctuation levels.


SERPWoo: A Bump in Volitility


SERPWoo tracks how much a particular niche fluctuates among the top 20, and then aggregates that data across several different verticals like mobile, desktop, search volume, etc.

You can definitely see a bump around the 23rd of June.


Oh Yeah, There Was a Sizable Update in May, Too


Because two algorithm updates are better than one!

After digging into many drops, I saw the usual suspects when it comes to “quality updates”. For example, aggressive advertising, UX barriers, thin content mixed with UX barriers, frustrating user interface problems, deceptive ads, low quality content, and more.


Google’s Patent on Finding Authoritative Sites for the SERPs


No, Siri, I said ‘find authority pictures–you know what? Nevermind’

Apparently Google looks for authoritative pages the same we you probably already do when doing research:

A patent granted to Google this week focuses upon authoritative search results. It describes how Google might surface authoritative results for queries and for query revisions when there might not results that meet a threshold of authoritativeness for the initial query. Reading through it was like looking at a mirror image of the efforts I usually go through to try to build authoritative results for a search engine to surface.

Very interesting stuff. If you’re feeling particularly perky some morning over coffee, I’d suggest giving this patent a read through. Seems likely that you may gain some insight into how Google frames or approaches authority web pages.

Also, go ahead and click through to read the full text of this article. I’m putting two here, but there are seven “takeaways” from the patent that I recommend becoming familiar with:

1. Google might maintain a “keyword-to-authoritative site database” which it can refer to when someone performs a query.

2. The patent described “Mapping” keywords on pages on the Web as sources of information for that authoritative site database.

Finally, this is all further proof that the best long-term SEO strategy is becoming an authority in your space.

How? Call me biased, but getting juicy, high-quality backlinks is part of a balanced SEO breakfast. Check out how our RankBOSS service can help.


Google’s New Algorithm Update Targets Fake News


This is a great update on Google’s relationship with, and response to, fake news.

From Bloomberg, on the update:

The Alphabet Inc. company is making a rare, sweeping change to the algorithm behind its powerful search engine to demote misleading, false and offensive articles online. Google is also setting new rules encouraging its “raters” — the 10,000-plus staff that assess search results — to flag web pages that host hoaxes, conspiracy theories and what the company calls “low-quality” content.

It’s always interesting to read about SEO-related issues from non-industry people. In this case, it’s Ben from the (amazing) tech blog Stratechery.

Framing the problem of fake news in relation to Google’s finances:

Google, on the other hand, is less in the business of driving engagement via articles you agree with, than it is in being a primary source of truth. The reason to do a Google search is that you want to know the answer to a question, and for that reason I have long been more concerned about fake news in search results, particularly “featured snippets.”

Google … is not only serving up these snippets as if they are the truth, but serving them up as a direct response to someone explicitly searching for answers. In other words, not only is Google effectively putting its reputation behind these snippets, it is serving said snippets to users in a state where they are primed to believe they are true.

The main criticism here is not in how Google handled the algorithm update, but in how they are changing the quality rater guidelines to now demote pages that it considers “not-authoritative:”

This simply isn’t good enough: Google is going to be making decisions about who is authoritative and who is not, which is another way of saying that Google is going to be making decisions about what is true and what is not, and that demands more transparency, not less.

Dear Google:


Drop Dead Fred: A Google Algorithm Update Analysis


Fred Google update
only 90s kids will get this

Recently, there was a Google algorithm update that clever hilarious SEOs named “Fred” after Gary Illyes said all future updates should be named “Fred.”

As usual, there has been a lot of speculation as to what this update entailed, but nothing super solid.

Recently, though, an SEO tool company called Sistrix has published some interesting findings after studying 300 sites:

Nearly all losers were very advertisement heavy, especially banner ads, many of which were AdSense campaigns. Another thing that we often noticed was that those sites offered little or poor quality content, which had no value for the reader. It seem that many, but not all, websites are affected who tried to grab a large number of visitors from Google with low quality content, which they then tried to quickly and easily monetize through affiliate programs.

According to this post, sites that look like this got hammered:


a.k.a. a page created to grab search traffic, with a low amount/terrible quality content and lots of ads.

Hopefully you’ve been taking our advice and creating solid content on the pages you’re trying to rank. ?




JK. Any time any little thing happens/goes wrong in this industry, it’s all doom all the time.

But seriously, this probably is a pretty big deal, when combined with the mobile-first index and rise of mobile search:

Tappable shortcuts eliminating the need to search (for certain things)…

The shortcuts eliminate the need to search, providing quick answers around sports scores, nearby restaurants, up-to-the minute weather updates and entertainment information, like TV schedules or who won the Oscar for best supporting actress.

I mean, if you’re in any of those industries (sports scores, weather, etc), Google already ate your lunch and made you buy dessert.

Here’s a video on the new feature:


Google’s Biggest Competition


Three years ago, if I were to put money on which multi-gajillion-dollar tech company would pose the biggest threat Google’s search dominant, smart money would be on Apple.

But since all Apple has done in the past three years is NOT innovate Siri and make laptops with features no one asked for (I’m not bitter, YOU’RE bitter), it’s Facebook that’s stepping up it’s game.

Yesterday TechCrunch wrote about the test of an “enhanced local search feature” on Facebook. It’s an expanded version of Nearby Places: e.g., “coffee nearby.” It’s difficult to tell precisely what’s new here. However TechCrunch says the following: it’s “a list of relevant businesses, along with their ratings on Facebook, a map, as well as which friends of yours have visited or like the places in question.”

Here’s what it looks like:

facebook seo

As the article says, it’s been (and continues to be) a slow, steady ramp up for Facebook, but they’ve got the presence and the data to present a big threat to Google in the near future.


Rapid-Fire SEO Insights


Google is changing its definition of “exact match” keywords in Keyword Planner.

The search query “hotels in new york” will be able to trigger an ad impression for the exact match keyword [new york hotels] because the word order and the term “in” can be ignored and not change the intent of the query.

Facebook Video Algorithm Update


facebook video seo

Yes, sometimes we talk about not-Google!

Here’s a summary of how the new Facebook video update works:

Simply put, your Facebook videos will be ranked according to how long people watch your video. If they watch it into completion, then your page will be rewarded accordingly. Of course, if a majority of the people who watch your video leave it halfway through then your content will be given the appropriate demerits.

Another case of user-egagement/experience being used as a ranking signal, but this time in Facebook.

Expect to see (and probably do this yourself in your videos) Facebook videos starting like this:

Hey, be sure to stick around to the end of this video for

As we’ve seen, when a search engine gives value to a metric, that metric is exploited mercilessly. 🙂


How Hummingbird Works


A super-optimized (for social sharing), in-depth post by Neil Patel(‘s ghost writer).

Hummingbird doesn’t get a lot of mentions in the day-to-day SEO blog circuit. Everybody is all Panda this and Penguin that.

But Hummingbird was a big deal–and still appears to be.

Here’s the list of eight takeaways from the study (most of which we’ve been talking about here, for forever). Check out the post to see the data behind the summary, and some of the individual content analyzed.

However, if you want to skip those weird ads


Here’s the list:

  1. Select, refine and state your site’s topic using a clear purpose statement, above-the-fold content and specific navigation elements. (Don’t be content with fuzzy or broad statements.)
  2. Create long form content. (Avoid short content.)
  3. Create in-depth content. (Avoid generic content.)
  4. Summarize the purpose and intent of the site with specificity and directness. (Don’t hide your purpose or make it vague.)
  5. Create content that appeals to readers (Don’t create content for search engines.)
  6. Create focused content. (Don’t try to provide comprehensive content on every sub niche in your niche.)
  7. Create a lot of content. (Don’t be happy with a few blog posts or evergreen pages.)
  8. Create content that is entirely relevant to your area of expertise. (Don’t write about off-topic subjects.)


Me, writing that last post:


Penguin 4.0 is Now Completely Rolled Out



Just an FYI — nothing too in-depth here.

Penguin 4.0 is confirmed by Google as having finished rolling out.

It’s like Jay Z sez: If your site’s still penalized I feel bad for you, son



Penguin 4.0 Recovery Case Studies



Mmmm, case studies. The sustenance of SEOs everywhere.

This case study is done by Marie Haynes, and focuses on sites that were previously Google-slapped (Penguin punched?) and have recently recovered.

Lots of examples like this:

Hit by Penguin: This site was suppressed by a manual action for unnatural links several years ago. While they have made some improvements since then, I have always felt that they were still somewhat suppressed and have told them that they likely would see some improvement when Penguin finally updated.

Why? Large number of keyword anchored paid links as well as directory submissions.

What was done to attempt recovery? We did a thorough link audit and disavow. Many links were removed. Ongoing link audit and disavow work was done.

Did the site get new links while suppressed? This site has been working with a good SEO company and has managed to gain a good number of new links and also to continually improve their on-site quality.

Basically, thorough link audits + some disavows + new, strong links = the recipe for recovery (apparently).


Possum = “Near Me” Update?


While Penguin is still rolling out, just starting to un-kill sites that got slapped by version 3, the beginning-of-September local update (that lots of people are calling Possum) had some very real, very big consequences (both good and bad) for many local sites.

LocalSEOGuide looks at the impact of this update on “near me” queries (such as “Apple store near me,” “pizza near me,” etc.).

The findings?

Sites targeting “near me” searches saw a big boost, and the update seemed to be targeting sites that were using some SPAMMY techniques (which is a relative definition, I know) to rank locally.

While we didn’t see this in every case, strong local search domains that have been using this brand near me strategy appeared to start to be more relevant to Google for these queries. While the site in question is a nationally-known brand, we even saw this kind of activity on some of the smaller, far less well-known local search clients we work with.



The Quality Update; Not Penguin or Panda, but Still Important


There are more things to fear than just Pandas and Penguins.

So, while many still focus on Google Panda, we’ve seen five Google quality updates roll out since May of 2015. And they have been significant. That’s why I think SEOs should be keenly aware of Google’s quality updates, or Phantom for short. Sometimes I feel like Panda might be doing this as Google’s quality updates roll out.

This article focuses on the Phantom/Quality update, and why this algorithm update should be on your radar.

Short answer: because it can F your S up.

phantom quality algorithm

Click through to get a solid foundational understanding of Phantom/Quality updates from Glenn Gabe, one of my favorite SEO authorities.

An Update on Doorway Pages


Google is coming after your crappy-user-experience-created-only-for-SEO doorway pages (again).

From Google’s official site:

Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.

The post has a helpful list of things to check to make sure you are not using doorway pages, so definitely give that a run through, if you’re unsure.

Funny thing, and a call-back to the SEObook post mentioned above: here’s a solid example from the Google Ventures-funded LendUp of exactly what doorway pages look like on a site:

lendup doorway pages

LendUp currently ranks extremely well for all of those pages, if you’re wondering.



Official Google Updates

Guidelines for bloggers who review products they receive for free



Google has caught on to your link-building schemes. No longer can you send beef jerky or facial scrub to review sites in return for that sweet, sweet DoFollow link.

New guidelines now indicate you must:

  1. Use NoFollow links
  2. Disclose the relationship
  3. Write compelling, unique content


As a form of online marketing, some companies today will send bloggers free products to review or give away in return for a mention in a blogpost. Whether you’re the company supplying the product or the blogger writing the post, below are a few best practices to ensure that this content is both useful to users and compliant with Google Webmaster Guidelines.


The Webmaster Blog has a new domain name


New google webmaster url

You can find the new site at webmasters.googleblog.com, instead of the old address: googlewebmastercentral.blogspot.com.


…starting today, Google is moving its blogs to a new domain to help people recognize when they’re reading an official blog from Google. These changes will roll out to all of Google’s blogs over time.

Even GOOGLE doesn’t use blogspot anymore…


How Google Prioritizes Spam Reports



Google prioritizes spam reports sent within Search Console (then those submitted elsewhere. Google also prioritizes acting upon spam reports if it comes from a source that has previously submitted spam reports that have been helpful in cleaning up legitimate spam (instead of just being bitchy to competitors).

If you submit spam reports to Google, especially for spam within your niche, it would be more beneficial to you if you really do help clean up the space from spam by submitting valid reports, and not just randomly reporting competitors for tiny almost non-existent tiny violations.


AI is Transforming Google Search


TL;DR — The head of AI at Google is now the head of search, and with their increased adoption of deep learning neural networks to drive search rather than algorithms, maybe we can learn to love penguins and pandas again (and start hating on robots).

AI Search Algorithms

Yes, Google’s search engine was always driven by algorithms that automatically generate a response to each query. But these algorithms amounted to a set of definite rules. Google engineers could readily change and refine these rules. And unlike neural nets, these algorithms didn’t learn on their own. As Lau put it: “Rule-based scoring metrics, while still complex, provide a greater opportunity for engineers to directly tweak weights in specific situations.”

But now, Google has incorporated deep learning into its search engine. And with its head of AI taking over search, the company seems to believe this is the way forward.

Google Manipulates Search Results


A study has come out, sponsored by Yelp, claiming that Google is manipulating its search results to favor its own web properties, presenting users with a poorer end product.

“The easy and widely disseminated argument that Google’s universal search always serves users and merchants is demonstrably false,” the paper reads. “Instead, in the largest category of search (local intent-­based), Google appears to be strategically deploying universal search in a way that degrades the product so as to slow and exclude challengers to its dominant search paradigm.”

And here’s the best quote from the report that was published:

The results demonstrate that consumers vastly prefer the second version of universal search. Stated differently, consumers prefer, in effective, competitive results, as scored by Google’s own search engine, than results chosen by Google. This leads to the conclusion that Google is degrading its own search results by excluding its competitors at the expense of its users. The fact that Google’s own algorithm would provide better results suggests that Google is making a strategic choice to display their own content, rather than choosing results that consumers would prefer.

I don’t have to point out that Google is possibly getting… “penalized” …for manipulating the search results, do I? Because, that’s kind of hilarious…

Amid the antitrust suit in Europe, Google is not having the best time right now.


New Google Algorithm: the “Newsworthy Update.”


Last week saw a new Google update, not related to any of their recurring algorithm updates. This update, being unofficially called the “Newsworthy Update,” boosted SERP visibility for many sites that cover fresh, news-related content.

news serps

Check out Search Engine Land for the full story.

Google Wants to Rank Websites Based on Facts not Links


Google is seeking to rank websites based on factual information, rather than skewed towards sites with the higher number of incoming links, as it has in the past. They are working on having their Knowledge-Based Trust score assess the number of incorrect facts on a page, and deem websites with the least amount of incorrect facts as most trustworthy. Apps currently exist today that perform similar assessments, like weeding out spam e-mails from your inbox or pulling rumors from gossip websites to verify or denounce them.

From the article: “A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team. The score they compute for each page is its Knowledge-Based Trust score.”


Our #1 E-Commerce SEO Strategy Revealed
November Core Algo Update, Bert, Google News, and More!

Learn How To Rank Your Site

Get our simple, straightforward guide on how to rank your site.



Started from the bottom now he's here (still at the bottom but he's the only one that knows what he’s doing around here). Speaks in lyrics and The Office memes. Actually understands Snapchat.
dotted line pointing to the void

Smash your traffic records with quality SEO.

SEO emphasis line

If you’re tired of the empty promises… Tired of the mediocre results…
Tired of SEO companies taking you for a ride… Tired of reading the word tired…

Let Smash Digital help.

Scroll to Top