The Best Technical SEO Articles: Our Collection

Another Broad Core Algorithm Update

https://www.seroundtable.com/march-12-13-google-search-ranking-algorithm-update-27249.html

Hold on to your asses, the SERPs are getting a little bit wild again.

Recently, Google confirmed the roll-out of a broad core algorithm update (which is what the “Medic” update really was) on March 12th. Whether your traffic came crashing down or shot up into space, these broad core algo updates aren’t subtle.

Professional re-enactment of Google releasing a core algo update.

The solution to ranking well again, if your traffic fell? Same as before:

A Glimpse At The Bleak, Post-Apocalyptic Future of SEO

Google Says: Disavowing Bad Links Could Increase Trust

https://www.seroundtable.com/google-trust-algorithmic-links-27014.html

This seems like a pretty important point that most people paying attention to their backlinks should tuck away for the future.

Google’s John Mueller said in a webmaster hangout on Tuesday at the 16:44 mark that in some cases, disavowing or cleaning up bad links to your site may help Google’s algorithm trust other links to your site.

So if you’ve been on the fence about whether or not to disavow those crappy links a disgruntled competitor sent your way, maybe this pushes you over the edge. Here’s the video where you can watch the conversation:

The August 1st Core Algorithm Update
(“Medic” SEO Update)


On August 1st, Google rolled out a core algorithm update, which means they made a change to how the algorithm scores and values the many factors that determine how well a site does or does not rank for a given keyword.

Google is constantly pushing out small updates to its algorithm to try and improve the results it serves to searchers, but this was one of the biggest many SEO experts had ever seen.

If you don’t follow SEO-related news closely, chances are you probably noticed a change to your site’s traffic sometimes from August 1st – August 8th. Whether it increased or declined, the August 1st update had some pretty big impacts for a lot of sites.

According to data gathered by SEO publication sites (and a ton of chatter and first-hand accounts on site where SEOs hang out), this update seemed to target sites related to the health industry and related keywords. However, health was just one industry of many that was affected.

Here’s a graph from SERoundtable that that pulls a bunch of info together to show which industries were most affected:

Expert Speculation and What Google Says

This core algorithm update–dubbed the Medic Update by an industry news site because it largely targets health-realted sites–took a solid week to fully roll out.

Sites that pushed “alternate health” advice saw the biggest initial drop. Examples include DrAxe.com and Prevention.com.

This led several experts to push the idea that Google had tweaked their algorithm to reward sites that are true authorities relating to the healthcare industry. In Google’s Quality Rater Guidelines (QRG), a document that spells out what does (and does not) constitute a high-quality site, Google spells out the importance of E-A-T:

Expertise. Authority. Trust.

The initial takes on the update pointed to sites like DrAxe.com losing organic traffic rankings, while sites with more “traditional” authority in the medical industry, like (the health section of) ScienceDaily.com gaining a significant amount of organic traffic through better rankings:

Marie Haynes pointed to the Google’s Quality Rater Guidelines–specifically the Trust part of the E-A-T acronym as to the reason why sites lost ground in their rankings:

If you run a [health related] site, the following are all going to be important factors in how you rank:

  • Is your content written by people who are truly known as authorities in their field?
  • Do your business and your writers have a good reputation?
  • Are you selling products that are potentially either scams, not helpful, or even harmful to people?

If you are lacking business or author reputation or have products that don’t inspire trust, then re-establishing trust and ranking well again may be difficult.

Many were quick to jump on the E-A-T bandwagon to explain the drastically changed search results. However, focusing only on the matter of Trust and Expertise is to ignore many important factors that may impact a site. As Glenn Gabe wrote re: the update:

I highly recommend reading the QRG to see what Google deems high versus low quality, to understand how Google treats [health-related] sites, to understand the importance of E-A-T, to understand the impact of aggressive, disruptive, and deceptive ads, and much more.

But it’s not the only thing you should do. …don’t ignore technical SEO, thin content, performance problems, and other things like that. Think about the site holistically and root out all potential problems.

This is evergreen good advice when it comes to SEO.

Here’s what Google team member Danny Sullivan said about the update:

 

What the update was targeting (and what to do about it)

Now that you understand the scope and a little bit about what this update targeted (trust, yes, but many other issues), you’re probably wondering what you can do about this update if you lost some ground.

The most important thing to remember when reading the next section is: this was a broad core algorithm update. The key is “broad.” It wasn’t just one thing. It’s not just targeting the medical/health niche, although they were hit particularly hard. It’s not just about query intent or site speed or content. It’s about ALL of them.

The second most important thing to remember when reading these updates from various SEOs: how does their advice relate to their product? Bias is a hell of a lens to view the world through, so just be aware of what’s on offer.

Is a particular ‘expert’ or agency really hammering, say, individual author authority as the biggest thing this update targeted? Do they happen to offer reputation management? If so, take their advice into consideration with the bias in mind.

At Smash Digital, we build links. And we’re really good at it. So just be aware of the bias that we think building powerful links is one of the most important things you can do for your business.

Also, we’re totally right about this, but keep that in mind when you listen to our take–and anyone else’s take–on the update.

With those points covered, let’s dig into what the August 1st update might have targeted and, if possible, what you can do about it.

1. For some queries, in some niches, the intent behind the query changed

For some queries–specifically related to medical niches–Google seemed to do the SERP equivalent of reaching over the table and mixing up your plate of food right as you were about to Instagram it.

I’ve seen several queries that went from being “transactional” in nature i.e. showing results assuming the person searching was looking to buy something to informational i.e. showing results assuming the person searching is looking for information. So if you’re an ecommerce site that used to rank a product page for a particularly valuable medical-related query, and Google (or their algorithm) changed the Query Intent to informational and are not showing products anymore, you gotta step up your content game.

If this is the case, try to write a “buyer’s guide” or similar educationally-focused post that teaches rather than sells. Flex your authority and trust by showing you’ve got the searcher’s best interest at heart, and are just trying to spread knowledge.

We’ve seen some early promising results where rankings have popped back up after dialing back the sales-talk, turning away from pushing a product and just letting the content teach–and that’s it.

So check your main keywords to see if the Query Intent of the results has changed.

2. What kind of page is ranking, and does your site match up?

This will be brief, as it’s similar to point #1.

Choose a page that lost some rankings and look at what URL Google is ranking for your site. Is it a fresh af blog post? Is it the homepage?

Now look at the top 10 and compare. If you notice something like, most of the pages in the top 10 are deeply categorized blog posts, but you’ve been trying to rank the homepage…

3. User experience matters. A lot.

This may come as shock, but Google doesn’t care about your site’s profit.

DEAL WITH IT

So if you need to be aggressive with ads to make enough to pay for all that beautiful epic content they want you to create, maybe say goodbye to your good rankings. If not now, soon.

Chances are, if you’re a big media site that’s ugly with deceptive ads, you probably got slapped in this last update.

image from CanIRank.com

Other obvious things that may have hurt your site (or will, if you don’t get it together):

  • slowly-loading sites
  • pop-ups that block content (especially on mobile)
  • excessive ads
  • autoplaying videos
  • and other terrible experience.

4. All the obvious things you’ve heard

The update is not even a month old at the time this post is being written. Getting perspective on an update–especially one this big and far-reaching, can take months to years to fully understand. Of course, SEOs are relentlessly curious and data-curious, so there’s a lot it’s possible to piece together even only a few weeks out.

But the majority of what we understand about this update is ahead of us, not behind.

In the mean time, stick to (and follow!) the SEO advice you constantly hear. It’s your best guard against future updates.

Make sure your on-page SEO is solid.

Links are important; you need good ones.

Produce high-quality content that demonstrates your authority.

Link out to sources; don’t be outbound-link-greedy.

User experience is vital; make it good.

bonus tip

Need some help?

If you think you were impacted by the August 1st update and want to see if we’d be able to help, just hit up the contact page and I’ll get back to you ASAP.

Other reading on the subject that we enjoyed and learned from

 

An Updated Page Rank

http://www.seobythesea.com/2018/04/pagerank-updated/

First off, yes Page Rank is still a thing. Google uses it within their ranking algorithm. The reason you probably haven’t heard of it for a few years is that they stopped publicly updating it.

Why?

Because SEOs used it help them rank sites or sell services and Google has no love for SEOs…

So what’s this about an updated Page Rank?

SEO By the Sea is covering a recent update to a previously-granted patent pertaining to Page Rank. It’s some pretty complicated stuff (or at least, is written that way):

One embodiment of the present invention provides a system that ranks pages on the web based on distances between the pages, wherein the pages are interconnected with links to form a link-graph. More specifically, a set of high-quality seed pages are chosen as references for ranking the pages in the link-graph, and shortest distances from the set of seed pages to each given page in the link-graph are computed. Each of the shortest distances is obtained by summing lengths of a set of links which follows the shortest path from a seed page to a given page, wherein the length of a given link is assigned to the link based on properties of the link and properties of the page attached to the link. The computed shortest distances are then used to determine the ranking scores of the associated pages.

Basically, Google is trying to use Seed Pages, and the distance a given site is (via links) from a Seed Page. It’s like six degrees of separation from some really authoritative pages, with links.

 

Google is Still Serious about HTTPS

https://www.blog.google/topics/developers/introducing-app-more-secure-home-apps-web/

Google just launched the new .apps gTLD and here’s a really interesting part of using their new domain ending:

key benefit of the .app domain is that security is built in—for you and your users. The big difference is that HTTPS is required to connect to all .app websites, helping protect against ad malware and tracking injection by ISPs, in addition to safeguarding against spying on open WiFi networks. Because .app will be the first TLD with enforced security made available for general registration, it’s helping move the web to an HTTPS-everywhere future in a big way.

The Ultimate Black Hat SEO Bug (Now Squashed)

http://www.tomanthony.co.uk/blog/google-xml-sitemap-auth-bypass-black-hat-seo-bug-bounty

Recently, Tom Anthony discovered a way to rank a brand new site for some crazy-valuable keywords at the top of the SERPs:

I recently discovered an issue to Google that allows an attacker to submit an XML sitemap to Google for a site for which they are not authenticated. As these files can contain indexation directives, such as hreflang, it allows an attacker to utilise these directives to help their own sites rank in the Google search results.

I spent $12 setting up my experiment and was ranking on the first page for high monetizable search terms, with a newly registered domain that had no inbound links.

Leading to results like this:

And traffic like this:

Click through and check out the post for all the details–it’s pretty amazing.

 

A Recent Core Algorithm Update

https://www.seroundtable.com/google-algorithm-update-25384.html

Did your site’s organic traffic get its ass kicked or start kicking ass lately? It may be due to a recent Google algorithm update. SE Roundtable posted about a bunch of chatter they started seeing on various SEO and webmaster forums about a possible update.

Shortly thereafter, Google actually confirmed the update (which they don’t always do).

So, as this was a core algorithm update, I haven’t seen much in the way of “these sites took a hit because of this reason” type of info, but it also seems to surface a few weeks after the fact, so I’ll keep my eyes open and post an update in a future This Week in SEO post to let you know.

 

The Chrome Browser Says: Not Secure

https://domainnamewire.com/2018/02/09/google-upping-ante-ssl/

Imagine you’ve put all this hard work and effort into building a site, ranking it well for a bunch of sweet keywords, and… an no one stays on your site for more than 10 seconds.

In an upcoming edition of Google’s Chrome browser, all websites without an SSL certificate will be marked “not secure.” Like this:

Starting in July, the latest version of Chrome will show the second notification for sites that don’t have SSL even if someone is not inputting information into a form field.

Why this matters to your SEO:

  • People are going to see “not secure” and thing your site will infect their computer.
  • They’ll immediately go back to Google and Google will think “damn, people aren’t staying on this page a long time. I guess it’s not relevant.
  • Google will push your page further down the rankings because user-experience shows it’s not relevant.
  • Your rankings go down.

It’s an easy fix, and you’ve got a few months to get it done, so…

 

Google Removes 96/101 GMB Reviews

https://www.en.advertisercommunity.com/t5/forums/v3_1/forumtopicpage/board-id/Spam_and_Policy/page/1/thread-id/22628#

Damn, Google!

So here’s a bit of local SEO drama for you (and a very costly lesson for you to learn from).

Someone complained on the Google My Business advertising forum that a competing law firm was incentivizing reviews by offering a free pass to a zoo reward, to be chosen from people that left them a review.

Here is a review from someone complaining about the practice:

Of course, most attorneys in the area have a few reviews which is totally normal. Almost all of their fake reviews came in at the exact same time 25-30 days ago. Although, there is a new push with more reviews popping up today, and there’s a batch that was all left on the same day 6mos ago.

They should have under 10 reviews at the very most.

Eventually, the law firm in question chimes in several times on the thread (with some super lawyer-y lingo like “pursuant” and “to the extent”) to say that they are not offering their services in return for reviews, and so they are not in violation of the guidelines.

I definitely recommend reading the full thread, but here’s how it ends (and this is the part you really should internalize):

Google’s team decided that the reviews WERE, in fact, against their guidelines:

“Reviews are only valuable when they are honest and unbiased. (For example, business owners shouldn’t offer incentives to customers in exchange for reviews.) Read more in our review posting guidelines. If you see a review that’s inappropriate or that violates our policies, you can flag it for removal.

And here’s how it all shook out:

Ouch.

 

Another Algorithm Update (Probably!)

https://www.seroundtable.com/november-google-update-24720.html

Eventually these won’t be newsworthy anymore with the frequency they keep happening…

Okay, probably not. It’s always a big deal when Google drops an algorithm update like a diss track aimed at your website.

I am seeing signs both within the search community and from the automated tracking tools of an update with Google’s search results going on right now. The interesting thing is about 50% of the tools are reporting on the algorithm update and the other are not. Maybe Google is doing a 50/50 test on a new algorithm?

SERP/algorithm tracking site evidence:

Cognitive SEO:

RankRanger:

 

What the F Has Been Going on the SERPs Lately?

https://www.gsqi.com/marketing-blog/the-hornets-nest-fall-2017-google-algorithm-updates/

Basically:

Since August, we’ve seen a number of updates I would call significant. I actually can’t remember seeing that many substantial updates in such a short period of time.

This has been clear from the thousands of keywords we track here at Supremacy. While we’re used to seeing the occasional turbulence in the SERPs, rankings across the last couple of months have been more like paint on a speaker in slow motion:

So why the recent volatility (not even accounting for the recent mobile stuff)? Glenn Gabe takes a very smart—and well informed—stab at it in this post. There’s a number of points he makes, and I highly recommend you go through and read them all, but here’s the most likely terrifying explanation:

Google may be increasing the frequency of refreshing its quality algorithms. And the end goal could be to have that running in near real-time (or actually in real-time). If that happens, then site owners will truly be in a situation where they have no idea what hit them, or why.

Or, you could do something about it:

Based on the volatility this fall, and what I’ve explained above, I’m sure you are wondering what you can do. I’ve said this for a while now, but we are pretty much at the point where sites need to fix everything quality-wise. I hate saying that, but it’s true.

 

Thoughts on the August Algorithm Update

http://www.gsqi.com/marketing-blog/august-19-2017-google-algorithm-update/

Google has no chill. I’ve seen a ton of movement in the SERPs this summer than I can remember in any previous quarter. And it’s all, as far as I can tell, been tied to a site’s quality score.

As I mentioned in my post about the May 17, 2017 update, Google seems to be pushing quality updates almost monthly now (refreshing its quality algorithms). That’s great if you are looking to recover, but tough if you’re in the gray area of quality and susceptible to being hit. Over the past several months, we have seen updates on May 17, June 25, July 10, and now August 19. Google has been busy.

There seems to have been another one in early September (around the 7th or 11th).

Lots of rankings doing this:

and this:

And a few killer sites doing this:

The post goes in-depth to some examples of what may have caused these pages to be impacted (the usual suspects like thin content, lots of ads, and this):

It never ceases to amaze me how some sites absolutely hammer their visitors, attempt to trick them into clicking ads, provide horrible UX barriers, and almost give them a seizure with crazy ads running across their pages.

A good post to dig into and follow the advice for your site. It’s only going to get worse for your SEO if visiting your site is a UX disaster.

 

Auto-playing Video Ads in Google’s SERPs

http://searchengineland.com/google-confirms-testing-auto-playing-videos-search-results-279533

Google has confirmed with Search Engine Land that they are running a small experiment where they auto-play videos in the search results page. Jennifer Slegg spotted the test this morning after conducting some test searches using Internet Explorer. The video in the knowledge panel will auto-play if you are in this experiment.

And you were mad when they reduced the local pack from 7 to 3…

Big SERP Shakeup

It’s been a damn interesting week.

How’s your organic traffic doing?

It looks like there’s been a very significant Google algorithm update that started rolling out around June 23rd.

Here’s a couple of questions you may be wondering about, and my answers:

Q: What is this? Panda? Penguin? Hummingbird? Pigeon?

A: ¯\_(ツ)_/¯ You see, SEO moves slowly, in general, and it takes time to study the data, to see common site elements that trend down, others the trend up, and really put the pieces together.

Q: Should I panic?

A: No. Probably not. I’ve seen many situations where a site owner has done more damage by trying to disavow a ton of links, 301 redirect a ton of others, and just generally muck-up the site’s architecture trying to get some rankings back. It’s better to take action once you understand the situation better.

I’ve also seen sites that get knocked down by a penalty and, when the dust settles, rank slightly higher than they did after a week of poor rankings. The algorithm can take a while to roll out, and aftershocks can further change things around. So don’t panic!

Here’s some more info on this algorithm update.

 

Search Engine Land Covering the Possible Update

http://searchengineland.com/google-algorithm-update-rolling-since-june-25th-277942

Google technically did not confirm it, outside of John Mueller’s typical reply to any questions around an algorithm update. But based on the industry chatter that I track closely and the automated tracking tools from Mozcast, SERPMetrics, Algoroo, Advanced Web Rankings, Accuranker, RankRanger and SEMRush, among other tools, it seems there was a real Google algorithm update.

 

Rank Ranger’s Early Coverage of the 2017 June Update

https://www.rankranger.com/blog/google-algorithm-update-june-2017-explained

Despite the length of the current update, the initial chatter, per Barry Schwartz of SERoundtable, was quite light. This is obviously peculiar, not only in light of the length of the update, but the fluctuation levels themselves as well. The risk levels on our Rank Risk Index have risen above moderate, and show a continuous series of high fluctuation levels.

 

SERPWoo: A Bump in Volitility

https://www.serpwoo.com/stats/volatility/

SERPWoo tracks how much a particular niche fluctuates among the top 20, and then aggregates that data across several different verticals like mobile, desktop, search volume, etc.

You can definitely see a bump around the 23rd of June.

 

Oh Yeah, There Was a Sizable Update in May, Too

http://www.gsqi.com/marketing-blog/may-17-2017-google-algorithm-update/

Because two algorithm updates are better than one!

After digging into many drops, I saw the usual suspects when it comes to “quality updates”. For example, aggressive advertising, UX barriers, thin content mixed with UX barriers, frustrating user interface problems, deceptive ads, low quality content, and more.

 

Google’s Patent on Finding Authoritative Sites for the SERPs

http://www.seobythesea.com/2017/05/how-does-google-look-for-authoritative-search-results/

No, Siri, I said ‘find authority pictures–you know what? Nevermind’

Apparently Google looks for authoritative pages the same we you probably already do when doing research:

A patent granted to Google this week focuses upon authoritative search results. It describes how Google might surface authoritative results for queries and for query revisions when there might not results that meet a threshold of authoritativeness for the initial query. Reading through it was like looking at a mirror image of the efforts I usually go through to try to build authoritative results for a search engine to surface.

Very interesting stuff. If you’re feeling particularly perky some morning over coffee, I’d suggest giving this patent a read through. Seems likely that you may gain some insight into how Google frames or approaches authority web pages.

Also, go ahead and click through to read the full text of this article. I’m putting two here, but there are seven “takeaways” from the patent that I recommend becoming familiar with:

1. Google might maintain a “keyword-to-authoritative site database” which it can refer to when someone performs a query.

2. The patent described “Mapping” keywords on pages on the Web as sources of information for that authoritative site database.

Finally, this is all further proof that the best long-term SEO strategy is becoming an authority in your space.

How? Call me biased, but getting juicy, high-quality backlinks is part of a balanced SEO breakfast. Check out how our RankBOSS service can help.

 

Google’s New Algorithm Update Targets Fake News

https://stratechery.com/2017/not-ok-google/

This is a great update on Google’s relationship with, and response to, fake news.

From Bloomberg, on the update:

The Alphabet Inc. company is making a rare, sweeping change to the algorithm behind its powerful search engine to demote misleading, false and offensive articles online. Google is also setting new rules encouraging its “raters” — the 10,000-plus staff that assess search results — to flag web pages that host hoaxes, conspiracy theories and what the company calls “low-quality” content.

It’s always interesting to read about SEO-related issues from non-industry people. In this case, it’s Ben from the (amazing) tech blog Stratechery.

Framing the problem of fake news in relation to Google’s finances:

Google, on the other hand, is less in the business of driving engagement via articles you agree with, than it is in being a primary source of truth. The reason to do a Google search is that you want to know the answer to a question, and for that reason I have long been more concerned about fake news in search results, particularly “featured snippets.”

Google … is not only serving up these snippets as if they are the truth, but serving them up as a direct response to someone explicitly searching for answers. In other words, not only is Google effectively putting its reputation behind these snippets, it is serving said snippets to users in a state where they are primed to believe they are true.

The main criticism here is not in how Google handled the algorithm update, but in how they are changing the quality rater guidelines to now demote pages that it considers “not-authoritative:”

This simply isn’t good enough: Google is going to be making decisions about who is authoritative and who is not, which is another way of saying that Google is going to be making decisions about what is true and what is not, and that demands more transparency, not less.

Dear Google:

 

Drop Dead Fred: A Google Algorithm Update Analysis

http://www.sistrix.com/blog/googles-fred-update-what-do-all-losers-have-in-common/

Fred Google update
only 90s kids will get this

Recently, there was a Google algorithm update that clever hilarious SEOs named “Fred” after Gary Illyes said all future updates should be named “Fred.”

As usual, there has been a lot of speculation as to what this update entailed, but nothing super solid.

Recently, though, an SEO tool company called Sistrix has published some interesting findings after studying 300 sites:

Nearly all losers were very advertisement heavy, especially banner ads, many of which were AdSense campaigns. Another thing that we often noticed was that those sites offered little or poor quality content, which had no value for the reader. It seem that many, but not all, websites are affected who tried to grab a large number of visitors from Google with low quality content, which they then tried to quickly and easily monetize through affiliate programs.

According to this post, sites that look like this got hammered:

02-example-of-search-result.png

a.k.a. a page created to grab search traffic, with a low amount/terrible quality content and lots of ads.

Hopefully you’ve been taking our advice and creating solid content on the pages you’re trying to rank. ?

 

SEO IS DEAD!

searchengineland.com/googles-new-tappable-shortcuts-271690

JK. Any time any little thing happens/goes wrong in this industry, it’s all doom all the time.

But seriously, this probably is a pretty big deal, when combined with the mobile-first index and rise of mobile search:

Tappable shortcuts eliminating the need to search (for certain things)…

The shortcuts eliminate the need to search, providing quick answers around sports scores, nearby restaurants, up-to-the minute weather updates and entertainment information, like TV schedules or who won the Oscar for best supporting actress.

I mean, if you’re in any of those industries (sports scores, weather, etc), Google already ate your lunch and made you buy dessert.

Here’s a video on the new feature:

 

Google’s Biggest Competition

http://www.lsainsider.com/facebook-inches-closer-to-something-that-looks-like-local-search

Three years ago, if I were to put money on which multi-gajillion-dollar tech company would pose the biggest threat Google’s search dominant, smart money would be on Apple.

But since all Apple has done in the past three years is NOT innovate Siri and make laptops with features no one asked for (I’m not bitter, YOU’RE bitter), it’s Facebook that’s stepping up it’s game.

Yesterday TechCrunch wrote about the test of an “enhanced local search feature” on Facebook. It’s an expanded version of Nearby Places: e.g., “coffee nearby.” It’s difficult to tell precisely what’s new here. However TechCrunch says the following: it’s “a list of relevant businesses, along with their ratings on Facebook, a map, as well as which friends of yours have visited or like the places in question.”

Here’s what it looks like:

facebook seo

As the article says, it’s been (and continues to be) a slow, steady ramp up for Facebook, but they’ve got the presence and the data to present a big threat to Google in the near future.

 

Rapid-Fire SEO Insights

 



https://www.portent.com/blog/ppc/adwords-changing-exact-match-again.htm
Google is changing its definition of “exact match” keywords in Keyword Planner.

The search query “hotels in new york” will be able to trigger an ad impression for the exact match keyword [new york hotels] because the word order and the term “in” can be ignored and not change the intent of the query.

Facebook Video Algorithm Update

https://seo-hacker.com/facebook-changing-rank-videos-advantage/

facebook video seo

Yes, sometimes we talk about not-Google!

Here’s a summary of how the new Facebook video update works:

Simply put, your Facebook videos will be ranked according to how long people watch your video. If they watch it into completion, then your page will be rewarded accordingly. Of course, if a majority of the people who watch your video leave it halfway through then your content will be given the appropriate demerits.

Another case of user-egagement/experience being used as a ranking signal, but this time in Facebook.

Expect to see (and probably do this yourself in your videos) Facebook videos starting like this:

Hey, be sure to stick around to the end of this video for

As we’ve seen, when a search engine gives value to a metric, that metric is exploited mercilessly. 🙂

 

How Hummingbird Works

http://neilpatel.com/2016/11/10/how-google-hummingbird-really-works-what-we-learned-by-analyzing-9-93-million-words-of-content/

A super-optimized (for social sharing), in-depth post by Neil Patel(‘s ghost writer).

Hummingbird doesn’t get a lot of mentions in the day-to-day SEO blog circuit. Everybody is all Panda this and Penguin that.

But Hummingbird was a big deal–and still appears to be.

Here’s the list of eight takeaways from the study (most of which we’ve been talking about here, for forever). Check out the post to see the data behind the summary, and some of the individual content analyzed.

However, if you want to skip those weird ads

patel

Here’s the list:

  1. Select, refine and state your site’s topic using a clear purpose statement, above-the-fold content and specific navigation elements. (Don’t be content with fuzzy or broad statements.)
  2. Create long form content. (Avoid short content.)
  3. Create in-depth content. (Avoid generic content.)
  4. Summarize the purpose and intent of the site with specificity and directness. (Don’t hide your purpose or make it vague.)
  5. Create content that appeals to readers (Don’t create content for search engines.)
  6. Create focused content. (Don’t try to provide comprehensive content on every sub niche in your niche.)
  7. Create a lot of content. (Don’t be happy with a few blog posts or evergreen pages.)
  8. Create content that is entirely relevant to your area of expertise. (Don’t write about off-topic subjects.)

 

Me, writing that last post:

z4p2pjq

Penguin 4.0 is Now Completely Rolled Out

https://www.seroundtable.com/google-penguin-4-rollout-complete-22831.html

pengwatch

Just an FYI — nothing too in-depth here.

Penguin 4.0 is confirmed by Google as having finished rolling out.

It’s like Jay Z sez: If your site’s still penalized I feel bad for you, son

jay-z

 

Penguin 4.0 Recovery Case Studies

https://www.mariehaynes.com/penguin-4-0-recovery-case-studies/

suave-penguin

Mmmm, case studies. The sustenance of SEOs everywhere.

This case study is done by Marie Haynes, and focuses on sites that were previously Google-slapped (Penguin punched?) and have recently recovered.

Lots of examples like this:

Hit by Penguin: This site was suppressed by a manual action for unnatural links several years ago. While they have made some improvements since then, I have always felt that they were still somewhat suppressed and have told them that they likely would see some improvement when Penguin finally updated.

Why? Large number of keyword anchored paid links as well as directory submissions.

What was done to attempt recovery? We did a thorough link audit and disavow. Many links were removed. Ongoing link audit and disavow work was done.

Did the site get new links while suppressed? This site has been working with a good SEO company and has managed to gain a good number of new links and also to continually improve their on-site quality.

Basically, thorough link audits + some disavows + new, strong links = the recipe for recovery (apparently).

 

Possum = “Near Me” Update?

http://www.localseoguide.com/possum-near-update/

While Penguin is still rolling out, just starting to un-kill sites that got slapped by version 3, the beginning-of-September local update (that lots of people are calling Possum) had some very real, very big consequences (both good and bad) for many local sites.

LocalSEOGuide looks at the impact of this update on “near me” queries (such as “Apple store near me,” “pizza near me,” etc.).

The findings?

Sites targeting “near me” searches saw a big boost, and the update seemed to be targeting sites that were using some SPAMMY techniques (which is a relative definition, I know) to rank locally.

While we didn’t see this in every case, strong local search domains that have been using this brand near me strategy appeared to start to be more relevant to Google for these queries. While the site in question is a nationally-known brand, we even saw this kind of activity on some of the smaller, far less well-known local search clients we work with.

local-brand-local-directory

 

The Quality Update; Not Penguin or Panda, but Still Important

http://searchengineland.com/why-googles-quality-updates-should-be-on-your-algorithmic-radar-part-1-257389

There are more things to fear than just Pandas and Penguins.

So, while many still focus on Google Panda, we’ve seen five Google quality updates roll out since May of 2015. And they have been significant. That’s why I think SEOs should be keenly aware of Google’s quality updates, or Phantom for short. Sometimes I feel like Panda might be doing this as Google’s quality updates roll out.

This article focuses on the Phantom/Quality update, and why this algorithm update should be on your radar.

Short answer: because it can F your S up.

phantom quality algorithm

Click through to get a solid foundational understanding of Phantom/Quality updates from Glenn Gabe, one of my favorite SEO authorities.

An Update on Doorway Pages

https://webmasters.googleblog.com/2015/03/an-update-on-doorway-pages.html

Google is coming after your crappy-user-experience-created-only-for-SEO doorway pages (again).

From Google’s official site:

Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.

The post has a helpful list of things to check to make sure you are not using doorway pages, so definitely give that a run through, if you’re unsure.

Funny thing, and a call-back to the SEObook post mentioned above: here’s a solid example from the Google Ventures-funded LendUp of exactly what doorway pages look like on a site:

lendup doorway pages

LendUp currently ranks extremely well for all of those pages, if you’re wondering.

¯\_(ツ)_/¯

 

Official Google Updates

Guidelines for bloggers who review products they receive for free

https://webmasters.googleblog.com/2016/03/best-practices-for-bloggers-reviewing.html

freegift

Google has caught on to your link-building schemes. No longer can you send beef jerky or facial scrub to review sites in return for that sweet, sweet DoFollow link.

New guidelines now indicate you must:

  1. Use NoFollow links
  2. Disclose the relationship
  3. Write compelling, unique content

 

As a form of online marketing, some companies today will send bloggers free products to review or give away in return for a mention in a blogpost. Whether you’re the company supplying the product or the blogger writing the post, below are a few best practices to ensure that this content is both useful to users and compliant with Google Webmaster Guidelines.

 

The Webmaster Blog has a new domain name

https://webmasters.googleblog.com/2016/03/an-update-on-webmaster-central-blog.html

New google webmaster url

You can find the new site at webmasters.googleblog.com, instead of the old address: googlewebmastercentral.blogspot.com.

Why?

…starting today, Google is moving its blogs to a new domain to help people recognize when they’re reading an official blog from Google. These changes will roll out to all of Google’s blogs over time.

Even GOOGLE doesn’t use blogspot anymore…

 

How Google Prioritizes Spam Reports

http://www.thesempost.com/google-prioritizes-spam-reports-from-those-with/

TPSreports

Google prioritizes spam reports sent within Search Console (then those submitted elsewhere. Google also prioritizes acting upon spam reports if it comes from a source that has previously submitted spam reports that have been helpful in cleaning up legitimate spam (instead of just being bitchy to competitors).

If you submit spam reports to Google, especially for spam within your niche, it would be more beneficial to you if you really do help clean up the space from spam by submitting valid reports, and not just randomly reporting competitors for tiny almost non-existent tiny violations.

 

AI is Transforming Google Search

http://www.wired.com/2016/02/ai-is-changing-the-technology-behind-google-searches/

TL;DR — The head of AI at Google is now the head of search, and with their increased adoption of deep learning neural networks to drive search rather than algorithms, maybe we can learn to love penguins and pandas again (and start hating on robots).

AI Search Algorithms

Yes, Google’s search engine was always driven by algorithms that automatically generate a response to each query. But these algorithms amounted to a set of definite rules. Google engineers could readily change and refine these rules. And unlike neural nets, these algorithms didn’t learn on their own. As Lau put it: “Rule-based scoring metrics, while still complex, provide a greater opportunity for engineers to directly tweak weights in specific situations.”

But now, Google has incorporated deep learning into its search engine. And with its head of AI taking over search, the company seems to believe this is the way forward.

Google Manipulates Search Results

http://recode.net/2015/06/29/yelp-teams-with-legal-star-tim-wu-to-trounce-google-in-new-study/

A study has come out, sponsored by Yelp, claiming that Google is manipulating its search results to favor its own web properties, presenting users with a poorer end product.

“The easy and widely disseminated argument that Google’s universal search always serves users and merchants is demonstrably false,” the paper reads. “Instead, in the largest category of search (local intent-­based), Google appears to be strategically deploying universal search in a way that degrades the product so as to slow and exclude challengers to its dominant search paradigm.”

And here’s the best quote from the report that was published:

The results demonstrate that consumers vastly prefer the second version of universal search. Stated differently, consumers prefer, in effective, competitive results, as scored by Google’s own search engine, than results chosen by Google. This leads to the conclusion that Google is degrading its own search results by excluding its competitors at the expense of its users. The fact that Google’s own algorithm would provide better results suggests that Google is making a strategic choice to display their own content, rather than choosing results that consumers would prefer.

I don’t have to point out that Google is possibly getting… “penalized” …for manipulating the search results, do I? Because, that’s kind of hilarious…

Amid the antitrust suit in Europe, Google is not having the best time right now.

SKI

New Google Algorithm: the “Newsworthy Update.”

http://searchengineland.com/report-last-weeks-google-update-benefited-news-magazine-web-sites-223717

Last week saw a new Google update, not related to any of their recurring algorithm updates. This update, being unofficially called the “Newsworthy Update,” boosted SERP visibility for many sites that cover fresh, news-related content.

news serps

Check out Search Engine Land for the full story.

Google Wants to Rank Websites Based on Facts not Links

http://www.newscientist.com/article/mg22530102.600-google-wants-to-rank-websites-based-on-facts-not-links.html#.VPrBgcZM5NK

Google is seeking to rank websites based on factual information, rather than skewed towards sites with the higher number of incoming links, as it has in the past. They are working on having their Knowledge-Based Trust score assess the number of incorrect facts on a page, and deem websites with the least amount of incorrect facts as most trustworthy. Apps currently exist today that perform similar assessments, like weeding out spam e-mails from your inbox or pulling rumors from gossip websites to verify or denounce them.

From the article: “A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team. The score they compute for each page is its Knowledge-Based Trust score.”

 

Bringing a Site Back from the Dead

https://www.rankxl.com/reviving-a-penalized-site-to-6400-visits-per-month-case-study/

the new Viagra commercials are really pushing the envelope…

Good case study here.

Backstory: some guy bought a website but never did anything with it. Fast forward a few neglected years and leading up to Q4 2018 the site was also pretty abandoned by visitors and getting no love from Google.

This post covers what the author did to re-tafficize™ the site, including

  • fixing thin content
  • updating good articles
  • disavowing links

Since this website wasn’t really worth much we decided to experiment and have a plan consisting of 2 parts.

Step 1. Check the website and delete all the content which wasn’t ranking or content that was considered as thin or low quality in the eyes of Google. Same for low quality backlinks.

Step 2. If the first part gave results (fortunately it did), we could move to the second phase of the plan which was to invest some money in it, write more content and build some high quality backlinks to boost the rankings.

The results:

Sweet.

 

What it Takes to Rank in Higher Education Niches

https://www.serpwoo.com/blog/analysis/seo-for-higher-education-marketing/

This is a good case study–not just for the takeaways, but for the research process. It seems like a lot of people just want to be told what to do (e.g. sign up for our link building service) instead of researching and understanding their industry, their competitors, and what it takes to rank for their best keywords.

Obviously this is easier to do if you have some SEO skills yourself, but not impossible. In this case study, they crunch some data they get by digging through the sites on page one for various keywords to draw some conclusions:

Our findings show that page one URLs have more nofollow backlinks than page 2-10 URLs.

And while some might scream this is an example of page 1 URLs having more total backlinks overall, and thus some of those will be nofollows.. it doesn’t account for the fact that URLs on page 3 who have a similar number of total backlinks ( but very few nofollow ) aren’t on page 1.

If total backlinks for page 1 URLs and page 3 URLs are similar, but nofollows are less on page 3 and more on page 1 URLs, then you have to give the credit that nofollows ARE helping in rankings.

I’m not trying to start a conversation around the value of nofollow links OR endorse them as the thing your backlink profile is missing, so don’t @ me. I’m just pointing out that this case study is pointing out that there is some interesting data here.

Is it likely that the sites ranking on page one provided some value and that there are other factors accounting for their high rankings, one consequence of which was earning lots of no follow links?

YEAH PROBABLY.

But that’s a signal that you should pay attention, and dig in further. WHY are they getting more nofollows than the page three sites?

That’s just one tiny piece of one data point you can dig in to. Check out the whole study to get a sense of the process and data a professional SEO looks at when trying to rank a site.

Addressing and Ignoring Technical SEO

https://ftf.agency/technical-seo-case-study/

This is actually a post from 2015 with a few updates and a new published on date so I’m gonna including it here to a) stick with the case study theme, and b) it’s genuinely interesting and important content.

Technical SEO.

By fixing some pretty terrible technical SEO issues (and leveraging “traffic leaks”) Nick was able to grow the site from 7-8,000 visits per day to an average of 25,000.

Key takeaways:

  • Leverage your site’s internal link equity and your strongest page’s citation flow. Use this to your full advantage to send link equity to the pages that need it most.
  • Optimize your site’s crawl budget and make sure you’re not wasting Googlebot’s time crawling pages with thin or near-duplicate content.
  • Don’t underestimate the power of properly built and optimized sitemaps, and make sure you’re submitting them regularly through GSC.
  • Put in the work to engineer a few big traffic pops, while they may only be a flash in the pan in terms of growth; the sum of these parts will create a larger whole.

The updated part of this post mostly relates to this graph:

Which shows that, when the site was sold shortly after the work described here, but new owners neglected on-page SEO pretty badly.

 

Why Google Doesn’t Use CTR and E-A-T Like You Think

https://www.seo-theory.com/why-google-does-not-use-ctr-bounce-rate-and-e-a-t-the-way-you-think-it-does/

An SEO think-piece (rant) on how many SEOs are full of shit and correlation does not imply causation, and how, with cherry-picked data, some SEOs spread misinformation that refuses to die because it gets caught in viral feedback loop.

Google does NOT use E-A-T as a ranking signal.

It doesn’t matter how often Googlers reject that belief. It doesn’t matter how much they debunk it. The same people come back time and again and proclaim their correctness, asserting they finally have proof that the E-A-T- Fairy is real and bestowing blessings upon everyone.

Wikipedia alone proves there is no E-A-T ranking signal. Wikipedia is not only NOT an expert Website, the news media has published many stories about subject-matter experts being driven away from Wikipedia. Although Wikipedia’s community has responded to these criticisms and attempted to adjust their policies through the years to promote better editing, their fundamental principle of allowing consensus from the unwashed masses to make decisions has led to many potentially good articles being edited into mediocrity. In some cases outright false information persists in Wikipedia articles because their rules against “edit wars” favor the people who are clever enough to revert accurate corrections.

You’ll most likely have a strong reaction to this post–whether for or against, but that’s a good thing. On the one hand, it’s important to try and validate on your own the kinds of things some very visible SEOs claim. On the other, Google has zero interest in having any of us really understand their algorithms work (outside of making sure they can be easily crawled for maximum indexing and scraping, but that’s a story of another day).

So, I mean, you’ve got to accept SOME advice eventually. Probably helpful to read something from both sides of the SEO aisle as much as possible. For every Marie Haynes theory read something from Michael Martinez.

Then start drinking…

Breaking Down Search Intent

https://www.contentharmony.com/blog/classifying-search-intent/

from Supremacy’s Sketchbook Fridays

Search intent has always been important, but it really stepped into the spotlight in Q2/Q3 2018 when the “medic update” shook up the SERPs in a big way. Now, it pays not just to consider the search intent (which was best practices pre-Medic), but to constantly evaluate what Google considers the intent to be.

This post breaks down intent even further than the traditional buyer vs. informational and has some really smart things to say overall.

For instance, breaking down search intent classifications even further to pull “answer intent” from the broader “informational:”

Slightly different from research, there are quite a few searches where users don’t generally care about clicking into a result and researching it – they just want a quick answer. Good examples are definition boxes, answer boxes, calculator boxes, sports scores, and other SERPs that feature a non-featured snippet version of an answer box, as well as a very low click-through rate (CTR) on search results.

Highly recommended read–and apparently they have a tool coming out that helps identify search intent by keyword.

 

Does AMP Give SEO Ranking Benefits?

https://www.stonetemple.com/amp-impact-on-rankings-conversions-engagement/

exclusive look at how to optimize a site for AMP

Google says AMP carries no SEO advantage, but Google says a lot of things…

What does the data say, though?

Stone Temple took a(n admittedly small) sample of AMP sites and ran some numbers:

Overall, 22 of the 26 websites (77%) experienced organic search gains on mobile. Other areas of improvement include SERP impressions and SERP click-through rates. A summary of the results across all 26 sites is as follows:

27.1% increase in organic traffic
33.8% increase in SERP impressions
15.3% higher SERP click-through rates

But were there rankings gains?

Yes, kind of–but the study’s author is reluctant to say it was due to AMP:

what we saw was that 23 of the 26 domains saw an increase in our Search Visibility score. 17 of these domains saw an increase of 15% or more, and the average change in Search Visibility across all 26 sites was a plus 47%.

This data suggests that there were some rankings gains during the time period. So that still leaves us with the question as to whether or not having an AMP implementation is a direct ranking factor. In spite of the above data, my belief is that the answer is that it’s not.

Check out this one for the full scope of the study.

 

Are Web Directories Still Relevant for SEO in 2019?

https://cognitiveseo.com/blog/21291/web-directories-seo/

yes no

If they are high quality and relevant, yes.

 

Understanding Query Syntax

http://www.blindfiveyearold.com/query-syntax

This is a very smart post for advanced SEOs only. I mean, go read it if you’re not an SEO but you’ll be like:

 

 

This is another post, essentially, on search intent:

It’s our job as search marketers to determine intent based on an analysis of query syntax. The old grouping of intent as informational, navigational or transactional are still kinda sorta valid but is overly simplistic given Google’s advances in this area.

Knowing that a term is informational only gets you so far. If you miss that the content desired by that query demands a list you could be creating long-form content that won’t satisfy intent and, therefore, is unlikely to rank well.

Query syntax describes intent that drives content composition and format.

I’m not gonna attempt to summarize this post here, but instead I will encourage you to go read the whole post.

Topic Clusters

https://alfredlua.com/seo-topic-clusters/

Lots to focus on with content this week.

This post, by the Growth Editor at Buffer.com will be most helpful to you if you’re just about to do a big content push, and you’re looking for a little direction.

Similar to silo-ing pages, this post explores how some of the content at Buffer is created and structured on the site.

So far, I have seen two styles for the main page. The first is a pillar page — a long-form guide, often known as an “ultimate guide” that covers the topic comprehensively. The second is a hub page — something like a table of contents with a summary of each sub-topics and a call to action to read the respective supporting articles. We have tried both at Buffer (pillar page example and hub page example). To determine which style to use, I like to search for the topic on Google and see which style most of the top 10 articles use.

It also covers some of the mistakes they’ve made, which is helpful to see:

A highly recommended read.

Featured Snippets FTW

https://www.orbitmedia.com/blog/featured-snippets/

If you can’t beat ’em, join ’em. If you can’t get them to send traffic to your high quality, relevant content, win the featured snippet so at least it’s your content they’re displaying.

…not quite as catchy.

Every year there are reports of Google scraping data from websites (against Google’s webmaster guidelines if YOU do it) and displaying it in various feature-rich SERPs, saving the searcher a click. While the featured snippets decrease the amount of clicks the standard organic-10 results get, there’s nothing you can really do about that except try and get the featured snippet in addition to the best organic spot.

Not sure where to start?

Orbit Media’s post will get you most of the way there:

If your content gets a featured snippet for one search query, the chances to win featured snippets for other similar ones are great enough. As your site already has an appropriate structure to provide a quick answer, it’s more likely Google will consider it to be a great option for the related queries.

 

Expired Domain SEO [Case Study]

https://detailed.com/expired-domain-seo/

Another great post from Glen Allsopp (outting as per usual).

This one is about a close-up look at how entrepreneur in India is making 5 figures per month with an affiliate site–specifically using expired domain redirects as a way to build links/pass on authority.

Of course, this is nothing new to SEO veterans, but if it’s a new idea to you, give it a read. The post also had a few reactions from SEO professionals which was an interesting addition.

The site is a pretty amazing anomaly if you ask me. The design is as basic as it gets, the content is thin – has no tone or legitimacy, and the internal links are partial match at best.

Yet the site has made strategic use of some old school link building techniques that definitely still work and managed to rack up over 2,300 RD’s in Ahrefs and rank for over 28k keywords, with nearly 1,800 ranking on the first page of Google. Most of these terms are commercial intent terms with several thousand searches per month each.

It’s a great way to get some insights into what a successful site owner is doing to generate traffic in an honest way, without an info product attached to it.

 

Just How Big Your Website Actually Is

https://www.gsqi.com/marketing-blog/how-to-find-true-size-of-your-site-index-coverage-reporting/

Size of your site != how many pages are indexed.

This is definitely getting into the nerdier side of things, but understanding this can be really important to your SEO efforts.

…crawl budget might also be a consideration based on the true size of your site. For example, imagine you have 73K pages indexed and believe you don’t need to worry about crawl budget too much. Remember, only very large sites with millions of pages need to worry about crawl budget. But what if your 73K page site actually contains 29.5M pages that need to be crawled and processed? If that’s the case, then you actually do need to worry about crawl budget. This is just another reason to understand the true size of your site.

This is a pretty thorough post, taking you through all the reasons why understanding your site’s size is important, the different ways your site could get out of control, and how to reign it all in.

Good stuff, highly recommended.

E-A-T is Not a Search Algorithm

http://www.blindfiveyearold.com/algorithm-analysis-in-the-age-of-embeddings

It’s almost the Thanksgiving holiday here in the US, so I’ll start by saying how thankful I am for this post, which is kind of the culmination of a lot of I’m-just-guessing-but-take-this-as-gospel SEO advice-giving lately.

When the August 1st update hit, many people had never seen anything like it. Even battle-hardened SEOs were kind of shocked at what they were seeing. Like anyone would when the metaphorical rug is yanked out from under them, people whose sites got smashed down in the SERPs looked to whatever confident voice they could find and held on for dear life.

The problem with proclaiming why a Google algo update happened and how to fix it, as this post shows, is that it takes a long time to sort through the data–or even a long time to collect the data.

It can be hard to do nothing when your traffic is down 75% over night, but it’s even worse to make a bunch of changes to your site before you understand why you lost that traffic in the first place.

I have seen a lot of “experts” talking about how E-A-T was being more heavily weighted with this new update, but E-A-T is not part of an algorithm:

The problem is those guidelines and E-A-T are not algorithm signals. Don’t believe me? Believe Ben Gomes, long-time search quality engineer and new head of search at Google.

“You can view the rater guidelines as where we want the search algorithm to go,” Ben Gomes, Google’s vice president of search, assistant and news, told CNBC. “They don’t tell you how the algorithm is ranking results, but they fundamentally show what the algorithm should do.”

So I am triggered when I hear someone say they “turned up the weight of expertise” in a recent algorithm update. Even if the premise were true, you have to connect that to how the algorithm would reflect that change. How would Google make changes algorithmically to reflect higher expertise?

Google doesn’t have three big knobs in a dark office protected by biometric scanners that allows them to change E-A-T at will.

So what did change with these recent big updates?

To put it as succinctly as possible: Google’s algorithms reassessing search intent.

There’s no way I can sum up this post in a few paragraphs. This post is what I mean when I say “good SEO sleuthing takes time,” (ok I’ve actually never said the word “sleuthing”). It takes hours and hours to sort through the SERPs and the new rankings–that’s why there’s no “state of the algorithm update” post right when it happens–or at least, there shouldn’t be.

If you read only one post when you’re trippin on tryptophan this Thursday, make it this one.

 

A Different Kind of International SEO

https://www.searchenginejournal.com/google-algorithm-loopholes/278093/

So, a Facebook group had a contest to rank a new site for “Rhinoplasty Plano.”

The runner up (2nd spot) was a site entirely in Latin.

Yes, Latin, the dead language spoken by the Romans and, one assumes, inspiration for Pig Latin.

So how the hell did someone rank a site entirely in Latin when Google is so smart and has invested in neural networks to have their algorithms reward relevant sites and etc. etc. etc.?

Position two is held by a site written almost entirely in Latin, Rhinoplastyplano.co. It mocks everything Google says about authority and quality content. Google ranking a site written in Latin is analogous to the wheels falling off a car.

Basically, they did all the things you’re supposed to do to a local site: map on the website, schema mark-up, optimized the page, etc. Just… with Latin content.

They say their intent wasn’t to shame Google or anything but… it’s pretty bad optics for Google to keep talking about how much content matters and build a quality site with good content, it’s all that matters, and then a site in Latin hits the top spot…

Pretty entertaining story. Check out both URLs above for the full take.

Search Clicks > Search Volume

https://www.siegemedia.com/seo/search-clicks

Siege Media, content SEO geniuses, have put together a video on why you shouldn’t just focus on searches per month, but rather, focus on search clicks (a metric that’s easy to see using Ahrefs.

Really good stuff.

Is Your Site an Endless Horror for Googlebot?

https://www.contentkingapp.com/academy/crawler-traps/

Search engine spiders don’t like a site that plays hard to get. If they have to put too much effort into crawling your site they just… won’t crawl all of it.

I think we’re all on the same page here in wanting the best experience possible for users AND Google on your site, so making sure there aren’t any tricky spots where the crawler could get and give up.

So, how to prevent this?

Definitely dig into the article, but here are the likely culprits:

  • URLs with query parameters: these often lead to infinite unique URLs.
  • Infinite redirect loops: URLs that keep redirecting and never stop.
  • Links to internal searches: links to internal search-result pages to serve content.
  • Dynamically generated content: where the URL is used to insert dynamic content.
  • Infinite calendar pages: where there’s a calendar present that has links to previous and upcoming months.
  • Faulty links: links that point to faulty URLs, generating even more faulty URLs.

If your site possibly has some of the above situations, it’s probably worth digging in to make sure you’re not killing your crawl budget (the links has solutions to fixing all the above-mentioned list).

 

Backlinko Explains Search Intent

https://backlinko.com/skyscraper-technique-2-0

Brian Dean has created a pretty good post explaining the importance of understand query intent (which we covered on our “Medic” core algorithm update post). Calling it Skyscraper 2.0 seems like it needlessly confuses the topic, but whatever, hashtag marketing.

On paper my content had everything going for it:

    • 200+ backlinks.
    • Lots of comments (which Google likes).
    • And social shares out the wazoo.

What was missing? User Intent.

User Intent + SEO = Higher Rankings

Remember:

Google’s #1 goal is to make users happy. Which means they need to give people results that match their User Intent.

This post also covered an interesting thing: Google has started to solicit user feedback on their intent:

The bottom line is this:

Understanding query intent of a searcher is important, but understanding what Google thinks is the query intent of a searcher will get your content ranking in the top spots.

 

Another Featured Snippet Post

https://ahrefs.com/blog/find-featured-snippets/

Ahrefs.com has some pretty compelling data about featured snippets.

TL;DR — you’re taking a 31% traffic hit by having the #1 spot but also NOT the featured snippet.

Ranking in 2nd-10th?

So this post is very Ahrefs-centric, in that it shows you how to find featured snippet opportunities using their tool, but still a valuable post.

Check it out if you’re ready to get your featured snippet game on. This post is less about telling you HOW to optimize your content for featured snippet domination (just Google it, there’s about 10,000 posts), and more for how to find the opportunities to go after.

…it’s then a case of trying to understand WHY you don’t own these snippets and doing everything in your power to rectify the issue(s).

Here are a few issues and solutions that, while not guaranteed to help, have apparently “worked” for others:

  • Your competitor’s answer is “better” than yours
  • You have structured markup issues
  • Your content doesn’t adhere to the format searchers want to see

 

A Cryptomining Script Study by Ahrefs

https://ahrefs.com/blog/cryptomining-study/

Ahrefs did an interesting study recently about the number of sites using a script to mine cryptocurrency by borrowing (without your consent–“borrowing” used pretty loosely here) some of your computer’s processing power while you’ve got the site open in your browser.

So how many sites did Ahrefs find using a cryptomining script?

We found 23,872 unique domains running cryptocurrency mining scripts.

As a percentage of the total 175M+ in Ahrefs’ database, that’s 0.0136%. (Or 1 in 7,353 websites.)

The study found some interesting things, such as:

– some of the top sites (as measured by SEO authority) that use cryptomining scripts allow user-generated content, such as Medium.com

If you’re interested in alternative monetization methods, cryptocurrency, or studies that use a huge chunk of data, check out the full post.

If you have a site that relies on organic traffic and you’re thinking about using such a script on your site, here’s probably the most important part of the article (even though it’s speculation):

There has even been rumours in the past that Google might block websites with crypto-mining scripts in Chrome (a browser with ~58% market share). Bottomline: installing crypto-mining scripts simply isn’t worth the risk for high-profile websites.

So, tread carefully there.

Real Talk: Keyword Search Volume

https://ahrefs.com/blog/keyword-traffic-estimation/

Nice article from Ahrefs on what searches per month actually means with keyword research.

Obviously no one gets close to the amount of data Google has with regards to keyword traffic, but as these things go, Ahrefs has access to more info than most of us ever will. So when they put out an article on something like the realities of search volume, it’s a good idea to listen up.

I’m sure every professional SEO has noticed that ranking a page for a high-volume keyword doesn’t always result in a huge amount of traffic. And the opposite is true, too—pages that rank for seemingly “unpopular” keywords can often exceed our traffic expectations.

This is something that not everyone realizes, so I wanted to highlight it here as something to keep in mind/something one should be reminded of often.

Just because you rank for a keyword doesn’t mean you will get some high percentage of click throughs/traffic from ranking for that term. There are a number of reasons they identified that could be the cause, from obvious ones like ads taking away clicks from the top spot, to less obvious one like long tail keyword variations that have never been searched before.

It’s an interesting article–definitely recommend you taking a look.

 

The Rise and Fall of Featured Snippets

https://moz.com/blog/knowledge-graph-eats-featured-snippets

Featured Snippets: 0
Knowledge Graph: 1

The two weeks encompassing the end of October/Beginning of November saw a pretty significant SERP switch-up.

The number of featured snippets for keywords fell:

And the number of knowledge graph results rose:

We’ve highlighted several articles in the past on how to win the featured snippet for your keywords. Many agencies and businesses have put a lot of energy and resources into winning the featured snippet (the REAL #1 ranking). So, you might be asking… WTF, Google?

still salty about the G+ failure?

It’s likely that Google is trying to standardize answers for common terms, and perhaps they were seeing quality or consistency issues in Featured Snippets. In some cases, like “HDMI cables”, Featured Snippets were often coming from top e-commerce sites, which are trying to sell products. These aren’t always a good fit for unbiased definitions. Its also likely that Google would like to beef up the Knowledge Graph and rely less, where possible, on outside sites for answers.

The real winner here is Wikipedia, as they are generally the source of the data for the knowledge panels (and “winner” is used loosely here, as they are not really compensated for being the engine behind knowledge panels. Maybe Sundar Pichai will donate $3 to Wikipedia during this funding drive).

 

When You Accidentally Block Googlebot

http://www.localseoguide.com/googlebot-may-hitting-bot-blocker-urls/

There are many legitimate reasons you would have some bot-blocking language in your site’s code. The use-case in the article is to prevent your site being scraped. That’s legit.

However, I’ve also done the thing where I unchecked and forgot to re-check the “allow search engines to index this site” box and took me way too long to realize…

This article highlights an instance of Googlebot being unintentionally blocked. I definitely recommend the read:

Many sites use bot blockers like Distil Networks to stop scrapers from stealing their data. The challenge for SEOs is that sometimes these bot blockers are not set correctly and can prevent good bots like Googlebot and BingBot from getting to the content, which can cause serious SEO issues. Distil is pretty adamant that their service is SEO-safe, but I am not so certain about others. We recently saw a case where the http version of a big site’s homepage was sending Googlebot to a 404 URL while sending users to the https homepage, all because the bot blocker (not Distil) was not tuned correctly.

 

Time to Ranking

http://www.serplogic.com/infographics/rank-google

Hey, shout out to 2015, this post presents its information via infographic.

Why?

Links, probably.

Anyway, since it’s hard to quote from an infographic, I took a screen shot of the relevant section.

As you can probably imagine, one of the top 3 questions we get asked by people interested in signing up with us for our amazing link building abilities, is how long will it be until they start ranking on page one.

The answer to this depends on, like, 1,000 different things (age of the page you’re trying to rank, keyword competitiveness, etc.), but it helps to manage expectations by looking at some data:

TL;DR: Older pages dominate the front page of the SERPs for many keywords. Google has about zero incentive to let newer pages (unproven in trust and authority) slip into the top spot with some clever SEO tricks unless it’s a niche that strives on time-sensitve content. Unfortunately. 🙂

 

SEO is Not Magic

https://webmarketingschool.com/secret-seo-guaranteed-results/

But like fixing a broken engine or planting a vinyard, there’s a very big difference between being able to do it well, and just bullshitting everyone (and yourself) about how easy it is/how good you are.

We’ve always been pretty clear that SEO can be distilled down to a few things, and this post does a good job of summarizing that:

On Site / Technical SEO boils down to making your website easy for Google to crawl, and understand.

Content that appears on your website must be worthwhile, relevant, interesting and adds value to Google’s index.

Authority, which you can call PageRank, backlinks, link equity or any number of other terms, is the primary way Google sorts their results.

While each of those three things may have hundreds of factors to consider, that’s it:
There is no fourth category only revealed to the illuminati.
There is no secret handshake with Google engineers.
There is no “Secret Sauce.

Of course, there’s a lot unmentioned under the heading, of, say, “authority,” that isn’t covered.

Are you still building links in forum posts? Or do you have a sophisticated set-up of sites to test SEO theories, or a network of editors and sites where you can publish good content and get good links?

Not to push the hard sell… let’s call it the medium sell? We’re good at what we do here, and while SEO is not magic, neither is a tonsillectomy, but that doesn’t mean you should do it yourself!

Click here to read about how we can help.

 

Why Your GSC Position Data (Probably) Dropped

http://www.blindfiveyearold.com/analyzing-position-in-google-search-console

This is a post about using Search Console to analyze your position data–some really heavy stuff. Mostly I stick to page optimizations and link building strategies. I don’t dig too deeply into Search Console, but this issue seems like a frustrating one, so hopefully this helps.

Have you ever seen something like this in Search Console:

And reacted like:

There are always lots of reports about whether or not there’s a bug in Google Search Console. Is it a buy or an algorithm update? This post helps show that understanding the position data, and really knowing how to read it, can help easily distinguish between the two.

There’s a simple explanation of what’s going on, and your site (probably) hasn’t tanked:

Google Search Console position data is only stable when looking at a single query. The position data for a site or page will be accurate but is aggregated by all queries.

In general, be on the look out for query expansion where a site or page receives additional impressions on new terms where they don’t rank well. When the red line goes up and the green goes down that could be a good thing.

And one more from the comments on this post:

And that’s why I think understanding how to read this data is so important. If you look at the data and say “it’s a bug” instead of looking at it and saying “whoa, something big just happened” then you’re bound to fall behind.

 

Why Link Spam is on the Rise Again: Because It Works

http://www.blindfiveyearold.com/ignoring-link-spam-isnt-working

You guys/gals, this is a good post.

Before Panda and Penguin, you could throw crappy links and thin content at your site, and it might rank fine. But if it didn’t, you just got ignored–not penalized.

Post Panda/Penguin, the risk of this tactic carried a huge cost for an established business.

Current incarnations of various algorithms are penalizing less and ignoring more.

Here’s a TL;DR for you (but you should still go read. It’s not THAT long):

So when Google says they’re pretty good at ignoring link spam that means some of the link spam is working. They’re not catching 100%. Not by a long shot.

So while Google snickers thinking spammers are wasting money on these links it’s the spammers who are laughing all the way to the bank. Low overhead costs make even inefficient link manipulation profitable in a high demand market.

I’ve advised clients that I see this problem getting worse in the next 12-18 months until it reaches a critical mass that will force Google to revert back to some sort of penalization.

 

Slow and Steady SEO Case Study

http://searchengineland.com/seo-case-study-0-100000-visitors-12-months-277995

Slow and steady wins the race, but it’s not very sexy.

^ White hat SEO, basically.

This case study is pretty great, because it takes you from the beginning (what do you prioritize first, to help build success upon later?) to the middle stages of an SEO strategy.

“Get links” is a pretty solid SEO strategy, I gotta say, but there are things you can do in the beginning to make those links extra effective.

For instance:

Content drives SEO success.

It’s possible to secure a few links to bottom-of-the-funnel pages, but you’ll need middle and top-of-the-funnel content to sustainably capture attention and links.

Creating useful content for your audience is always a sound strategy, but you can take it a step further by being intentional and strategic about the content you publish. We maintained a relentless focus on SEO — creating every page with search, and the opportunities available to us, in mind.

I know I’ve never mentioned the importance of good content on this site before…

Oh wait, I do. Constantly.

This is a great article to read if you are just getting going and looking for some guidance. If you click through, you’ll see that the very first lines talk about building a site without any hacks or tricks. That’s good if you don’t understand the risks involved in all SEO tactics, but keep in mind:

1. It took a year to get there.
2. Hacks and tricks, done right, might get you there a little faster (or get bigger results.

 

Rapid-Fire SEO Insights

Google is testing increased front-page Google My Business (GMB) editing and call to action.
http://blumenthals.com/blog/2017/07/07/google-testing-increased-front-page-gmb-editing-call-to-action/

New to schema? Start here:

https://www.searchenginejournal.com/get-started-schema/204518/

Tangentially related to SEO (and something very important to keep in mind about Google–especially if you practice SEO and use Google products…)

How Google Search Reveals Our Darkest Secrets

https://www.theguardian.com/technology/2017/jul/09/everybody-lies-how-google-reveals-darkest-secrets-seth-stephens-davidowitz

Google, On Snippets

http://webmasters.googleblog.com/2017/06/better-snippets-for-your-users.html

Ever wondered what Google’s take on featured snippets, meta data, and more is?

WELL THIS IS YOUR LUCKY DAY, PARTNER. Google recently put out a blog post that takes on all of these topics. Highly recommend you check it out.

The most interested part, to me, was this:

Can I prevent Google from using the page contents as snippet?
You can prevent Google from generating snippets altogether by specifying the “nosnippet” robots directive. There’s no way to prevent using page contents as snippet while allowing other sources.

Damn. I hate when that happens, But it’s a relief at least to know those for certain, so you can stop worrying about it, maybe?

Exceptions to SEO Best Practices

During my research this morning, I came across a very interesting niche, with some very interesting top-ranking sites.

Though I’m not going to reveal niche OR sites (this isn’t intended to be some kind of hit-job), I wanted to share something that I don’t see talked about very often:

SEO best practices are a good starting point, but don’t at all apply to some niches.

The keyword I was looking at is 301,000 searches per month, very competitive, and VERY lucrative.

If you wanted to rank a site for this keyword you’d have to come to terms with two big things:

  1. you aren’t going to crack the top 10 without some seriously aggressive link building
  2. your site is probably going to get penalized at some point.

So, not for the faint-of-heart, and definitely not a long-term business strategy.

Take a look at the top 20 ranking sites to see what that looks like:

To summarize, that’s:

9 .com, 3 .to (actually, the same site has all three top spots!), 1 .video, 1 .es, 1 .co, 1 .ag, 1 .ms, 1 .io. 1 .net, 1 .org.

Since I am not disclosing the keyword/niche let me assure you that:

  • Content is not king, here.
  • This is not about building relationships.
  • User experience is shit.

 

The .coms present here are the usual SERP-characters, Twitter, Youtube, Facebook, etc. There’s a reason most domains are not .coms: a good domain name would be wasted on this churn/burn style of business.

What do all of these sites have in common?

A ton of links.

Unsafe, built-to-rank (not-necessarily-to-last) links.

The top ranking site (.to TLD) has built 1,390 links (many of which were 301 redirected from other, similar sites).

Similarweb.com estimates 318 million visits/mo:

So the trade-off of having their site be SUPER at risk for a penalty is 300 million visitors per month and a site in the Alexa top 100.

Building links to your site was, is, and will be the most important factor in a site ranking well.

Typical advice tells you to be smart about your link velocity, keep it looking natural, slow and steady.

For this site, however, the link velocity stray’s pretty far from the warming-glow of best practices:

The top ranking site has +1000 linking domains.

The other sites have close to 1000 linking domains.

Exact Match Domains (EMD) work super well (though you’ll need more than JUST an EMD to rank)

For this, I’ll exclude the domains ranking on pure authority and have nothing to do with the niche (Twitter, Facebook, etc).

That leaves 14 sites (in the top 20).

Of those 14 remaining sites:

  • 5 sites use exact match domains
  • 5 sites use partial match domains

 

I see the same trend of EMD/PMD across the related and equally valuable and competitive keywords.

The Takeaways

Commonly-given SEO advice is usually practical, but doesn’t apply to all niches/websites and specifically to all SEO strategies.

Exact match domains can still be an asset.

Link building is usually a must to help your site rank well. Excessive link building can rank your site well for ultra-competitive niches, but the trade-off is that it creates a site with a high probability of getting penalized eventually.

Remember, there are all kinds of different strategies for getting the traffic you need. The “SEO Best Practices” preached by many (including us) are generally right for most businesses, but there are countless exceptions if you are able to think outside of the box, and the results can be incredibly rewarding if you succeed.

 

The Problem With (and how to raise) Moz DA

http://www.serplogic.com/seo-news-views/increase-your-domain-authority

There’s a common misconception I hear frequently from prospects and SEO beginners, so I took the opportunity to cover this article in the weekly update to help clear it up.

Moz’s DA (domain authority) is not a metric Google uses to rank sites. The Domain Authority metric is a proprietary algorithm Moz uses to try and calculate a site’s authority by using some combination of various inputs–most likely links.

Just because your site has a high DA does not mean it will rank well.

Personally, I have found the Ahrefs Domain Rating to be more accurate and more up-to-date, but I guess that’s personal preference.

Here’s another fun fact about Moz’s DA that I don’t think is widely known: it’s very easy to manipulate a site’s DA score with tons of low quality links. Just because the DA score goes up from those crappy links, doesn’t mean Google is going to be impressed and rank your site.

Still, as the article points out:

So, while I think DA is a totally worthless metric (and it is), increasing it means one thing: you are building the right kind of links, and in the end, that is what has the biggest impact on your Google rankings. Here is a simple, yet guaranteed way to increase your website’s Domain Authority before the next update.

Slightly contrary to the point I made above about low quality links (which is true), if you are engaging in high quality link building, you should see your Moz score improve, ideally, and your rankings improve as well.

The rest of this article talks about how to increase your Moz DA score by doing good, wholesome SEO that will help your overall rankings, so… not a bad way to kill 10 minutes by giving it a read.

 

Getting Traffic From Pinterest

https://ahrefs.com/blog/get-traffic-from-pinterest/

Do you have an interest… in Pinterest?

Good news. Ahrefs have a really long post on generating traffic using Pinterest. Like, if you were looking to kill a few hours learning something new and probably valuable, this isn’t a bad choice.

Much too long an article for a short summary to do it justice, I will just say that if you are into the idea of using Pinterest as a part of your traffic generation strategy, this post literally has everything you’ll need.

Here’s a great tip on connecting with people that are already sharing your content on Pinterest:

I found this tip in Aaron Lee’s post at Postplanner. It helped me connect with some Ahrefs fans, who have strong Pinterest profiles (more about this down the post).

Put the following line into your browser and replace “YOURDOMAIN” at the end with your domain.

https://www.pinterest.com/source/YOURDOMAIN

Good stuff.

Facebook SEO

https://cognitiveseo.com/blog/13884/facebook-seo/

Facebook SEO — a bigger ROI than Bing…

Though some of these 10 points are a bit… obvious, this is still a pretty helpful post about something I don’t read much about: Facebook SEO.

From exact match vanity URLs, to on-page optimization, this post takes you through the most important items to address to make sure you are found organically within the Facebook walled-garden.

There are even some benefits to your organic Google rankings:

You usually post Notes to inform or even create engagement with your fans. They are an easy way of informing fans of important news about your business or even updating customers on a current crisis. Even if you are aware of this or not, Google seems to search and index the Notes that come from Facebook fan pages.

Careful when you click through to check out the full article. They have more pop-ups than a late 90s VH1 video…

 

How Many Keywords Could a Page Rank For

https://ahrefs.com/blog/also-rank-for-study/

…if a woodchuck could… rank… damnit.

Nevermind.

Another cool study out of Ahrefs.com

Here’s an interesting results:

It looks like the average #1 ranking page will also rank in the top10 for nearly 1,000 other relevant keywords (while the median value is more than two times smaller – around 400 keywords).

But how will this help you with your SEO efforts?

There are many helpful takeaways in this post (even if it’s just providing hard data to confirm something you already suspected), such as:

longer content ranks for more total keywords
pages with more backlinks rank for more total keywords

Pretty shocking, right? 🙂

Fun to see what people with access to a lot of data can prove.

 

WordPress Safety

https://www.seo-theory.com/defend-wordpress/

On the heels of one of the biggest ransomeware attacks this past weekend, it’s not a bad time to bring up the issue of security.

You know what DOESN’T help you rank well? Having your site get hacked, injected with 1,000 sunglasses or Viagra pages, and then be B-B-B-B-BLASTED with dirty, spammy links.

Not a lot of “sound bites” for this, I suggest you (finish reading this ??? post, and checking out our SEO services) click through and just… do all the things in the post.

Make your site safe from clever haxxors like this:

 

Case Study: Ranking for a High Volume Keyword

https://moz.com/blog/case-study-ranking-high-volume-keyword

This is a case study using the write good content > email outreach > guest post method.

It’s super effective!

But also, it takes a lot of work and time. If you’ve tried to do any SEO in the past few years, you know that a) this is the way SEO is trending, unless b) you are grey hat/black hat, understand the risk, and are on the aggressive side.

Lots of pieces to this post, but this one is probably the most interesting and valuable:

Hypothesis: We hypothesized that dropping the number of “sales management” occurrences from 48 to 20 and replacing it with terms that have high lexical relevance would improve rankings.

Were we right?

YEAH DAWG!!!

The improvement happened pretty quickly, as well:

  • July 18th – Over-optimized keyword recognized.
  • July 25th – Content team finished updating body copy, H2s with relevant topics/synonyms.
  • July 26th – Updated internal anchor text to include relevant terms.
  • July 27th – Flushed cache & re-submitted to Search Console.
  • August 4th – Improved from #4 to #2 for “Sales Management”
  • August 17 – Improved from #2 to #1 for “Sales Management”

Very interesting. I’m frequently a fan of de-optimizing over-optimized pages.

I have some tests planned on this myself. If we see any interesting results, I’ll report back, so stay tuned…

 

Infinite Review Spam

http://blumenthals.com/blog/2017/04/17/the-largest-review-spam-network-ever-or-who-is-shazedur-rahman-and-why-should-you-care/

We’ve discussed before how reviews are a huge factor in determining local rankings.

You also probably know that ranking locally is worth $$$.

And where there’s $$$, there’s going to be people doing whatever they can to take their share of it.

In this case, it is hiring/running a giant network of local-review SPAM.

Mike Blumenthal discovered it and spent the maximum amount of time any reasonable person would in trying to uncover the true size of it.

As is hinted at in the article… it’s strange that Google has been eager to curtail link building spam in the past, but this local-review spam persists. Thrives, even!

Review spam at this scale, unencumbered by any Google enforcement, calls into question every review that Google has. Fake business listings are bad but businesses with 20 or 50 or 150 fake reviews are worse. They deceive the searcher and the buying public and they stain every real review, every honest business and Google.

Mike suggests that not just removing the users who create the fake reviews should be done, but actually punishing the businesses that use these services. In which case…

Hello Google Local SEO!

 

The Rankability of New gTLDs

http://domainnamewire.com/2017/04/25/study-shows-new-tlds-can-rank-like-tlds-not-much-else/

Background: The Domain Name Association (DNA) commissioned a study to see if there was any advantage to using a new gTLD (.marketing, .xyz, .tech, etc.) over .com.

DNA’s incentive to show how great new gTLDs are for SEO is pretty clear: the association is made up of individuals who stand to gain, in one way or the other, from the increased acceptance of the new gTLDs.

So, what was their conclusion?

It’s gonna surprise you, watch out!

“SEO Expert Research Reveals Search Advantages of Relevant Domain Name Extension”

Wow! What a headline!

But here is some very fair criticism of this study from Domain Name Wire:

What the research did find was that the so-called “domain authority” of some of the ranked new top level domain names was much lower than .com domain names ranked for the same terms. This would suggest that it’s easier to rank a site on a new TLD than on .com.

What you won’t see in the material is that the sample size was quite small–about 300 “newer” domains (about 2,000 total).

So, a definitely biased study whose conclusion should not be taken to heart.

And here’s a little bonus: J MU weighing in on the study via Twitter:

@DInvesting That looks misleading. New TLDs can rank well, of course: all TLDs can! Also, Google doesn’t use DA for ranking.

— John ☆.o(≧▽≦)o.☆ (@JohnMu) April 25, 2017

 

Rapid-Fire SEO Insights

On HTTPS In the Search Results
https://moz.com/blog/half-page-one-google-results-https

New study shows that about half of all page one search results contain HTTPS (secure) sites.

Voice Search = the Next Big Thing
https://www.highervisibility.com/blog/get-it-while-its-hot-voice-search-is-the-next-big-thing/

Here are some interesting statistics:

  • 55% of teens and 41% of adults use voice search on a daily basis.
  • Shifts in user behavior have taken strides to become increasingly hands-free.
  • Google says 20% of queries on its mobile app and on Android devices are voice searches.

Alexa, rank my site for voice search.

Real Estate SEO Strategies Applicable to Many Niches

http://webris.org/real-estate-seo/

This is a really solid, thorough pos
t on how to be really intentional about your SEO plan. Though it is focused on the real estate niche, this can be applied to many different niches.

For instance: this bit about keyword intention is super critical to getting value from the keywords you target and rank for:

Try and stay away from “luxury” keywords, I understand the appeal but there’s a handful of issues with them:

  • Luxury keywords are brutally competitive. The more competition, the longer and more expensive it is to rank your website.
  • The quality of those keywords aren’t that great. You get a lot of window shoppers who just want to look at pictures – it’s not the best traffic.
  • It’s rare that someone will buy a $10m home from an internet search.

We’ve previously written about this: choosing keywords with buying intent.

I have seen people be very unenthusiastic about targeting a very clear buying-keyword with 500 searches/mo, compared to a very generic, not-at-all buying-intent keyword with 15,000 search/mo.

Like the wise philosopher STOCK PHOTOS once illustrated:

follow the money

Follow the money!

 

Is Google AMP Worth It?

http://www.stateofdigital.com/google-amp-case-studies/

There’s been a lot of back and forth about whether or not Google’s AMP feature helps or hurts publishers.

Seems to me like it’s mostly early and they are still fine-tuning everything to try and take care of all the content producers…

jk

Okay, but seriously this post contains three case studies focusing on how being listed in AMP has or has not helped with organic traffic.

News site case study results:

The vast bulk of a news publisher’s search traffic comes from Google News. You can argue for days about whether AMP yields more or less ad revenue than non-AMP, but without traffic your ad slots will go un-monetised. So, if you are a news website AMP is not optional. Failing to adopt AMP means your competitors will scoop up all the mobile traffic and you are left with the scraps.

Cool. Here’s the results for a lead gen site:

Implementing AMP on this page has resulted in 27% more traffic from mobile devices. But for a lead gen website, traffic is just part of the equation. The next question is, does the added AMP traffic result in more conversions?

This lead gen site saw 18% improvement in goal conversions from organic search after they implemented AMP on their key landing pages. The AMP traffic doesn’t simply bounce, it really engages with the site and contributes to improved business outcomes.

And finally, and ecommerce site:

The site saw a strong uplift in traffic from mobile devices after they implemented AMP, but this also corresponded to lower engagement metrics: a higher bounce rate and less time spent on the site.

Interestingly enough, the ecommerce conversion rate improved after AMP was implemented, but the average order value dropped significantly.

INTERESTING

 

An Introduction to Video SEO

https://www.portent.com/blog/seo/video-seo-2017-best-practice-guide.htm

I don’t do a lot of video SEO — or any. I’m not super experienced and it doesn’t come up a lot in the scope of our link building service.

Don’t be sad, though! Other people are good at video SEO. In this post, the Portent team give a nice intro and overview of video SEO.

video seo

Not a post that lends itself to being summarized–go check out the whole thing (if video SEO would be helpful).

 

Get Your …Sitemap Together

https://moz.com/blog/xml-sitemaps

Basic premise of this article: Sitemaps are important, but you’re probably not using them correctly.

Google engineers certainly understand that every site has a certain number of “utility” pages that are useful to users, but not necessarily content-type pages that should be landing pages from search: pages for sharing content with others, replying to comments, logging in, retrieving a lost password, etc.

If your XML sitemap includes all of these pages, what are you communicating to Google? More or less that you have no clue as to what constitutes good content on your site and what doesn’t.

It’s good to understand the WHY of sitemaps, so you can implement them properly. They aren’t just an item to be checked on your “Is My Site SEO’d” checklist.

This article is a good overview of the how and the why so you can get your site zipped up nice and tight, in an easy-to-read package for Googlebot.

This a a great beginner/refresh post, so definitely give it a look.

Rapid-Fire SEO Insights

The winner of the WIX SEO competition explains his strategy:

– exact match domains are effective
– multi-language site (increased queries/clicks)
– creating buzz (and evangelists) through social media
– differentiating from competitors by using a newGTLD (.tech)

HTTP/2 isn’t the future–it’s the present:

HTTP/2 can be used starting today on your web applications and can only be good (for your users mostly) but also for your application on multiple things: performances, SEO, encryption.

^ very app-focused, tech-heavy articles, but good to keep on eye on these things.

Insane in the RankBrain

https://www.bruceclay.com/blog/rankbrain-real-seo-impact/

rankbrain

This article is about RankBrain, and the author’s speculation as to how it works. Unlike the first other two big ranking signals (links and content), things are a little vague about how this top-3 ranking signal) according to Google) affects SEO.

Here are the basics, from the post:

1. There’s a lot of speculation (esp when it first came out) about what exactly RankBrain is. Here’s a nice… high-level definition:

RankBrain is a machine-learning artificial intelligence system that came onto the scene in 2015.

2. How RankBrain works:

Essentially, RankBrain can take sets of “training” data created by humans to help establish a baseline, and then can apply machine learning to determine the best search results based on a variety of factors over time.

Feel free to form your own opinions about RankBrain and ranking in Google after reading this post. Bruce Clay is a smart and respected dude, but I don’t quite agree with one of the conclusions he draws in this post:

So in the era of RankBrain, even though the basics of SEO that we know and love are still important, you’ll want to think of creative ways to grab that SERP real estate.

That means if you’re not in the upper echelon of brands online in your space, consider supplementing your search marketing strategy with pay-per-click ads.

Helping run a link building agency, I have access to a lot of data, and see a lot of sites that are not “upper echelon” brands killing it on some very valuable keywords.

Remember kids, follow the money. Bruce Clay’s agency will 1) do PPC advertising for you if you’re NOT a huge brand, and 2) will do SEO for you if you ARE a huge brand, so it’s $$$cashmoney$$$ either way.

It’s always good to remember that kind of thing when considering who is telling you what.

PS. Buy a link building subscription from us! ? ? ?

 

Trust in the Time of Google

https://dejanseo.com.au/trust/

I talked above how links and content are two of the most important ranking factors. Well, trust and authority are two ways that the combination of links and content can be expressed.

Trustworthy sites and authoritative sites rank well in Google.

There wasn’t much of an… introduction on this post, but from what I can tell the author surveyed a bunch of people about what they think constitute a trustworthy site. You can take this info and apply it to your own site to make sure it comes off as trustworth as well (beyond just “write good content and get good links”).

Around 21% of surveyed users check the URL to establish whether they’re on the intended domain name and 19% look for encryption symbols in their browser’s address bar. The two factors already account for 40% of all answers in the latest web trust survey. Brand awareness signal appears to be strongly linked with URL as well. Users tend to check both URL and brand instances on the page such as logos, images and text. URL and brand alone (12%) add up to more than half of the answers in the survey.

Here’s a handy bar graph from the post to help you remember all those valuable trust signals:

trust signals

 

Case Study #1: PBNs Work

http://diggitymarketing.com/affiliate-seo-case-study-2/

Got three case studies for you to read this week.

First up is this one by Matt Diggity and Josh Kelly from Hammerhead Domains.

How did they raise the roof an Amazon affiliate site’s traffic by 425%?

traffic graph

Optimized the site, and got some sweet PBN links.

Yeah, there’s a bit more to it, and they go into in the post, but they basically did, you know… regular SEO things like optimize title tags, fix broken pages, build links…

Still. It’s always helpful to go through someone’s case study and see step by step what they did to get some big wins. Def. check this one out.

Also, if you’re looking to DIY some on-page SEO, Travis and I wrote this awesome guide for you.

 

Case Study #2: An SEO Audit for Beginners

https://yoast.com/yoast-case-study-seo-mom-blog/

In the final case study we’re featuring, Marieke goes through a “mommy blogger” site and explains the what and the why of her recommendations.

If you’re a jaded-ass SEO veteran, best to skip this one. If you’re a bit newer to SEO, you’ll get a lot out of this. Each SEO audit item is explained as if you’re a beginner, and it’ll be easy to apply a lot of this to your own site.

The page speed score of the homepage of One Beautiful Home is very low (17/100 on desktop in Google Page Speed Insights). A low page speed is bad news for your SEO! The images on the homepage are quite heavy and should be optimized. Overall, you could reduce their size by 3.5 MB (76% reduction), which would, most likely, substantially boost your site speed.

 

Rapid-Fire SEO Insights

https://moz.com/blog/schema-new-restaurant-menu-markup

Trying to rank a restaurant? (Good luck…)

Here are the most recent Schema updates to include menus.

  • A new menu type. Menus officially become entities in Schema.org with their own properties and subtypes.
  • The new Menu type includes a hasMenuItem property. This property would be used to point to the (also new) MenuItem schema type, which is what would be used to mark up individual menu items.
  • Since most restaurants feature a few menus such as one for breakfast, one for lunch and one for dinner, there is a new hasMenuSection property and a MenuSection type that can be used to mark up the various menus. And you can also use it to mark up the different sections of each particular menu such as the appetizers, salads, main courses, and desserts on a dinner menu.
  • For each MenuItem, we’re able to mark up the name, description, price, and nutritional information. And while it’s not new to schema, you can also use the suitableForDiet property to denote if the menu item is low calorie, low fat, low salt, vegan, gluten-free, or suitable for various other restricted diets.

 

Are We There Yet: On the Time it Takes to Rank

http://www.johnfdoherty.com/how-long-does-seo-take-to-work/

It’s not a huge surprise that the most common (and most quickly-asked) question between any client and SEO agency is:

How long will it take to rank for my keywords

A perfectly reasonable question! Unfortunately, the answer is complicated and something between ¯\_(ツ)_/¯ and (ಥ_ಥ).

This blog post does a good job of breaking down why:

You need to allow time to do the audit
You need to allow time to get the work implemented
You need to allow time for Google to recrawl everything
Building links takes a lot of time if you’re doing it scalably as you build the rest of your business

But, basically:

It takes time.

skeleton
waiting for SEO

 

Exact Match Domains: Not Inherently Bad

via a tweet from Gary Illyes:

Did you think exact match domains were bad?

A lot of people do, in my experience.

Why?

Quick history lesson:

Back in 2012, Google hit us with the EMD update.

The purpose was to prevent low quality and spammy sites from using the power of an exact match domain name to outrank better sites.

But, exact match domains really do help sites rank, in my experience. The site either has to be quality to take advantage of this (or sometimes it just seems the filter Google applies to this can take a while to really kick in).

I wanted to share this with you to clear up a lot of the incorrect assumptions I often hear.

K?

K.

 

Rapid-Fire SEO Insights

This week’s theme: content… again!

Important stuff, content. There have been a few really great articles published on content (esp as it relates to ranking), so I wanted to share them with you all at once. Should keep you busy for a while…

10x Content Ideation & Process
http://kaiserthesage.com/content-creation-process/

There are two ways to succeed in content marketing:

Be the first one to break a story. Or be the best one to tell it.

This article is about the second way.

The Ultimate Content Creation Guide
http://www.nichepursuits.com/content-creation-guide/

5300 words of content strategy–especially useful if you have the money to hire a team.

Content Brainstorming
http://ipullrank.com/content-exercise-using-link-building-prospecting-content-brainstorming/

Power-up your outreach efforts by combining link building prospect outreach with content-idea brainstorming. Sounds complicated? It is! Read all about it.

Analyzing Google Organic Results

http://www.localvisibilitysystem.com/2017/01/19/breakdown-of-page-1-of-googles-local-organic-search-results-who-dominates/

local business

Interesting breakdown of what types of results dominate the front page of local organic results.

  • Business Homepages – 37.2%
  • Directories: Category Search – 36.62%
  • Business Sub-pages – 12.68%
  • Directories: Business Pages – 7.58%
  • Other – 4.86%
  • News – 0.64%

Click through to the article for an explanation of what each entails (and the way this data was compiled).

So now that you have this data, what can you do with it?

Consider what type of page you’re trying to rank (for instance: homepage vs. sub page). You’ve got a better chance of ranking your homepage vs. a sub page, according to this data.

Would it be worth trying to rank your homepage AND getting listed in the top directories to try and capture more overall real estate?

Yes, probably.

The best thing you can do for the local business page you’re trying to rank and Google the keyword(s) you’re trying to rank for and see what the results look like. If every site is ranking for a business sub-page and you’re trying to rank your homepage, it probably makes sense to evaluate WHY sub-pages are ranking, and then take action on your own site.

 

Googlebot Ignoring Canonical Tag?

http://www.gsqi.com/marketing-blog/google-ignore-rel-canonical-different-content/

canonical vs cannonical

John Mueller discusses the rel=canonical link in a recent webmaster hangout, and said something interesting relating to the canonical tag.

Basically, if you used a rel=canonical tag to tie two pages together that are NOT equivalent (aka, using the tag incorrectly), Googlebot can ignore the directive if it thinks you made a mistake.

The consequence of this? It could lead to your pages being indexed incorrectly–with either having more pages indexed than should be, or leading to pages not being indexed that should.

From a case study tackling this issue:

Don’t simply canonicalize mass amounts of urls to other urls with greatly different content. That’s not really the intent of rel canonical anyway. It was introduced to help cut down on duplicate content and help webmasters point Google in the right direction about which urls to index and surface in the search results. As demonstrated above, Google’s algorithms can think the canonical setup is a mistake and simply ignore the canonical url tag.

 

Google’s Bottom Ads

http://www.thesempost.com/google-testing-4-adwords-ads-duplicated-ads-bottom-search-results/

Always
Be
Testing

Google’s quest to make more money from ads is never ending (understandably). The newest iteration of this has appeared in the form of testing ads at the bottom of the SERPs. Like this:

bottom ads

This is significant to SEOs, obviously, because while it may not be taking real-estate away from organic results (yet), it is further decreasing the percentage of the page dedicated to organic results. Google is not trying to make things easier for increasing your organic traffic–if that wasn’t already obvious…

 

What the Hell is a Crawl Budget and Why Should You Care

http://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html

google bot
Go, Googlebot, Go!

Google crawls a website to index and catalogue the site, so it can display relevant results to searchers.

There’s been a lot of talk about a crawl budget for your site–that is, Google will only crawl so much of a website, so make it count–but we’ve not really heard anything very official on the matter from Google.

Until now.

Check out this post for the full scoop, but here are some highlights:

Googlebot is designed to be a good citizen of the web. Crawling is its main priority, while making sure it doesn’t degrade the experience of users visiting the site. We call this the “crawl rate limit,” which limits the maximum fetching rate for a given site.

Simply put, this represents the number of simultaneous parallel connections Googlebot may use to crawl the site, as well as the time it has to wait between the fetches. The crawl rate can go up and down based on a couple of factors

What can affect crawl rate and reduce your “crawl budget?”

Faceted navigation and session identifiers
On-site duplicate content
Soft error pages
Hacked pages
Infinite spaces and proxies
Low quality and spam content

So make sure your site is very “crawlable” (by having really awesome on-page SEO–check out our guide if you want to rock it).

 

Website Migrations or How to Ruin Everything

http://www.stateofdigital.com/stop-and-step-away-from-the-migration/

There are a lot of good reasons to do a site migration, but there are
even more ways in which things could (will probably) go terribly wrong, and tank your site’s visibility, like this:

site migration

So how do you do a redesign/migration and not destroy everything?

This post is a good starting point… but I would say the #1 most important thing: hire a developer who knows WTF they are doing (and understands SEO).

So go through this guide if this is something you’re considering and make sure you understand all the little pieces involved.

Working on many migrations, I am often left surprised by the lack of respect that is given to the need to implement redirects. Whether it is key stakeholders, c-suite or developers, there seems to be a reluctance to want to implement them.

Now some of the excuses can be valid, with website speed being one provided by the development team. However, if they are not implemented then there is a serious chance that you could lose a lot of the visibility that you have worked so hard to get over time.

 

After Links, Trust is Everything

https://www.semrush.com/blog/7-ways-to-make-your-website-more-trustworthy/

untrustworthy

Overdosing on pop-ups, bait and switch content, doing one of any number of things to damage your website’s trust is SEO suicide.

You could have a powerful link profile, but if visitors run back to Google after a few seconds of being on your site, not even that .gov links you’ve been scheming to get will save you.

This article lists several things that could damage the trustworthiness of your site. Collect all seven!

Be careful with the ad content that displays on your website. Yes, banner and pop-up ads can help you generate extra revenue through your company website… but at what cost? A site that’s overloaded with ads for off-site products is going to come across as unprofessional, if not outright spammy. Ditch the ads. Make sure your whole site is just about you and your customer.

Consider this list a starting point for trust. There’s still a lot you can do to help your site outside of the points listed on SEMrush.

Advanced Content Marketing Strategies (for SEO)

http://authoritywebsiteincome.com/advanced-seo-content-marketing-strategies/

fire
spittin old school fire

Not quite for the beginners out there, this post will help you figure out how to consolidate your current content, how to make your content prop up your SEO, and similar examples from real sites.

This was a really good point (if you know what you’re doing and won’t break a bunch of your pages):

The idea with this strategy is that you reduce the number of posts you are trying to rank for any keywords and combine multiple pages on long tail keywords into 1 MONSTER resource on the given topic. This strategy I believe plays into Google Hummingbird algorithm layer […]

If you’ve been wondering about some options to build up your content into something more powerful for your site, this is a good resource.

 

Google Hates Your WordPress Theme

http://www.elite-strategies.com/google-hates-your-wordpress-theme/

You think your website looks nice, but do you really need a giant parallax image of three people smiling at a computer?

Themeforest wants to sell you a slick video-slider theme that does one of everything, but what do your customers want?

And we know that, for the most part, Googlebot wants what the people want.

[…a particular theme] referenced 23 JavaScript libraries and 20 CSS stylesheets! Kind of nuts. The scariest part is that there are 500+ sales of the theme floating around out there. Most likely a lot of frustrated webmasters.

This really isn’t an isolated situation either. There are 100’s if not 1000’s of these themes out there. They look pretty, they have oodles of “features” but when Google comes to visit, it’s not a good thing. As an SEO, one of our biggest jobs is working with misconfigured websites, old frameworks, and broken themes.

I always like to reference “that last 10%” — your site is optimized, your content is working overtime, and your links are on point, what do you do next? What are the uncommon tasks that will push your site up that last little bit to #1.

Having a (good) developer go through your ‘out of the box’ WP theme and optimize that thing for a better user (and Spider) experience.

themore-youknow

 

Website Trust Factors

http://www.elite-strategies.com/website-trust-credibility/

trust-fall

Your website’s trust signals are incredibly important in ranking well. We think it’s so important that it is one of the foundations of our link building service.

This post is mostly full of “no duh” advice, but it’s stuff that gets frequently overlooked and can make a really big difference in ranking well.

Check it out and make sure you are not making any of these errors, such as:

We realize that in 2016 that a lot of businesses are virtual and might not have an address, but does that raise an eyebrow when a customer is going to make a purchase? All I know is this: we ran a heatmap on about 50 local business websites last year and the “address” portion of the contact us page was the 4th most viewed portion of the website. People definitely still care about an address and phone number, for one reason or another.

 

Technical SEO Checklist

http://www.greenlaneseo.com/blog/the-technical-seo-essentials-checklist/

turning-it-off-and-on-again

Speaking of “no duh,” this post is chock full of very helpful things to check your site for–not focused on trust–but on shoring up your technical SEO:

Robots.txt is a file on your site that tells web robots (such as Google’s crawler) where and how to crawl your site. They visit this file everytime they visit your site. It’s pretty easy to accidentally block important things, especially if you’ve had developers working or have just pushed a site from development to live. We see it all the time. Robots.txt has to be pretty much perfectly implemented.

I don’t know how many sites I’ve worked on trying to figure out if it’s a new Google penalty, or a manual action… nope. Robots.txt blocking spiders. True Story.

 

Knowledge Panels in Site Audits

http://www.seobythesea.com/2016/10/knowledge-panels-in-site-audits/

Bill Slawski is a smart dude.

His recent slide deck from Pubcon focuses on the knowledge graph, and how rich the search results pages have become.

Look through this slide show. For real.

Also, the page has some pretty good resources I recommend checking out:

Adding Markup vocabulary on your site can result in a knowledge panel showing links to social profiles for a business, as described here:

https://developers.google.com/search/docs/data-types/social-profile-links

 

Rapid-Fire SEO Insights

This week, domain-related news.

http://www.dnjournal.com/archive/lowdown/2016/dailyposts/20161018.htm

Google has partnered with new gTLD rockstars, donuts, to launch Nomulus.

What is Nomulus?

a new open source cloud-based registry platform that, in addition to powering Google’s own new TLDs (including their dotbrand extension – .google – and their generic TLDs like .app), is available to other registry operators as well.

http://dngeek.com/2016/10/popular-domain-extensions-startups-q3-2016/

According to this data (which looks at newly-funded-startups), .com is still the dominant TLD, but is down 5% from last quarter due to the slow but steady rise of the new gTLDs (with .tech being the most popular gTLD

Learning SEO by Studying Anomalies

http://www.seobythesea.com/2016/10/learning-seo-looking-anomalies/

I try not to do this often, but this article doesn’t really translate well to a summary, so I really recommend you read the whole article.

Basically, this is a story of learning things about SEO by investigating anomalies in the search results.

In this case, someone noticed a date attached to a post in the SERPs, and Bill Slawski did some digging to try and figure out where Google pulled the date from (as it was not written explicitly in the post).

What he found has some pretty interesting implications where Google understanding context is concerned.

initial-question

Which answers the question that Dan asked about why Google decided that the date in the snippet should be April 10, 2016 instead of the date that it was actually published October 4, 2016. Google may have paid more attention to the context of date formatting than it possibly should have in that post. It does show that Google is paying attention to context, though.

 

Rapid-Fire SEO Insights

http://webmasters.googleblog.com/2016/10/using-amp-try-our-new-webpage-tester.html

Use Google’s new AMP testing tool in your Search Console if you are all about that mobile traffic. Looks like a solid tool.

http://www.localseoguide.com/sometimes-best-seo-strategy-wait/

Interesting (tiny) case study on a client who experienced massive deindexation that coincided with Penguin’s rollout. Turns out it was just a Google algorithm hiccup. Indexation (and rank) quickly went back up to pre-error levels.

Best to observe a lot and act slowly when experiencing a big shake-up in the SERPs. Could be one of a thousand things. Don’t kill your site by panicking.

http://blumenthals.com/blog/2016/10/04/google-testing-advanced-verification-process-for-plumbers-locksmiths/

Google is focusing on businesses that experience a fair bit of fraud (letting someone posing as plumber/locksmith into your house is serious business), and if you found that person through a Google search, I guess Google feels some responsibility to only send true professionals. This specifically applies to businesses using adwords.

UTF-8 BOM: A Little Known Robots.txt Problem

http://www.gsqi.com/marketing-blog/utf-8-bom-robots-txt/

First up, WTF is UTF-8 BOM?

BOM stands for byte order mark and it’s used to indicate the byte order for a text stream. It’s an invisible character that’s located at the start of a file

Cool. Why does it matter?

…when your robots.txt file contains the UTF-8 BOM, Google can choke on the file. And that means the first line (often user-agent), will be ignored. And when there’s no user-agent, all the other lines will return as errors (all of your directives). And when they are seen as errors, Google will ignore them.

By the way, this is some very advanced stuff, so I wouldn’t worry too much about this until you’re writing great content, optimizing on-site SEO, etc.

This post goes into how to fix this little problem.

utf-bom-result-top

I recommend checking this post out if you’re looking to address the leftover 20% of the 80/20.

 

Yet Another Post About Content Syndication

https://www.searchenginejournal.com/increased-pageviews-34-with-content-syndication/174369/

I’ve mentioned content syndication for SEO wins before.

It’s good stuff! This post digs into where and how brands are syndicating their content. Everyone else is jumping off this syndication bridge, so you might as well…

Buffer syndicates old content from their blog. Typically, the content Buffer syndicates on Medium has also been syndicated on other publisher sites like Huffington Post, NextWeb, etc. An example: a piece that was published in April 2013 on Buffer was then repurposed on Huffington Post in January 2014, then finally posted to Medium in August 2016.

Increased traffic is the end goal of SEO anyway, right?

This is a solid, meaty article that gives you everything you need to push your content to work (more) for you.

 

Rapid-Fire SEO Insights

https://www.searchenginejournal.com/wordpress-com-ushers-set-advanced-seo-tools/175447/

Business users on WordPress.com can now customize meta title/descriptions (only like, a decade or something after that was considered a useful feature?) 🙂

http://seoauv.com/keyword-research/

Google made their keyword planning tool less useful to SEOs (and SEO tools), so doing keyword research in this new era is a bit different than it used to be. Nick has put out a solid guide that should help you through it. This is not a post for the casual SEOer, this is a big guide that offers a ton of value.

Read it. Use it.

https://ahrefs.com/blog/dwell-time/

Dwell time = the amount of time a user spends on a site, originating from the search results and coming BACK to the search results.

Bounce rate = the percentage of single page visits on a site–less useful (and less important) to Google.

Lots of speculation on this post, but the main point is that dwell time is relevant to SEO.

You can improve dwell time by writing quality content, tying it to a relevant keyword, and not using click-bait.

An Important Note About Penguin and Disavowing Links

http://searchengineland.com/google-penguin-doesnt-penalize-bad-links-259981

penguin

Here’s the deal: Google’s Penguin update/on-going filter will take care of spammy backlinks by simple devaluing them (vs. demoting (‘adjusting the rank of’) a site.

So you don’t HAVE to disavow links, but Google wants you to Help A Brother out and snitch on those bad links anyway.

Here’s a summary including a Gary “Super Ill” Illyes quote:

So in short, it seems Google Penguin no longer penalizes the site or specific pages but rather ignores/devalues the spammy links and thus the rankings are adjusted. Gary said this should make webmasters “happier,” adding “and that makes me happy.”

 

Get Yourself a Featured Snippet

http://www.bruceclay.com/blog/featured-snippets-with-glenn-gabe/

google-featured-snippet

One thing I probably won’t shut up about any time soon is featured snippets, and how to get that #1 spot.

Glenn Gabe sums it up neatly:

  • Cover a topic as thoroughly and clearly as you can.
  • Answer the question concisely and provide both the question and answer on the page.
  • Provide a concise section that answers the core query.
  • Use bullets or numbered lists for processes.
  • Provide a strong image near the answer for possible inclusion in the Featured Snippet.
  • Use HTML tables where appropriate.

Easier said than done, yes, but you should absolutely try and get that featured placement.

Because the only thing better than ranking #1 is being featured…

 

The Power of Linkless Mentions

http://www.bruceclay.com/blog/how-linkless-mentions-help-seo/

Basically, there’s more to life than getting links.

At Supremacy, we’re all about the links. But we also recognize the value of getting your brand mentioned on various websites. In addition to helping beef up your search engine result listing, like this:

sseo

Getting your brand mentioned on website can help build authority. It’s part of a balanced SEO campaign. In addition to those indirect authority-building benefits, unlinked mentions can also help you by getting you to focus on other aspects of your business, argues the article.

Linkless mentions are a long-term investment that spreads your name far and wide. The engines can see this happening, and they’ll wonder why your brand or product is becoming more important. Then they start testing you in results to see if you please searchers.

That’s the ultimate goal. Pleasing the consumer. The engine wins and so does the business.

I mean… yeah. We’re an SEO site and we <3 SEO, but you should definitely be focusing on the consumer if you’re a business (you know, AFTER you’re ranking in the top 3) 😉

 

Wired’s Slow Switch to HTTPS

https://www.wired.com/2016/08/wired-https-progress

You’ve probably heard that switching your site to HTTPS will offer some kind of ranking boost.

That may be fine for your 10 page affiliate site, but for a massive media site like Wired, it’s a gigantic undertaking. It’s been several months, and they’ve only turned it on for one part of the site:

I think it’s interesting to watch how a big, successful brand takes on an SEO issue we all (probably) had to deal with at some point.

Temporary SEO changes on your site are a possible consequence of transitioning to HTTPS. Although we’ve been working hard to manage SEO for HTTPS migrations according to industry best practices, our initial results for the Security section have left us uncomfortable with turning on sitewide HTTPS so soon

issues decreasing

^ looks like it’s getting better, though…

 

A Guide on Competitor Analysis

https://www.gotchseo.com/seo-competitor-analysis/

Ideally, your SEO efforts should include more than just good link building.

Competitor analysis has a lot of great uses. Here are three, according to this article:

There are three reasons why competitor analysis is critical:

1. You can find what they’re doing well
2. You can find strategic advantages
3. You can find link opportunities

This isn’t something you should get super caught-up in and spend all your time on, but at the very least, be aware of what’s going on with the competition.

Knowing is half the battle, yo.

knowing is half the battle

 

WTF Do All Those Metrics Mean — AHREFs Edition

https://ahrefs.com/blog/seo-metrics/

Yeah, so we checked the KD and it was 45. The UR/DR score was medium-low and the live backlinks were reasonable so I said ‘yeah, we can probably rank that page.

There are a lot of metrics that get thrown around in the SEO space. You can reference a sites Domain Authority (moz), Trust Flow (Majestic), or Domain Raiting (AHREFs), but what do those things actually mean? How are the number determined?

Transparency is *usually* a good thing–especially in a case such as this, where understanding the data behind a metric can help you understand what the metrics are telling you.

AHREFs just released a post where they share the meaning behind their proprietary terms, and the values they consider when determining a specific metric (espeically keyword difficulty).

A nice post to get an idea of what’s going on there, or if you’re a rookie, a nice post to understand what a lot of the acronyms being thrown around actually mean.

We often see people explain Ahrefs’ URL Rating as a replacement for Google PageRank metric – but they’re not the same.

We indeed have started out with a PageRank-like formula, but then “UR” underwent quite a few iterations with a goal of creating a metric that would have the highest possible correlation with Google rankings.

And as you can tell from the graph below, URL Rating correlates with Google rankings better than any of our “unprocessed” backlink metrics:

ranking factors

Youtube Marketing and SEO

http://www.slideshare.net/RyanStewart3/13-tips-to-explode-youtube-channel-growth

We’re not super big into Youtube, as we are pretty focused on link building.

But it’s not a bad idea to focus down on Youtube if you’re in the kind of niche where videos would work well.

As everyone always says “it’s the number 2 search engine, might as well rank well on it…”

Ryan Stewart has been impressing me this year by consistently producing good content.

Here’s the slide deck where you can learn about getting your Youtube game leveled up.

*Note: In regards to YouTube, we are big fans of both ranking YouTube videos IN Google, and embedding relevant YouTube videos on your website for the pages you want to rank. A video in your pages is one of the best ranking hacks around for onpage seo.

A Massive Guide to Using SEMrush

http://www.matthewwoodward.co.uk/reviews/semrush/

semrush review

Matthew Woodward blows it up this week with a giant post on how to make the most of SEMrush.

The tool shows you:

Mentions: Where your brand – whether that’s your own name, or your business name – is being mentioned around the internet.
Backlinks: Where your backlinks are coming from, and opportunities for you to act on them, such as editing anchor text or networking with that site owner.

It’s the best of Buzzsumo and Mention in one, easy to use, tool that integrates seamlessly with the rest of the projects.

Good stuff. Give it a read if you a) want to use SEMrush more efficiently, and b) you have 11 hours to read and 300 to implement all the advice.
Just in case you were wondering, we usually use Ahrefs as our go-to backlink checker & overall SEO toolkit.

 

Optimizing Your Site for Knowledge Graph Domination

https://moz.com/blog/functional-content-auditing-masterplan

I’ve highlighted stories relating to the knowledge graph before, but never one so useful and instructional as this.

Go ahead and skip the first half of the moz.com post, because the good stuff comes after a lot of talk about his agency (Zazzle Media) and a tool his agency has made (Content Optimization and Auditing Tool).

**It’s probably a great tool, but you have to hand over a bunch of contact/personal info in order to get your “report,” which you don’t really need for the purposes of going after the knowledge graph**

But in highlighting this post, I am specifically pointing to the examples of how websites are capturing this valuable “featured snippet” real estate.

For all the rookies out there, this is what winning the knowledge graph looks like:

winning the knowledge graph

In the article, scroll down to “the perfect page structure” and go from there. There are some great analyses on the structure of pages winning the featured snippet. It’s a bit long, but it’s worth the read.

Claiming your own Featured Snippet then requires a focus on content structure and on answering key questions in a logical order. This also means paying close attention to on-page HTML structure to ensure that Google can easily and cleanly pick out specific answers.

 

Spend Your Crawl Budget Wisely

Yoast

your crawl budget
stock photo, what are you even trying to say?

If you’ve got a big site, Google is only going to spend so much of its resources crawling it.

If you’re site is small or you’re still working on issues like “writing quality content” or “where do I get backlinks” this is probably not the post for you.

But if your SEO is dialed in, it’s probably worth taking a look at how to optimize the crawling of your site via sitemaps and robots.txt.

If you have sections of your site that really don’t need to be in Google, block them using robots.txt. Only do this if you know what you’re doing, of course. One of the common problems we see on larger eCommerce sites is when they have a gazillion way to filter products. Every filter might add new URLs for Google. In cases like these, you really want to make sure that you’re letting Google spider only one or two of those filters and not all of them.

Good stuff.

If you’re needing a little more Smash Digital in your life, be sure to follow us on Facebook, Twitter and (lol) Google+ for lots more updates and tips.

Using Aggregate Review Schema to Get SERP Stars

http://www.whitespark.ca/blog/post/83-how-to-use-aggregate-review-schema-to-get-stars-in-the-serps

get stars

Like the way those stars look beneath all your competitors in the SERPs?

Well relax; now you can have your very own stars, like this:

google local serps stars

Here’s how:

Almost any Content Management System supports editing the HTML of a page or a part of a page. Access to the HTML is all you need. Because the JSON-LD code is invisible for the visitor, it’s still indexed as markup by the search engines.

Check out the post if you’ve got stars in your eyes (Star Pun!), and learn how to get them for your very own site.

 

Tie All Your Sites Together in Search Console

https://webmasters.googleblog.com/2016/05/tie-your-sites-together-with-property.html

New google search console feature
Your mobile site, your AMP traffic, your app–all can now be treated inside Search Console as one site.

Property Sets will treat all URIs from the properties included as a single presence in the Search Analytics feature. This means that Search Analytics metrics aggregated by host will be aggregated across all properties included in the set. For example, at a glance you’ll get the clicks and impressions of any of the sites in the set for all queries.

This new feature will be rolling out over the few days, so keep an eye out for it if you’ve got some properties that could benefit.

 

De-indexed Pages Blocked in Robots.txt

http://ohgm.co.uk/de-index-pages-blocked-robots-txt

Has this ever happened to you?

You specify pages to block from being crawled/indexed, but there the little bots go, crawling and indexing pages you didn’t want crawled or indexing, making a MESS of everything, like

there's got to be a better way

THERE’S GOT TO BE A BETTER WAY!

In this test was a site that hasn’t been accessible to crawl for over two years. I think this makes for a good test – we are removing a particularly stubborn stain here. But because of this I also believe it’s a particularly slow test. The only exposure to those URLs is the ancient memory Google has of them (it can’t access the sitemap or internal links) and the submission to an indexing service.

There’s also the possibility that Google is electing to keep these URLs in the index because of third party signals. Like the links agency types were hitting them with from 2011 to 2013.

So this is a small solution to an infrequent problem, but if you have a URL you are trying to get out of Google’s index, it’s an important solution.

To oversimplify and sum it up: if you tell the Robots.txt file to disallow the URL to be crawled, the URL still has to be crawled to get the information.

The solution here it to swap out “noindex” for “disallow.” The results of the experiment were that a URL that had been “disallowed” via Robots.txt was showing up in Google like this:

indexed and blocked

Several days later, many of the domains had dropped out of the index.

neat

 

Yoast on Robots.txt

https://yoast.com/ultimate-guide-robots-txt/

Speaking of robots…

Yoast, the most drawn man in SEO, has released a pretty epic post on all things Robots.txt

This is probably more information than you’ll ever need, but it’s always a good thing to know WHERE that information lives, so when you need it, you know where to find it.

The robots.txt file is one of the primary ways of telling a search engine where it can and can’t go on your website. All major search engines support the basic functionality it offers. There are some extra rules that are used by a few search engines which can be useful too.

This guide covers all the uses of robots.txt for your website. While it looks deceivingly simple, making a mistake in your robots.txt can seriously harm you site, so make sure to read and understand this.

robots-txt-tester

As I always like to point out, this is also a good example of some seriously high-quality, relevant, in-depth content. This page is gonna rank super well for many robot.txt-related searches. Learn from it!

The Great CTR/SEO Debate

Stone Temple

Is CTR a SEO ranking factor

I’ve seen a quiet back and forth discussion about whether or not Click Through Rate affects SEO rankings or not.

One the one hand, there’s not a lot of hard data on this subject (it’s just too difficult to isolate one ranking factor and point to it as definitive).

On the other, it does make sense that Google should rank higher sites that get more clicks for a given query than those that get skipped over, or where the searcher bounces back quickly from.

Within the SEO space, there are a few well-known attempted case studies from Rand Fishkin seeking to prove that CTR is, in fact, a ranking factor.

This article (from Stone Temple) seeks to look at both sides of the argument, and come away with something objective. They even explain where the results Rand found might have come from (instead of CTR):

I’ll stand by my guess that I made in the above conversation that some part of the Google algos that are designed to pick up on hot news events is triggering the behavior seen in Rand’s experiments. This would explain the rise in the results and the drop afterwards when the click activity tapered off. But, we can’t know 100% for sure.

Fantastic article here, definitely recommend clicking through and reading the whole thing.

 

On Blog Comments and SEO

http://www.thesempost.com/comments-good-for-google-seo/

SEO blog comments

I love a good “going against the grain” post. Here, The SEM Post talks theory, and why blog comments are good for SEO.

Token John Mueller quote:

So if these comments bring useful information in addition to the content that you’ve provided also on these pages, then that could be a really good addition to your website. It could really increase the value of your website overall. If the comments show that there’s a really engaged community behind there that encourages new users when they go to these pages to also comment, to go back directly to these pages, to recommend these pages to their friends, that could also be a really good thing.

In the end, the comments argument comes down to:
a) user experience/engagement, and
b) are they contributing quality content to a page (and are they being indexed by Googlebot?)

The article links to several examples of comments helping the SEO efforts of a website (including having a featured snippet pulled from the comments of a web page).

 

Starter SEO Training

http://imfromthefuture.com/digital-marketing-seo-training/

SEO Beginners

Shout out to the SEO beginners! The agency I’m From the Future has put together a killer resource on all the important SEO basics, plus a helpful glossary of SEO/Marketing terms at the end.

All you SEO #Ballers might want to sit this one out…

You can think of linkbuilding and outreach as sister terms, kind of like SEO and Digital Marketing. The two are practiced in tandem in that outreach is used to supplement linkbuilding.

Linkbuilding is especially important because Google uses links as a means of discovering new websites and then using it’s UNEARTHLY algo to then rank the pages it finds on each site.

Also, this is fantastic example of well-produced and helpful content–content that gets shared and linked to.

 

Huge SEO for Shopify Resource

https://www.digitaldarts.com.au/the-expert-guide-to-shopify-seo

Shopify Ecommerce SEO

Continuing with the theme of beginners, here’s an in-depth post on how to rock your Shopify SEO.

101 + tips is a lot, so this guide is best consumed by searching for a specific area you need help with, and then moving on to the next.

WordPress gets a lot of love in the SEO tutorial space, so I’m happy to see this rock-solid post from Digital Arts tackling Shopify.

1. Collections Structure
A collection in Shopify is a group of products. It acts like the various sections in a physical store that tells customers where to find a type of product. A well-designed group of collections tell people and Google what each is about to find a product.

Design your collections first for people. Google hates it when store owners attempt to please the search engine at the cost of user-experience. No one wants a drop-down with 100 brands.

Brickell have four primary collections: face, shave, body & hair, and collections. The collections group leads to a drop-down of “Bestsellers”, “Travel”, and “Kits”:

 

Site Redesigns and SEO — How Not to F*** it Up

http://www.seerinteractive.com/blog/seo-website-redesign-checklist

Just a single error when redesigning a site can tank the SEO results that took years to earn.

You’ve probably heard a horror story of some WordPress designer forgetting to uncheck the “discourage search engines from indexing this site” and wondering where all the organic traffic has gone.

block search engines

But that’s the SEO equivalent of the IT desk telling you to troubleshoot your tech issues by making sure the computer is plugged in.

This post is thorough, and takes you through everything that could go wrong when redesigning a website (from an SEO standpoint).

Monitoring the cache date will give you an estimate of when you can expect to see changes being picked up. This will depend on the crawl rate of your site, but once the redesign is cached, you should begin seeing changes in the index in a couple of days.

This is a great resource to give you a framework to work from, instead of just crossing your fingers. Good luck…

tumblr_lyzb1ioS3u1qlvwnco1_400

 

Mapping 301 Redirects

[When I add the link to this article, it keeps auto-inserting a very large snippet and image, so no link for them. Search “BuiltVisible Mapping 301 Redirects” to read the whole post]

In keeping with the theme of the above story, this post digs into how to map 301 redirects so you don’t wreck your site’s SEO potential.

seo rekt

This is one of those posts that you don’t really care about until you need it, but then you really REALLY care about it. If you have no use for it now, put it in your Evernote or something so you can easily come back to it.

On large sites, both H1 tags and titles are likely to be heavily templated, making them excellent for redirect mapping. Typically, I like to start with H1’s as they tend to be shorter and are more likely to yield a positive match. To perform our mapping, we’re once again going to utilise Excel’s VLOOKUP function.

Medium Traffic Increase

The Time it Takes for a Link to Influence Rankings

https://moz.com/blog/how-long-does-link-building-take-influence-rankings

If you sign up with a link building service (like our RankBOSS solution) or otherwise invest in SEO, you’ll need to manage your/your clients’s expectations.

Many an SEO has told clients to expect to start seeing results in months, not days or weeks. It can be frustrating for everyone involved, but it’s the way things work.

This post from Moz takes this oft-quoted rule and back it up with data.

Gotta love that cold, hard, data.

rankings take time

I picked out 76 links pointing to pages which are all similar to each other in content, and we didn’t change that content (significantly) for 6 months. I focused on rankings for target keywords with a 25–35% Keyword Difficulty Rating. I looked at two versions of their target keywords, so I could have a bit more data. The results aren’t super surprising to SEOs, but they’re often questioned by the managers of SEOs, and now you have graphs to prove what you’ve been saying all along.

 

Abandoning a Dedicated Blog for Medium.com

https://medium.freecodecamp.com/we-just-abandoned-our-blog-for-medium-you-probably-should-too-33e742a1d49

Medium Traffic Increase

Smart advice says to own your own platform, and build your home base there. Putting all your eggs in Facebook’s basket can really come back to haunt you when they change their visibility algorithms, or ban your group for any reason they want.

It’s a way to always have access to the readers or community you build.

But could there be a scenario in which it makes sense to move a fairly popular blog over to someone else’s platform, permanently?

That’s the case this article makes. It depends on what your end-goal is. If it’s to have even MORE people read the popular articles you write, it may make sense.

With Medium, anyone can write an article on their own initiative, then submit it to our Medium publication. After some light editing, I can syndicate their article to the tens of thousands of people who follow Free Code Camp’s publication. And there’s no ambiguity that they wrote the article, and at the bottom of the article, there’s a button readers can click to follow them.

 

The Most Valuable Featured Snippets

https://getstat.com/blog/most-valuable-featured-snippets/

Search Analytics gives us a very interesting look at featured snippets in the latest post from a series about featured snippets.

In case you’re wondering, this is a featured snippet, and it is displayed ABOVE the first organic result, below the last ad.

what is a featured snippet

So what’s the big deal?

Featured snippets usually get pulled from a search position outside the top spot, effectively giving sites an opportunity to outrank the first spot and pick up some extra traffic.

This post looks at the value of traffic obtained from featured snippets:

In our study, out of one million high-CPC keywords, 92,832 returned a featured snippet. Within that smaller data set, we found that 1,570 of those had annual traffic with an estimated value of over one million dollars.

We used some pretty simple math to calculate this estimated annual traffic value for featured snippet queries (CPC x annual US search volume).

This is just one topic from their comprehensive white paper. I definitely suggest going and taking a look at the whole thing!

 

On Redirecting Old, Expired Pages

http://www.hobo-web.co.uk/should-i-301-redirect-old-product-pages-to-existing-category-pages/

301 redirect
yes, there’s actually some stock photography for 301 redirects…

What should you do with old, expired, or no longer relevant pages? Let them 404? Redirect them to the home page?

It’s not as straightforward an answer as you might think.

This post digs into John Mueller’s responses to some questions concerning what to do with these old pages. Here are the most important takeaways:

  • 301 redirects to a category page would be treated as a soft 404.
  • If you have an old/expired/no-longer-relevant page that can be redirected to a similarly relevant page, that is OK.
  • Certain redirects to the home page will be treated as soft 404s as well.

What is a soft 404? Good question:

Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted (also, you probably don’t want your site to rank well for the search query” – source: Google

Lots more interesting stuff in this post. Recommend you read through and decide what applies to your site.

 

Increased Organic Traffic (Case Study)

http://webris.org/get-more-organic-traffic-right-now/

organic traffic growth

^ Look at that sexy organic traffic growth.

This post looking at increasing organic traffic for an ecommerce site by doing some smart keyword research is one of the most important things I think SEOs need to understand.

To crudely sum it up:

  1. You’re probably not going to outrank a brand for their own brand or product
  2. Stop writing content based on what the monthly searches are for a particular keyword.
  3. Be the first to own relevant content: look at the conversations happening NOW to help you determine what to write about. The monthly searches will probably be 0, but that doesn’t really apply.

In the video walkthrough, Ryan shows you how he is using social media sites like Instagram and Snapchat to get a feel for what people are really searching for, without waiting six months for the Google keyword tool data to catch up.

Smart stuff, pay attention!

Google picks up on the back end of what’s happening on social media. So instead of looking at dated key words, we wanna look at the source and we wanna attack it and capture it, and be the first entrance into that piece of content before other SEOs and content marketers start creating content around it. The best place to do that honestly is Instagram, and Snapchat’s even better because Snapchat is right now in the moment. Instagram is still very hip, people are sharing but Snapchat is right in the moment. I can’t show you Snapchat, it’s just something you gotta pick up on your own. But Instagram’s a great place to start.

 

How Google Works

https://builtvisible.com/how-google-works/

Recently, Google Engineer Paul Haar presented at SMX West and it is very relevant to our interests. The talk was called “How Google Works: A Google Ranking Engineer’s Story.”

12

So query understanding. First question is do we know any named entities [41] in the query? The San Jose Convention Centre, we know what that is. Matt Cutts, we know what that is. And so we label those. And then, are there useful synonyms? Does General Motors in this context mean…? Does GM mean General Motors? Does GM mean genetically modified? And my point there is just that context matters. We look at the whole query for context.

BuiltVisible transcribed the whole talk, with slides and links for further reading. This post, by the way, is an example of quality content that offers a lot of value.

A small summary is not going to do this long talk very much justice, so I’ll end with this:

Make time to read this whole article, or watch the entire presentation. The video is 32 minutes long, and here it is, for you convenience:

How NOT to do SEO

https://www.seroundtable.com/a-local-seo-got-60-clients-suspended-overnight-21828.html

Go Directly to Jail

Look, we’ve all made dumb mistakes and nobody’s perfect. I’m not highlighting this post to kick someone when they’re down, because I’m sure this person had the worst week ever.

But if you don’t learn from the mistakes of others, you’re definitely at risk to make similar mistakes, so here we go…

A consultant had over 60 clients on one account, and they were all suspended overnight. In general, if a single business gets picked up for spam, any other businesses owned by the same Google account can get suspended as well. For those of you working in bulk with many locations on the same account, there might be other rules there, I’d be interested to hear if that’s a danger for larger organizations as well. Either way, for those of you with just a lot of different clients, always make sure to keep them on their own designated account.

All 60 clients had their businesses suspended from Google Local. The SEO consultant had listed himself as the owner of all accounts instead of the manager. Either this in itself (it’s not a best practice) or some disavow/spam report/sketchy link building (it’s unclear) got the account pinged, which affected 66 different businesses.

Ouch

Bonus tip: PLEASE never put multiple sites into any kind of the same Google account (analytics/webmaster tools/etc). There have just been too many reports of this causing mass slaughterings.

 

A Buffet of SEO Tips

http://www.matthewbarby.com/seo-tips/

Matt Barby is one of the people’s whose updates I never miss. He publishes helpful, in-depth content, and it is usually worth the read.

This post is no exception. Some topics included in the 19 tips:

SEO Tips topics

From an organic search point of view, video has a few issues, especially if you’re not hosting your video content through YouTube. The biggest problem is that search engines can’t understand the content within video (yet).

One way to maximise the amount of keywords that your video content can rank for is to create full text transcripts to accompany them.

If you’re a hardcore SEO, a lot of these will be good reminders of tactics you already know/have thought of/haven’t used in a while. If you’re in the beginner or intermediate SEO stage, you’ll find a lot of gold in this here post.

 

RankBrain and Google’s Top Ranking Signals

http://searchengineland.com/now-know-googles-top-three-search-ranking-factors-245882

Last year, when Google debuted RankBrain, they claimed it was the 3rd-most-important ranking signal, but declined to say what number one and two were.

This week, Andrey Lipattsev, (Search Quality Senior Strategist) says the two most important ranking signals were…

Links and Content.

links and content

I’ve also seen a lot of chatter this week about RankBrain not actually being the 3rd most important ranking factor. There’s a lot of conjecture about all this, but links and content have always been the big focus, and, for me, they will continue to be.

 

Technical SEO: A Guide

https://ma.ttias.be/technical-guide-seo/

Indexing pages

I’ve talked about this before, but I see so many sites that clearly neglect the technical aspects of their site.

Lucky for you, there are some A+ resources out there to help you diagnose and fix common problems holding your SEO back. This post by Mattias is in-depth and heavily technical. I recommend going through it and, if it’s over your head, having someone who understands these points to take the checklist to your site and tighten up the loose screws. You’ll rank better for it.

Use the HTML elements to your advantage: h1 headers for your most important titles, h2 for the next level titles, h3 for the next ones, and so on. Google parses your content and uses those tags as its guideline.

Semantic markup goes a lot further than just h1 tags though. If you’re an event- or booking-site, you can add structured data to your markup that can be parsed and shown directly into Google’s search result pages.

 

Trackers: Creepy AF

http://jacquesmattheij.com/trackers

trackers are creepy

Marketers and online entrepreneurs can become pretty insulated when it comes to all the stuff we do to optimize our conversions, increase traffic through ads, and so on.

This post is a fun, if brutal reminder that your online “personal space” is constantly being violated, sold to the highest bidder, and relentlessly tracked. It’s interesting how the shifting point of view (consumer/marketer) puts its own spin on this behavior. It’s definitely creepy when the process is personified.

But this tracking and advertising system works really, really well. Case in point: Facebook made 2.9 billion in MOBILE AD REVENUE ALONE, in just the 2nd quarter of 2015… (source)

A couple of weeks ago I went to the local shopping centre looking for a thermometer. After entering one store upon leaving without buying anything a tracker was assigned to me. I didn’t think much of it at first, but he followed me dutifully around the shopping centre, took careful note of how I walked. Whenever I visited a store he made a note in his little black book (he kept calling it my profile, and he didn’t want to show me what was in it so I assume it was actually his, rather than mine). Each of those stores of course assigned trackers to me as well and soon enough I was followed by my own personal veritable posse of non-descript guys with little black books making notes.

 

Moving to a CDN

https://css-tricks.com/moving-to-a-cdn/

Moving domains to a CDN

CDNs (content delivery networks) can be a great thing for your site’s SEO. Increased site load time and resistance to hacks are two that come quickly to mind.

However, moving a few of your sites to a CDN is not the easiest thing to do. There’s a lot that can go wrong and work against your SEO fortune. But moving over 1100 sites? Madness!

In this interview, and SEO interviews his agency’s CTO to get the scoop on the who/what/where/why of moving that many sites. Really interesting; recommended reading.

Was the problem with point A our hosting, or the lack of a CDN?

Both, sort of. Our top cause of downtime was malicious traffic that our hosting environment was not positioned to handle. Also, our server response times consistently hovered at around two seconds. Response times should be under half a second, and two seconds should be nearing a complete page load!

 

16 SEO Experiments

http://seosherpa.com/seo-experiments/

I love SEO experiments. Even research-backed SEO experiments can be easily disproven, because the algorithms that drive search placement can be manually tweaked to cloud the waters.

That’s the case with the first of the 16 experiments presented in the post. Rand Fishkin’s CTR test produced an amazing result, pushing a result halfway down the page up to the #1 spot in a matter of hours.

When SEO Sherpa sought to reproduce the same test, they ran into some very lackluster, inconclusive results.

Rand was concerned that in response to his public post showing how clicks may influence Google’s results more directly than previously suspected, that Google may have tightened their criteria around this particular factor.

It could be that the private tests didn’t use enough clicks for the niche, or that the clicks were not as diverse as when Rand tweeted, or 500 other things that weren’t controlled.

In trying the test again, they were able to reproduce results similar to the first test (shooting a newly published blog post to the top of the page). So while, in the end, they achieved the same result twice, a series of failed tests between the two successful ones is hard to ignore (though the post seems to do just that).

There are 15 more experiments that are really interesting and worth reading. Like everything else related to SEO, take with a grain of salt…

 

The Guide to Online Publicity Campaigns

https://moz.com/blog/advanced-guide-online-publicity-campaigns

Last week we shared an article on PR wins and fails. This week, we’re highlighting an article that takes you through the process of running an online PR campaign for your business. It takes you through how to publicize yourself, how to take advantage of media tie-ins, how to land great interviews, and more.

It’s a meaty post that deserves your attention if you’re using content to generate traffic (and get links, of course).

“Content marketers” pitch me:

1.) To share or link to some random article, and they do so often when
2.) I have no connection to or interest in the topic at all

Publicists pitch me:

1.) To write about an idea because
2.) They already know that I have a connection to or interest in that topic

I ignore or delete the pitches from “content marketers.” Following the pitches from publishers, I may choose to include their source, study, or idea in some future piece in the publications to which I contribute. Most “link earning” methods are poor imitations of traditional publicity practices.

 

SEO Audit Checklist

https://www.gotchseo.com/seo-audit/

DO AN SEO AUDIT

It’s always a good idea to give your site(s) an audit if it’s been a while. Like how you go to the dentist twice a year to make sure all’s well, it’s a good idea to give your site a once-over every now and then to fix any stray SEO issue that could be holding back your organic traffic.

This guide takes you through several steps for making sure everything is okay, from competitor analysis, to keyword cannibalization:

One of the most important factors to look for in an audit is keyword cannibalization.

“Keyword cannibalization” is when two pages are competing for the same keyword.

This can confuse Google and force it to make a decision on what page is “best” for the search query.

 

Ecommerce SEO

ahrefs.com/blog/ecommerce-seo

site structural errors

Ahrefs content has been consistently strong, lately. This one is a guest post by David McSweeny and is a meaty guide to ecommerce SEO. Not just “do this, don’t do that,” but everything is illustrated in the context of two case studies.

Toy Universe were able to double their search traffic in a short period by putting in the time and effort to create unique content for their products, and by implementing solid, sensible SEO best practices across the site.

By adopting a ‘top down’ approach and working on the most important products and categories first, they were able to maximize the SEO benefit and see some quick results from their efforts

 

A Field Guide to Spider Traps

https://www.portent.com/blog/seo/field-guide-to-spider-traps-an-seo-companion.htm

Spider Crawl Tree

Despite the scary title, this post is actually about how having an un-optimized site doesn’t allow search engine spiders to discover all the important pages on your site.

Now THAT’S scary.

This is a meaty post with a lot of data, examples, and most importantly, info on how to fix this problem. I recommend you at least skim through the post enough to see if your own site has this problem.

E-commerce sites are particularly good at creating spider traps. They often have product category pages you can sort and filter using multiple criteria such as price, color, style and product type. These pages often have URLs like “www.site.com/category?pricerange=1020&color=blue,red&style=long&type=pencils.”

 

Keep Your Knowledge Graph Info Up-to-date

https://www.searchenginejournal.com/update-google-knowledge-graph/156179/

Is Your Knowledge Graph Current

SEOs spend a lot of time and energy trying to get their site to rank at the top of the search results, but how many keep an eye on how their site actually looks ON the search engine page?

Google wants you to keep an eye on your site’s knowledge graph and make sure it’s correct and up-to-date.

In order to request a change to a Knowledge Graph card, you have to:

Own an online presence that represents the entity in the Knowledge Graph card.
Ensure that an online presence — such as a website, YouTube channel, or Google+ page — is included in the Knowledge Graph card.
Be signed in to the Google account which owns that online presence.

 

Do 404 Errors Hurt SEO?

http://www.blindfiveyearold.com/do-404-errors-hurt-seo

link not found screaming frog

A thoughtful piece on the impact of internal 404 pages, along with a step by step process to identify and fix up the errors. Worth a read and a quick site audit to clean up this potential loose end.

404 errors themselves may not directly hurt SEO, but they can indirectly. In particular, internal 404s can quietly tank your efforts, creating a poor user experience that leads to a low-quality perception and pogosticking behavior.

 

How To Find Website Optimization Opportunities

http://www.matthewbarby.com/website-optimization-opportunities/

Ahrefs-Position-Explorer-Tool

One of the first things I do is use Ahref’s Position Explorer tool to show me any keywords that a URL is currently ranking for – I tend to look for keywords on page 2 or 3 that could be bumped up to page one with some on-page tweaks. You can also use SEMrush and Search Console data for this.

Matthew Barby brings a solid process for going through a website and taking care of some opportunities to optimize. So much emphasis in SEO is placed on link building and penalties, sometimes the health of the website gets lost in the hype. Take a stroll through the steps in the post to make sure your site is working for you, not against you.

Ranking In “Other” (Social) Search Engines

http://www.quicksprout.com/2015/09/28/social-seo-simplified-how-to-optimize-for-the-other-search-engines/

marketing hashtags

Google is the king of search — no argument there. But aside from Bing (and ocassionally Youtube), most people don’t really think of big social sites as search engines. But there are some intersting statistics to consider:

Obviously, Google is the largest search engine.

The misconception, however, is that social search engines aren’t large themselves.

Take Twitter, for instance, which gets an impressive 2.1 billion queries per day. That’s not far behind Google.

Consider that Facebook reached 1 billion searches per day back in 2012, which has only grown since then.

And finally, YouTube—the largest video site—gets over 3 billion searches per month. It may not be as big as the others, but 3 billion searches is still a lot.

This article takes you through how to capitalize on each social site to see that your content is getting found as a part of these billions of daily searches. A lot of this advice is kind of… standard and well known, but there’s probably some things your’e neglecting here that would make a difference…

 

Google Ranking Factors

https://northcutt.com/wr/google-ranking-factors/

So you want to rank a website? Here are over 200 ranking factors to consider before as you’re making plans.

northcutt google ranking factors

Northcutt.com does a great job of going over each factor, and providing a rating of “myth” or “concrete.” The interactive post contains a bunch of links from sources for each “concrete” ranking factor. This one is a *must* for SEOs.

The Most Commonly Used Words in High-Ranking Title-tags

http://www.siegemedia.com/the-most-common-words-in-high-ranking-title-tags

I’m a sucker for data you can actually use.

Siege Media did some number crunching and got a list of the most-used keywords in high-ranking pages (pulled from the title tags). One thing that’s interesting is the split between “buying intent” keywords and “informational.”
meta title graph

In the end, we were left with a list of seven keywords that occurred enough in the pages’ title tags to glean insights from. For the purposes of giving you the full picture/allowing you to make your own conclusions, we’ve also included the number of times the word occurred on the original keyword list. The higher the number, the more likely that number influenced the total number of occurrences in the titles of our dataset.

Read the whole article, they really break down each keyword and how the information can be used. A solid post; a must read!

Topical Trustflow: Five Use Cases

https://blog.majestic.com/general/5-ways-successful-seos-topical-trust-flow/

trends-pagerank

Majestic makes the case for replacing the dead (publicly) dead Pagerank metric with topical trust flow.

Topical Trust Flow (not tropical) is how authoritative and trustworthy a domain or URL is within its niche and what the topic of the content is about. The content that links to a page helps determine its Topical Trust Flow, so the more referring domains the more accurate Topical Trust Flow is. It also isn’t easily faked out like me and 16,996 other LinkedIn users, which you’ll learn about next. This is just one reason why other digital marketers had this to say when Topical Trust Flow was first launched. Yes, I even picked users with good Topical Trust Flow for our niche.

The post outlines five ways that SEOs are using topical trust flow to do better in the SERPs. If you’re a Majestic user, this is a solid post on how to get more from the tool.

How to Leverage Social Media for SEO

http://marketingland.com/leverage-social-media-seo-link-building-137962

This was an interesting look not at how to rank better by getting more likes and shares, but in how to reach out to link prospects to try and get a link (if that’s something you’re into). With SEO, it’s good to try several different things, so I think this is worth a read.

One of the great advantages of doing link building with Google+ is its versatile built-in outreach engine.

Once you have an outreach Circle, you can choose to share an update specifically with the individuals contained in that Circle.

If people in the Circle have circled you back, then you can also opt to send them an email notification with that update. Check the “Also send email from you to [Circle Name]” box, and they will receive the update in their inbox.

 

The Biggest SEO Mistakes

The Biggest SEO Mistakes: 6 Search Engine Disasters To Avoid (and Tips for Fixing Them Fast)

I like posts like this because, no matter how much of an SEO #boss you are, it’s easy to get caught up in the link building glamour and over look some fundamental things when working on a site.

This posts goes through eight of the most-overlooked mistakes in SEO and how to address them. Good stuff.

sitelinks

For most sites, a significant chunk of their visitors come from Google searches for the brand’s name. So a lot of people see the sitelinks and they become important entry points into the website.

But few marketers check these or manage them actively. A lot of brands have sitelinks that are irrelevant to marketing (such as “login”) or redundant (several pages that are so closely related that they look the same).

 

Keyword Research for Content

https://blog.ahrefs.com/keyword-research-for-content/

We’ve linked to some great posts before on keyword research, but this post is a great guide on using keyword research specifically for the benefits of writing content. There’s no reason to just guess at what kind of content might help you rank better when you can put in a little work and get a solid road map.

While the search volume per month, can easily be found with the google AdWords keyword planer, or SEMRush keyword overview, determining if the User Intent is valuable for you, will require knowledge of your website and what you would like your visitors to do.

So, it is for you to decide if you want only readers for your content or buyers for your service.

The use of modifiers like: Cheap, Best, Buy, Find in combination with the head term will generate a keyword focused on buyers intent.

This is a good tactical post, and could help you write content that will help you rank better in the SERPs.

Getting Into the Knowledge Graph 101

https://www.serpwoo.com/blog/experts/knowledge-graph-101/

knowledge-graph-example

This week’s required reading comes from SERPwoo, one of my favorite SEO tools. The knowledge graph is used for any person place or thing with enough authority to justify it. This post is all about lending yourself/your business enough authority to have it triggered for a related search.

Google has been experimenting with various features of the Knowledge Graph for the past few years now. Just remember, as recently as 3 years ago there was no Knowledge Graph whatsoever. Google is still in the experimental phase of things. For instance a few months ago Google decided to start adding social network icons in certain SERPs. In the beginning they only gave it to musicians and a-listers, but now the floodgates have been opened to everyone. New features come and go almost on a daily basis.

 

The Unfortunate Consequences of Ignoring Links

http://searchengineland.com/unfortunate-consequences-ignoring-links-226259

I love posts with research that read like case studies. Search Engine Land looks at a few cases where a popular site/service doesn’t focus on linkbuilding, and suffers for it in terms of visibility. A very interesting read!

netflix speed index

The Netflix ISP Speed Index is great, but it’s underperforming in terms of links and visibility. (I didn’t even know it existed prior to writing this post.) If Netflix started thinking strategically about links, they could build their way up to the same level as Ookla. There’s no reason a brand as large and visible as Netflix shouldn’t be raking in links for such a useful tool, especially considering all the exposure that has come from the net neutrality debate.

Netflix has already done the hard part and created an excellent tool; now it’s just time for some intelligent promotion to acquire the links that tool deserves.

 

How to do Competitor Research

https://moz.com/blog/illustrated-seo-competitive-analysis-workflow

competitors
How great is stock photography?

Competitor research is one of the core services any SEO should be proficient with. Understanding what competitors are doing to rank well, and how they are generating (or earning) links is part of a balanced breakfast SEO analysis.

This Moz post is money–it takes you through the work flow of analyzing the competition, showing step by step a thorough process. If SEO is important to what you do, might want to take some notes here…

In order to facilitate this process (and make it easy to replicate, control, and document), I’ve created a step-by-step workflow with the different activities and factors to take into consideration, including identifying SEO competitors, gathering the potential keywords to target, assessing their level of difficulty, and selecting them based on defined criteria:

This post is short and to the point, featuring a graphic that says a lot more than a bunch of text could.

4 Unnoticed Technical SEO Issues

http://www.stateofdigital.com/4-unnoticed-technical-seo-issues/

Technical SEO is one of those easy-to-overlook areas of search engine optimization that should really get more attention. It’s not as sexy as link building, I know, but it can really hold you back if you mess it up.

This post brings it with four technical SEO issues that may be messing up your rankings. Check it to learn about:

  • Redirect Chains
  • Layered Navigation
  • Robots.txt
  • Schema Mark-up

For eCommerce it is essential, and I cannot recommend it enough to any of my clients. Not just because we have entered the world of structured data and we need to provide the search engines with context about what we are trying to say, but at present it still differentiates your website in the SERPs.

There are a range of schema markups that are available, so you do not have the excuse of saying ‘I don’t work on an eCommerce store’. To find out more information then take a look here – http://www.schema.org/ and if you are looking for help to create your schema then here is another handy tool – http://schema-creator.org/.

If you don’t know a lot about schema, follow those links and get to learnin.

schemaorg

Building Signals of Trust and Authority

http://www.stateofdigital.com/third-pillar-of-seo-authority

Barry Adams writes about a topic near and dear to my heart: building authority. He touches on the importance of good, in-depth, LONG content:

serpIQ-content-length-vs-rankings

He also writes at length about social shares. YES, they are important, but are you killing your user’s experience by cluttering your page with social share buttons that clutter/distract from your content and cause your page to take a longer time to load?

I agree with many UX designers that social share buttons can be clunky, ugly, and intrusive. But UX has a very different set of goals and objectives than SEO (no matter how loudly some people shout that ‘UX is the new SEO’, that doesn’t make it true).

Sometimes you have to be intrusive and ‘in your face’ to get the right result. In the context of SEO, the right result is to acquire links to aid your website’s rankings. Social sharing is an intrinsic part of achieving that result, and that means you need social sharing buttons on your site.

As the article says, it’s a balance between UX and SEO. Not a black-and-white rule that says do or do not use social sharing, but about testing different options and using the one that achieves a harmony between the two.

What buttons/plugin do you use (if any) for your site’s social sharing strategy?

Is Guest Blogging Dead?

Is Guest Blogging Dead

 

Hardly…
You just have to do it right

In a blog post yesterday the one and only Matt Cutts struck lightning bolts of fear into the hearts of thousands of SEO’s across the world with a quick pec of his keyboard.

Matt Stated (and I quote):

“So stick a fork in it: guest blogging is done”

Oh Noes. The Skyz are falling

The SEO sky is falling

 

What it really means:

It’s important to always think about the motivations behind these pleasant updates from dear Mr. Cutts. His job is not only to help coordinate algorithmic anti-seo changes, his job is also to head the anti-SEO propaganda machine.

What Matt Cutts really meant:

“Guest posting is currently working really well, and our algorithm is having a tough time distinguishing what is real and what is not.”

Think about this, what on average is the best type of backlink you can get?

  1. Niche relevant of course
  2. Contextual links within well written content
  3. On real sites with real traffic

 

Ok, well what kind of links come from guest posts?

  1. Niche relevant of course
  2. Contextual links within well written content
  3. On real sites with real traffic

See what I’m getting at here?

How To Avoid Guest Posting Problems

Easy. Make it look real.
The problem with a lot of guest posts that people are being offered is the ridiculous quality that is produced. These are really just posts on a network, except the SEO providers convince you “these are real guest posts.” They aren’t. They are just simple network posts that are a little better written, and cost 5 times as much.

Now don’t get me wrong, I love using some grey-hat links from time to time for all kinds of SEO projects. But at least I’m upfront with what they are. You would be better off just getting high PR network links instead of these crappy “guest posts” that would still give you the same possibility of a manual penalty slap and not have the same algorithmic power.

Ok Ok, How to make guest posts work

Checklist:

  1. Use 4-5 other authority outbound links within the article
  2. DO NOT put your link in the “author section”. Put it in the article
  3. Write an actual quality article (brain explosion)
  4. Use multiple images and/or video
  5. Do actual reach outs for guest posts (hire it)
  6. Use natural anchor texts – Do not force a keyword (Co-occurrence works anyway)

 

This is more hassle, but if you’re going to go the guest blogging route then you might as well take the extra effort and do it properly. If it’s done properly it’s very difficult to even tell it’s a guest post, and if you go up against the big bad manual reviewers team your chances will be much higher of success.

Have an opinion, thought, or anything else? Leave a comment below!
[convertpress id=”1774″ replacetheme=”false”] Edit: After a few personal emails from you guys I want to reiterate something; If you’re doing guest posts specifically for SEO purposes, they don’t have to actually BE legitimate, they just need to LOOK legitimate. Real guest posts are awesome in that they will provide traffic if done properly, but for SEO benefits you just want it to look the part. Avoid the “author bio” section type of backlinks, and use lots of other natural outbound links and you should be safe.

Best Link Building Articles
Local SEO Updates

Learn How To Rank Your Site

Get our simple, straightforward guide on how to rank your site.

Picture of Karl Kangur

Karl Kangur

When being a chess prodigy turned out to be too demanding, Karl converted to being a marketing nerd. He loves to theorycraft and when he starts talking about SEO, he can't stop.
dotted line pointing to the void

Smash your traffic records with quality SEO.

SEO emphasis line

If you’re tired of the empty promises… Tired of the mediocre results…
Tired of SEO companies taking you for a ride… Tired of reading the word tired…

Let Smash Digital help.

Scroll to Top