This Week In SEO 93
Knowledge Graph, Local SEO, Meta Descriptions & More!
The Rise and Fall of Featured Snippets
https://moz.com/blog/knowledge-graph-eats-featured-snippets
Featured Snippets: 0
Knowledge Graph: 1
The two weeks encompassing the end of October/Beginning of November saw a pretty significant SERP switch-up.
The number of featured snippets for keywords fell:
And the number of knowledge graph results rose:
We’ve highlighted several articles in the past on how to win the featured snippet for your keywords. Many agencies and businesses have put a lot of energy and resources into winning the featured snippet (the REAL #1 ranking). So, you might be asking… WTF, Google?
It’s likely that Google is trying to standardize answers for common terms, and perhaps they were seeing quality or consistency issues in Featured Snippets. In some cases, like “HDMI cables”, Featured Snippets were often coming from top e-commerce sites, which are trying to sell products. These aren’t always a good fit for unbiased definitions. Its also likely that Google would like to beef up the Knowledge Graph and rely less, where possible, on outside sites for answers.
The real winner here is Wikipedia, as they are generally the source of the data for the knowledge panels (and “winner” is used loosely here, as they are not really compensated for being the engine behind knowledge panels. Maybe Sundar Pichai will donate $3 to Wikipedia during this funding drive).
41 Things to do to Rank Better for Local SEO
https://www.bruceclay.com/blog/local-seo-search-ranking-factors/
how do you rank higher on Google Maps and Google local search results? Improving your local search rankings is possible, and the results are very real. A Google study found that:
- 4 in 5 consumers use search engines to find local information.
- 50 percent of local smartphone searches lead to a store visit in less than a day.
- 18 percent of local searches on a smartphone result in a sale within a day.
Useful if you’ve got a local biz and you’re looking for to up your game (or if you’ve got a VA with extra time).
Go to the full post to see commentary on each item, but here is the slide deck (from Clay’s Pubcon presentation)
Meta Description Gets More Characters in SERPs
https://searchengineland.com/google-officially-increases-length-snippets-search-results-287596
Updated Yoast plugins coming to a WordPress install near you…
SERP snippets (the bit under the title/website) are getting the Twitter treatment and getting a character limit upgrade to just under 230.
Some webmasters and SEOs may consider updating their meta descriptions, but I don’t believe Google would recommend doing so. The snippets are more often dynamically generated based on the user query and content found in both the meta description and the content visible on the page. If Google is going to go with a longer snippet, it likely will pull that content from the page.
Alternative Link Building Methods
https://www.gotchseo.com/expired-domains/
This is a good post on four link building methods using expired domain names (and only one of them is “build a PBN”).
- Find a relevant and high-quality expired domain
- Extract its backlink profile
- Find contact information for all the quality link opportunities
- Reach out and let the linker know that they are linking to a dead resource/website
- If they respond, pitch the idea of them replacing the dead link with a link to your website
Time consuming, but less risky than a regular ‘ole PBN (assuming you won’t do a good job with your PBN). Click through for the other three methods.
Interesting stuff.
When You Accidentally Block Googlebot
http://www.localseoguide.com/googlebot-may-hitting-bot-blocker-urls/
There are many legitimate reasons you would have some bot-blocking language in your site’s code. The use-case in the article is to prevent your site being scraped. That’s legit.
However, I’ve also done the thing where I unchecked and forgot to re-check the “allow search engines to index this site” box and took me way too long to realize…
This article highlights an instance of Googlebot being unintentionally blocked. I definitely recommend the read:
Many sites use bot blockers like Distil Networks to stop scrapers from stealing their data. The challenge for SEOs is that sometimes these bot blockers are not set correctly and can prevent good bots like Googlebot and BingBot from getting to the content, which can cause serious SEO issues. Distil is pretty adamant that their service is SEO-safe, but I am not so certain about others. We recently saw a case where the http version of a big site’s homepage was sending Googlebot to a 404 URL while sending users to the https homepage, all because the bot blocker (not Distil) was not tuned correctly.