This Week In SEO 40
HTTP/2 and SEO, Deceptive Download Buttons, and More!
An Introduction to HTTP2
Did you even know there was a SECOND http? According to Github,
HTTP/2 is a replacement for how HTTP is expressed “on the wire.” It is not a ground-up rewrite of the protocol; HTTP methods, status codes and semantics are the same, and it should be possible to use the same APIs as HTTP/1.x (possibly with some small additions) to represent the protocol.
The focus of the protocol is on performance; specifically, end-user perceived latency, network and server resource usage. One major goal is to allow the use of a single connection from browsers to a Web site.
Search Engine Journal’s got a write-up answering questions about HTTP/2 you didn’t even know you had!
The biggest takeaway from all this is that HTTP/2 is more modern and faster, letting servers communicate more quickly with browsers, thus making sites faster *coughRankingSignalcough*. The biggest question, though, is whether Googlebot will be able to deal with your modern HTTP/2 site:
Google’s announcement of support for HTTP/2 shows that they will likely be adding more user experience indicators to the algorithms or, at the very least, adding HTTP/2 as a ranking signal this year. Even if there isn’t a significant boost to your rankings, remember that you are better serving your users by having a faster website.
No More Deceptive Download Buttons
Google isn’t putting up with your malicious “flash is out of date, click here to update” ads served on your website anymore. Google calls these “social engineering ads.”
Tricky/deceptive download buttons will now get the OMFG DON’T VISIT THIS SITE treatment, like this:
Consistent with the social engineering policy we announced in November, embedded content (like ads) on a web page will be considered social engineering when they either:
Pretend to act, or look and feel, like a trusted entity — like your own device or browser, or the website itself.
Try to trick you into doing something you’d only do for a trusted entity — like sharing a password or calling tech support.
Now’s a good time to make sure your site isn’t hosting/embedding any risky “social engineering ads.”
AI is Transforming Google Search
TL;DR — The head of AI at Google is now the head of search, and with their increased adoption of deep learning neural networks to drive search rather than algorithms, maybe we can learn to love penguins and pandas again (and start hating on robots).
Yes, Google’s search engine was always driven by algorithms that automatically generate a response to each query. But these algorithms amounted to a set of definite rules. Google engineers could readily change and refine these rules. And unlike neural nets, these algorithms didn’t learn on their own. As Lau put it: “Rule-based scoring metrics, while still complex, provide a greater opportunity for engineers to directly tweak weights in specific situations.”
But now, Google has incorporated deep learning into its search engine. And with its head of AI taking over search, the company seems to believe this is the way forward.
A Field Guide to Spider Traps
Despite the scary title, this post is actually about how having an un-optimized site doesn’t allow search engine spiders to discover all the important pages on your site.
Now THAT’S scary.
This is a meaty post with a lot of data, examples, and most importantly, info on how to fix this problem. I recommend you at least skim through the post enough to see if your own site has this problem.
E-commerce sites are particularly good at creating spider traps. They often have product category pages you can sort and filter using multiple criteria such as price, color, style and product type. These pages often have URLs like “www.site.com/category?pricerange=1020&color=blue,red&style=long&type=pencils.”
Keep Your Knowledge Graph Info Up-to-date
SEOs spend a lot of time and energy trying to get their site to rank at the top of the search results, but how many keep an eye on how their site actually looks ON the search engine page?
Google wants you to keep an eye on your site’s knowledge graph and make sure it’s correct and up-to-date.
In order to request a change to a Knowledge Graph card, you have to:
Own an online presence that represents the entity in the Knowledge Graph card.
Ensure that an online presence — such as a website, YouTube channel, or Google+ page — is included in the Knowledge Graph card.
Be signed in to the Google account which owns that online presence.