Google has been investing in a method of indexing information that is discrete from simply crawling web pages for keywords; instead, they’re looking at data (aka Knowledge Graph). To quote Search Engine Land:
It contains information about entities and their relationships to one another Ė meaning that Google is increasingly able to recognize a search query as a distinct entity rather than just a string of keywords.
This ties into a larger push by the World Wide Web Consortium (the internet’s standards-governing body) to move the internet away from keyword-driven search and rather towards something called the semantic web. With the semantic web, search results return more accurately and more relevant results when queries are stated in naturalistic language rather than individual keywords.
Put differently: it’s asking a friend a factual question versus trying to look it up in the index of an encyclopedia.
In search, this is demonstrated by a box with information appearing seamlessly in search. Sometimes this information is in the sidebar where ads usually reside, or it’s above-the-fold in the SERP.
Pictured above are images of the Google Knowledge Graph responses. However, Bing has implemented something similar, called Snapshot. Search engines like Hakia and Powerset were built on semantic search from the ground up.
To achieve this, you need to bear in mind the Explicit and Implicit entities on your website.
To understand the Knowledge Graph, you first have to know what explicit and implicit entities are.
Explicit entities are pieces of information that are structured for easy identification and cataloguing by search engines and web crawlers. This is achieved through a number of ways, but the most common are:
rel="author", which defines who the author of a blog post is. There are other uses for this, but this is more limited in its application.
By contrast, implicit entities refer to information gained from context. This might be a prose paragraph that discusses opening and closing hours, a block of text at the bottom of the page with the store address in a single line, or a restaurant description that includes categories and keyword phrases. The technology used to suss out this information is called Natural Language Processing, and it represents one of the most cutting-edge developments in programming and search. (All of my research indicates that this folds into content marketing.)
When you optimize for the Knowledge Graph, you need to tier your keywords; you’re looking at user intent in search, not simply keywords and keyword density. As semantic search improves and gains more traction, you’ll need to look beyond traditional PPC strategies of buying keywords. SEM Rush suggests looking at keywords in three tiers:
Most importantly, good SEO practices never change, regardless of technological shifts. As long as your strategy continues to include solid fundamentals, your business(es) will continue to reap benefits even through algorithm shakeups and emerging technologies.