There has been some tremendous content published discussing Google’s Hummingbird update, the rebuild that allows the search engine to move beyond just keywords, and to begin better understanding the meaning behind searches. A.J. Kohn, Aaron Bradley, Danny Sullivan, Gianluca Fiorelli, and others, have all written strong articles discussing Hummingbird’s impact on search, and I’ve learned quite a bit from reading each resource.
While I see content centered SEO strategies remaining strong in light of Hummingbird, I believe there are a few areas where site owners can adjust strategy to capitalize on the changes. However, adjusting is only possible once the basics of Hummingbird and semantic search are grasped. Much of the Hummingbird content written thus far, has taken on a slightly academic bent. I hope this post will contribute to a better understanding of how Hummingbird could improve Google’s search results, and motivate site owners to create more “beautiful” web pages (more on that later).
The needs of users must still be addressed
The goal is still to create relevant, high quality content for users, and make the information accessible, with strong site architecture, and outreach campaigns. Penguin put many black hat SEOs out of business. Hummingbird doesn’t change the fundamental paradigms of the SEO game.
Danny Sullivan said as much in a Hummingbird basics post he published on Search Engine Land. Addressing the “is SEO dead” question in the Hummingbird context, Sullivan had this to say:
No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.
The difference between Hummingbird and previous algorithm updates
Recent updates like, Panda and Penguin, have focused on the behavior of site owners and SEOs. Panda bakes in website quality assessments gleaned from surveying college students about characteristics of sites they consider high quality, and those they consider spammy or undesirable.
Penguin, and its subsequent refreshes, help Google to better identify web spam, especially low quality links, and link schemes. These updates bring to fruition Google webmaster guidelines that were at one time only aspirational, in that Google told site owners what to do, but couldn’t police violations.
They’re now much better at rewarding signals they associate with quality, and punishing spam, which has caused quite a bit of disruption in the search results.
Hummingbird is different. It doesn’t focus as much on Webmaster behavior, but on Google, and how the search engine processes information. Hummingbird is Google’s attempt to bring the semantic web to life, by building a search engine that understands not just words, but the meaning behind them.
How should this impact SEO strategy?
Beautiful pages become more important
Hummingbird doesn’t fundamentally change SEO, however, it makes adhering to certain best practices all the more important. Useful content, actually written for users, must still be a point of emphasis. Beautiful pages, defined as pages that address topics in depth, that pay attention to detail, that are the digital equivalent of a well manicured garden, those pages should thrive like never before.
Content publishers should be aware of Google’s increased sophistication as they sit down to put pen to paper (or finger to keypad as it were). Writing granular, keyword driven articles, that overlap in meaning and substance, is not a good investment of time. If it is to work properly, Hummingbird will throw searches that were once viewed by Google as distinct, because of different keywords, together into the same arena of competition, because of shared meaning.
On a keyword centric web, Google takes content at face value based on keywords. This leaves the door open to publish many similar pages, that all refer to the same thing, each ranking for its specific keyword slot.
I envision the concept as a horse race. On a keyword driven web, articles that match very specific keywords all line up at the starting gate to compete against each other. On the semantic web, a stampede of horses break down the segmented race gates, and all charge forward together, with many different keywords competing in much larger “meaning” driven races.
For example, a travel site publishes different, but similar pages for “Motown,” “Detroit,” and the “Motor City.” In a keyword focused world, each page may be returned separately, depending on the exact keyword the searcher used. On the semantic web, Google understands that all three phrases can refer to Detroit, Michigan (Motown is also a genre of music). Understanding all these searches refer to a big city in Michigan, they’ll want to return the best Detroit page, not the page that has the most relevant keywords. In other words, the Detroit Wikipedia page, can appear for “the Motor City” searches, as it does in my screen shot below.
The best page may very well be determined by links, and other traditional SEO signals, however, in light of semantic search, one page can be visible for more queries than it was before. Once keyword limitations are removed, quite a few “SEO focused” pages become less relevant, and redundant.
As Aaron Bradley said in a recent post:
The relationships between entities facilitated by the ability to uniquely and unambiguously identify them, and to provide unambiguous data about them. And if you’re successful in this, your presence in search will be extended, and you’ll be connected to searchers looking for very specific things. You’ll appear not just for “blender,” but for “blender recommendations,” “good blenders under $200,” and “blender under 18 inches tall,” along with implicit queries that the search engines are increasingly able to work out from the query context and information about the user, like “blenders recommended by my friends” or “machine for crushed ice margaritas” or “compare blenders and juicers.”
The practical effect is a keyword stampede, in which races that were once neatly partitioned by keyword slot, are now open to bold new contestants.
Of course, keywords still matter
The keyword web existed in a vacumn. Words triggered a series of events that displayed information related to those words, but not all the connotations of their meaning. The semantic web undertands concepts, synonyms, and entities. But all of these are still described with words, keywords if you will. The words trigger all the connotations of meaning that give rise to fully developed semantic search.
As I’ve described above, the takeaway is not that keywords are irrelevant, but that they’re no longer as segmented as they used to be. It’s the meaning race now, not the keyword race.