RETURN TO BLOGS

In a previous post, we covered Google’s shift towards making artificial intelligence an integral part of their search platform. This shift has occurred over the last few years and has brought on changes in the way Google interprets websites and how it ranks them in search. Rather than relying on the 200+ historical metrics, like backlinks, keywords within the content on a page, and internal linking, Google is also using user behavior to rank a page.

In hindsight, this methodology makes complete sense. If a site is ‘optimized’ really well, but provides a bad user experience and causes low engagement, it shouldn’t be ranked on the first page – regardless of those optimizations. However, the reason it hasn’t been adopted widely until the last few years is the amount of computational power and testing that needed to take place.

When RankBrain was launched, it was only used on a small percentage of queries. Now, it affects almost every single search in Google. With these large amounts of data Google is able to fine tune and make the artificial intelligence piece of their ranking algorithm even smarter. This will make it harder to rank for some sites, but those sites that follow Webmaster Guidelines and provide useful information will perform well, potentially even in their infancy without a large backlink profile.

There is some trouble in the waters, though.

How Google is Using AI for Knowledge Graph Results

As a search engine, Google has the goal of providing results that are the best fit for your query. For some of these queries, Google has found that it is a better experience to serve the result directly in the SERPs through their Knowledge Graph rather than sending you to a website that has the answer. This knowledge graph can come in the form of a snippet they pull from a website, or it can be a single line answer with no links to the website that data was pulled from.

Google does not include links to sites when the answer is widely known information. This might include celebrity’s birthdays, heights of buildings, weather, or any other standardized data. All of this data is being parsed through as Google crawls the internet and it has been able to make the associations between the search query and those deep reservoirs of information. Over time it has been able to learn that for a query like ‘What is the population of Denver?’ it can pull information from multiple resources and display it directly in the SERPs. Sending you to a website with that information would take extra time and is not as seamless of an experience as providing that same data right away.

denver population screenshot
There is no direct link to another website and clicking ‘Explore More’ only takes the visitor to another Google property.

Google’s Use of Artificial Intelligence in the SERPs

Looking beyond Knowledge Graph, let’s dig in a little deeper to understand how Google uses artificial intelligence in the SERPs.

Essentially, Google is running blind taste tests to figure out which flavor people like the most. If one flavor continually gets a negative reaction, it won’t subject searchers to something that doesn’t taste very good.

This taste testing process is then layered on top of a powerful relevance engine. This platform can understand queries based on their relationship to previously seen topics in real time, providing instantaneous changes to the SERPs.

While this test isn’t comprehensive, Rand Fishkin of Moz fame published an often referenced case study in which he had his social media following click on a specific listing. It originally was ranked in 7th, but over the course of the following 3 hours, the listing moved into the first position.

Others have repeated this study more recently with similar results. It is important to remember that personalization will always influence the rankings each individual sees, but there is something to be said for the influence user behavior has over rankings.

In a more abstract example, if you have three sites in top positions and the average CTR hovers around 10%, but one receives 17% click through rate, Google will notice this. Based on this higher click through rate, it can infer that the site is more valuable and the metadata or content within it is a better match for certain queries. So the next time a similar query is made, that page may rank higher.

These ranking changes are always in flux, though. While the page with a 17% click through rate may rank higher on a Tuesday, if it can’t maintain those higher click through rate over time it’s rank will slip the following Wednesday.

Other Behaviors Considered for Ranking

Besides click through rate, other behaviors have been identified as potential effectors of rank. Included on this list are dwell time, bounce rate, and site click behavior.

A website could have the most enticing, attractive metadata that garners high click through rates, but if it can’t keep users on the site, and they return to the SERPs, Google can understand that there was a poor experience that led them away. In essence, if someone engages with your site, it indicates that there is a high amount of relevance between a searcher’s query and your website.

There is a bit of a gray area here though – and this is potentially one of the reasons RankBrain is not the top ranking signal. Not every search needs high levels of engagement to indicate it was successful. There are many times I’m searching for a quick fact, only to find it and leave the site within seconds. Or even if I find an answer, sometimes I may check multiple sites to confirm that the first thing I read is correct. This is the internet after all.

Other Ways Google is Using Artificial Intelligence

In another display of its use of mass amounts of data, Google is integrating search behavior across multiple platforms. If you are a user of Google Now you may find it to be a useful tool for answering questions and showing you pertinent information based on events coming up or relevant attractions nearby.

Besides this functionality, Google is also adding additional information to the cards it displays. In the first post on AI we mentioned the Google Now team is a part of the Search team. In the screenshot below, you can see how these two datasets are coming together.

google price drop screenshot

In a previous search, this user has looked for a particular book. He didn’t purchase it (understanding of site behavior), but Google was able to ‘remember’ this interaction and present the book at a later date when the price was lower (understanding product information).

This is interesting for two reasons. Google is merging these two non-cohesive datasets (user behavior and pricing information) as well as attempting to understand the intent behind the original search. Did he not buy the book because it was too expensive? Or for some other reason? If it turns out price wasn’t the issue, Google can potentially learn from this and serve not only better results to this user, but also to anyone else who searches for a book on this topic. If a lower price still isn’t enough to get someone to buy it, maybe it’s time to show another book to begin with!

What’s Next for Google and Artificial Intelligence

Google’s use of artificial intelligence will only grow and its ability to influence rankings will continue to increase. How powerful will RankBrain become? Only time will tell, but at least for now some engineers at Google think it’s funny to joke that they don’t even know what it is doing. Well, hopefully that’s a joke.

Regardless, you can take advantage of this shift if you haven’t already. RankBrain and artificial intelligence are only going to position well crafted, relevant sites higher in the SERPs. In the next part of this series we’ll cover what you can do to become that relevant site and how to prepare for the ever growing expanse of unique, first-time long-tail queries.