Algorithm updates are a constant factor we deal with in the SEO world. Google releases smaller updates daily and several large-scale “broad core updates” each year.
In the years of Penguin and Panda, Google gave SEOs and webmasters hints as to why you may have seen a drastic drop in traffic following a broad core update.
These days, however, Google isn’t as helpful. Although the Google Search Liaison Twitter account was created in 2017 to interact with the search community, action-oriented information is not shared.
The March 2019 Core Update and June 2019 update were no different.
As such, we’re left to focus on the core principals of SEO:
- Create great content that appeals to users and gets shared/linked to
- Make sure your website is fast, well structured, and can be efficiently crawled
Our Experience with Recent Algorithm Updates
Many winners and losers were reported following recent algorithm updates, with much focus being put on health and publishing sites. Of Blue Moon Digital clients, only one brand we work with saw notable declines in organic presence. Not in the health niche, not a publisher, but a retailer. From mid-March to early-April, we saw a 40% decline in Page 1 Keywords.
What’s Wrong with this Site?
Following the declines, we reviewed what we believe were major SEO issues the retailer was facing. Looking at the broad strokes, we could break it down into two simple problems:
- We know that there is a shortage of reader-friendly content, including style guides, editorial articles, fashion advice, and company news.
- From a previous technical audit, we knew that a massive number of duplicate pages are being created by the brand’s CMS. A significant development backlog caused this project to fall out of the queue.
Content creation was a long-term plan that the brand was not ready to take on (much to our pain). So, we were left with wrangling in the brand’s duplicate content issue.
What is Duplicate Content and Why Does it Matter? AKA What the Heck is Crawl Budget?
Google’s crawlers spend a limited amount of time crawling each domain. The time and resources allotted to each domain can be described as “crawl budget.” When a CMS creates thousands of duplicate pages that serve no value to users or search engines, we label these pages as “duplicate content.” Google may end up crawling the duplicate pages (in addition to user-facing pages), effectively wasting crawl budget on useless pages when it could be used on our valuable pages (products, categories, blog posts). Wasted crawl budget can negatively affect overall domain organic value and cause traffic/revenue declines.
More on crawl budget for the curious:
- https://www.contentkingapp.com/academy/crawl-budget/
- https://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html
How We Optimized Crawl Budget and How We Know It Worked
To block these duplicated pages from wasting valuable crawl budget, we leveraged the noindex and nofollow directives within the robots.txt file. For extra measure, we adjusted URL Parameter settings in Google Search Console (OG).
After these changes, we steadily saw the number of indexed pages slowly decline (as planned). Over a span of several weeks, we decreased the number of indexed pages by 41%.
However, it’d be another few weeks before we hit pay dirt.
Cue the June 2019 Core Update.
Beginning in early June, we saw our organic rankings skyrocket. MoM Page 1 Rankings rose 87%.
You might be saying to yourself…
Everyone knows that there are over 200 ranking factors being considered in Google’s ranking algorithm. How do you know if an optimizing crawl budget actually works?
Our retail brand has two sister brands that aren’t active clients. All three brands run on the same CMS and serve similar niches. While equally as affected by the March 2019 Core Update, our brand was the only one to take action to resolve the duplicate content. As such, it was the only site to see the bounce-back in June.
Conclusion
Comparative ranking data suggests that duplicate content was a major cause of our retail client seeing significant organic declines following the March 2019 Core Update. Our crawl budget optimization work was rewarded during the June 2019 Core Update. While subject matter authority may have been a factor for other niches (health, publishing), our work suggests that optimizing crawl budget can show positive results in the wake of algorithm changes.
Bonus: News About Robots.txt Noindex/Nofollow Directives
In early July Google announced it will no longer support several useful features of the robots.txt file, including noindex and nofollow directives beginning in September 2019. To be fair, Google never officially supported these features in the past, so officially not supporting noindex/nofollow from the robots.txt isn’t a big surprise. Instead of calling out directives in the robots.txt file, SEOs and webmasters will have to add noindex/nofollow meta tags into page template source codes.
Are you putting SEO on a shelf? Let us help you navigate the ever-changing world of SEO, SEM, and constant updates to Google algorithms by dropping us a line.