Search engines are notoriously hard to analyze or report on. There are implausible variations in market share statistics, and fierce disagreements about the merits of privacy-oriented engines like DuckDuckGo. Also, despite the claims made by digital marketing agencies, nobody outside Google and Microsoft fully knows how search engines order their results. There is even greater secrecy over the bots that interrogate live websites, studying their contents before ranking them for common search terms.
While we know a lot about the Google algorithm, which lists web pages in descending order of relevance across search engine results pages, we don’t know everything. Market leader Google is reticent about the intricacies of its search engine algorithm because marketing agencies would instantly look for cheats and workarounds. The shark-infested waters of black hat SEO agencies are testament to that. Any algorithm changes are designed to improve the user experience, closing loopholes to ensure more accurate and relevant results are returned.
Our knowledge of how the Google algorithm works is based on a mixture of official announcements, expert observations and educated guesswork. The algorithm is tweaked around 600 times a year, including a couple of major annual revisions which often receive distinguishing names like Possum or Penguin. The most recent major update took place in mid-August, downgrading sites with deceptive or aggressive advertising. Simultaneously, it boosted original content providers and mobile-friendly redesigns of existing platforms.
Every revision is designed to optimize the accuracy of ranking results. In early March, the Google algorithm was revised to penalize websites found to be in violation of Google’s guidelines for webmasters. This was aimed squarely at low-grade blogs and websites intended to generate advertising revenue, rather than provide a valuable experience to audiences. And this is a key fact about modern search engines – they know when companies are writing for algorithms rather than consumers. The days of keyword stuffing and affiliate link pages are long over, and any pages containing these elements should be edited immediately.
As its machine learning algorithm becomes more sophisticated, Google has been able to interrogate sites more effectively. On top of accurately gauging readability, it can interrogate pages in real time for near-instant updating.
Key Factors Evaluated by the Google Algorithm
- How long the site has been live.
- How many unique visits it receives.
- The length of time spent on each page by visitors.
- The frequency of site updates.
- Volumes of inbound links, and the overall quality of these third-party websites.
- Volumes of keywords and long tails used on each page.
- The site’s location relative to people searching for it.
- The domain authority and top level domain.
Although some attributes of the Google algorithm remain closely guarded secrets, others are widely publicized. Webmasters were given a five-month warning before interstitials were treated as homepage content, giving them time to remove annoying pop-ups that may damage the mobile experience. Indeed, mobile optimization has been crucial since Google announced desktop-oriented sites wouldn’t rank as highly as responsive templates, which automatically adjust according to each device’s screen resolution.
Finally, there are plenty of useful resources that detail Google revisions and updates. The Google Webmaster Blog often highlights major alterations, and suggests which sites might be downgraded by new revisions. Search Engine Land and Moz also carry regular updates, with analysis of the impact of potential revisions. And as this article makes clear, Midphase knows a thing or two about search engine algorithms, too. Bookmark our Blog page for news of future Google updates and how they might affect your own site…
This article was brought to you by Midphase, for shared hosting, cloud servers and 24/7 support visit our site here www.midphase.com