For all of the negative attention black (and gray) hat search engine optimization (SEO) practitioners get for being scam artists, sleazy salespeople, and peddlers of risky two-bit wares and services (which, admittedly, they are), there is one nice thing that can be said about those who make their living by finding ways to trick Google’s search ranking algorithm: they’ve got to be pretty clever to continually discover and exploit any loopholes in the search engine’s highly complex and secretive algorithm in the first place.

The trouble for black hat SEOs, however, is that the engineers at Google are even more clever than they are. And, thanks to advancements in technology like machine learning (artificial intelligence components that enable a piece of software to learn on its own) and semantic search functionality (meaning the ability of search engines to interpret the meaning of searches based on their contextual relevance rather than relying on the presence of keywords), search engines are only going to continue to improve their ability to detect and demote sites that lean on black hat SEO tactics for their online visibility.


Google Is Always Improving Its Ability to Combat Webspam

Google updates its algorithm hundreds, perhaps even thousands of times a year. Before long, the speed with which Google can detect and target webspam practices as they arise is likely to increase at a breakneck pace and, thus, the work performed by black hat SEOs will only get increasingly difficult, and increasingly risky, as time goes on.

At a 1999 conference, Google’s current president and co-founder Sergey Brin infamously stated that he didn’t believe that the then up-and-coming search engine needed to worry about things like webspam, but he was very quickly proven wrong.

The methods that black hat SEOs found to undercut the search engine’s algorithm were at times ingenious. And those methods evolved right along with the search engine itself, continually indicating weaknesses in the engine’s algorithm which it could then work to repair in a sustained effort to reward sites that produced content that provided a positive experience to search engine users.

Today, as we recently covered on our blog, there are very few tactics that attorneys can employ effectively in order to artificially inflate their websites’ rankings on SERPs. Those black hat practices that still work are not likely to work for very long and, when Google finds a way to shut them down, it will generally not be worth the stress, frustration, and work necessary to recover a website’s organic rankings enough to warrant dabbling in such practices in the first place (and we’re not the only ones who feel this way, either).

But the question remains as to how black hat practitioners will manage to game the system in the future. The most probable answer is that, at some point, they’ll likely need to simply quit trying to cheat Google’s algorithm if they want to stay relevant in the world of SEO. Otherwise, the best option for black hat practitioners will probably be to start thinking about another line of work sooner rather than later. Here’s why:


Black Hat SEO in the Short Term

No doubt black hat SEOs will continue their methodology of scrambling to recover and then refining their tactics after each successive Google update. At least, that is, for the time being.

Though nobody can actually predict the future, it’s not hard to imagine that link-trading schemes will probably grow increasingly complex (as they have since Google’s inception) until it is no longer necessary for the search engine to use the presence of backlinks as a measure of a website’s authority.

Purchases of fake reviews may see a temporary uptick in popularity, and negative SEO (which essentially means engaging in black hat SEO tactics on behalf of one’s competitors in the hopes that it will cause them to be deranked on SERPs) — though certainly an unethical and potentially illegal practice — might become more common when black hat tactics begin failing to produce even short-term results on SERPs.

But, as Google continues to improve its indexing and search capabilities, even these tactics are likely to eventually fall by the wayside. This is because the more that Google is able to improve its algorithm’s ability to learn and self-correct with the use of AI, the more efficient it will become at detecting and relegating black hat SEO practices while simultaneously rewarding sites that stick to white hat tactics.


Artificial Intelligence (AI) and Deep Learning

Though not yet a perfect process (one SEO expert says that “AI can behave brilliantly at some times while behaving more like a drunk toddler on a sugar high at other times”), AI is going to have major SEO implications and will likely change SEO practices as we know them. And if Google CEO Sundar Pichai’s enthusiasm for AI is any indication, many of those changes are likely on the not-so-distant horizon.

In 2015, Google added the RankBrain feature to its algorithm to help it better understand the intentions of search engine users based on linguistic context clues provided in searches. That AI product is still working today to make the use of specific keywords and keyword phrases in website content arbitrary, and Google claims that it is currently the third most important piece of its algorithm behind the presence of quality content on a webpage and the presence of external links elsewhere on the web.

But Google has employed deep learning processes in many other ways, as well, including using a computer modeled off of the human brain to identify images of cats, creating a program that taught itself to become perhaps the best player of a popular Chinese board game in history, and to train a digital assistant so well in the patterns and nuances of human speech that it was able to trick actual people into thinking it was human. Google also uses AI to learn the types of videos users are most likely to watch in an effort to keep viewers glued to YouTube as long as possible in order to maximize video ad revenue.

Today, researchers are working to make AI capable of “imagining” future scenarios based on a current set of circumstances, including predicting actions by human beings in popular TV shows. Thus, it’s not hard to imagine a system that could one day predict how black hat SEOs might react to an algorithmic change, thereby enabling that system to close potential loopholes before they can ever be exploited.


Natural Language Processing and Semantic Search

When Google rolled out its Hummingbird update in 2013 it was described by the company’s then search chief Amit Singhal as perhaps the most dramatic rewrite of Google’s code he’d seen since he began working at the company in 2001.

Hummingbird was the foundation of the search engine’s efforts to begin better understanding conversational language. It also included the introduction of the knowledge graph, which answers specific questions for users without the need to click a link to get the information they are looking for. The search engine became so confident in its results at that point that it even began bolding the answer to a user’s question rather than the keywords they’d used in a search, as in the example below.

An image of the knowledge graph that appears for the search "how many gold medals does michael phelps have"

Thanks to deep learning techniques like RankBrain, Google is only getting better at interpreting a searcher’s intent (semantic search). And though it may not automatically know exactly what a user is looking for based on the specific terms used in a search, the search engine can accurately predict within a fairly narrow range what kind of information a user hopes to find.

A search for the Vietnamese rice noodle called “pho” — which is also a word used as a shorthand for soup that uses these noodles — offers places to eat the Vietnamese dish near where the search was conducted, as well as a definition of the word itself. Those results are followed by a list of links primarily covering top-rated local pho restaurants, along with a carousel of videos a little further down the page about how to eat and make the popular soup.

search results for the word "pho" offer restaurants and definitions in knowledge graphs

SEOs and website owners were aggravated that the addition of the knowledge graph decreased the need for users to search a list of blue links in order to find the information they were looking for, thus negatively affecting click through rates for even top-rated organic results for certain queries. Users, however, were delighted by a search system that was instantly more efficient and easy to use.

As Google gets better at understanding conversational language and the semantics of specific searches, the use of specific keywords and search phrases will continue to lose SEO value, thus rendering black hat tactics like keyword stuffing entirely obsolete.


Voice Search and the Internet of Things

Imagine the following interaction with a hypothetical digital assistant:

User: OK Google, what is the penalty for a first-time DUI in Arizona?

Google: According to the Jane Johnson DUI Law Firm, the penalty for a first-time DUI offense in Arizona includes a minimum 24-hour jail sentence, with a maximum of 6 months in jail, as well as between $250 and $2500 in fines.

User: Is it possible to avoid jail time entirely?

Google: Though the minimum penalty for DUI actually mandates ten days in jail, nine of those may be suspended, but it is not possible to avoid jail time entirely for a first-time DUI conviction.

User: Cast the top ten DUI lawyers in Arizona to my TV.

Google: Three of the top-ten DUI lawyers in the state practice in the city where you live. Would you like me to limit results to those lawyers?

User: Yes…actually, can you just schedule consultations with all three of them?

Google: Would you like me to add those appointments to your calendar once they are scheduled?

User: Yes, please…now, can you place my usual pizza order?

Google: Your delivery will arrive in 30-45 minutes and your legal consultations have all been scheduled for next week.

Though this is not yet reality, the constant improvement of search engine language capabilities and the increasing interconnectivity of devices in homes (as well as advancements in wearable tech) means a future in which almost everything we use on a daily basis — from televisions, to thermostats, to toasters — can be connected via the internet and controlled almost completely with the help of a digital assistant.

The steady increase in the use of digital assistants and voice search forecasts a SEO culture in which screens are not even necessary for many searches, meaning that organic real estate for most searches will become increasingly competitive.

It will therefore be even more important for attorneys to ensure that the content they offer on their law firm websites is optimized to provide value to users — and not just value, but the absolute best value in its market. After all, some branding in a voice search is better than no impression on a user, whatsoever.

As Google and its users begin to rely more and more on knowledge graphs and one-item answers to facilitate searches, it will be increasingly difficult for black hat SEOs to undercut search engine algorithms in order to reach that top organic spot. Thus, attorneys will need to be even more focused on the quality of their content over time, as opposed to deciding whether or not to engage in black hat tactics on behalf of their law firm websites at all.



Google’s capabilities with regard to battling black hat SEO are constantly improving. And, thanks to AI and other technological advancements, that rate of improvement is increasing rapidly and will only get faster and more precise as time goes on.

Thus, it is only a matter of time before Google’s algorithm is able to leave black hat practitioners in the dust. The only remaining question is whether or not your law firm website is able to stay on the right side of Google’s updates by keeping within its webmaster guidelines, or if it is ultimately left behind as well for disregarding or subverting them.