I’m always fascinated to read coverage of SEO from outsiders. We constantly talk about traffic, rankings, ramping up content production, building links, etc., and winning the battle for search visibility for our employers and clients. It seems like the end user – actual Google searchers – get little attention in most SEO spaces.
Current Affairs recently published an article called “How SEO is Gentrifying the Internet.” It’s some pretty harsh coverage of SEO as a practice, particularly around how good SEO can exclude what’s good for the audience.
I think this sort of criticism is good for the industry, and while I certainly don’t agree with all of it, I think the writer makes some points that SEOs, content strategists, and developers should hear.
I’m sure there are more capable defenders of SEO than I could ever be. My goal here isn’t to provide a comprehensive takedown of this article, or a full-fledged defense of SEO. This is just some of what came to my mind as I read the article.
Can Cats Eat Blueberries?
I recommend reading Slater’s article in its entirety, but if you really don’t feel like it, here’s the gist:
Slater starts by defining SEO as “tricks marketers use to make certain articles appear at the top of Google’s search results.” His thesis: SEO makes the internet worse. As marketers optimize their pages for rankings and traffic, they fail to optimize for quality.
Slater wanted to know if he could feed his cat blueberries. As most of us would, he entered “can cats eat blueberries?” into a Google search. The featured snippet was from Purina, a Nestlé brand whose pet food often gets recalled. He doesn’t trust the content here.
He skips the first result in favor of the second ranking page. He reads the page and finds that it has some interesting facts about blueberries in general, but the content never actually answers the question: Can cats eat blueberries?
All of this content satisfied Google, but not the reader. It was optimized for words like blueberries and cats, but never actually provided an answer to the original question.
Slater then zooms out and points out that most topics have undergone what he calls “SEO bloat.” Searchers with simple questions that should yield simple answers have to comb through lengthy SEO-optimized pages to maybe find what they need. Think of recipe blogs that write novel-length stories about every recipe.
His ultimate conclusion is that the SEO arms race has made the internet less interesting and less usable. People who want to learn something have to sift through page after page of content that all essentially says the same thing.
As SEO marketers try to optimize their pages, they essentially play a game of copycat, lifting the structure and content of the top ranking pages and putting it into their own, with a little dash of something extra to one-up those competitors. There’s no creativity or original thinking. As we’ll discuss in the next section, that’s what Slater means by gentrification of the search results.
This isn’t just an annoyance, as Slater mentioned. The implications are far reaching. Google’s algorithm is a black box. Google has dodged outrage about how it surfaces information because our political leadership and the general public don’t really understand how Google controls what we see. Google is effectively a “liege lord” of the internet and can dictate its terms to everyone else.
Slater then goes into a discussion of the possibility of breaking up Google via antitrust lawsuits. The EU has levied substantial fines against Google, though Google’s lobbying has mostly softened the regulators’ blows. Slater closes by imploring us to forget our self-interest and pressure our elected officials to lessen the power of Google as it homogenizes the internet.
Why This Criticism is Good for Anyone in SEO and Content
Among search engines, a full 88 percent of internet searches take place on Google. If your work falls somewhere under the SEO umbrella (content, UX, technical SEO, link outreach, etc), your efforts directly impact what millions of people see when they search.
When we make the effort to have our content rank for a given topic, all of the people who turn to Google to find information for their lives must rely on the quality and accuracy of the information we publish.
There’s no doubt that some in SEO care more about churning out mediocre-at-best content for the sake of traffic. I’ve seen firsthand the wave of tools that have popped up claiming to give you the magic sauce for making your content more appealing to search engines, even though all they do is give you a recipe for copycat content, rather than a way to say something new and useful.
For all the talk about E-A-T, many SEOs are still looking for shortcuts. That likely leads millions of searchers every day to find content like Slater did – fluffy, useless, and even factually incorrect.
When SEOs see this kind of content ranking highly for valuable keywords and topics, they copy it, make incremental improvements on it, and hope it ranks higher.
This is the gentrification process Slater is referring to. SERPs are increasingly crowded out by homogenized content produced by companies trying to outdo each other by making incremental improvements on mediocre content.
Their success is the searchers’ loss, since each page on a topic more or less says the same thing. Just like with a gentrifying neighborhood, the content on a SERP loses its character, and exists only for the profit of a select few.
In the world of SEO, success is often measured in traffic, leads, and sales. There’s nothing inherently wrong with that – everyone has to put food on the table.
For how metrics- and results-driven SEOs are, though, there is rarely consideration around how our content makes the Internet a better place. Are we contributing to the homogenization of Google’s results pages, or are we bringing something new and useful to the table?
Are we actually serving people like Slater who don’t care about traffic and rankings but genuinely want to find answers to their questions?
I think it’s good for all of us in the SEO and content spaces to hear these kinds of criticisms, if only because it can help us refocus on what matters. Copycat content might still get you results in the short-term, but there’s a massive opportunity for people who can produce content that’s actually helpful.
Douchebags Per Capita and the Future of SEO
“Since the 1990s, SEO marketing has been a lucrative pursuit for the world’s most scruple-free douchebags,” Slater writes.
Ouch. Harsh, and only partially warranted.
I’ve been around SEO long enough to know it has its fair share of douchebags, sketchballs, hucksters, and snake oil salesmen.
But what I’d like to know is, what’s the douchebags per capita in the SEO field compared to, say, op-ed writers? Journalists in general? Politicians, investment bankers, car mechanics, bakers, etc.? Is it really higher in SEO?
I know Slater was being hyperbolic. But the line stood out to me because that still seems to be SEO’s reputation, even as it continues to work its way into boardrooms and strategic roadmaps of some of the biggest companies in the world. That’s to say nothing of the countless small and midsized businesses that have made use of SEO as a growth strategy.
I can’t help but shake the feeling that Slater is largely picking on the SEO industry of the 1990s and 2000s. The shady forums where people traded hacks, the PBNs, the link farms. The fact that he directly calls out Neil Patel as a standard bearer for SEOs in general makes me wonder how deeply he dug into the industry.
Saying Neil Patel represents all of SEO is like saying Martin Shkreli represents everyone in finance.
It’s easy to pick out the most visible jerk in an industry and say he’s typical of the whole group. Are there plenty of mini-Patels doing SEO? Sure. Are there other people in finance with no conscience outside of their own selfish motives? Of course.
But as I’ve come to see, SEO is a big tent. Some of the world’s leading experts on fields such as natural language generation, UX, web design, accessibility, content production, and others are actively working on projects that, while they’re currently applied to SEO, can have far-ranging impacts beyond marketing.
The image of the creepy weirdo buying up spam links and stuffing pages with keywords to steal search traffic is just tired and played out. Slater’s personal frustration with the quality of Google’s results shouldn’t have prevented him from getting a broader look into the industry to see the good work being done.
Google – An Information Boutique or Convenience Store?
Obviously, Google itself doesn’t come out of this discussion cleanly.
Google is really good at surfacing basic facts and delivering them directly in the SERP. Want to know who the 33rd U.S. President was? How many wins the ‘86 Celtics had? What the temperature is in Presque Isle, Maine? Google’s got you.
Essentially, Google is great at making commodified information abundant.
If you want something more in-depth on a given topic, however, there’s no guarantee you’ll find what you’re looking for very easily in Google’s index. It goes back to the problem of some SEOs optimizing for marginal improvements in search rankings over creating content that’s useful for readers.
I know Google is actively working to solve these problems. But as Slater, among many others, has pointed out that they still persist despite years of innovation and resources spent on machine learning, user intent, E-A-T, personalized results, etc.
This is an interesting problem for Google. Never before in the history of humankind has information been so abundant, a condition largely due to Google’s work, but less trusted. The concept of abundance and scarcity is important here.
Alex Danco defines abundance as “the condition reached as the friction involved in consumption decisions approaches zero.”
“We’re entering this world where supply side scarcity is increasingly information, which is scarce only so long as it’s proprietary. (And information has a way of freeing itself with time.),” Danco writes. “Most of the interesting scarce resources are accumulating on the demand side: consumer preference, consumer intent, consumer loyalty, consumer impulse, consumer aggregation.”
I think it’s worth adding consumer trust to these scarce demand-side resources. Google has no problem making information abundant. Getting consumers to trust the information and find it useful is a whole new challenge. Clearly, it’s one that hasn’t been solved yet.
Google made information abundant by providing an irresistible incentive for organizations, businesses, and individuals to free it. That incentive is massive amounts of essentially free, ongoing traffic.
But as Danco points out, going from scarcity to abundance means that new forms of scarcity manifest themselves.
What’s scarce in an age of information abundance?
Expertise, authority, and trust. Not only that, but relevance to the user’s context – the intent behind their search. This is what Google is trying to solve for, with admittedly uneven results.
This makes me wonder about two possible futures for Google.
One is where Google can deliver on the second half of its promise to make the world’s information accessible and useful. They are able to reliably serve users factually correct, useful information related to their queries, even if those queries are complex and super niche.
The other possible outcome is where Google ultimately cannot deliver, and it effectively becomes the information convenience store of the world. A place to get commodified information and basic facts, but not a place to go when you need something specific and useful to you in your particular context.
The latter would signal Google’s failure to escape what David Perell calls “the algorithmic trap.” Algorithms, like Google’s, are great at giving people what they like, but mostly fail to give people what they love.
The more high-quality information and expert access moves to what Venkatesh Rao calls the “cozyweb,” or private Slack/Discord channels, courses, Dropboxes, email lists, Substacks, and other walled-off groups, the more Google Search will essentially be a starting point for learning where else to go to actually find what you want.
It may well be that high-quality, relevant information doesn’t really want to be free. Maybe it never did. Instead, it wants a good lead gen program via open platforms like Google before it gives itself up. If that’s the case, searchers may need to be willing to pay up or join these walled-off digital spaces to get the answers and information they crave.
To get back to Slater’s main point, I think there’s a good case to be made that Google is gentrifying part of the internet – the surface layer. The stuff that is Googleable is, with some exceptions, mostly homogenized, watered down, and in some cases, borderline useless to a reader.
Smart content strategists, SEOs, editors, and writers are actively seeking ways to write for information gain, rather than trying to copy what ranks on page 1. They’re looking for what’s not being served up, and finding ways to give readers what they need beyond structured data and basic facts.
Just because Google’s SERPs are gentrifying, it doesn’t mean the whole internet is. Google, for all its reach, is still just one way to find content on the internet. It seems now that if you want to find the “real” neighborhoods on the internet, you’ll have to do a little more exploring than a simple Google search.