Search Engine Optimization is Like Rolling The Dice
Are you willing to roll the dice with your site's success?

If you could measure, in dollars, my level of totally-over-it regarding search engine optimization, I’d be like Jeff Bezos meets Voltron.

I’d be so rich it’d be beyond ridiculous. I’d buy every acre of rain forest in the world and turn it all into a park. Then I’d pay off everyone’s student loans, buy everyone a house, and build a fleet of personal spaceships that anyone could use.

That’s how rich I’d be. Because that’s how sick I am of hearing about search engine optimization.

So let’s get it out in the open – SEO is a scam. It’s pushed by “experts” with dubious training and experience, has never been shown to help in the long run, and is always squarely in Google’s sights as it continually refines its search algorithms.

Instead of chasing some goofy scheme happens that might work for the next three months, you should start focusing on things that are actually important – value, quality, and compliance with web standards.

Have you been taken in by the scam? Is there a way to fix it? Come along and we’ll work it all out.

Google vs The World

About five years ago, Google started following a really disruptive path.

Instead of focusing on core search, they started doing things like integrating ads and search results so cleanly that it’s hard to tell the difference between the two. They started adding content “snippets” – excerpts of content pulled from non-Google sites – that discourage people from leaving Google to visit other sites. Worst of all, they began the overall trend of “curating” content from non-Google sources into search results and then slapping monetization on it.

The search results for “search engine optimization” include a “snippet” of content that users can read without ever leaving Google. The content does not belong to Google, it’s pulled from a third party website that ends up getting no visitors.

It’s a big change from how Google started out. And it’s kind of a bummer. It increasingly lets Google bolster their fortune and empire by standing on the back of other people’s content.

When you type a question into a search engine, the search engine should be really good at decoding your question, understanding the context, and then pointing you to a solution that best matches your question’s intent.

Increasingly, that’s not what Google does. Instead, Google decodes your question, understands the context, and then gives you the answer directly. Except, it doesn’t come up with the answer, it just grabs the appropriate content from someone else’s site and shows it to you. You don’t even have to visit that site in order to see it.

That’s wack.

Google is becoming more of a pay-to-play version of a web portal than a search engine.

But, tough luck, Charlie Brown. That’s life.

Until some new group of grad students comes forward with their own venture-capital-backed, search-ecosystem-disrupting platform, Google is what we have.

Google’s traffic – and it’s ability to send traffic to your site – completely destroys every other search engine combined. If you ranked #1 on Bing, Yahoo, and DuckDuckGo – but didn’t show up on Google – you’d get minuscule amounts of traffic compared to what Google can send you.

This is the core issue driving the entire “industry” of search engine optimization.

Because ranking well on Google can send so much traffic to a website, everyone wants to rank well on Google. Accordingly, there’s an entire industry of people hawking products and services to site owners hoping to rank better in Google.

How SEO (claims it) Works & Why That’s Dumb

Search engine optimization is based on an exceedingly simple idea – figure out what Google thinks is important and then do that. A lot.

After all, if you can crack the code Google uses to decide which sites rank high and which rank low then you can get all of your sites ranked high for whatever keywords you choose. Which means money. Lots and lots of money.

But there are some issues with this.

First, nobody knows what Google likes except for Google. Moz and Backlinko have done an outstanding job of pulling lots and lots of data and then using that data to reverse engineer some insights into Google’s algorithms. The quality of their work is very high. Moz has even taken things a step further and harnessed the wisdom of the crowd (that’s a pretty good article right there about crowd wisdom, give it a look) to further refine their estimations of how Google number-crunches it’s way to displaying search results.

So yes, we know some of the things Google likes. Some things we know because people have experimented. We know other things because the people at Google have explicitly made some information public. But what we don’t know – and never will – is the actual parsing of the algorithms. The full list of ranking factors – and how those factors are balanced against one another (probably more important than the factors themselves) is a proprietary secret that’s never been made public.

I’d bet every last penny I have that no comprehensive list of all of Google’s various ranking factors is in existence anywhere outside of Google.

Shall We Bake A Cake?

Pretend you want to bake a cake. Here, this is for you:

Assuming this is the only information you have, how do you think your cake is going to turn out? Spoiler alert: your cake is going to be terrible.

Your cake will be terrible because you don’t actually have any information about how to bake a cake. All you have is a list of ingredients that a great cake contains.

All of the context – how to combine the ingredients, the order in which things should be added – is missing. On a deeper level, there is no subtlety in this raw ingredient list. Adding flour a little bit at a time is likely to work better than dumping it all in at once. Beating the egg whites slightly will produce a better texture than beating them a lot or not beating them at all. But you don’t know any of that. All you know is what things a cake contains.

That’s what lists of ranking factors are like.

Let’s make this cake example even more fun.

Pretend you do have a full recipe – ingredients, context, actual instructions, everything. Oh, except, the recipe is from the 1400s and it calls for suet (kidney fat), chopped up mutton, elderberries, sheep milk, and it has to be baked over an oak wood fire.

Assuming you could actually gather all of the required ingredients and follow all of the required steps (do you have an oak-burning cake pit in your yard?), the cake you end up with is going to be gross. Not because you did anything wrong, just because meaty, sheepy, smoke flavored cake is icky.

It’s icky because we don’t have the same culinary palate as people from the 1400s. It’s weird to us to think about a desert item having a meaty, smoky flavor. In the context of our modern sensibilities that’s kind of gross.

Search engine optimization is exactly like baking a cake. Wherever you look online, you only have two choices – you either have a modern list of accurate ingredients without any type of context or you have full instructions for assembly of an outdated product.

You can’t use either of those things to build a successful strategy.

You Can’t Optimize Without Context

There are some very good lists of ranking factors (ingredients) available online. Moz has a great list, which I’ve included above. But it’s probably not going to help you, because it lacks any kind of context.

To understand the importance of context, let’s talk about the big, data-driven posts that pop up from time to time.

I’ve seen in-depth articles where people will round up millions of blog posts, put them into a sophisticated tool like Ahrefs, and then write about the various correlations they discover.

Here’s one example from The Hook Agency (an SEO company):

A graph from The Hook Agency showing the correlation between average content length and SERP positioning

This graph shows us that longer articles tend to rank higher on Google. The Hook Agency has done their homework on this issue and they discuss it quite thoroughly – referencing external data, comparing it to examples they know are successful, and even developing a custom formula to try and estimate how long blog posts should be.

This content length issue is a good example because it gets a lot of dedicated discussion in SEO forums and online marketing blogs. The problem, of course, is the old adage that correlation does not mean causation. There’s no context in which we can understand the raw data.

Thinking about our cake example, I’d suggest that graphs like this are even worse than having an ingredient list with no context. In fact, graphs like this aren’t even talking about ingredients. What they’re doing is taking a slice of a popular, finished cake, biting into it, and saying it has a spongey texture.

Well, telling you that popular cakes have a spongey texture tells you zero information about how to get that spongey texture, or whether that spongey texture is the cause of the popularity or just the result of some other thing that’s making the cake great. At best, graphs like this just tell you that spongey texture appears to be a characteristic of popular cakes.

What if you took some kind of industrial solvent used to make plastic spongey and dumped that into your cake? You’d end up with a spongey cake that is both terrible and possibly poisonous. Yum.

The Right Way to Think About Ranking Factors

Even though “spongey texture” may be a characteristic of good cakes, achieving that texture for its own sake – or by any means necessary – is obviously not a worthwhile goal.

This is the problem with even great lists of accurate ranking factors.

Think about that “how long should my posts be” graph:

Is it true that top ranking posts tend to be fairly long, on average? Yes. If you accept the validity of the data and the methods used to gather it, then that statement is factually correct.

Does that mean you should arbitrarily try to make all of your articles really long? No. Even though that’s precisely what the post from which I excerpted the graph suggests you do, it’s a bad idea.

To understand why it’s a bad idea, stop and think about why the top ranking articles tend to be a certain length. Is it because the authors sat down with a word count in mind and randomly plugged away until they met that goal?

Of course not.

Top-ranking articles tend to be long because answering the question a reader types into a search bar takes a certain amount of depth and discussion to do correctly. And that is something Google cares a lot about.

In fact, while the average length of posts on this site is about 2,000 words, some of our top-ranking pages come from our Smarter in 60 Seconds series and are much shorter. Why? Because they do a good job satisfying user intent – they answer the questions people are asking Google.

To take things to a whole new level of nerdy detail, I could also point out that the baseline statistical methods used to produce graphs like “how long should my posts be,” are inherently flawed.

Averages – as a statistical indicator – assume that the contributing values are distributed somewhat normally. As a result, they are unnaturally sensitive to the influence of outliers. Just a few very long or very short pieces of content can skew the average so much that it gives an incorrect representation of the actual data. Instead, graphs like the one we’re discussing should probably be built using the median, which is not as sensitive to aberrant data points.

Bottom Line: Whenever you see a list of ranking factors, you should look beyond the letter-of-the-law and think about the spirit of the idea being conveyed. You have to infer the context in which the ranking factors make sense.

In the example we’re using, it’s not really article length that matters, but completeness. Length is just a side effect of writing a complete article that satisfies user intent.

“Optimization” Gimmicks Don’t Work…For Long

When all of the various lists of ranking factors get passed around to the internet at large, people (and agencies) start popping out of the woodwork, eager to apply them to the process of getting your site to rank better – or GET YOUR SITE TO RANK #1!!!!!

And honestly, a lot of times it works. The methods that search engine firms use to get your pages to pop in the search results are effective.

Until they’re not.

To illustrate, let’s look at a graph from a case study we did of a blog called shoutmeloud. Shoutmeloud is a good site, it’s very successful (to the tune of $40k per month) and it doesn’t use gimmicky methods to boost it’s organic rankings.

Shoutmeloud organic traffic over time
Shoutmeloud traffic and organic keyword metrics for >3 years, showing bumps coinciding with updates to Google’s search algorithm

Do you see the three bumps in the blue graph? The first one happens in the middle of 2016, the second one towards the end of 2016, and the third in the middle of 2017. You can read our full article for more discussion, but those bumps happened because of updates to Google’s search algorithm.

Read More: The Shoutmeloud Success Story – The Path to 500k Visitors & $40,000 Every Month

Shoutmeloud didn’t do anything differently during these times but the updated Google algorithm viewed the existing content as being more worthy of an increased number of search impressions and bumps in ranking.

A traffic graph from an “optimized” size showing a huge traffic fall after a Google algorithm update put the smack down on some of the spammy techniques the site was using to rank higher in the SERPs.

On the other hand, this graph shows the strong and sudden decrease in search traffic experienced by another site following a similar update to the Google ranking algorithm.

In a twist I find particularly piquant, I pulled this graph from a search engine optimization blog and it was posted in an article that describes how the site in question used a whole new set of “let’s try to trick Google” techniques to recover the lost traffic. Amazing.

This is a pattern that repeats over and over again in the search engine optimization game.

Because there actually are individual metrics that Google tends to weight heavily, those metrics can be exploited for temporary gains. But the story always ends the same – the algorithm changes and the search rankings crash. It’s a never-ending cycle of “optimize” (i.e., play tricks), crash, optimize, crash.

One great example of an exploitable metric is the backlink. It’s true that Google likes sites with lots of backlinks (links from other sites). In order to exploit this, it used to be common practice in the search engine optimization community to build something called a Private Blog Network.

Basically, people would go to places like Blogger or Weebly and create a large number of free blogs. They would put a handful of low quality scraped or machine-generated content posts on each blog and include lots of links back to the site they were trying to rank. As an added bonus, they would keyword-stuff the anchor text of these thinks so it precisely matched the words for which they were trying to rank.

And it worked pretty well. The target site got a good boost in the search results for the target word(s).

Until it didn’t. Pro Tip: Google isn’t dumb.

They figured out what was going on. Then all the little violins came out and started playing sad songs because “My business was destroyed overnight by the evil Google.”

Naturally, the community moved on and stopped this crazy, spammy practice, right?

Nope.

It’s still prevalent because Google still highly values backlinks. The only thing that’s changed is that the process of building the networks has gotten more sophisticated.

Instead of using blogs created on free blogging sites, the “optimizers” are actually registering domains, hosting them with legitimate hosting companes, writing decent quality articles to populate them, and then cloaking the outgoing links – they organize the link structure in a tiered system that’s one or two steps removed from the site they’re actually trying to rank.

A lot of times, when you see ads offering links on “high domain authority sites” for sale, the sites in question are members of networks like this. Usually in the T1 or T2 level shown in the diagram below.

The scheme is more complicated, but the idea is exactly the same.

For now, this practice still works and is fairly widely used. Pat Flynn – a pretty respectable guy – actually talked openly about it with his guest during one episode of the Smart Passive Income Podcase.

Because this process closely mimics real backlinks from real websites, it’s harder to detect than the old scheme. But it’s still a scheme, and Google still isn’t dumb. It won’t last forever

At some point this tactic will fail, the violins will come out, and then the scheming will begin anew.

Meanwhile, the algorithm base – as a whole – continues to evolve beyond reliance on purely exploitable metrics and towards a system that’s increasingly sophisticated at teasing out subtle indicators of quality.

Search Optimization That Actually Matters

It’s easy to say you shouldn’t care about any of these ranking factors but I know it’s hard not to. I spend a lot of time doing outreach and building backlinks just like anyone else. (Feel like linking to us? Drop me an email). But I stay away from schemes.

In the search engine optimization world, optimizing techniques are routinely classified as either black hat, gray hat, or white hat.

Black hat techniques are straight-up cheating – using fake entry pages, showing different versions of a website to Google and to real visitors, stealing links, etc. This is the internet equivalent of just plain lying about how much money you made last year when you fill out your taxes.

Gray hat techniques are talked about like they’re different than black hat strategies – somehow more legit – but they’re really not. They use the same types of strategies, they’re just less blatant and in-your-face about it. Building a big fake network of disposable blog sites to boost the rankings of one target site? Black hat. Building a big fake network of non-disposable blog sites that are carefully curated and walled off from one another to boost the rankings of one target site? Gray hat.

If black hat techniques are like lying about your income on the 1040, then gray hat techniques are like claiming a $500 donation from that one blank Goodwill receipt you have. Did you really donate $500 worth of stuff? Who knows. There’s no way to really prove it either way and the letter of the law allows you a $500 donation.

Almost all the time, when people talk about search engine optimization, they’re talking about gray hat techniques (with a little black hat thrown in behind the scenes)

The third category – white hat – isn’t really search engine optimization at all, it’s just a collection of best practices for properly building and maintaining a web site. It’s like filling out your taxes correctly and honestly and being super precise about how you do it.

A search engine optimization type article about this would have a headline like “Boost Your Chances of Successfully Winning A Tax Audit by 500% With This One Simple Trick!” and then the article itself would tell you to be honest when reporting your income.

Is that a strategy? No, it’s just the way things are meant to be done. Clearly there’s a difference between purposely doing an unnatural amount of things Google likes and just doing things the way they’re suppossed to be done.

Doing things right isn’t search engine optimization, it’s just building a good site.

Good sites are properly coded, well constructed, and use the various structural elements of the web environment correctly. To make sure your site matches this profile, you absolutely should pay attention to core SEO recommendations.

But you should also understand that they’re not really SEO tactics – they’re just the normal, courteous, community standards of the civilized internet. Like keeping your grass cut and your yard free of garbage.

  • Correctly use your title (your H1 tag) so that it reflects the main purpose and intent of your content. Clever titles are tempting (I love them) but should be avoided. Make your titles clear and purpose-driven
  • Correctly use header tags so they reflect the main points you’re trying to convey
  • Make good use of images, graphs, tables, and videos. An SEO-minded person would say that these things tend to keep people on your pages longer, which Google likes. That’s true, but it misses the point that these elements ultimately create a fuller, higher quality user experience, which is why Google cares.
  • Build your site to look good and perform well on mobile devices. SEO people point out that Google will penalize your site if you don’t. True, but once again this misses the real point – lots of people browse with mobile devices. If your site looks bad on mobile, then it’s of lower value than a similar site that works well on mobile. Quality is what Google wants.
  • Choose a good host that can serve pages quickly. Yeah, I get it, good hosting is expensive compared to $2/mo budget hosting. But waiting for pages to load is a frustrating waste of time and everybody hates it. Google hates it, too, because it creates a lower quality experience. Would you rather pay $20/mo and rank well or $2/mo and never get any organic traffic?
  • Use ALT tags and give your images appropriate file names. Again, this is just using the various elements of HTML according to their design specifications. These elements are present in the codebase for a reason and sites that use the elements correctly are sending an indicator that they care about community standards and are investing in creating a quality experience for users.
  • Make sure your articles contain vetted, high-quality links to both other pages on your domain and to pages on third party domains. Content on the web was never designed to be presented as a series of silos. The web was conceived to be an interconnected series of content pieces that were woven together to create a rich, textural experience that readers could experience to whatever level of depth they wanted to pursue. Your goal should be to create a deep and engaging experience and linking to related content is a way to create that experience.

If you want to check out a pretty comprehensive list of these types of factors, Backlinko has a good one.

At the end of the day, the benchmark you’re trying to achieve is quality. That’s ultimately what Google cares about. They’re getting better and better at training their algorithms to recognize quality – despite the various tricks that people try to pull.

By ignoring the prevailing SEO gimmicks of the day and focusing instead on building a quality user experience, you’ll future-proof your website and grow a resource that Google views with increasing authority and trust.

Remember, friends don’t let friends SEO.

LEAVE A REPLY

Please enter your comment!
Please enter your name here