Anyone with even a passing interest in SEO or digital marketing will be only too well aware of the current drive to focus on content rather than links.

While it’s a welcome if long overdue change, any resulting gains could be short lived if we fail to take this opportunity to examine old practices and think about what prompted this change.

The ends are the same, the means have changed

What strikes me about a lot of current debate is that Google’s desire to see quality content is often framed, implicitly or explicitly, as a change. They’re de-emphasising backlinks in favour of something else – namely high quality content — as the basis of ranking. This is only half right.

Google (and other search engines) were always interested in high quality content. Backlinks were always only a proxy for measuring that quality. As many explanations of the PageRank model put it, links can be likened to votes. The’re an indication that the linked page is of value. More links — and more links from pages that are themselves deemed valuable — suggests greater value.

The now all too obvious problem with that model is the ease with which it can be exploited. Like rigging an election, you go out and find ways to manufacture votes.

So there is in fact no real change; Google are just reaching a point where their original model is no longer fit for purpose, and technology is available to provide much more sophisticated – and much less easily manipulated – measures of quality.

Pedantic? Maybe. But getting that straight is, I think, important if we’re to avoid ways of thinking and working that have failed the industry before.

Quantifying quality content

This line of thinking invites us to consider how else we’ll determine quality. Links were easy. More links was better, and more from authoritative and well-ranked sites better still.

Things are now significantly less clear, but the SEO industry and its customers will, by force of habit more than real necessity, want to find a way to quantify quality. They’ll want to know precisely what measurements indicate content that will succeed in natural search. How long a piece of content be? How often should new content be posted? What Flesch-Kincaid reading level?

I’m already seeing questions like this in forums, and in a certain sense they’re valid queries. It is important for us to understand how search algorithms are arriving at their decisions. But equally such questions could betray a failure to really understand what happened to links, and such a failure will lead inexorably to repeating past mistakes.