Expert publishing blog opinions are solely those of the blogger and not necessarily endorsed by DBW.
…Your content is the problem.
Content is what drives the Internet. And relevant content is what drives search results. Having superbly written, deep, descriptive, and most importantly, relevant content is the key to creating a thriving web presence.
Duplicate content, however, will shoot all of your high-flying search engine optimization (SEO) efforts right out of the sky.
Okay, that was a bad metaphor, but as I write this I’m stuck on a much smaller plane than what I usually like to fly on.
What do I mean by duplicate content? Let’s go back to the two sides to SEO that I outlined in last week’s post: the mechanicals (the things you can control on your website) and the content envelope (all the content that surrounds your site, which you have less control over), and let’s focus a little bit on the content envelope side of things.
First, a definition (courtesy of Google) of what duplicate content is:
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar (my emphasis).
Think about that for a second. It doesn’t need to be an exact replica—it just needs to be close enough that Google tags it as also existing somewhere else on the web. If there are two pieces of content that Google deems either the same or close enough, and if they’re both the right relevant search result for Google to serve in the search engine results pages (SERPs), which one does it serve?
Relevance, after all, is at the heart of what Google and the other search engines look for when serving up search results. If it serves both up, which does it put higher? The truth is, these are important questions that we as publishers, generating online content all the time, don’t necessarily have answers for.
And that’s just if there are two duplicates. Imagine if there are ten, fifty or even a hundred pages with almost exactly the same text. Someone runs a search and Google finds the most relevant content, but finds it in 100 different places across the web. At this point, relevance has been taken out of the equation altogether. To figure out what it will serve in the search results, Google is possibly going to weight some other factors higher than it normally would.
Other factors Google uses are links to the content, both from outside the site and within it, whether there’s a mobile responsive version of the page (and this is growing in importance – stay tuned), if there is structured data on the page, the page’s and site’s “authority,” how long the content has been indexed and many other criteria.
It doesn’t matter whether you understand what all of those things are (I’ll explain a lot of them in future posts). The point is that if you have a piece of content that’s replicated in many different places across the web (your site included), it really doesn’t matter how good or how relevant it is—because the chance of your version on your page on your site showing up higher than everyone else’s is mostly determined by factors far outside your control.
So what does this have to do with publishers in particular? Think about the web content surrounding your books and authors. Chances are you create book descriptions at a certain point in the production process, which serve in that role for the lifecycle of each title they’re associated with. When a book is released, its description is probably part of the feed—including metadata, cover art, pricing information and so on—that goes out everywhere. Your book descriptions are the same on the big sites that sell your books, like Amazon, Barnes & Noble, and Books-a-Million, as they are on the smaller ones—as they are, too, on your authors’ sites and probably 100 other sites that sell or talk about your books, like Goodreads or Jellybooks).
As they are—last but not least—on your site.
The same is probably true of your authors’ bios as well, especially if they’re written by your authors. And chances are, that’s exactly the case. Chances are your authors are probably using the same bios everywhere they write, or speak or present. Especially if your authors very actively promote their books and spread their bios all over the place, there are more than likely duplicate content issues on that front as well.
And if all of this content is the same as what you have on your site—and if what will determine how high a specific page shows up in the search results is everything but the actual content (because it’s duplicated everywhere else)—is it any wonder your book pages and author pages aren’t showing up very well in search results? Your site would have to be superbly optimized even to stand a chance of doing so.
Imagine, instead, that you had a site full of unique, well written, descriptive, deep and relevant content about your books and authors—book descriptions and author bios that weren’t available anywhere else. And really well written. You wouldn’t be competing with all of the other sites that had the same content. You’d probably have a pretty good chance of showing up higher in the SERPS.
Add now imagine having all of that unique and relevant content on a site that has all of the SEO mechanicals done right. It’s like sending a fully fueled, mechanically sound plane down the runway, one that you know to be engineered to operate at top condition before leaving the ground. Your only other concern, at that point, is flying above the radar in order (paradoxically, in this metaphor) to be found.