Chasing 100% Unique Content…or else?

If you’ve spent any time at all looking into content creation for your business, or delving into how SEO works, or exploring the services offered by different providers, you’ll have bumped into a number of concepts and phrases that repeat. Some ideas recur frequently and in different contexts, whether or not they’re clearly explained—or whether they have any strictly proven basis in reality. Many of those ideas can be supported by evidence, but others are taken on faith. Some don’t have rigid empirical support, but are nonetheless true.

One of those ideas you’ve likely come across is that all content on your site, including blog posts, should be 100% unique. Putting aside any complaints that “100% unique” is a redundant phrase, let’s discuss the true and the false about this idea: both why it’s a good idea and why it isn’t necessarily as big a deal as you might think.

First, let’s look at the idea of content uniqueness. Virtually every piece of writing on this idea will recommend that your content always be unique. The price for repeating content, many of these sources will say, is a penalty in search ranking (and in some cases blocking). That (penalties) can happen in some cases, but it’s not a major problem.

Most of that caution, however, is about literal duplicate pages: it’s not an issue for the kind of writing we do, which is the continuous creation of new and timely content. Although it is still true that if a client uses the same page repeatedly (on multiple sites, as a mobile-ready or printable page, etc.) then those duplicate pages can have some effect.

A bigger risk might be that keywords are repeated too often or in awkward, artificial ways, because the writing isn’t very good. There is no magic number or formula of how often to repeat a keyword or phrase (“keyword density”) on a specific page. SEO experts know this—in part because Google has stated it. Best practice as far as we’re concerned is to work in the use of each critical keyword two to three times. Does that mean that using a key only once is a failure? Absolutely not. It also doesn’t mean that using a key five or six times in an article is a problem, if it’s effective and meaningful and used in proper context (in other words: it’s well written). Will those repetitions do anything to improve the ranking? Probably not.

Regarding duplication of content from “scraping” (when others copy your pages): You shouldn’t worry too much about others stealing your content. For one thing, the kind of updated content that most businesses should be putting on their sites isn’t often attractive to content thieves.

Will it get stolen anyway? Absolutely, to some extent. Will that theft matter? Not really. Besides the fact that Google’s algorithms (and those of other search engines) take into account later copies of pages (and treat them as later copies, thereby lowering their rank), your content—if it’s produced by a focused operation like Waltham WordWorks—should be specific enough to your business, in both overall coverage and unique details, that it will be of little value to anyone else without significant rewriting. If someone has to make that level of effort with stolen content to get value from reposting it, chances are that they won’t bother. It’s a simple question of return on investment, and that equation doesn’t work out.

Let’s also look at the problem of “100%.” We’ve had clients who were very concerned about this. Is each post 100% unique? Is there any duplication from a previous post on a similar issue—perhaps from three months ago? Are keywords used only with excessive caution—to prevent any appearance of duplicate content?

This approach is, to put it kindly, impractical. Strictly speaking, no two pages in the same language will be 100% unique: some words will always be duplicates. In the SEO analysis sense, we’re talking less about strict repetition and more about the re-use of keywords and the appearance of repeated (key or non-key) phrases. When it comes to this type of content, it’s not easy to even define what 100% means. For example, if two posts share keywords, then at the very least those strings can be expected to appear in the text: 100% has already been taken off the table. A better approach is not to focus on keyword duplication (from post to post) but keep attention where it belongs: on the actual writing.

When a writer knows what she’s doing, it’s not difficult to create content with very little duplication. Other than the keys themselves, there might be no phrase that two articles on the same topic have in common. Using synonyms and related ideas—as well as an entirely different approach—the same ideas can often be covered well with little or no overlap.

In fact, it’s not unusual in our writing process for an author to submit a piece with, for example, more than one opening paragraph, with a note for the editor or account manager to choose one. We even had a situation some time back where the same current news topic was requested by seven different accounts over a three week period. Since we were writing for clients in different regions with different local news sources, it was an easy enough task to create each article independently. Some had similar structures and there was keyword overlap, but none had duplicate text or duplicate external links.

Now, let’s look at the worries of what happens when your content isn’t unique—and whether or not they’re justified.

The concern is that duplicate content leads to ranking penalties and even, in extreme cases, to a site being blocked from search results. Should you spend time worrying about this?

In a word: No. The concerns here are usually overblown. Should you be copying and pasting content–even your own? No. But if you already have a solid strategy for new content creation, and if the material you post is high quality, then there is little to worry about. If you are continually adding content and each page of that content is written from the ground up by competent writers, it won’t cause any problems. At Waltham WordWorks, for instance, our writers have standing instructions to review previous posts on client sites so that they can avoid taking a too-similar approach when writing about recurring topics.

It’s worth noting that the kind of “duplicate” content most likely to cause trouble is when an entire page is duplicated. With our service, the chance of that ever happening is zero (although we can’t prevent a client from duplicating a page if they choose).

This article by Andy Crestodina on Neil Patel’s blog covers some of these points well. We don’t know Andy or Neil personally, but Neil—and his guests—have a long history of providing good insights into topics like this.

When you need quality content for your business website, get in touch with Waltham WordWorks. We’ll never let you down.

Leave a comment