Few years back I had a really nice set of Amazon affiliate stores that used Amazon’s XML datafeed, prior to Google cracking down on thin affiliates: basically content created using the affiliates content with no added value, AKA duplicate content or thin affiliate content, my very easy to make millions of indexed Amazon affiliate product pages (created from duplicate content) went from pulling in 25,000+ unique visitors a day to a fraction of this traffic (probably about 500 visitors now, little from Google).
Went from making around $50,000 a year just from those Amazon affiliate stores to a hell of a lot less: made just $117 from Amazon last month, $75 the month before taking into account AdSense income from the stores pages I’m probably making less than $1,500 a year from those pages.
The same is true of duplicated articles (article content is no different to product content to Google), so if all you do is copy articles from article directories like every second webmaster trying to make easy money online you are highly unlikely to do well in Google long term.
You can get away with some duplicate content, but use too much on a page and it will not rank well in Google no matter what you do. That’s not to say duplicate content can’t pull any Google traffic, you might still get some real easy SERPs, but in comparison to what you would get if the content was unique it’s a trickle of traffic. Like I said 25K compared to 500 visitors a day and that’s from about a million indexed pages (Google will still index your duplicated content)!
My advice is avoid using a significant amount of duplicated content on any page you consider important. I hate to give precise percentages since every page is different, with a small page you can get away with more duplicate content relatively speaking than a large page: if there’s 500 words and 200 are duplicate that’s not that much duplicate content, in comparison if there’s 3,000 words and 1,000 are duplicate (less of a percentage than the first example) that’s a LOT of duplicate content for a page that size. Like wise 200 duplicated words out of 3,000 words isn’t a lot, that could be considered a quote of another page or something which isn’t going to trip any duplicate content filters in Google.
In general I’ve found you can add duplicate content to a site and it not negatively affect your unique content, but if your for example adding 10 duplicated articles to every 1 unique article, you are pretty much wasting the vast majority of your SEO benefit (from links) on pages that are going to be penalised, basically using duplicate content is a waste of resources.
I have a site I’m not really using right now, home page PR3. Was a waste to have a site with PR and no content, so I added a automated tool for adding articles from a article directory. So will be using duplicate article content only.
Over 1,000 articles have been published with many covering niches that are relatively high traffic. Now at PR3 home page I didn’t expect thousands of visitors a day, but if all this content were unique I could expect to see at least 500 visitors a day and probably 1,000.
Actual traffic is more like 10 visitors a day.
Avoid using duplicate content, I’ve tried all sorts of ways to get past the automated duplicate content penalties, but within a week or two of adding the duplicate content it’s clearly been downgraded in Google SERPs.







0 comments
Post a Comment