Q: I have an eCommerce site and want to know the best way to handle user-generated & manufacturer-required duplicate content that is found across different URLs.
A: If a content is found in exactly the same form on several URLs across the web, Google tends to penalize it. It is specifically the eCommerce websites and some travel sites that face the problem. So, how are we going to deal with product descriptions, which most of the manufacturers demand to keep same on the websites selling their product?
I am sharing three ways to include the content while trying to minimize the risk of a penalty.
1. You can create some unique content by doing test results, comparing products, or adding specification missed by the manufacturer. Having your own editorial review of the products on the website or some reviews by the satisfied customers can work great to differentiate the page.
At times, you just don’t need that much unique content to get out of a duplicate content issue. So, I suggest you not to go way overboard with this. There are some SEOs who hire cheap and low quality writers just to fight the issue. If the content is not useful, it won’t serve the purpose.
2. If you have some unique content but the website already has a huge amount of duplicate content, which is useful for the user and you want it to remain on the same page. What I recommend is to use iframes to keep the duplicate content out of search engine’s index or not associate it with the particular URL.
3. Last suggestion is to apply visualization, aggregation or modification to the duplicate content to create something that is unique, valuable, and able to rank well. Movie review sites such as Rotten Tomatoes aggregate up the review data, snippets, and quotes coming from different places on the web. So, this way they have a bunch of duplicate content but from different sites. These are useful pieces of content because they have their own things such as Rotten Tomatoes rating or editorial review. Google will recognize these pieces and want to keep it in the index.
At the end, what I would recommend is when you are performing the process and have a large amount of content, start with the pages that are the most important. These pages can be identified by going through the list of popular items in the database or looking at things people are searching for the most on the web. Another thing you can do is, if you are adding a large amount of content all at once to these pages, keep these duplicative pages out of the index until you are done with the modification process. When you are done with the content addition part, make them indexable.
Hope my answer will solve your problem.