NIXSolutions: How to Solve the Problem with Duplicate Content on the Site

Recently, Google employee John Mueller left an interesting comment on the topic of reducing duplicate content on the site. This was brought to the attention of Barry Schwartz from Search Engine Roundtable.

NIXSolutions

Mueller noted that if there are problems with scanning, then working with individual pages will not help much. We need to look for solutions that will reduce duplication by a factor of 10. In other words, you need not focus on individual pages, but work on a site-wide basis.

Duplicate content is definitely a technical issue and not something that depends on individual actions, according to SearchEngines.

For example, if the site contains 100 thousand products and each of them has 50 URLs, then reducing the number of URLs from 5 million to 500 thousand (5 URLs for each product) will definitely be worth the effort and will have a positive effect on crawling. In addition, by reducing the number of Google duplicates, it will be easier to aggregate signals within the main product or category page, which will help to avoid showing the wrong pages in the SERP.

Therefore, if your site has a duplicate content issue, it is worth talking to the developers about how to ensure that the canonical URL is shown to users and Google. It might be worth considering other tracking methods to avoid creating duplicate URLs, notes NIXSolutions, and to find out if there are any bugs that lead to unnecessary duplication.