NIX Solutions: Google Says 60% of Content on the Web is Duplicated

If you are worried that you have a problem with duplicate content, we hasten to reassure you – more than half of the sites on the Internet have exactly the same problem. Google generally claims that 60% of the content on the Internet is duplicated.

NIX Solutions

That’s exactly the kind of information contained in a presentation by search engineer Gary Illyes at Google Search Central Live in Singapore. The corresponding slide was published on Twitter by a specialist who was present at the event.

In order to reduce the percentage of duplicate content on the Internet and not fall into the so-called “cluster of duplicates”, Gary Illyes recommended that site owners:

  1. remove duplicate protocols, favoring HTTPS
  2. remove www/without www
  3. remove URLs with useless parameters
  4. remove option with slash/without slash
  5. remove other duplicate checksums

Regarding point 5, Google compares the checksums generated from the main content on the pages. If there are two pages with the same checksum, then Google regards them as duplicates, says SearchEngines.

In any case, Google has plenty to choose from when ranking responses to any query. And the figure of 60% clearly demonstrates only the width of this choice. This means that webmasters need to create something more unique and useful than most of what is already on the Internet.

NIX Solutions reminds that recently Google has implemented improved visual product search and dynamic filters to refine selections in the desktop version of its search.