Google, taking care of the quality of the search, lowers low-quality sites in the ranking. However, what is included in the concept of “poor-quality site” is not entirely clear. Webmasters often wonder which site Google will deem poor quality faster – one with a lot of technical problems, or one with content and usability problems.
According to Google spokesperson John Mueller, “Site quality issues are things that users find problematic.”
Can users notice bad rel-canonical leading to indexing of the alternate URL? Hardly. But poor navigation, a large number of links leading nowhere, meaningless texts and a bunch of idle pages naturally cause irritation. Just because a website is technically flawless doesn’t mean it’s a useful and relevant search result for users.
Mueller pointed out that it’s common to see both quality and technical issues on the same site, but that doesn’t mean they’re equal, and Google understands that.
“Both technical issues and quality issues are extremely broad areas with a lot of overlap (e.g. is site speed a technical issue or a site quality issue?). If the site has egregious problems in one of the areas, then the ideal settings in another area will not have any effect.”
In general, everything is individual for each site, and there is no single quality indicator, says SearchEngines. Google evaluates sites comprehensively, and it sometimes takes months to determine their relevance and quality.
NIXSolutions notes that at one of last year’s meetings with webmasters, Muller said that the quality of the site is not quantifiable. He also said that he had repeatedly raised the issue of adding a quality score to the Search Console. However, this is very difficult:
“Google could create such a metric, but it won’t be the same “metric” that is actually used in search, and Google would not like to mislead site owners.”