Webmasters and SEO specialists continue to face the problem of slow web page indexing or its complete absence. This question has been raised for several years and each time Google indicates that there is no problem on the side of the search engine, and the whole problem lies in the quality of the site.
Google spokesman John Mueller has repeatedly said that if the site owner has to constantly refer to the tool to send crawl requests, then this may indicate possible shortcomings in the operation of the resource. Instead of using this tool on a regular basis, it’s better to try to understand and solve the issues that lead to indexing problems. However, for many webmasters, the problem still remains relevant, says SearchEngines.
To a question on Twitter about indexing new content, Mueller replied that a large number of webmasters and SEO specialists produce terrible content that is not even worth indexing.
“Just because it exists doesn’t mean it’s useful to users,” a Google spokesperson said.
Does this mean that the sign that content isn’t terrible is that Google takes the time to index it?
Obviously, everything should work the other way around: webmasters create high-quality content to the best of their ability and skills, then Google evaluates and indexes it. If the content (or part of it) is not indexed by the search engine, then it does not meet Google’s quality thresholds for content indexing.
NIX Solutions adds that the webmasters felt that calling all non-indexed content terrible is overkill. They also recalled Mueller’s recent advice to those who constantly write to him asking for SEO advice:
“Maybe you should stop reading SEO blogs and instead do something useful for your site and its users?”.