In the last video in the #AskGooglebot series, Google employee John Mueller answered several questions from webmasters at once. They were mainly related to the basic aspects of search engine optimization.
Could blocking CSS files in robots.txt cause any problems?
Yes, maybe that’s why it is advisable not to do this. The ability to see a page in its entirety allows Google to better understand what it is about and also evaluate whether it is mobile-optimized, reports SearchEngines.
How do I update the sitemap? Have a step by step guide?
Check out the documentation for your site to learn how to customize your sitemap, or use a dedicated plugin. It is often sufficient to simply enable this feature, notes NIXSolutions.
Can I reset indexing for a site?
There is no such possibility. However, if you’ve made changes to your site, then search engines will automatically focus on the new version and ditch the old version over time. You can help them with this by using redirecting from old URLs to new ones.
Should you delete RSS feeds to make it easier for Googlebot to crawl pages?
Google systems try to balance the crawl of the site – automatically. This sometimes results in some pages being scanned more often than others. But this usually happens after Google has viewed important pages.