
Google’s John Mueller responds to four rapid-fire questions about common technical SEO issues that almost everyone faces at some point.
Mueller responds to questions from readers about:
- Blocking CSS files
- Updating sitemaps
- Re-uploading a site to the web
- Googlebot’s crawl budget
These inquiries are addressed in the most recent installment of YouTube’s Ask Googlebot video series.
Traditionally, those videos have focused on answering a single question in as much detail as Google is capable of providing.
However, not every SEO question requires an entire video to answer. Some questions can be answered in a single or two sentences.
Here are some quick answers to frequently asked questions by people who are just getting started in SEO.
Read A Beginner’s Guide to Digital Marketing.
Can CSS File Blocking in Robots.txt Affect Rankings?
Yes, blocking CSS can cause problems, and Mueller advises against doing so.
When CSS is disabled in robots.txt, Googlebot is unable to render a page as it would appear to visitors.
Being able to see a page in its entirety aids Google in better understanding it and confirming that it is mobile-friendly.
How Should I Update My Website’s Sitemap?
According to Mueller, there is no universally applicable simple solution for updating sitemaps that works across all websites.
However, most website setups include their own built-in solutions.
Consult the help pages for your site to find a sitemap setting or a compatible plugin that generates sitemap files.
It’s usually just a matter of turning on a setting and you’re good to go.
What Is The Proper Way To Reintroduce A Website To Google?
It is not possible to re-index a website by deleting and re-uploading its files.
Google will automatically prioritize the most recent version of a site and gradually phase out the older version.
You can speed up this process by redirecting any old URLs to the new ones.
Googlebot Crawling?
A reader writes to Mueller, claiming that RSS feed URLs in the head of every page consume 25% of Googlebot’s crawl budget.
They inquire whether removing the RSS feeds would improve crawling.
Mueller claims that RSS feeds are not a problem, and that Google’s systems automatically balance crawling across a website.
This sometimes results in Google crawling specific pages more frequently, but pages are only re-crawled after Googlebot has seen all of the important pages at least once.
Need help with our free SEO tools? Try our free Keyword Position, Keyword Density Checker, Keywords Suggestion Tool.
Learn more from SEO and read How to Resolve Rewrites in Google Title Tags.




One Comment