Advertisement
SEO

Site Dependence On JavaScript A Problem For Googlebot?

Google's John Mueller responds to a question about JavaScript's impact on Googlebot, recommending testing to figure out what's really going on.

A question was asked of Google’s Search Advocate John Mueller during a recent Google Search Central SEO office-hours hangout, asking if it’s bad for a website to be dependent on JavaScript for basic functionality.

Is it possible that this will have a negative impact on Googlebot’s crawling and indexing?

Mueller said it’s probably fine, but he also suggested some steps to take to ensure that neither Google nor users have any issues with the site.

Without JavaScript, the site is not user-friendly

The person who asked the question mentioned that JavaScript was used for a lot of the site’s functionality and was concerned about the impact on both user and SEO friendliness.

This is the question:

“Our website is not very user friendly if JavaScript is turned off.

Most of the images are not loaded. Out flyout menu can’t be opened.

However the Chrome Inspect feature, in there all menu links are there in the source code.

Might our dependence on JavaScript still be a problem for Googlebot?”

The “Chrome Inspect feature” that the person is referring to is most likely Chrome’s View Page Source code inspection tool.

So they’re saying that even if JavaScript is disabled in a browser, the links are still present in the HTML code.

Mueller advises conducting site inspections

Mueller acknowledged in his response that Google could most likely handle the site.

But what was left unsaid is that JavaScript is required for the functionality of many websites and that the person who asked the question had a fairly typical experience.

Many images will not load, the layout may become broken, and some menus will not work if JavaScript is disabled in your browser.

A screenshot of SearchEngineJournal with JavaScript disabled is shown below:

While Mueller alluded to this in his response, it should probably be emphasized that most websites are inaccessible without JavaScript enabled on the browser and that the experience of the person who asked the question is not unusual but rather common.

Mueller admitted that everything would most likely work out.

He said:

“And, from my point of view …I would test it.

So probably everything will be okay.

And probably, I would assume if you’re using JavaScript in a reasonable way, if you’re not doing anything special to block the JavaScript on your pages, then probably it will just work.”

Read Blogger versus WordPress – Which Will Be The Better Option?

Test Your Site’s Performance

Mueller then encouraged the person to run tests to ensure that the site is working properly, saying that “we” have tools but not specifying which ones.

He’s probably referring to the Google Search Console tools that can tell you if Google can crawl your pages and images.

Mueller continued his answer:

“But you’re much better off not just believing me, but rather using a testing tool to try it out.

And the testing tools that we have available are quite well documented.

There are lots of …variations on things that we recommend with regards to improving things if you run into problems.

So I would double-check our guides on JavaScript and SEO and think about maybe, …trying things out, making sure that they actually work the way that you want and then taking that to improve your website overall.”

Sites that are easy to use

Mueller then moved on to the topic of user-friendliness, as the person who asked the question mentioned that the site is unusable without JavaScript enabled.

The vast majority of websites on the Internet use JavaScript, according to W3Techs, which claims that 97.9% of websites do.

In its annual report on JavaScript use, HTTPArchive, which uses actual Chrome user data from opted-in users, notes that the median number of JavaScript downloads for mobile devices is 20, with as many as 33 first-party JavaScript and 34 third-party scripts for the 90th percentile of websites.

According to HttpArchive, 36.2 percent of JavaScript forced onto a site visitor’s browser goes unused, resulting in wasted bandwidth.

As you can see, the problem isn’t users who visit a site with JavaScript disabled, as the person who asked the question was concerned about. Their worry was unfounded.

The real issue is that users are encountering a site that is forcing too much JavaScript on them, resulting in a poor user experience.

Mueller didn’t go into detail about how the individual’s concerns were misplaced. He did, however, suggest some useful methods for determining whether users are having a bad experience due to JavaScript issues.

Mueller continued his answer:

“And you mentioned user-friendly with regards to JavaScript, so from our point of view, the guidance that we have is essentially very technical in the sense that we need to make sure that Googlebot can see the content from a technical point of view, and that it can see the links on your pages from a technical point of view.

It doesn’t primarily care about user-friendliness.

But of course your users care about user-friendliness.

And that’s something where maybe it makes sense to do a little bit more so that your users are really for sure having a good experience on your pages.

And this is often something that isn’t just a matter of a simple testing tool.

But rather something where maybe you have to do a small user study or kind of interview some users or at least do a survey on your website to understand where do they get stuck, what kind of problems are they facing.

Is it because of these …you mentioned the fly-out menus. Or is it something maybe completely different where they’re seeing problems, that maybe the text is too small, or they can’t click the buttons properly, those kinds of things which don’t really align with technical problems but are more, kind of, user-side things that if you can improve those and if you can make your users happier, they’ll stick around and they’ll come back and they’ll invite more people to visit your website as well.”

Users’ Testing Furthermore, none of the recommended tests were explicitly referenced by Google Mueller. It’s self-evident that Search Console is the best tool for diagnosing Google crawling issues. For example, Search Console notifies publishers when a large number of URLs are discovered.

One of the best user experience tools is Microsoft Clarity, a free user experience analytics tool. This GDPR-compliant analytics tool gives you insights into how users interact with your site and can even tell you when they’re having a bad time.

As a result, it could be very useful for diagnosing the site issues that John Mueller mentioned.

Citation

Watch John Mueller at the 10:23 minute mark:

Learn more from SEO and read It’s normal for cached JavaScript pages to appear empty, according to Google.

Related Articles

Back to top button

Adblock Detected

Don't miss the best oppertunities.