If you want the contents and pages in your website to be ranked and displayed in the Google search results page, you have to make sure that Google’s spiders have access to your CSS and JS files.
You have to understand that Google’s main focus is to provide better rankings for websites that are user-friendly. When we say user-friendly, the website should load quickly, the navigation tabs are clear, and of course, Google should have access to the CSS and JS files.
If the spiders sent by Google to crawl and index the website are blocked, then Google has no way of knowing what your website is about.
Needless to say, if Google doesn’t know what your website is about, it also doesn’t know how it should rank your website in the search results.
In a nutshell, if the Googlebots are blocked from accessing your CSS and JS files, your website’s SEO performance will likely suffer.
Your rankings in the search results will go down or worse, your website will disappear altogether from the search results. You don’t want that to happen.
When you receive the warning message from Google, it basically means something is blocking the search engine’s bots and that you should fix it immediately before the situation gets any worse.
If you have received the warning from Google, you should try to fix it as soon as possible. The more you put off addressing the problem, the more it will hurt your website.
If you haven’t received the warning, you will get one sooner or later.
That said, even if you didn’t get the notification, you should still check to make sure that Google has access to your CSS and JS files.
Fixing the problem is not as complicated as you might think.
It can be frustrating at first but it gets easier as you learn more about how the bots and files work.
Contents In Page
How to Give Google Access to Your CSS and JS Files
First of all, you must know which exact files in your website are inaccessible to Google’s bots. To do this, you need to log into your Google Search Console account and go to your dashboard.
Find the Crawl tab and click on the option that says “Fetch as Google”. Then click on “Fetch and Render”.
You should do this for both of your desktop and mobile devices. The results should show up and be displayed in a row in the lover half of your dashboard.
When you click on the results, you will be provided with two windows that are basically screenshots of your website. The two windows show what Google sees and what a regular visitor sees.
What you should do is look for differences between the two windows and screenshots. If there’s a difference, this means that Google bots have been blocked.
The results should also include a list of links that refer to the CSS and JS files that Google wasn’t able to access.
The list not only shows the links, it also identifies what type of file has been blocked if it’s a script or an image or a video.
You can also identify which resources have been blocked by going to the Google Index and choosing Blocked Resources.
The results would show how many pages have blocked resources. You can download all the information if you want to keep a separate copy in case your Google Search Console goes into some trouble.
If you click on each blocked resource, you will see the actual links that cannot be accessed by the Google bots.
How to Fix the Warnings by Editing robots.txt
It’s important that before you proceed, you need to know first what a robots.txt is. In a nutshell, the robots.txt is what webmasters use to give instructions to web crawlers and spiders for indexing purposes.
You can also use the robots.txt to tell crawlers what to index and what not to index in your website.
When a Google bot reaches your website to index it, it follows the directions of the robot.txt.
At this point, you should already see why robots.txt files are very important when it comes to fixing CSS and JS warnings on your website.
If your website runs on the WordPress platform, here’s how you can fix the CSS and JS warnings through your site’s robots.txt files.
You can easily edit your robots.txt file using FTP like FileZilla. Or you can make use of the cPanel file manager. Last but not the least, you can use the file editor feature of SEO by Yoast, a very popular WordPress plugin.
Using Yoast SEO, you can access and edit both your .htaccess and robots.txt files directly from your WordPress dashboard. From the dashboard, click on SEO, then on Tools, then on File Editor. From this page, you can now view and edit your robots.txt file. But how exactly should you edit the file? In most cases, all you need to do is get rid of the following line of code:
Disallow:/wp-includes/
This should fix the warnings. But it still depends on how you have initially configured your robots.txt file. It’s quite possible that that your site was configured to disallow access to some WordPress directories such as the following:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
You need to fix that if you want to stop receiving the CSS and JS warnings from Google.
If you are confused at this point and you don’t know how to proceed, you should review your robots.txt file or consult with someone who is more knowledgeable about the issue.
The general rule here is that you must get rid of the lines of code that are preventing Google bots from accessing your CSS and JS files.
You are to remove these codes on your website’s front-end. In most cases, these codes and files are stored in the themes folders and plugins folders.
It might also be necessary to to get rid of wp-includes and WordPress themes. This because there are themes and certain plugins that may call scripts that are located in your wp-includes folder.
Review all these things to see if they are having any effects on your robots.txt which is leading to the warnings you are getting from Google.
What If You Don’t Have a Robots.txt File?
Logic would suggest that if your website doesn’t have a robots.txt file, the Google bot would crawl and index all of your files.
This means that you shouldn’t be receiving a CSS and JS warning from Google right?
Well, it’s still possible for you to receive the warning even if your website doesn’t have a robots.txt file.
This is because in some instances, some companies that provide hosting for WordPress automatically block bots from accessing default WordPress folders.
This means that even if your robots.txt file is empty, it’s still possible for you to get a CSS and JS warning from Google.
To fix this problem, you must override the block by allowing access to blocked folders. You can sue the following lines of code:
User-agent: *
Allow: /wp-includes/js/
The next step is to make sure that you save the modified robots.txt file.
Now go back to your Google Search Console dashboard and repeat the process we’ve discussed earlier on how to identify the resources that are being blocked in your website.
Using the render button, you should be provided with the list of resources blocked. In this case, you should no longer see any blocked resources if you’ve followed the instructions we’ve discussed.
So that’s how you fix CSS and JS warnings from Google.
The bottom line here is that you should know how robots.txt files affect the accessibility of your site’s pages. We hope that this article helped you get rid of the problem once and for all.