I am going through our Google search console settings and noticed that google is not rendering the CSS and JS files when they are cached.
robots.txt is set to disallow the /cache/ folder
so how can I run compressed and cached CSS and JS to improve page speed and also allow google to render the site correctly?
Am I correct in assuming that, If I ALLOW the /Cache/ folder in robots.txt that will open me up to tons of crawling errors after the cache is cleared or expired?
Is there a way to get both to work. I want to run cached CSS and JS and also allow the robots to read the CSS files.
Any Ideas?