Litespeed Cache Miss on first page load despite cache being populated




Wordpress LScache Plugin: Litespeed Cache Miss on first page load despite cache being populated

Last Updated on: Wed, 15 Apr 2026 00:00:02
Hi I am new to litespeed and trying to configure it but hitting some issues when testing the caching before I put a new site live. 1. I use Screaming frog to generate the cache manually after purge 2. I check the filesystem to verify the cache exists 3. I open a new incognito browser and load the page. I see the litespeed cache as MISS yet the combined and minifed css and js are being loaded 4. I refresh the page and get a litespeed cache hit with the same files being loaded. I am a bit at a loss Edit: not sure if it is relevant or not but I did also have a cache variation for cookie consent. Although as I have to add this in the htaccess file it seems to get overwritten whenever I change any settings ?. Is that right? Litespeed Report: VXKNTMME This topic was modified 3 weeks, 3 days ago by cgreen177. This topic was modified 3 weeks, 3 days ago by cgreen177. The page I need help with: http://staging.pfccumbria.co.uk Edit: realised I had delayed javascript in an attempt to solve another problem, so have switched this off, but still seeing the same issue on first load. New Litespeed Report: MCFRFUZT Hi, 1/2) if you are using 3rd party crawler service , make sure you have simulate the cookie (since you seem to have a cookie vary rule ?) , proper user-agent , and also the encoding , any of them missed will lead to cache miss 3) its always there regardless miss/hit best regards, Thanks OK lets simplify things and remove the cache vary. Step 1: purge cache Step 2: crawl site via Screaming Frog to populate cache https://imgur.com/NksKt0j Step 3: load home page Observations: Home page generates a cache miss https://imgur.com/GepiJKV JS request is for the file that is in the cache https://imgur.com/y8XVqv7 Next request HITS the litespeed cache and requests the same file, with the same versions ? you can try it yourself. My client may be looking at the site but I dont think will go anywhere near the privacy page so you should see the behaviour for yourself there. So, why when I have a pre-populated cache (valid for a week), does the first page hit always generate a miss? What have I done wrong in the setup, because this cant be right, so I assume it is my error somewhere. Is it the encoding? Does the litespeed crawler take care of that? In which case, are we syaing its stupid to even to try to use Litespeed if the host wont allow access to the crawler? (I have 2 hosting companies ? one does allow the crawler, the other one demands you go onto private servers before it allows this) For completeness please also see report: BGKFNJMY Thanks And if was to do with Screaming Frog using the WRONG encoding, wouldnt I then expect to see a new cache file generated on first page load (which I dont) as you have enabled guest mode , you will also need to simulate the guest mode cookie , otherwise you warms the guest pages cache , instead of normal pages cache any reason why you dont use the plugins built-in crawler ? for these js files , not exactly as you expected lets say I have 2 requests , one comes with encoding , one doesnt , but they will both serve same js/css files , because the page content is still same, but encoding is different , so even cache miss, it wont generate new js files Aha! OK I have this working now that guest mode is off thank you and Screaming Frog does now populate the cache correctly (I knew it would be a noob error). Next hurdle ? how to handle GDPR Cookie compliance cookies without a cache vary ? I cant use the crawler because the host wont let me marking resolved Sorry rather than raise a new ticket Im reopening. Im still seeing issues. Here is what I have observed today: 1. Tested some pages but 404s were being returned for the css and js files. I suspect that might have been something to do with my purge settings (which were all unchecked, so I changed that) 2. Flushed cache and waited for a cron job to run to wget each page to populate the cache (c 9:17 am) 3. Observed cache population successfully and litespeed cache being hit. 4. At around 1:30 pm tried to access the site from a different device and suspect the cache wasnt hit first time. Difficult to tell as I was mobile with a poor signal 5. 16:00 accessed site again from private browser and I can see a litespeed cache miss again on first page hit (check privacy page to see this). 6. Waited until after 16:20 (after cron job below) to try again and can still see cache miss on first page view. I have a cron job which runs the following command for each page once an hour at 17 minutes past the hour: wget -O ? >/dev/null 2>&1 But even though I can confirm this is running it doesnt seem to keep the cache warm. My cache expiry is set to 1 week so should I be seeing this? Again apologies its probably something in my settings (the cookie is not being varied and instead I am excluding the relevant css and js from the cache, which seems to work fine) Report reference: QEKXPMKR Also now seeing 404s on the js and css when I get a HIT on page load as the cache files seem to have disappeared and are not getting repopulated by my cron job. Is this because I dont have access to the litespeed crawler? And why would the cache files disappear in under 24 hours when the TTL is 1 week? Any help appreciated but I am rapidly coming to the conclusion that I cannot use Litespeed for my caching solution here



LiteCache Rush: Speed comes from using less, not from doing it faster



Reference