Crawler deactivated by host




Wordpress LScache Plugin: Crawler deactivated by host

Last Updated on: Wed, 15 Apr 2026 00:00:02
My host doesnt allow the litespeed crawler and has disabled it on the server. My issue is that when my pages are cached, the speed is great, when it isnt cached, the site is slow to respond. Going from an F at Gtmetrix, to an A when cached. If I cant crawl the pages to preload the cache, how can I make sure my pages are preloaded and have cached versions ready? Hi, sadly if your hosting provider doesnt allow crawler , then you cant , if your site is pretty static , you can try make cache to stay alive as long as possible , but if its very dynamic site , I dont really have any good solution for it. Best regards, Hi, Happily, I found a solution, not perfect, but it will do. Free Site analyser I configured it to crawl slowly through the site, and it acts the same way. My site is huge, so trying to keep the cache alive is tricky. Am wondering if I can just remove all the triggers to clear the cache, and just do a purge all when Ive made changes. Or just purge the page Ive edited etc? Anyway, all is not lost with this Site Analyser which of course has the massive advantage of giving me SEO information as well. Best wishes @audiomonk, I tested the SiteAnalyzer and for me it didnt generate a cache. It would also have surprised me, since Lisa Clark, on May 14, 2021 writes: Unfortunately we are not aware of any third-party crawlers that work with LSCache. Here is the article: Best wishes Then thats strange, because previous to running this siteanalyser pages were slow to load, all of them once the cache had been cleared. Visiting them once, then visiting them again manually as I said made the huge difference. Tested with GT metrix, gave poor score first time, then re-testing (after page visited) give an A score and the page loaded like lightning. So, crawled via siteanalyser then every page I checked loaded really quickly. Obviously you know more about the caching etc, but this was my experience, and still is. I tried it again today. Doesnt LS cache a page when its visited? How does it work? I noticed Lisa Clark talking about a lighter version of the LS crawler that wont need server permissions etc this would be a godsend. Theres talk on there of these crawlers working, but not heating the cache? well , there is something like an online crawler that sends request from remote server , but its still a long way to-do Make on your site an html sitemap ? many plugins can do this. In your browser, install a plugin like Downthemall (https://addons.mozilla.org/en-US/firefox/addon/downthemall/) or similar. Open the html sitemap in your browser and let Downthemall download all the html files to your computer. Cached! (be aware that this can let your server bleed if you download too many files simultaneously)



LiteCache Rush: Speed comes from using less, not from doing it faster



Reference