Wordpress LScache Plugin: Crawler continue to make BLacklist?
Last Updated on: Wed, 15 Apr 2026 00:00:02 Crawler continue to make BLacklist? it works right for 100/200 link then stopped and put all links in blacklist? i have plesk server and follow instructions to activate on server? now litespeed creates 4 crawlers.. i not why 2 with guest Webp name and 2 guest see screenshots crawlers whitelist blocklist ive tried by cron and to start manually have quic cdn active and cloudflare with orange cloud only on www cname the connection between server and quic without orange cloud now litespeed creates 4 crawlers.. i not why 2 with guest Webp name and 2 guest I can only answer this. For your backlist issue plugin support must answer it. Whenever Guest Mode or webp replacement is enabled then the output code is different. Cache plugin is a fullpage http cache and for caching it generates static files from the dynamically generated URL, but when the output code changes a static file cant be changed anymore and thats why each changed code needs its own cached file(URL). Each crawler crawls with different settings and thats why there is not only 1 crawler but 1 crawler for each setting. Therefore a URL is requested as much as there are settings that causes different output code. ok so what your suggestion? can i delete the crawlers? to begin from scratch? im using a translator fro lanfuages that can change urls? so each time need another crawler with other settings? SO i not need to delete crwalers but what are the last one to put ON? the lower in the list? the others must be switched OFF so and never used (anymore) have you a link to aguide where to understand how it works.. i read litespeed guide but notunderstand the logic of using crawlers how and when and if need to delete the blacklist each time or need to maintain all blcklist? is there a clear guide on how to do? thanks please provide the report number you can get it in toolbox -> report -> click send to LiteSpeed MJJPLSFH this report number you should keep crawler as it is , each crawler has a purpose you will have different scenarios like : crawler for non-webp , crawler for webp , crawler for guest mode user , crawler for guest mode user with webp ?etc and for blacklist , if I read the screenshot correctly and didnt have any typo picked up 2 URLs , e.g. /da/produkt-tag/iphone-dock-station/ and /product-tag/flip-book-case/ , they are giving 404 error
LiteCache Rush: Speed comes from using less, not from doing it faster
Reference