Wordpress LScache Plugin: Watch Crawler Status Explain
Last Updated on: Wed, 15 Apr 2026 00:00:02 Hi Support I currently set up my crawl interval to refresh every 1 hour. Under the crawler cron summary, I see two cron jobs. First one is Guest Webp and the second one is just named guest. What are these two cron jobs? The first one prefetch a copy of cache onto on webp images onto the server? And the second one prefetches the html content of all pages found within the sitemap? I am using 3 threads and the first cron job takes 50 minutes to complete. The second one takes about 50 minutes to complete as well. Does that mean I need to set my crawler interval at least 2 hours? Is there any way to set my crawler to run the second cron job first before it crawls on the webp images? I find the site runs much faster when the pages are cached rather than having webp images cached. Hi, the first guest webp means crawler the page with webp support for modern browser like Chrome , the new Edge , Safari 14 and above ?etc the other guest is for other browsers that doesnt support webp images , like older version of Safari ?etc and actually , consider the MarketShare of Chrome , and how often people update Apple devices , Id say there are more webp-supported browser than non-webp browser. if within interval the crawler was running , it will jt simply skip and wait for current one finishes and then start on next interval time sadly no , you can not manipulate the list order Best regards, Re: if within interval the crawler was running , it will jt simply skip and wait for current one finishes and then start on next interval time Im sorry but I didnt quite understand what your reply meant. So should I set it to one hour or two hours? Hi, well ,actually it doesnt make much difference if you set to 1 or 2 hours in this case Best regards, Sorry to bother you again. I want to make sure I understand this correctly. If a cron job requires 2 hours to complete and I set the interval to one hour, surely it is not enough time for the crawler to complete the job. The first hour will probably only complete 50% of the job. Since I have set my crawl interval to 1 hour, on the second hour will the crawler pick up from its last position or reset back to position 1? when it detects crawler was running , it will skip it e.g. you set time to 1 hours right now it is 00:00 , each of your crawler takes 50 minutes to complete 00:00 starts 00:50 finishes first depends on your interval between runs , lets say it is set to 5 mintues 00:55 it starts to second round 01:00 cron trigger crawler again , but crawler seems there is a running task already , so it is skip it 01:45 second round finishes 02:00 it triggers cron again , but since last run was finished 15 mintues , not meet 1 hours yet , then skip it again 03:00 cron again , since it has been 1 hour 15 minutes , it will start to crawler 03:50 first round finishes 03:55 second round starts 04:00 cron again , but crawler is running , skip 04:45 second round finishes 05:00 cron again , 15 minutes ago , not meet 1 hour condition , skip 06:00 starts again first round ?. and so on If my crawl interval is set to 1 hour and it takes 2 hours to complete the job, the crawling would not stop even though 1 hour has past? It will continue to crawl with interval between runs until the entire size is complete? I had an impression that the 1 hour crawl interval would stop the crawling from happening after 1 hour has past even if the crawling process is not completed within the 1 hour. And then it will restart in position 1, so the cron will never finishes its job forever. its interval time between 2 crawlers , not maximum crawler running time I see where this is going then. Because the crawler cannot auto start crawling as soon as a purge occurs, the crawl interval sets when to fire a cron to start checking again. Does an auto purge occur when a new purchase order is received? If thats the case, shouldnt it be beneficial to set crawl interval to half hour or shorter so it can start crawling the site again to cache all missing pages? it will purge product page (and its related pages, like category or tag page) upon stock change more aggressive crawler , yes , it could be more beneficial , but it also creates more load to server, not very practical choice for shared hosting user
LiteCache Rush: Speed comes from using less, not from doing it faster
Reference