magento-full-page-cache-crawler
magento-full-page-cache-crawler copied to clipboard
Magento Full Page Cache Crawler
is i possible to exclude pages from Crawling? for example:  
Hi, M1.9.2.4 Installed without any noted errors, but i get this every day: Curious as to why you need to do the AND (`1` = 1) in your SQL? Next...
Hi, I installed this extension through composer and it is installed perfectly fine. I can see the extension in my backend and then I set a crawler & cron Job...
I have installed the module without composer, and have this error when run the crawler: 'Fatal error: require_once(): Failed opening required 'C:\xampp\htdocs\magento\app\code\local\Maverick\Crawler\Helper/../../../../../../../../autoload.php' (include_path='C:\xampp\htdocs\magento\app\code\local;C:\xampp\htdocs\magento\app\code\community;C:\xampp\htdocs\magento\app\code\core;C:\xampp\htdocs\magento\lib;.;C:\xampp\php\PEAR') in C:\xampp\htdocs\magento\app\code\local\Maverick\Crawler\Helper\Crawler.php on line 28' Is needed...
 