magento2-CronjobManager
magento2-CronjobManager copied to clipboard
CData section too big when loading timeline
Timeline won't load on our production servers, it generates:
Warning: DOMDocumentFragment::appendXML(): Entity: line 1: parser error : CData section too big found in /vendor/magento/framework/View/TemplateEngine/Xhtml/Template.php on line 60
@JeroenVanLeusden I think this is a Magento bug: https://github.com/magento/magento2/issues/8084#issuecomment-335239220 BUT they say it's fixed in v2.2.x (I'm very skeptical)...
What version of Magento are you running?
We're on 2.2.2, should be fixed in 2.2.3 so maybe there is no issue after we upgraded the webshop.
@JeroenVanLeusden Not ready to give up investigating just yet!
I setup my local environment to be on version 2.2.2 and ran a large sql query to load a little over 10,000 crons in cron_schedule, and I loaded up the Timeline with no errors.
I tried this on production mode and developer mode; I'm using CJM v1.5.0, testing on Ubuntu 16.04 (doubt that the OS is the issue, but I figure it's worth mentioning). Still no luck.
BUT, I grepped for the error you posted above and found it in var/log/system.log with a time stamp of April 29th, but I have no idea how to trigger it again.
Do you have any other information to help me reproduce? Maybe some steps? Can you consistently reproduce this error?
We consistently see this error, which makes me believe it might be a php configuration somewhere? We run the following:
- Ubuntu 16.04.1 LTS
- PHP 7.0.28-1
- max_execution_time => 18000
- max_file_uploads => 50
- max_input_nesting_level => 64
- max_input_time => 60
- max_input_vars => 10000
- memory_limit => 2048M
Just checked my cron_schedule and saw a whopping 170k records. After flushing them there are ~13k records.
The warning message disappeared but not my browser window is crashing trying to render the timeline.
I don't think it's a PHP configuration. See this comment here: https://github.com/magento/magento2/issues/8084#issuecomment-306568076. I think by pulling 170k records through the uiComponent we're going way over the 10MB limit set by libxml.
I didn't account for the timeline to be able to support over 100k records. This is most definitely attributed to a Magento core bug: https://github.com/magento/magento2/issues/11002
There is a PR that was merged that (supposedly) fixes this here: https://github.com/magento/magento2/pull/12497 (I personally haven't seen this issue happen on v2.2.3, so maybe it does work).
But, while the timeline may not be able to support 100k records, this does bring an issue to light regarding performance on the timeline.
This is something we can fix. I noticed performance starts to slow down at ~2k records.
I'm going to close this issue and blame Magento on this one, but I will open another issue regarding performance in another ticket.
Thanks for helping troubleshoot this issue! If you still have concerns, feel free to comment even if the issue is closed, or create a new issue if another arises.
@Ethan3600 This is related to the issue #65 ?
I have the same problem in version 2.2.5

Yep. Is that still happening on the latest version? I hardcoded a limit to the amount of crons hoping it'll be low enough so we don't see this error.
@Ethan3600 The problem is not really related to the module, but a problem with clearing the cron history. The store was over 100k records. Thank you very much for your attention.
Hmm. I was able to reproduce this by loading over 100k crons in the timeline. Are you sure it has to do with clearing the cron history? I was under the assumption that we're just loading too much crons in the timeline (hence the limit)
Yes, exactly, but the magento "core" should not accumulate more than 100,000 crons without cleaning automatically. And on the other hand the CronjobManager even with the last patches, could not limit the lines I guess..
@amelojunior agreed; this behaviour is due to a Magento core bug. However, we do have control as far as the amount of records that are loaded into the timeline. We're simply querying the database with a limit.
This, unfortunately, was a lazy solution on my part. The real problem is described in this comment.
We're not allowed to load over 10MB of JSON (within the contents of a single element). To quickly remedy this problem I forced our query to have a page size (aka limit) of 35,000. You can see the construction of this query here
To my knowledge, there's no way to limit a query by the size of the data (in MB). I'll need to really think about a solution for this. I could drop the limit further, or make it configurable (via a setting), but I'd rather avoid doing that.
Another less elegant way around this apart from clearing the cron history is patching Magento Core file: /var/www/html/vendor/magento/framework/View/TemplateEngine/Xhtml/Template.php
amend public function append($content)
to:
$target = $this->templateNode->ownerDocument;
$source = new \DOMDocument();
$source->loadXml($content, LIBXML_PARSEHUGE);
$this->templateNode->appendChild(
$target->importNode($source->documentElement, TRUE)
);
This then will allow you to load it up even with your cron history. Not sure about the perfomance though. I have simply changed the cron history time to be 600 seconds for now will monitor and was using M2.3.4 EE.
I'm pretty sure this has been fixed by now. We had a similar issue in https://github.com/elgentos/LargeConfigProducts but haven't seen it for years.