core
core copied to clipboard
Generating DCAT feed eats memory
I now get this error on demo.thedatatank:
[Tue May 10 14:56:04.860052 2016] [:error] [pid 6551] [client 78.21.55.178:55568] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 78 bytes) in /home/tdt/demo/core/bootstrap/compiled.php on line 6199
How much memory is assigned? (check php.ini)
I'll check if we're caching this properly.
As the error message says: 134217728 bytes ;)
But it really should not take that much memory to get a DCAT feed with only a few datasets, right?
Depends, we're working on more meta-data right now, but normally it should not.
Wasn't due to the DCAT, it was because on our demo server needed some maintenance :).
Reopening this: seems like EasyRDF is a horrible library concerning performance and memory. I’m soon launching the Hardf library which also support quads: https://github.com/pietercolpaert/hardf/pull/5
I get a 200 times faster parsing speed with Hardf