Loading large execution history runs fail
Bug Description
On flows that are dealing with files and have failed, when trying to view the run, it loads for a while, then the error shows up:
All settings are set to save:
To Reproduce
Create an example flow:
- get a webhook with a DOCX file, can be with images, lets say 25MB+
- convert the DOCX to HTML
- convert the HTML back to DOCX
make the flow fail to run. try to view the execution
Expected behavior
the failed run can be displayed in the execution log
Operating System
ubuntu 22.04 4 dedicated VCPU, 16GB RAM, 8GB swap
n8n Version
1.72.1
Node.js Version
docker
Database
PostgreSQL
Execution mode
main (default)
Hey @nirkons,
We have created an internal ticket to look into this which we will be tracking as "N8N-7970"
Hey @nirkons
Do any errors appear in either the browser console or in the n8n logs?
Hey @nirkons
Do any errors appear in either the browser console or in the n8n logs?
In the docker logs I didn't see anything, do I need to set any parameter in the docker compose to see debug logs?
The console I didn't check, I'll check and update Thanks
Hey @nirkons
Do any errors appear in either the browser console or in the n8n logs?
index-fCEqBD8K.js:14301 GET https://example.com/rest/executions/837404 net::ERR_CACHE_WRITE_FAILURE 304 (Not Modified)
this is the only thing in the console, nothing in the docker logs, but im unsure if i need to increase debugging in docker?
thanks
Hi @nirkons,
Could you please clear the browser's cache data? If the problem persist it would help us to reproduce it ourselves if you could provide us a sample file to test it.
Thanks
Hey @nirkons!
We haven’t heard back for a while so we’re going to close this for now.
If you’re still running into the issue, or have more details to share, feel free to reopen it or create a new one.
Thanks!
Hey, I can confirm this exact error on my instance too, happy to work with you with any info you need.
I'm still having this issue, I could not get any logs, nothing shows up even when increasing logging level.
The only thing is that the run itself is about 250MB in the DB when getting it from sql.
Maybe you can reproduce by creating a similar scenario
Download a docx file from google drive, convert it to html with mammoth, add another function node and make changes to this html (doesn't matter what), add another function node and convert back to docx with html-to-docx, upload to Google drive