blackbox-log-viewer
blackbox-log-viewer copied to clipboard
unable to export to CSV large files
unable to export larger >= 10MB file to CSV
I have also run into this issue. The Blackbox explorer just crashes. In my case, I can export up to a 12MB file, but if I try with an 18MB file it crashes. I can include my logs (both the one that works and the one that fails, if it helps to diagnose this.
Attach one that fails. Maybe we can see something.
Here you go. ESC_SENSOR_RPM.zip
This issue / pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs within a week.
I've looking at this. It seems a problem of memory, it generates different variables with all the content and maybe something overflows.
The strange is that sometimes it works, and sometimes not, so is difficult to fix something that not always crash :(
That has been my experience as well. It seems a function of both the length of flight, as well as what options are enabled. For the log that I included here, I have a 32/16khz Gyro/PID loop, as well as the ESC RPM logging option turned on.
Yes, I used your log. With your log, sometimes I can export to CSV, sometimes not.
Ah ok. Then this is more insidious that it appears. I appreciate you looking into it.
I have .BBL file (~6MB). The export csv button is not working in chrome app of Blackbox explorer.
I tried to load and export above file into BBE 3.4.0 - on my mac environment it works several times without problems
BUT If user click Export - it seems nothing happens (no output on screen). Background a csv-export-worker create the export file and return.
I took a look into the csv_export - dump function and into the worker.js file. I assume that the problem is that the export is completely dumped into memory (in a map - not sure yet). For above file a csv file around 160mb is created. So this will influence performance and maybe stability.
Maybe a solution could be: Reengineer the worker with an other nom lib and write row by row and flush several times. I the mean time populate a progress bar on screen to update user.
It took approx 3-4min to export above file - without any user information - for the user it seems it will not work.
Hope that helps
@mrRobot62: Good find! The use of workers, while efficient, is problematic in this sense, as it makes it hard to keep the user informed of what is going on (or even that something is going on).
I know, that the reason I recommended to rewrite the export logic with an other lib to have the chance to setup a progress bar (or something else).
Like csv-writer
@mrRobot62: There is definitely potential in that. At the moment our choice of libraries is hampered by the fact that they have to be web-ready to be supported in the version we release as a Chrome web app. Feel free to open a pull request for this, happy to provide support.
Hi @mikeller I spoke(wrote) with @McGiverGim : maybe an approach to solve it, is to implement in csv_export following logic: Split the complete map into several baskets (e.g. 10). Loop around this baskets. Every loop populate a user message on GUI. Append every loop the result map at the end save map to csv file.
Advantage: we do not exchange lib and can populate during saving a user message Disadvantage: it'S not "only" a bug, it's a reengineering of csv_export and maybe saving costs a little bit more time
I’ve looked into it yesterday and agree on the solution. One problem I think is that we cannot write to file in chunks since we are saving it with a web download and not with direkt file access. So the complete data will have to be in the browser. On my desktop with 64GB ram it takes 8 seconds to do the export of the file above so it’s memory related, but in anyway user feedback is required and processing in chunks is the only way to solve it.
mhm - good argument with browser - btw: 64GB ram is not "common" for must users ;-) Is it possible to use the chrome file.system api?
Sorry I'm not an expert in web development
I'm having the same issue with 50Mb files. Usually, when the log file is large, the user doesn't need all the time span at once. So, a quick fix would be to let the user choose what time chunk to be exported.
Also experiencing this issue with 30-40Mb files - @Mahdi-Hosseinali's solution would work for most people I suspect, using the In/Out points to define the range would seem a natural fit.