Behaviour/performance over 1m points?
Hi
I'm wondering if anyone knows whether the library can handle more than 1 million points? E.g; would it render 10 million points? I'm not too concerned with initial load, but if it can then keep updates < 2s that would be great.
Hi, It depends on the updates. If your dataset doesn't change a lot between updates, meaning that the dataset can be sorted using an insertionSort, then it might be possible to stay under 2s. But if the dataset changes a lot, the library uses the browser native sort algorithm on the full dataset and it is too slow.
I've found that it performs linearly (and very well) up to 2m-2.5m points, but after that it tends to crash....not sure why (it is not a RAM issue...)
You could use a web inspector (profiles in Chrome's webtools for example) to see which function breaks. The library is not designed for this amount of data, but if it's something obvious I can try to fix.