Question: Does the accuracy-filter try to maximize accuracy or does it simply log as soon as possible?
A question.
How does the accuracy-filter work?
I have it set to 30m. I do not want to miss dozens of points because I happen to have a poor GPS signal for a while.
My question: Does this mean that when the accuracy is 28m, GPSlogger will immedately log the point - or will it still try to get a better reading? I would prefer if it did this:
DO
Get GPS reading
WHILE accuracy is better than previous reading, REPEAT
# we now have the most accurate reading possible
if (accuracy is better than accuracy-filter-setting) THEN
Log the point
so even though the accuracy-criterium is met, it still tries to do better. Does it work like that?
Or is it more like: GPS-logger: Give me a reading GPS-device, ok, it's not accurate yet, but here it is GPS-logger: Whohoo! Done! GPS-device: Wait, I got a much better one! GPS-logger: No, enough is enough! I'm done!
Sorry for late reply. The answer is it logs as soon as possible. There is another setting in there called 'Duration to match accuracy' (60 seconds) - and the code will keep trying for a maximum of 60 seconds, until it finds a matching point, then stop logging. That's done to save on battery otherwise each location fix becomes very expensive.