Improve offline location caching for autosend
This is related to issue #1061
I am uploading to a dawarich server on my LAN. Your upload feature currently only caches uploads if it detection the phone is offline otherwise it doesnt get uploaded. It would be ideal for it to only upload when it can contact my LAN. You mention battery concerns.
Most obvious Ideas I can think of in order of preference:
- Add back off logic (Keep doubling retry interval until it succeeds)
- Add a connectivity constraint (Check if connected to specific WIFI SSID. One of my other apps does this)
- Give a warning to the user about battery and let the user decide between battery and data integrity (I could just set it to once a day or once a week).
- Tell the user to do it (Show a notification when cached pts exist with an option to retry now, let the user decide to do it)
All these ideas assume you back off if it fails. I dont know if you already do this, (If there are 1000 cached pts, and it fails upload the first point, don't retry the next cached point since that will probably fail too. Wait until the next retry interval. If it succeeds then try the next point and so on.) It's probably better for battery life to upload in intervals (every X cached pts, every hour, etc) than every time it records a point anyways.
In the latest code which will be released next version, I've added a check for NET_CAPABILITY_VALIDATED, which should improve net detection, although the logic of the net detection is in the workmanager library. There's already a backoff logic which is also part of the same library.
I'm not sure what feature dawarich is making use of, if it's a file upload (GPX, CSV etc) or Custom URL logging. For long periods of being offline, Custom URL is not a suitable logging mechanism, since the workmanager only takes a maximum of 20 'jobs' which also means points will get lost. The next best thing is to use the auto send feature with Custom URL which will read points from a file and send them to the server one by one.
I guess my use case is different because I'm hosted only on my LAN so I only want to be able to upload only when connect to my home LAN via my WIFI SSID (or VPN but that's secondary)
I'd rather not expose dawarich to the internet.
dawarich is just customURL I believe.
I guess I'm not sure what the difference between Custom URL with and without auto-send means.
Oh I should explain that.
Normal custom URL logging: when a point is received, it will make an HTTP request to the target URL.
"Allow auto sending" custom URL setting: it actually records a CSV file. Then every 'auto send interval' which is about 60 minutes, the app will send each line from the file to the target URL. This is the bulk sending approach. This would probably work better with the wifi setting, assuming the dawarich server is on the wifi network you connect to.
I have custom URL enabled. on the autosend screen, I have "allow auto sending", "Send on wifi only", and "custom url" checked In "custom url" I have "log to custom url" and "allow auto sending" checked.
In the log, I see it trying to send whenver a point is recorded.
when I look in dawarich, it seems to only reflect the points when I'm at home (connect to wifi)
In the log when it's trying to send, is it succeeding or showing any errors?
You could also try recording a debug log, and having a look in it to see if any errors are being produced when it tries to do the normal logging. I wasn't sure if that's what you meant by the log.
When it autosends the CSV and fails after 3 tries, what happens? does it try again later?
When it autosends the CSV and fails after 3 tries, what happens? does it try again later?
Yes it'll retry at the next interval which is 60 minutes default. Assuming the file name hasn't changed.
File name change? Does it use its own file (ie. queue.csv) and get bigger as it fills and smaller as it sends OR does it use the "Log CSV" file (default daily)? If this is the case, this could explain why I was losing points. But how does it track whats been sent and what hasnt
If you use the normal file names like daily file, eg 20251006.gpx then it'll simply keep retrying at the next interval, usually 60 minutes.
But if you have a dynamic file name where the file name is generated, then an upload is attempted whenever the file name changes. For example if you have %HOUR_%MINUTE then every minute, the file name changes from 18_51.gpx to 18_52.gpx, and that change will try an upload event. There's no extra tracking involved, it's simply the fact that the new file name doesn't match the old file name, therefore, make an attempt to upload.
Anyway it is best to stick to longer lived filenames like the daily one, so that it is recorded to, and reuploaded regularly.
ok it seems to be working except for two things.
- Does it only send a certain number each time. I reduced it down to every 1min and after a few hours, it seems like its still sending alot.
- I switched to a monthly file, but it will still miss something if I happen to be at the end/beginning of a month. Is there a way to just have a "queue/buffer" file that saves whats not sent yet? I don't really have a need to keep anything on the actual phone other than what hasnt been sent.
Yes each 'upload' or 'send' of the CSV to Custom URL should resend every single line in the CSV to the target server. The target server needs to be able to handle deduplicating.
No queue buffer type setup, it's a simple send the files as-is. It should be sending the files when the names change.
So it always sends the whole file? for a monthly file,
- Day 1: sends day 1 data
- Day 2: sends day 1 and 2 data
- Day 31: sends day 1 - 31 data
But let's say on the 27th, I go out of town and come back the next month and thus am not connected to WIFI to my server. Well when I come back and it reconnects, it will be on the next month's file and day 27-31's data is lost.
I hope you understand my goal/request. To send all the data to the server and to "buffer/queue" it if I'm not connected to the server
i'd like to know this too,i log to owntracks using custom url,i like the ability to log offline(plane,vacation without data-just wifi-so back at hotel),so i can have few points offline,or hundreds,now how should i log?
enable csv(or whatever internal format) and let gpslogger do it's magic would be best,send point after point once online(or more precisely server reachable,when internet works,but home is offline for some reason),wait for each point confirmation and if not,wait again? this is not how it works?
what about logging csv,but autosend will delete from csv after each successfull send?or better update some pointer,so next time it just continues from next line? (thinking about flash wear here,probably silly)
thanks for your hard work!
So it always sends the whole file? for a monthly file,
* Day 1: sends day 1 data * Day 2: sends day 1 and 2 data * Day 31: sends day 1 - 31 dataBut let's say on the 27th, I go out of town and come back the next month and thus am not connected to WIFI to my server. Well when I come back and it reconnects, it will be on the next month's file and day 27-31's data is lost.
I hope you understand my goal/request. To send all the data to the server and to "buffer/queue" it if I'm not connected to the server
I missed this comment. Yes in a monthly file it'll end up sending every single line from day 1 to 31. In the cases where there are 'missed connections' like the 27-31 example, you can use the upload menu to send to Custom URL and it'll let you pick a CSV file to re-send.
The nature of time, network, and locations means there are going to be edge cases and missed connections always. The automation in the app is quite rudimentary, and for those missed connections, it will need some intervention.
i'd like to know this too,i log to owntracks using custom url,i like the ability to log offline(plane,vacation without data-just wifi-so back at hotel),so i can have few points offline,or hundreds,now how should i log?
enable csv(or whatever internal format) and let gpslogger do it's magic would be best,send point after point once online(or more precisely server reachable,when internet works,but home is offline for some reason),wait for each point confirmation and if not,wait again? this is not how it works?
what about logging csv,but autosend will delete from csv after each successfull send?or better update some pointer,so next time it just continues from next line? (thinking about flash wear here,probably silly)
thanks for your hard work!
The app will - if it's been told an internet connection exists - try sending the lines 3-5 times to the server. If all attempts fail, it simply gives up. In the case of auto send (aka auto upload) if you've set it to go every 60 minutes, it'll keep trying all the lines every 60 minutes.
But suppose all of that fails anyway. What you ought to do next time you're in a stable connection, is upload manually from the upload menu at the bottom. You can 'upload to' the custom URL option and pick your CSV files, then let it issue the requests one-by-one.
ok, I think you've explain the existing behavior well enough. You choose a naming convention for a file based on date so it can be daily, monthly, etc. It then uploads the entire file at whatever interval you set.
I think I've explained my feature request and the gaps. Upload all gps recording to a server and keep trying if it can't connect. Don't have gaps and miss uploading some data. Don't lose data.
The current method has issues:
- I don't want to waste battery so I'm not going to set it to upload every minute, I might do like 1-2times a day.
- If I had a daily file, I'd be missing all the data after the last attempt every day assuming the last upload is successful
- So I chose the longest option, Monthly to try to reduce this gap but since it doesnt track what it uploads, it gets more and more inefficient as the file grows. Possibly more likely to lose data as new data is always at the end.
There are many ways to solve this but my suggestion would be to have an option to save to a queue file where you add new entries and remove entries that are successfully uploaded. This prevents loss of any data. It is more efficient because you aren't uploading data that has previously been uploaded.
I think I have almost the same use case here, I log using only custom URL to dawarich. I have a VPN server setup on my home network that my phone is always connected to (even on cell data), but the connection isn't perfect.
Anyways, adding to eng3's proposal in their last message: -when logging to custom URL, always use a hidden log/queue file (don't give user the option to use or not) -on a new point, if sending it to the server succeeds then parse the queue file into a list of points, try to send all points one by one, if one fails write a new queue file with the unsent points. -On a new point, if sending it to the server fails, append point to queue file -Can add other checks like wifi connection easily in that code
This imo should be the standard behavior for how log to custom URL works.
If you tell me where to look in the code to implement it, I'll take a look (no promises though).