dawarich icon indicating copy to clipboard operation
dawarich copied to clipboard

Importing old Google Latitude export fails

Open mbirth opened this issue 10 months ago • 4 comments

OS & Hardware Raspberry Pi 5 (ARM64) via Docker

Version 0.23.5

Describe the bug Similar to #728 - I get the same error message when trying to import a 190MB Google Latitude JSON (exported in 2014):

To Reproduce Steps to reproduce the behavior:

  1. Go to Imports
  2. Click on New import
  3. Select "Google Semantic History"
  4. Click on "Choose files" and select "LocationHistory.json"
  5. Click "Create Import"
  6. Wait for a minute
  7. Check notifications for error message

Expected behavior The points in that file should be imported.

Screenshots n/a

Logs

[Import failed](https://example.org/notifications/1)
less than a minute ago
Import "LocationHistory.json" failed: undefined method `flat_map' for nil, stacktrace:
/var/app/app/services/google_maps/semantic_history_parser.rb:42:in `parse_json'
/var/app/app/services/google_maps/semantic_history_parser.rb:14:in `call'
/var/app/app/services/imports/create.rb:12:in `call'
/var/app/app/models/import.rb:17:in `process!'
/var/app/app/jobs/import_job.rb:10:in `perform'
/var/app/vendor/bundle/ruby/3.3.0/gems/activejob-8.0.1/lib/active_job/execution.rb:68:in `block in _perform_job'
/var/app/vendor/bundle/ruby/3.3.0/gems/activesupport-8.0.1/lib/active_support/callbacks.rb:120:in `block in run_callbacks'
...

No further helpful message in any of the outputs, though. (Also, see #728 - I seem to be hitting the same issue - which might be related to an unexpected JSON structure?)

Additional context JSON looks like this (190MB of this):

{
  "somePointsTruncated" : true,
  "locations" : [ {
    "timestampMs" : "1412844278374",
    "latitudeE7" : 515065039,
    "longitudeE7" : 123727043,
    "accuracy" : 56
  }, {
    "timestampMs" : "1412843978435",
    "latitudeE7" : 515064980,
    "longitudeE7" : 123727441,
    "accuracy" : 59
  }, {
    "timestampMs" : "1412843679375",
    "latitudeE7" : 515064751,
    "longitudeE7" : 123727390,
    "accuracy" : 58
  }, {
    "timestampMs" : "1412843366312",
    "latitudeE7" : 515065026,
    "longitudeE7" : 123726772,
    "accuracy" : 55
  }, {
    "timestampMs" : "1412843066604",
    "latitudeE7" : 515064962,
    "longitudeE7" : 123727435,
    "accuracy" : 59
  }, {
    "timestampMs" : "1412842743707",
    "latitudeE7" : 515064971,
    "longitudeE7" : 123727126,
    "accuracy" : 57
  }, {
    "timestampMs" : "1412842432766",
    "latitudeE7" : 515064929,
    "longitudeE7" : 123728006,
    "accuracy" : 62
  }, {
    "timestampMs" : "1412842143710",
    "latitudeE7" : 515064936,
    "longitudeE7" : 123727126,
    "accuracy" : 57
  }, {
    "timestampMs" : "1412841903249",
    "latitudeE7" : 515064907,
    "longitudeE7" : 123727310,
    "accuracy" : 58
  }, {
    "timestampMs" : "1412841606819",
    "latitudeE7" : 515064841,
    "longitudeE7" : 123726896,
    "accuracy" : 55
  }, {
    "timestampMs" : "1412841543530",
    "latitudeE7" : 515064867,
    "longitudeE7" : 123726746,
    "accuracy" : 55,
    "activitys" : [ {
      "timestampMs" : "1412841543964",
      "activities" : [ {
        "type" : "still",
        "confidence" : 100
      } ]
    } ]
  }, {
    "timestampMs" : "1412841485466",
    "latitudeE7" : 515064888,
    "longitudeE7" : 123727097,
    "accuracy" : 57
  }, {
    "timestampMs" : "1412841425430",
    "latitudeE7" : 515064858,
    "longitudeE7" : 123727334,
    "accuracy" : 58
  } ]
}

Trying to import it as "Google Phone Takeout" doesn't throw any error, the import finished "successfully". But it also doesn't import any points.

mbirth avatar Feb 02 '25 18:02 mbirth

Any solutions here? Got the same problem.

/edit: I see myself corrected. Importing it as Phone Takeout seems to load the points. Took it a while to even start.

HWiese1980 avatar Apr 29 '25 15:04 HWiese1980

@mbirth location history file belongs to "Google Phone Takeout" source of data. You chose wrong source

Freika avatar May 02 '25 18:05 Freika

@Freika I've tried that:

Trying to import it as "Google Phone Takeout" doesn't throw any error, the import finished "successfully". But it also doesn't import any points.

My export is from 2014 - I guess the format was a bit different back then.

mbirth avatar May 02 '25 19:05 mbirth

@mbirth if you're comfortable, you can send your file to [email protected] and I'll debug the issue

Freika avatar May 03 '25 21:05 Freika

@mbirth if you're comfortable, you can send your file to [email protected] and I'll debug the issue

Having this issue as well. Upload seems to silently fail. Wasnt sure if you still needed something to debug.

sugarfunk avatar May 30 '25 17:05 sugarfunk

In the end I exported everything from Traccar (which is what I had used before and already had all my Latitude points in it) and wrote a small Python script to convert the data into SQL INSERTs for DaWarIch.

mbirth avatar Jun 05 '25 22:06 mbirth

In the end I exported everything from Traccar (which is what I had used before and already had all my Latitude points in it) and wrote a small Python script to convert the data into SQL INSERTs for DaWarIch.

Could you post that python script if you still have it? I am in the same boat and want to convert from Traccar to darwarich.

Thank you @mbirth !

boldgear avatar Jun 11 '25 20:06 boldgear

Could you post that python script if you still have it? I am in the same boat and want to convert from Traccar to darwarich.

Sure, no problem:

#!/usr/bin/env python3
 
# Export traccar tc_positions table as CSV (e.g. using Adminer)
FILE="traccar-export.csv"
 
 
import csv
import json
import sys
from datetime import datetime
 
# traccar_id --> topic
tcid_map = {
    1: 'owntracks/mb/iPhone 13 Pro',
    2: 'owntracks/mb/iPhone XS',
    3: 'owntracks/mb/iPhone 6S',
    4: 'Google Latitude',
    5: 'owntracks/mb/iPhone 16 Pro'
}
 
with open(FILE, "rt", encoding="utf-8-sig") as f:
    csvreader = csv.DictReader(f)
    i = 0
    for row in csvreader:
        attrs = json.loads(row["attributes"])
        
        # OwnTracks format: https://owntracks.org/booklet/tech/json/#_typelocation
        owntr = {
            "_type": "location",
            "acc": int(row["accuracy"]),
            "alt": int(row["altitude"]),
            "bs": 1,   # 0=unknown, 1=unplugged, 2=charging, 3=full
            "lat": float(row["latitude"]),
            "lon": float(row["longitude"]),
            "t": "p",   # p=ping, u=manual, t=timer, etc.
            "tid": "MB",
            "tst": int(datetime.strptime(row["fixtime"], "%Y-%m-%d %H:%M:%S").timestamp()),
            "topic": tcid_map[int(row["deviceid"])],
        }
        if "batteryLevel" in attrs:
            owntr["batt"] = int(attrs["batteryLevel"])
            
        if float(row["speed"]) > 0:
            owntr["vel"] = float(row["speed"])
            owntr["cog"] = int(row["course"])
 
        #print("### Source data:")
        #print(repr(row))
 
        #print("### Output:")
        #print(json.dumps(owntr))
        
        dwirec = {
            "battery_status": 1,
            "ping": None,
            "battery": owntr["batt"] if "batt" in owntr else None,
            "tracker_id": owntr["tid"],
            "topic": owntr["topic"],
            "altitude": owntr["alt"],
            "longitude": owntr["lon"],
            "velocity": str(owntr["vel"]) if "vel" in owntr else None,
            "trigger": 0,
            "bssid": None,
            "ssid": None,
            "connection": 0,
            "vertical_accuracy": None,
            "accuracy": owntr["acc"],
            "timestamp": owntr["tst"],
            "latitude": owntr["lat"],
            "mode": None,
            "inrids": None,
            "in_regions": None,
            "raw_data": json.dumps(owntr),
            "import_id": None,
            "city": None,
            "country" :None,
            "created_at": row["fixtime"],
            "updated_at": row["fixtime"],
            "user_id": 1,
            "geodata": "{}",
            "visit_id": None,
            "reverse_geocoded_at": None,
            "course": owntr["cog"] if "cog" in owntr else None,
            "course_accuracy": None,
            "external_track_id": None,
        }
        
        if i%10 == 0:
            if i>0:
                print(";")
            keys = dwirec.keys()
            keystring = '"' + '", "'.join(keys) + '"'
            print(f'INSERT INTO "points" ({keystring}) VALUES ')
        elif i>0:
            print(",")
        
        values = []
        for k in keys:
            v = dwirec[k]
            if type(v) is str:
                if "'" in v:
                    v = v.replace("'", "\\'")
                values.append("'" + v + "'")
            elif v is None:
                values.append("NULL")
            else:
                values.append(str(v))
        
        valuestring = ", ".join(values)
        print(f"({valuestring})", end="")
        
        i += 1
        #if i > 10:
        #    break
 
print(";")
print(f"{f} records converted.", file=sys.stderr)

Make sure to adjust the traccar_ids to OwnTracks topic map.

(I've also generated OwnTracks records to push via the DWI-API first. But this took ages, so I've created the SQL output.)

It'll output the SQL to STDOUT, so you'll want to redirect this into a file, e.g.:

./convert.py > dawarich.sql

mbirth avatar Jun 11 '25 23:06 mbirth