ethercalc
ethercalc copied to clipboard
Migrate from dump.json to redis?
There doesn't seem to be a way to migrate a dump.json
file into a new ethercalc instance. Maybe I'm missing something? I'm moving my ethercalc instance to a server with docker and it was pretty easy to setup a redis server from there, so I did it but couldn't see a way to import the data off the old server (which was using just a JSON file as a database)
A script to take all snapshot-*
and audit-*
keys from dump.json
, then connect to Redis and SET
them would do the trick.
Would someone like to help writing it? Node.js preferred, but other language is fine too. We can put it into misc/
or include it as an option to ethercalc.
I don't have my environment set up properly to test it yet, but I have a tentative script written in livescript.
require! <[ fs redis ]>
dump2redis = module.exports = (filePath, client) ->
dump = JSON.parse do
fs.readFileSync filePath, \utf8
for key, val of dump when key == /^(snapshot|audit)-.*$/
client.set key, val
if require.main == module
[dumpFile, redisHost, redisPort, redisPass] = process.argv.slice(2)
dumpFile ?= process.cwd! + '/dump.json'
redisHost ?= \localhost
redisPort ?= 6379
client = redis.createClient redisPort, redisHost
if redisPass
client.auth redisPass
dump2redis dumpFile, client
I'll submit an actual pull request once I can test it, but if you want to try using it feel free.
For the record, after having searched such a tool in vain, I coded the same in Python2. It has been tested and it worked as expected.
#!/usr/bin/env python
# -*- coding:Utf-8 -*-
import redis
import json
import re
DUMP='/root/dump.json'
fd = open(DUMP, 'r')
JS = json.loads('\n'.join(fd.readlines()))
fd.close()
snap_regex = re.compile('snapshot-.*')
audit_regex = re.compile('audit-.*')
redisClient = redis.StrictRedis(host='localhost', port=6379, db=0)
for key in JS:
if snap_regex.match(key) or audit_regex.match(key):
print "[+] Inserting %s..." %key
redisClient.set(key, JS[key])
I am interested too, but I don't seem to have a dump.json
file, but instead I have a bunch of files in dump
directory. I am migrating to a new server which runs docker and thus use redis instead of local storage. I am not sure how to proceed to migrate the data.
OK, I have found a solution for my own problem. I post it here if someone else needs to do the same. I have used the REST API to send all sheets to the new server with a simple Python script. I don't know how general it is.
from pathlib import Path
import requests
ethercalc_dir = Path("/home/calc/ethercalc")
dump_dir = ethercalc_dir / "dump"
remote_server = "https://newserveur.tld"
for f in dump_dir.glob("snapshot*_formdata.txt"):
sheetname = f.name[9:-13]
# Get SocialCalc data from local server
r = requests.get(f"http://127.0.0.1:8000/_/{sheetname:}")
data = r.text
# Create spreadsheet in remote server
r = requests.post(f"{remote_server:}/_", json={"room": sheetname, "snapshot": data})
print(sheetname, r)
dump.json
dump.json was split into a file for each sheet. Maybe 2 years ago. There is no json file any more, just files for each sheet.