nips
nips copied to clipboard
User data export and import with relays.
Hello there, I am thinking about an NIP which allows the users to download all their events stored in one relay into their local machine. So that later, they can upload these events into another relay. This should be quite useful when one wants to migrate their data from one relay to another.
Let me know your thoughts?
Why not just ask the relay for all the data using a normal request?
Why not just ask the relay for all the data using a normal request?
Yeah I think you can do that at the client side with some queries. But I haven't seen any client that offer this function yet. It would be quite convenient if this is implemented as the protocol level. Like every relay will offer this simple function and a client can just request the event data for one user using one simple GET request. Plus, it can also offer a quick and simple way for a relay to import a user's data in one shot (maybe just one zip file containing all of a user's events?).
Clients should store their own data, back-up and export should be part of the client.
@fabianfabian And that's exactly why I am working on https://github.com/evoluhq/evolu
https://github.com/evoluhq/evolu
What happens if one uses several sets of clients—how should the merge of data from different source take place? Plus, the data is not just for myself. I want it to be available in a new relay I migrate to. How would that happen?
Clients should store their own data, back-up and export should be part of the client.
Added Download my data feature request to Damus https://github.com/damus-io/damus/issues/527
Clients should store their own data, back-up and export should be part of the client.
Added Download my data feature request to Damus damus-io/damus#527
Should be export data, as download implies the client is downloading your data from the relays. Ideally the client stores all your events before they are published.
Clients should store their own data, back-up and export should be part of the client.
Added Download my data feature request to Damus damus-io/damus#527
Should be export data, as download implies the client is downloading your data from the relays. Ideally the client stores all your events before they are published.
Ok, changed to export from download to be consistent with #214 title.
Ideally the client stores all your events before they are published.
There is an issue I picked up where I would like to add more detail and don't have all the background: Local Caching: https://github.com/damus-io/damus/issues/409. I don't think Damus has this yet, and am not sure about other clients. Is local caching an implementation solution for your suggestion of client storing all events prior to publication?
I made https://github.com/patrkris/migstr to pull and push events from/to relays
So glad to see such a simple and elegant solution!
+1 for NIP for this. Since there are like 10 different relays and new users/admins can not decide at early testing, it would lead to split/missing data when different relays are tested
Also as relay operator/admin can not know about all users nor to contact them for user-data backup and restore on new relay
Since data retention is essential thing on nostr relay, there should be some generic NIP so there will be prevented every relay implementation will do backup/restore/export/import by their way and that will lead to incompatible data format...
If there will ne NIP for that, there will be guaranteed compatibility over whole relays which will implement such NIP and it will give freedom for server/relay operators to easy migrate between relay software...
For example for now there are 3 mayor relays (based on relay version list here https://nostr.petrkr.net/relays.htm ) but who knows who will win and who will stop maintenance their server...
Like plain electrum server written in python was replaced by fulcrum for big instances or electr for personal usage... But electrum server is able to resync from blockchain data.. Relay data can not be restored, because nostr does not sync data over all nodes (as blockchain do)
nostril-query | nostcat --stream <from> | jq '[.[0], .[2]] | nostcat <to>
nostril-query | nostcat --stream <from> | jq '[.[0], .[2]] | nostcat <to>
This seems only takes actual (and future) events, not historical. So it could be used to sync two nodes, but not to copy old, saved events.