dat
dat copied to clipboard
Sharing files one-off via http
Hi y'all; brilliant work you're doing here! :wave:
I mentioned to @mafintosh on IRC that a common use case I have is to want to share a single file (or disparate set of files) with a friend. Sometimes that friend won't have dat installed, so it's nice when I can share it over HTTP. Here are two ways I go about doing this:
1. oldweb
ssh eight45.net 'mkdir /tmp/fun'
scp file1.png ~/dev/file2.js eight45.net:/tmp/fun
ssh eight45.net 'ecstatic /tmp/fun -p 9111`
From here I can tell my friend to go to http://eight45.net:9111/ to see the files I shared.
2. ipfs
ipfs add -w file1.png ~/dev/file2.js
added QmSJ1Upq9RXzi167LwVpTqdBwXoemf5YDv3XDARqbRLPKE file1.png
added QmcGA3wQ2uBMQcEK1ae7rfY1T62v35qVrHaGPTJUQbhyzD file2.js
added QmNyNKUsiGpHcXjRjQjE49nfN4ayMAqac7YwkHwCf8x5te
From here I can tell my friend to visit https://ipfs.io/ipfs/QmNyNKUsiGpHcXjRjQjE49nfN4ayMAqac7YwkHwCf8x5te and they can access the files.
Maybe there are already ways to do this with dat that I don't know about, but here are the challenges I see:
dat share foo.pngwill create a dat for everything in the current directory. Having some way to make a transparent, ephemeral.datdirectory for sharing a few ad-hoc files would be useful.- Some way for others to access the dat over http. IPFS makes everything accessible through its gateway on ipfs.io. Hashbase looks promising -- maybe permit anonymous use for small files?
Hey @noffle! 👋
A free public gateway wouldn't be too hard to build. Dat provides filesizes in the metadata prior to downloading the content, so you can easily apply a size limit and give it a short-lived cache.
Maybe we should build one into hashbase?
The IPFS folks have been running a public gateway for years now, and it's been a huge asset to the community & I think contributed a lot to its growth. Would be very jazzed to see this in datland.
If I can, I'll throw together the idea I have in my head tonight:
- A new CLI tool
dat-sharemaybe?- Same tool is used for both running the service and uploading to the service, and P2P transfers.
- Has local config file containing:
- Auth codes and addresses to remote servers.
- Auth codes to clients that local server allows uploads from.
We already have this on datproject.org/
I wrote a quick tool that two parts.
The server listens on a TCP port for clients to connect. The protocol is simply the 32 byte key followed by hyperdrive replication protocol.
The server also has an HTTP server that handles requests on-demand from HTTP clients using @joehand's hyperdrive-http.
I want the server to create a sparse hyperdrive that replicates from the client over TCP on demand. When the client disconnects, I deregister the key from the http server.
The problem I'm getting is something is wrong with sparse mode and only the first file requested by a browser will download.
https://github.com/creationix/drop-dat/blob/master/drop-dat.js
@creationix that is excellent (drop-dat, I meant). That kind of simplicity is exactly what I was after, as I want to run that, without relying on third part servers. Would love to see some systemd init script, to run a server permanently!
I agree with @noffle that an ipfs.io-style public gateway would be a great asset for the community. datbase.org only seems to give you a preview, not the raw data.
What are the barriers to setting one up?
There are dat gateways on github. Just have to shoulder the cost. It also has problems with URLs not working right
On May 26, 2018, at 10:00 AM, Edward Silverton [email protected] wrote:
I agree with @noffle that an ipfs.io-style public gateway would be a great asset for the community. datbase.org only seems to give you a preview, not the raw data.
What are the barriers to setting one up?
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.
What if there was a configurable size cap on the gateways? Most times I just want to share small files, not big datasets.
Could you use nginx rate-limiting too?
On Sat, 26 May 2018, 15:55 Stephen Whitmore, [email protected] wrote:
What if there was a configurable size cap on the gateways? Most times I just want to share small files, not big datasets.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/datproject/dat/issues/871#issuecomment-392284537, or mute the thread https://github.com/notifications/unsubscribe-auth/ABRceRbJKreqgOSp_RO5vxe2b49ogfs3ks5t2bM5gaJpZM4PuVBe .
It's up to whoever runs one. We decided to do the hashbase model where it's always connected to your personal resource usage, and that's how we manage it. (We should make it a little easier/faster to add dats on hashbase so that you could get the benefit of a gateway.)
I've been working with https://github.com/DaMaHub to try to get a gateway set up.
It works for a time, but then we get this error:
node: ../deps/uv/src/unix/udp.c:67: uv__udp_finish_close: Assertion `!uv__io_active(&handle->io_watcher, 0x001 | 0x004)' failed. Aborted (core dumped)
We're running Ubuntu 16.04.4 LTS with node v8.11.1
Has anyone run into this before?
Sorry, should mention that we're using https://github.com/pfrazee/dat-gateway
@edsilv could be a node bug (here is the line that's throwing). I'd assume if it was a bug in how something is using node, it wouldn't core dump, it'd emit an 'error' event on the object or throw a JS exception. Perhaps not though, @mafintosh could that be an interaction with the utp lib?
Hey, just to let you know that updating to node v10.9.0 and rebooting seems to have fixed it.