deno
deno copied to clipboard
Slow upload speed for http server
I tested upload speed for varius file sizes: 10M, 100M, 1000M in comparison of node.js upload speed in Deno is about 2x-3x slower and on 1G file deno is 10x slower and use almost all CPU resources (my ubuntu 20.04 vps server has 2 cpu cores and 2G ram).
What I'm doing wrong or how to improve upload speed??
Send test data to server(localhost) via curl:
time curl -H "Content-Type:application/octet-stream" --data-binary @/dir/dev/foo1000M.bin http://127.0.0.1:8080/upload/foo.bin
Test file for uploading was created by command:
truncate -s 10M foo10M.bin
import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { config } from "./modules/config.js"
import * as path from "https://deno.land/std/path/mod.ts"
import {
writableStreamFromWriter,
readableStreamFromReader,
} from "https://deno.land/std/streams/mod.ts";
const PORT=8080
async function handler(req){
console.log(`>>>new request: ${req.method}, ${req.url}, path: ${req.pathname} params: ${req.searchParams}`)
console.log("HEADERS:",req.headers)
if (req.body) {
console.log("==>hasBody!");
const file = await Deno.open(path.join(config.upload_dir, "native_deno.bin"), { create: true, write: true })
const writableStream = writableStreamFromWriter(file)
await req.body.pipeTo(writableStream)
}
console.log(">>upload complete!")
return new Response("upload complete!");
}
serve(handler, { port: PORT })
console.log(">>>server listen...",PORT)
Results:
for 10M file:
Deno: 60 - 90ms Node: 35-40ms
for 100M file:
Deno: 500-600ms Node: 260ms
for 1000M file:
Deno: 20s-5m Node: 1000-1700ms
I repeat upload test multiple times and every time on 1G file deno server become slower and slower (for 10M and 100M test file I don't see any performance reduction) but node server upload time stay the same avery time.
Please include your node code too.
This is my simple test node code:
const express = require('express')
const morgan = require('morgan')
const fs = require('fs')
const path = require('path')
const stream = require('stream/promises')
const app = express()
const port = 8081
app.use(morgan('tiny'))
app.get('/', (req, res) => {
res.send('Hello World! ' + new Date())
})
app.post("/upload/:filename", async (req, res) => {
//check file name
if (!req?.params?.filename || !/^[0-9a-zA-Z\.\-\_]+$/.test(req?.params?.filename)) {
return res.send("Invalid file name!")
}
const fs_stream = fs.createWriteStream(path.join("/dir/www/upload", req.params?.filename), { flags: "w" })
//stream body to file
await stream.pipeline(req, fs_stream)
res.send("File successfully uploaded to node server!")
fs_stream.close()
})
app.listen(port, () => {
console.log(`==>listening on port ${port}`)
})
Then I test uploading of 1G file multiple times I see performance degrades.
Duplicate of #10157?
@kitsonk Maybe - I have some stuff to do on Monday, but I'll try to investigate this on Tuesday.
@devalexqt So I can reproduce Deno being slower than Node here, but not catastrophically slower like your benchmark suggests. I measured 1GB uploads to the Node server at 1.24 secs, and the Deno upload at 3.40 secs. I am using your exact script and commands, except that I am writing to /tmp instead of cwd.
Is there anything else I should know to be able to reproduce your results? What Deno version are you on?
@devalexqt So I can reproduce Deno being slower than Node here, but not catastrophically slower like your benchmark suggests. I measured 1GB uploads to the Node server at 1.24 secs, and the Deno upload at 3.40 secs. I am using your exact script and commands, except that I am writing to /tmp instead of cwd.
Is there anything else I should know to be able to reproduce your results? What Deno version are you on?
Try to upload 1g file multiple times in the row, every time it's slower and slower...
2x-3x slower that node, looks catastrophic in my eyes, something wrong in deno code under the hood.
I'm using latest deno version, and node 16.
Also, I had 2 cpu and 2g ram server.
I'll try that, thanks.
2x-3x slower that node, looks catastrophic in my eyes, something wrong in deno code under the hood.
I agree it isn't great, but 5s is much faster than 5 minutes (60x faster) 😅
Also, I had 2 cpu and 2g ram server.
Ah, I think this may be related. I'll take this into account.
Deno also uses almost all CPU resource during upload 1G file.
Any news?
No.
We've recently added the ability to use Web Streams directly with file system files.
https://deno.com/blog/v1.19#files-network-sockets-and-stdio-are-now-native-web-streams
@devalexqt I'd be interested to see your benchmark again with this new interface:
async function handler(req) {
console.log(`>>>new request: ${req.method}, ${req.url}, path: ${req.pathname} params: ${req.searchParams}`)
console.log("HEADERS:",req.headers)
if (req.body) {
console.log("==>hasBody!");
const file = await Deno.create(path.join(config.upload_dir, "native_deno.bin"))
await req.body.pipeTo(file.writable)
}
console.log(">>upload complete!")
return new Response("upload complete!");
}
Thanks, will try today.
Hi! Tested new version:
deno --version
deno 1.19.0 (release, x86_64-unknown-linux-gnu)
v8 9.9.115.7
typescript 4.5.2
I tested 1G file uploading and with new version total uploading time is reduced from 16.5 to 5.3 seconds (3x improvement and result more stable if I repeat test multiple times). But node.js has 2.3 seconds! (2x faster) and uses 2x less CPU power on my VPS server: 2 CPU & 2G RAM.
I also want to try to run test on 4-8 core CPU.
Hi!
Running the tests on another VPS with 4 CPUs and 8 GB of RAM gives exactly the same result.
Basycally deno has 1G/5s =200MByte/secod throughput , but node has 450MByte/s throughput.
So, deno is 2x slower and it's a huge difference.
How we can improve it?
Tested again on new deno and std versions:
deno 1.19.2
[email protected]
Deno still 2x slower vs node : 2.3sec vs 5.5sec
Tested new deno release:
deno 1.21.1
[email protected]
Deno is slightly faster than previous version 4.5s, but still 2x slower than node 2.3s
We are actively working on improving performance of HTTP server.
I'm tested same server but with new Deno.serve(), result is the same: 2x slower vs node.
deno 1.25.0 (release, x86_64-unknown-linux-gnu)
v8 10.6.194.5
typescript 4.7.4
Same slow results.
deno 1.25.1 (release, x86_64-unknown-linux-gnu)
v8 10.6.194.5
typescript 4.7.4
ReadableStreams are quite slow as compared to the old Reader/Writer interface, because the buffer is not being reused. Allocating a new Uint8Array on every chunk is one of the reasons of the slow performance on big uploads.
https://github.com/denoland/deno/blob/d7b27ed63bf74c19b0bc961a52665960c199c53a/ext/http/01_http.js#L383-L390
By reusing the buffer (not using ReadableStream) uploading a 2gb file takes around 3 seconds, the same upload using a ReadableStream takes 4.8 seconds on my computer.
I love having spec compliant Response and Request, but when dealing with a lot of data is nice to have some lower level APIs. I think we should expose lower level APIs for users that really need that speed.
The same bottleneck happens in fetch when downloading large files.
I'm working on this, I'll try to have a PR by tomorrow.
Xref #16046
Same slow results.
deno 1.26.1 (release, x86_64-unknown-linux-gnu)
v8 10.7.193.3
typescript 4.8.3