akka-http
akka-http copied to clipboard
fileUpload directive does not respect Content-Transfer-Encoding header
Consider the example:
(post & path("upload")) {
fileUpload("myFile") { case (FileInfo(fieldName, fileName, contentType), fileSource) =>
// handle this request somehow
}
}
and this curl request:
curl -F 'myFile=@path/to/some/file.dat;encoder=base64' http://localhost/upload
It creates an HTTP request which looks like the following:
POST /upload HTTP/1.1
Host: localhost
Content-Length: ???
Content-Type: multipart/form-data; boundary=------------------------abcdefghijk
--------------------------abcdefghijk
Content-Disposition: form-data; name="myFile"; filename="file.dat"
Content-Transfer-Encoding: base64
GuCpMEOL1qXwbDU9e4gqdz6EO9czFKzGV2b+i7dOzZZeAmQpyQBkM7hFXzPmZx4o
es0i0I5jWjReuvAWq25AKeK4XkgRIv7pOWPMFrX8wE4QyVPBLVoNz9uRBW81GSUb
NMnzYWQ9QdCcLOvzqwGdHWbCn64/h2ASX2QGoSP6KQ0=
--------------------------abcdefghijk--
The fileUpload
ignores the Content-Transfer-Encoding
scheme and provides the body part data as base64-encoded bytes.
However, it does not let the client code know which encoding scheme was actually used upon the request.
If different encoding schemes can be used for a single route backed by the fileUpload
directive, then there's no way to handle uploading files correctly.