npm-WebDAV-Server icon indicating copy to clipboard operation
npm-WebDAV-Server copied to clipboard

`maxRequestDepth` does not work if `depth` is not set

Open gera2ld opened this issue 5 years ago • 8 comments

If depth header is not set or not a number, maxRequestDepth does not work as expected. Because in such cases depth will never meet 0:

https://github.com/OpenMarshal/npm-WebDAV-Server/blob/5f237622e81886f0305293810725c07b54a2598f/src/server/v2/commands/Propfind.ts#L443

gera2ld avatar Aug 01 '20 14:08 gera2ld

@gera2ld

I also found this problem

curl http://foo.bar:1900/ -X PROPFIND -H depth:every --user root:root
curl http://foo.bar:1900/ -X PROPFIND -H depth:null --user root:root
curl http://foo.bar:1900/ -X PROPFIND --user root:root

masx200 avatar Sep 02 '20 03:09 masx200

@OpenMarshal

I found that as long as the "depth" in the request header is not a number or is not set, the server will treat it as "Infinity" and maxRequestDepth does not work.

masx200 avatar Sep 02 '20 03:09 masx200

@mholt I found that this bug can cause serious and fatal server memory overflow and high CPU usage problems.

masx200 avatar Sep 02 '20 03:09 masx200

https://datatracker.ietf.org/doc/rfc4918/?include_text=1

9.1. PROPFIND Method

The PROPFIND method retrieves properties defined on the resource identified by the Request-URI, if the resource does not have any internal members, or on the resource identified by the Request-URI and potentially its member resources, if the resource is a collection that has internal member URLs. All DAV-compliant resources MUST support the PROPFIND method and the propfind XML element (Section 14.20) along with all XML elements defined for use with that element.

A client MUST submit a Depth header with a value of "0", "1", or "infinity" with a PROPFIND request. Servers MUST support "0" and "1" depth requests on WebDAV-compliant resources and SHOULD support "infinity" requests. In practice, support for infinite-depth requests MAY be disabled, due to the performance and security concerns associated with this behavior. Servers SHOULD treat a request without a Depth header as if a "Depth: infinity" header was included.

9.1.1.  PROPFIND Status Codes

This section, as with similar sections for other methods, provides some guidance on error codes and preconditions or postconditions (defined in Section 16) that might be particularly useful with PROPFIND.

403 Forbidden - A server MAY reject PROPFIND requests on collections with depth header of "Infinity", in which case it SHOULD use this error with the precondition code 'propfind-finite-depth' inside the error body.

masx200 avatar Sep 02 '20 03:09 masx200

The following are the steps to reproduce the bug

For example, if there is the existence of the following loop folder soft links:



$ pwd
/data/data/com.termux/files/home/test                      
 $ ls -l
total 7
drwx------ 2 u0_a261 u0_a261 3488 Sep  2 00:44 a
drwx------ 2 u0_a261 u0_a261 3488 Sep  2 00:44 b
$ ls -l b                                                  
 total 0
lrwxrwxrwx 1 u0_a261 u0_a261 42 Sep  2 00:38 a -> /data/data/com.termux/files/home/test/a
-rw------- 1 u0_a261 u0_a261  0 Sep  2 00:44 c.txt
$ ls -l a
total 0
lrwxrwxrwx 1 u0_a261 u0_a261 42 Sep  2 00:39 b -> /data/data/com.termux/files/home/test/b
-rw------- 1 u0_a261 u0_a261  0 Sep  2 00:44 d.txt
$

masx200 avatar Sep 02 '20 03:09 masx200

The following are the steps to reproduce the bug

Then I start the webdav service

npx webdav-cli --username root --password root --port=1900 --path=/data/data/com.termux/files/home

masx200 avatar Sep 02 '20 03:09 masx200

The following are the steps to reproduce the bug

Then initiate a request to the server, the server enters an endless loop, the memory overflows, and the server is down

curl http://127.0.0.1:1900 -X PROPFIND  --user root:root

<--- Last few GCs --->

[8419:0x745ce87a00]   797885 ms: Mark-sweep 920.1 (936.4) -> 916.2 (936.6) MB, 1737.1 / 0.8 ms  (average mu = 0.123, current mu = 0.014) allocation failure scavenge might not succeed
[8419:0x745ce87a00]   799605 ms: Mark-sweep 920.3 (936.9) -> 916.5 (936.4) MB, 1698.2 / 0.8 ms  (average mu = 0.071, current mu = 0.013) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
error Command failed with signal "SIGABRT".
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
$
curl: (52) Empty reply from server

masx200 avatar Sep 02 '20 03:09 masx200

@AdrienCastex

The solution is as follows:

server.beforeRequest((arg, next) => {

    const { headers, method } = arg.request
    const { depth } = headers
    if (method === 'PROPFIND' && depth !== '0' && depth !== '1') {
        arg.setCode(403);
        arg.exit();
    }
    else {
        next();
    }
})

masx200 avatar Sep 02 '20 03:09 masx200