node-spdyproxy icon indicating copy to clipboard operation
node-spdyproxy copied to clipboard

crashing (possible memory leak)

Open cannotcode opened this issue 11 years ago • 19 comments
trafficstars

After using the proxy for some hours, it randomly stops processing new connections. Going through the log, I found errors like the following:

(node) warning: possible EventEmitter memory leak detected. 101 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at ServerResponse.addListener (events.js:160:15)
    at ServerResponse.once (events.js:185:8)
    at IncomingMessage.Readable.pipe (_stream_readable.js:538:8)
    at ClientRequest.<anonymous> (/usr/local/lib/node_modules/spdyproxy/lib/server.js:80:12)
    at ClientRequest.g (events.js:180:16)
    at ClientRequest.emit (events.js:95:17)
    at HTTPParser.parserOnIncomingClient [as onIncoming] (http.js:1688:21)
    at HTTPParser.parserOnHeadersComplete [as onHeadersComplete] (http.js:121:23)
    at Socket.socketOnData [as ondata] (http.js:1583:20)
    at TCP.onread (net.js:527:27)

any tips? thanks in advance

cannotcode avatar Nov 13 '14 10:11 cannotcode

I also have this question!

canghai908 avatar Nov 13 '14 12:11 canghai908

Please give the latest version from master a try, just updated it to use latest node-spdy.

igrigorik avatar Nov 13 '14 15:11 igrigorik

please explain to a newbie how to start the one from master. I was using the previous one after installing with npm install -g spdyproxy
now after I unzip the master branch and try to run it from bin/ using node ./spdyproxy ... I get errors like

module.js:340
    throw err;
          ^
Error: Cannot find module 'spdy'
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)

thanks again

cannotcode avatar Nov 13 '14 16:11 cannotcode

Hello

I tried with the version from master branch and it still stops working. After going through the log file this is the only thing I found that could be relevant:

(node) warning: possible EventEmitter memory leak detected. 101 listeners added. Use emitter.setMaxListeners() to increase                                                                                          limit.
Trace
    at EventEmitter.addListener (events.js:160:15)
    at RADIUSHelper.authUser (/root/node-spdyproxy/lib/radiushelper.js:36:29)
    at handleRequest (/root/node-spdyproxy/lib/server.js:170:25)
    at emit (events.js:98:17)
    at Connection.onconnect (/root/node-spdyproxy/node_modules/spdy/lib/spdy/server.js:263:10)
    at Connection.emit (events.js:98:17)
    at Stream.start [as _start] (/root/node-spdyproxy/node_modules/spdy/lib/spdy/stream.js:261:23)
    at Connection.handleSynStream [as _handleSynStream] (/root/node-spdyproxy/node_modules/spdy/lib/spdy/connection.js:310                                                                                         :10)
    at Connection.handleFrame [as _handleFrame] (/root/node-spdyproxy/node_modules/spdy/lib/spdy/connection.js:193:19)
    at Parser.emit (events.js:95:17)

There are 16GB of RAM on the server so I don't think that could be a problem. Killing / restarting the daemon seems to be the only workaround. Any advice?

cannotcode avatar Dec 10 '14 08:12 cannotcode

By randomly stops processing connections, do you mean it hangs indefinitely and stops responding to all requests? Are the existing connections still operational?

igrigorik avatar Dec 10 '14 16:12 igrigorik

it stops for all requests (I used to stream some online radio and it was hanging too). I can reach it on the port it listens and the process is active, if I run tail -f on the log.txt I can still see there are requests coming, yet no replies are being sent.

It happens on 3 out of 3 different servers, two being KVM instances with 768MB and 4GB of RAM, 3rd being a physical server with 16GB of RAM (Xeon CPU). The usage through the proxy is very low (1 or max 2 users since they are used privately). The crash happens ~once / day

cannotcode avatar Dec 10 '14 16:12 cannotcode

Are you sure the server's are not reaching their max socket limit, or some such?

igrigorik avatar Dec 10 '14 16:12 igrigorik

I don't know. How can I check/set the max socket limit for node.js / spdyproxy?

the servers themselves are idle and not used for anything else.

cannotcode avatar Dec 10 '14 16:12 cannotcode

Check: http://stackoverflow.com/questions/410616/increasing-the-maximum-number-of-tcp-ip-connections-in-linux ... lots of articles on this topic.

igrigorik avatar Dec 10 '14 16:12 igrigorik

thanks, but since it happens all of a sudden and there are only a few connection on the server, I doubt it can be related to that. I suspect it happens when the spdyproxy is not even used, like overnight - because I found it crashed a few times. It is running on Debian 7. Do you recommend to try on a different distro?

cannotcode avatar Dec 10 '14 16:12 cannotcode

A typical page connects to dozens of different hostnames and often requests hundreds of resources. Hence just a few clients can easily generate thousands of open sockets -- check that you're not exceeding your limits. The fact that the proxy continues to serve existing connections tells me that something else might be limiting your connections.

Aside from that, it's hard for me to diagnose what's going wrong. Poking at it with a debugger, or strace, etc, might help.

igrigorik avatar Dec 10 '14 16:12 igrigorik

The fact that the proxy continues to serve existing connections tells me that something else might be limiting your connections.

it doesn't. I had some radio streaming and it stopped, so I had to restart it. Perhaps it is crashing on some external requests coming from internet scanners and such? It is very random and frustrating that it can't be replicated. It doesn't crash during loading big pages. I tried everything: reloading portal-like pages, downloading big files, having multiple tabs with websites that use keep-alive. It seems to crash when nobody expects it.

What Linux distro do you recommend me to run it? one that has been stable for you :)

cannotcode avatar Dec 10 '14 16:12 cannotcode

I doubt it has anything to do with the distro, but I've been using Ubuntu in the past. When you catch the proxy in a bad state, try running strace on it to see what it's up to.

igrigorik avatar Dec 10 '14 16:12 igrigorik

@Ifxx Don't compile it. Just run it. I set the directory like this:-

test

I made a test file as "test.pac" But, i couldn't successsfully configure following these instruction customizing to my path, $> "/Applications/Google Chrome.app/Contents/MacOS/Google Chrome" --proxy-pac-url=file:///path/to/config.pac --use-npn

So, it ended up like this:- azureresmgr

:( :(

ashumeow avatar Dec 10 '14 20:12 ashumeow

@Ifxx Sorry, mine is not ubuntu. But, i could say that "throw err" problem occurs when you just compile it with node. For ubuntu, the commands would be a bit different. Just browse through it. =)

ashumeow avatar Dec 10 '14 20:12 ashumeow

I think I am making a progress here in discovering what could be wrong. I forgot to mention that I am using radius. Today I found the server non-responding again, so I checked the log and it was receiving the HTTP requests and tried to forward them. Along with the HTTP GET requests in the log file, I also got this:

{ state: 'fetching' }
# incomplete cache, waiting...

The leaking error was the same I got previously:

(node) warning: possible EventEmitter memory leak detected. 101 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at EventEmitter.addListener (events.js:160:15)
    at RADIUSHelper.authUser (/usr/local/lib/node_modules/spdyproxy/lib/radiushelper.js:36:29)
    at handleRequest (/usr/local/lib/node_modules/spdyproxy/lib/server.js:170:25)
    at emit (events.js:98:17)
    at Connection.onconnect (/usr/local/lib/node_modules/spdyproxy/node_modules/spdy/lib/spdy/server.js:263:10)
    at Connection.emit (events.js:98:17)
    at Stream.start [as _start] (/usr/local/lib/node_modules/spdyproxy/node_modules/spdy/lib/spdy/stream.js:260:23)
    at Connection.handleSynStream [as _handleSynStream] (/usr/local/lib/node_modules/spdyproxy/node_modules/spdy/lib/spdy/connection.js:299:10)
    at Connection.handleFrame [as _handleFrame] (/usr/local/lib/node_modules/spdyproxy/node_modules/spdy/lib/spdy/connection.js:182:19)
    at Parser.emit (events.js:95:17)

And here comes the interesting part! Checking netstat I found many node processes listening on udp (67 processes to be exact). See screenshot
So in my opinion this is related to radius otherwise I don't see why it would listed on UDP. The radius server is not on localhost and nobody was using the SPDY proxy. I just started the browser after some long idle time and tried to open a website and noticed it stopped working since I last used it (a few hours back).

cannotcode avatar Dec 12 '14 16:12 cannotcode

Hmm, not sure if that's the actual culprit behind the behavior you're seeing.. But, on a quick scan of the current Radius code it does look like we weren't listening for errors. Just pushed a small update to cover this case -- can you try running latest master and see if it makes any difference?

igrigorik avatar Dec 13 '14 06:12 igrigorik

done, updated. Will let you know when I get anything.

cannotcode avatar Dec 13 '14 06:12 cannotcode

I had a similar issue with memory usage growing to over 1G in a couple hours.

I'm now trying to use the HEAD version to see if it is solved.

But in general , how to debug this?

thefallentree avatar Jan 31 '15 07:01 thefallentree