workers-sdk icon indicating copy to clipboard operation
workers-sdk copied to clipboard

🐛 BUG: Wrangler hangs when recording profile in Chrome inspector

Open ericmatthys opened this issue 2 years ago • 11 comments

Which Cloudflare product(s) does this pertain to?

Pages

What version(s) of the tool(s) are you using?

wrangler 3.15.0

What version of Node are you using?

18.16.0

What operating system are you using?

macOS Sonoma 14.1

Describe the Bug

Using the Chrome inspector, I have tried recording a profile both through the Performance tab as well as the deprecated Profiler tab (https://developer.chrome.com/blog/new-in-devtools-114/#js-profiler). After I click to start recording, I'll make a request to a worker / pages function, but the request hangs and never responds. If I try to stop recording in the Chrome inspector, that also seems to hang forever.

  • I have tried this in Chrome 120 (Beta) and 121 (Canary). For some reason the stable channel wasn't discovering wrangler in chrome://inspect.
  • If I start recording and stop without ever making a request to the worker / pages function, then it behaves normally.
  • I do not have any debugger; statements that might be interfering.
  • If I never try to record and make a request with the inspector open, I do see logs in the console from the pages function and it responds correctly.

Let me know what I can do to help track down this issue as it seems impossible to record a CPU profile right now to diagnose some performance issues.

Screenshot 2023-11-08 at 4 23 22 PM

Please provide a link to a minimal reproduction

No response

Please provide any relevant error logs

No response

ericmatthys avatar Nov 08 '23 23:11 ericmatthys

Have you tried this with Wrangler's hosted devtools? (Run wrangler dev and press d)

penalosa avatar Nov 09 '23 00:11 penalosa

I see the same behavior with the hosted devtools (https://devtools.devprod.cloudflare.dev/js_app?theme=systemPreferred&ws=localhost%3A9229%2Fws&debugger=true)

ericmatthys avatar Nov 09 '23 11:11 ericmatthys

Hey! 👋 I wasn't able to reproduce this locally, using a worker that made 100 crypto.pbkdf2Sync('secret', 'salt', 100000, 64, 'sha512'); calls. Would you be able to try clear site data in the hosted devtools? Open devtools on the hosted devtools, click the Application tab, then Clear site data: image

mrbbot avatar Nov 27 '23 16:11 mrbbot

Sorry for the late reply. In the hosted devtools (https://devtools.devprod.cloudflare.dev/js_app?theme=systemPreferred&ws=127.0.0.1%3A9229%2Fws&debugger=true), I don't see an Application tab at all.

Screenshot 2023-12-07 at 9 46 44 AM

ericmatthys avatar Dec 07 '23 16:12 ericmatthys

Ah apologies, you'll need to open Chrome's regular DevTools on our hosted DevTools... 😅 The Application tab should be visible there.

mrbbot avatar Dec 11 '23 16:12 mrbbot

Ah thanks. After trying that and testing some more, I can profile some routes, but some specific ones seem to hang forever as described originally. I'll try to narrow down what's causing that. There's nothing obvious about those routes that stands out right now.

For the routes that do work, the profile seems incomplete. I feel like I'm holding something wrong. 😅

Screenshot 2023-12-11 at 9 51 00 AM Screenshot 2023-12-11 at 9 51 06 AM Screenshot 2023-12-11 at 9 51 36 AM

ericmatthys avatar Dec 11 '23 16:12 ericmatthys

Hey again! Were you able to make any progress narrowing down what caused the hanging? For the incomplete profiles, could you try sending lots of requests to the routes? That should help build a better picture for those bottom two tables.

mrbbot avatar Jan 02 '24 16:01 mrbbot

Hey there, I was getting this exact same issue. I would try and profile my application spinup, and after one or two refreshes, it locks up. I moved over to my windows PC and it works no problem. Only seeing this issue on my Mac with the following specs:

Which Cloudflare product(s) does this pertain to? Pages

What version(s) of the tool(s) are you using? wrangler 3.22.4

What version of Node are you using? 18.17.1

What operating system are you using? macOS Sonoma 14.1

One question I had was the self time is 0ms for each function, and the ms is added up beside program. is this how it is supposed to read? It is similar to @ericmatthys profile screenshots. I was expecting to see self time for each function called.

ssollows avatar Jan 12 '24 19:01 ssollows

@ssollows - it would be great if you were able to provide a reproduction that we could debug. @ericmatthys - did you manage to make some progress?

petebacondarwin avatar Jan 29 '24 15:01 petebacondarwin

Sorry for dropping off here. I finally got some time to come back to this.


For the incomplete profiles, it seems like the generic (program) still shows 100% of the time with everything else at 0%, regardless of how many times I spam the request. The (program) time is simply however long the profiler was recording.

Screenshot 2024-02-09 at 3 04 44 PM Screenshot 2024-02-09 at 3 03 15 PM Screenshot 2024-02-09 at 2 58 53 PM

Some of the routes that were hanging when the profiler was recording seem to be fixed. I can still reproduce hangs, but not on all the routes I could before.

Interestingly, I'm seeing this behavior...

  1. Start the dev server
  2. Make a request to one of the safe routes
  3. Open debugger
  4. Observe that the Start button to start recording a profile is enabled
  5. Close the debugger
  6. Make a request to one of the problem routes
  7. Open the debugger again
  8. Observe that the Start button is now disabled, without ever having started a recording

I can simplify that to just:

  1. Start the dev server
  2. Make a request to one of the problem routes
  3. Open the debugger
  4. Observe that the Start button is now disabled, without ever having started a recording

There's nothing obvious that differentiates the safe routes from the problem routes. I'm using Pages Functions so it all gets compiled into one worker. All the routes have more or less similar responsibilities (make subrequests, transform the data, return a response).

ericmatthys avatar Feb 09 '24 22:02 ericmatthys

Hey! 👋 Thanks for the additional information. I was able to reproduce the (program) issue with just workerd and have opened a new issue to track this over there: cloudflare/workerd#1754. 👍 I don't think there's much we can do on the wrangler side to fix this.

mrbbot avatar Mar 01 '24 14:03 mrbbot

I'm seeing both these issues (the hanging, as well as the (program) issue). Seems relatively easy to reproduce with a Next.js app that's making concurrent requests, esp from pre-fetch.

CPU Profiling seems completely broken atm.

mhart avatar May 13 '24 10:05 mhart

The fix for this should be released in the next version of Wrangler

penalosa avatar Aug 22 '24 11:08 penalosa