Random object gets printed to the console and python freezes
The problem
Console randomly prints an object
The object in question:
{"r": 23, "c": "jsi", "p": 1, "action": "call", "ffid": 11, "key": "", "args": [12, {"ffid": 34}, {"ffid": 35}]}
This object can very each run, sometimes the problem doesn't even happen!
After this is printed python gets held up and after 100000ms pythonia throws an error:
node:internal/event_target:1011
process.nextTick(() => { throw err; });
^
Error: Attempt to access '__call__' failed. Python didn't respond in time (100000ms), look above for any Python errors. If no errors, the API call hung.
at C:\Users\AirplaneGobrr\Documents\GitHub\ai-gen\node_modules\.pnpm\[email protected][email protected]\node_modules\pythonia\src\pythonia\Bridge.js:248:13
at waitFor (C:\Users\AirplaneGobrr\Documents\GitHub\ai-gen\node_modules\.pnpm\[email protected][email protected]\node_modules\pythonia\src\pythonia\Bridge.js:123:26)
at async Bridge.call (C:\Users\AirplaneGobrr\Documents\GitHub\ai-gen\node_modules\.pnpm\[email protected][email protected]\node_modules\pythonia\src\pythonia\Bridge.js:233:18)
at async t2i (C:\Users\AirplaneGobrr\Documents\GitHub\ai-gen\aiGen\worker.js:81:29)
Emitted 'error' event on Worker instance at:
at Worker.[kOnErrorMessage] (node:internal/worker:298:10)
at Worker.[kOnMessage] (node:internal/worker:309:37)
at MessagePort.<anonymous> (node:internal/worker:205:57)
at MessagePort.[nodejs.internal.kHybridDispatch] (node:internal/event_target:736:20)
at MessagePort.exports.emitMessage (node:internal/per_context/messageport:23:28)

I think maybe this issue might also be related to #68 ? I tried to debug it on my own bug have no clue what the problem is or where to even begin looking
Code
Bit of my code:
var images = await (await main.__call__$(prompt, {
negative_prompt: negPrompt,
callback: progress,
guidance_scale: scale,
num_inference_steps: steps,
num_images_per_prompt: images,
height: height,
width: width
})).images
"Main" is from a StableDiffusionPipeline pretrained pipe AKA:
const pipe = await dif.StableDiffusionPipeline.from_pretrained$(moduleToLoad, { torch_dtype: await torch.float16 })
const main = await pipe.to("cuda")
Dif being the diffusers and torch being torch
after doing some debug it looks like the callback: progress is causing the issue, Fix is to remove it but it would be nice to have the callback!
@AirplanegoBrr
after doing some debug it looks like the
callback: progressis causing the issue, Fix is to remove it but it would be nice to have the callback!
if it's the same as #68, then putting a reference to your callback somewhere in Python should help, as then it won't be removed by Python's garbage collector. I guess that would be a bug in the bridge but such a hack might work to go around it before it's fixed.