Glenn 'devalias' Grant
Glenn 'devalias' Grant
> With overriding to latest webcrack, I still see this @VivaLaPanda Thanks for checking. > If not, but it works upstream, then providing an ideally minimal example of the code...
> this seems to fix it: > > [main...fosteman:humanify:main](https://github.com/jehna/humanify/compare/main...fosteman:humanify:main) This seems to create a plugin that logs a `console.error`, and then skips the entire `Program`: ```ts const syntaxErrorPlugin: PluginItem =...
@TheGreyRaven The surrounding context in your error is less clear than what I discovered above in https://github.com/jehna/humanify/issues/285#issuecomment-2641827428, but I suspect it's the same root cause, that there is an `await`...
Imported here: https://github.com/jehna/humanify/blob/88b51bc689d23717212b035eb287a0d3c07d3f1b/src/plugins/babel/babel.ts#L4-L4 Used here: https://github.com/jehna/humanify/blob/88b51bc689d23717212b035eb287a0d3c07d3f1b/src/plugins/babel/babel.ts#L72-L78 > I found the root cause why your fork got rid of the error: the bug is in `babel-plugin-transform-beautifier`, which removes the async keyword....
> Upstream bug issue: > > - https://github.com/gzzhanghao/babel-plugin-transform-beautifier/issues/3 > Fixed in v0.1.1 > > _Originally posted by @gzzhanghao in https://github.com/gzzhanghao/babel-plugin-transform-beautifier/issues/3#issuecomment-2661448752_ - https://github.com/gzzhanghao/babel-plugin-transform-beautifier/pull/5 - https://github.com/gzzhanghao/babel-plugin-transform-beautifier/blob/main/CHANGELOG.md#011 - https://www.npmjs.com/package/babel-plugin-transform-beautifier/v/0.1.1 **Edit:** PR to bump...
> node v23 isn't supported by isolated-vm - https://github.com/laverdet/isolated-vm#security - > Additionally, it is wise to keep nodejs up to date through point releases which affect v8. You can find...
> After several hours, the progress still shows 0%, which leads me to suspect that the tool might be running on CPU instead of utilizing CUDA acceleration. @GermanKousal / @KyleSau...
Another thing worth noting, that may or may not be relevant here, relating to the `node-llama-cpp` build options and whether it was built for CUDA/etc... I'm thinking maybe for some...
There are some notes in here that might also be relevant: - https://node-llama-cpp.withcat.ai/guide/CUDA - > `node-llama-cpp` ships with pre-built binaries with CUDA support for Windows and Linux, and these are...
These commands might also be useful in debugging if/how well certain models might run on your system: - https://node-llama-cpp.withcat.ai/cli/inspect/estimate - > `inspect estimate` command > Estimate the compatibility of a...