Nicolas Patry

Results 977 comments of Nicolas Patry

`load(open(filename, "r").read())`. This has been extensively discussed in many places so I won't repeat everything. You're using some kind of network disk access using a mount point. The OS uses...

What version of tokenizers are you referring to ? We haven't uploaded `tokenizers.js` on NPM in a loooong while (we did rewrite everything with napi, but frankly it seems the...

Currently we're not really keen on doing that. We have all CUDA devices image already but merging every path including CPU (and various CPU backends most likely) would mean adding...

I'll merge this a fix in a follow-up stub.py is right even if it's just a newline.

Oh you may have forgotten to rebuild tokenizers, stub.py looks at the binary and extracts the pyi from the built binary, so if your binary is outdated you may not...

100% on this. Having implemented a few servers, panicking is not great !

I understand the concern, however I do not think this should be fixed. `with torch.device(x)` Changes the default allocation place, which messes a lot of internal logic in `safetensors` (it...

For `torch` we are using torch's internals to load tensors, which afaik is hard to beat without completely sidestepping the torch engine (which will cause issues with the torch allocator...

Could you link to the problematic model so we can have a reference of what's expected and check that it's valid. The assertion exists because it was supposed to be...

I moved the issue it wasn't on the proper repo. > yes annoying sadly. i am also being have to explain them to disable - and they are not really...