Results 37 comments of guillaume-be

Hello @yijunyu , I believe the weight conversion should work fine. Can you please post a stack trace if that is not the case? With this conversion working, you will...

Hello @yijunyu , Could you please share the Python code that was used to serialize the model?

Can you try saving the model state dict instead of the model itself, i.e. ```python torch.save(model.state_dict(), 'codeBERT_pl.bin') ```

Hello @yijunyu , I believe you can still use the first way and this should be an easy fix. For `SequenceClassification`, the models expect a mapping from label id to...

Hello @SorooshMortazavi , Which version of the library are you using, and which version of `libtorch` have you downloaded?

Hello @SorooshMortazavi , Libtorch comes packaged with its own CUDA version so your global installation should not impact this library installation. `[email protected]` is still based on `[email protected]` that uses Libtorch...

Hello @SorooshMortazavi , The issue is indeed an out of memory error on your CUDA device. Unfortunately it has 4GB available which may be limiting for several large models available...

Hello @alleonhardt , The local version (latest version of this repository) requires libtorch 1.11.x. The latest version published on `crates.io` requires libtorch 1.10.x

Hello @felippemr , The example linked in this article leverages `KeyBERT` which is not yet supported in this crate. I am planning on implementing sentence embeddings transformers over the next...

Hello @genderev , Regarding **1.**, I unfortunately lack the WASM experience to support you in this matter. Please note however that this crate relies on `tch-rs` bindings to the C++...