Results 143 comments of Peter

@MNLubov Are you looking for a specific model from HuggingFace? I'm trying to fix the huggingface module this month, so if everything goes well, it would be workable again before...

@MNLubov Yes. I haven't investigate the sentence-transformers implementation, but it seem that it can also be done with normal huggingface interface. Like this one https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2, it's a `bert` model, so...

You can find some simple usages in the [toy example](https://github.com/chengchingwen/Transformers.jl/tree/master/example/AttentionIsAllYouNeed). Basically, ```julia using Transformers using Transformers.Datasets # utilities for dataset using Transformers.Datasets: IWSLT # IWSLT datasets # available language for...

Looks like they no longer provide file links for specific translation pair, we would need to rewrite the datadeps base on that

@maj0e move issue to #85

Simply `BSON.@save` and `BSON.@load`. I guess the error is probably because you forget to `using Transformers` before loading. And yes it's better to do `cpu(model)` before saving.

Several points: 1. you don't need to use `load_bert_pretrain`, you can just use `BSON.@load`. 2. `BertTextEncoder` contains both `tokenizer` and `wordpiece`, so you don't need to store all of them....

Hi @ViralBShah, I'm fine with this idea in general, but I'm not sure it's a good timing to do it. I still need to revisit/rewrite some (probably most) part of...

> Are you interested in pulling more examples like this? @Oblynx Sure! That will be nice.

I'll keep the issue here for now because we get more attention here (pun intended)