character-bert icon indicating copy to clipboard operation
character-bert copied to clipboard

Main repository for "CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters"

Results 6 character-bert issues
Sort by recently updated
recently updated
newest added

Hi, I just found that the input embeddings of character-bert encoder are not tied with the output MLM weights. I would appreciate it if you could also release the pre-trained...

Hi @helboukkouri The max numbers of letters in a word is set to 50. So for a word with 5 characters is getting padded to 50. For padding, a value...

Hi, You're printing words and their embeddings using: ``` for token, embedding in zip(x, embeddings_for_x): print(token, embedding) ``` How can I see each letter's vector?

Great stuff! I saw you have an active huggingface implementation: https://huggingface.co/helboukkouri/character-bert/tree/main Is this live? Or when do you expect it to be? Thanks!

question

Hi Do you have a script for finetuning? I need to know how to prepare the data for finetuning. Thanks

Hi, This library is such a great idea! I am trying to download the general_character_bert and medical_character_bert pretrained models. When I run the download.py I got some errors. When I...