ELMoForManyLangs
ELMoForManyLangs copied to clipboard
Pre-trained ELMo Representations for Many Languages
Hi everyone, I need help. So I'm doing my research about contextual topic model for my thesis and I'm about to try using ELMo for Indonesian. Unfortunately, my data was...
It seems like every download links are invalid now.
bow_id, eow_id, oov_id, pad_id = [char2id.get(key, None) for key in ('', '', oov, pad)]
i try to train new elmo model use thins code CUDA_VISIBLE_DEVICES=1 python -m elmoformanylangs.biLM train \ --train_path data/korean.txt \ --config_path configs/cnn_50_100_512_4096_sample.json \ --model output/ko \ --optimizer adam \ --lr 0.001...
Is it possible to fine-tune the existing model available? I've seen some example of biLM fine-tuning, but I do not know if it is possible on ELMoForManyLangs
RuntimeError: Error(s) in loading state_dict for ConvTokenEmbedder: size mismatch for word_emb_layer.embedding.weight: copying a param with shape torch.Size([140384, 100]) from checkpoint, the shape in current model is torch.Size([71222, 100]). size mismatch...
When I installed it and used the command ‘from elmoformanylangs import Embedder’ allennlp 2.9.3,python=3.8 An error occurred File E:\Anaconda3\envs\Elmol\lib\site-packages\elmoformanylangs-0.0.4.post2-py3.8.egg\elmoformanylangs\elmo.py:12, in 10 from .modules.embedding_layer import EmbeddingLayer 11 from .utils import dict2namedtuple...
The error like the under: RuntimeError: Error(s) in loading state_dict for ConvTokenEmbedder: size mismatch for word_emb_layer.embedding.weight: copying a param with shape torch.Size([71222, 100]) from checkpoint, the shape in current model...
Hi. In the tfhub of elmo (https://tfhub.dev/google/elmo/2) there is an output like you provide: _elmo: the weighted sum of the 3 layers, where the weights are trainable. This tensor has...
I have install the package in my anaconda env without problems. When i use it in my python 3.8 code, at the line "from elmofromanylangs import Embedder" i have the...