En Ouyang
En Ouyang
Hi @ziyin-dl I tried the example on text8, when I use default config, vocab==10000, the dimension I got is 123, when I changed the vocab==15000, the program use all the...
ok, in my understanding, the best way should be `min_count` was set, then you can get the number of words which appearing more than `min_count`, then the number should be...
ok, I can understand the logic now, the real size of vocabulary was decided by overlap of `min_count` and `vocab_size`, for example, I set `min_count` as 100, `vocab_size` as 10000,...
Ok, thanks! 😃
thanks for your reply, so the complete process for embedding training is that: 1. I prepare all sentences, then break into characters (or words) 2. create a mapping dictionary like...
Ok, thanks!