TalSchuster

Results 8 comments of TalSchuster

Thanks for your help it is with Ubuntu 16.04: ``` Distributor ID: Ubuntu Description: Ubuntu 16.04.4 LTS Release: 16.04 Codename: xenial ``` and its on a sever but I think...

Hi @scofield7419 That's great! I'm sure people will find the models and alignments for more languages useful. The supervised alignment computation was done with the [MUSE](https://github.com/facebookresearch/MUSE) repository. Their repo is...

Yes, that sounds correct. I've uploaded the anchors for the provided English model, so it will save you the time of extracting the English anchors. There's a link now in...

For joint-training, you can check [this paper](https://arxiv.org/pdf/1909.08744.pdf). In short - In average it can provide better results but the effectiveness varies across languages. Still, mostly it is worth learning and...

Hi, Thanks for your feedback. The visualizations were done using a dimensionality reduction via PCA. The PCA was computed over the anchor space and then applied on the displayed embeddings....

Hi, For English we used the identity matrix (divided by the empirical norm). You can check`gen_anchors_bert.py ` for computing it for BERT. Actually, I ran it for multilingual BERT for...

Sounds great. [This](https://www.dropbox.com/s/qhxdwx7gxumxl9i/bert_en_es_-1.pth?dl=0) is a mapping Spanish to English for the last layer. Actually, for multilingual BERT the norm doesn't seem to vary (probably because it was jointly trained) so...

Reveal solution ```python def sol(): return int("7" * int("4") + "1" * (len(str(123456789)) - int("4"))) ```