udify
udify copied to clipboard
A single model that parses Universal Dependencies across 75 languages. Given a sentence, jointly predicts part-of-speech tags, morphology tags, lemmas, and dependency trees.
Hi, really impressed by your work. I was trying to utilize the pretrained multilingual bert model as a tagger for downstream task, but not able to download the pre-trained model...
Hi, there seems to be a bug in the calculation of final_window_start: https://github.com/Hyperparticle/udify/blob/cbabef684b524b912fbe9fcfe1509de5c79b08a0/udify/modules/bert_pretrained.py#L488-L509 On the test case from your comment, `final_window_start` is greater than full_seq_len: ```python full_seq_len = 16 max_pieces...
Hello, and thanks for making this parser available! I wrote a shell script which calls `predict.py` from an external directory, but I encounter the following error: ``` Traceback (most recent...
To my knowledge, the pretrained models don't currently have support for the SIGMORPHON 2019 shared task; I was wondering if it would be possible to release the pretrained models for...
Fixes Issue #31 . Probably in the future we could update allennlp version and remove the explicit overrides version.
Is there an example config somewhere showing how to fine-tune on a specific treebank using BERT weights saved from fine-tuning on all UD treebanks combined (using the saved pretrained models)?...