transformer-srl icon indicating copy to clipboard operation
transformer-srl copied to clipboard

Error while loading model

Open LeonHammerla opened this issue 2 years ago • 2 comments

When i try the Example or my own model i get the following Error while loading the model:

predictor = predictors.SrlTransformersPredictor.from_path("/path/to/model/srl_bert_base_conll2012.tar.gz", "transformer_srl")

==> for my dependency-model, for the example it changes to transformer_srl_span

allennlp.common.checks.ConfigurationError: transformer_srl_dependency not in acceptable choices for dataset_reader.type: ['babi', 'conll2003', 'interleaving', 'multitask', 'sequence_tagging', 'sharded', 'text_classification_json', 'multitask_shim', 'ptb_trees', 'semantic_dependencies', 'srl', 'universal_dependencies', 'sst_tokens', 'coref', 'preco', 'winobias', 'masked_language_modeling', 'next_token_lm', 'simple_language_modeling', 'copynet_seq2seq', 'seq2seq', 'cnn_dm', 'swag', 'commonsenseqa', 'piqa', 'fake', 'quora_paraphrase', 'snli', 'drop', 'qangaroo', 'quac', 'squad', 'squad1', 'squad2', 'transformer_squad', 'triviaqa', 'ccgbank', 'conll2000', 'ontonotes_ner', 'gqa', 'vqav2', 'visual-entailment']. You should either use the --include-package flag to make sure the correct module is loaded, or use a fully qualified class name in your config file like {"model": "my_module.models.MyModel"} to have it imported automatically.

LeonHammerla avatar May 03 '22 19:05 LeonHammerla

I have the same issue. Did you manage to fix this? @LeonHammerla

Lisa-aa avatar Nov 29 '23 14:11 Lisa-aa

I found out that this issue is caused when you do not have the entire import: from transformer_srl import dataset_readers, models, predictors The compiler says dataset_readers and models are unused, but they are necessary. @LeonHammerla

Lisa-aa avatar Nov 29 '23 15:11 Lisa-aa