whisere

Results 38 comments of whisere

If the TARGET_ERROR_RATE can't be reached after training for a long time, is it right to kill the training process and run?: lstmtraining \ --stop_training \ --continue_from data/eeboecco/checkpoints/eeboecco_checkpoint \ --traineddata...

need to run this for **adapter_sent**: pip install adapter-transformers I used this step for both classifiers layers setting: https://inception-project.github.io/releases/0.19.0/docs/user-guide.html#sect_projects_layers and add netura to labels: labels=["negative", "netura", "positive"] to AdapterSentenceClassifier above...

Thanks for the reply. Yes I am trying to use the recommenders in INCEpTION. All the instructions I found are for spacy_ner and spacy_pos, I have successfully set them up,...

Thanks we can use the fixed tagset, so is it right to use this settings: https://inception-project.github.io/releases/0.19.0/docs/user-guide.html#sect_projects_layers in INCEpTION for using sklearn_sentence (SklearnSentenceClassifier) and adapter_sent (AdapterSentenceClassifier) that come with this git...

How do we find the adapter information such as the one below: server.add_classifier( "adapter_sent", AdapterSentenceClassifier( base_model_name="bert-base-uncased", adapter_name="**sentiment/sst-2@ukp**", labels=["negative", "positive"] ), ) Thanks. Is it okay to add "netura" to the...

the above is from https://github.com/inception-project/inception-external-recommender/blob/master/wsgi.py

Thank you very much, I will try that.

Any idea why it reported: No adapter with name 'sentiment/hinglish-twitter-sentiment@nirantk' was found in the adapter index. when using https://adapterhub.ml/adapters/nirantk/bert-base-multilingual-uncased-hinglish-sentiment/ with server.add_classifier( "adapter_sent1", AdapterSentenceClassifier( base_model_name="bert-base-multilingual-uncased", adapter_name="sentiment/hinglish-twitter-sentiment@nirantk", labels=["negative", "netural", "positive"] ), )...

Please see https://github.com/Adapter-Hub/Hub/issues/27