bert-sklearn
bert-sklearn copied to clipboard
a sklearn wrapper for Google's BERT model
This PR was added: - bert-base-portuguese-cased; - bert-large-portuguese-cased. from https://huggingface.co/neuralmind/bert-base-portuguese-cased and https://huggingface.co/neuralmind/bert-large-portuguese-cased
I trained this model on google colab on a GPU. I am trying to load the model on my local machine(CPU) using load_model() but unable to load it. I am...
According to the subject.
How to change "hidden_dropout_prob" or "attention_probs_dropout_prob" value? I used below code. Thank you! from bert_sklearn import BertClassifier from bert_sklearn import BertRegressor from bert_sklearn import load_model # define model model =...
Your work, `bert-sklearn`, is a really great piece of code! Thank you! :) It seems to me, that it does not work with `torch>1.4.0`, so `torch` version need to be...
Some parameters like "epochs" should be an argument to the fit function and not to the model constructor.
Hi! I'm using this model for a multiclass text classification problem and the results I get are not as good as I expected. I suspect that one problem might be...
I use bert-sklearn in a benchmark scenario, so I repeatedly construct and use BertClassifiers, like this: m1 = BertClassifier( bert_model="biobert-base-cased") m1.fit(..) m1.predict(..) m1.save(..) .... m2 = BertClassifier( ) m2.fit(..) m2.predict(..)...

Thank you Charles for providing this awesome pkg. I run into an error when reloading the model using `load_model()`. How should I fix it? 