Retrain `bert_tiny_uncased_en_sst2` BertClassifier to reflect the dropout change
We added dropout layer to keras_nlp.models.BertClassifier, so we need to update the presets accordingly.
@jbischof Jon I believe you still have the training script?
Yes it's in our repo (link).
My goal was having a task preset for primarily API development and tutorial writing. It's OK if these models get better over time.
Hi @chenmoneygithub I would like to solve this issue. Could you please guide me little bit? As I understood I need to train BertClassifier 2 times with different dropout_probs to reflect how this change is changing the final trained training...is it right?
cc : @jbischof
@susnato Thanks for your interest!
To clarify - we finetuned BERT earlier on SST2 to make bert_tiny_uncased_en_sst2, however, dating back to that time our BertClassifier did not have the dropout layer. So what we need to do now is to finetune BertClassifier again on SST2. Your work will include:
- Write a colab that finetunes
keras_nlp.models.BertClassifieron SST2. - report the evaluation score on validation set.
- share the colab with us by opening a PR to
Then we will run your colab to generate the checkpoint and upload to our Google cloud storage. Since no code will be checked in, we will explicitly credit you in the code and our documentation on keras-io.
@chenmoneygithub Thanks for replying! I will trained it for 2 epochs. Since for 5 epochs it was taking a lot of time.
Hey i would like to take this up
Hi @jayam30 as you can see I have already submitted the PR regarding the issue and we are working on this. You can choose other the issues that need immediate fix from this list
This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.