ALBERT-TF2.0 icon indicating copy to clipboard operation
ALBERT-TF2.0 copied to clipboard

ALBERT model Pretraining and Fine Tuning using TF2.0

Results 24 ALBERT-TF2.0 issues
Sort by recently updated
recently updated
newest added

Bumps [tensorflow-gpu](https://github.com/tensorflow/tensorflow) from 2.0.0 to 2.7.2. Release notes Sourced from tensorflow-gpu's releases. TensorFlow 2.7.2 Release 2.7.2 This releases introduces several vulnerability fixes: Fixes a code injection in saved_model_cli (CVE-2022-29216) Fixes...

dependencies

Is it possible to do this and could you please, if possible, provide some general instructions? Thanks in anticipation.

I am doing pre-training from scratch. It seems that training is started as gpu's are being used but nothing is on terminal except this: ``` ***** Number of cores used...

Hi There, I am having some issues getting the model to finetune. I'm sort of confused and could use some help. Is there a forum I could ask for help?...

In readme, performance of CoLA task is defined by accuracy. But this task always measured by Matthew correlation. Does it mean the say thing by just calling Matthew_corr as accuracy?...

I can't get past this error with run_classifier.py AssertionError: Nothing except the root object matched a checkpointed value. Typically this means that the checkpoint does not match the Python program....

Hi, Can you plz share ALBERT xxlarge model fine-tuned on SQUAD 2 and if possible a REST API same as your previous BERT-SQUAD (https://github.com/kamalkraj/BERT-SQuAD) or at least the inference code...

I have generated pretraining data using the given steps in this repo. I am doing this for the Hindi language with 22gb of data. Generating pretraining data itself took 1...