unilm
unilm copied to clipboard
Pretraining on own dataset
I would like to pretrain LayoutLMv2 on my own dataset, rather than simply finetuning the existing model on a downstream task. Will the pretraining code be released?
This is highly valuable. A sample of the dataset would be helpful as well
I also hope pretraining code for LayoutlLMv2.