private-transformers
private-transformers copied to clipboard
Paper results cannot reproduce!
I run your codes for reproducing your results in your paper. There are several issues:
- The file - ./data/original/GLUE-SST-2/cached_dev_RobertaTokenizer_256_sst-2.lock - is not explained how to get it.
- I run a single-gpu device, but I can run coding at .examples/classification/src/trainer: Error: raise ValueError("Multi-gpu and distributed training is currently not supported.")