Josh
Josh
I had to make the following small changes to get the code to run: https://github.com/jsrozner/crl/commit/70e59d4e866cb1bc662ca763928ca8490f512941 I can pull request if you'd like? But please make sure they don't seem to...
It was the huggingface default (same with the choice of epsilon at 1e-8, and learning rates in the 1e-4 to 1e-5 range). It could be worth trying other optimizers. Have...
I'd been meaning to read through that post and tune over optimizer as well! I think transformers finetune.py script defaults to Adam (and a lot of the notebooks also seem...
cool, collate was on my feature list actually! and i'm glad you've found it useful! i've also been making a lot of changes - i've made it considerably more modular...
also, huggingface's transformer offers a batch_encode method that should take care of uniform padding and length
I wrote the following, using huggingface tokenizer to handle the batch encoding. It will pad to the max length in a batch. This also substantially reduces the memory footprint from...
@FL33TW00D what'd you think about the new implementation?
So I think one of my issues is that this repo uses npm (.npmrc does have the version configured to not add a 'v'). The instructions in my previous comment...
Also, to the extent that you guys are re-implementing the google docs interface, in the comment window, tab selects the wrong button (it selects "cancel" instead of "comment"): Steps: 1....