Vipula Rawte

Results 10 comments of Vipula Rawte

How do you deal with sequences longer than 512 tokens? Thanks!

Also, since it uses BinaryClassificationProcessor, it creates the logits tensor with size of [10,2] (say, my input file has 10 texts). After view(-1), logits becomes size=20 but the labels are...

On setting: 'output_mode': 'regression' it still gives the following error: ![image](https://user-images.githubusercontent.com/22553367/69432206-3bf3ff80-0d07-11ea-91c5-105a02433dd1.png)

Hi, Thanks for your suggestion but it did not work. `from apex import amp` works only when present in apex directory. I am wondering how to make it global because...

So in your case, it would be 20 x 768? (since max # chunks = 20)?

More explanation on how loss is calculated for every chunk separately? I mean the entire document has a target label and so AFAIU, the loss would be calculated for this...

I am not clear what the "sequences" is in the following code to find attention weights: test_seq = pad_sequences([sequences[index]], maxlen=MAX_SEQUENCE_LENGTH) Thank you!

This [issue](https://github.com/psnonis/FinBERT/issues/3) might be helpful.

@nehatj, @franz101 Thank you for pointing this out. I'll fix this in the next version. All the updates will be pushed [here](https://github.com/vr25/fin_RoBERTa). Thanks!