Akarsh
Akarsh
Hi @mbertani, sorry for the late reply. If it is possible, I would surely like to give it a go. As of my experience with GPUs, I have worked on...
Hi @Nico995, I have able to generate the error (however, I had to use two files, since, for one file, the shape comes out to be (256, 256)), and I...
Hi @Akhilesh64 , thanks a lot for your request. I am preparing the end to end pre-training and finetuning code here: https://github.com/uakarsh/docformer/tree/master/examples/docformer_pl/docformer_etoe_pretraining_finetuning
It was an old script. Although I believe it is correct, but the link I shared is the one which we are planning to work on, to pre-train docformer
Since DocFormer is an encoder, I think you can surely use it for your question answering task. You need to add a code for encoding the question and attach a...
Hi @tmquan, from the question, I guess it is not directly available as of now, in `lightning-transformer`, but from the colab notebook that you shared and the code base of...
Hi @maxzvyagin, what I can understand from your question is, `performing the enable_`, on the custom Lightning Module. For that, I think one simple and straight strategy would be inheriting...
Hi @NielsRogge, can I help in this implementation?
Hi @jmandivarapu1, Although I didn't write the entire code, but I did write till the part where the pytorch dataset object could be made and one iteration/batch's forward and backward...