Faith Oyedemi
Faith Oyedemi
Hi @artitw, I would like to continue from where John stopped.
Noted. I have reviewed John's work and played with the notebooks he reported. It seems that my assignments are the following, in order: 1. Get sufficient (> 10k) training data....
Hi @artitw, After trying different options that did not work out, I opted for Amazon Sagemaker. - I loaded the datasets (JSON) to AWS s3 - I dockerized the fine-tuning...
Hi @artitw, I used a small dataset to test my setup as you suggested and it worked fine. But the larger dataset took too long to run. I set the...
Hi @artitw Hope you have had a good day. Two things. ## 1. Before going far, I want to let you know that I am fine tuning using --- ```...
## 2. I dug into the codebase and figured out a way to use the GPU. --- By editing the [Translator](https://github.com/artitw/text2text/blob/master/text2text/translator.py) and doing this: ``` import text2text as t2t import...
Hi @artitw, The dataset we are using for fine tuning has multiple questions attached to each context. Do you think that this might be affecting the algorithm's learning? As against...
@artitw This looks interesting. Can I begin to look into this?