Tanmay Laud
Tanmay Laud
> thread '' panicked at 'called `Result::unwrap()` on an `Err` value: Internal', /__w/tokenizers/tokenizers/tokenizers/src/models/unigram/trainer.rs:203:53 note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace Preparing data... Training tokenizer... Traceback (most recent...
# 🌟 New adapter setup ## Model description Big Bird is a new model available in huggingface ( efficient transformer) ## Open source status * [ X] the model implementation...
# 🌟 New adapter setup ## Open source status * [x] the model implementation is available * [X] the model weights are available
This PR adds a notebook demo for using omnixai library for generating per-query valid explanations for ml4ir models
The inference demo is not working : Also, the usage demo given [here](https://huggingface.co/transformers/usage.html) does not work with BioBERT. (requires changing the output types)
## Adding a Dataset - **Name:** *HealthQA* - **Description:** *Healthcare question-answering dataset* - **Task:** *QnA* - **Paper:** *https://dl.acm.org/doi/abs/10.1145/3308558.3313699* - **Data:** *asked the owner for dataset link* - **License:** *to be...
Hi, I am confused what was the dataset size for the English only model?
Getting error: RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1! I am running basaran with default params and llama...
Hi, I noticed a degradation in Large v2 quality compared to Medium v2 quality. I want to make sure I am using the config and settings correctly. ``` from encoder.utils...