tokenizers icon indicating copy to clipboard operation
tokenizers copied to clipboard

Tqdm for `tokenizer.encode_batch`

Open sksq96 opened this issue 4 years ago • 2 comments

Hi HF team,

Thanks for this awesome library!

I'm trying to encode a file with ~50M lines, using tokenizer.encode_batch. It would be great to have tqdm support to know how fast is this library and how long to wait for the results to come out.

Thanks

sksq96 avatar Sep 29 '20 22:09 sksq96

Hi,

I also join this request. Is there any workaround right now?

Thank you!

HaritzPuerto avatar Oct 06 '21 12:10 HaritzPuerto

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar May 11 '24 01:05 github-actions[bot]