distil icon indicating copy to clipboard operation
distil copied to clipboard

DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active learning library built on py-torch for reducing labeling costs.

Results 12 distil issues
Sort by recently updated
recently updated
newest added

Hi, Thank you very much for the toolkit, I want to plot the experimental comparison result, but I don't konw how to plot the same effect as the paper, so...

Is there any code for plotting the label efficiency?

Can you include examples of nlp applications, including the use of HuggingFace transformers?

If yes, please provide a tutorial

Hello! Is it possible to perform Active-Learning with a combination of SSL methods such as Virtual Adversarial Training (VAT), Entropy Minimization (EntMin), etc? I believe that this would be the...

Add medical imaging benchmarks

Benchmark notebooks require call to delete_checkpoints() at the end of train_one

hello ,Could you provide some active semi-supervised learning algorithms? Besides, could you please pass my application about Decile_DISTIL_Dev group?

Add seed argument to stats.rv_discrete in L33 of badge.py

Outdated based on the current GitHub repo code.