Daan van Stigt

Results 5 issues of Daan van Stigt

[Adapters](https://arxiv.org/pdf/1902.00751.pdf) are pretty cool. Let's see if we can implement them for our Transformer encoder models by integrating with [Adapter Transformers](https://github.com/Adapter-Hub/adapter-transformers). A particular cool extension would be [AdapterFusion](https://arxiv.org/pdf/2005.00247.pdf), which would...

enhancement

[Baal](https://baal.readthedocs.io/en/latest/index.html) looks pretty cool. Let's see if we can use it to select data for more training (active learning) or model calibration (so the predicted probabilities are more meaningul).

enhancement

### Why? [SimAlign](https://github.com/cisnlp/simalign) is an amazingly simple and effective way of obtaining word alignments from multilingual Transformer encoders. OpenKiwi is built on top of multilingual Transformers. Hence OpenKiwi can produce...

enhancement

# 🚀 Feature request Example code to produce the (supercool!) Adapter Fusion inter-Adapter attention plots in figure 5 from the paper [AdapterFusion: Non-Destructive Task Composition for Transfer Learning](https://arxiv.org/pdf/2005.00247.pdf). ![Screenshot 2020-11-17...

enhancement