kwrobel.eth
kwrobel.eth
For wikicode `[[link]]s` filter_wikilinks returns `[[link]]` - it should `[[link]]s` or `[[link|links]]`.
Do torch versions in benchmark https://github.com/Tencent/TurboTransformers/blob/master/docs/bert.md use `.half()` (FP16)?
I am training to run the model on multiple GPUs. Probably `SplitCrossEntropyLoss` causes some troubles, any hints? ``` File "main.py", line 209, in train raw_loss = criterion(model.module.decoder.weight, model.module.decoder.bias, output, targets)...
In the readme: > Combo Pre-trained continued from original BERT on 2017, 2018, 2019 SEC 10K dataset but in the paper: > train a Combo Model on top of the...
It would be helpful to click on relation arrow or its label and this relation to be selected in Relations window to edit or delete.
If text has a lot of relations then their arrows overlaps. One of the solutions is to draw arcs or somehow detect overlapping and change arrow paths.
To reproduce run: https://colab.research.google.com/drive/1SgLlLBI16ZQkjMAjJhzfNhAO0D6zgUwh?usp=sharing Probably, it is a bug in transformers.
In the clean up process response time for each proxy could be calculated.
Is it possible to add tasks without editing the library code (dynamically by using Python)?
fix infinite loop if number of sentences is smaller than number of sections (30)