flair icon indicating copy to clipboard operation
flair copied to clipboard

A very simple framework for state-of-the-art Natural Language Processing (NLP)

Results 253 flair issues
Sort by recently updated
recently updated
newest added

### Question Hi Flair Community, I've got an annotated dataset in BIO format that I'm attempting to use with Flair for annotating 7 PDFs. Unfortunately, all my metric results are...

question
Awaiting Response

### Describe the bug I'm trying to use this model. I even have an example from the site that doesn't work. Unknown error that I do not know how to...

bug
Awaiting Response

### Problem statement Hi, there's a new [EMNLP 2023 paper](https://arxiv.org/abs/2310.13213) that introduces version 2 of MultiCoNER dataset. MultiCoNER v2 should also be supported in Flair :hugs: ### Solution The dataset...

feature

Closes gh-3243 A suggestion how we could support pickle and deepcopy without having to create multiple spans etc. The idea is to use `__getitem__` to create spans and relations by...

### Describe the bug Trying to import and run some experiments in Colab with Flair, when this error raises: --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) [/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)...

bug

### Question While trying to deploy the NER model, often the length of the generated token sequence can not be controlled even though the characters can be limited. - With...

question

### Question I am using now after a while a new version flair 0.12.2 for the script that i used before ( (before that, I guess I used 0.10)to train...

question

This issue tracks the progress of releasing the next version of Flair, i.e. Flair 0.13. Planned release is **16.10.23**. One major new documentation feature are **publicly available API docs**, generated...

### Describe the bug I'm using large model on long text input (each text is more then 2000 characters), after few request to this model, it takes almost all GPU...

bug
Awaiting Response