OpenPrompt icon indicating copy to clipboard operation
OpenPrompt copied to clipboard

An Open-Source Framework for Prompt-Learning.

Results 108 OpenPrompt issues
Sort by recently updated
recently updated
newest added

For example, I am supposed to construct a sampler using ``` sampler10 = FewShotSampler(num_examples_per_label = 5, also_sample_dev = True, num_examples_per_label_dev = 5) ``` I should then feed the data ```...

I am replicating the tutorial [4.1_all_tasks_are_generation.py ](https://github.com/thunlp/OpenPrompt/blob/4f17f4fa814f03f8253891b8a80a41dfe3e33bb1/tutorial/4.1_all_tasks_are_generation.py#L395) and trying to apply it in a case of binary classification. In the model generate, the output_sentence contains only the label, but I...

Hello, despite my best efforts, I could not succeed in NER tagging using `OpenPrompt`. The docs show an example of a prompt for NER tagging, but I haven't managed to...

Fix the typing error of ``max_seq_length: Optional[str]`` of ``PromptDataLoader`` and ``ErniePromptDataLoader`` both of which should be ``int``.

采用all-MiniLM-L6-v2作为PLM预测标签时出现结果不稳定,请问是什么原因呢?怎么调整promptTemplate都没有用

I want to do prompt tuning for a masked-fill-based T5 model, which has the input format like this: ```python test_dataset = [ InputExample(text_a="The quick fox over the lazy dog", tgt_text="...

Here is my code of trying to use ```PrefixTuningTemplate```: ``` import torch from openprompt.data_utils.conditional_generation_dataset import WebNLGProcessor from openprompt.plms import load_plm from openprompt.prompts.prefix_tuning_template import PrefixTuningTemplate plm, tokenizer, model_config, WrapperClass = load_plm('opt',...

Is there a way to use Openprompt for an in-context learning setting (i.e., adding examples to the prompt).