Black-Box-Tuning icon indicating copy to clipboard operation
Black-Box-Tuning copied to clipboard

ICML'2022: Black-Box Tuning for Language-Model-as-a-Service & EMNLP'2022: BBTv2: Towards a Gradient-Free Future with Large Language Models

Results 9 Black-Box-Tuning issues
Sort by recently updated
recently updated
newest added

Hello, there is a bug in RobertaL Model ouput when I run deepbbt because it do not have hidden state even the config.output_hidden_states = True (Line439). Then I found the...

TypeError: add_code_sample_docstrings() got an unexpected keyword argument 'tokenizer_class' How to solve it?

Hi, congrats on acceptance at ICML 2022 for Black-Box-Tuning. We are having a event on Hugging Face for ICML 2022, where you can submit spaces(web demos), models, and datasets for...

Hi, I tried your BBTv2 code but failed to get comparable results as reported in your paper. In my case, using the command ```Python python deepbbt.py --model_name "roberta-large" --task_name "snli"...

I found this optimization is about vector after user obtain the results from server. I wander if there any possible to get the final prompt text to help other people...

Hello, I see that BBTv2 supports a couple of t5 models Would it be easily extendable to support Flan-T5 models as well ?

It would be very useful a comparison with performances of gradient-based methods (lora,p-tuning,prompt tuning, etc.) on the same datasets and using the same models (i.e., t5-xxl) commonly used in literature....

In the paper, it seems all NLU tasks, how about NLG tasks?

I am in HuaWei,and I have the same age with you。I am very interested in moss. Can I get your contact information or you contact me via @1697540432qq.com or WeChat@19949292778....