safeguards-shield
safeguards-shield copied to clipboard
Build accurate and secure AI applications to unlock value faster
Just reporting that Colab Pro w/High Ram (24GB) & Premium GPU (Tesla V100-SXM2-16GB) crashes 'out of memory' when using `databricks/dolly-v2-12b` as indicated in the comment in `Dolly 2.0 HuggingFace v1.1.ipynb`
Support local gpu
Hey, Is there any way to early stop create_dataset()? As I my case it can run for ages regards
@kw2828 For example, if I want to train a specific dataset, something like PDF, how can I approach that? I have tried using embeddings and connecting with OpenAI using LangChain,...
In your finetuning code at: https://colab.research.google.com/drive/1n5U13L0Bzhs32QO_bls5jwuZR62GPSwE?usp=sharing As per dolly code, special tokens needs to be loaded to tokenizer ``` tokenizer.pad_token = tokenizer.eos_token tokenizer.add_special_tokens( {"additional_special_tokens": [END_KEY, INSTRUCTION_KEY, RESPONSE_KEY_NL]} ) ``` Also,...
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) Cell In[16], line 3 1 generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer) ----> 3 generate_text("Look up the boiling point of water.") File /opt/conda/envs/textgen/lib/python3.10/site-packages/transformers/pipelines/base.py:1109, in Pipeline.__call__(self, inputs,...
Hi, Thanks for sharing the code. I'm wondering if you have trained Dolly2 Lora successfully already with the code? and how long did it train? what the performance if the...