FastChat
FastChat copied to clipboard
Colab notebook for demo
Thanks a ton team for the release of model. Any notebook for demo? Some steps about mentioning model paths are a bit confusing. thanks
me too
+1
contributions are welcome
I found working code here along with Colab notebook - Vicuna
Hey @merrymercy If you assignee to me. I can do a notebook
@EnesGumuskaynak Please go ahead
@markwsac We cannot distribute merged weights because of the LLaMA model license. We encourage you to use our recommended way to run Vicuna.
@EnesGumuskaynak any updates?
I've just started
@EnesGumuskaynak hope you make one using vicuna-7b , 8bit? with chat interface
I'm working on it but colab free not offer much ram and dont allow to create swap file. 3 step need more than colab offer.
- LLama weights to hugging face version convert ()
- Hugging face version to vicuna version applying delta
- Start vicuna model "Loading checkpoint " part
The common aspect of all 3 steps is to first load the model into ram and then start the process.
since we cannot share the model weights, people will need to upload the original llama weights or the huggingface version into Colab. This will further complicate the process of running the demo on Google Colab. It seems more reasonable to create notebooks with documentation for local use. I am open to other suggestions as well.
since we cannot share the model weights, people will need to upload the original llama weights or the huggingface version into Colab. This will further complicate the process of running the demo on Google Colab. It seems more reasonable to create notebooks with documentation for local use. I am open to other suggestions as well.
yea, you can use vicuna models already available on huggingface model hub , people have converted the delta weights to usable weights.
@GeorvityLabs It is not recommended to use these weights due to the following reasons:
- Respect the LLaMA license
- We did not verify the correctness of any of these weights.
- We will have future updates but these weights won't be updated because they are not released by us.
@GeorvityLabs It is not recommended to use these weights due to the following reasons:
- Respect the LLaMA license
- We did not verify the correctness of any of these weights.
- We will have future updates but these weights won't be updated because they are not released by us.
thanks for the clarification.
#427 I created a PR. You can review it. Unfortunately, I couldn't make a special work for Colab as I mentioned before.
Perhaps one could do a notebook with a less restrictive model, say, Mistral?
A colab demo for FastChat will be great. we welcome contribution.
Any updates on FastChat working with free version of Colab?
Hi @merrymercy !
I have a notebook in which I am able to run the FastChat API in Google Colab free tier. As examples I've included the code snippets for accessing embeddings, contributed by @andy-yang-1. These code examples, part of the PR #663.
The notebook is deployed locally into the playground
directory, but this is the link:
https://colab.research.google.com/drive/1aosMOb0vltjdlAha2-_y2tKgZ1srBFxL?usp=sharing