FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

How can we use vicuna for information retrievel from bunch of docs?

Open alan-ai-learner opened this issue 2 years ago • 3 comments
trafficstars

Hi @yantao0527 @infwinston @Mearman @zhisbug @jegonzal , I've got a question, I have let's say a book on NLP and I want to use Vicuna in such a way that if user asks any question, the model can answer from the book only. Is it possible using Vicuna or other llms?

Any suggestion would be very helpful! Thanks!

alan-ai-learner avatar Apr 27 '23 13:04 alan-ai-learner

I recommend a youtube video to learn how to use OpenAI's new GPT-4 api to 'chat' with and analyze multiple PDF files.

yantao0527 avatar Apr 27 '23 15:04 yantao0527

@yantao0527 , i will look into it, that is my last and expensive option , but before that i was thinking is it possible with fine tuning any open source model like vicuna.

Thanks!

alan-ai-learner avatar Apr 27 '23 16:04 alan-ai-learner

You can combine other tools (e.g. LangChain) and Vicuna's OpenAI-compatible API (https://github.com/lm-sys/FastChat#openai-compatible-restful-apis--sdk).

I think it does not work out-of-the-box now. Contributions are welcome.

merrymercy avatar Apr 29 '23 13:04 merrymercy

Could fine-tuning on the book be done easily? How would you need to structure the data?

Marenz avatar May 28 '23 03:05 Marenz

I would also be interested to know how one could fine-tune Vicuna with books or any other large bodies information

hitchclimber avatar Jul 05 '23 14:07 hitchclimber

While this is good, it's not really an issue of fastchat, right?

I would recommend you have a look at this colab on how to fine-tune llama2 as an example: https://colab.research.google.com/drive/1Ly01S--kUwkKQalE-75skalp-ftwl0fE?usp=sharing

surak avatar Oct 21 '23 16:10 surak