TypeChat
TypeChat copied to clipboard
Docs on using TypeChat with other LLMs
What I understand is Typechat can be an alternative to lang chain library but in whole documentation I only see this being used for OpenAI's GPT Model only. So, is there any option to use this with any Open source model like LLAMA or Falcon ?
You can implement any TypeChatLanguageModel
that satisfies the interface.
It is not documented aside from the code source but if you need an example you can probably use my early Quivr implementation for typechat here as a reference:
https://github.com/DKFN/quivr-typechat/blob/e1c68c2c5e9d9479386cf66f6b6438dcac32e05b/src/main.ts
This is how I run typechat on my Quivr instance using GPT4All
I hope this helps !
EDIT: Reference exact commit sha and file. For now, the code is basic enough to be an example but it will become more complex in the future and may become confusing for ppl searching for examples.
You can implement any
TypeChatLanguageModel
that satisfies the interface. It is not documented aside from the code source but if you need an example you can probably use my early Quivr implementation for typechat here as a reference:https://github.com/DKFN/quivr-typechat/blob/e1c68c2c5e9d9479386cf66f6b6438dcac32e05b/src/main.ts
This is how I run typechat on my Quivr instance using GPT4All
I hope this helps !
EDIT: Reference exact commit sha and file. For now, the code is basic enough to be an example but it will become more complex in the future and may become confusing for ppl searching for examples.
Why not make some type of documentation for these steps ? I see as valuable resource!
If complexity start getting bigger, documentation can be upgraded also!
Thanks @Underewarrr that is a great suggestion !
I will start working on documentation about it after work
Hey all, we'll probably pursue some specific docs on plugging in a specific model.
Until then, I've also put together a doc over at https://github.com/microsoft/TypeChat/pull/65 which specifically points out TypeChatLanguageModel
as something you can implement to make TypeChat work with any language model.
Speaking of which, there are at least 2 PRs (#34 and #55) which are entirely about adding a proxy to the built-in create*LanguageModel
functions. Maybe we should include proxies in our documentation alongside plugging into specific language models.
As an aside, over time I don't know how many external language models we will have "out-of-the-box" support for. We might opt to keep things as-is, add more support, or remove all the built-in helpers. Others on the team might feel differently.
@DKFN thanks for your example. It seems to work when adapting to Text Generation Inference (TGI) serving a Llama model.
Awesome! I am happy it worked for you :) I have dropped the documentation effort since Daniel mentionned works was already on the way for it. In the meantime, I am glad you found my little code useful :)