berkecanrizai
berkecanrizai
Hi, I want to fine-tune the 7b model, am I supposed to download the provided checkpoint and fine-tune it as shown in this repo: https://github.com/EleutherAI/gpt-neox#using-custom-data . Would they be compatible...
### Steps to reproduce If you have a schema that has an optional parameter, and you create a subclass of that schema, the optional parameter becomes non-optional. Example: ```python class...
There are already set of prompts such as `prompt_query_rewrite` and `prompt_query_rewrite_hyde`, allow setting the query transformation behaviour in the `BaseRAGQA` initialization. It can default to `None` to skip it.
**Is your feature request related to a problem? Please describe.** Parsers should check if given LLM instance supports vision inputs, LiteLLM can be checked with `litellm.supports_vision`. **Describe the solution you'd...