One
One
We picked the 3rd epoch out of 5 epochs. You can find the full hyperparameters [here](https://huggingface.co/openchat/openchat-3.5-0106/blob/main/openchat.json).
Thanks! Sorry the dataset contains a bit private data, but we are considering releasing the open subsets of the dataset.
> Similar question. Did V3.5 still use the same strategy as the paper _openchat_ describe? Or there were other changes and you will publish a new-version report? Yes, it's mainly...
Use the `--enable-sys-prompt` flag. We will update the README to include instructions for system prompts.
If tokens aren't specified in the server command arguments using `--api-keys`, you can use an arbitary token or no tokens at all. If specified, you should use any of the...
See here for examples https://github.com/imoneoi/openchat#request-example `model` should be openchat_3.5
Because vLLM pre-allocates memory as KV cache. You can use `python3 -m ochat.serving.openai_api_server --help` to check the arguments to control the preallocation behavior.
Yes. We are actively working on it 👀
@sileod Thanks for the detailed response! I'm using the FLAN 2022 dataset (https://huggingface.co/datasets/Open-Orca/FLAN). What is FLAN with SNI? Also, are these tasks listed not present in FLAN 2022 and Bigbench...
@domeccleston +1 👍