openchat
openchat copied to clipboard
OpenChat: Advancing Open-source Language Models with Imperfect Data
I conducted concurrent processing tests on an RTX 3090 and the results showed that it takes longer for the 8-GPUs to process the same concurrent requests than 2-GPUs, and the...
Maybe I wrong but think memory usage we can calculate as 7,3*4 ~30G, but in nvidia-smi show 72G after ``` python3 -m ochat.serving.openai_api_server --model berkeley-nest/Starling-LM-7B-alpha ``` We can control this?
Is it possible to further do instruction tuning on OpenChat with domain specific data? If so, is there any boilerplate that can be used as a starting point. I had...
Hello OpenChat Team, First and foremost, I would like to express my sincere appreciation for your work on OpenChat 3.5. It's been a go-to model for my projects, and I'm...
Hello @imoneoi , I would like to ask for your guidance on whether OpenChat 3.5 supports the usage of **"role": "system"** when calling with the OpenAI api format.
Congrats for the V3.5 release! May I ask if there are plans to release your finetuning data, just like what you have been always doing with your previous release?
any one has a colab notebook?
Hello, I know a lot of people want to train OpenChat model but accessing modern GPUs like A100s or H100s is seem difficult. So I tried using ZeRO-3 to train...
The majority of models can be accessed and utilized through the Ollama interface, making it a convenient and straightforward process. Testing and deployment are also simplified through this method. It...
# Issue: Difficulty in Downloading Large Files During openchat Installation ## Problem Description Hello, I've encountered some difficulties while installing `openchat`, mainly due to the size of the file `pytorch_model-00001-of-00002.bin`....