OpenChatKit icon indicating copy to clipboard operation
OpenChatKit copied to clipboard

Results 104 OpenChatKit issues
Sort by recently updated
recently updated
newest added

To better support private models, pass 'use_auth_token' to HuggingFace transformers if either the `OCK_USE_AUTH_TOKEN` environment variable is set or `--use-token` is passed at the command line.

### My question: 胜多负少的 ### Bot response: 所得到的多多 ### Ideal bot response: 点对点 ### Bot response was: - [X] Factually incorrect - [X] Not helpful - [X] Harmful, inappropriate or...

feedback report

What’s the roadmap for the project becoming a true open alternative to chatgpt? While its capabilities are impressive on their own, stacked against ChatGPT there’s lot lacking. For example… *...

Hi, it will be super nice if you provide LORA training, to reduce the computational cost. Because 8x80 A100 is too expensive

I started a training process with 4*V100S(32GB VRAM each) at 18:00, and i got a "training starts..." prompt. With nvidia-smi, i can see that 3 GPUs are running with utils...

It is recommended to describe the operating environment required for installation (for example, macos is not recommended), cpu, memory, storage and other conditions in the readme.md file

**Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll...

Hi, What is minimum resource requirement to fine-tuning Llama-2-7B-32K-beta model with finetune_llama-2-7b-32k-mqa.sh script. Such as disk, memory and vRAM. Look forward to your response. Best regards.

### Error Following the steps in readme to run `finetune_llama-2-7b-32k-mqa.sh`, got below error: `Traceback (most recent call last): File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 478, in main() File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 443, in main...