OpenChatKit icon indicating copy to clipboard operation
OpenChatKit copied to clipboard

Results 104 OpenChatKit issues
Sort by recently updated
recently updated
newest added

This PR: - Adds the argument `--load-in-8bit` for inference - Adds an example jupyter/Colab notebook that can run `bot.py` inference (quantized) on a free Colab account (would have crashed after...

**Describe the bug** When run $python inference/bot.py --model togethercomputer/Pythia-Chat-Base-7B --retrieval it report a RuntimeError: The size of tensor a (2048) must match the size of tensor b (2131) at non-singleton...

$python inference/bot.py --model togethercomputer/Pythia-Chat-Base-7B --max-tokens 128 Loading togethercomputer/Pythia-Chat-Base-7B to cuda:0... Loading checkpoint shards: 100%|????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????| 2/2 [00:06>> hi Traceback (most recent call last): File "Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py", line 269, in main() File "Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py",...

I want to use OpenChatKit to chat with a pdf and ask questions about it. I suppose I can use pypdf to extract the text from the pdf and then...

Hey, Karo from [Aim](https://github.com/aimhubio/aim) here! :wave: This is an awesome project! We would be more than happy to submit an integration PR. Aim is an open-source and supercharged experiment tracker....

I learned that Vicuña can train a large language model with a very low configuration, but only for research purposes; Can we learn from Vicuna to train and fine-tune the...

my environment is GPU: V100-32G torch: 1.13.1+cu116 python: 3.7.13 I load the model by using Int8: tokenizer = AutoTokenizer.from_pretrained("togethercomputer/GPT-NeoXT-Chat-Base-20B") model = AutoModelForCausalLM.from_pretrained("togethercomputer/GPT-NeoXT-Chat-Base-20B", device_map="auto", load_in_8bit=True) And when run the model.generate the...

**Is your feature request related to a problem? Please describe.** Hi is there a reason or motivation behind the probabilities in the data files? I am curious about making a...

**Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll...

This PR did several things: - Add feedback data in `data/OIG/prepare.py` - Add a fine-tuning script in `training/finetune_Pythia-Chat-Base-7B-feedback.sh`, which further fine-tune upon the ckpt produced by `training/finetune_Pythia-Chat-Base-7B.sh`. - Some trivial...