OpenChatKit
OpenChatKit copied to clipboard
I am trying to run it, but it gives me this error: $ mamba env create -f environment.yml pytorch/linux-64 Using cache pytorch/noarch Using cache nvidia/linux-64 Using cache nvidia/noarch Using cache...
add script to merge the lora weights to the base model. The output merged model looks like this: Thanks to @orangetin for the solution. see more: https://github.com/togethercomputer/OpenChatKit/pull/113#issuecomment-1581085627 https://github.com/togethercomputer/OpenChatKit/issues/127#issuecomment-1584961110
**Is your feature request related to a problem? Please describe.** I'm always frustrated when OpenChatKit cannot code, if it could, then it's really kind of an alternative of ChatGPT. **Describe...
**Describe the bug** when trying 'python3 inference/bot.py' , the follow error occurs: ``` Traceback (most recent call last): File "/home/darrin/miniconda3/envs/OpenChatKit/lib/python3.9/site-packages/transformers/configuration_utils.py", line 616, in _get_config_dict resolved_config_file = cached_path( File "/home/darrin/miniconda3/envs/OpenChatKit/lib/python3.9/site-packages/transformers/utils/hub.py", line...
I want to continued pre train t5-base model. Who can help me?
Create FAQ to answer questions that people might have that is difficult to find in the README. Generated by Claude based on the README file.