h2ogpt
h2ogpt copied to clipboard
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
gradio_client==0.1.3 fails causes gradio app to fail with recursion error when using client_test.py
``` python generate.py --base_model=gpt2 ``` causes gradio server to fail with recursion error: [clientout.log](https://github.com/h2oai/h2ogpt/files/11264049/clientout.log) Still fails with same requirements back on ebaedb74599aefb6cc0361dcbff92cee5bd6a29c that worked fine Changing gradio version doesn't help,...
https://github.com/Digitous/GPTQ-for-GPT-NeoX
https://platform.openai.com/docs/guides/chat/introduction https://github.com/openai/openai-cookbook/blob/main/examples/How_to_format_inputs_to_ChatGPT_models.ipynb Then our non-chat and chat cases would also be covered, but as API at python level instead of bare bones API calls with api_name strings. fastchat did this,...
See also: https://github.com/h2oai/h2o-llmstudio/blob/main/prompts/eval_template.txt
https://twitter.com/omarsar0/status/1641792530667675648/photo/1
gradio error for certain inputs: ``` Downloading pytorch_model.bin: 100%|██████████| 1.74G/1.74G [00:25
E.g. like https://github.com/poloniki/quint/blob/master/notebooks/Chunking%20text%20into%20paragraphs.ipynb [Chunking text into paragraphs.ipynb.zip](https://github.com/h2oai/h2ogpt/files/11305565/Chunking.text.into.paragraphs.ipynb.zip)
When I ran "sudo docker-compose up -d --build" and use docker-compose logs -f to check, I got the following errors: My system has 32 GB DRAM and TitanX GPU, 12GB...
Hi Team, Nice work and appreciate your efforts on this project 🫡 I am trying to run the Docker container and I had the following issue when executing the command...
## Step 1: Get best open-source model: model: `togethercomputer/GPT-NeoXT-Chat-Base-20B` https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B ## Step 2: Get good open-source instruct data: Inspired by https://bair.berkeley.edu/blog/2023/04/03/koala/ Note: GPT-NeoXT-Chat-Base-20B was already trained on OIG data, so...