h2o-llmstudio
h2o-llmstudio copied to clipboard
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
Explore the following things: - Is it possible to specify the device of the tensors when pushing? - Is it always the case that CPU loading has float32 and double...
This PR documents official docs for LLM Studio using Makersaurus (the documentation platform used to document H2O products). The content is sourced from various sources including, but not limited to...
Here are a few things to improve in the model card: - Some models fail if input contains wrong keys, such as type ids. We should delete them after the...
### 🐛 Bug Training on a custom datase using the docker release of LLM Studio. Selected the `h2oai/h2ogpt-gm-oasst1-en-2048-falcon-7b-v2` backbone. At the start of training I recieve a `ValueError: Please specify...
Currently, we rely on `peft` for default settings of LORA Layers. We should try setting default settings ourselve and improve the experience for new models where the layers are also...
- [x] Set up Makersaurus + templates in the repo - [x] "What is LLM Studio?" page - [x] Key terms - [x] Concepts - [x] Set up LLM Studio...
### 🚀 Feature Ability to add arbitrary custom tokens, special or normal, via GUI to [tokenizer.add_tokens](https://huggingface.co/docs/transformers/internal/tokenization_utils#transformers.SpecialTokensMixin.add_tokens). ### Motivation I want to add custom tokens to the tokenizer vocabulary. Similar to...
### 🚀 Feature one click installers for windows and linux ### Motivation As a newbie in LLM's i prefer focusing on my LLM rather than focusing on installations. As of...
Hi Team, Just wanted to check is there any plan for you guys to provide support for [MPT (MosaicML Pretrained Transformer)] base models such that we can finetune them with...