Open-Assistant
Open-Assistant copied to clipboard
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
As title says. Resolves https://github.com/LAION-AI/Open-Assistant/issues/2515
It is not clear to me whether we still need the "LLaMA worker" Dockerfile. I am also not sure if we are still using or will use the `text-generation-inference` worker...
Is it possible to run the worker with other models than distilgpt2 on a non GPU-system? After successfully launching the services (profiles ci + inference) with the distilgpt2 model, I...
I chose to use the existing `ExportMessageNode` and `ExportMessageTree` models and simply add new event types to allow inference chats to fit these models, as this seemed simpler than creating...
Add new data sources (cadvisor for docker containers, dcgm-exporter for nvidia gpus) Add dashboards for docker / gpus Add mono-board WIP with variable for Datasource and Job
Close #2579. This runs a bit faster on my local machine. Unfortunately I think the main holdup is the PyTorch/CUDA install (also slows down the inference worker image build) which...
Hey everyone, This dataset might be useful for training OA: https://github.com/tatsu-lab/stanford_alpaca However it is legally a bit dubious, as the dataset is created using `text-davinci-003`. OpenAI states that it is...
Add system tag for each answer in a back and forth conversation. So we have to convert `[Q1, A1, Q2, A2]` to `q1attrib1a1q2attrib2a2` This also includes changing the prompter and...
This is a feature suggestion aimed at expanding the amount of data in smaller languages. "Translation" & "Translation Verification" tasks: Users can specify languages they're fluent in, as well as...
closes #2972