llm-archetype-batch-use-case icon indicating copy to clipboard operation
llm-archetype-batch-use-case copied to clipboard

General solution to archetype LLM batch use case

Results 2 llm-archetype-batch-use-case issues
Sort by recently updated
recently updated
newest added

Is there a way to workwith local LLM which is compatible with the [OpenAI ChatCompletion API specifications](https://platform.openai.com/docs/api-reference/chat) e.g. [openchat](https://github.com/imoneoi/openchat)? thanks

Hi, I am working on something similar - and there is no "preprocess.py" script in the main scripts folder.