semantic-kernel
semantic-kernel copied to clipboard
Python: Verify local models in Ollama and LM Studio are compatible with the OpenAI connector
Motivation and Context
Related to https://github.com/microsoft/semantic-kernel/issues/6498
The use of local models presents a twofold benefit for developers: increased flexibility and reduced costs. Ollama and LM Studio are two well-known platforms that facilitate the hosting of models locally, both of which offer compatibility with OpenAI endpoints. As such, it is imperative that our OpenAI connector functions correctly when users are operating models on these platforms.
Description
- Verify that our OpenAI connector works as expected with models hosted locally using Ollama and LM Studio.
- Create three new samples (Ollama/chat, LM Studio/chat, LM Studio/Embedding) under
/concepts/local_modelsto show how to using local models with the OpenAI connector. - Fix a bug in
test_sample_utils.pywhere if a test case is retried and input was never reset.
Contribution Checklist
- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the SK Contribution Guidelines and the pre-submission formatting script raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone :smile:
Python 3.10 Test Coverage Report •
File Stmts Miss Cover Missing semantic_kernel/connectors/ai/open_ai/services open_ai_chat_completion.py 23 1 96% 61 TOTAL 6814 767 89%
Python 3.10 Unit Test Overview
| Tests | Skipped | Failures | Errors | Time |
|---|---|---|---|---|
| 1598 | 1 :zzz: | 0 :x: | 0 :fire: | 26.351s :stopwatch: |