semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

Python: Verify local models in Ollama and LM Studio are compatible with the OpenAI connector

Open TaoChenOSU opened this issue 1 year ago • 1 comments

Motivation and Context

Related to https://github.com/microsoft/semantic-kernel/issues/6498

The use of local models presents a twofold benefit for developers: increased flexibility and reduced costs. Ollama and LM Studio are two well-known platforms that facilitate the hosting of models locally, both of which offer compatibility with OpenAI endpoints. As such, it is imperative that our OpenAI connector functions correctly when users are operating models on these platforms.

Description

  1. Verify that our OpenAI connector works as expected with models hosted locally using Ollama and LM Studio.
  2. Create three new samples (Ollama/chat, LM Studio/chat, LM Studio/Embedding) under /concepts/local_models to show how to using local models with the OpenAI connector.
  3. Fix a bug in test_sample_utils.py where if a test case is retried and input was never reset.

Contribution Checklist

TaoChenOSU avatar Jun 26 '24 20:06 TaoChenOSU

Py3.10 Test Coverage

Python 3.10 Test Coverage Report •
FileStmtsMissCoverMissing
semantic_kernel/connectors/ai/open_ai/services
   open_ai_chat_completion.py23196%61
TOTAL681476789% 

Python 3.10 Unit Test Overview

Tests Skipped Failures Errors Time
1598 1 :zzz: 0 :x: 0 :fire: 26.351s :stopwatch:

markwallace-microsoft avatar Jun 26 '24 21:06 markwallace-microsoft