semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

Python : #6499 Initial Commit for Mistral Connector

Open nmoeller opened this issue 1 year ago • 3 comments

Motivation and Context

  1. Why is this changed required ? To enable Mistral Models with Semantic Kernel, there was the issue #6499 in the Backlog to add a MistralAI Connector
  2. What problem does it solve ? It solves the problem, that semantic kernel is not yet integrated with MistralAI
  3. What scenario does it contribute to? The scenario is to use different connector than HF,OpenAI or AzureOpenAI. When User's want to use Mistral they can easliy integrate it now
  4. If it fixes an open issue, please link to the issue here. #6499

Description

The changes made are designed by the open_ai connector, i tried to stay as close as possible to the structure. For the integration i installed the mistral python package in the repository.

I added the following Classes :

  • MistrealAIChatPromptExcecutionSettings --> Responsible to administrate the Prompt Execution against MistralAI
  • MistralAIChatCompletion --> Responsible to coordinate the Classes and for Content parsing
  • MistralAISettings --> Basic Settings to work with the MistralAIClient

To test the changes with the tests please add MISTRALAI_API_KEY as Enviorment Variable

From a design decision i moved the processing of Functions from Connectors to the ChatCompletionClientBaseClass

What is integrated yet :

  • [X] Integrate Mistral AI Chat Models without Streaming
  • [X] Integrate Mistral AI Chat Models with Streaming
  • [X] Simple Integration Test to test Streaming and non Streaming
  • [x] Integrate MistralAI Models with Embeddings
  • [x] Integrate MistralAI FunctionCalling
  • [x] Extended Testing including Unit Testing & More Integration Tests

Contribution Checklist

nmoeller avatar Jun 24 '24 13:06 nmoeller

Py3.10 Test Coverage

Python 3.10 Test Coverage Report •
FileStmtsMissCoverMissing
semantic_kernel/connectors/ai/open_ai/services
   open_ai_chat_completion_base.py1687058%100, 104, 124, 149–153, 178, 186, 188, 192, 210–215, 233–261, 264–275, 290–297, 308–316, 332–339, 360, 368, 374–377, 389–392, 423
semantic_kernel/functions
   kernel_function_from_prompt.py154795%165–166, 180, 200, 218, 238, 321
TOTAL668678188% 

Python 3.10 Unit Test Overview

Tests Skipped Failures Errors Time
1567 1 :zzz: 0 :x: 0 :fire: 18.502s :stopwatch:

markwallace-microsoft avatar Jun 24 '24 13:06 markwallace-microsoft

@microsoft-github-policy-service agree

nmoeller avatar Jun 24 '24 14:06 nmoeller

Can you please comment on the following work item so that I can assign it to you? Thank you.

moonbox3 avatar Jun 24 '24 18:06 moonbox3

With seeing this #7035 here, it looks like @TaoChenOSU is also going for an implementation of the function calling inside the connector. If we both try to implement shared Function Calling Code, this will probably result into merge conflicts.

I think i will close this and will submit a new PR which is very lightweight and easy to understand for the Mistral Connector. And then we can define a new Issue if we want to merge the FunctionCalling Flow of all connectors.

nmoeller avatar Jul 02 '24 12:07 nmoeller