mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

llmInferenceSession leaks context between different sessions

Open Gemeto opened this issue 5 months ago • 0 comments

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

Yes

OS Platform and Distribution

Android

Mobile device if the issue happens on mobile device

Poco F6

Browser and version if the issue happens on browser

No response

Programming Language and version

Kotlin

MediaPipe version

0.10.24

Bazel version

No response

Solution

ChatBot

Android Studio, NDK, SDK versions (if issue is related to building in Android environment)

No response

Xcode & Tulsi version (if issue is related to building for iOS)

No response

Describe the actual behavior

In my custom app, when 2 different sessions are created, let's call then session1 and session2, if i send a message to session1 saying "Im Peter" and then, after closing session1, i send a message to session2 saying "Who i am?", the response is something like "You are Peter"

Describe the expected behaviour

When 2 different sessions are created, let's call then session1 and session2, if i send a message to session1 saying "Im Peter" and then, after closing session1, i send a message to session2 saying "Who i am?", the response must not remember that i told session1 im Peter

Standalone code/steps you may have used to try to get what you need

Code involved with this issue is in the class ChatBotActivity in this repo: https://github.com/Gemeto/Food-Alert-App

Other info / Complete Logs

Minimum code for testing, im actually using Gemma3n:


LlmInference llmInference =
    LlmInference.createFromOptions(ApplicationProvider.getApplicationContext(), options)

LlmInferenceSession.LlmInferenceSessionOptions sessionOptions1 =
  LlmInferenceSession.LlmInferenceSessionOptions.builder()
    .setTopK(10)
    .setTemperature(0.4f)
    .setGraphOptions(GraphOptions.builder().setEnableVisionModality(false).build())
    .build();

LlmInferenceSession session1 = LlmInferenceSession.createFromOptions(llmInference, sessionOptions1)

LlmInferenceSession.LlmInferenceSessionOptions sessionOptions2 =
  LlmInferenceSession.LlmInferenceSessionOptions.builder()
    .setTopK(10)
    .setTemperature(1.0f)
    .setGraphOptions(GraphOptions.builder().setEnableVisionModality(true).build())
    .build();

LlmInferenceSession session2 = LlmInferenceSession.createFromOptions(llmInference, sessionOptions2)

session1.addQueryChunk("Hi, my name is Peter, remember that")
session1.generateResponse()
session1.close()

session2.addQueryChunk("What is my name?")
session2.generateResponse()

Gemeto avatar Jun 06 '25 20:06 Gemeto