mediapipe
mediapipe copied to clipboard
LLM example --> Phi-2 model implementation: _uiState function error
Hi, thanks for providing a great examples. It is amazing indeed.
While I am deploying the test app for LLM chat, I tried two models which are Gemma and phi-2. For Gemma, it worked perfectly. But for phi-2, I changed the code depending on your guide in comment
// `GemmaUiState()` is optimized for the Gemma model.
// Replace `GemmaUiState` with `ChatUiState()` if you're using a different model
private val _uiState: MutableStateFlow<ChatUiState> = MutableStateFlow(ChatUiState())
When I changed the code like above, error happens in the below part.
try {
val fullPrompt = _uiState.value.fullPrompt
inferenceModel.generateResponseAsync(fullPrompt)
inferenceModel.partialResults
.collectIndexed { index, (partialResult, done) ->
currentMessageId?.let {
if (index == 0) {
_uiState.value.appendFirstMessage(it, partialResult) // <-- this is the error part
} else {
_uiState.value.appendMessage(it, partialResult, done)
}
if (done) {
currentMessageId = null
// Re-enable text input
setInputEnabled(true)
}
}
}
} catch (e: Exception) {
_uiState.value.addMessage(e.localizedMessage ?: "Unknown Error", MODEL_PREFIX)
setInputEnabled(true)
}
The error message is
Unresolved reference: appendFirstMessage
can you give me a feedback for this issue?
Thanks!
Hi,
We had a new release of MP and pushed a fix to the sample app. I actually think you would be OK using GemmaUIState with different models (so the naming is not actually accurate). However, for Phi2, please see https://github.com/google-ai-edge/mediapipe-samples/issues/380 to make sure the Phi2 task model is set up correctly. That is more likely to be the cause of any underlying issues.