LLMUnity icon indicating copy to clipboard operation
LLMUnity copied to clipboard

Phi 2 model has weird output when running on GPU

Open SubatomicPlanets opened this issue 9 months ago • 0 comments

Describe the bug

When using the Phi 2 model and standard settings of the LLM model everything works fine. However, when I increase the Num GPU Layers It gets weird output.

Num GPU Layers == 0 = normal Num GPU Layers == 1 = normal Num GPU Layers >= 2 = weird

I have an RTX 3060 with cuda installed properly. The Mistral-7b-instruct model works fine no matter what value I use (above 30 my game starts to lag, but I think that's normal since it just uses more GPU)

I set the prompt of the LLM component to You can only say "hello, this is a test" and these are it's responses: normal:

hello, this is a test

weird:

hello, this- " S M D B P TThe
-------------------------------------------------------- " S M D B P TThe
-------------------------------------------------------- " S M D B P TThe
-------------------------------------------------------- " S M D B P TThe
------------------------------------------------

As you can see, it's very strange. Sometimes it also uses symbols like ( and & and so on which makes it look even stranger.

Steps to reproduce

Open the chatbot sample and use the Phi 2 model. Adjust Num GPU Layers to see results

LLMUnity version

1.2.7

Operating System

Windows

SubatomicPlanets avatar May 15 '24 15:05 SubatomicPlanets