How to set system message to InternVL-Chat-V1-5??
How to set the system message to InternVL-Chat-V1-5? I gave the model system message "Your name is Sam" and then asked "What is your name" it said "My name is AI". Can anyone help me? I'm using lmdeploy to deploy the model. But I can also use transformer to do that.
class Model:
@modal.enter()
def start_engine(self):
import torch
from lmdeploy import serve, ChatTemplateConfig
print(MODEL_NAME)
self.server = serve("OpenGVLab/InternVL-Chat-V1-5",
chat_template_config=ChatTemplateConfig(
model_name='internvl-internlm2'),
server_name='0.0.0.0',
server_port=23333)
@modal.method()
async def generate(self, messages):
from lmdeploy import client
handle = client(api_server_url='http://0.0.0.0:23333')
model_name = handle.available_models[0]
print(model_name)
outputs = handle.chat_completions_v1(
model=model_name, messages=messages)
print(outputs)
for out in outputs:
return out
Any idea about this @czczup
Hello, I estimate that the effect of replacing the system message with this model is not good because a fixed system message was used during training, rather than training with varying system messages.
Hello, I estimate that the effect of replacing the system message with this model is not good because a fixed system message was used during training, rather than training with varying system messages.
@czczup So how to prompt this model better? Any examples
Hello, I estimate that the effect of replacing the system message with this model is not good because a fixed system message was used during training, rather than training with varying system messages.
@czczup So how to prompt this model better? Any examples
In most cases, you don't need to change the system message, just adjust the user query like you did for gpt4v.