yocto-gl icon indicating copy to clipboard operation
yocto-gl copied to clipboard

[FR] Ability to set system prompts for prompt experimentation

Open dgormly opened this issue 1 year ago • 7 comments

Willingness to contribute

No. I cannot contribute this feature at this time.

Proposal Summary

During prompt experimentation you often set system prompts for OpenAI, Azure, and open source models such as llama2. This controls the choice of language and can be filled with your application guardrails.

This is before your question or prompt template.

Can this feature be added to the UI as it is fundamental.

Motivation

What is the use case for this feature?

Chat bots and products have custom system prompts. This changes how models like Llama2 and open ai react and effects the choice of language.

Why is this use case valuable to support for MLflow users in general?

Prompt engineering is very limited to how the model will respond without system prompts

Why is this use case valuable to support for your project(s) or organization?

The organization has a personal set of system behaviour and language style that is expected from our prompts outside of the prompt template / QA.

Why is it currently difficult to achieve this use case?

The user interface and openai pyfunc does not support this making the experimentation and evaluation component useless.

Details

Please see attached for examples on how system prompts effect experimentation:

  • https://platform.openai.com/playground?mode=chat
  • https://developer.ibm.com/tutorials/awb-prompt-engineering-llama-2/

What component(s) does this bug affect?

  • [X] area/artifacts: Artifact stores and artifact logging
  • [ ] area/build: Build and test infrastructure for MLflow
  • [X] area/deployments: MLflow Deployments client APIs, server, and third-party Deployments integrations
  • [ ] area/docs: MLflow documentation pages
  • [ ] area/examples: Example code
  • [X] area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • [X] area/models: MLmodel format, model serialization/deserialization, flavors
  • [ ] area/recipes: Recipes, Recipe APIs, Recipe configs, Recipe Templates
  • [ ] area/projects: MLproject format, project running backends
  • [ ] area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • [x] area/server-infra: MLflow Tracking server backend
  • [X] area/tracking: Tracking Service, tracking client APIs, autologging

What interface(s) does this bug affect?

  • [X] area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • [ ] area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • [ ] area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • [ ] area/windows: Windows support

What language(s) does this bug affect?

  • [ ] language/r: R APIs and clients
  • [ ] language/java: Java APIs and clients
  • [ ] language/new: Proposals for new client languages

What integration(s) does this bug affect?

  • [X] integrations/azure: Azure and Azure ML integrations
  • [X] integrations/sagemaker: SageMaker integrations
  • [X] integrations/databricks: Databricks integrations

dgormly avatar Feb 12 '24 04:02 dgormly

this issue is also preventing us from being able to use the Prompt Engineering UI with our workflow and model management. Will be great if this feature is added with urgency.

trungngv avatar Feb 12 '24 05:02 trungngv

Custom system prompts are super important for tweaking how things work, and without the ability to set them, many of us are gonna miss out on using this awesome feature.

avaassadi avatar Feb 12 '24 06:02 avaassadi

I can imagine this feature would be of significant value to many people 👍

Hazious avatar Feb 12 '24 07:02 Hazious

cc @prithvikannan - thoughts on this?

BenWilson2 avatar Feb 14 '24 01:02 BenWilson2

@BenWilson2 totally agree we should add this! we would need to change a few places

  • UI: add a field for this, pass system prompt when making query
  • backend: store system prompt in the run
  • promptlab model flavor: pass system prompt when constructing query

prithvikannan avatar Feb 14 '24 01:02 prithvikannan

@mlflow/mlflow-team Please assign a maintainer and start triaging this issue.

github-actions[bot] avatar Feb 20 '24 00:02 github-actions[bot]

Thank you everyone! This is very much appreciated

dgormly avatar Feb 20 '24 00:02 dgormly