scikit-llm icon indicating copy to clipboard operation
scikit-llm copied to clipboard

Feature request: setting seed parameter of OpenAI's chat completions API

Open haukelicht opened this issue 1 year ago • 1 comments

Thank you for creating and maintaining this awesome project!

OpenAI recently introduced the seed parameter to make their models' text generation and chat completion behavior (more) reproducible (see https://cookbook.openai.com/examples/reproducible_outputs_with_the_seed_parameter).

I think it would be great if you could enable users of your package to control this parameter when using OpenAI models as a backend (i.e., in the files here: https://github.com/iryna-kondr/scikit-llm/tree/main/skllm/models/gpt)

The seed parameter could be hard-coded https://github.com/iryna-kondr/scikit-llm/blob/0bdea940fd369cdd5c5a0e625d3eea8f2b512208/skllm/llm/gpt/clients/openai/completion.py#L50 similar to setting temperature=0.0.

Alternatively, users could pass seed=<SEED> via **kwargs.

haukelicht avatar Feb 14 '24 13:02 haukelicht

Hello @haukelicht,

Thank you for your suggestion. Are you aware whether the seed parameter plays any role if the temperature is set to 0? I have to admit that I did not dig deeper into this topic, but always had an impression that the seed is only relevant when the temperature is positive, otherwise the model is already (almost) deterministic.

But again, I might be completely wrong here.

OKUA1 avatar Feb 14 '24 15:02 OKUA1