OpenAI icon indicating copy to clipboard operation
OpenAI copied to clipboard

Enable batch prompts

Open polarblau opened this issue 1 year ago • 0 comments

The OpenAI API seems to allow both strings and arrays of strings in order to batch requests:

Here’s the canonical example from the docs using Python:

import openai  # for making OpenAI API requests

num_stories = 10
prompts = ["Once upon a time,"] * num_stories
 
# batched example, with 10 story completions per request
response = openai.Completion.create(
    model="curie",
    prompt=prompts,
    max_tokens=20,
)
 
# match completions to prompts by index
stories = [""] * len(prompts)
for choice in response.choices:
    stories[choice.index] = prompts[choice.index] + choice.text
 
# print stories
for story in stories:
    print(story)

The source inline documentation seems to suggest that this functionality has been considered also for this library …

struct CompletionsQuery: Codable {
    // ...
    /// The prompt(s) to generate completions for, encoded as a string, array of strings, array of tokens, or array of token arrays.
    public let prompt: String
    // ...
}

… but the code itself seems to only implement strings but not arrays for prompts.

Batching seems like a great way to reduce load times based on my experimentation and I’d love to see it come to this library as well.

Thanks for all the amazing work!

Disclaimer: I'm 100% new to Swift.

polarblau avatar Jun 06 '23 20:06 polarblau