edgedb-js
edgedb-js copied to clipboard
EdgeDB AI provider
I think we need a call to discuss what we want to support with the provider and extension (or maybe I'm just not aligned with the goals and provider should just be built to support current capabilities of the extension).
Examples of how we can use this with the Vercel SDK: the completion route, the chat route
This PR has the basic support for completion and chat that we also support with edgedb-js ai lib. I made it to work with the AI ext we have and its current capabilities. And it can be used as this but definitely needs improvements in order to have fully compatible interfaces with Vercel LanguageModelV1 provider.
TODO: I need to add README and update some identifiers and interface names throughout the provider since we are not really a language model.
QUESTIONS:
-
We don't support any of LLM settings maxTokens, temperature, topP, topK, frequencyPenalty, presencePenalty, stopSequences, responseFormat, seed etc, so I don't know do we wan't to add support for these or should I return an error from the provider if user tries to set any of these.
-
Do we want to support setting for
safe_prompt? -
Do we support structured outputs?
-
Do we have support for tools?
-
In the output our RAG returns it doesn't return:
usage: {
promptTokens: ...,
completionTokens: ...,
},
and some other things that are used/returned from the Vercel LanguageModelV1 provider. So I return 0 for these 2 values for now from the provider.
- Do we want to support different settings for different LLMs or we navigate more towards supporting basic works and settings all LLMs support. I mean some of them even support images and audio.