langstream
langstream copied to clipboard
Require to reference an AI Service Provider configuration
Challenge Currently, it is assumed that only one AI model will be set in configuration (hugging-face-configuration, openai-configuration, etc). Then the ai-chat-completions and compute-ai-embeddings agents use that configuration to complete the task.
This creates some ambiguity when creating a pipeline. What if I provided a configuration for hugging face and openai? How would the agents know which one to use?
Solution Add an additional "mapping" field to the agent tasks where I can specify the configuration to use. Refer to the query agent as an example. It has the "datasource" field where I can provide the resource name of the database I want to use.