Prompty Ignores openai_api_key env variables
I didn't see a similar issue, so apologies if this is a duplicate.
Issue
In the .prompt file, if api_key is added in the YAML configuration branch with an openai compatible endpoint and configuration, the prompty syntax highlighting indicates that this is a violation of spec and flags it as an error.
Running a request against the openai endpoint works fine despite the error, however it took me down 20 minutes of digging to realize that the environment variables in the .env file were fine, rather, I think prompty isn't passing the environment variable to the openai api request library.
Running in a fresh dev container in Win 11 Enterprise. Environment variables are definitely set.
Workaround
If you are receiving an error about the OPENAI_API_KEY missing in Prompty, you can temporarily resolve this by using similar logic to the below prompty YAML:
configuration:
name: qwen2.5-7b-instruct-1m
type: openai
base_url: ${env:OPENAI_BASE_URL}
api_key: ${env:OPENAI_API_KEY}
This first screenshot shows running without the api_key set in the prompty file. The code validates but the prompt fails.
This second screenshot shows running with the api_key set in the prompty file. The code does not validate, however the prompt and subsequent LLM call succeeds.
Hope this is helpful. It happens on both release & prerelease on the extension.
Oof - this looks like an error in the VSCode validation schema. We are working on updating it as we speak with an update schema. In the interim, even if there squiggles it is actually correct. Stay tuned!