Foundry Local models from cache
I have Foundry Local running, and trough the command line, you can see I have the model GPT-OSS in my cache and it is running.
When I use github copilot, and I get to select models from Foundry Local (while using the AI Toolkit Extentions) I have also selected the GPT-OSS model, and it wants to start downloading it again, while it is already on my machine as you can see in the lists with cached models.
Also the Azure AI Foundry extention when selecting a model from my Foundry Local installation, it keeps asking to download the model again. It seems that those plugins do look to a different cache directory, or perhaps do not make the API call to Foundry Local to find what models are already available in the cache.
Something else I noticed as well, in VS Code I get to select the specific version of the model it needs to download, instead of harnessing the power of Foundry Local, to rely on the hardware detection of my machine.
As you can see in the postman call to foundry local the models are available in the cache.
Is this a bug in Foundry Local or in the extention that is not using the same cache folder?
Thank you for contacting us! Any issue or feedback from you is quite important to us. We will do our best to fully respond to your issue as soon as possible. Sometimes additional investigations may be needed, we will usually get back to you within 2 days by adding comments to this issue. Please stay tuned.
Hi @pts1989, AITK in VSCode and Foundry local use different default model cache folders. Both allow you to change the default cache folder if you want to share the models.
For foundry local, you can use foundry cache command to check the cache location and change the location.
For AITK, you can go to the extension settings and change the cache location
This issue has been automatically marked as stale because it has been marked as requiring author feedback but has not had any activity for 7 days. It will be closed if no further activity occurs within 3 days of this comment. If it is closed, feel free to comment when you are able to provide the additional information and we will re-investigate.