Foundry-Local icon indicating copy to clipboard operation
Foundry-Local copied to clipboard

"Unable to find package Microsoft.AI.Foundry.Local.Core" when adding package Microsoft.AI.Foundry.Local

Open lebakken opened this issue 1 month ago • 9 comments

When executing the following command to install the latest Microsoft.AI.Foundry.Local nuget package: dotnet add package Microsoft.AI.Foundry.Local --version 0.8.0.1

The following error occurs:

info : GET https://api.nuget.org/v3-flatcontainer/microsoft.ai.foundry.local.core/index.json info : NotFound https://api.nuget.org/v3-flatcontainer/microsoft.ai.foundry.local.core/index.json 200ms error: NU1101: Unable to find package Microsoft.AI.Foundry.Local.Core. No packages exist with this id in source(s): Microsoft Visual Studio Offline Packages, nuget.org

The Microsoft.AI.Foundry.Local.Core package is missing or not public.

lebakken avatar Nov 19 '25 17:11 lebakken

@lebakken - thanks for raising the issue! The Core package is currently on the Azure Public feed:

https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT/NuGet/Microsoft.AI.Foundry.Local.Core/overview/0.8.1

So if you install that first from the feed, it should resolve your issue.

We are working on getting this added to Nuget.org in the next day or two.

samuel100 avatar Nov 19 '25 22:11 samuel100

I tried to install that version and the newest version released 0.8.2.2, but now get this error:

Unable to find package Microsoft.ML.OnnxRuntime.Foundry.

chuckbeasley avatar Nov 20 '25 14:11 chuckbeasley

https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT/NuGet/Microsoft.ML.OnnxRuntime.Foundry/overview/1.23.2

skottmckay avatar Nov 20 '25 22:11 skottmckay

The problem is that, if I add this AzureDevOps package feed manually, it throws the error like:

Assets file 'C:\dev\<project-directory-name>\src\<project-name>\obj\project.assets.json' doesn't have a target for 'net9.0/win-x64'. Ensure that restore has run and that you have included 'net9.0' in the TargetFrameworks for your project. You may also need to include 'win-x64' in your project's RuntimeIdentifiers

Although I change my build machine to Mac OS, it throws net9.0/osx-arm64 instead of net9.0/win-x64.

When do you think we have this Microsoft.ML.OnnxRuntime.Foundry package on the main NuGet feed?

justinyoo avatar Nov 24 '25 16:11 justinyoo

The samples includes a nuget.config file: https://github.com/microsoft/Foundry-Local/blob/main/samples/cs/GettingStarted/nuget.config that will get the packages from the right feed.

You need to run with the runtime identifier, i.e.

cd samples/cs/GettingStarted/cross-platform/
dotnet run --project HelloFoundryLocalSdk/HelloFoundryLocalSdk.csproj -r:osx-arm64

samuel100 avatar Nov 24 '25 17:11 samuel100

Thanks for the info @samuel100! I managed to get the feed done. Then, I've got four following-up questions:

  1. The 0.8.2.1 version related codebase doesn't exist in the main branch of this repository. When will it be included here?
  2. Is specifying the runtime identifier temporary or permanent?
  3. The Foundry Local C# SDK uses Betalgo.Ranul.OpenAI instead of the official OpenAI SDK. Is it strategically chosen or being replaced with the official one?
  4. According to the sample code you mentioned above, the model.GetChatClientAsync() method returns OpenAIChatClient from Betalgo.Ranul.OpenAI, but it doesn't seem to be compatible with IChatClient of Microsoft.Extensions.AI. How can I cast the type from OpenAIChatClient to IChatClient?

justinyoo avatar Nov 25 '25 03:11 justinyoo

To answer the Q4, should I follow this REST API approach for now? https://learn.microsoft.com/en-gb/azure/ai-foundry/foundry-local/how-to/how-to-integrate-with-inference-sdks?view=foundry-classic&pivots=programming-language-csharp

justinyoo avatar Nov 25 '25 04:11 justinyoo

Thanks for the info @samuel100! I managed to get the feed done. Then, I've got four following-up questions:

  1. The 0.8.2.1 version related codebase doesn't exist in the main branch of this repository. When will it be included here?
  2. Is specifying the runtime identifier temporary or permanent?
  3. The Foundry Local C# SDK uses Betalgo.Ranul.OpenAI instead of the official OpenAI SDK. Is it strategically chosen or being replaced with the official one?
  4. According to the sample code you mentioned above, the model.GetChatClientAsync() method returns OpenAIChatClient from Betalgo.Ranul.OpenAI, but it doesn't seem to be compatible with IChatClient of Microsoft.Extensions.AI. How can I cast the type from OpenAIChatClient to IChatClient?

I ran into the same issue and wrote an adapter class. Let me know if you want to see the code.

chuckbeasley avatar Nov 25 '25 05:11 chuckbeasley

I ran into the same issue and wrote an adapter class. Let me know if you want to see the code.

Oh, yeah. I agree that I had to implement the adapter pattern for OpenAIChatClient to IChatClient, which I don't think we do it by ourselves.

Since my usecase is Aspire, and the Aspire.Hosting.Azure.AIFoundry NuGet package currently supports v0.3.0, I might have to stick with that version until it gets resolved.

justinyoo avatar Nov 25 '25 13:11 justinyoo