"Unable to find package Microsoft.AI.Foundry.Local.Core" when adding package Microsoft.AI.Foundry.Local
When executing the following command to install the latest Microsoft.AI.Foundry.Local nuget package: dotnet add package Microsoft.AI.Foundry.Local --version 0.8.0.1
The following error occurs:
info : GET https://api.nuget.org/v3-flatcontainer/microsoft.ai.foundry.local.core/index.json info : NotFound https://api.nuget.org/v3-flatcontainer/microsoft.ai.foundry.local.core/index.json 200ms error: NU1101: Unable to find package Microsoft.AI.Foundry.Local.Core. No packages exist with this id in source(s): Microsoft Visual Studio Offline Packages, nuget.org
The Microsoft.AI.Foundry.Local.Core package is missing or not public.
@lebakken - thanks for raising the issue! The Core package is currently on the Azure Public feed:
https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT/NuGet/Microsoft.AI.Foundry.Local.Core/overview/0.8.1
So if you install that first from the feed, it should resolve your issue.
We are working on getting this added to Nuget.org in the next day or two.
I tried to install that version and the newest version released 0.8.2.2, but now get this error:
Unable to find package Microsoft.ML.OnnxRuntime.Foundry.
https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT/NuGet/Microsoft.ML.OnnxRuntime.Foundry/overview/1.23.2
The problem is that, if I add this AzureDevOps package feed manually, it throws the error like:
Assets file 'C:\dev\<project-directory-name>\src\<project-name>\obj\project.assets.json' doesn't have a target for 'net9.0/win-x64'. Ensure that restore has run and that you have included 'net9.0' in the TargetFrameworks for your project. You may also need to include 'win-x64' in your project's RuntimeIdentifiers
Although I change my build machine to Mac OS, it throws net9.0/osx-arm64 instead of net9.0/win-x64.
When do you think we have this Microsoft.ML.OnnxRuntime.Foundry package on the main NuGet feed?
The samples includes a nuget.config file: https://github.com/microsoft/Foundry-Local/blob/main/samples/cs/GettingStarted/nuget.config that will get the packages from the right feed.
You need to run with the runtime identifier, i.e.
cd samples/cs/GettingStarted/cross-platform/
dotnet run --project HelloFoundryLocalSdk/HelloFoundryLocalSdk.csproj -r:osx-arm64
Thanks for the info @samuel100! I managed to get the feed done. Then, I've got four following-up questions:
- The
0.8.2.1version related codebase doesn't exist in the main branch of this repository. When will it be included here? - Is specifying the runtime identifier temporary or permanent?
- The Foundry Local C# SDK uses Betalgo.Ranul.OpenAI instead of the official OpenAI SDK. Is it strategically chosen or being replaced with the official one?
- According to the sample code you mentioned above, the
model.GetChatClientAsync()method returnsOpenAIChatClientfrom Betalgo.Ranul.OpenAI, but it doesn't seem to be compatible withIChatClientof Microsoft.Extensions.AI. How can I cast the type fromOpenAIChatClienttoIChatClient?
To answer the Q4, should I follow this REST API approach for now? https://learn.microsoft.com/en-gb/azure/ai-foundry/foundry-local/how-to/how-to-integrate-with-inference-sdks?view=foundry-classic&pivots=programming-language-csharp
Thanks for the info @samuel100! I managed to get the feed done. Then, I've got four following-up questions:
- The
0.8.2.1version related codebase doesn't exist in the main branch of this repository. When will it be included here?- Is specifying the runtime identifier temporary or permanent?
- The Foundry Local C# SDK uses Betalgo.Ranul.OpenAI instead of the official OpenAI SDK. Is it strategically chosen or being replaced with the official one?
- According to the sample code you mentioned above, the
model.GetChatClientAsync()method returnsOpenAIChatClientfrom Betalgo.Ranul.OpenAI, but it doesn't seem to be compatible withIChatClientof Microsoft.Extensions.AI. How can I cast the type fromOpenAIChatClienttoIChatClient?
I ran into the same issue and wrote an adapter class. Let me know if you want to see the code.
I ran into the same issue and wrote an adapter class. Let me know if you want to see the code.
Oh, yeah. I agree that I had to implement the adapter pattern for OpenAIChatClient to IChatClient, which I don't think we do it by ourselves.
Since my usecase is Aspire, and the Aspire.Hosting.Azure.AIFoundry NuGet package currently supports v0.3.0, I might have to stick with that version until it gets resolved.