[BUG] `foundry model run deepseek-r1-1.5b-cpu` fails on MacOS
Environment
- Device: MacBook Pro M1
- OS: MacOS 14.0
Preparation
- [x] Neutron Server (
Inference.Service.Agent) - Usechmod +x Inference.Service.Agentto make executable - [x] Foundry Client (
foundry) - Usechmod +x foundryto make executable, and add toPATH
Execution
foundry service start
info: Microsoft.Neutron.OpenAI.Provider.OpenAIServiceProviderOnnx[0]
OpenAIService for type:OpenAIServiceProviderOnnx
info: Microsoft.Neutron.Rpc.Service.JsonRpcService[0]
json-rpc-service running
info: Microsoft.Neutron.Rpc.Service.JsonRpcService[2305]
Accepting pipe connections pipeName:inference_agent
info: Microsoft.Hosting.Lifetime[14]
Now listening on: http://localhost:5272
info: Microsoft.Hosting.Lifetime[0]
Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
Hosting environment: Production
info: Microsoft.Hosting.Lifetime[0]
Content root path: /foundry_local/Neutron.Server/release_osx-arm64/
Permission denied
Failed setting /foundry_local/Neutron.Server/release_osx-arm64/Inference.Service.Agent priority to AboveNormal
🟢 Service is Started on http://localhost:5272, PID 55684!
Result
foundry model run deepseek-r1-1.5b-cpu
fail: Microsoft.AspNetCore.Server.Kestrel[13]
Connection id "0HNC00TRB4C3H", Request id "0HNC00TRB4C3H:00000002": An unhandled exception was thrown by the application.
System.InvalidOperationException: No service for type 'Microsoft.Neutron.OpenAI.OpenAIServiceComposite' has been registered.
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService(IServiceProvider, Type) + 0xa8
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService[T](IServiceProvider) + 0x30
at Microsoft.AspNetCore.Http.Generated.<GeneratedRouteBuilderExtensions_g>F16C589DE9EC82483AA705851D2FE201CB4CB4AAF6561E8DE71B6A1891AD8D67F__GeneratedRouteBuilderExtensionsCore.<>c__DisplayClass8_0.<<MapPost4>g__RequestHandler|5>d.MoveNext() + 0xe8
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpProtocol.<ProcessRequests>d__238`1.MoveNext() + 0x288
[20:32:50 ERR] Exception parsing ProblemDetails response.
System.Text.Json.JsonException: The input does not contain any JSON tokens. Expected the input to start with a valid JSON token, when isFinalBlock is true. Path: $ | LineNumber: 0 | BytePositionInLine: 0.
---> System.Text.Json.JsonReaderException: The input does not contain any JSON tokens. Expected the input to start with a valid JSON token, when isFinalBlock is true. LineNumber: 0 | BytePositionInLine: 0.
at System.Text.Json.ThrowHelper.ThrowJsonReaderException(Utf8JsonReader&, ExceptionResource, Byte, ReadOnlySpan`1) + 0x24
at System.Text.Json.Utf8JsonReader.Read() + 0x70
at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader&, T&, JsonSerializerOptions, ReadStack&) + 0x40
--- End of inner exception stack trace ---
at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack&, JsonReaderException) + 0x338
at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader&, T&, JsonSerializerOptions, ReadStack&) + 0x164
at System.Text.Json.Serialization.Metadata.JsonTypeInfo`1.Deserialize(Utf8JsonReader&, ReadStack&) + 0x28
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1, JsonTypeInfo`1, Nullable`1) + 0xb0
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1, JsonTypeInfo`1) + 0x120
at System.Text.Json.JsonSerializer.Deserialize[TValue](String, JsonTypeInfo`1) + 0x54
at Microsoft.AI.Foundry.Local.Common.ModelManagement.GetErrorFromResponse(HttpResponseMessage, CancellationToken) + 0x7c
Exception: Failed: Downloading deepseek-r1-distill-qwen-1.5b-cpu-int4-rtn-block-32-acc-level-4
Response status code does not indicate success: 500 (Internal Server Error).
Is this unexpected? Only windows is said to be supported currently, and there is even a branch called "remove_mac_linux".
Hey @vldv , we've provided a limited access mac/linux build to some of the community, and we're still testing it - but thank you for the comment :)
@BeanHsiang , we've got a new build in the works, will test this with the new build and get back to you soon!
Hey @vldv , we've provided a limited access mac/linux build to some of the community, and we're still testing it - but thank you for the comment :)
Please add me
Hey @HaoHoo - we're very close to a nice build. Rest assured you'll be first on the list once we finalize on it. Expect it sometime this week or the next!
@HaoHoo did you try the latest version on Mac? Instructions here: https://review.learn.microsoft.com/en-us/azure/ai-foundry/foundry-local/get-started?branch=main#option-1-quick-cli-setup