InnerEye-Gateway
InnerEye-Gateway copied to clipboard
The InnerEye-Gateway is a Windows service that acts as a DICOM end point to run inference on https://github.com/microsoft/InnerEye-DeepLearning models.
Bumps [Markdig](https://github.com/xoofx/markdig) from 0.15.0 to 0.33.0. Release notes Sourced from Markdig's releases. 0.33.0 Changes 🐛 Bug Fixes Fix source span calculation (PR #733) by @zickb 🧰 Misc Better literal delimiter...
Bumps [Namotion.Reflection](https://github.com/RicoSuter/Namotion.Reflection) from 1.0.11 to 3.0.0. Commits See full diff in compare view [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter...
Bumps [Microsoft.EntityFrameworkCore.Sqlite](https://github.com/dotnet/efcore) from 3.1.9 to 7.0.10. Release notes Sourced from Microsoft.EntityFrameworkCore.Sqlite's releases. .NET 7.0.10 Release .NET 7.0.9 Release EF Core 7.0.8 This is a patch release of EF Core 7.0...
Bumps [fo-dicom](https://github.com/fo-dicom/fo-dicom) from 4.0.8 to 5.1.1. Release notes Sourced from fo-dicom's releases. 5.1.1 On May 29th 2023 fo-dicom 5.1.1 was officially released. This release contains the following bugfixes: Fix issue...
Bumps [Microsoft.Extensions.Options.ConfigurationExtensions](https://github.com/dotnet/runtime) from 6.0.0 to 7.0.0. Release notes Sourced from Microsoft.Extensions.Options.ConfigurationExtensions's releases. .NET 7.0 RC 2 Release .NET 7.0 RC 1 Release .NET 7.0 Preview 7 Release .NET 7.0 Preview...
After following the set up steps and attempting to run the tests, all of them pass except the `GenerateAndTestDeAnonymisedStructureSetFile` test, failing with the following output: ```text Message: Assert.IsFalse failed. Stack...
During the end-to-end tests, if the inference service fails to complete the model run then, even if the Inference service receives HTTP error codes, the Gateway continues with the tests,...
Documentation does not clearly explain the difference between development and production