DacFx
DacFx copied to clipboard
".NET Core should not be using a file backed model" should just default to using the correct storage method.
Possibly solutions:
- Set default, remove SqlPackage option for .NET Core
- Use proper default, throw exception if specified otherwise
- Update exception message to include info about proper usage
it would also be good if https://github.com/microsoft/azuredatastudio/issues/12754 could also be fixed when fixing this.
IMO, a better solution is to support file-backed model when using a .NET Core/5+ build of sqlpackage
.
Otherwise, how are we supposed to import huge BACPACs using those builds?
IMO, a better solution is to support file-backed model when using a .NET Core/5+ build of
sqlpackage
.Otherwise, how are we supposed to import huge BACPACs using those builds?
Any thoughts on this anyone?
If I recall correctly, it was a platform limitation handed to us by .NET Core, but it's possible that's been added in more recent versions. @llali?
@Benjin @llali Any update on this please? Struggling to import any database over around 7GB and file backed model appears to be the solution: https://techcommunity.microsoft.com/t5/azure-database-support-blog/import-bacpac-fails-if-schema-has-a-large-number-of-tables/ba-p/2451229
@Benjin @llali Any update on this please? Struggling to import any database over around 7GB and file backed model appears to be the solution: https://techcommunity.microsoft.com/t5/azure-database-support-blog/import-bacpac-fails-if-schema-has-a-large-number-of-tables/ba-p/2451229
We are facing the same issue. If we were on .NET Framework we could use a file backed model and not run out of resources, however .NET Core is preventing us from importing our DB. An update would be appreciated please?