Implement stored procedure update mapping
Here's a WIP for some early feedback to make sure I'm in the right direction.
Some comments:
- ResultSetMapping is now a flags enum, with a bit that specifies whether the command has output parameters (I've also redone the "positional mapping" optimization for SQL Server as another bit).
- Rows affected is unimplemented (waiting for metadata), same with fancier mapping scenarios (need to know which table is being updated by the sproc).
- Positional vs. named argument style
- All databases support positional arguments to stored procedures, but not all support named (see comparison below); we should do positional by default.
- Supporting named seems like it needs some metadata changes... StoredProcedureParameterBuilder has HasName (StoreStoredProcedureParameter extends IColumnBase which has Name). From a user perspective this looks like it should set the parameter name in the sproc call (e.g.
EXEC @id = 3, @name = 'foo'). However, this is all non-nullable, so we have no way to make the named/positional distinction. Making IColumnBase.Name nullable seems like it would have a huge impact; another option would be to use some other property specific to StoreStoredProcedureParameter, but that would make Name dead for that type. We can also scope this out and say that named parameters aren't supported (and remove the HasName API for now).
- Some assumptions in this implementation:
- We call sprocs from regular commands (CommandType.Text with EXEC), even with output parameters. The alterative is to use CommandType.StoredProcedure, but these can't be batched - one call per command (until the new ADO.NET batching API is implemented). This works well for SQL Server, but I'm not 100% sure of the portability here.
- We consume output parameters as we're traversing the DbCommand's result set. If some database only populates output parameters at the end (e.g. when the reader is disposed), this will fail.
- No seeding support right now. I think this would require propagating the relational model through migrations - probably not trivial. Consider scoping out.
- Sprocs don't have to return rows affected (but they can). This means that the concurrency check is optional - not possible without sprocs at the moment (see #10443).
Some cross-database comparison:
| Database | Syntax | Named/Positional | Docs |
|---|---|---|---|
| SQL Server | EXEC MySproc 'foo', 'bar' |
Both | docs |
| PostgreSQL | CALL MySproc('foo', 'bar') |
Both | docs |
| MySQL | CALL MySproc('foo', 'bar') |
Positional only | docs |
| MariaDB | CALL MySproc('foo', 'bar') |
Positional only | docs |
| Oracle | CALL MySproc('foo', 'bar') |
Positional only (in SQL) | docs |
| SQLite | No sproc support |
Closes #245 Closes #28435
Crap, seems like on SQL Server output parameters aren't populated until NextResult is called. Will change the logic to take this into account.
@roji One for the book? 🤣
Oh I dunno, sprocs are such a quirky area that I'm hardly even surprised here...
Yeah, I don't suppose you have any lack of material. No need to find filler.
Pretty much :rofl:
@roji works fine for me without next result. Will dig out a sample..
@ErikEJ this is specifically for a call that also returns a resultset (so both resultset and output parameter). If there's only an output parameter without a resultset, the previous NextResult already "went past" the row and processed its output parameter.
Here's a code sample:
Code sample
await using var conn = new SqlConnection("Server=localhost;Database=test;User=SA;Password=Abcd5678;Connect Timeout=60;ConnectRetryCount=0;Encrypt=false");
await conn.OpenAsync();
var cmd = conn.CreateCommand();
cmd.CommandText = @"
DROP TABLE IF EXISTS Blogs;
CREATE TABLE Blogs
(
Id int IDENTITY PRIMARY KEY,
Title VARCHAR(MAX)
)";
cmd.ExecuteNonQuery();
cmd.CommandText = @"
CREATE OR ALTER PROCEDURE Blogs_Insert(@id int OUT, @title varchar(max))
AS BEGIN
INSERT INTO [Blogs] ([Title]) VALUES (@title);
SELECT @id = SCOPE_IDENTITY();
SELECT 8;
END;";
cmd.ExecuteNonQuery();
cmd.CommandText = "EXEC Blogs_Insert @Id OUTPUT, 'foo'";
var idParam = new SqlParameter("Id", SqlDbType.Int)
{
Direction = ParameterDirection.Output
};
cmd.Parameters.Add(idParam);
cmd.Parameters.AddWithValue("title", "foo");
using var reader = cmd.ExecuteReader();
reader.Read();
Console.WriteLine("Result column: " + reader[0]);
Console.WriteLine("ID: " + idParam.Value);
Pushed a logic change to handle this, @AndriySvyryd this is probably a good time to give this a 1st review.
@ErikEJ note that I'm also supporting multiple sproc calls in a single command (batching), by concatenating multiple EXECs into the same CommandText. If you have any info/insights/possible pitfalls, they'd be very welcome :grimacing:
I have this sproc and get values back without special effort:
CREATE OR ALTER PROCEDURE OutputScenarios (
@Year SMALLINT,
@ProductCount INT OUTPUT,
@Description NVARCHAR(50) OUTPUT
) AS
BEGIN
SELECT * FROM Customers
SELECT @ProductCount = 42;
SELECT @Description = 'It works'
END;
GO
public virtual async Task<List<OutputScenariosResult>> OutputScenariosAsync(short? Year, OutputParameter<int?> ProductCount, OutputParameter<string> Description, OutputParameter<int> returnValue = null, CancellationToken cancellationToken = default)
{
var parameterProductCount = new SqlParameter
{
ParameterName = "ProductCount",
Direction = System.Data.ParameterDirection.InputOutput,
Value = ProductCount?._value ?? Convert.DBNull,
SqlDbType = System.Data.SqlDbType.Int,
};
var parameterDescription = new SqlParameter
{
ParameterName = "Description",
Size = 100,
Direction = System.Data.ParameterDirection.InputOutput,
Value = Description?._value ?? Convert.DBNull,
SqlDbType = System.Data.SqlDbType.NVarChar,
};
var parameterreturnValue = new SqlParameter
{
ParameterName = "returnValue",
Direction = System.Data.ParameterDirection.Output,
SqlDbType = System.Data.SqlDbType.Int,
};
var sqlParameters = new []
{
new SqlParameter
{
ParameterName = "Year",
Value = Year ?? Convert.DBNull,
SqlDbType = System.Data.SqlDbType.SmallInt,
},
parameterProductCount,
parameterDescription,
parameterreturnValue,
};
var _ = await _context.SqlQueryAsync<OutputScenariosResult>("EXEC @returnValue = [dbo].[OutputScenarios] @Year, @ProductCount OUTPUT, @Description OUTPUT", sqlParameters, cancellationToken);
ProductCount.SetValue(parameterProductCount.Value);
Description.SetValue(parameterDescription.Value);
returnValue?.SetValue(parameterreturnValue.Value);
return _;
}
SqlQueryAsync is:
return await db.Set<T>().FromSqlRaw(sql, parameters).ToListAsync(cancellationToken);
Compared to your sample code, I see named parameters, direction = InputOutput and setting a value up front...
@ErikEJ that's a high-level code sample using EF, which may be doing various stuff under the hood. I need to use ADO.NET directly - can you try to produce an ADO.NET-only code sample?
Sorry, yes, too much EF Core magic there - this works fine on my PC! (Notice explicit using statement!)
using Microsoft.Data.SqlClient;
using System.Data;
await using var conn = new SqlConnection("Server=.\\SQLEXPRESS;Database=Northwind;Integrated Security=true;Trust Server Certificate=true");
await conn.OpenAsync();
var parameterProductCount = new SqlParameter
{
ParameterName = "ProductCount",
Direction = ParameterDirection.InputOutput,
Value = Convert.DBNull,
SqlDbType = SqlDbType.Int,
};
var parameterDescription = new SqlParameter
{
ParameterName = "Description",
Size = 100,
Direction = ParameterDirection.InputOutput,
Value = Convert.DBNull,
SqlDbType = SqlDbType.NVarChar,
};
var parameterreturnValue = new SqlParameter
{
ParameterName = "returnValue",
Direction = ParameterDirection.Output,
SqlDbType = SqlDbType.Int,
};
var sqlParameters = new[]
{
new SqlParameter
{
ParameterName = "Year",
Value = Convert.DBNull,
SqlDbType = SqlDbType.SmallInt,
},
parameterProductCount,
parameterDescription,
parameterreturnValue,
};
var cmd = conn.CreateCommand();
cmd.CommandText = "EXEC @returnValue = [dbo].[OutputScenarios] @Year, @ProductCount OUTPUT, @Description OUTPUT";
cmd.Parameters.AddRange(sqlParameters);
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
Console.WriteLine("Result column: " + reader[0]);
}
}
Console.WriteLine("ProductCount: " + parameterProductCount.Value);
Console.WriteLine("Description: " + parameterDescription.Value);
Console.WriteLine("ReturnValue: " + parameterreturnValue.Value);
@ErikEJ oh sure, if you dispose the reader, that consumes the resultset and populates the output parameters.
I'm specifically trying to process output parameters as the reader resultsets are being processed, rather than at the very end, and when processing a row (after calling Read), the output parameters for that row aren't yet populated.
I'm doing this "inline" processing style to have a single pass over all the commands, rather than having two passes (once over the resultset, another over the output parameters). It could also be important for raising the proper concurrency exception at the right time (i.e. as rows are being processed). But I may be over-complicating things, I'll take another look once concurrency detection is implemented.
Supporting named seems like it needs some metadata changes... StoredProcedureParameterBuilder has HasName (StoreStoredProcedureParameter extends IColumnBase which has Name). From a user perspective this looks like it should set the parameter name in the sproc call (e.g. EXEC @id = 3, @name = 'foo'). However, this is all non-nullable, so we have no way to make the named/positional distinction. Making IColumnBase.Name nullable seems like it would have a huge impact; another option would be to use some other property specific to StoreStoredProcedureParameter, but that would make Name dead for that type. We can also scope this out and say that named parameters aren't supported (and remove the HasName API for now).
We can use IColumnBase.Name, but use named argument style for all parameters or add a flag or some other configuration which would determine when a particular argument should be passed by name (tracked in https://github.com/dotnet/efcore/issues/28439)
No seeding support right now. I think this would require propagating the relational model through migrations - probably not trivial. Consider scoping out.
We currently get IColumnMapping when available and pass through the IProperty.
https://github.com/dotnet/efcore/blob/129baccb9e136831b8a79be916460ca5fef9dcba/src/EFCore.Relational/Migrations/MigrationsSqlGenerator.cs#L909
It shouldn't be too different for sprocs. If sprocs are to be used and the model is not available we should throw. But this can be done in a separate PR or post 7.0
I'm doing this "inline" processing style to have a single pass over all the commands, rather than having two passes (once over the resultset, another over the output parameters). It could also be important for raising the proper concurrency exception at the right time (i.e. as rows are being processed). But I may be over-complicating things, I'll take another look once concurrency detection is implemented
OK, I'm abandoning inline processing of output parameters, it seems that SqlClient/SQL Server are too quirky here. Specifically, if a sproc with an output param and no result column is executed in a batch before a regular select, the output parameter is only populated after a NextResult (although there's only one result set, and it's after the sproc call with the output parameter. I'll do the safer thing and only access output parameters after the reader has been completely consumed.
Quirky code sample
await using var conn = new SqlConnection("Server=localhost;Database=test;User=SA;Password=Abcd5678;Connect Timeout=60;ConnectRetryCount=0;Encrypt=false");
await conn.OpenAsync();
var cmd = conn.CreateCommand();
cmd.CommandText = @"
DROP TABLE IF EXISTS Blogs;
CREATE TABLE Blogs
(
Id int IDENTITY PRIMARY KEY,
Title VARCHAR(MAX)
)";
cmd.ExecuteNonQuery();
cmd.CommandText = @"
CREATE OR ALTER PROCEDURE Blogs_Insert(@id int OUT, @title varchar(max))
AS BEGIN
INSERT INTO [Blogs] ([Title]) VALUES (@title);
SELECT @id = 8;
END;";
cmd.ExecuteNonQuery();
cmd.CommandText = "EXEC Blogs_Insert @Id OUTPUT, 'foo'; SELECT 20";
var idParam = new SqlParameter("Id", SqlDbType.Int)
{
Direction = ParameterDirection.Output
};
cmd.Parameters.Add(idParam);
cmd.Parameters.AddWithValue("title", "foo");
using var reader = cmd.ExecuteReader();
// We should now be on the resultset of the 2nd command
Console.WriteLine($"First read: {reader.Read()}, value: {reader[0]}"); // true, 20
Console.WriteLine("ID: " + idParam.Value); // null (?!)
Console.WriteLine($"NextResult: {reader.NextResult()}"); // false
Console.WriteLine($"Second read: {reader.Read()}"); // false
Console.WriteLine("ID: " + idParam.Value); // 8...
/cc @ErikEJ
:facepalm: it's even documented that you need to close SqlDataReader before accessing output parameters:
While the SqlDataReader is being used, the associated SqlConnection is busy serving the SqlDataReader, and no other operations can be performed on the SqlConnection other than closing it. This is the case until the Close method of the SqlDataReader is called. For example, you cannot retrieve output parameters until after you call Close.
Nice!
Is it? :rofl:
- Do we want to validate that HasOriginalValueParameter doesn't get used for wrong properties (i.e. not concurrency token/pk)?
No, there are valid reasons to use it for non-conditional properties
Hey I don't want to bug you if this isn't ready for primetime yet but thought I'd check. I've got 7.0.0-rc.1.22410.9. Intellisense helps me build the Insert/Update/Delete mappings. Compiler is happy. But procedures are not being used. Just localdb on same machine as VS. It's enough for me to write my article knowing that it WILL work, though sorry I can't say "and here's the SQL ...look it's calling the sprocs!". However if this SHOULD work with this build, I can share my sln in separate issue for you to see if a) I'm doing something wrong (?) or b) I'm doing something differently (super simple class/dbcontext) . Thanks
@julielerman
It's enough for me to write my article knowing that it WILL work
Never trust us untrustworthy devs!
Unfortunately, AFAIK our CI doesn't produce daily builds of PRs which haven't been merged yet, so that version you're using only contains the metadata APIs (this PR does the actual implementation).
You can still build this PR yourself and run against that - it shouldn't be too hard. Or if you have a couple of days to wait, this should be merged and you'll have a daily build nupkg.
In the meantime, here's a console program which works for me when executed against the PR, it outputs:
EXEC [Blog_Insert] @p0, @p1 OUTPUT;
Code sample
await using var ctx = new BlogContext();
await ctx.Database.EnsureDeletedAsync();
await ctx.Database.EnsureCreatedAsync();
ctx.Database.ExecuteSqlInterpolated(
$"""
CREATE PROCEDURE Blog_Insert(@Name varchar(max), @Id int OUT)
AS BEGIN
INSERT INTO [Blogs] ([Name]) VALUES (@Name);
SET @Id = SCOPE_IDENTITY();
END;
""");
ctx.Database.ExecuteSqlInterpolated(
$"""
CREATE PROCEDURE Blog_Update(@Id int, @Name varchar(max))
AS UPDATE [Blogs] SET [Name] = @Name WHERE [Id] = @id;
""");
ctx.Database.ExecuteSqlInterpolated(
$"""
CREATE PROCEDURE Blog_Delete(@Id int)
AS DELETE FROM [Blogs] WHERE [Id] = @Id;
""");
var blog = new Blog { Name = "Foo" };
ctx.Blogs.Add(blog);
await ctx.SaveChangesAsync();
public class BlogContext : DbContext
{
public DbSet<Blog> Blogs { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
=> optionsBuilder
.UseSqlServer(@"Server=localhost;Database=test;User=SA;Password=Abcd5678;Connect Timeout=60;ConnectRetryCount=0;Encrypt=false")
.LogTo(Console.WriteLine, LogLevel.Information)
.EnableSensitiveDataLogging();
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Blog>()
.InsertUsingStoredProcedure(
"Blog_Insert",
spb => spb
.HasParameter(w => w.Name)
.HasParameter(w => w.Id, pb => pb.IsOutput()))
.UpdateUsingStoredProcedure(
"Blog_Update",
spb => spb
.HasParameter(w => w.Id)
.HasParameter(w => w.Name))
.DeleteUsingStoredProcedure(
"Blog_Delete",
spb => spb.HasParameter(w => w.Id));
}
}
public class Blog
{
public int Id { get; set; }
public string Name { get; set; }
}
thanks @roji ...i will wait! HOpe this wasn't the absolutely wrong place to ask. Carry on! :)
Hello @roji!
Because this pull request has the auto-merge label, I will be glad to assist with helping to merge this pull request once all check-in policies pass.
p.s. you can customize the way I help with merging this pull request, such as holding this pull request until a specific person approves. Simply @mention me (@msftbot) and give me an instruction to get started! Learn more here.
Apologies, while this PR appears ready to be merged, I've been configured to only merge when all checks have explicitly passed. The following integrations have not reported any progress on their checks and are blocking auto-merge:
- Azure Pipelines
These integrations are possibly never going to report a check, and unblocking auto-merge likely requires a human being to update my configuration to exempt these integrations from requiring a passing check.
Give feedback on this
From the bot dev team
We've tried to tune the bot such that it posts a comment like this only when auto-merge is blocked for exceptional, non-intuitive reasons. When the bot's auto-merge capability is properly configured, auto-merge should operate as you would intuitively expect and you should not see any spurious comments.
Please reach out to us at [email protected] to provide feedback if you believe you're seeing this comment appear spuriously. Please note that we usually are unable to update your bot configuration on your team's behalf, but we're happy to help you identify your bot admin.
Apologies, while this PR appears ready to be merged, I've been configured to only merge when all checks have explicitly passed. The following integrations have not reported any progress on their checks and are blocking auto-merge:
- Azure Pipelines
These integrations are possibly never going to report a check, and unblocking auto-merge likely requires a human being to update my configuration to exempt these integrations from requiring a passing check.
Give feedback on this
From the bot dev team
We've tried to tune the bot such that it posts a comment like this only when auto-merge is blocked for exceptional, non-intuitive reasons. When the bot's auto-merge capability is properly configured, auto-merge should operate as you would intuitively expect and you should not see any spurious comments.
Please reach out to us at [email protected] to provide feedback if you believe you're seeing this comment appear spuriously. Please note that we usually are unable to update your bot configuration on your team's behalf, but we're happy to help you identify your bot admin.