Dapper icon indicating copy to clipboard operation
Dapper copied to clipboard

What is "efficient" execution of multiple commands?

Open AnthonyMastrean opened this issue 7 years ago • 4 comments

In the README in the section "Execute a Command multiple times" you say (emphasis mine)

The same signature also allows you to conveniently and efficiently execute a command multiple times (for example to bulk-load data)

What do you mean by "efficiently"? Are these statements concatenated and sent as one command? Are they sent in one transaction?

I suspect the answer might be in this area, but I'm not savvy enough to tell for sure (I'm also looking at my database stats now to try to tell from the backend).

AnthonyMastrean avatar Jul 11 '18 17:07 AnthonyMastrean

Bump

Its been my experience that Dapper will take the IEnumerable and enumerate it and create the parameterized SQL and execute it per row. That is to say, if you have an array of 100 rows then it will do 100 inserts for you.

A more efficient way would be to utilize the VALUES (),(),().. syntax and insert 100 rows with just a single SQL statement. I am not sure on the exact reason why Dapper doesn't do this (I assume compatibility), but the nuget package Faithlife.Utility.Dapper (see https://faithlife.github.io/DapperUtility/BulkInsert.html) supports this method

BHare1985 avatar Jan 28 '19 16:01 BHare1985

Bump

@BHare1985 I can confirm that it is still the case that an INSERT of a collection with ExecuteAsync will trigger a separate INSERT for each object in the collection.

Is there any reason why Dapper is not capable of creating an INSERT command with multiple values, which would be a lot faster? Even Dapper.Contrib Insert does not expand the VALUES

paaaz avatar Jun 30 '21 09:06 paaaz

+1 to this concern. My expectation from this section in the docs:

var foos = new List<Foo>
{
    { new Foo { A = 1, B = 1 } }
    { new Foo { A = 2, B = 2 } }
    { new Foo { A = 3, B = 3 } }
};

var count = connection.Execute(@"insert MyTable(colA, colB) values (@a, @b)", foos);
Assert.Equal(foos.Count, count);

Would have been a single SQL statement was executed that looked something like this

DECLARE @a1 int = 1,
        @b2 int = 1,
        @a2 int = 2,
        @b2 int = 2,
        @a3 int = 1,
        @b3 int = 1;

insert MyTable(colA, colB)
values
  (@a1, @b1),
  (@a2, @b2),
  (@a3, @b3);

But instead, I'm seeing three different SQL Executions, 1 for each row:

DECLARE @a int = 1,
        @b int = 1;


insert MyTable(colA, colB) values (@a, @b);
DECLARE @a int = 2,
        @b int = 2;

insert MyTable(colA, colB) values (@a, @b);
DECLARE @a int = 3,
        @b int = 3;

insert MyTable(colA, colB) values (@a, @b);

If we're talking about inserting 1k+ rows at a time, is there a recommended approach to passing that much data all in one go?

CC: @mgravell

KyleMit avatar Mar 20 '25 16:03 KyleMit

There's two answers to that. The fastest is bulk insert; FastMember and DapperAOT both have tools to help with that, assuming you're using something like SqlBulkCopy. I can advise more if that is useful. The second is DbBatch; again, DapperAOT has experimental code to turn this exact scenario into DbBatch usage; how the server handles DbBatch is down to the providers however - it might choose to unroll and execute one-by-one. I'll try to test it soon.

mgravell avatar Mar 20 '25 22:03 mgravell