aiopg
aiopg copied to clipboard
Bulk update values with SQLAlchemy
Since aiopg does not support bulk insert (https://github.com/aio-libs/aiopg/issues/112), so I use this to insert everything in a single query:
await conn.execute(
sa_foo_table
.insert()
.values([
dict(name='name1', x=1),
dict(name='name2', x=2),
dict(name='name3', x=3),
])
)
Is there any such thing for bulk updating? Because if I update one by one, it might take quite some time (there are thousands of rows).
GitMate.io thinks possibly related issues are https://github.com/aio-libs/aiopg/issues/43 (Error with sqlalchemy), https://github.com/aio-libs/aiopg/issues/345 (SQLAlchemy Dialects), https://github.com/aio-libs/aiopg/issues/140 (Using cursors with SQLAlchemy ), https://github.com/aio-libs/aiopg/issues/478 (Update all secret:
values in .travis.yml
), and https://github.com/aio-libs/aiopg/issues/45 (Possibly use SQLAlchemy Strategies).
@FadedCoder hi.
you probably mean queries of this type:
async def test_bulk(connect):
query = tbl.update().where(
tbl.c.id == sa.bindparam('id')
).values(name=sa.bindparam('name'))
await connect.execute(query, [
dict(id='1', name='test_update1'),
dict(id='2', name='test_update2'),
])
it's impossible. as asynchronous mode does not support executemany
, but according to the documentation psycopg2 http://initd.org/psycopg/docs/cursor.html#cursor.executemany Warning In its current implementation this method is not faster than executing execute() in a loop.
but isn't there a way to compile many updates/inserts into a single sql query and execute it faster than sequentially execute single queries? It only requires values sanitization and it must be ok.
@vir-mir is it somehow possible to use execute_batch()
or execute_values()
?
According to https://github.com/psycopg/psycopg2/issues/491#issuecomment-276551038:
cur.executemany(): 761.322767019 sec
execute_batch(): 14.6529989243 sec
execute_values(): 12.4037430286 sec
pgcopy: 2.85529208183 sec