CLI feature idea: pass a block of Python code that yields rows to be inserted
The Python CLI equivalent of this:
def rows():
yield {"id": 1, "name": "Cleo"}
yield {"id": 2, "name": "Suna"}
db["creatures"].insert_all(rows())
Similar to the --functions mechanism, recently extended in:
- #681
I'm torn on what to call the option, since --functions already means "these Python functions will be turned into SQL custom functions".
Maybe this:
sqlite-utils insert data.db creatures --code '
def rows():
yield {"id": 1, "name": "Cleo"}
yield {"id": 2, "name": "Suna"}
'
It could look for a rows() function, or it could use the first callable in the block.
But... what if you could insert multiple tables at once?
sqlite-utils insert data.db --code '
def creatures():
yield {"id": 1, "name": "Cleo"}
yield {"id": 2, "name": "Suna"}
def foods():
yield "id", "name"
yield 1, "Pasta"
yield 2, "Salad"
'
In this case the table name is no longer a required option for sqlite-utils insert. Is that a bit too confusing though? Maybe have a sqlite-utils insert-multi? That seems messy too.
Could call it --python instead of --code. Might be a little bit more clear.
The problem with this one:
sqlite-utils insert data.db --code '
def creatures():
yield {"id": 1, "name": "Cleo"}
yield {"id": 2, "name": "Suna"}
def foods():
yield "id", "name"
yield 1, "Pasta"
yield 2, "Salad"
'
Is that there's no clear way to set things like primary keys for the resulting tables.
On that basis I think sticking to just a single table at a time makes more sense. If you really want to run Python code to create multiple tables you can use Python and import sqlite_utils!