extract-pg-schema
extract-pg-schema copied to clipboard
TODO
Issues that I want to add:
- [x] ❌ BUG: duplicate columns reported when a column has multiple foreign key constraints
- [ ] Routine support (procedures, functions, aggregates)
- [ ] Class: index
- [ ] Class: sequence
- [x] Information about row-level security policies added to tables (what about views?)
- [ ] Experiment with https://github.com/electric-sql/pglite instead of test-containers
I would really love to have (non catalog) functions listed
I'm working on a pg query gen tool and this is the next step, along with using RLS to refine what to generate
Are there any known blockers?
Yeah, I have been wanting to do that for a long time (as you can see!) The only blocker is my lack of time :-) I will get around to it at some point but I can't make any promises as to when.
Being able to pass in a ready migrated in memory PGlite instance would be great! However, knex doesn't support pglite at this stage: https://github.com/knex/knex/issues/6078.
In the meantime, doing something like this would work: https://github.com/ben-pr-p/pglite-pool
Yeah, it's annoying because there are some places where I rely quite a bit on Knex and then there are other places where I just went with .raw. So it would still be a bit of a rewrite to remove Knex entirely.
I created a pglite dialect for knex: https://www.npmjs.com/package/knex-pglite
I guess you could configure it in extractSchemas.ts? Any suggestions on the best way to configure it? e.g. just use pglite if the provided connection string is a file path?
I don't think its easy to configure a knex dialect to take an existing PGlite instance, e.g. to pass a readily migrated PGLite instance to kanel. However, I think using temp directory is a good enough, e.g. you migrate your DB to a temp dir, generate types with kanel and then delete the temp dir.
That's awesome! I will look into it as soon as I have some time. I don't think this will be too difficult.