Enable dynamically loaded postgres extensions
Hi all, We at Lantern wanted to enable dyunamically loading our lantern ectension into a wasm postgres instance. We were happy to find pglite where you had done the heavy lifting of bringing postgres to browser, so built our demo on top of pglite. These two PRs implement the changes we needed to enable dynamic extension loading and we think these would be useful upstream as well.
postgres-wasm PR: https://github.com/electric-sql/postgres-wasm/pull/13
Our blog post and the demo can be found here
- Add
MAIN_MODULElinker flag, which will emit necessary code for dynamically loading side moudles. It will also export all symbols from postgres, so loaded extensions can access postgres symbols. - Remove
EMULATE_FUNCTION_POINTER_CASTSflag (see postgres-wasm PR) - Add
ASYNCIFYflag which will wrap code in async runtime, allowing dlopen to work in browsers via resolving wasm files withfetch. ChangedExecProtocolMsgfunction call withccallto properlyawaitthe response. - Add
pathOrUrlparam for extensions. If it is passed, on startup we will try to fetch the extension control file, then fetch sql file corresponding the extension version which also should be located under the same path or url.module_pathnamein control file should be full path or url to wasm file. The control file should exist under the path or url like this example, the SQL script for corresponding version should exist under the same url like this. And control file should contain full URL to WASM file which will be fetched when we runCREATE EXTENSIONcommand. - Change
locateFilefunction to check if url starts withhttp:/orhttps:/it will append the second slash and return the url. This is because ondlopenInternalfunction here when url passes throughPATH.normalizefunction thehttp://becomeshttp:/ - Add
addExtension(name: string, extension: Extension)function which can also be used after db initialization to register extension control and sql files and later useCREATE EXTENSIONto load the extension.
We have compiled and tested the following extensions:
I'm pretty sure this is already supported but the team, for some reason that completely baffles me, insists on doing Dev in non-publicised repos and not visibly discussing what they are doing. They were supposed to de a release weeks ago but it has been held up adding lots and lots of extra little features and nice-to-haves. I think this might actually be the second time someone has implemented this feature, after it already having been implemented outside the main repo by the maintainers...
Definitely check the discord...
Hey @var77
This is incredible, thank you!
We have a new version of PGlite (based on PG v16.3) in development here: https://github.com/electric-sql/pglite/pull/112 that includes a standard API for adding Postgres extensions and PGlite plugins. I would love to see your work on lantern and pgcrypto extensions ported to it.
We will be doing a big release of it in a few weeks, and adding those to the list would be amazing.
Do reach out on discord if you want to collaborate on anything!
I notice you have turned back on asyncify, was that only to enable dlopen or did you need it for either of the extensions?
The next branch supports loading extensions without asyncify as it significantly degrades performance and inflates the binary size.
I notice you have turned back on
asyncify, was that only to enable dlopen or did you need it for either of the extensions?The
nextbranch supports loading extensions withoutasyncifyas it significantly degrades performance and inflates the binary size.
Hi @samwillis , yes I turned on the asyncify, as dlopen was using fetch to load the wasm files in browser and the fetch was not working without asyncify.
It's awesome that you managed to make it working without asyncify, I will check it on the next branch
Hey @var77
As we now have built in support for dynamic loading extensions I'm going to close this PR. We are working on docs + example repos to demonstrate how to build custom extensions for PGlite. We hope to have this out soon. Many thanks for your work on this!
Sam