open-attestation
open-attestation copied to clipboard
Locally cached Context for OAv3 docs
We're currently running OA as a serverless (Lambda) app, which breaks context pre-caching. That is, every call to wrapDocument() fetches all the named jsonld contexts. This makes for a lot of wasted time.
An obvious solution is to pull in these contexts at build time, and only fetch unusual contexts.
If this is something that makes sense to others, I would be happy to raise a PR. I think the most flexible approach is to allow the user to pass in a custom documentLoader
function to the call to validateW3C
. In the current code, this would probably have to be included as n option in wrapDocument
An alternate, and easier option would be no dynamic loading at all, just use what's included in the build. I note this is the approach taken by DIDkit, where currently all contexts are baked in at compile time, This would be particularly simple, but of course it couldn't support new credential types without a re-build.
Any thoughts?
(edited for clarity)
(On a side note): This is related to two other points of interest which I'll flag here (I can open new issues if there's any interest):
- I see that other's e.g. Transmute in their examples use their documentLoader for loading DIDs with DIF's universal resolver. This seems like an approach with some merit.
- There doesn't seem to be any schema validation or JSON-LD validity checking on the verification end- documents are instead just checked against a TypeScript type. This surprises me a but- but perhaps this is deliberate...
When we say the "context", can I presume it's referring to the URLs in the @context
section of a document? If that's what it's referred to, would it make sense and probably easier for the apps to cache them locally on the users' clients instead of caching on the server side through a lambda? For eg, localstorage?
Yep, I do mean the JSON-LD files referred to in the @context section.
would it make sense and probably easier for the apps to cache them locally on the users' clients instead of caching on the server side through a lambda?
I realise now I haven't been clear above, sorry!
I'm not thinking about using it in the browser here. In this case we are using OpenAttestation server-side, behind the VC-API. We're currently using AWS Lambda for this.
Does the issue make more sense now?