o1js
o1js copied to clipboard
Support for async `Cache`
The compilation step is usually very taxing on machine resources so we would like to avoid it when possible. Currently, a Cache
interface is defined and an implementation based on the filesystem is provided when running under Node.JS:
https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L31-L59
Currently, there is no Cache
implementation that works in the browser (the environment where we expect to have end users), so we defined our own:
export const RemoteCache = (url: string): Cache => ({
async read({ persistentId, uniqueId, dataType }) {
// read current uniqueId, return data if it matches
let currentId = await fetch(`${url}/${persistentId}.header`).then(r => r.text());
if (currentId !== uniqueId) return undefined;
if (dataType === 'string') {
let string = await fetch(`${url}/${persistentId}`).then(r => r.text());
return new TextEncoder().encode(string);
} else {
let buffer = await fetch(`${url}/${persistentId}`).then(r => r.arrayBuffer());
return new Uint8Array(buffer);
}
},
async write() {
throw Error('not available');
},
canWrite: false,
debug: true,
});
This implementation relies on a server where compilation of the proofs was already performed and stored in the filesystem. On the browser we make a network request for the compiled artifacts on read
and reject all write
operations (we don't want to alter the server filesystem).
Security aside (can we actually trust the server cached files?), this implementation does not compile due to Cache
being synchronous, making it impossible to use async/await
:
https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L37
Note how the current filesystem implementation uses sync operations:
https://github.com/o1-labs/o1js/blob/fd7bd4b02f4f7cc1057cd423261d70c22275763a/src/lib/proof-system/cache.ts#L176
We would like for the Cache
interface to be async by default:
type Cache = {
/**
* Read a value from the cache.
*
* @param header A small header to identify what is read from the cache.
*/
read(header: CacheHeader): Promise<Uint8Array | undefined>;
/**
* Write a value to the cache.
*
* @param header A small header to identify what is written to the cache. This will be used by `read()` to retrieve the data.
* @param value The value to write to the cache, as a byte array.
*/
write(header: CacheHeader, value: Uint8Array): Promise<void>;
/**
* Indicates whether the cache is writable.
*/
canWrite: boolean;
/**
* If `debug` is toggled, `read()` and `write()` errors are logged to the console.
*
* By default, cache errors are silent, because they don't necessarily represent an error condition,
* but could just be a cache miss, or file system permissions incompatible with writing data.
*/
debug?: boolean;
};
We are not quite sure if this is even possible, considering that this cache is eventually used in a synchronous context:
https://github.com/o1-labs/o1js-bindings/blob/177fb399d85ef4fab10d1ff26670da5a7de59450/crypto/bindings/srs.ts#L100
If this is not possible we would appreciate any hint on how to use async
code inside a Cache
implementation.
Note that we tried to force the async
code to block by wrapping all promises (ex. fetch
) in a busy wait loop:
function block<T>(p: Promise<T>): T {
let value: T;
p.then(v => value = v);
while (value! === undefined) { /* Unlucky busy wait =( */ }
return value;
}
Besides this blocking the main thread (thus making the UI unresponsive), we did not get any output from the compilation process which makes us think that we're doing something wrong.
How about fetching the cached values before running compile()
, and then your Cache
just provides those values synchronously?
Note that we tried to force the async code to block by wrapping all promises (ex. fetch) in a busy wait loop:
busy waiting like that doesn't work, because your busy loop prevents the next microtask to start so the promise is never triggered. promises in JS can't be made synchronous
We are not quite sure if this is even possible
It's possible but annoying 😅 Needs another Pickles refactor
busy waiting like that doesn't work, because your busy loop prevents the next microtask to start so the promise is never triggered. promises in JS can't be made synchronous
I figured that this would probably not work but it was worth the try.
How about fetching the cached values before running compile(), and then your Cache just provides those values synchronously?
We're exploring this approach but the cached files are ~2.1 GB so I don't think we can store it in a Map
or similar, which means that we need to leverage some kind of Storage mechanism:
- Cookies are not supposed to be used like this
- Web Storage (
localStorage
andsessionStorage
) only support up to ~10 MB at best.
This leaves us with IndexedDB
, CacheAPI
and Origin Private File System
. The first two only provide Promise
based APIs which we've already rules out, and while OPFS
does provide a "sync" API (see: https://developer.mozilla.org/en-US/docs/Web/API/File_System_API/Origin_private_file_system#manipulating_the_opfs_from_a_web_worker) it is intended to be used from a WebWorker which does not expose a "sync" API (it relies on message passing).
We're exploring this approach but the cached files are ~2.1 GB so I don't think we can store it in a Map or similar
As a first iteration, I'd just store them in memory
As a first iteration, I'd just store them in memory
To my surprise Firefox does not complain. It seems like this approach works so we'll go with it for the time being.
@emlautarom1
See https://discord.com/channels/484437221055922177/1171938451193593856/1172278215637733407 https://github.com/o1-labs/o1js/issues/1252
Thanks @dfstio, we ended up with something quite similar to what is shared in Discord except for the fact that we have 63 files in the cache folder instead of just 10.
The following discussion can also be of interest to you. You can first sign the tx on the web without compiling or proving and then do the compilation, proving, and sending the tx on your server subject to AuroWallet adding a new API method: https://discord.com/channels/484437221055922177/1228326948078489642/1228403957752397905
The changes required in Auro Wallet for it to work are much smaller than Pickles refactoring.
Thanks @dfstio, we ended up with something quite similar to what is shared in Discord except for the fact that we have 63 files in the cache folder instead of just 10.
You don't need all the files on web - some files are created very fast on the fly, so you can download a subset of the files generated during compiling. https://discord.com/channels/484437221055922177/1171938451193593856/1174766167982886952
do the compilation, proving, and sending the tx on your server
For our use case we want to keep as many steps as possible in the browser.