Jessie meets lua/go?
The Agoric platform runs contracts on XS, a JavaScript engine designed for embedded systems. It was chosen for its support of deterministic execution and affinity for Hardened JS, not for its speed. Contracts spend significant time marshalling data into messages and back out, as well as pattern validation.
JavaScript (especially the Jessie dialect) has a lot in common with Lua.
Why Go Rocks for Building a Lua Interpreter explains a number of performance techniques that seem to remain deterministic.
I took note of this nugget:
Freezing prevents unintentional global state and permits sharing Lua data values among concurrent lua.State interpreters without copying.
How many 10x speedups might that lend to the Agoric platform?
Tantek's polyglot cassis makes serious use of the intersection of js and php. I wonder to what extent the JS/Lua overlap could be used to bootstrap testing of a port of the zb lua interpreter to JavaScript:
- start with a handful of cases: some JSON data, a few Justin expressions. a. stay within the subset where evaluation semantics agree b. expand out to a few cases where they differ; have a bot fix the code to match JS semantics
- have a bot generate a few hundred cases cases and separate them between those where the semantics agree and where they differ; have the bot continue to port the code
- grab cases from test262; progress from the JSON subset to Justin and Jessie
- full fuzzing of Jessie
in-memory representation: protobuf? cap'n proto?
Protobuf interop would be really nice; for cosmos explorers and such.
But I suspect we'd grind our gears trying to fit object references in there. Cap'n Proto, on the other hand, does this for breakfast.
And there are industrial-strength application of zero-copy protobuf RPC. (Darn! I can't find that thing now.)
See also:
- https://github.com/endojs/endo/discussions/2090
Spritely / Oaken connection?
I wonder to what extent the parsing-to-instructions approach applies to use in guile scheme?
see also:
verified parse a la proof-carrying code?
To save consumers from parsing without reducing trust, I wonder if a proof-of-correct parse would be economical.
using idris? lean?
There's that lean checker in rust...
hm... rust and wasm get along nicely... another story altogether?
Patterns and Schemas
protobuf and cap'n proto tend to use structs with numbered fields where any names for the fields are not carried with the data.
In end/Agoric, JSON objects, i.e. records with named fields, are much more common. As noted, we use patterns to validate them. We've discussed using pattern-based compression:
I wonder about using M.tagged(...) patterns to connect patterns with schemas... or something.