foundry
foundry copied to clipboard
feat: JSON parsing cheatcodes
Component
Forge
Describe the feature you would like
The ability to easily parse JSON files within Solidity is very useful for things like reading deploy script outputs, config files, etc. Here is a proposed spec on how this should be implemented:
Cheatcodes
The readFile
cheatcode already exists, and we just add a parseJson
cheatcode which takes a string of JSON and key
, specified using the same syntax as jq
. It returns the data of each key as ABI-encoded bytes
.
This implies parseJson
will need to infer data types from the JSON to ABI-encode them appropriately. In particular, we need to distinguish between a hex string that's bytes
(bytes get right-padded) vs. a hex string that's a number (numbers get left-padded). I think ethers.js has a convention for distinguishing these, we should check that convention, use the same one, document it, and make sure it's followed when the JSON output files from scripts are written. I think the convention is 0x1234
for a number and [0x12, 0x34]
for a bytes but am not certain.
interface Vm {
// Reads the entire content of file to string, (path) => (data)
function readFile(string calldata) external returns (string memory);
// Given a string of JSON, find the provided key, (stringified json) => (ABI-encoded data)
function parseJson(string calldata json, string calldata key) external view returns (bytes memory);
}
forge-std
We'll also add new forge-std helpers to Test.sol
, as shown below.
// Everything shown here is new and not yet present in Test
contract Test {
// Reading in deployment logs will be common, so let's include helpers for
// them in forge-std.
// NOTE: Not all of the below data types are correct, we need to verify them,
// e.g. I think nonce is really a uint64.
struct Transaction {
uint256 txType;
address from;
address to;
uint256 gas;
uint256 value;
bytes data;
uint256 nonce;
}
struct TransactionDetail {
bytes32 hash;
// e.g. F0 for CREATE, this is called `type` in the output but that's a solidity keyword.
// We should consider changing that for consistency.
bytes1 opcode;
string contractName;
address contractAddress;
Transaction transaction;
}
struct Receipt {
bytes32 transactionHash;
// --- snip, you get the idea ---
}
// Read in all deployments transactions.
function readTransactions(string memory path) internal view returns (TransactionDetail[] memory) {
string memory deployData = vm.readFile(path);
bytes memory parsedDeployData = vm.parseJson(deployData, ".transactions[]");
return abi.decode(parsedDeployData, (TransactionDetail[]));
}
// Analogous to readTransactions, but for receipts.
function readReceipts(string memory path) internal view returns (Receipt[] memory) {
// --- snip, you get the idea ---
}
// Helpers for parsing keys into types. We'd include these for all value types
// as well as `bytes`, `string`, `uint256[]`, and `int256[]`. Only two are shown below.
function readUint256(string memory json, string memory key) internal view returns (uint256) {
return abi.decode(vm.parseJson(json, key), (uint256));
}
function readBytes32(string memory json, string memory key) internal view returns (bytes32) {
return abi.decode(vm.parseJson(json, key), (bytes32));
}
}
Example Usage
The above would result in the following sample usage.
contract MyTest is Test {
string internal constant deployFile = "broadcast/Deploy.s.sol/10/run-latest.json";
function myFunction() public {
// Get all deployment transactions.
TransactionDetail[] memory transactions = readTransactions(deployFile);
// Get the name of the first contract deployed.
string memory deployData = vm.readJson(deployFile);
string memory contractName = abi.decode(vm.parseJson(deployData, ".transactions[0].contractName"), (string));
// Get the nonce and transaction hash of the first contract deployed.
uint256 nonce = readUint256(deployData, ".transactions[0].tx.nonce");
bytes32 txHash = readBytes32(deployData, ".transactions[0].hash");
}
}
Additional context
No response
returning it as abi-coded bytes is quite clever and probably the simplest solution to the problem that we do not know how to encode the json value.
also providing types for transactions (receipts) etc. out of the box is probably a good idea.
A more advanced solution would be to apply some preprocessing akin to rust macros that generates the necessary solidity glue code for custom data types first
Where I think this would be interesting/helpful is in working with deployment artifacts. We could then write integration tests(!) and interact with both L1 and L2 nodes using the scripting functionality (AFAIUI).
Bonus if the ABI-coder stuff for JSON can be re-used for https://github.com/foundry-rs/foundry/issues/858 as well (similar thoughts on ABI encoding and passing that)
For implementation, I found this crate that uses C bindings from jq to parse the syntax natively. Is it a dependency we are ok with or do we prefer to implement the parsing natively?
cc @onbjerg @mattsse
There are a number of JSON path crates in Rust as well, e.g. https://docs.rs/jsonpath-rust/latest/jsonpath_rust/
I'd prefer we don't use C bindings as it might complicate our cross platform builds
Note that part of the scope of this issue involves changing broadcast artifacts such that txs/receipts are saved with the correct types to facilitate type inference when reading JSON. For example, gas and nonce should be numbers instead of hex strings, etc.
from @mattsse in https://github.com/foundry-rs/foundry/pull/2217#issuecomment-1175354247
we'd need to do
Value -> Abi types
anyway, I guess
That's a good note @mds1.
I used the crate mentioned by @onbjerg and it seems to be working well. I want to see how to deal with abi encoding complex structures e.g an array of arrays.
Had some personal matters, but should resume dev from tomorrow.
I think that this would be very useful for parsing addresses out of deployment artifacts for chainops. A network could be configured and then the corresponding addresses could be read from json files and then pulled into a script. This is one of the nicer features of hardhat scripts.
As part of this workstream, I consider the ability to write JSON files as well.
I am considering the following API:
- Define
string
path to file - Define
string[]
array of keys - Define
string[]
array of values (user can usevm.toString()
) - Define
bool
overwrite to select append (false) or overwrite (true) if filename exists - Pass them to
vm.writeJson(path, keys, values, overwrite)
Another way to go about it would be to use vm.writeFile()
and make a forge-std library to do that. Could be tricky on how to append though.
Thoughts? @mds1 @onbjerg
So keys would be something like [".foo.bar"]
and values would be ["baz"]
, resulting in { foo: { bar: "baz" } }
? Seems ok, but not entirely sure how that would work w/ things like arrays and numbers :thinking:
@onbjerg you are right. It wouldn't work for arbitrary objects and paths, only if you want to add a value at the top level of the json object.
Another idea is the following:
- Append to existing json object
- read object with
vm.readFile
andvm.parseJson
into a struct - modify struct
- Define
string[]
array of key names, ordered the same way as the values are ordered in the struct - Define
string[]
array of types, ordered the same way as the values are ordered in the struct
- read object with
- Write new json object
- Same as above, without first reading a json file into the struct
Example:
struct StoreToJson{
address receiver;
uint256 amount;
Transaction transaction; // transaction object from a forge script deployment
}
struct Transaction {
string sender;
uint256 nonce;
}
string[] memory keys = [ "receiver", "amount", "passphrase"];
string[] memory types = [ "address", "uint256", "tuple(\"string\", \"uint256\")" ];
vm.writeJson(abi.encode(StoreToJson), keys, types);
I don't love it, but I can't think of something better
@onbjerg you are right. It wouldn't work for arbitrary objects and paths, only if you want to add a value at the top level of the json object.
Another idea is the following:
Append to existing json object
- read object with
vm.readFile
andvm.parseJson
into a struct- modify struct
- Define
string[]
array of key names, ordered the same way as the values are ordered in the struct- Define
string[]
array of types, ordered the same way as the values are ordered in the structWrite new json object
- Same as above, without first reading a json file into the struct
Example:
struct StoreToJson{ address receiver; uint256 amount; Transaction transaction; // transaction object from a forge script deployment } struct Transaction { string sender; uint256 nonce; } string[] memory keys = [ "receiver", "amount", "passphrase"]; string[] memory types = [ "address", "uint256", "tuple(\"string\", \"uint256\")" ]; vm.writeJson(abi.encode(StoreToJson), keys, types);
I don't love it, but I can't think of something better
the core problem is that we need to find a way to map fields to (name+type), there's no way around this.
one way to solve this would be with helper methods, for example:
then you'd need one function per type:
function serialize(MyStruct m, Serializer s) returns bytes {
SerliazeMap map = s.serialize_map();
map.serialize_entry("value", m.value);
...
return map.end()
}
with a custom json serializer we can then simplify this to
function toJson(MyStruct m) {
return serialize(m, cheats.jsonSerializer());
}
where jsonSerializer
returns a cheat code contract that has bindings for all the serializer calls and creates the JSON object
Last week was quite eventful, so didn't have the time to work on this. @mattsse thanks for the spec. I will riff on that in code and report back.e
I recently wrote a small Solidity library (quabi) using jq
with vm.ffi
to parse specific data from contract ABI files. I think the approach here is superior. However, additional helper functions in forge-std would be useful to abstract common parses such as the transaction data or, for example, lists of function selectors from a contract ABI.