cardano-ledger icon indicating copy to clipboard operation
cardano-ledger copied to clipboard

Improve `PPViewHashesDontMatch` to include the data for the expected hash

Open michele-nuzzi opened this issue 1 year ago • 7 comments

since the introduction of PlutusV3 script_data_hash calculation for some downstream tools broke, including plu-ts.

I am querying v3 costs models directly from a fully synced cardano-node on sanchonet, as follows:

cardano-cli query protocol-parameters --$sancho | jq .costModels.PlutusV3

which gives me back the following array:

[100788,420,1,1,1000,173,0,1,1000,59957,4,1,11183,32,201305,8356,4,16000,100,16000,100,16000,100,16000,100,16000,100,16000,100,100,100,16000,100,94375,32,132994,32,61462,4,72010,178,0,1,22151,32,91189,769,4,2,85848,123203,7305,-900,1716,549,57,85848,0,1,1,1000,42921,4,2,24548,29498,38,1,898148,27279,1,51775,558,1,39184,1000,60594,1,141895,32,83150,32,15299,32,76049,1,13169,4,22100,10,28999,74,1,28999,74,1,43285,552,1,44749,541,1,33852,32,68246,32,72362,32,7243,32,7391,32,11546,32,85848,123203,7305,-900,1716,549,57,85848,0,1,90434,519,0,1,74433,32,85848,123203,7305,-900,1716,549,57,85848,0,1,1,85848,123203,7305,-900,1716,549,57,85848,0,1,955506,213312,0,2,270652,22588,4,1457325,64566,4,20467,1,4,0,141992,32,100788,420,1,1,81663,32,59498,32,20142,32,24588,32,20744,32,25933,32,24623,32,43053543,10,53384111,14333,10,43574283,26308,10,16000,100,16000,100,962335,18,2780678,6,442008,1,52538055,3756,18,267929,18,76433006,8868,18,52948122,18,1995836,36,3227919,12,901022,1,166917843,4307,36,284546,36,158221314,26549,36,74698472,36,333849714,1,254006273,72,2174038,72,2261318,64571,4,207616,8310,4,1293828,28716,63,0,1,1006041,43623,251,0,1]

After checking with the plutus team I'm confident in saying that these parameters are preserved.

I have a simple transaction, with a single redeemer and no datums;

The only redeemer is the following:

{
    "tag": "Spend",
    "index": 0,
    "execUnits": {
        "steps": "1922100",
        "memory": "10500"
    },
    "data": {
        "int": "0"
    }
}

with conway allowing for multiple representations of the redeemer witnesses, I'm testing for both, even though I am aware that the intended representation should be "as-is" from the cbor, the tool used to build the transaction is the same used for testing.

The expected hash, part of the error message is:

9a0726be60d68003c541953ded2489526eedc607cd07622c68366a64c264038b

yet nothing seems to get me the same result:

exhibit 1: redeemers as map

input data:

a18200008200821929041a001d5434a10298fb1a000189b41901a401011903e818ad00011903e819ea350401192baf18201a000312591920a404193e801864193e801864193e801864193e801864193e801864193e80186418641864193e8018641a000170a718201a00020782182019f016041a0001194a18b2000119568718201a0001643519030104021a00014f581a0001e143191c893903831906b419022518391a00014f580001011903e819a7a90402195fe419733a1826011a000db464196a8f0119ca3f19022e011999101903e819ecb2011a00022a4718201a000144ce1820193bc318201a0001291101193371041956540a197147184a01197147184a0119a9151902280119aecd19021d0119843c18201a00010a9618201a00011aaa1820191c4b1820191cdf1820192d1a18201a00014f581a0001e143191c893903831906b419022518391a00014f5800011a0001614219020700011a000122c118201a00014f581a0001e143191c893903831906b419022518391a00014f580001011a00014f581a0001e143191c893903831906b419022518391a00014f5800011a000e94721a0003414000021a0004213c19583c041a00163cad19fc3604194ff30104001a00022aa818201a000189b41901a401011a00013eff182019e86a1820194eae182019600c1820195108182019654d182019602f18201a0290f1e70a1a032e93af1937fd0a1a0298e40b1966c40a193e801864193e8018641a000eaf1f121a002a6e06061a0006be98011a0321aac7190eac121a00041699121a048e466e1922a4121a0327ec9a121a001e743c18241a0031410f0c1a000dbf9e011a09f2f6d31910d318241a0004578218241a096e44021967b518241a0473cee818241a13e62472011a0f23d40118481a00212c5618481a0022814619fc3b041a00032b00192076041a0013be0419702c183f00011a000f59d919aa6718fb0001

resulting hash:

3faebd87a3f87898331d155f146aeaa5c5ce1815285f47d3fdb57d73f36bd861

exhibit 2: redeemers as array

8184000000821929041a001d5434a10298fb1a000189b41901a401011903e818ad00011903e819ea350401192baf18201a000312591920a404193e801864193e801864193e801864193e801864193e801864193e80186418641864193e8018641a000170a718201a00020782182019f016041a0001194a18b2000119568718201a0001643519030104021a00014f581a0001e143191c893903831906b419022518391a00014f580001011903e819a7a90402195fe419733a1826011a000db464196a8f0119ca3f19022e011999101903e819ecb2011a00022a4718201a000144ce1820193bc318201a0001291101193371041956540a197147184a01197147184a0119a9151902280119aecd19021d0119843c18201a00010a9618201a00011aaa1820191c4b1820191cdf1820192d1a18201a00014f581a0001e143191c893903831906b419022518391a00014f5800011a0001614219020700011a000122c118201a00014f581a0001e143191c893903831906b419022518391a00014f580001011a00014f581a0001e143191c893903831906b419022518391a00014f5800011a000e94721a0003414000021a0004213c19583c041a00163cad19fc3604194ff30104001a00022aa818201a000189b41901a401011a00013eff182019e86a1820194eae182019600c1820195108182019654d182019602f18201a0290f1e70a1a032e93af1937fd0a1a0298e40b1966c40a193e801864193e8018641a000eaf1f121a002a6e06061a0006be98011a0321aac7190eac121a00041699121a048e466e1922a4121a0327ec9a121a001e743c18241a0031410f0c1a000dbf9e011a09f2f6d31910d318241a0004578218241a096e44021967b518241a0473cee818241a13e62472011a0f23d40118481a00212c5618481a0022814619fc3b041a00032b00192076041a0013be0419702c183f00011a000f59d919aa6718fb0001

resulting hash:

06c087dd394976edf9806e981ddec70f6f6aba24b63db9b5061cd440b7f3bd82

michele-nuzzi avatar Jul 30 '24 18:07 michele-nuzzi

additionally to golden tests, since this issue is likely to re-appear for each time new builtins are added, or some costs modified It would be great if the error message could report the input data used to get to the final, expected hash

michele-nuzzi avatar Jul 30 '24 18:07 michele-nuzzi

@lehins tagging you to get this under your radar

This is a major blocker on my side, but of course, I understand there is a lot of stuff going on other than this.

michele-nuzzi avatar Aug 01 '24 05:08 michele-nuzzi

Script hash mechanism has not changed in Conway, so I am not sure what exactly is the problem that you are experiencing. Worth noting that it is not relevant how the redeemers are represented, we use the original bytes that were submitted over the wire for script integrity hash computation.

additionally to golden tests, since this issue is likely to re-appear for each time new builtins are added, or some costs modified

Golden tests are not gonna help in this case, since as I mentioned algorithm has not changed, while cost models can change at any point. Are you sure you are using the correct cost model for computing the hash?

It would be great if the error message could report the input data used to get to the final, expected hash.

This is definitely a good idea to report original bytes that where used to compute the correct script integrity hash, since that would make debugging issues like that much easier. Unfortunately we won't be able to add this feature until the next era or at the earliest the next intra-era hard fork, since we can't change predicate failures at an arbitrary point.

If you include the offending transaction as hex encoded cbor, I could assist a little better, until then I can't really tell what is giving you trouble.

lehins avatar Aug 03 '24 00:08 lehins

Without fail, PPViewHashes is what causes the most headaches in every hardfork going back multiple years, whether "the script hash mechanism" officially changes or not; specifically, the mechanism might not change, but the serialization formats for the relevant pieces of data keep changing, such as switching from arrays to maps, adding useless 258 cbor wrappers, or changing the conventional ordering of cost model parameters.

There are just too many opaque encoding subtleties, and so IMO extra effort should be put into surfacing what's going wrong. If the predicates themselves can't be changed, can't something in the miniprotocol code catch this error and augment it with the appropriate data to return back as part of the error string?

I just spent 2 hours, for example, debugging an issue because someone decided that the new cost models shouldn't be sorted alphabetically 😅; This might be well known for most, but was very surprising and confusing for me, without access to the cost models the node was expecting to use

Quantumplation avatar Jan 17 '25 21:01 Quantumplation

@Quantumplation We could add a cardano-cli command that could produce the expected data in CBOR format. All that is necessary for this to work is the offending transaction and the current protocol parameters, which cardano-cli could automatically query from a running node or supplied as an argument to the cli command.

This way it could be a useful debugging tool that can be used whenever PPViewHashesDontMatch is encountered, until we actually take care of this ticket, which will be a while, since it needs a hard fork.

What do you think?

lehins avatar Jan 18 '25 02:01 lehins

That could be a suitable compromise; mostly I'm just hoping we don't change the serialization formats again, but if I can't get that, then the cli tool would be useful!

Quantumplation avatar Jan 18 '25 02:01 Quantumplation

I'm just hoping we don't change the serialization formats again

That's the only reason why we can't change PPViewHashesDontMatch without a hard fork, it will require a change to serialization.

So, the correct solution is to introduce a new predicate failure that has an extra field with the original data when compared to PPViewHashesDontMatch and produce that new predicate failure for protocol version 11. If that protocol version will be the next era then we will redefine all predicate failures from scratch anyways.

If we did not do that and change the predicate failure without a hard fork, then it would break the node-to-client protocol and lots of people would become unhappy 😄

lehins avatar Jan 18 '25 07:01 lehins

Looks like we might be able to get it done for the next intra-era hard fork

lehins avatar Jul 15 '25 20:07 lehins