ion-rust
ion-rust copied to clipboard
Update conformance testing DSL to support unknown symbols
Issue #, if available: n/a
Description of changes: This PR adds missing symbol notation for Toplevel and Produces clauses.
This turned out to be a lot simpler than I had originally thought. My first (few) pass(es) through the spec left my understanding to be that the notation could define symbol mappings, which would have to be created during the evaluation of the fragments, and supported when evaluating expectations. Unfortunately I didn't realize my misunderstanding until after having implemented creating the symbol mappings.
The functionality now allows symbol IDs to be specified as quoted symbols, with the #$
notation for identifying such symbol IDs.
Details
When the Toplevel fragment is triggered to encode it's contents into a new document fragment, it uses ProxyElement
to wrap each of the ion elements that were defined within its body. ProxyElement implements WriteAsIon
in order to hijack any symbols being written and detect any #$
prefixed symbols so that it can parse out the symbol table name, and offset. Once parsed, it will write the symbol ID to the document fragment.
When a Produces clause is evaluated, it will walk through its body of ion elements, while reading the concatenated ion document. As each element is evaluated for equality, any symbols found in the Produces' body will also be tested for #$
prefix, and offsets/symtabs extracted.
In addition to Toplevel and Produces support, the Denotes clause was updated to use a reader rather than the Element API, so that it can access the symbol IDs to complete its data model tests.
Additionally
I also used this PR to follow up on #828, and rolled the typed null handling into a read_null
function and reset the ion_type
back to its original for consistency. I'll follow up with a PR to return the underlying type from ion_type
for all value implementations.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.