Migrating generic parser test to 0.10.1
Hi all,
Previously I was able to use a generic test function which executed all my parsing tests. Here's the rough idea:
fn execute_parse_test<O>(
input: impl AsRef<str>,
parser: impl Parser<Token, O, Error = Simple<Token>>,
expected: O,
) where
O: Debug + Display + PartialEq + Hash,
{
let (range, tokens) = create_tokens(input.as_ref());
let input_ast = parser
.parse(Stream::from_iter(range, tokens.into_iter()))
.expect("Failed to parse input");
// Perform tests...
}
In 0.10.1, I'm struggling with getting this function to type. Here's my current attempt:
fn execute_parse_test<'a, I, O>(
input: &'a str,
parser: impl Parser<'a, I, O, extra::Err<Rich<'a, Token>>>,
expected: O,
) where
O: Debug + Display + PartialEq + Hash,
I: ValueInput<'a, Token = Token, Span = SimpleSpan>,
{
let tokens = lex(input);
let token_stream =
Stream::from_iter(tokens).map((0..input.len()).into(), |(t, s): (_, _)| (t, s.into()));
let input_ast: O = parser
.parse(token_stream)
.into_result()
.expect("Failed to parse input");
// ...
}
Calling my parsers like this works in my library code, but not in this test code.
The error I'm getting is:
error[E0308]: mismatched types
--> lib\spthy-parser\src\parse\parse_tests.rs:66:16
|
54 | fn execute_parse_test<'a, I, O>(
| - expected this type parameter
...
64 | Stream::from_iter(tokens).map((0..input.len()).into(), |(t, s): (_, _)| (t, s.into()));
| ---------------- the found closure
65 | let input_ast: O = parser
66 | .parse(token_stream)
| ----- ^^^^^^^^^^^^ expected type parameter `I`, found `MappedInput<Token, _, Stream<...>, ...>`
| |
| arguments to this method are incorrect
|
::: lib\spthy-parser\src\lexer.rs:167:36
|
167 | pub(crate) fn lex(source: &str) -> impl Iterator<Item = (Token, Range<usize>)> + '_ {
| ------------------------------------------------ the found opaque type
|
= note: expected type parameter `I`
found struct `MappedInput<lexer::Token, _, Stream<impl Iterator<Item = (lexer::Token, std::ops::Range<usize>)> + '_>, {closure@lib\spthy-parser\src\parse\parse_tests.rs:64:64: 64:80}>`
help: the return type of this call is `MappedInput<lexer::Token, _, Stream<impl Iterator<Item = (lexer::Token, std::ops::Range<usize>)> + '_>, {closure@lib\spthy-parser\src\parse\parse_tests.rs:64:64: 64:80}>` due to the type of the argument passed
--> lib\spthy-parser\src\parse\parse_tests.rs:65:24
|
65 | let input_ast: O = parser
| ________________________^
66 | | .parse(token_stream)
| |________________------------^
| |
| this argument influences the return type of `parse`
note: method defined here
--> C:\Users\nikme\.cargo\registry\src\index.crates.io-1949cf8c6b5b557f\chumsky-0.10.1\src\lib.rs:343:8
|
343 | fn parse(&self, input: I) -> ParseResult<O, E::Error>
| ^^^^^
Here, I'm stuck. As far as I understand the documentation/chumsky, MappedInput<_,_,_,_> should implement ValueInput<...> here. I've looked at #767 but I don't think it is the same problem.
I've tried changing the function to a macro, which works, but the compile then becomes exponential and my tests become unusable.
For reference, here is how I call my top-level parser in the actual library code:
pub(crate) fn parse<'a>(src: &'a str) -> Result<..., Vec<Rich<'a, Token>>> {
let token_iter = lex(src);
let token_stream =
Stream::from_iter(token_iter).map((0..src.len()).into(), |(t, s): (_, _)| (t, s.into()));
my_parser().parse(token_stream).into_result()
}
Happy for any input on how to solve this!
Thanks a lot in advance!
The problem is that your parser looks for a specific I that it defines, but you're giving it an input type that you choose. Have you tried getting rid of the I type parameter and making the input type explicit instead?