Parse returns error instead of terminating with log.Fatalf
This is my attempt at solving #18 . Feedback is welcome.
(I used the golang.org/x/tools/cmd/goyacc version of goyacc)
Sorry for being unresponsive, I was on vacation.
I Introduced the nested select because I noticed that the TestUnexpectedAtSign would sometimes fail as both channels emitted an error. In that case, we get racy behaviour regarding the returned error message. To my understanding, the errors in l.Errors are all handcrafted and thus more user friendly.
While using this PR in a project, I noticed two remaining issues that need to be addressed before this can get merged
- Sometimes more than one Error is added to either l.ParseErrors leading to a deadlock, as the channel only has a capacity of one. Is there any reason why we use a channel instead of, i.e. a slice in the first place? Otherwise we could just use a slice and either return all errors or only the first one ( Sth. like this https://gist.github.com/its-luca/0f90ea071e8ce74c2f1005205c71fbda )
func TestParser_NoDeadlockAnMultipleErrors(t *testing.T) {
//this entry raises multiple errors that lead to a deadlock on capacity one channels
badEntry := `@article{badEntry,
title={Bad Entry Title},
journal=should Be Quoted,
}`
_, err := Parse(strings.NewReader(badEntry))
if err == nil {
t.Fatal("Expected parser error got none")
}
}
- This is the more serious problem. Somehow, there is persisted state between two parse runs. Parsing a malformed entry can lead a second parse run to fail. E.g. this test fails:
// TestParser_ParseGoodEntryAfterBadEntry checks that parsing a bad entry does not influence further calls to parse
func TestParser_ParseGoodEntryAfterBadEntry(t *testing.T) {
goodEntry := `@article{goodEntry,
title={Good Entry Title},
}`
badEntry := `@article{badEntry,
title={Bad Entry Title},
journal=should Be Quoted,
}`
_, err := Parse(strings.NewReader(badEntry))
if err == nil {
t.Fatal("Expected parser error got none")
}
t.Logf("Done with badEntry")
_, err = Parse(strings.NewReader(goodEntry))
if err != nil {
t.Fatalf("Did not expect parser error but got %v", err)
}
}
(Note that this requires you to either increase the ParseErrors and Errors channel capacity in lexer.go or use the patch referenced in the gist. Otherwise you run into the deadlock error.
Is there any reason why we use a channel instead of, i.e. a slice in the first place
The yacc generated parser don't return the idiomatic (token, error), which was why we set up a separate channel for error(s) for the lexer to return the error in a back-channel. I'm not against putting it in a slice though, it makes sense.
I've updated the error reporting and handling -- if you wouldn't mind merging those changes here and see if that works with your change to return error :pray:
Thanks. I merged your changes. I still need to figure out why the error state persists between parser runs though
Thanks. I merged your changes. I still need to figure out why the error state persists between parser runs though
I don't necessarily think it's state persisting between parser runs. Even if you ignore the errors like v, _ := bib.GetStringVar($3) you still get the same issue of failing to parse a valid bibtex. I think the function signature returning (v, error) is interfering with yacc.