happy
happy copied to clipboard
Mapping multiple tokens into one
Is there a reason why different tokens cannot be mapped into a single one using %token
directive? This is particularly useful for spelling of keywords when the lexer is not sophisticated and would eliminate the need to do it in grammar. Here's an example:
%token
centered { TIdentifier "centered" }
centered { TIdentifier "centred" }
Have you tried it? If it works, it's a feature!
I did. It complains that centered is listed twice.
src/Camfort/Specification/Stencils/Grammar.y: 60: multiple use of 'centered'
src/Camfort/Specification/Stencils/Grammar.y: multiple use of 'centered'
src/Camfort/Specification/Stencils/Grammar.y: multiple use of 'centered'
Ok, thanks for the suggestion then. I don't see any reason why it couldn't be done.
Alright, I will see if it is an easy change and maybe do a PR soon.