(no title)
ckok | 1 year ago
When it's really one of the most simple things if you divide it in parts and look at it from a tokenizer (string to list of tokens) and parser on top. Where the tokenizer can usually be very simple: a loop, large switch on the current character, where a choice is made on "what can this be", and making it into a formal token or error. Then a simple recursive parser that can almost be a 1 to 1 copy of the (E)BNF.
jrop|1 year ago
detourdog|1 year ago
I felt that they were pointing out how the cognitive load of understanding is effected by word choice.
unknown|1 year ago
[deleted]
joz1-k|1 year ago
taeric|1 year ago
antononcube|1 year ago
(BTW, there is a "Parser combinators" section in the featured post/article.)
ckok|1 year ago
It has been years since I've written a proper parser but before that every time I had to write one I tried the latest and greatest first. ANTLR, coco/r, combinators. All the generated ones seemed to have a fatal flaw that hand writing didnt have. For example good error handling seemed almost impossible, very slow due to Infinite look ahead or they were almost impossible to debug to find an error in the input schema.
In the end hand crafting seems to be faster and simpler. Ymmv.
My point about the article was mostly that all the formal theory is nice but all it does is scare away people, while parsing is probably the simplest thing about writing a compiler.
marcosdumay|1 year ago
The good news is that this happens in the grammar definition. So once you define your language well, you don't have to watch for it anymore.
DemocracyFTW2|1 year ago
rurban|1 year ago
unknown|1 year ago
[deleted]