Skip to content

Commit

Permalink
Rewriting the limits IO to use a tokenizer rather than regexes. (#136)
Browse files Browse the repository at this point in the history
Summary:

These regexes are getting completely out of hand, I think it would be simpler to just use a tokenizer, especially since we have many common "building blocks" like parsing vectors.  

This new version creates a Tokenizer class that basically just feeds back tokens and you can use it to build your parser on top.  The advantages that I see are: 
1. Much simpler to extend, rather than trying to reverse engineer regexes that match any given number or whatever.
2. More useful error messages that tell you the exact spot where parsing failed.

Differential Revision: D66404695
  • Loading branch information
Chris Twigg authored and facebook-github-bot committed Nov 25, 2024
1 parent df6cb7e commit 12eda6b
Showing 1 changed file with 317 additions and 193 deletions.
Loading

0 comments on commit 12eda6b

Please sign in to comment.