Skip to content

Commit

Permalink
Rewriting the limits IO to use a tokenizer rather than regexes. (#136)
Browse files Browse the repository at this point in the history
Summary:

These regexes are getting completely out of hand, I think it would be simpler to just use a tokenizer, especially since we have many common "building blocks" like parsing vectors.  

This new version creates a Tokenizer class that basically just feeds back tokens and you can use it to build your parser on top.  The advantages that I see are: 
1. Much simpler to extend, rather than trying to reverse engineer regexes that match any given number or whatever.
2. More useful error messages that tell you the exact spot where parsing failed.

Reviewed By: jeongseok-meta

Differential Revision: D66404695
  • Loading branch information
Chris Twigg authored and facebook-github-bot committed Nov 26, 2024
1 parent 1649e57 commit 793bcbf
Showing 1 changed file with 317 additions and 195 deletions.
Loading

0 comments on commit 793bcbf

Please sign in to comment.