Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected value for max(block_number) in logs table at some point #199

Open
gdevanla opened this issue Jan 5, 2024 · 0 comments
Open

Comments

@gdevanla
Copy link

gdevanla commented Jan 5, 2024

We are currently using the Polygon dataset available in BigQuery. In this process, every hour we query the logs table for the max block number. This query always returns the expected results. But, in one instance we got back a weird number i.e. 7277816997830857785.

Now this only happened once, but has caused us a lot of headaches due to a run-away process that was trying to catch up to this number.

It would be very helpful if anyone familiar with the ETL process to provide any insight on why this would have been the case. Is there some temporary state the data was in when this query was run?

I tried looking into the implementation in this repo but did not make any meaningful progress.

This occurred around 10 pm US Pacific Standard Time on Dec 26th but has not been reproducible since then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant