You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are currently using the Polygon dataset available in BigQuery. In this process, every hour we query the logs table for the max block number. This query always returns the expected results. But, in one instance we got back a weird number i.e. 7277816997830857785.
Now this only happened once, but has caused us a lot of headaches due to a run-away process that was trying to catch up to this number.
It would be very helpful if anyone familiar with the ETL process to provide any insight on why this would have been the case. Is there some temporary state the data was in when this query was run?
I tried looking into the implementation in this repo but did not make any meaningful progress.
This occurred around 10 pm US Pacific Standard Time on Dec 26th but has not been reproducible since then.
The text was updated successfully, but these errors were encountered:
We are currently using the Polygon dataset available in BigQuery. In this process, every hour we query the
logs
table for the max block number. This query always returns the expected results. But, in one instance we got back a weird number i.e.7277816997830857785
.Now this only happened once, but has caused us a lot of headaches due to a run-away process that was trying to catch up to this number.
It would be very helpful if anyone familiar with the ETL process to provide any insight on why this would have been the case. Is there some temporary state the data was in when this query was run?
I tried looking into the implementation in this repo but did not make any meaningful progress.
This occurred around 10 pm US Pacific Standard Time on Dec 26th but has not been reproducible since then.
The text was updated successfully, but these errors were encountered: