-
-
Notifications
You must be signed in to change notification settings - Fork 18.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
REGR: fix read_parquet with column of large strings (avoid overflow from concat) #55691
REGR: fix read_parquet with column of large strings (avoid overflow from concat) #55691
Conversation
pandas/tests/io/test_parquet.py
Outdated
def test_string_column_above_2GB(self, tmp_path, pa): | ||
# https://github.com/pandas-dev/pandas/issues/55606 | ||
# above 2GB of string data | ||
v1 = b"x" * 100000000 | ||
v2 = b"x" * 147483646 | ||
df = pd.DataFrame({"strings": [v1] * 20 + [v2] + ["x"] * 20}, dtype="string") | ||
df.to_parquet(tmp_path / "test.parquet") | ||
result = read_parquet(tmp_path / "test.parquet") | ||
assert result["strings"].dtype == "string" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test is quite slow (around 20s for me) and uses a lot of memory (> 5 GB), so I am not sure we should add it ... (our "slow" tests are still run by default, so this would be annoying when running the tests locally)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm in favor of not adding this test given the potential CI load. Maybe an ASV since this is "performance" related too given the memory trigger if you think that makes sense.
At minimum, would be good to comment in pandas/core/arrays/string_.py
why the modification was made
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a comment about it, and "removed" the test: I left the code here, to make it easier to run that test in the future by just uncommenting (or if we enable some high_memory mark that would be disabled by default)
Adding a ASV sounds useful, but wouldn't prevent catching a regression, as also for ASV we would use a smaller dataset. So leaving that out of the PR here.
thanks @jorisvandenbossche. |
…arge strings (avoid overflow from concat)
…n of large strings (avoid overflow from concat)) (#55706) Backport PR #55691: REGR: fix read_parquet with column of large strings (avoid overflow from concat) Co-authored-by: Joris Van den Bossche <[email protected]>
pyarrow.lib.ArrowInvalid: offset overflow while concatenating arrays
#55606doc/source/whatsnew/vX.X.X.rst
file if fixing a bug or adding a new feature.