Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better error message for large elements. #30639

Merged
merged 4 commits into from
Mar 15, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions sdks/python/apache_beam/runners/worker/data_plane.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@

_DEFAULT_SIZE_FLUSH_THRESHOLD = 10 << 20 # 10MB
_DEFAULT_TIME_FLUSH_THRESHOLD_MS = 0 # disable time-based flush by default
_FLUSH_MAX_SIZE = (2 << 30) - 100 # 2GB less some overhead, protobuf/grpc limit

# Keep a set of completed instructions to discard late received data. The set
# can have up to _MAX_CLEANED_INSTRUCTIONS items. See _GrpcDataChannel.
Expand Down Expand Up @@ -147,6 +148,14 @@ def maybe_flush(self):
def flush(self):
# type: () -> None
if self._flush_callback:
if self.size() > _FLUSH_MAX_SIZE:
raise ValueError(
f'Buffer size {self.size()} exceeds GRPC limit {_FLUSH_MAX_SIZE}. '
'This is likely due to a single element that is too large. '
'To resolve, prefer multiple small elements over single large '
'elements in PCollections. If needed, store large blobs in '
'external storage systems, and use PCollections to pass their '
'metadata, or use a custom coder that reduces the element\'s size.')
self._flush_callback(self.get())
self._clear()

Expand Down
Loading