You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In a batch job on Dataflow that reads payload and metadata from a Bigquery table and publishes them to PubsubIO, I sometimes experience errors:
com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 Bad Request
"message" : "Request
payload size exceeds the limit: 10485760 bytes.",
PubsubIO Javadoc says it will use the the global limit of 10 MiB by default but it seems that doesn't work in all circumstances. I'm handling relatively large records here, up to 600 KiB per message.
Adding
.withMaxBatchBytesSize(5242880)
after
PubsubIO.writeMessages().to(topic)
fixes this issue.
Imported from Jira BEAM-7107. Original Jira may contain additional context.
Reported by: MadEgg.
The text was updated successfully, but these errors were encountered:
In a batch job on Dataflow that reads payload and metadata from a Bigquery table and publishes them to PubsubIO, I sometimes experience errors:
PubsubIO Javadoc says it will use the the global limit of 10 MiB by default but it seems that doesn't work in all circumstances. I'm handling relatively large records here, up to 600 KiB per message.
Adding
after
fixes this issue.
Imported from Jira BEAM-7107. Original Jira may contain additional context.
Reported by: MadEgg.
The text was updated successfully, but these errors were encountered: