Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Populate labels and custom metadata on VMs that serves Dataflow #19453

Open
kennknowles opened this issue Jun 4, 2022 · 4 comments
Open

Populate labels and custom metadata on VMs that serves Dataflow #19453

kennknowles opened this issue Jun 4, 2022 · 4 comments

Comments

@kennknowles
Copy link
Member

For now, Apache Beam on Google Dataflow doesn't provide functionality to pass custom labels and metadata on VM instances that serve Dataflow job. Only labels on Job is available. 

Actually com.google.api.services.dataflow.model.WorkerPool alredy has the field metadata but metod setMetadata never using.

Need to add functionality to provide custom labels and metadata on VM instances via running Dataflow job on Google cloud.

Imported from Jira BEAM-6832. Original Jira may contain additional context.
Reported by: alex3.14.

@prasrvenkat
Copy link

@kennknowles @damccorm Would you know if this is being worked on? If not can I take it? Let me know.

@kennknowles
Copy link
Member Author

You can use DataflowPipelineOptions.setLabels today.

@prasrvenkat
Copy link

You can use DataflowPipelineOptions.setLabels today.

Ok awesome thank you

@tahsib-optimizely
Copy link

how to add metadata?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants