You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, currently it's not possible to specify this argument when submitting a beast job using the CLI, using the --overrides .
Instead you have to manually change the CRD, or create a new one
Possible solution
Allow one to specify the overwrite flag in the --overrides argument
Alternatives
No response
Context
No response
The text was updated successfully, but these errors were encountered:
Description
One of the default arguments to all our spark jobs is the
--overwrite
flag.This is mainly used for the few places we have jobs that do incremental loading, to fully reload all historical data.
(e.g https://github.com/SneaksAndData/shrek/blob/93dc85842f55434bd82029360284148cc7980513/shrek/kits/generalized/copy_data_merge_by_key.py#L161)
However, currently it's not possible to specify this argument when submitting a beast job using the CLI, using the --overrides .
Instead you have to manually change the CRD, or create a new one
Possible solution
Allow one to specify the
overwrite
flag in the --overrides argumentAlternatives
No response
Context
No response
The text was updated successfully, but these errors were encountered: