You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 24, 2024. It is now read-only.
I have been running this package in Composer Airflow using a separate PyPi release I made for my team to use: (airflow-dbt-cta). I hope this will be only a temporary solution to avoid cluttering PyPi with confusing rogue packages, but it has been over a year since the latest PyPi release of airflow-dbt, and over 6 months since a different user requested a new release (issue 65).
The new functionality that supports passing environment variables (eg to use variables in sources.yaml - see PR 60 from May 2020) was causing errors when I attempted to run it in Google Composer. This seems to be because the new addition of env causes dbt_hook.py to spin up a subprocess with an empty environment, when the behavior should instead be to just use the existing environment and only export new variables if the user specifies them. At least, when I made the change I suggested in PR 75, the operators began working as intended.
My team is trying to run this package in Composer Airflow, which might introduce quirks not present in other Airflow deployments... I'm not sure. Composer do be weird sometimes.
It would be amazing if we could get this fix in there and also have a new PyPi release? Pretty please?
The text was updated successfully, but these errors were encountered:
Thanks for this. It's quite frustrating when packages get abandoned... Personally, I've switched to BashOperator which supports env and append_env.
In my case, I'm installing dbt in my airflow image using pipx to keep it isolated from Airflow so it exists somewhere else on my PATH. As a workaround (or perhaps you even prefer this method), you could reconstruct PATH in the env you pass manually, something like:
env= {
"PATH": os.getenv("PATH"),
...
}
Or to do a full append:
env= {
**os.environ,
"YOUR_EXTRA_ARG": "",
...
}
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I have been running this package in Composer Airflow using a separate PyPi release I made for my team to use: (airflow-dbt-cta). I hope this will be only a temporary solution to avoid cluttering PyPi with confusing rogue packages, but it has been over a year since the latest PyPi release of airflow-dbt, and over 6 months since a different user requested a new release (issue 65).
The new functionality that supports passing environment variables (eg to use variables in
sources.yaml
- see PR 60 from May 2020) was causing errors when I attempted to run it in Google Composer. This seems to be because the new addition ofenv
causes dbt_hook.py to spin up a subprocess with an empty environment, when the behavior should instead be to just use the existing environment and only export new variables if the user specifies them. At least, when I made the change I suggested in PR 75, the operators began working as intended.My team is trying to run this package in Composer Airflow, which might introduce quirks not present in other Airflow deployments... I'm not sure. Composer do be weird sometimes.
It would be amazing if we could get this fix in there and also have a new PyPi release? Pretty please?
The text was updated successfully, but these errors were encountered: