You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 24, 2024. It is now read-only.
Since migrating from local to develop environment airflow is running in kubernetes helm, DbtRunOperator -- and any DBT related operator, are no longer working in develop branch. DBT is not recognized as a command, even with dependency packages installed. I have a dbt project already initialized, and am also specifying the path to the dbt directory.
[2024-07-18, 10:06:13 UTC] {dbt_hook.py:117} INFO - dbt run --profiles-dir /opt/airflow/dags/repo/dags/dbt/mysql_dbt/profiles --target dev
[2024-07-18, 10:06:13 UTC] {dbt_hook.py:126} INFO - Output:
[2024-07-18, 10:06:17 UTC] {dbt_hook.py:132} INFO - Command exited with return code 2
[2024-07-18, 10:06:17 UTC] {taskinstance.py:1937} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow_dbt/operators/dbt_operator.py", line 98, in execute
self.create_hook().run_cli('run')
File "/home/airflow/.local/lib/python3.9/site-packages/airflow_dbt/hooks/dbt_hook.py", line 138, in run_cli
raise AirflowException("dbt command failed")
airflow.exceptions.AirflowException: dbt command failed
[2024-07-18, 10:06:17 UTC] {taskinstance.py:1400} INFO - Marking task as FAILED. dag_id=airflow_dbt, task_id=dbt_run, execution_date=20240718T100611, start_date=20240718T100613, end_date=20240718T100617
[2024-07-18, 10:06:17 UTC] {standard_task_runner.py:104} ERROR - Failed to execute job 27286 for task dbt_run (dbt command failed; 18254)
Apache Airflow version
2.7.2
What happened
Since migrating from local to develop environment airflow is running in kubernetes helm, DbtRunOperator -- and any DBT related operator, are no longer working in develop branch. DBT is not recognized as a command, even with dependency packages installed. I have a dbt project already initialized, and am also specifying the path to the dbt directory.
[2024-07-18, 10:06:13 UTC] {dbt_hook.py:117} INFO - dbt run --profiles-dir /opt/airflow/dags/repo/dags/dbt/mysql_dbt/profiles --target dev
[2024-07-18, 10:06:13 UTC] {dbt_hook.py:126} INFO - Output:
[2024-07-18, 10:06:17 UTC] {dbt_hook.py:132} INFO - Command exited with return code 2
[2024-07-18, 10:06:17 UTC] {taskinstance.py:1937} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow_dbt/operators/dbt_operator.py", line 98, in execute
self.create_hook().run_cli('run')
File "/home/airflow/.local/lib/python3.9/site-packages/airflow_dbt/hooks/dbt_hook.py", line 138, in run_cli
raise AirflowException("dbt command failed")
airflow.exceptions.AirflowException: dbt command failed
[2024-07-18, 10:06:17 UTC] {taskinstance.py:1400} INFO - Marking task as FAILED. dag_id=airflow_dbt, task_id=dbt_run, execution_date=20240718T100611, start_date=20240718T100613, end_date=20240718T100617
[2024-07-18, 10:06:17 UTC] {standard_task_runner.py:104} ERROR - Failed to execute job 27286 for task dbt_run (dbt command failed; 18254)
CurrentDagDirectory = os.path.dirname(os.path.abspath(file))
The text was updated successfully, but these errors were encountered: