Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix error when setting a large number of properties
Bugfix Fix #269. This change greatly reduces the likelihood of an error when specifying a large number of property_ids in `ga4.combine_property_data()`. * Fixed the following bug * Changed to copy a table for each peoperty_id dbt_project.yml ```yml vars: ga4: source_project: source-project-id property_ids: [ 000000001 , 000000002 , ... , 000000040 ] start_date: 20210101 static_incremental_days: 3 combined_dataset: combined_dataset_name ``` ```shell $ dbt run -s base_ga4__events --full-refresh 06:51:19 Running with dbt=1.5.0 06:52:05 Found 999 models, 999 tests, 999 snapshots, 999 analyses, 999 macros, 999 operations, 999 seed files, 999 sources, 999 exposures, 999 metrics, 999 groups 06:52:06 06:52:14 Concurrency: 4 threads (target='dev') 06:52:14 06:52:14 1 of 1 START sql view model dataset_name.base_ga4__events ......... [RUN] 06:56:17 BigQuery adapter: https://console.cloud.google.com/bigquery?project=project-id&j=bq:asia-northeast1:????????-????-????-????-????????????&page=queryresults 06:56:17 1 of 1 ERROR creating sql view model dataset_name.base_ga4__events [ERROR in 243.80s] 06:56:18 06:56:18 Finished running 1 view model in 0 hours 4 minutes and 11.62 seconds (251.62s). 06:56:22 06:56:22 Completed with 1 error and 0 warnings: 06:56:22 06:56:23 Database Error in model base_ga4__events (models/staging/base/base_ga4__events.sql) 06:56:23 The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters. 06:56:23 06:56:23 Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1 ``` Merging this pull request will enable execution. ```shell $ dbt run -s base_ga4__events --full-refresh HH:mm:ss Running with dbt=1.5.0 HH:mm:ss Found 999 models, 999 tests, 999 snapshots, 999 analyses, 999 macros, 999 operations, 999 seed files, 999 sources, 999 exposures, 999 metrics, 999 groups HH:mm:ss HH:mm:ss Concurrency: 4 threads (target='dev') HH:mm:ss HH:mm:ss 1 of 1 START sql incremental model dataset_name.base_ga4__events ... [RUN] HH:mm:ss Cloned from `source-project-id.analytics_000000001.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000001`. HH:mm:ss Cloned from `source-project-id.analytics_000000002.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000002`. .... HH:mm:ss Cloned from `source-project-id.analytics_000000040.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000040`. HH:mm:ss 1 of 1 OK created sql incremental model dataset_name.base_ga4__events [CREATE TABLE (? rows, ? processed) in ?] HH:mm:ss HH:mm:ss Finished running 1 incremental model in ? (?). HH:mm:ss HH:mm:ss Completed successfully HH:mm:ss HH:mm:ss Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1 ``` --- Fixed timeout in clone operation The following error will almost never occur because I have changed to clone separated by property_id. * Removed https://github.com/Velir/dbt-ga4/blame/6.0.1/README.md#L323-L332 from README.md * Resolved the following operation https://github.com/Velir/dbt-ga4/blame/6.0.1/README.md#L323-L332 > Jobs that run a large number of clone operations are prone to timing out. As a result, it is recommended that you increase the query timeout if you need to backfill or full-refresh the table, when first setting up or when the base model gets modified. Otherwise, it is best to prevent the base model from rebuilding on full refreshes unless needed to minimize timeouts. > > ``` > models: > ga4: > staging: > base: > base_ga4__events: > +full_refresh: false > ```
- Loading branch information