This Github action can be used to deploy tables/views schemas to BigQuery.
name: "Deploy to BigQuery"
on:
pull_request: {}
push:
branches: ["main"]
jobs:
deploy_schemas:
runs-on: ubuntu-latest
name: Deploy to BigQuery
steps:
# To use this repository's private action,
# you must check out the repository
- name: Checkout
uses: actions/[email protected]
- name: Deploy schemas to BigQuery
uses: jashparekh/bigquery-action@v3
env:
gcp_project: 'gcp-us-project'
dataset_schema_directory: 'gcp-us-project/dataset_name'
credentials: ${{ secrets.GCP_SERVICE_ACCOUNT }}
The full name of the GCP project you want to deploy.
Example: gcp-us-project
The directory in your repository where are you storing the schemas for your tables and views.
Example: gcp-us-project/dataset_name
Google Service Account with permission to create objects in the specified project. Can be stored as a repository secret
This action uses GBQ to deploy to Google BigQuery. GBQ now supports specifying partitions with the schema as well.
To leverage this you need to nest your JSON table schema in a dictionary. An example for the same is given below. Library supports Time and Range based partitioning along with Clustering.
All the configuration options can be found here.
{
"partition": {
"type": "range",
"definition": {
"field": "ID",
"range": {
"start": 1,
"end": 100000,
"interval": 10
}
}
},
"clustering": [
"ID"
],
"schema": [
{
"name": "ID",
"type": "INTEGER",
"mode": "REQUIRED"
}
]
}
See the Contributing Guide for additional information.
To execute tests locally (requires that docker
and docker-compose
are installed):
docker-compose run test
This Github Action was originally written by Jash Parekh.