This library follows Semantic Versioning.
This repository uses tool Release Please to create GitHub and PyPi releases. It does so by parsing your git history, looking for Conventional Commit messages, and creating release PRs.
Learn more by reading How should I write my commits?
-
Set environment variables for
INSTANCE_ID
,DATABASE_ID
,REGION
,DB_USER
,DB_PASSWORD
,IAM_ACCOUNT
. -
Run pytest to automatically run all tests:
pytest
Notes:
- Tests use both IAM and built-in authentication.
- Learn how to set up a built-in databases user at Cloud SQL built-in database authentication.
- Local tests will run against your
gcloud
credentials. Usegcloud
to login with your personal account or a service account. This account will be used to run IAM tests. Learn how to set up access to the database at Manage users with IAM database authentication. The "IAM_ACCOUNT" environment variable is also used to test authentication to override the local account. A personal account or a service account can be used for this test. - You may need to grant access to the public schema for your new database user:
GRANT ALL ON SCHEMA public TO [email protected];
Cloud Build is used to run tests against Google Cloud resources in test project: langchain-cloud-sql-testing.
Each test has a corresponding Cloud Build trigger, see all triggers.
These tests are registered as required tests in .github/sync-repo-settings.yaml
.
Cloud Build triggers (for Python versions 3.9 to 3.11) were created with the following specs:
name: pg-integration-test-pr-py39
description: Run integration tests on PR for Python 3.9
filename: integration.cloudbuild.yaml
github:
name: langchain-google-cloud-sql-pg-python
owner: googleapis
pullRequest:
branch: .*
commentControl: COMMENTS_ENABLED_FOR_EXTERNAL_CONTRIBUTORS_ONLY
ignoredFiles:
- docs/**
- .kokoro/**
- .github/**
- "*.md"
substitutions:
_CLUSTER_ID: <ADD_VALUE>
_DATABASE_ID: <ADD_VALUE>
_INSTANCE_ID: <ADD_VALUE>
_REGION: us-central1
_VERSION: "3.9"
Use gcloud builds triggers import --source=trigger.yaml
to create triggers via the command line
- Create an Cloud SQL for PostgreSQL instance and database
- Setup Cloud Build triggers (above)
-
Run integration test:
gcloud builds submit --config integration.cloudbuild.yaml --region us-central1 --substitutions=_INSTANCE_ID=$INSTANCE_ID,_DATABASE_ID=$DATABASE_ID,_REGION=$REGION
To run Cloud Build tests on GitHub from external contributors, ie RenovateBot, comment: /gcbrun
.
Please make sure your code is fully tested. The Cloud Build integration tests are run with the pytest-cov
code coverage plugin. They fail for PRs with a code coverage less than the threshold specified in .coveragerc
. If your file is inside the main module and should be ignored by code coverage check, add it to the omit
section of .coveragerc
.
Check for code coverage report in any Cloud Build integration test log. Here is a breakdown of the report:
Stmts
: lines of executable code (statements).Miss
: number of lines not covered by tests.Branch
: branches of executable code (e.g an if-else clause may count as 1 statement but 2 branches; test for both conditions to have both branches covered).BrPart
: number of branches not covered by tests.Cover
: average coverage of files.Missing
: lines that are not covered by tests.
Google hosts documentation on LangChain's site via the Google Provider page and individual integration pages: Vector Stores, Document Loaders, and Memory.
Currently, manual PRs are made to the Langchain GitHub repo.
API docs are templated in the docs/
directory.
To test locally, run: nox -s docs
The nox session, docs
, is used to create HTML to publish to googleapis.dev
The nox session, docfx
, is used to create YAML to publish to CGC.
The kokoro docs pipeline runs when a new release is created. See .kokoro/
for the release pipeline.