Skip to content

Commit

Permalink
Cleanup some bits
Browse files Browse the repository at this point in the history
  • Loading branch information
samuelhwilliams committed Nov 15, 2024
1 parent 533fb13 commit 07b1f9b
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 37 deletions.
1 change: 0 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,3 @@ repos:
"--disable-plugin",
"Base64HighEntropyString",
]
exclude: .env.development
36 changes: 3 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@ To update requirements, use `uv add` or `uv remove`.

This service depends on:
- A postgres database
- No other microservices

### IDE Setup
[Python IDE Setup](https://github.com/communitiesuk/funding-service-design-workflows/blob/main/readmes/python-repos-ide-setup.md)
Expand All @@ -40,30 +39,6 @@ This service depends on:
### Local DB Setup
General instructions for local db development are available here: [Local database development](https://github.com/communitiesuk/funding-service-design-workflows/blob/main/readmes/python-repos-db-development.md)

### DB Helper Scripts
This repository uses `invoke` to provide scripts for dropping and recreating the local database in [tasks.py](./tasks.py)

### Running in-container
To run the tasks inside the docker container used by docker compose, first bash into the container:
```bash
docker exec -it $(docker ps -qf "name=pre-award-stores") bash
```
Then execute the required tasks using `inv` as below.

Or to combine the two into one command:
```bash
docker exec -it $(docker ps -qf "name=pre-award-stores") inv truncate-data
```

## Running the tests

We use `pytest` to run all of our tests. We have a selection of unit, integration, and end-to-end (browser) tests. By default,
the unit and integration tests will run with a basic invocation of `pytest.`

The majority of our tests will expect to access a database. This will be available if you're using the docker runner and have started all of the services.

[Testing in Python repos](https://github.com/communitiesuk/funding-service-design-workflows/blob/main/readmes/python-repos-db-development.md)

## Updating database migrations

Whenever you make changes to database models, please run:
Expand All @@ -72,19 +47,14 @@ Whenever you make changes to database models, please run:

The `message` should be a short description of the DB changes made. Don't specify a revision id (using `--rev-id`) - it will be generated automatically.

The migration file for your changes will be created in ./db/migrations. Please then commit and push these to github
The migration file for your changes will be created in ./db/migrations/versions. Please then commit and push these to github
so that the migrations will be run in the pipelines to correctly upgrade the deployed db instances with your changes.


## Builds and Deploys
Details on how our pipelines work and the release process is available [here](https://dluhcdigital.atlassian.net/wiki/spaces/FS/pages/73695505/How+do+we+deploy+our+code+to+prod)
### Paketo
Paketo is used to build the docker image which gets deployed to our test and production environments. Details available [here](https://github.com/communitiesuk/funding-service-design-workflows/blob/main/readmes/python-repos-paketo.md)

### Copilot
Copilot is used for infrastructure deployment. Instructions are available [here](https://github.com/communitiesuk/funding-service-design-workflows/blob/main/readmes/python-repos-copilot.md), with the following values for the fund store:
- service-name: pre-award-stores
- image-name: ???

## Accessing on AWS

TODO: Some AWS access info
- service-name: fsd-pre-award-stores
3 changes: 0 additions & 3 deletions pytest.ini
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,3 @@ env =
FLASK_ENV=unit_test
FLASK_DEBUG=1
GITHUB_SHA=123123

markers =
accessibility: accessibility test

0 comments on commit 07b1f9b

Please sign in to comment.