diff --git a/README.md b/README.md index 4e8241331..c7ed080a9 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ Welcome to the project for the New TANF Data Portal, which will replace the lega Our vision is to build a new, secure, web-based data reporting system to improve the federal reporting experience for TANF grantees and federal staff. The new system will allow grantees to easily submit accurate data and be confident that they have fulfilled their reporting requirements. This will reduce the burden on all users, improve data quality, lead to better policy and program decision-making, and ultimately help low-income families. ---- +--- ## Current Build @@ -15,7 +15,7 @@ Our vision is to build a new, secure, web-based data reporting system to improve |**Frontend Coverage**| [![Codecov-Frontend-Dev](https://codecov.io/gh/raft-tech/TANF-app/branch/develop/graph/badge.svg?flag=dev-frontend)](https://codecov.io/gh/raft-tech/TANF-app?flag=dev-frontend) | [![Codeco-Frontend-HHS](https://codecov.io/gh/HHS/TANF-app/branch/main/graph/badge.svg?flag=main-frontend)](https://codecov.io/gh/HHS/TANF-app?flag=main-frontend) | [![Codeco-Frontend-HHS](https://codecov.io/gh/HHS/TANF-app/branch/master/graph/badge.svg?flag=master-frontend)](https://codecov.io/gh/HHS/TANF-app?flag=master-frontend) |**Backend Coverage**| [![Codecov-Backend-Dev](https://codecov.io/gh/raft-tech/TANF-app/branch/develop/graph/badge.svg?flag=dev-backend)](https://codecov.io/gh/raft-tech/TANF-app/branch/develop?flag=dev-backend)| [![Codecov-Backend-HHS]( https://codecov.io/gh/HHS/TANF-app/branch/main/graph/badge.svg?flag=main-backend)](https://codecov.io/gh/HHS/TANF-app/branch/main?flag=main-backend) | [![Codecov-Backend-HHS]( https://codecov.io/gh/HHS/TANF-app/branch/master/graph/badge.svg?flag=master-backend)](https://codecov.io/gh/HHS/TANF-app/branch/master?flag=master-backend) -[Link to Current Development Deployments](https://github.com/raft-tech/TANF-app/blob/feat/1860/docs/Technical-Documentation/TDP-environments-README.md#development) +[Link to Current Development Deployments](./docs/Technical-Documentation/TDP-environments-README.md) *Due to limitations imposed by Github and occasional slow server response times, some badges may require a page refresh to load.* diff --git a/docs/How-We-Work/team-meetings.md b/docs/How-We-Work/team-meetings.md index 90d283829..8b78dcc59 100644 --- a/docs/How-We-Work/team-meetings.md +++ b/docs/How-We-Work/team-meetings.md @@ -44,14 +44,14 @@ A typical sprint schedule is described in the table below. - **Attendees:** Core team - **Facilitator:** Raft PM/Scrum Master - **When:** Daily 12pm-12:15pm EST and asynchronously on Thursdays and every other Tuesday when the team has sprint ceremonies -- **Format:** Each team member gives a brief and intentional update to answer these questions - - Facilitator shares their screen and pulls up the tickets for each team member in the current sprint. - - *What did you do since the last standup that advances the sprint goals?* - - *What are you doing today that advances the sprint goals?* - - *Are you being held up by any inner-team dependencies i.e. Reviews, Tabletops, Pairings?* - - *Are you encountering any blockers?* - - Use the time directly after Standup to meet with a targetted group of poeple to resolve any issues that came up duirng the meeting - - If you can’t make Standup in real-time, please post a short async update to the [OFA TDP General Mattermost Channel](https://mattermost.goraft.tech/goraft/channels/guest-ofa-tdp-general) +- **Format:** Each team member gives a brief and intentional update to answer these questions + - Facilitator shares their screen and pulls up the tickets for each team member in the current sprint. + - *What did you do since the last standup that advances the sprint goals?* + - *What are you doing today that advances the sprint goals?* + - *Are you being held up by any inner-team dependencies i.e. Reviews, Tabletops, Pairings?* + - *Are you encountering any blockers?* + - Use the time directly after Standup to meet with a targetted group of poeple to resolve any issues that came up duirng the meeting + - If you can’t make Standup in real-time, please post a short async update to the [OFA TDP General Mattermost Channel](https://mattermost.goraft.tech/goraft/channels/guest-ofa-tdp-general) ### Backlog Refinement @@ -60,7 +60,24 @@ A typical sprint schedule is described in the table below. - **Facilitator:** Raft PM/Scrum Master - **When:** Every Tuesday on non-sprint planning weeks 11:00am-12:00pm EST / Every Tuesday on sprint planning weeks 3:00pm-4:00pm EST - **Format:** The product owner and raft product manager will collaborate with the leads to review the backlog and prioritize issues in support of the next release and make sure items are ready for the upcoming Sprint Planning session. By the end of the session, there will be a common understanding of the upcoming priorities and acceptance criteria of refined issues. The refined set of issues should be sent to the sprint board before Sprint Planning. If there are outstanding questions on a particular issue they will be noted within the unrefined issue and remain in the backlog. The agenda and notes can be found in the [Product Notebook](https://hhsgov.sharepoint.com/sites/TANFDataPortalOFA/_layouts/15/Doc.aspx?sourcedoc={cbce2e75-17b2-4e70-b422-60d034fcd4af}&action=edit&wd=target%28Product.one%7Ccfbcc7fb-4b00-4c43-9e29-70bdedd83b98%2FBacklog%20Refinement%7C4ef1b64b-327d-4628-823a-0d1fc5fce6ea%2F%29) within the TDP OneNote. - +- **Column Definitions** + - New Issues to be Sorted: New issues/tickets that need to be introduced to the core team. Author has ticket 90% drafted and can either get to 100% with some input from the team during backlog or needs significant scoping work but author's portion is complete for the time being. + - Unrefined: Scoping is still needed but author has ticket drafted for all the "known known's" (not WIP) + - Author needs significant information / developments from other work will significantly influence this ticket + - Ticket can be introduced + - External factors outside of the author spending time building the ticket (ie need external team's input, see how a feature develops, etc.) + - Ex. Waiting on X ticket to finish the scope of said unrefined ticket, problem found / unsure how big it is and knows other work will unearth it + - If we know the ACs but not the tasks, then its unrefined + - Refined: Ticket is complete and is ready to be executed. + - Refined & Ready to Go (Next Sprint) + - "Earmarked" work for the upcoming sprint. +- **Labelling:** + - WIP + - Author knows the 5 W's or darn near (90%) + - Drafted ticket – either still on the author the finish their part or a short team conversation is needed. + - Administrative in nature + - Ex. Stub, ticket that doesn't feel there's enough to warrant an introduction + ### Sprint Review - **Goal:** To review the work that was completed in the last two weeks and identify work that will roll over into the next sprint cycle. - **Attendees:** Core team @@ -104,7 +121,7 @@ A typical sprint schedule is described in the table below. - **Facilitator:** Raft PM/Scrum Master - **When:** Every other Tuesday 2:00pm-2:30pm EST, as needed - **Format:** This meeting is a formal version of internal demo with an emphasis on demonstrating work to stakeholders outside the product team. This demo should emphasize completed work that has a direct impact to the end user, it should not be a status of the work that has been done. The author of the feature will demo new work and features to the attendees. Attendees can give feedback during the meeting or in async follow-up. - - *All demos will be presented by the author of the issue unless the author is unavailable, gives explicit permission, and there is an immediate need to perform the demo.* + - *All demos will be presented by the author of the issue unless the author is unavailable, gives explicit permission, and there is an immediate need to perform the demo.* ### UX Sync * **Goal:** A weekly discussion between UX and Product for UX updates on research findings, spec, research strategy, or other pressing priorities. diff --git a/docs/Sprint-Review/sprint-102-summary.md b/docs/Sprint-Review/sprint-102-summary.md new file mode 100644 index 000000000..44d135296 --- /dev/null +++ b/docs/Sprint-Review/sprint-102-summary.md @@ -0,0 +1,86 @@ +# sprint-102-summary + +6/19/2024 - 7/2/2024 + +**Dev:** + +_**Prioritized DAC and Notifications Work**_ + +* \#2978 — As sys admin, I want to be able to reparse datafile sets +* \#3002 — \[BUG] Django admin filter not working properly +* \#1620 — \[SPIKE] As tech lead, I need to know the real-time branches deployed in Cloud.gov spaces +* \#2687 — As sys admin, I need the Access Request emails to admins to resume +* \#2792 — \[Error Audit] Category 3 error messages clean-up +* \#2996 — Add dynamic field name to cat4 error message + +**DevOps:** + +_**Successful deployments across environments and pipeline stability investments**_ + +* \#831 — Application health monitoring + +**Design:** + +_**Support reviews, planning for simplified quarter selection in TDP, email template delivery**_ + +* \#3014 — Blanked-out values in Submission History (Refinement) + * Slated for delivery 6/18/2024 backlog +* KC Release Notes & FAQ addition is going through final QASP review + * Associated A11y review sync w/ Thomas + * Walk-on Dear Colleague letter link update to this PR (or spin up a separate ticket if deployment of the letter to OFA's website doesn't align to this) +* \#3017 — Spike for simplified quarter selection for STTs +* \#2985 — \[Design deliverable] Email template for stuck file notifications + +## Tickets + +### Completed/Merged + +* [#3021 \[Design Deliverable\] Updated KC Release Notes & Update Indicator FAQ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3021) +* [#3008 As a software engineer, I want to be able to test django-admin-508](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3008) +* [#2795 As tech lead, I need TDP to detect duplicate records within a file and not store them in the db. ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2795) +* [#2133 \[Dev\] Enhancement for Request Access form (Tribe discoverability) ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2133) +* [#831 \[Spike\] As a Tech Lead, I want to get alerts when there is a backend or frontend error that affects an STT user ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/831) +* [#3023 as STT approved user, I need my IP address whitelisted so i can access TDP](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3023) +* [#2491 Create root-level docker-compose configuration file(s)](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2491) + +### Submitted (QASP Review, OCIO Review) + +* [#2473 As a data analyst I want to be notified of approaching data deadlines](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2473) +* [#2693 \[Error Audit\] Category 2 error messages clean-up ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2693) +* [#2801 Friendly name cleanup](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2801) +* [#2883 Pre-Made Reporting Dashboards on Kibana](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2883) +* [#2896 TDRS Parity Tracker](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2896) +* [#2950 As tech lead, I need the STT filter for search\_indexes to be updated](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2950) +* [#2954 Extend SESSION\_COOKIE\_AGE](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2954) +* [#2950 As tech lead, I need the STT filter for search\_indexes to be updated](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2950) + +### Ready to Merge + +* + +### Closed (Not Merged) + +* [#3000 \[Design Deliverable\] TDP Poster for summer 2024 conferences](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3000) + +### Moved to Next Sprint + +**In Progress** + +* [#1620 \[SPIKE\] As tech lead, I need to know the real-time branches deployed in Cloud.gov spaces](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1620) +* [#2792 \[Error Audit\] Category 3 error messages clean-up](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2792)I +* [#3004 Implement (small) data lifecycle (backup/archive ES)](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3004) +* [#3022 Spike - Continue Zap Sleep Investigation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3022) + +#### Blocked + +* + +**Raft Review** + +* [#2687 As sys admin, I need the Access Request emails to admins to resume](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2687) +* [#2985 \[Design Deliverable\] Email spec for Admin Notification for stuck files](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2985) +* [#2996 Add dynamic field name to cat4 error messages](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2996) +* [#3002 \[BUG\] Django admin filter not working properly](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3002) +* [#3016 Spike - Cat2 Validator Improvement](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3016) +* [#3017 Spike - As an STT user I need better guidance on selecting the appropriate fiscal period to submit my quarterly files](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3017) +* [#3025 As an STT user, I need an accurate error report when I space-fill `COUNTY_FIPS_CODE`](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3025) diff --git a/docs/Sprint-Review/sprint-103-summary.md b/docs/Sprint-Review/sprint-103-summary.md new file mode 100644 index 000000000..ae45411e0 --- /dev/null +++ b/docs/Sprint-Review/sprint-103-summary.md @@ -0,0 +1,81 @@ +# sprint-103-summary + +7/03/2024 - 7/16/2024 + +**Dev:** + +_**Prioritized DAC, improved dev tooling, and fixing bugs**_ + +* \#1621 — As a TDP user, I'd like to see a descriptive error message page if authentication source is unavailable +* \#2687 — As sys admin, I need the Access Request emails to admins to resume +* \#2792 — \[Error Audit] Category 3 error messages clean-up +* \#3027 — \[Bug] Investigate codecov failure in build-and-test workflow +* \#3004 — Implement (small) data lifecycle (backup/archive ES) + +**DevOps:** + +_**Successful deployments across environments and pipeline stability investments**_ + +* \#1620 — \[SPIKE] As tech lead, I need to know the real-time branches deployed in Cloud.gov spaces + +**Design:** + +_**Support reviews, Django Admin Experience epic research, email template delivery**_ + +* \#2910 — Django Admin Experience Improvements Research Session +* \#3057 — \[Design Deliverable] Spec for light-lift fiscal quarter / calendar quarter explainer in TDP +* \#3058 (stretch) — \[Design Deliverable] Release notes email template +* \#2968 (stretch pending Cat 3 completion) — Update Error Audit for Cat 4 / QA + +## Tickets + +### Completed/Merged + +* [#2693 \[Error Audit\] Category 2 error messages clean-up ](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2693) +* [#3025 As an STT user, I need an accurate error report when I space-fill `COUNTY_FIPS_CODE`](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3025) +* [#2857 Upgrade Postgres DB from version 12 to version 15](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2857) +* [#2950 As tech lead, I need the STT filter for search\_indexes to be updated](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2950) +* [#3002 \[BUG\] Django admin filter not working properly](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3002) +* [#3027 Bug - investigate codecov failure in build-and-test workflow](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3027) +* [#3017 Spike - As an STT user I need better guidance on selecting the appropriate fiscal period to submit my quarterly files](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3017) +* [#3022 Spike - Continue Zap Sleep Investigation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3022) +* [#2473 As a data analyst I want to be notified of approaching data deadlines](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2473) +* [#2687 As sys admin, I need the Access Request emails to admins to resume](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2687) +* [#2801 Friendly name cleanup](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2801) + +### Submitted (QASP Review, OCIO Review) + +* [#1620 \[SPIKE\] As tech lead, I need to know the real-time branches deployed in Cloud.gov spaces](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1620) +* [#3058 \[Design Deliverable\] Release notes email template](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3058) +* [#2985 \[Design Deliverable\] Email spec for Admin Notification for stuck files](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2985) +* [#2883 Pre-Made Reporting Dashboards on Kibana](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2883) +* [#3004 Implement (small) data lifecycle (backup/archive ES)](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3004) +* [#2954 Extend SESSION\_COOKIE\_AGE](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2954) +* [#3016 Spike - Cat2 Validator Improvement](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3016) + * approved during sprint 103, but included in the sprint 102 release + +### Ready to Merge + +* + +### Closed (Not Merged) + +* + +### Moved to Next Sprint + +**In Progress** + +* [#3059 Bug: file stuck in pending state when DOB or SSN field is space-filled](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3059) +* [#3055 Service timeout blocks parsing completion](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/3055) +* [#2792 \[Error Audit\] Category 3 error messages clean-up](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2792)I +* [#2910 \[Research Facilitation\] Admin Experience Improvements](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2910) + +#### Blocked + +* + +**Raft Review** + +* [#1621 As a TDP user, I'd like to see a descriptive error message page if authentication source is unavailable.](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1621) +* [#2996 Add dynamic field name to cat4 error messages](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2996) diff --git a/docs/Technical-Documentation/TDP-environments-README.md b/docs/Technical-Documentation/TDP-environments-README.md index 214c5183a..01b7fe477 100644 --- a/docs/Technical-Documentation/TDP-environments-README.md +++ b/docs/Technical-Documentation/TDP-environments-README.md @@ -6,8 +6,7 @@ | -------- | -------- | -------- | -------- | -------- | | A11y | https://tdp-frontend-a11y.app.cloud.gov | https://tdp-frontend-a11y.app.cloud.gov/admin/ | | Space for accessibility testing | | QASP | https://tdp-frontend-qasp.app.cloud.gov | https://tdp-frontend-qasp.app.cloud.gov/admin/ | | Space for QASP review | -| raft | https://tdp-frontend-raft.app.cloud.gov | https://tdp-frontend-raft.app.cloud.gov/admin/ | - | Space for raft review | +| raft | https://tdp-frontend-raft.app.cloud.gov | https://tdp-frontend-raft.app.cloud.gov/admin/ | | Space for raft review | ![badge](https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/andrew-jameson/ded3a260ed8245a5b231ba726b3039df/raw/Live-Environments-raft.json) diff --git a/scripts/deploy-backend.sh b/scripts/deploy-backend.sh index 24bef90d9..ebbce8243 100755 --- a/scripts/deploy-backend.sh +++ b/scripts/deploy-backend.sh @@ -1,7 +1,7 @@ #!/bin/bash ############################## -# Global Variable Decls +# Global Variable Decls ############################## # The deployment strategy you wish to employ ( rolling update or setting up a new environment) @@ -77,7 +77,7 @@ set_cf_envs() else cf_cmd="cf set-env $CGAPPNAME_BACKEND $var_name ${!var_name}" fi - + echo "Setting var : $var_name" $cf_cmd done @@ -85,7 +85,7 @@ set_cf_envs() } # Helper method to generate JWT cert and keys for new environment -generate_jwt_cert() +generate_jwt_cert() { echo "regenerating JWT cert/key" yes 'XX' | openssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodes -sha256 @@ -94,7 +94,7 @@ generate_jwt_cert() } update_kibana() -{ +{ # Add network policy allowing Kibana to talk to the proxy and to allow the backend to talk to Kibana cf add-network-policy "$CGAPPNAME_BACKEND" "$CGAPPNAME_KIBANA" --protocol tcp --port 5601 cf add-network-policy "$CGAPPNAME_FRONTEND" "$CGAPPNAME_KIBANA" --protocol tcp --port 5601 @@ -105,12 +105,16 @@ update_backend() { cd tdrs-backend || exit cf unset-env "$CGAPPNAME_BACKEND" "AV_SCAN_URL" - + if [ "$CF_SPACE" = "tanf-prod" ]; then cf set-env "$CGAPPNAME_BACKEND" AV_SCAN_URL "http://tanf-prod-clamav-rest.apps.internal:9000/scan" else # Add environment varilables for clamav cf set-env "$CGAPPNAME_BACKEND" AV_SCAN_URL "http://tdp-clamav-nginx-$env.apps.internal:9000/scan" + + # Add variable for dev/staging apps to know their DB name. Prod uses default AWS name. + cf unset-env "$CGAPPNAME_BACKEND" "APP_DB_NAME" + cf set-env "$CGAPPNAME_BACKEND" "APP_DB_NAME" "tdp_db_$backend_app_name" fi if [ "$1" = "rolling" ] ; then @@ -129,12 +133,12 @@ update_backend() fi set_cf_envs - + cf map-route "$CGAPPNAME_BACKEND" apps.internal --hostname "$CGAPPNAME_BACKEND" # Add network policy to allow frontend to access backend cf add-network-policy "$CGAPPNAME_FRONTEND" "$CGAPPNAME_BACKEND" --protocol tcp --port 8080 - + if [ "$CF_SPACE" = "tanf-prod" ]; then # Add network policy to allow backend to access tanf-prod services cf add-network-policy "$CGAPPNAME_BACKEND" clamav-rest --protocol tcp --port 9000 @@ -149,7 +153,7 @@ bind_backend_to_services() { echo "Binding services to app: $CGAPPNAME_BACKEND" if [ "$CGAPPNAME_BACKEND" = "tdp-backend-develop" ]; then - # TODO: this is technical debt, we should either make staging mimic tanf-dev + # TODO: this is technical debt, we should either make staging mimic tanf-dev # or make unique services for all apps but we have a services limit # Introducing technical debt for release 3.0.0 specifically. env="develop" @@ -158,10 +162,10 @@ bind_backend_to_services() { cf bind-service "$CGAPPNAME_BACKEND" "tdp-staticfiles-${env}" cf bind-service "$CGAPPNAME_BACKEND" "tdp-datafiles-${env}" cf bind-service "$CGAPPNAME_BACKEND" "tdp-db-${env}" - + # Setting up the ElasticSearch service cf bind-service "$CGAPPNAME_BACKEND" "es-${env}" - + set_cf_envs echo "Restarting app: $CGAPPNAME_BACKEND" diff --git a/tdrs-backend/Dockerfile b/tdrs-backend/Dockerfile index 9543f4b8d..e8233528d 100644 --- a/tdrs-backend/Dockerfile +++ b/tdrs-backend/Dockerfile @@ -20,7 +20,7 @@ curl -o /usr/share/postgresql-common/pgdg/apt.postgresql.org.asc --fail https:// sh -c 'echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list' && \ apt -y update && apt install postgresql-client-15 -y # Install packages: -RUN apt install -y gcc graphviz graphviz-dev libpq-dev python3-dev +RUN apt install -y gcc graphviz graphviz-dev libpq-dev python3-dev vim # Install pipenv RUN pip install --upgrade pip pipenv RUN pipenv install --dev --system --deploy diff --git a/tdrs-backend/Pipfile b/tdrs-backend/Pipfile index 51a998b7e..6e3775877 100644 --- a/tdrs-backend/Pipfile +++ b/tdrs-backend/Pipfile @@ -26,13 +26,12 @@ boto3 = "==1.28.4" cryptography = "==3.4.7" dj-database-url = "==0.5.0" django = "==3.2.15" -django-admin-508 = "==0.2.2" +django-admin-508 = "==1.0.1" django-admin-logs = "==1.0.2" django-configurations = "==2.2" django-cors-headers = "==3.12.0" django-extensions = "==3.1.3" django-filter = "==21.1" -django-more-admin-filters = "==1.8" django-model-utils = "==4.1.1" django-storages = "==1.12.3" django-unique-upload = "==0.2.1" diff --git a/tdrs-backend/Pipfile.lock b/tdrs-backend/Pipfile.lock index 0ca355085..7b054c8b7 100644 --- a/tdrs-backend/Pipfile.lock +++ b/tdrs-backend/Pipfile.lock @@ -1,7 +1,7 @@ { "_meta": { "hash": { - "sha256": "2dd2adca467bcb7a6281923765737b5b0b52101a30efc80401e5552109874674" + "sha256": "80bf15489b1a4a07f3711904a66fe19188e49eaa58dbd920d20bf4432dcd5518" }, "pipfile-spec": 6, "requires": { @@ -83,11 +83,11 @@ }, "certifi": { "hashes": [ - "sha256:3cd43f1c6fa7dedc5899d69d3ad0398fd018ad1a17fba83ddaf78aa46c747516", - "sha256:ddc6c8ce995e6987e7faf5e3f1b02b302836a0e5d98ece18392cb1a36c72ad56" + "sha256:5a1e7645bc0ec61a09e26c36f6106dd4cf40c6db3a1fb6352b0244e7fb057c7b", + "sha256:c198e21b1289c2ab85ee4e67bb4b4ef3ead0892059901a8d5b622f24a1101e90" ], "markers": "python_version >= '3.6'", - "version": "==2024.6.2" + "version": "==2024.7.4" }, "cffi": { "hashes": [ @@ -256,11 +256,11 @@ }, "django-admin-508": { "hashes": [ - "sha256:6488ce76cbccecb1667ee21d49e87a259d43f7a619b18e7035c9e6bdf1c79bb3", - "sha256:fd7ed03e27efaa5b33aa47c4d82ae540a7c42957504061854fc76c046bca8607" + "sha256:419d017eab16c264b771c8c7ef1815c1c181cf4a1603b7e45cf78a3bbecb1d4a", + "sha256:fbc7bb8bc37f4c2089efceda9818a97898881ab80273919248f85cd3d6f01215" ], "index": "pypi", - "version": "==0.2.2" + "version": "==1.0.1" }, "django-admin-logs": { "hashes": [ @@ -366,14 +366,6 @@ "index": "pypi", "version": "==4.1.1" }, - "django-more-admin-filters": { - "hashes": [ - "sha256:2d5dd9e8b55d85638d5e260dfb694b1903288b61c37e655b9443b70a5f36833f", - "sha256:fc4d3a3bf0367763a887dceca4b469e467ad062a9e8da1c29b6d6137c5b0e3cd" - ], - "index": "pypi", - "version": "==1.8" - }, "django-nine": { "hashes": [ "sha256:304e0f83cea5a35359375fc919d00f9917b655c1d388244cbfc7363f59489177", @@ -451,11 +443,11 @@ }, "exceptiongroup": { "hashes": [ - "sha256:5258b9ed329c5bbdd31a309f53cbfb0b155341807f6ff7606a1e801a891b29ad", - "sha256:a4785e48b045528f5bfe627b6ad554ff32def154f42372786903b7abcfe1aa16" + "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", + "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc" ], "markers": "python_version < '3.11'", - "version": "==1.2.1" + "version": "==1.2.2" }, "executing": { "hashes": [ @@ -484,11 +476,11 @@ }, "humanize": { "hashes": [ - "sha256:582a265c931c683a7e9b8ed9559089dea7edcf6cc95be39a3cbc2c5d5ac2bcfa", - "sha256:ce284a76d5b1377fd8836733b983bfb0b76f1aa1c090de2566fcf008d7f6ab16" + "sha256:06b6eb0293e4b85e8d385397c5868926820db32b9b654b932f57fa41c23c9978", + "sha256:39e7ccb96923e732b5c2e27aeaa3b10a8dfeeba3eb965ba7b74a3eb0e30040a6" ], "markers": "python_version >= '3.8'", - "version": "==4.9.0" + "version": "==4.10.0" }, "idna": { "hashes": [ @@ -516,11 +508,11 @@ }, "ipython": { "hashes": [ - "sha256:53eee7ad44df903a06655871cbab66d156a051fd86f3ec6750470ac9604ac1ab", - "sha256:c6ed726a140b6e725b911528f80439c534fac915246af3efc39440a6b0f9d716" + "sha256:1cec0fbba8404af13facebe83d04436a7434c7400e59f47acf467c64abd0956c", + "sha256:e6b347c27bdf9c32ee9d31ae85defc525755a1869f14057e900675b9e8d6e6ff" ], "markers": "python_version >= '3.7'", - "version": "==8.25.0" + "version": "==8.26.0" }, "itypes": { "hashes": [ @@ -687,78 +679,89 @@ }, "pillow": { "hashes": [ - "sha256:048ad577748b9fa4a99a0548c64f2cb8d672d5bf2e643a739ac8faff1164238c", - "sha256:048eeade4c33fdf7e08da40ef402e748df113fd0b4584e32c4af74fe78baaeb2", - "sha256:0ba26351b137ca4e0db0342d5d00d2e355eb29372c05afd544ebf47c0956ffeb", - "sha256:0ea2a783a2bdf2a561808fe4a7a12e9aa3799b701ba305de596bc48b8bdfce9d", - "sha256:1530e8f3a4b965eb6a7785cf17a426c779333eb62c9a7d1bbcf3ffd5bf77a4aa", - "sha256:16563993329b79513f59142a6b02055e10514c1a8e86dca8b48a893e33cf91e3", - "sha256:19aeb96d43902f0a783946a0a87dbdad5c84c936025b8419da0a0cd7724356b1", - "sha256:1a1d1915db1a4fdb2754b9de292642a39a7fb28f1736699527bb649484fb966a", - "sha256:1b87bd9d81d179bd8ab871603bd80d8645729939f90b71e62914e816a76fc6bd", - "sha256:1dfc94946bc60ea375cc39cff0b8da6c7e5f8fcdc1d946beb8da5c216156ddd8", - "sha256:2034f6759a722da3a3dbd91a81148cf884e91d1b747992ca288ab88c1de15999", - "sha256:261ddb7ca91fcf71757979534fb4c128448b5b4c55cb6152d280312062f69599", - "sha256:2ed854e716a89b1afcedea551cd85f2eb2a807613752ab997b9974aaa0d56936", - "sha256:3102045a10945173d38336f6e71a8dc71bcaeed55c3123ad4af82c52807b9375", - "sha256:339894035d0ede518b16073bdc2feef4c991ee991a29774b33e515f1d308e08d", - "sha256:412444afb8c4c7a6cc11a47dade32982439925537e483be7c0ae0cf96c4f6a0b", - "sha256:4203efca580f0dd6f882ca211f923168548f7ba334c189e9eab1178ab840bf60", - "sha256:45ebc7b45406febf07fef35d856f0293a92e7417ae7933207e90bf9090b70572", - "sha256:4b5ec25d8b17217d635f8935dbc1b9aa5907962fae29dff220f2659487891cd3", - "sha256:4c8e73e99da7db1b4cad7f8d682cf6abad7844da39834c288fbfa394a47bbced", - "sha256:4e6f7d1c414191c1199f8996d3f2282b9ebea0945693fb67392c75a3a320941f", - "sha256:4eaa22f0d22b1a7e93ff0a596d57fdede2e550aecffb5a1ef1106aaece48e96b", - "sha256:50b8eae8f7334ec826d6eeffaeeb00e36b5e24aa0b9df322c247539714c6df19", - "sha256:50fd3f6b26e3441ae07b7c979309638b72abc1a25da31a81a7fbd9495713ef4f", - "sha256:51243f1ed5161b9945011a7360e997729776f6e5d7005ba0c6879267d4c5139d", - "sha256:5d512aafa1d32efa014fa041d38868fda85028e3f930a96f85d49c7d8ddc0383", - "sha256:5f77cf66e96ae734717d341c145c5949c63180842a545c47a0ce7ae52ca83795", - "sha256:6b02471b72526ab8a18c39cb7967b72d194ec53c1fd0a70b050565a0f366d355", - "sha256:6fb1b30043271ec92dc65f6d9f0b7a830c210b8a96423074b15c7bc999975f57", - "sha256:7161ec49ef0800947dc5570f86568a7bb36fa97dd09e9827dc02b718c5643f09", - "sha256:72d622d262e463dfb7595202d229f5f3ab4b852289a1cd09650362db23b9eb0b", - "sha256:74d28c17412d9caa1066f7a31df8403ec23d5268ba46cd0ad2c50fb82ae40462", - "sha256:78618cdbccaa74d3f88d0ad6cb8ac3007f1a6fa5c6f19af64b55ca170bfa1edf", - "sha256:793b4e24db2e8742ca6423d3fde8396db336698c55cd34b660663ee9e45ed37f", - "sha256:798232c92e7665fe82ac085f9d8e8ca98826f8e27859d9a96b41d519ecd2e49a", - "sha256:81d09caa7b27ef4e61cb7d8fbf1714f5aec1c6b6c5270ee53504981e6e9121ad", - "sha256:8ab74c06ffdab957d7670c2a5a6e1a70181cd10b727cd788c4dd9005b6a8acd9", - "sha256:8eb0908e954d093b02a543dc963984d6e99ad2b5e36503d8a0aaf040505f747d", - "sha256:90b9e29824800e90c84e4022dd5cc16eb2d9605ee13f05d47641eb183cd73d45", - "sha256:9797a6c8fe16f25749b371c02e2ade0efb51155e767a971c61734b1bf6293994", - "sha256:9d2455fbf44c914840c793e89aa82d0e1763a14253a000743719ae5946814b2d", - "sha256:9d3bea1c75f8c53ee4d505c3e67d8c158ad4df0d83170605b50b64025917f338", - "sha256:9e2ec1e921fd07c7cda7962bad283acc2f2a9ccc1b971ee4b216b75fad6f0463", - "sha256:9e91179a242bbc99be65e139e30690e081fe6cb91a8e77faf4c409653de39451", - "sha256:a0eaa93d054751ee9964afa21c06247779b90440ca41d184aeb5d410f20ff591", - "sha256:a2c405445c79c3f5a124573a051062300936b0281fee57637e706453e452746c", - "sha256:aa7e402ce11f0885305bfb6afb3434b3cd8f53b563ac065452d9d5654c7b86fd", - "sha256:aff76a55a8aa8364d25400a210a65ff59d0168e0b4285ba6bf2bd83cf675ba32", - "sha256:b09b86b27a064c9624d0a6c54da01c1beaf5b6cadfa609cf63789b1d08a797b9", - "sha256:b14f16f94cbc61215115b9b1236f9c18403c15dd3c52cf629072afa9d54c1cbf", - "sha256:b50811d664d392f02f7761621303eba9d1b056fb1868c8cdf4231279645c25f5", - "sha256:b7bc2176354defba3edc2b9a777744462da2f8e921fbaf61e52acb95bafa9828", - "sha256:c78e1b00a87ce43bb37642c0812315b411e856a905d58d597750eb79802aaaa3", - "sha256:c83341b89884e2b2e55886e8fbbf37c3fa5efd6c8907124aeb72f285ae5696e5", - "sha256:ca2870d5d10d8726a27396d3ca4cf7976cec0f3cb706debe88e3a5bd4610f7d2", - "sha256:ccce24b7ad89adb5a1e34a6ba96ac2530046763912806ad4c247356a8f33a67b", - "sha256:cd5e14fbf22a87321b24c88669aad3a51ec052eb145315b3da3b7e3cc105b9a2", - "sha256:ce49c67f4ea0609933d01c0731b34b8695a7a748d6c8d186f95e7d085d2fe475", - "sha256:d33891be6df59d93df4d846640f0e46f1a807339f09e79a8040bc887bdcd7ed3", - "sha256:d3b2348a78bc939b4fed6552abfd2e7988e0f81443ef3911a4b8498ca084f6eb", - "sha256:d886f5d353333b4771d21267c7ecc75b710f1a73d72d03ca06df49b09015a9ef", - "sha256:d93480005693d247f8346bc8ee28c72a2191bdf1f6b5db469c096c0c867ac015", - "sha256:dc1a390a82755a8c26c9964d457d4c9cbec5405896cba94cf51f36ea0d855002", - "sha256:dd78700f5788ae180b5ee8902c6aea5a5726bac7c364b202b4b3e3ba2d293170", - "sha256:e46f38133e5a060d46bd630faa4d9fa0202377495df1f068a8299fd78c84de84", - "sha256:e4b878386c4bf293578b48fc570b84ecfe477d3b77ba39a6e87150af77f40c57", - "sha256:f0d0591a0aeaefdaf9a5e545e7485f89910c977087e7de2b6c388aec32011e9f", - "sha256:fdcbb4068117dfd9ce0138d068ac512843c52295ed996ae6dd1faf537b6dbc27", - "sha256:ff61bfd9253c3915e6d41c651d5f962da23eda633cf02262990094a18a55371a" + "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", + "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", + "sha256:06b2f7898047ae93fad74467ec3d28fe84f7831370e3c258afa533f81ef7f3df", + "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", + "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", + "sha256:0ae24a547e8b711ccaaf99c9ae3cd975470e1a30caa80a6aaee9a2f19c05701d", + "sha256:134ace6dc392116566980ee7436477d844520a26a4b1bd4053f6f47d096997fd", + "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", + "sha256:1b5dea9831a90e9d0721ec417a80d4cbd7022093ac38a568db2dd78363b00908", + "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", + "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", + "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", + "sha256:298478fe4f77a4408895605f3482b6cc6222c018b2ce565c2b6b9c354ac3229b", + "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", + "sha256:2db98790afc70118bd0255c2eeb465e9767ecf1f3c25f9a1abb8ffc8cfd1fe0a", + "sha256:32cda9e3d601a52baccb2856b8ea1fc213c90b340c542dcef77140dfa3278a9e", + "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", + "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", + "sha256:43efea75eb06b95d1631cb784aa40156177bf9dd5b4b03ff38979e048258bc6b", + "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", + "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", + "sha256:5161eef006d335e46895297f642341111945e2c1c899eb406882a6c61a4357ab", + "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", + "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", + "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", + "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", + "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", + "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", + "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", + "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", + "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", + "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", + "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", + "sha256:73664fe514b34c8f02452ffb73b7a92c6774e39a647087f83d67f010eb9a0cf0", + "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", + "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", + "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", + "sha256:7970285ab628a3779aecc35823296a7869f889b8329c16ad5a71e4901a3dc4ef", + "sha256:7a8d4bade9952ea9a77d0c3e49cbd8b2890a399422258a77f357b9cc9be8d680", + "sha256:7c1ee6f42250df403c5f103cbd2768a28fe1a0ea1f0f03fe151c8741e1469c8b", + "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", + "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", + "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", + "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", + "sha256:87dd88ded2e6d74d31e1e0a99a726a6765cda32d00ba72dc37f0651f306daaa8", + "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", + "sha256:8d4d5063501b6dd4024b8ac2f04962d661222d120381272deea52e3fc52d3736", + "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", + "sha256:930044bb7679ab003b14023138b50181899da3f25de50e9dbee23b61b4de2126", + "sha256:950be4d8ba92aca4b2bb0741285a46bfae3ca699ef913ec8416c1b78eadd64cd", + "sha256:961a7293b2457b405967af9c77dcaa43cc1a8cd50d23c532e62d48ab6cdd56f5", + "sha256:9b885f89040bb8c4a1573566bbb2f44f5c505ef6e74cec7ab9068c900047f04b", + "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", + "sha256:a02364621fe369e06200d4a16558e056fe2805d3468350df3aef21e00d26214b", + "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", + "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", + "sha256:b15e02e9bb4c21e39876698abf233c8c579127986f8207200bc8a8f6bb27acf2", + "sha256:b2724fdb354a868ddf9a880cb84d102da914e99119211ef7ecbdc613b8c96b3c", + "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", + "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", + "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", + "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", + "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", + "sha256:c76e5786951e72ed3686e122d14c5d7012f16c8303a674d18cdcd6d89557fc5b", + "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", + "sha256:cfdd747216947628af7b259d274771d84db2268ca062dd5faf373639d00113a3", + "sha256:d7480af14364494365e89d6fddc510a13e5a2c3584cb19ef65415ca57252fb84", + "sha256:dbc6ae66518ab3c5847659e9988c3b60dc94ffb48ef9168656e0019a93dbf8a1", + "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", + "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", + "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", + "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", + "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", + "sha256:e88d5e6ad0d026fba7bdab8c3f225a69f063f116462c49892b0149e21b6c0a0e", + "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", + "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", + "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", + "sha256:f7baece4ce06bade126fb84b8af1c33439a76d8a6fd818970215e0560ca28c27", + "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", + "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1" ], "markers": "python_version >= '3.8'", - "version": "==10.3.0" + "version": "==10.4.0" }, "prometheus-client": { "hashes": [ @@ -805,10 +808,10 @@ }, "pure-eval": { "hashes": [ - "sha256:01eaab343580944bc56080ebe0a674b39ec44a945e6d09ba7db3cb8cec289350", - "sha256:2b45320af6dfaa1750f543d714b6d1c520a1688dec6fd24d339063ce0aaa9ac3" + "sha256:1db8e35b67b3d218d818ae653e27f06c3aa420901fa7b081ca98cbedc874e0d0", + "sha256:5f4e983f40564c576c7c8635ae88db5956bb2229d7e9237d03b3c0b0190eaf42" ], - "version": "==0.2.2" + "version": "==0.2.3" }, "pycparser": { "hashes": [ @@ -837,9 +840,10 @@ }, "python-crontab": { "hashes": [ - "sha256:f4ea1605d24533b67fa7a634ef26cb59a5f2e7954f6e677d2d7a2229959a2fc8" + "sha256:40067d1dd39ade3460b2ad8557c7651514cd3851deffff61c5c60e1227c5c36b", + "sha256:82cb9b6a312d41ff66fd3caf3eed7115c28c195bfb50711bc2b4b9592feb9fe5" ], - "version": "==3.1.0" + "version": "==3.2.0" }, "python-dateutil": { "hashes": [ @@ -982,11 +986,11 @@ }, "setuptools": { "hashes": [ - "sha256:01a1e793faa5bd89abc851fa15d0a0db26f160890c7102cd8dce643e886b47f5", - "sha256:d9b8b771455a97c8a9f3ab3448ebe0b29b5e105f1228bba41028be116985a267" + "sha256:032d42ee9fb536e33087fb66cac5f840eb9391ed05637b3f2a76a7c8fb477936", + "sha256:33874fdc59b3188304b2e7c80d9029097ea31627180896fb549c578ceb8a0855" ], "markers": "python_version >= '3.8'", - "version": "==70.1.0" + "version": "==71.1.0" }, "six": { "hashes": [ @@ -998,11 +1002,11 @@ }, "sqlparse": { "hashes": [ - "sha256:714d0a4932c059d16189f58ef5411ec2287a4360f17cdd0edd2d09d4c5087c93", - "sha256:c204494cd97479d0e39f28c93d46c0b2d5959c7b9ab904762ea6c7af211c8663" + "sha256:773dcbf9a5ab44a090f3441e2180efe2560220203dc2f8c0b0fa141e18b505e4", + "sha256:bb6b4df465655ef332548e24f08e205afc81b9ab86cb1c45657a7ff173a3a00e" ], "markers": "python_version >= '3.8'", - "version": "==0.5.0" + "version": "==0.5.1" }, "stack-data": { "hashes": [ @@ -1240,61 +1244,61 @@ "toml" ], "hashes": [ - "sha256:015eddc5ccd5364dcb902eaecf9515636806fa1e0d5bef5769d06d0f31b54523", - "sha256:04aefca5190d1dc7a53a4c1a5a7f8568811306d7a8ee231c42fb69215571944f", - "sha256:05ac5f60faa0c704c0f7e6a5cbfd6f02101ed05e0aee4d2822637a9e672c998d", - "sha256:0bbddc54bbacfc09b3edaec644d4ac90c08ee8ed4844b0f86227dcda2d428fcb", - "sha256:1d2a830ade66d3563bb61d1e3c77c8def97b30ed91e166c67d0632c018f380f0", - "sha256:239a4e75e09c2b12ea478d28815acf83334d32e722e7433471fbf641c606344c", - "sha256:244f509f126dc71369393ce5fea17c0592c40ee44e607b6d855e9c4ac57aac98", - "sha256:25a5caf742c6195e08002d3b6c2dd6947e50efc5fc2c2205f61ecb47592d2d83", - "sha256:296a7d9bbc598e8744c00f7a6cecf1da9b30ae9ad51c566291ff1314e6cbbed8", - "sha256:2e079c9ec772fedbade9d7ebc36202a1d9ef7291bc9b3a024ca395c4d52853d7", - "sha256:33ca90a0eb29225f195e30684ba4a6db05dbef03c2ccd50b9077714c48153cac", - "sha256:33fc65740267222fc02975c061eb7167185fef4cc8f2770267ee8bf7d6a42f84", - "sha256:341dd8f61c26337c37988345ca5c8ccabeff33093a26953a1ac72e7d0103c4fb", - "sha256:34d6d21d8795a97b14d503dcaf74226ae51eb1f2bd41015d3ef332a24d0a17b3", - "sha256:3538d8fb1ee9bdd2e2692b3b18c22bb1c19ffbefd06880f5ac496e42d7bb3884", - "sha256:38a3b98dae8a7c9057bd91fbf3415c05e700a5114c5f1b5b0ea5f8f429ba6614", - "sha256:3d5a67f0da401e105753d474369ab034c7bae51a4c31c77d94030d59e41df5bd", - "sha256:50084d3516aa263791198913a17354bd1dc627d3c1639209640b9cac3fef5807", - "sha256:55f689f846661e3f26efa535071775d0483388a1ccfab899df72924805e9e7cd", - "sha256:5bc5a8c87714b0c67cfeb4c7caa82b2d71e8864d1a46aa990b5588fa953673b8", - "sha256:62bda40da1e68898186f274f832ef3e759ce929da9a9fd9fcf265956de269dbc", - "sha256:705f3d7c2b098c40f5b81790a5fedb274113373d4d1a69e65f8b68b0cc26f6db", - "sha256:75e3f4e86804023e991096b29e147e635f5e2568f77883a1e6eed74512659ab0", - "sha256:7b2a19e13dfb5c8e145c7a6ea959485ee8e2204699903c88c7d25283584bfc08", - "sha256:7cec2af81f9e7569280822be68bd57e51b86d42e59ea30d10ebdbb22d2cb7232", - "sha256:8383a6c8cefba1b7cecc0149415046b6fc38836295bc4c84e820872eb5478b3d", - "sha256:8c836309931839cca658a78a888dab9676b5c988d0dd34ca247f5f3e679f4e7a", - "sha256:8e317953bb4c074c06c798a11dbdd2cf9979dbcaa8ccc0fa4701d80042d4ebf1", - "sha256:923b7b1c717bd0f0f92d862d1ff51d9b2b55dbbd133e05680204465f454bb286", - "sha256:990fb20b32990b2ce2c5f974c3e738c9358b2735bc05075d50a6f36721b8f303", - "sha256:9aad68c3f2566dfae84bf46295a79e79d904e1c21ccfc66de88cd446f8686341", - "sha256:a5812840d1d00eafae6585aba38021f90a705a25b8216ec7f66aebe5b619fb84", - "sha256:a6519d917abb15e12380406d721e37613e2a67d166f9fb7e5a8ce0375744cd45", - "sha256:ab0b028165eea880af12f66086694768f2c3139b2c31ad5e032c8edbafca6ffc", - "sha256:aea7da970f1feccf48be7335f8b2ca64baf9b589d79e05b9397a06696ce1a1ec", - "sha256:b1196e13c45e327d6cd0b6e471530a1882f1017eb83c6229fc613cd1a11b53cd", - "sha256:b368e1aee1b9b75757942d44d7598dcd22a9dbb126affcbba82d15917f0cc155", - "sha256:bde997cac85fcac227b27d4fb2c7608a2c5f6558469b0eb704c5726ae49e1c52", - "sha256:c4c2872b3c91f9baa836147ca33650dc5c172e9273c808c3c3199c75490e709d", - "sha256:c59d2ad092dc0551d9f79d9d44d005c945ba95832a6798f98f9216ede3d5f485", - "sha256:d1da0a2e3b37b745a2b2a678a4c796462cf753aebf94edcc87dcc6b8641eae31", - "sha256:d8b7339180d00de83e930358223c617cc343dd08e1aa5ec7b06c3a121aec4e1d", - "sha256:dd4b3355b01273a56b20c219e74e7549e14370b31a4ffe42706a8cda91f19f6d", - "sha256:e08c470c2eb01977d221fd87495b44867a56d4d594f43739a8028f8646a51e0d", - "sha256:f5102a92855d518b0996eb197772f5ac2a527c0ec617124ad5242a3af5e25f85", - "sha256:f542287b1489c7a860d43a7d8883e27ca62ab84ca53c965d11dac1d3a1fab7ce", - "sha256:f78300789a708ac1f17e134593f577407d52d0417305435b134805c4fb135adb", - "sha256:f81bc26d609bf0fbc622c7122ba6307993c83c795d2d6f6f6fd8c000a770d974", - "sha256:f836c174c3a7f639bded48ec913f348c4761cbf49de4a20a956d3431a7c9cb24", - "sha256:fa21a04112c59ad54f69d80e376f7f9d0f5f9123ab87ecd18fbb9ec3a2beed56", - "sha256:fcf7d1d6f5da887ca04302db8e0e0cf56ce9a5e05f202720e49b3e8157ddb9a9", - "sha256:fd27d8b49e574e50caa65196d908f80e4dff64d7e592d0c59788b45aad7e8b35" + "sha256:0086cd4fc71b7d485ac93ca4239c8f75732c2ae3ba83f6be1c9be59d9e2c6382", + "sha256:01c322ef2bbe15057bc4bf132b525b7e3f7206f071799eb8aa6ad1940bcf5fb1", + "sha256:03cafe82c1b32b770a29fd6de923625ccac3185a54a5e66606da26d105f37dac", + "sha256:044a0985a4f25b335882b0966625270a8d9db3d3409ddc49a4eb00b0ef5e8cee", + "sha256:07ed352205574aad067482e53dd606926afebcb5590653121063fbf4e2175166", + "sha256:0d1b923fc4a40c5832be4f35a5dab0e5ff89cddf83bb4174499e02ea089daf57", + "sha256:0e7b27d04131c46e6894f23a4ae186a6a2207209a05df5b6ad4caee6d54a222c", + "sha256:1fad32ee9b27350687035cb5fdf9145bc9cf0a094a9577d43e909948ebcfa27b", + "sha256:289cc803fa1dc901f84701ac10c9ee873619320f2f9aff38794db4a4a0268d51", + "sha256:3c59105f8d58ce500f348c5b56163a4113a440dad6daa2294b5052a10db866da", + "sha256:46c3d091059ad0b9c59d1034de74a7f36dcfa7f6d3bde782c49deb42438f2450", + "sha256:482855914928c8175735a2a59c8dc5806cf7d8f032e4820d52e845d1f731dca2", + "sha256:49c76cdfa13015c4560702574bad67f0e15ca5a2872c6a125f6327ead2b731dd", + "sha256:4b03741e70fb811d1a9a1d75355cf391f274ed85847f4b78e35459899f57af4d", + "sha256:4bea27c4269234e06f621f3fac3925f56ff34bc14521484b8f66a580aacc2e7d", + "sha256:4d5fae0a22dc86259dee66f2cc6c1d3e490c4a1214d7daa2a93d07491c5c04b6", + "sha256:543ef9179bc55edfd895154a51792b01c017c87af0ebaae092720152e19e42ca", + "sha256:54dece71673b3187c86226c3ca793c5f891f9fc3d8aa183f2e3653da18566169", + "sha256:6379688fb4cfa921ae349c76eb1a9ab26b65f32b03d46bb0eed841fd4cb6afb1", + "sha256:65fa405b837060db569a61ec368b74688f429b32fa47a8929a7a2f9b47183713", + "sha256:6616d1c9bf1e3faea78711ee42a8b972367d82ceae233ec0ac61cc7fec09fa6b", + "sha256:6fe885135c8a479d3e37a7aae61cbd3a0fb2deccb4dda3c25f92a49189f766d6", + "sha256:7221f9ac9dad9492cecab6f676b3eaf9185141539d5c9689d13fd6b0d7de840c", + "sha256:76d5f82213aa78098b9b964ea89de4617e70e0d43e97900c2778a50856dac605", + "sha256:7792f0ab20df8071d669d929c75c97fecfa6bcab82c10ee4adb91c7a54055463", + "sha256:831b476d79408ab6ccfadaaf199906c833f02fdb32c9ab907b1d4aa0713cfa3b", + "sha256:9146579352d7b5f6412735d0f203bbd8d00113a680b66565e205bc605ef81bc6", + "sha256:9cc44bf0315268e253bf563f3560e6c004efe38f76db03a1558274a6e04bf5d5", + "sha256:a73d18625f6a8a1cbb11eadc1d03929f9510f4131879288e3f7922097a429f63", + "sha256:a8659fd33ee9e6ca03950cfdcdf271d645cf681609153f218826dd9805ab585c", + "sha256:a94925102c89247530ae1dab7dc02c690942566f22e189cbd53579b0693c0783", + "sha256:ad4567d6c334c46046d1c4c20024de2a1c3abc626817ae21ae3da600f5779b44", + "sha256:b2e16f4cd2bc4d88ba30ca2d3bbf2f21f00f382cf4e1ce3b1ddc96c634bc48ca", + "sha256:bbdf9a72403110a3bdae77948b8011f644571311c2fb35ee15f0f10a8fc082e8", + "sha256:beb08e8508e53a568811016e59f3234d29c2583f6b6e28572f0954a6b4f7e03d", + "sha256:c4cbe651f3904e28f3a55d6f371203049034b4ddbce65a54527a3f189ca3b390", + "sha256:c7b525ab52ce18c57ae232ba6f7010297a87ced82a2383b1afd238849c1ff933", + "sha256:ca5d79cfdae420a1d52bf177de4bc2289c321d6c961ae321503b2ca59c17ae67", + "sha256:cdab02a0a941af190df8782aafc591ef3ad08824f97850b015c8c6a8b3877b0b", + "sha256:d17c6a415d68cfe1091d3296ba5749d3d8696e42c37fca5d4860c5bf7b729f03", + "sha256:d39bd10f0ae453554798b125d2f39884290c480f56e8a02ba7a6ed552005243b", + "sha256:d4b3cd1ca7cd73d229487fa5caca9e4bc1f0bca96526b922d61053ea751fe791", + "sha256:d50a252b23b9b4dfeefc1f663c568a221092cbaded20a05a11665d0dbec9b8fb", + "sha256:da8549d17489cd52f85a9829d0e1d91059359b3c54a26f28bec2c5d369524807", + "sha256:dcd070b5b585b50e6617e8972f3fbbee786afca71b1936ac06257f7e178f00f6", + "sha256:ddaaa91bfc4477d2871442bbf30a125e8fe6b05da8a0015507bfbf4718228ab2", + "sha256:df423f351b162a702c053d5dddc0fc0ef9a9e27ea3f449781ace5f906b664428", + "sha256:dff044f661f59dace805eedb4a7404c573b6ff0cdba4a524141bc63d7be5c7fd", + "sha256:e7e128f85c0b419907d1f38e616c4f1e9f1d1b37a7949f44df9a73d5da5cd53c", + "sha256:ed8d1d1821ba5fc88d4a4f45387b65de52382fa3ef1f0115a4f7a20cdfab0e94", + "sha256:f2501d60d7497fd55e391f423f965bbe9e650e9ffc3c627d5f0ac516026000b8", + "sha256:f7db0b6ae1f96ae41afe626095149ecd1b212b424626175a6633c2999eaad45b" ], "markers": "python_version >= '3.8'", - "version": "==7.5.3" + "version": "==7.6.0" }, "docutils": { "hashes": [ @@ -1306,11 +1310,11 @@ }, "exceptiongroup": { "hashes": [ - "sha256:5258b9ed329c5bbdd31a309f53cbfb0b155341807f6ff7606a1e801a891b29ad", - "sha256:a4785e48b045528f5bfe627b6ad554ff32def154f42372786903b7abcfe1aa16" + "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", + "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc" ], "markers": "python_version < '3.11'", - "version": "==1.2.1" + "version": "==1.2.2" }, "factory-boy": { "hashes": [ @@ -1323,11 +1327,11 @@ }, "faker": { "hashes": [ - "sha256:4c40b34a9c569018d4f9d6366d71a4da8a883d5ddf2b23197be5370f29b7e1b6", - "sha256:bdec5f2fb057d244ebef6e0ed318fea4dcbdf32c3a1a010766fc45f5d68fc68d" + "sha256:0f60978314973de02c00474c2ae899785a42b2cf4f41b7987e93c132a2b8a4a9", + "sha256:886ee28219be96949cd21ecc96c4c742ee1680e77f687b095202c8def1a08f06" ], "markers": "python_version >= '3.8'", - "version": "==25.8.0" + "version": "==26.0.0" }, "flake8": { "hashes": [ diff --git a/tdrs-backend/docs/session-management.md b/tdrs-backend/docs/session-management.md index 78ff6dd05..e4f0c1831 100644 --- a/tdrs-backend/docs/session-management.md +++ b/tdrs-backend/docs/session-management.md @@ -5,9 +5,17 @@ The requirement for this project is that users will be logged out of the system ### Backend The backend will be the ultimate arbiter of session management. When the user logs in they will receive an HttpOnly cookie that is set to expire in 30 minutes. After that, with every interaction between the FE and BE, the BE will refresh the cookie, so it will extend the timeout time to another 30 minutes. -This is managed in `tdrs-backend/tdpservice/settings/common.py` with the following setting: +When the user logs in, they will receive an HttpOnly cookie with no `Expires=` setting. This indicates a [session cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Cookies#removal_defining_the_lifetime_of_a_cookie) which will automatically expire upon browser close. This is controlled with the django setting: + +```python +SESSION_EXPIRE_AT_BROWSER_CLOSE=True ``` -SESSION_TIMEOUT = 30 + +The cookie itself contains a `sessionid` reference to a Django-managed session. The session expiration is set to the same expiration of the login.gov-provided jwt, **15 minutes**. + +This is managed in `tdrs-backend/tdpservice/settings/common.py` with the following setting: +```python +SESSION_COOKIE_AGE = 15 * 60 # 15 minutes ``` ### Frontend diff --git a/tdrs-backend/tdpservice/parsers/case_consistency_validator.py b/tdrs-backend/tdpservice/parsers/case_consistency_validator.py index ce0d6a13c..7476e57c7 100644 --- a/tdrs-backend/tdpservice/parsers/case_consistency_validator.py +++ b/tdrs-backend/tdpservice/parsers/case_consistency_validator.py @@ -168,7 +168,7 @@ def __validate_section1(self, num_errors): def __validate_section2(self, num_errors): """Perform TANF Section 2 category four validation on all cached records.""" num_errors += self.__validate_s2_records_are_related() - num_errors += self.__validate_t5_aabd_and_ssi() + num_errors += self.__validate_t5_atd_and_ssi() return num_errors def __validate_family_affiliation(self, num_errors, t1s, t2s, t3s, error_msg): @@ -390,7 +390,7 @@ def __validate_s2_records_are_related(self): num_errors += 1 return num_errors - def __validate_t5_aabd_and_ssi(self): + def __validate_t5_atd_and_ssi(self): num_errors = 0 is_ssp = self.program_type == 'SSP' @@ -403,7 +403,7 @@ def __validate_t5_aabd_and_ssi(self): t5s = self.sorted_cases.get(t5_model, []) for record, schema in t5s: - rec_aabd = getattr(record, 'REC_AID_TOTALLY_DISABLED') + rec_atd = getattr(record, 'REC_AID_TOTALLY_DISABLED') rec_ssi = getattr(record, 'REC_SSI') family_affiliation = getattr(record, 'FAMILY_AFFILIATION') dob = getattr(record, 'DATE_OF_BIRTH') @@ -413,7 +413,7 @@ def __validate_t5_aabd_and_ssi(self): dob_date = datetime.strptime(dob, '%Y%m%d') is_adult = get_years_apart(rpt_date, dob_date) >= 19 - if is_territory and is_adult and (rec_aabd != 1 and rec_aabd != 2): + if is_territory and is_adult and rec_atd not in {1, 2}: self.__generate_and_add_error( schema, record, @@ -424,7 +424,7 @@ def __validate_t5_aabd_and_ssi(self): ) ) num_errors += 1 - elif is_state and rec_aabd != 2: + elif is_state and rec_atd == 1: self.__generate_and_add_error( schema, record, @@ -446,7 +446,7 @@ def __validate_t5_aabd_and_ssi(self): ) ) num_errors += 1 - elif is_state and family_affiliation == 1: + elif is_state and family_affiliation == 1 and rec_ssi not in {1, 2}: self.__generate_and_add_error( schema, record, diff --git a/tdrs-backend/tdpservice/parsers/fields.py b/tdrs-backend/tdpservice/parsers/fields.py index 076743096..2431148d0 100644 --- a/tdrs-backend/tdpservice/parsers/fields.py +++ b/tdrs-backend/tdpservice/parsers/fields.py @@ -18,6 +18,7 @@ def __init__( endIndex, required=True, validators=[], + ignore_errors=False, ): self.item = item self.name = name @@ -27,6 +28,7 @@ def __init__( self.endIndex = endIndex self.required = required self.validators = validators + self.ignore_errors = ignore_errors def create(self, item, name, length, start, end, type): """Create a new field.""" @@ -64,7 +66,7 @@ class TransformField(Field): """Represents a field that requires some transformation before serializing.""" def __init__(self, transform_func, item, name, friendly_name, type, startIndex, endIndex, required=True, - validators=[], **kwargs): + validators=[], ignore_errors=False, **kwargs): super().__init__( item=item, name=name, @@ -73,7 +75,8 @@ def __init__(self, transform_func, item, name, friendly_name, type, startIndex, startIndex=startIndex, endIndex=endIndex, required=required, - validators=validators) + validators=validators, + ignore_errors=ignore_errors) self.transform_func = transform_func self.kwargs = kwargs diff --git a/tdrs-backend/tdpservice/parsers/parse.py b/tdrs-backend/tdpservice/parsers/parse.py index f21b7bc7c..61b009616 100644 --- a/tdrs-backend/tdpservice/parsers/parse.py +++ b/tdrs-backend/tdpservice/parsers/parse.py @@ -34,6 +34,10 @@ def parse_datafile(datafile, dfs): bulk_create_errors({1: header_errors}, 1, flush=True) update_meta_model(datafile, dfs) return errors + elif header_is_valid and len(header_errors) > 0: + logger.info(f"Preparser Warning: {len(header_errors)} header warnings encountered.") + errors['header'] = header_errors + bulk_create_errors({1: header_errors}, 1, flush=True) field_values = schema_defs.header.get_field_values_by_names(header_line, {"encryption", "tribe_code", "state_fips"}) diff --git a/tdrs-backend/tdpservice/parsers/row_schema.py b/tdrs-backend/tdpservice/parsers/row_schema.py index 7dd01556f..970f83cbe 100644 --- a/tdrs-backend/tdpservice/parsers/row_schema.py +++ b/tdrs-backend/tdpservice/parsers/row_schema.py @@ -141,7 +141,7 @@ def run_field_validators(self, instance, generate_error): if (field.required and not is_empty) or should_validate: for validator in field.validators: validator_is_valid, validator_error = validator(value, self, field.friendly_name, field.item) - is_valid = False if not validator_is_valid else is_valid + is_valid = False if (not validator_is_valid and not field.ignore_errors) else is_valid if validator_error: errors.append( generate_error( diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/header.py b/tdrs-backend/tdpservice/parsers/schema_defs/header.py index 67475fd5f..2ef98f56e 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/header.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/header.py @@ -122,11 +122,17 @@ startIndex=22, endIndex=23, required=True, - validators=[validators.matches("D", - error_func=lambda eargs: ("HEADER Update Indicator must be set to D " - f"instead of {eargs.value}. Please review " - "Exporting Complete Data Using FTANF in the " - "Knowledge Center."))], + validators=[ + validators.matches( + "D", + error_func=lambda eargs: ( + "HEADER Update Indicator must be set to D " + f"instead of {eargs.value}. Please review " + "Exporting Complete Data Using FTANF in the " + "Knowledge Center." + ), + )], + ignore_errors=True, ), ], ) diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m6.py b/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m6.py index 43d9ec7f5..5774e485a 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m6.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m6.py @@ -50,7 +50,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid() ] ), @@ -215,7 +215,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid() ] ), @@ -380,7 +380,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid() ] ), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m7.py b/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m7.py index 5075b12b9..ca01bc43e 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m7.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/ssp/m7.py @@ -50,7 +50,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid(), ], ), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t6.py b/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t6.py index 4b355c4ed..17f40ecba 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t6.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t6.py @@ -56,7 +56,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid(), ], ), @@ -276,7 +276,10 @@ startIndex=2, endIndex=7, required=True, - validators=[], + validators=[ + validators.dateYearIsLargerThan(2019), + validators.quarterIsValid(), + ], ), TransformField( calendar_quarter_to_rpt_month_year(1), @@ -491,7 +494,10 @@ startIndex=2, endIndex=7, required=True, - validators=[], + validators=[ + validators.dateYearIsLargerThan(2019), + validators.quarterIsValid(), + ], ), TransformField( calendar_quarter_to_rpt_month_year(2), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t7.py b/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t7.py index 7916a2b8c..995aba897 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t7.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/tanf/t7.py @@ -50,7 +50,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid(), ], ), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t6.py b/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t6.py index a85ca325e..77c5a45c6 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t6.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t6.py @@ -44,7 +44,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid(), ], ), @@ -252,7 +252,10 @@ startIndex=2, endIndex=7, required=True, - validators=[], + validators=[ + validators.dateYearIsLargerThan(2019), + validators.quarterIsValid(), + ], ), TransformField( calendar_quarter_to_rpt_month_year(1), @@ -455,7 +458,10 @@ startIndex=2, endIndex=7, required=True, - validators=[], + validators=[ + validators.dateYearIsLargerThan(2019), + validators.quarterIsValid(), + ], ), TransformField( calendar_quarter_to_rpt_month_year(2), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t7.py b/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t7.py index dd0e020a2..12b034466 100644 --- a/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t7.py +++ b/tdrs-backend/tdpservice/parsers/schema_defs/tribal_tanf/t7.py @@ -50,7 +50,7 @@ endIndex=7, required=True, validators=[ - validators.dateYearIsLargerThan(2020), + validators.dateYearIsLargerThan(2019), validators.quarterIsValid(), ], ), diff --git a/tdrs-backend/tdpservice/parsers/schema_defs/util.py b/tdrs-backend/tdpservice/parsers/schema_defs/util.py deleted file mode 100644 index 5e7d3f9d2..000000000 --- a/tdrs-backend/tdpservice/parsers/schema_defs/util.py +++ /dev/null @@ -1,152 +0,0 @@ -"""Utility functions for schema definitions.""" - -from .. import schema_defs -from tdpservice.data_files.models import DataFile - -import logging - -logger = logging.getLogger(__name__) - -def get_schema_options(program, section, query=None, model=None, model_name=None): - """Centralized function to return the appropriate schema for a given program, section, and query. - - TODO: need to rework this docstring as it is outdated hence the weird ';;' for some of them. - - @param program: the abbreviated program type (.e.g, 'TAN') - @param section: the section of the file (.e.g, 'A');; or ACTIVE_CASE_DATA - @param query: the query for section_names (.e.g, 'section', 'models', etc.) - @return: the appropriate references (e.g., ACTIVE_CASE_DATA or {t1,t2,t3}) ;; returning 'A' - """ - schema_options = { - 'TAN': { - 'A': { - 'section': DataFile.Section.ACTIVE_CASE_DATA, - 'models': { - 'T1': schema_defs.tanf.t1, - 'T2': schema_defs.tanf.t2, - 'T3': schema_defs.tanf.t3, - } - }, - 'C': { - 'section': DataFile.Section.CLOSED_CASE_DATA, - 'models': { - 'T4': schema_defs.tanf.t4, - 'T5': schema_defs.tanf.t5, - } - }, - 'G': { - 'section': DataFile.Section.AGGREGATE_DATA, - 'models': { - 'T6': schema_defs.tanf.t6, - } - }, - 'S': { - 'section': DataFile.Section.STRATUM_DATA, - 'models': { - 'T7': schema_defs.tanf.t7, - } - } - }, - 'SSP': { - 'A': { - 'section': DataFile.Section.SSP_ACTIVE_CASE_DATA, - 'models': { - 'M1': schema_defs.ssp.m1, - 'M2': schema_defs.ssp.m2, - 'M3': schema_defs.ssp.m3, - } - }, - 'C': { - 'section': DataFile.Section.SSP_CLOSED_CASE_DATA, - 'models': { - 'M4': schema_defs.ssp.m4, - 'M5': schema_defs.ssp.m5, - } - }, - 'G': { - 'section': DataFile.Section.SSP_AGGREGATE_DATA, - 'models': { - 'M6': schema_defs.ssp.m6, - } - }, - 'S': { - 'section': DataFile.Section.SSP_STRATUM_DATA, - 'models': { - 'M7': schema_defs.ssp.m7, - } - } - }, - 'Tribal TAN': { - 'A': { - 'section': DataFile.Section.TRIBAL_ACTIVE_CASE_DATA, - 'models': { - 'T1': schema_defs.tribal_tanf.t1, - 'T2': schema_defs.tribal_tanf.t2, - 'T3': schema_defs.tribal_tanf.t3, - } - }, - 'C': { - 'section': DataFile.Section.TRIBAL_CLOSED_CASE_DATA, - 'models': { - 'T4': schema_defs.tribal_tanf.t4, - 'T5': schema_defs.tribal_tanf.t5, - } - }, - 'G': { - 'section': DataFile.Section.TRIBAL_AGGREGATE_DATA, - 'models': { - 'T6': schema_defs.tribal_tanf.t6, - } - }, - 'S': { - 'section': DataFile.Section.TRIBAL_STRATUM_DATA, - 'models': { - 'T7': schema_defs.tribal_tanf.t7, - } - }, - }, - } - - if query == "text": - for prog_name, prog_dict in schema_options.items(): - for sect, val in prog_dict.items(): - if val['section'] == section: - return {'program_type': prog_name, 'section': sect} - raise ValueError("Model not found in schema_defs") - elif query == "section": - return schema_options.get(program, {}).get(section, None)[query] - elif query == "models": - links = schema_options.get(program, {}).get(section, None) - - # if query is not chosen or wrong input, return all options - # query = 'models', model = 'T1' - models = links.get(query, links) - - if model_name is None: - return models - elif model_name not in models.keys(): - logger.debug(f"Model {model_name} not found in schema_defs") - return [] # intentionally trigger the error_msg for unknown record type - else: - return models.get(model_name, models) - -def get_program_models(str_prog, str_section): - """Return the models dict for a given program and section.""" - return get_schema_options(program=str_prog, section=str_section, query='models') - -def get_program_model(str_prog, str_section, str_model): - """Return singular model for a given program, section, and name.""" - return get_schema_options(program=str_prog, section=str_section, query='models', model_name=str_model) - -def get_section_reference(str_prog, str_section): - """Return the named section reference for a given program and section.""" - return get_schema_options(program=str_prog, section=str_section, query='section') - -def get_text_from_df(df): - """Return the short-hand text for program, section for a given datafile.""" - return get_schema_options("", section=df.section, query='text') - -def get_schema(line, section, program_type): - """Return the appropriate schema for the line.""" - line_type = line[0:2] - return get_schema_options(program_type, section, query='models', model_name=line_type) diff --git a/tdrs-backend/tdpservice/parsers/test/test_case_consistency.py b/tdrs-backend/tdpservice/parsers/test/test_case_consistency.py index f65093d4e..d4a42d910 100644 --- a/tdrs-backend/tdpservice/parsers/test/test_case_consistency.py +++ b/tdrs-backend/tdpservice/parsers/test/test_case_consistency.py @@ -1042,7 +1042,7 @@ def test_section2_aabd_ssi_validator_pass_territory_child_aabd(self, small_corre ), ]) @pytest.mark.django_db - def test_section2_aabd_ssi_validator_fail_state_aabd(self, small_correct_file, header, T4Stuff, T5Stuff): + def test_section2_atd_ssi_validator_fail_state_atdd(self, small_correct_file, header, T4Stuff, T5Stuff): """Test records are related validator section 2 success case.""" (T4Factory, t4_schema, t4_model_name) = T4Stuff (T5Factory, t5_schema, t5_model_name) = T5Stuff @@ -1198,7 +1198,7 @@ def test_section2_aabd_ssi_validator_fail_territory_ssi(self, small_correct_file ), ]) @pytest.mark.django_db - def test_section2_aabd_ssi_validator_fail_state_ssi(self, small_correct_file, header, T4Stuff, T5Stuff): + def test_section2_atd_ssi_validator_fail_state_ssi(self, small_correct_file, header, T4Stuff, T5Stuff): """Test records are related validator section 2 success case.""" (T4Factory, t4_schema, t4_model_name) = T4Stuff (T5Factory, t5_schema, t5_model_name) = T5Stuff @@ -1228,7 +1228,7 @@ def test_section2_aabd_ssi_validator_fail_state_ssi(self, small_correct_file, he DATE_OF_BIRTH="19970209", FAMILY_AFFILIATION=1, REC_AID_TOTALLY_DISABLED=2, - REC_SSI=2 + REC_SSI=0 ), T5Factory.create( RPT_MONTH_YEAR=202010, @@ -1236,7 +1236,7 @@ def test_section2_aabd_ssi_validator_fail_state_ssi(self, small_correct_file, he DATE_OF_BIRTH="19970209", FAMILY_AFFILIATION=2, # validator only applies to fam_affil = 1; won't generate error REC_AID_TOTALLY_DISABLED=2, - REC_SSI=2 + REC_SSI=0 ), ] for t5 in t5s: diff --git a/tdrs-backend/tdpservice/parsers/test/test_header.py b/tdrs-backend/tdpservice/parsers/test/test_header.py index 18079bc68..ceca0a35e 100644 --- a/tdrs-backend/tdpservice/parsers/test/test_header.py +++ b/tdrs-backend/tdpservice/parsers/test/test_header.py @@ -59,8 +59,8 @@ def test_header_cleanup(test_datafile): # Encryption error ("HEADER20204A06 TAN1AD", False, "HEADER Item 9 (encryption): A is not in [ , E]."), # Update error - ("HEADER20204A06 TAN1EA", False, ("HEADER Update Indicator must be set to D instead of A. Please review " - "Exporting Complete Data Using FTANF in the Knowledge Center.")), + ("HEADER20204A06 TAN1EA", True, ("HEADER Update Indicator must be set to D instead of A. Please review " + "Exporting Complete Data Using FTANF in the Knowledge Center.")), ]) @pytest.mark.django_db def test_header_fields(test_datafile, header_line, is_valid, error): diff --git a/tdrs-backend/tdpservice/parsers/test/test_parse.py b/tdrs-backend/tdpservice/parsers/test/test_parse.py index 41c02252b..480b6dd46 100644 --- a/tdrs-backend/tdpservice/parsers/test/test_parse.py +++ b/tdrs-backend/tdpservice/parsers/test/test_parse.py @@ -1576,16 +1576,12 @@ def test_parse_tanf_section_1_file_with_bad_update_indicator(tanf_section_1_file parser_errors = ParserError.objects.filter(file=tanf_section_1_file_with_bad_update_indicator) - assert parser_errors.count() == 1 - - error = parser_errors.first() + assert parser_errors.count() == 5 - assert error.error_type == ParserErrorCategoryChoices.FIELD_VALUE - assert error.error_message == ("HEADER Update Indicator must be set to D " - "instead of U. Please review " - "Exporting Complete Data Using FTANF in the " - "Knowledge Center.") + error_messages = [error.error_message for error in parser_errors] + assert "HEADER Update Indicator must be set to D instead of U. Please review" + \ + " Exporting Complete Data Using FTANF in the Knowledge Center." in error_messages @pytest.mark.django_db() def test_parse_tribal_section_4_bad_quarter(tribal_section_4_bad_quarter, dfs): diff --git a/tdrs-backend/tdpservice/scheduling/management/db_backup.py b/tdrs-backend/tdpservice/scheduling/management/db_backup.py index 2ee42c14a..11beceaed 100644 --- a/tdrs-backend/tdpservice/scheduling/management/db_backup.py +++ b/tdrs-backend/tdpservice/scheduling/management/db_backup.py @@ -57,28 +57,16 @@ def get_system_values(): sys_values['S3_SECRET_ACCESS_KEY'] = sys_values['S3_CREDENTIALS']['secret_access_key'] sys_values['S3_BUCKET'] = sys_values['S3_CREDENTIALS']['bucket'] sys_values['S3_REGION'] = sys_values['S3_CREDENTIALS']['region'] - sys_values['DATABASE_URI'] = OS_ENV['DATABASE_URL'] + # Set AWS credentials in env, Boto3 uses the env variables for connection os.environ["AWS_ACCESS_KEY_ID"] = sys_values['S3_ACCESS_KEY_ID'] os.environ["AWS_SECRET_ACCESS_KEY"] = sys_values['S3_SECRET_ACCESS_KEY'] # Set Database connection info - AWS_RDS_SERVICE_JSON = json.loads(OS_ENV['VCAP_SERVICES'])['aws-rds'][0]['credentials'] - sys_values['DATABASE_PORT'] = AWS_RDS_SERVICE_JSON['port'] - sys_values['DATABASE_PASSWORD'] = AWS_RDS_SERVICE_JSON['password'] - sys_values['DATABASE_DB_NAME'] = AWS_RDS_SERVICE_JSON['db_name'] - sys_values['DATABASE_HOST'] = AWS_RDS_SERVICE_JSON['host'] - sys_values['DATABASE_USERNAME'] = AWS_RDS_SERVICE_JSON['username'] - - # write .pgpass - with open('/home/vcap/.pgpass', 'w') as f: - f.write(sys_values['DATABASE_HOST'] + ":" - + sys_values['DATABASE_PORT'] + ":" - + settings.DATABASES['default']['NAME'] + ":" - + sys_values['DATABASE_USERNAME'] + ":" - + sys_values['DATABASE_PASSWORD']) - os.environ['PGPASSFILE'] = '/home/vcap/.pgpass' - os.system('chmod 0600 /home/vcap/.pgpass') + AWS_RDS_SERVICE_JSON = json.loads(OS_ENV['VCAP_SERVICES'])['aws-rds'][0] + sys_values['DATABASE_URI'] = AWS_RDS_SERVICE_JSON['credentials']['uri'].rsplit('/', 1)[0] + sys_values['DATABASE_DB_NAME'] = AWS_RDS_SERVICE_JSON['credentials']['db_name'] + return sys_values @@ -94,19 +82,11 @@ def backup_database(file_name, pg_dump -F c --no-acl --no-owner -f backup.pg postgresql://${USERNAME}:${PASSWORD}@${HOST}:${PORT}/${NAME} """ try: - # TODO: This is a bandaid until the correct logic is determined for the system values with respect to the - # correct database name. - # cmd = postgres_client + "pg_dump -Fc --no-acl -f " + file_name + " -d " + database_uri - db_host = settings.DATABASES['default']['HOST'] - db_port = settings.DATABASES['default']['PORT'] - db_name = settings.DATABASES['default']['NAME'] - db_user = settings.DATABASES['default']['USER'] - - export_password = f"export PGPASSWORD={settings.DATABASES['default']['PASSWORD']}" - cmd = (f"{export_password} && {postgres_client}pg_dump -h {db_host} -p {db_port} -d {db_name} -U {db_user} " - f"-F c --no-password --no-acl --no-owner -f {file_name}") + cmd = f"{postgres_client}pg_dump -Fc --no-acl -f {file_name} -d {database_uri}" logger.info(f"Executing backup command: {cmd}") - os.system(cmd) + code = os.system(cmd) + if code != 0: + raise Exception("pg_dump command failed with a non zero exit code.") msg = "Successfully executed backup. Wrote pg dumpfile to {}".format(file_name) logger.info(msg) LogEntry.objects.log_action( @@ -268,28 +248,47 @@ def get_database_credentials(database_uri): database_name = database_uri return [username, password, host, port, database_name] - -def main(argv, sys_values, system_user): - """Handle commandline args.""" - arg_file = "/tmp/backup.pg" - arg_database = sys_values['DATABASE_URI'] +def get_opts(argv, db_name): + """Parse command line options.""" + arg_file = f"/tmp/{db_name}_backup.pg" arg_to_restore = False arg_to_backup = False + restore_db_name = None + opts, args = getopt.getopt(argv, "hbrf:n:", ["help", "backup", "restore", "file=", "restore_db_name="]) + for opt, arg in opts: + if "backup" in opt or "-b" in opt: + arg_to_backup = True + elif "restore" in opt or "-r" in opt: + arg_to_restore = True + if "file" in opt or "-f" in opt and arg: + arg_file = arg if arg[0] == "/" else "/tmp/" + arg + if "restore_db_name" in opt or "-n" in opt and arg: + restore_db_name = arg + + if arg_to_restore and not restore_db_name: + raise ValueError("You must pass a `-n ` when trying to restore a DB.") + + return arg_file, arg_to_backup, arg_to_restore, restore_db_name + +def get_db_name(sys_values): + """ + Get the correct database name. - try: - opts, args = getopt.getopt(argv, "hbrf:d:", ["help", "backup", "restore", "file=", "database=", ]) - for opt, arg in opts: - if "backup" in opt or "-b" in opt: - arg_to_backup = True - elif "restore" in opt or "-r" in opt: - arg_to_restore = True - if "file" in opt or "-f" in opt and arg: - arg_file = arg if arg[0] == "/" else "/tmp/" + arg - if "database" in opt or "-d" in opt: - arg_database = arg + In prod we use the default database name that AWS creates. In the Dev and Staging environments the databases are + named based off of their app; i.e. tdp_db_raft. The deploy script sets the APP_DB_NAME environment variable for all + apps except prod. + """ + env_db_name = os.getenv("APP_DB_NAME", None) + if env_db_name is None: + return sys_values['DATABASE_DB_NAME'] + return env_db_name - except Exception as e: - raise e +def main(argv, sys_values, system_user): + """Handle commandline args.""" + db_base_uri = sys_values['DATABASE_URI'] + + db_name = get_db_name(sys_values) + arg_file, arg_to_backup, arg_to_restore, restore_db_name = get_opts(argv, db_name) if arg_to_backup: LogEntry.objects.log_action( @@ -303,7 +302,7 @@ def main(argv, sys_values, system_user): # back up database backup_database(file_name=arg_file, postgres_client=sys_values['POSTGRES_CLIENT_DIR'], - database_uri=arg_database, + database_uri=f"{db_base_uri}/{db_name}", system_user=system_user) # upload backup file @@ -348,7 +347,7 @@ def main(argv, sys_values, system_user): # restore database restore_database(file_name=arg_file, postgres_client=sys_values['POSTGRES_CLIENT_DIR'], - database_uri=arg_database, + database_uri=f"{db_base_uri}/{restore_db_name}", system_user=system_user) LogEntry.objects.log_action( diff --git a/tdrs-backend/tdpservice/search_indexes/admin/filters.py b/tdrs-backend/tdpservice/search_indexes/admin/filters.py index 1d8caf0f8..399e45e36 100644 --- a/tdrs-backend/tdpservice/search_indexes/admin/filters.py +++ b/tdrs-backend/tdpservice/search_indexes/admin/filters.py @@ -2,8 +2,8 @@ from django.utils.translation import ugettext_lazy as _ from django.contrib.admin import SimpleListFilter from django.db.models import Q as Query -from more_admin_filters import MultiSelectDropdownFilter from tdpservice.stts.models import STT +from tdpservice.search_indexes.admin.multiselect_filter import MultiSelectDropdownFilter import datetime @@ -49,6 +49,7 @@ class STTFilter(MultiSelectDropdownFilter): def __init__(self, field, request, params, model, model_admin, field_path): super(MultiSelectDropdownFilter, self).__init__(field, request, params, model, model_admin, field_path) self.lookup_choices = self._get_lookup_choices(request) + self.title = _("STT") def _get_lookup_choices(self, request): """Filter queryset to guarantee lookup_choices only has STTs associated with the record type.""" diff --git a/tdrs-backend/tdpservice/search_indexes/admin/multiselect_filter.py b/tdrs-backend/tdpservice/search_indexes/admin/multiselect_filter.py new file mode 100644 index 000000000..071ff985b --- /dev/null +++ b/tdrs-backend/tdpservice/search_indexes/admin/multiselect_filter.py @@ -0,0 +1,191 @@ +"""File containing multiselect filter classes and mixins.""" +import urllib.parse +from django.contrib import admin +from django.db.models import Q +from django.utils.translation import gettext_lazy as _ +from django.contrib.admin.utils import reverse_field_path +from django.core.exceptions import ValidationError +from django.contrib.admin.options import IncorrectLookupParameters + + +def flatten_used_parameters(used_parameters: dict, keep_list: bool = True): + """Flatten length 1 lists in dictionary.""" + # FieldListFilter.__init__ calls prepare_lookup_value, + # which returns a list if lookup_kwarg ends with "__in" + for k, v in used_parameters.items(): + if len(v) == 1 and (isinstance(v[0], list) or not keep_list): + used_parameters[k] = v[0] + +class MultiSelectMixin(object): + """Mixin for multi-select filters.""" + + def queryset(self, request, queryset): + """Build queryset based on choices.""" + params = Q() + for lookup_arg, value in self.used_parameters.items(): + params |= Q(**{lookup_arg: value}) + try: + return queryset.filter(params) + except (ValueError, ValidationError) as e: + # Fields may raise a ValueError or ValidationError when converting + # the parameters to the correct type. + raise IncorrectLookupParameters(e) + + def querystring_for_choices(self, val, changelist): + """Build query string based on new val.""" + lookup_vals = self.lookup_vals[:] + if val in self.lookup_vals: + lookup_vals.remove(val) + else: + lookup_vals.append(val) + if lookup_vals: + query_string = changelist.get_query_string({ + self.lookup_kwarg: ','.join(lookup_vals), + }, []) + else: + query_string = changelist.get_query_string({}, [self.lookup_kwarg]) + return query_string + + def querystring_for_isnull(self, changelist): + """Build query string based on a null val.""" + if self.lookup_val_isnull: + query_string = changelist.get_query_string({}, [self.lookup_kwarg_isnull]) + else: + query_string = changelist.get_query_string({ + self.lookup_kwarg_isnull: 'True', + }, []) + return query_string + + def has_output(self): + """Return if there is output.""" + return len(self.lookup_choices) > 1 + + def get_facet_counts(self, pk_attname, filtered_qs): + """Return count of __in facets.""" + if not self.lookup_kwarg.endswith("__in"): + raise NotImplementedError("Facets are only supported for default lookup_kwarg values, ending with '__in' " + "(got '%s')" % self.lookup_kwarg) + + orig_lookup_kwarg = self.lookup_kwarg + self.lookup_kwarg = self.lookup_kwarg.removesuffix("in") + "exact" + counts = super().get_facet_counts(pk_attname, filtered_qs) + self.lookup_kwarg = orig_lookup_kwarg + return counts + + +class MultiSelectFilter(MultiSelectMixin, admin.AllValuesFieldListFilter): + """Multi select filter for all kind of fields.""" + + def __init__(self, field, request, params, model, model_admin, field_path): + self.lookup_kwarg = '%s__in' % field_path + self.lookup_kwarg_isnull = '%s__isnull' % field_path + lookup_vals = request.GET.get(self.lookup_kwarg) + self.lookup_vals = lookup_vals.split(',') if lookup_vals else list() + self.lookup_val_isnull = request.GET.get(self.lookup_kwarg_isnull) + self.empty_value_display = model_admin.get_empty_value_display() + parent_model, reverse_path = reverse_field_path(model, field_path) + # Obey parent ModelAdmin queryset when deciding which options to show + if model == parent_model: + queryset = model_admin.get_queryset(request) + else: + queryset = parent_model._default_manager.all() + self.lookup_choices = (queryset + .distinct() + .order_by(field.name) + .values_list(field.name, flat=True)) + super(admin.AllValuesFieldListFilter, self).__init__(field, request, params, model, model_admin, field_path) + flatten_used_parameters(self.used_parameters) + self.used_parameters = self.prepare_used_parameters(self.used_parameters) + + def prepare_querystring_value(self, value): + """Preparse the query string value.""" + # mask all commas or these values will be used + # in a comma-seperated-list as get-parameter + return str(value).replace(',', '%~') + + def prepare_used_parameters(self, used_parameters): + """Prepare parameters.""" + # remove comma-mask from list-values for __in-lookups + for key, value in used_parameters.items(): + if not key.endswith('__in'): + continue + used_parameters[key] = [v.replace('%~', ',') for v in value] + return used_parameters + + def choices(self, changelist): + """Generate choices.""" + add_facets = getattr(changelist, "add_facets", False) + facet_counts = self.get_facet_queryset(changelist) if add_facets else None + yield { + 'selected': not self.lookup_vals and self.lookup_val_isnull is None, + 'query_string': changelist.get_query_string({}, [self.lookup_kwarg, self.lookup_kwarg_isnull]), + 'display': _('All'), + } + include_none = False + count = None + empty_title = self.empty_value_display + for i, val in enumerate(self.lookup_choices): + if add_facets: + count = facet_counts[f"{i}__c"] + if val is None: + include_none = True + empty_title = f"{empty_title} ({count})" if add_facets else empty_title + continue + val = str(val) + qval = self.prepare_querystring_value(val) + yield { + 'selected': qval in self.lookup_vals, + 'query_string': self.querystring_for_choices(qval, changelist), + "display": f"{val} ({count})" if add_facets else val, + } + if include_none: + yield { + 'selected': bool(self.lookup_val_isnull), + 'query_string': self.querystring_for_isnull(changelist), + 'display': empty_title, + } + + +class MultiSelectDropdownFilter(MultiSelectFilter): + """Multi select dropdown filter for all kind of fields.""" + + template = 'multiselectdropdownfilter.html' + + def choices(self, changelist): + """Generate choices.""" + add_facets = getattr(changelist, "add_facets", False) + facet_counts = self.get_facet_queryset(changelist) if add_facets else None + query_string = changelist.get_query_string({}, [self.lookup_kwarg, self.lookup_kwarg_isnull]) + yield { + 'selected': not self.lookup_vals and self.lookup_val_isnull is None, + 'query_string': query_string, + 'display': _('All'), + } + include_none = False + count = None + empty_title = self.empty_value_display + for i, val in enumerate(self.lookup_choices): + if add_facets: + count = facet_counts[f"{i}__c"] + if val is None: + include_none = True + empty_title = f"{empty_title} ({count})" if add_facets else empty_title + continue + + val = str(val) + qval = self.prepare_querystring_value(val) + yield { + 'selected': qval in self.lookup_vals, + 'query_string': query_string, + "display": f"{val} ({count})" if add_facets else val, + 'value': urllib.parse.quote_plus(val), + 'key': self.lookup_kwarg, + } + if include_none: + yield { + 'selected': bool(self.lookup_val_isnull), + 'query_string': query_string, + "display": empty_title, + 'value': 'True', + 'key': self.lookup_kwarg_isnull, + } diff --git a/tdrs-backend/tdpservice/search_indexes/admin/reparse_meta.py b/tdrs-backend/tdpservice/search_indexes/admin/reparse_meta.py index 36034bb35..caff376e1 100644 --- a/tdrs-backend/tdpservice/search_indexes/admin/reparse_meta.py +++ b/tdrs-backend/tdpservice/search_indexes/admin/reparse_meta.py @@ -19,5 +19,7 @@ class ReparseMetaAdmin(ReadOnlyAdminMixin): list_filter = [ 'success', - 'finished' + 'finished', + 'fiscal_year', + 'fiscal_quarter', ] diff --git a/tdrs-backend/tdpservice/search_indexes/templates/multiselectdropdownfilter.html b/tdrs-backend/tdpservice/search_indexes/templates/multiselectdropdownfilter.html new file mode 100644 index 000000000..c8a8e9c78 --- /dev/null +++ b/tdrs-backend/tdpservice/search_indexes/templates/multiselectdropdownfilter.html @@ -0,0 +1,48 @@ +{% load i18n admin_urls %} +

{% blocktrans with filter_title=title %} By {{ filter_title }} {% endblocktrans %}

+ +
+ {% for choice in choices|slice:":1" %} + Show {{ choice.display }} + {% endfor %} + +
+ + diff --git a/tdrs-backend/tdpservice/settings/common.py b/tdrs-backend/tdpservice/settings/common.py index 83cc467b7..7a7baad72 100644 --- a/tdrs-backend/tdpservice/settings/common.py +++ b/tdrs-backend/tdpservice/settings/common.py @@ -53,7 +53,6 @@ class Common(Configuration): "storages", "django_elasticsearch_dsl", "django_elasticsearch_dsl_drf", - "more_admin_filters", # Local apps "tdpservice.core.apps.CoreConfig", "tdpservice.users", @@ -162,7 +161,7 @@ class Common(Configuration): TEMPLATES = [ { "BACKEND": "django.template.backends.django.DjangoTemplates", - "DIRS": STATICFILES_DIRS, + "DIRS": [os.path.join(BASE_DIR, "templates")], "APP_DIRS": True, "OPTIONS": { "context_processors": [ @@ -271,9 +270,8 @@ class Common(Configuration): # Sessions SESSION_ENGINE = "django.contrib.sessions.backends.signed_cookies" SESSION_COOKIE_HTTPONLY = True - SESSION_TIMEOUT = 30 SESSION_EXPIRE_AT_BROWSER_CLOSE = True - SESSION_COOKIE_AGE = 30 * 60 # 30 minutes + SESSION_COOKIE_AGE = 15 * 60 # 15 minutes # The CSRF token Cookie holds no security benefits when confined to HttpOnly. # Setting this to false to allow the frontend to include it in the header # of API POST calls to prevent false negative authorization errors. diff --git a/tdrs-backend/tdpservice/templates/error_pages/500.html b/tdrs-backend/tdpservice/templates/error_pages/500.html new file mode 100644 index 000000000..0060aa69f --- /dev/null +++ b/tdrs-backend/tdpservice/templates/error_pages/500.html @@ -0,0 +1,105 @@ + + + + + + + + + + + + + + + Page not found - TANF Data Portal + + + + +
+ Skip to main content +
U.S. flag

An Official website of the United States government

+
+
+ + +
+
+
+
+
+
+
+
+

Error

+

We’re sorry, there was an error in response.

+ +

{{ error }}.

+
+ + + Contact Us + +
+
+
+
+
+
+
+
+ +
+ + + \ No newline at end of file diff --git a/tdrs-backend/tdpservice/users/api/login.py b/tdrs-backend/tdpservice/users/api/login.py index 338508148..5e511963b 100644 --- a/tdrs-backend/tdpservice/users/api/login.py +++ b/tdrs-backend/tdpservice/users/api/login.py @@ -199,40 +199,13 @@ def login_user(request, user, user_status): ) logger.info("%s: %s on %s", user_status, user.username, timezone.now) - def get(self, request, *args, **kwargs): - """Handle decoding auth token and authenticate user.""" - code = request.GET.get("code", None) - state = request.GET.get("state", None) - - if code is None: - logger.info("Redirecting call to main page. No code provided.") - return HttpResponseRedirect(settings.FRONTEND_BASE_URL) - - if state is None: - logger.info("Redirecting call to main page. No state provided.") - return HttpResponseRedirect(settings.FRONTEND_BASE_URL) - - token_endpoint_response = self.get_token_endpoint_response(code) - - if token_endpoint_response.status_code != 200: - return Response( - { - "error": ( - "Invalid Validation Code Or OpenID Connect Authenticator " - "Down!" - ) - }, - status=status.HTTP_400_BAD_REQUEST, - ) - - token_data = token_endpoint_response.json() - id_token = token_data.get("id_token") - + def _get_user_id_token(self, request, state, token_data): + """Get the user and id_token from the request.""" try: decoded_payload = self.validate_and_decode_payload(request, state, token_data) + id_token = token_data.get("id_token") user = self.handle_user(request, id_token, decoded_payload) return response_redirect(user, id_token) - except (InactiveUser, ExpiredToken) as e: logger.exception(e) return Response( @@ -276,6 +249,39 @@ def get(self, request, *args, **kwargs): status=status.HTTP_400_BAD_REQUEST, ) + def get(self, request, *args, **kwargs): + """Handle decoding auth token and authenticate user.""" + code = request.GET.get("code", None) + state = request.GET.get("state", None) + + if code is None or state is None: + logger.info(f"Redirecting call to main page. No {'code' if code is None else 'state'} provided.") + return HttpResponseRedirect(settings.FRONTEND_BASE_URL) + + try: + token_endpoint_response = self.get_token_endpoint_response(code) + except Exception as e: + logger.exception(e) + return Response( + { + "error": str(e) + }, + status=status.HTTP_503_SERVICE_UNAVAILABLE + ) + + if token_endpoint_response.status_code != 200: + return Response( + { + "error": ( + "Invalid Validation Code Or OpenID Connect Authenticator " + "Down!" + ) + }, + status=status.HTTP_400_BAD_REQUEST, + ) + + token_data = token_endpoint_response.json() + return self._get_user_id_token(request, state, token_data) class TokenAuthorizationLoginDotGov(TokenAuthorizationOIDC): """Define methods for handling login request from login.gov.""" @@ -333,7 +339,11 @@ def decode_payload(self, token_data, options=None): id_token = token_data.get("id_token") access_token = token_data.get("access_token") - ams_configuration = LoginRedirectAMS.get_ams_configuration() + try: + ams_configuration = LoginRedirectAMS.get_ams_configuration() + except Exception as e: + logger.error(e) + raise Exception(e) certs_endpoint = ams_configuration["jwks_uri"] cert_str = generate_jwt_from_jwks(certs_endpoint) issuer = ams_configuration["issuer"] @@ -351,7 +361,11 @@ def decode_payload(self, token_data, options=None): def get_token_endpoint_response(self, code): """Build out the query string params and full URL path for token endpoint.""" # First fetch the token endpoint from AMS. - ams_configuration = LoginRedirectAMS.get_ams_configuration() + try: + ams_configuration = LoginRedirectAMS.get_ams_configuration() + except Exception as e: + logger.error(e) + raise Exception(e) options = { "client_id": settings.AMS_CLIENT_ID, "client_secret": settings.AMS_CLIENT_SECRET, @@ -374,7 +388,11 @@ def get_auth_options(self, access_token, sub): auth_options = {} # Fetch userinfo endpoint for AMS to authenticate against hhsid, or # other user claims. - ams_configuration = LoginRedirectAMS.get_ams_configuration() + try: + ams_configuration = LoginRedirectAMS.get_ams_configuration() + except Exception as e: + logger.error(e) + raise Exception(e) userinfo_response = requests.post(ams_configuration["userinfo_endpoint"], {"access_token": access_token}) user_info = userinfo_response.json() diff --git a/tdrs-backend/tdpservice/users/api/login_redirect_oidc.py b/tdrs-backend/tdpservice/users/api/login_redirect_oidc.py index 585c0ff47..2a63adc6d 100644 --- a/tdrs-backend/tdpservice/users/api/login_redirect_oidc.py +++ b/tdrs-backend/tdpservice/users/api/login_redirect_oidc.py @@ -4,11 +4,13 @@ import requests import secrets import time +from rest_framework import status from urllib.parse import quote_plus, urlencode from django.conf import settings -from django.http import HttpResponseRedirect +from django.http import HttpResponseRedirect, HttpResponse from django.views.generic.base import RedirectView +from django.template.loader import render_to_string logger = logging.getLogger(__name__) @@ -93,19 +95,32 @@ def get_ams_configuration(): Includes currently published URLs for authorization, token, etc. """ - r = requests.get(settings.AMS_CONFIGURATION_ENDPOINT) - data = r.json() - return data + r = requests.get(settings.AMS_CONFIGURATION_ENDPOINT, timeout=10) + if r.status_code != 200: + logger.error( + f"Failed to get AMS configuration: {r.status_code} - {r.text}" + ) + raise Exception(f"Failed to get AMS configuration: {r.status_code} - {r.text}") + else: + data = r.json() + return data def get(self, request, *args, **kwargs): """Handle login workflow based on request origin.""" # Create state and nonce to track requests state = secrets.token_hex(32) nonce = secrets.token_hex(32) - """Get request and manage login information with AMS OpenID.""" - configuration = self.get_ams_configuration() - + try: + configuration = self.get_ams_configuration() + except Exception as e: + logger.error(f"Failed to get AMS configuration: {e}") + rendered = render_to_string( + 'error_pages/500.html', + {'error': f"Failed to get AMS configuration: {e}", + 'frontend': settings.FRONTEND_BASE_URL}) + return HttpResponse(rendered, + status=status.HTTP_503_SERVICE_UNAVAILABLE,) auth_params = { "client_id": settings.AMS_CLIENT_ID, "nonce": nonce, diff --git a/tdrs-backend/tdpservice/users/api/middleware.py b/tdrs-backend/tdpservice/users/api/middleware.py index 5b82ae93f..7a8922384 100644 --- a/tdrs-backend/tdpservice/users/api/middleware.py +++ b/tdrs-backend/tdpservice/users/api/middleware.py @@ -13,7 +13,7 @@ def __call__(self, request): """Update cookie.""" response = self.get_response(request) now = datetime.datetime.now() - timeout = now + datetime.timedelta(minutes=settings.SESSION_TIMEOUT) + timeout = now + datetime.timedelta(minutes=settings.SESSION_COOKIE_AGE) # if there is no user, the user is currently # in the authentication process so we can't diff --git a/tdrs-backend/tdpservice/users/api/utils.py b/tdrs-backend/tdpservice/users/api/utils.py index 910646e05..5f6e348c6 100644 --- a/tdrs-backend/tdpservice/users/api/utils.py +++ b/tdrs-backend/tdpservice/users/api/utils.py @@ -14,14 +14,12 @@ import jwt import requests from jwcrypto import jwk -from rest_framework import status -from rest_framework.response import Response from django.conf import settings logger = logging.getLogger(__name__) now = datetime.datetime.now() -timeout = now + datetime.timedelta(minutes=settings.SESSION_TIMEOUT) +timeout = now + datetime.timedelta(minutes=settings.SESSION_COOKIE_AGE) """ Validate the nonce and state returned by login.gov API calls match those @@ -149,35 +147,6 @@ def get_nonce_and_state(session): return validation_keys -""" -Returns a found users information along with an httpOnly cookie. - -:param self: parameter to permit django python to call a method within its own class -:param user: current user associated with this session -:param status_message: Helper message to note how the user was found -:param id_token: encoded token returned by login.gov/token -""" - - -def response_internal(user, status_message, id_token): - """Respond with an httpOnly cookie to secure the session with the client.""" - response = Response( - {"user_id": user.pk, "email": user.username, "status": status_message}, - status=status.HTTP_200_OK, - ) - response.set_cookie( - "id_token", - value=id_token, - max_age=None, - expires=timeout, - path="/", - domain=None, - secure=True, - httponly=True, - ) - return response - - def response_redirect(self, id_token): """ Redirects to web app with an httpOnly cookie. diff --git a/tdrs-backend/tdpservice/users/test/test_api/test_login.py b/tdrs-backend/tdpservice/users/test/test_api/test_login.py new file mode 100644 index 000000000..a1e7a9a41 --- /dev/null +++ b/tdrs-backend/tdpservice/users/test/test_api/test_login.py @@ -0,0 +1,46 @@ +"""Test the LoginRedirectAMS class.""" +import pytest + +from unittest import mock +from tdpservice.users.api.login_redirect_oidc import LoginRedirectAMS + +@mock.patch("requests.get") +def test_get_ams_configuration(requests_get_mock): + """Test the LoginRedirectAMS class.""" + requests_get_mock.return_value.status_code = 200 + requests_get_mock.return_value.json.return_value = {"key": "test"} + returned_value = LoginRedirectAMS.get_ams_configuration() + assert returned_value == {'key': 'test'} + + # Test if the configuration is not returned + requests_get_mock.return_value.status_code = 503 + with pytest.raises(Exception): + LoginRedirectAMS.get_ams_configuration() + + +@mock.patch("requests.get") +@mock.patch("secrets.token_hex") +def test_LoginRedirectAMS_get(secrets_token_hex_mock, requests_get_mock): + """Test the LoginRedirectAMS class.""" + class DummyRequest: + session = { + "state_nonce_tracker": "dummy_state_nonce_tracker" + } + + requests_get_mock.return_value.status_code = 200 + requests_get_mock.return_value.json.return_value = {"authorization_endpoint": "dummy_authorization_endpoint"} + + secrets_token_hex_mock.return_value = "dummy_state_nonce" + + login_redirect_ams = LoginRedirectAMS() + + response = login_redirect_ams.get(DummyRequest) + assert response.url is not None + assert "dummy_state_nonce" in response.url + assert "dummy_authorization_endpoint" in response.url + + # Test if the AMS server is down + requests_get_mock.return_value.status_code = 503 + login_redirect_ams = LoginRedirectAMS() + response = login_redirect_ams.get("request") + assert response.status_code == 503 diff --git a/tdrs-backend/tdpservice/users/test/test_auth.py b/tdrs-backend/tdpservice/users/test/test_auth.py index 2ace23305..98ed19487 100644 --- a/tdrs-backend/tdpservice/users/test/test_auth.py +++ b/tdrs-backend/tdpservice/users/test/test_auth.py @@ -8,6 +8,7 @@ from django.core.exceptions import ImproperlyConfigured, SuspiciousOperation from rest_framework import status from rest_framework.test import APIRequestFactory +from unittest import mock import jwt import pytest @@ -18,7 +19,6 @@ generate_client_assertion, generate_jwt_from_jwks, generate_token_endpoint_parameters, - response_internal, ) from tdpservice.users.authentication import CustomAuthentication from tdpservice.users.models import User @@ -278,6 +278,27 @@ def test_auth_user_hhs_id_update(self, user): user_by_id = CustomAuthentication.authenticate(username=user.username, hhs_id=self.test_hhs_id) assert str(user_by_id.hhs_id) == self.test_hhs_id + @mock.patch("requests.get") + def test_bad_AMS_configuration( + self, + ams_states_factory, + req_factory, + user + ): + """Test login with state and code.""" + request = req_factory + request = create_session(request, ams_states_factory) + user.hhs_id = self.test_hhs_id + # test new hash + user.login_gov_uuid = None + user.save() + + view = TokenAuthorizationAMS.as_view() + response = view(request) + assert response.status_code == status.HTTP_503_SERVICE_UNAVAILABLE + assert b'Failed to get AMS configuration' in response.render().content + + def test_login_gov_redirect(api_client): """Test login.gov login url redirects.""" response = api_client.get("/v1/login/dotgov") @@ -428,15 +449,6 @@ def test_login_fails_with_bad_data(api_client): assert response.status_code == status.HTTP_400_BAD_REQUEST -@pytest.mark.django_db -def test_response_internal(user): - """Test response internal works.""" - response = response_internal( - user, status_message="hello", id_token={"fake": "stuff"} - ) - assert response.status_code == status.HTTP_200_OK - - @pytest.mark.django_db def test_generate_jwt_from_jwks(mocker): """Test JWT generation.""" diff --git a/tdrs-frontend/src/components/Header/Header.jsx b/tdrs-frontend/src/components/Header/Header.jsx index 201cd55bf..bcb614267 100644 --- a/tdrs-frontend/src/components/Header/Header.jsx +++ b/tdrs-frontend/src/components/Header/Header.jsx @@ -90,7 +90,7 @@ function Header() {