Skip to content

Commit

Permalink
Merge branch 'develop' into sentry-adr
Browse files Browse the repository at this point in the history
  • Loading branch information
raftmsohani authored Oct 1, 2024
2 parents 828d568 + c5f87eb commit cf56075
Show file tree
Hide file tree
Showing 16 changed files with 919 additions and 182 deletions.
89 changes: 89 additions & 0 deletions docs/Sprint-Review/sprint-107-summary.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# sprint-107-summary

8/28/2024 - 9/10/2024

### Priority Setting

* Re-parsing epic  
* Postgres db access  
* UX research with DIGIT team 
* Continuous communication with STTs about latest TDP features and updates 

### Sprint Goal

**Dev:**

_**Re-parsing, Admin Console Improvements, and Application Health Monitoring work**_

* \#3106 — Re-Parse Django Action 
* \#3137 — \[bug] OFA unable to export data to csv by record type and fiscal period
* \#3074 — TDP Data Files page permissions for DIGIT & Sys Admin user groups
* \#3044 — Prometheus/Grafana - Local Environment
* \#3042 — Sentry in cloud.gov

**DevOps:**
_**Successful deployments across environments and pipeline stability investments**_

* \#2965 — As tech lead, I want a database seed implemented for testing
* \#2458 — Integrate Nexus into CircleCI

**Design:**

_**Support reviews, In-app banner to support parsed data, Continue Error Audit (Cat 4)**_

* \#3156 — Release Notes Email Template
* \#3100 — \[Design Deliverable] Update stakeholders & personas document
* \#2968 — \[Design Deliverable] Update Error Audit for Cat 4 / QA

## Tickets

### Completed/Merged

* [#2561 As a sys admin, I need TDP to automatically deactivate accounts that are inactive for 180 days](https://github.com/raft-tech/TANF-app/issues/2561)
* [#2792 \[Error Audit\] Category 3 error messages clean-up ](https://github.com/raft-tech/TANF-app/issues/2792)
* [#3043 Sentry: Local environment for Debugging](https://github.com/raft-tech/TANF-app/issues/3043)
* [#3064 Re-parse Meta Model](https://github.com/raft-tech/TANF-app/issues/3064)
* [#3065 Spike - Guarantee Sequential Execution of Re-parse Command](https://github.com/raft-tech/TANF-app/issues/3065)
* [#3074 TDP Data Files page permissions for DIGIT & Sys Admin user groups ](https://github.com/raft-tech/TANF-app/issues/3074)
* [#3076 Admin Filter Enhancements for Data Files Page ](https://github.com/raft-tech/TANF-app/issues/3076)
* [#3078 \[Research Synthesis\] DIGIT Admin Experience Improvements ](https://github.com/raft-tech/TANF-app/issues/3078)
* [#3087 Admin By Newest Filter Enhancements for Data Files Page ](https://github.com/raft-tech/TANF-app/issues/3087)
* [#3114 \[Design Spike\] In-app banner for submission history pages w/ data parsed before May 2024 ](https://github.com/raft-tech/TANF-app/issues/3114)
* [#3142 \[Research Spike\] Get more detail about Yun & DIGIT's data workflow and use cases ](https://github.com/raft-tech/TANF-app/issues/3142)

### Submitted (QASP Review, OCIO Review)

*

### Ready to Merge

* [#2883 Pre-Made Reporting Dashboards on Kibana ](https://github.com/raft-tech/TANF-app/issues/2883)
* [#3102 Admin Exp: Django Implement Multi-Select Fiscal Period Dropdown For Data Export ](https://github.com/raft-tech/TANF-app/issues/3102)

### Closed (Not Merged)

* [#3110 Spike - Investigate Custom Filter Integration ](https://github.com/raft-tech/TANF-app/issues/3110)
* [#3156 Release Notes Knowledge Center and Email Template ](https://github.com/raft-tech/TANF-app/issues/3156)

### Moved to Next Sprint 

**In Progress** 

* [#2968 \[Design Deliverable\] Update Error Audit for Cat 4 / QA ](https://github.com/raft-tech/TANF-app/issues/2968)
* [#3060 As a TDP user, I need to stay logged in when I'm actively using the system ](https://github.com/raft-tech/TANF-app/issues/3060)
* [#3100 \[Design Deliverable\] Update stakeholders & personas document ](https://github.com/raft-tech/TANF-app/issues/3100)
* [#3106 Re-Parse Django Action ](https://github.com/raft-tech/TANF-app/issues/3106)
* [#3137 \[bug\] OFA unable to export data to csv by record type and fiscal period ](https://github.com/raft-tech/TANF-app/issues/3137)
* [#3164 \[Research Synthesis\] Yun & DIGIT's data workflow and use cases ](https://github.com/raft-tech/TANF-app/issues/3164)
* [#3170 Reparse Command Fails when Queryset is Large ](https://github.com/raft-tech/TANF-app/issues/3170)
* [#3179 Spike - How We Work / Hopes & Fears Workshop prep ](https://github.com/raft-tech/TANF-app/issues/3179)

**Blocked**

*

**Raft Review**

* [#2458 Integrate Nexus into CircleCI ](https://github.com/raft-tech/TANF-app/issues/2458)
* [#2965 As tech lead, I want a database seed implemented for testing ](https://github.com/raft-tech/TANF-app/issues/2965)
* [#3044 Prometheus/Grafana - Local Environment ](https://github.com/raft-tech/TANF-app/issues/3044)
4 changes: 2 additions & 2 deletions tdrs-backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM python:3.10.8-slim-buster
FROM python:3.10.8-slim-bullseye
ENV PYTHONUNBUFFERED 1

ARG user=tdpuser
Expand All @@ -17,7 +17,7 @@ RUN apt-get -y upgrade
# Postgres client setup
RUN apt --purge remove postgresql postgresql-* && apt install -y postgresql-common curl ca-certificates && install -d /usr/share/postgresql-common/pgdg && \
curl -o /usr/share/postgresql-common/pgdg/apt.postgresql.org.asc --fail https://www.postgresql.org/media/keys/ACCC4CF8.asc && \
sh -c 'echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list' && \
sh -c 'echo "deb [signed-by=/usr/share/postgresql-common/pgdg/apt.postgresql.org.asc] https://apt.postgresql.org/pub/repos/apt bullseye-pgdg main" > /etc/apt/sources.list.d/pgdg.list' && \
apt -y update && apt install postgresql-client-15 -y
# Install packages:
RUN apt install -y gcc graphviz graphviz-dev libpq-dev python3-dev vim
Expand Down
6 changes: 6 additions & 0 deletions tdrs-backend/tdpservice/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -395,6 +395,12 @@ def test_private_key():
yield get_private_key(key)


@pytest.fixture()
def system_user():
"""Create system user."""
return UserFactory.create(username='system')


# Register factories with pytest-factoryboy for automatic dependency injection
# of model-related fixtures into tests.
register(OwaspZapScanFactory)
Expand Down
7 changes: 3 additions & 4 deletions tdrs-backend/tdpservice/parsers/parse.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,11 +211,10 @@ def rollback_records(unsaved_records, datafile):
f"Encountered error while indexing datafile documents: \n{e}",
"error"
)
logger.warn("Encountered an Elastic exception, enforcing DB cleanup.")
logger.warning("Encountered an Elastic exception, enforcing DB cleanup.")
num_deleted, models = qset.delete()
logger.info("Succesfully performed DB cleanup after elastic failure.")
log_parser_exception(datafile,
"Succesfully performed DB cleanup after elastic failure.",
"Succesfully performed DB cleanup after elastic failure in rollback_records.",
"info"
)
except DatabaseError as e:
Expand Down Expand Up @@ -310,7 +309,7 @@ def delete_serialized_records(duplicate_manager, dfs):
total_deleted += num_deleted
dfs.total_number_of_records_created -= num_deleted
log_parser_exception(dfs.datafile,
"Succesfully performed DB cleanup after elastic failure.",
"Succesfully performed DB cleanup after elastic failure in delete_serialized_records.",
"info"
)
except DatabaseError as e:
Expand Down
82 changes: 41 additions & 41 deletions tdrs-backend/tdpservice/parsers/test/factories.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,43 +184,43 @@ class Meta:
EMPLOYMENT_STATUS = 1
WORK_ELIGIBLE_INDICATOR = "01"
WORK_PART_STATUS = "01"
UNSUB_EMPLOYMENT = 1
SUB_PRIVATE_EMPLOYMENT = 1
SUB_PUBLIC_EMPLOYMENT = 1
WORK_EXPERIENCE_HOP = 1
WORK_EXPERIENCE_EA = 1
WORK_EXPERIENCE_HOL = 1
OJT = 1
JOB_SEARCH_HOP = 1
JOB_SEARCH_EA = 1
JOB_SEARCH_HOL = 1
COMM_SERVICES_HOP = 1
COMM_SERVICES_EA = 1
COMM_SERVICES_HOL = 1
VOCATIONAL_ED_TRAINING_HOP = 1
VOCATIONAL_ED_TRAINING_EA = 1
VOCATIONAL_ED_TRAINING_HOL = 1
JOB_SKILLS_TRAINING_HOP = 1
JOB_SKILLS_TRAINING_EA = 1
JOB_SKILLS_TRAINING_HOL = 1
ED_NO_HIGH_SCHOOL_DIPL_HOP = 1
ED_NO_HIGH_SCHOOL_DIPL_EA = 1
ED_NO_HIGH_SCHOOL_DIPL_HOL = 1
SCHOOL_ATTENDENCE_HOP = 1
SCHOOL_ATTENDENCE_EA = 1
SCHOOL_ATTENDENCE_HOL = 1
PROVIDE_CC_HOP = 1
PROVIDE_CC_EA = 1
PROVIDE_CC_HOL = 1
OTHER_WORK_ACTIVITIES = 1
DEEMED_HOURS_FOR_OVERALL = 1
DEEMED_HOURS_FOR_TWO_PARENT = 1
EARNED_INCOME = 1
UNEARNED_INCOME_TAX_CREDIT = 1
UNEARNED_SOCIAL_SECURITY = 1
UNEARNED_SSI = 1
UNEARNED_WORKERS_COMP = 1
OTHER_UNEARNED_INCOME = 1
UNSUB_EMPLOYMENT = "01"
SUB_PRIVATE_EMPLOYMENT = "01"
SUB_PUBLIC_EMPLOYMENT = "01"
WORK_EXPERIENCE_HOP = "01"
WORK_EXPERIENCE_EA = "01"
WORK_EXPERIENCE_HOL = "01"
OJT = "01"
JOB_SEARCH_HOP = "01"
JOB_SEARCH_EA = "01"
JOB_SEARCH_HOL = "01"
COMM_SERVICES_HOP = "01"
COMM_SERVICES_EA = "01"
COMM_SERVICES_HOL = "01"
VOCATIONAL_ED_TRAINING_HOP = "01"
VOCATIONAL_ED_TRAINING_EA = "01"
VOCATIONAL_ED_TRAINING_HOL = "01"
JOB_SKILLS_TRAINING_HOP = "01"
JOB_SKILLS_TRAINING_EA = "01"
JOB_SKILLS_TRAINING_HOL = "01"
ED_NO_HIGH_SCHOOL_DIPL_HOP = "01"
ED_NO_HIGH_SCHOOL_DIPL_EA = "01"
ED_NO_HIGH_SCHOOL_DIPL_HOL = "01"
SCHOOL_ATTENDENCE_HOP = "01"
SCHOOL_ATTENDENCE_EA = "01"
SCHOOL_ATTENDENCE_HOL = "01"
PROVIDE_CC_HOP = "01"
PROVIDE_CC_EA = "01"
PROVIDE_CC_HOL = "01"
OTHER_WORK_ACTIVITIES = "01"
DEEMED_HOURS_FOR_OVERALL = "01"
DEEMED_HOURS_FOR_TWO_PARENT = "01"
EARNED_INCOME = "01"
UNEARNED_INCOME_TAX_CREDIT = "01"
UNEARNED_SOCIAL_SECURITY = "01"
UNEARNED_SSI = "01"
UNEARNED_WORKERS_COMP = "01"
OTHER_UNEARNED_INCOME = "01"


class TanfT3Factory(factory.django.DjangoModelFactory):
Expand Down Expand Up @@ -451,10 +451,10 @@ class Meta:
CURRENT_MONTH_STATE_EXEMPT = 1
EMPLOYMENT_STATUS = 1
WORK_PART_STATUS = "01"
UNSUB_EMPLOYMENT = 1
SUB_PRIVATE_EMPLOYMENT = 1
SUB_PUBLIC_EMPLOYMENT = 1
OJT = 1
UNSUB_EMPLOYMENT = "01"
SUB_PRIVATE_EMPLOYMENT = "01"
SUB_PUBLIC_EMPLOYMENT = "01"
OJT = "01"
JOB_SEARCH = '1'
COMM_SERVICES = '1'
VOCATIONAL_ED_TRAINING = '1'
Expand Down
9 changes: 9 additions & 0 deletions tdrs-backend/tdpservice/parsers/test/test_parse.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@


import pytest
import os
from django.contrib.admin.models import LogEntry
from django.conf import settings
from django.db.models import Q as Query
Expand Down Expand Up @@ -1739,6 +1740,9 @@ def test_parse_duplicate(file, batch_size, model, record_type, num_errors, dfs,
settings.BULK_CREATE_BATCH_SIZE = batch_size

parse.parse_datafile(datafile, dfs)

settings.BULK_CREATE_BATCH_SIZE = os.getenv("BULK_CREATE_BATCH_SIZE", 10000)

parser_errors = ParserError.objects.filter(file=datafile,
error_type=ParserErrorCategoryChoices.CASE_CONSISTENCY).order_by('id')
for e in parser_errors:
Expand Down Expand Up @@ -1782,6 +1786,9 @@ def test_parse_partial_duplicate(file, batch_size, model, record_type, num_error
settings.BULK_CREATE_BATCH_SIZE = batch_size

parse.parse_datafile(datafile, dfs)

settings.BULK_CREATE_BATCH_SIZE = os.getenv("BULK_CREATE_BATCH_SIZE", 10000)

parser_errors = ParserError.objects.filter(file=datafile,
error_type=ParserErrorCategoryChoices.CASE_CONSISTENCY).order_by('id')
for e in parser_errors:
Expand All @@ -1806,6 +1813,8 @@ def test_parse_cat_4_edge_case_file(cat4_edge_case_file, dfs):

parse.parse_datafile(cat4_edge_case_file, dfs)

settings.BULK_CREATE_BATCH_SIZE = os.getenv("BULK_CREATE_BATCH_SIZE", 10000)

parser_errors = ParserError.objects.filter(file=cat4_edge_case_file).filter(
error_type=ParserErrorCategoryChoices.CASE_CONSISTENCY)

Expand Down
2 changes: 1 addition & 1 deletion tdrs-backend/tdpservice/parsers/validators/category3.py
Original file line number Diff line number Diff line change
Expand Up @@ -386,7 +386,7 @@ def validate(record, row_schema):
"Caught exception in validator: validate__WORK_ELIGIBLE_INDICATOR__HOH__AGE. " +
f"With field values: {vals}."
)
logger.error(f'Exception: {e}')
logger.debug(f'Exception: {e}')
# Per conversation with Alex on 03/26/2024, returning the true case during exception handling to avoid
# confusing the STTs.
return true_case
Expand Down
25 changes: 14 additions & 11 deletions tdrs-backend/tdpservice/scheduling/management/db_backup.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,10 @@


OS_ENV = os.environ
content_type = ContentType.objects.get_for_model(LogEntry)

def get_content_type():
"""Get content type for log entry."""
return ContentType.objects.get_for_model(LogEntry)

def get_system_values():
"""Return dict of keys and settings to use whether local or deployed."""
Expand Down Expand Up @@ -91,7 +94,7 @@ def backup_database(file_name,
logger.info(msg)
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Executed Database Backup",
action_flag=ADDITION,
Expand Down Expand Up @@ -123,7 +126,7 @@ def restore_database(file_name, postgres_client, database_uri, system_user):
msg = "Completed database creation."
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Executed Database create",
action_flag=ADDITION,
Expand All @@ -145,7 +148,7 @@ def restore_database(file_name, postgres_client, database_uri, system_user):
msg = "Completed database restoration."
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Executed Database restore",
action_flag=ADDITION,
Expand Down Expand Up @@ -177,7 +180,7 @@ def upload_file(file_name, bucket, sys_values, system_user, object_name=None, re
msg = "Successfully uploaded {} to s3://{}/{}.".format(file_name, bucket, object_name)
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Executed database backup S3 upload",
action_flag=ADDITION,
Expand Down Expand Up @@ -208,7 +211,7 @@ def download_file(bucket,
msg = "Successfully downloaded s3 file {}/{} to {}.".format(bucket, object_name, file_name)
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Executed database backup S3 download",
action_flag=ADDITION,
Expand Down Expand Up @@ -293,7 +296,7 @@ def main(argv, sys_values, system_user):
if arg_to_backup:
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Begining Database Backup",
action_flag=ADDITION,
Expand All @@ -316,7 +319,7 @@ def main(argv, sys_values, system_user):

LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Finished Database Backup",
action_flag=ADDITION,
Expand All @@ -329,7 +332,7 @@ def main(argv, sys_values, system_user):
elif arg_to_restore:
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Begining Database Restore",
action_flag=ADDITION,
Expand All @@ -352,7 +355,7 @@ def main(argv, sys_values, system_user):

LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Finished Database Restore",
action_flag=ADDITION,
Expand All @@ -377,7 +380,7 @@ def run_backup(arg):
logger.error(f"Caught Exception in run_backup. Exception: {e}.")
LogEntry.objects.log_action(
user_id=system_user.pk,
content_type_id=content_type.pk,
content_type_id=get_content_type().pk,
object_id=None,
object_repr="Exception in run_backup",
action_flag=ADDITION,
Expand Down
Loading

0 comments on commit cf56075

Please sign in to comment.