diff --git a/docs/Sprint-Review/sprint-80-summary.md b/docs/Sprint-Review/sprint-80-summary.md new file mode 100644 index 000000000..908ef6259 --- /dev/null +++ b/docs/Sprint-Review/sprint-80-summary.md @@ -0,0 +1,51 @@ +# Sprint 80 Summary + +08/16/23 - 08/29/23 + +Velocity: Dev (20) + +## Sprint Goal +* Continue parsing engine development for TANF Sections (01-04), complete decoupling backend application spike and continue integration test epic (2282). +* UX to continue regional staff research, service design blueprint (.1 and .2) and bridge onboarding to >85% of total users +* DevOps to investigate nightlyscan issues and resolve utlity images for CircleCI and container registry. + + +## Tickets +### Completed/Merged +* [#2369 As tech lead, we need the parsing engine to run quailty checks across TANF section 1](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2369) +* [#1110 TANF (03) Parsing and Validation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1110) +* [#2282 As tech lead, I want a file upload integration test](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2282) +* [#1784 - Email Relay](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1784) + +### Ready to Merge +* N/A + +### Submitted (QASP Review, OCIO Review) +* [#1109 TANF (02) Parsing and Validation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1109) + +### Closed (not merged) +* N/A + +## Moved to Next Sprint (Blocked, Raft Review, In Progress) +### In Progress +* [#2116 Container Registry Creation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2116) +* [#2429 Singular ClamAV scanner](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2429) +* [#1111 TANF (04) Parsing and Validation](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1111) + + +### Blocked +* N/A + + +### Raft Review +* [#1610 As a user, I need information about the acceptance of my data and a link for the error report](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1610) +* [#1612 Detailed case level metadata](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/1612) +* [#1613 As a developer, I need parsed file meta data (TANF Section 1)](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/board) +* [#2626 (Spike) improve parsing logging](https://app.zenhub.com/workspaces/sprint-board-5f18ab06dfd91c000f7e682e/issues/gh/raft-tech/tanf-app/2626) + +### Demo +* Internal: + * 2369 / 1110 - TANF Sections (01 and 03) Parsing and Validation +* External: + * 2369 / 1110 - TANF Sections (01 and 03) Parsing and Validation + diff --git a/docs/Technical-Documentation/Zap-Scan-HTML-Report.md b/docs/Technical-Documentation/Zap-Scan-HTML-Report.md index b7fa723bd..6d9b20cfe 100644 --- a/docs/Technical-Documentation/Zap-Scan-HTML-Report.md +++ b/docs/Technical-Documentation/Zap-Scan-HTML-Report.md @@ -24,3 +24,39 @@ link to view the running process at CircleCI 4. Click the `owasp_report.html` link to view the report. ![image](images/report.png) + +### Configuring Report Output + +We use separate files for configuring the ZAP scanner for the front and back end applications +Backend: [tdrs-backend/reports/zap.conf](../../tdrs-backend/reports/zap.conf) +Frontend: [tdrs-frontend/reports/zap.conf](../../tdrs-frontend/reports/zap.conf) + +These files have a list of error codes and what to do with them. We have some of these set +to IGNORE because they do not apply to our configuration but were returning false positives +for test failures. For each of these, we should have a comment as to why the test is being +ignored. + +Can use Postman to mimic the test parameters before ignoring to verify +The [free version of Postman](https://www.postman.com/downloads/), the app or web version, can be used for this. +examples: +![image](images/postman_example1.png) +![image](images/postman_example2.png) + +### Invoking the OWASP Zap Scanner + +We build out how we invoke the zap scanner using our [zap-scanner](../../scripts/zap-scanner.sh) script. + +As part of that, we pass some additional configuration that includes a list of urls to exclude from the +scan. +`ZAP_CLI_OPTIONS` contains this list. +It is important to note, not to include the frontend or backend endpoint we want to test the scanner out +on. + +e.g. do not include something like this in the `-config globalexcludeurl.url_list.url` configuration options: +``` + -config globalexcludeurl.url_list.url\(3\).regex='^https?://.*\.hhs.gov\/.*$' \ + -config globalexcludeurl.url_list.url\(3\).description='Site - acf.hhs.gov' \ + -config globalexcludeurl.url_list.url\(3\).enabled=true \ +``` + +It will not be able to find the endpoint for the tests and the output is confusing as to what is happening. diff --git a/docs/Technical-Documentation/images/postman_example1.png b/docs/Technical-Documentation/images/postman_example1.png new file mode 100644 index 000000000..9c79e5400 Binary files /dev/null and b/docs/Technical-Documentation/images/postman_example1.png differ diff --git a/docs/Technical-Documentation/images/postman_example2.png b/docs/Technical-Documentation/images/postman_example2.png new file mode 100644 index 000000000..bfd35224a Binary files /dev/null and b/docs/Technical-Documentation/images/postman_example2.png differ diff --git a/tdrs-backend/docker-compose.yml b/tdrs-backend/docker-compose.yml index 9f41bf68e..d9d10d393 100644 --- a/tdrs-backend/docker-compose.yml +++ b/tdrs-backend/docker-compose.yml @@ -100,6 +100,7 @@ services: ./gunicorn_start.sh && celery -A tdpservice.settings worker -l info" ports: - "5555:5555" + tty: true depends_on: - clamav-rest - localstack diff --git a/tdrs-backend/gunicorn_start.sh b/tdrs-backend/gunicorn_start.sh index 823836149..684e7eb24 100755 --- a/tdrs-backend/gunicorn_start.sh +++ b/tdrs-backend/gunicorn_start.sh @@ -14,7 +14,7 @@ fi # echo "Applying database migrations" -#python manage.py migrate +python manage.py migrate #python manage.py populate_stts #python manage.py collectstatic --noinput diff --git a/tdrs-backend/reports/zap.conf b/tdrs-backend/reports/zap.conf index eaec6ae5c..46748a939 100644 --- a/tdrs-backend/reports/zap.conf +++ b/tdrs-backend/reports/zap.conf @@ -79,7 +79,11 @@ 40014 FAIL (Cross Site Scripting (Persistent) - Active/release) 40016 FAIL (Cross Site Scripting (Persistent) - Prime - Active/release) 40017 FAIL (Cross Site Scripting (Persistent) - Spider - Active/release) -40018 WARN (SQL Injection - Active/release) +##### IGNORE (SQL Injection - Active/release) as it doesn't apply to us and is giving +##### false positives because it takes us to a default django page notifying us +##### of the 403 forbidden, instead of just a 403 being returned. The test is +##### treating this as though the SQL injection worked, since a page is returned. +40018 IGNORE (SQL Injection - Active/release) 40019 FAIL (SQL Injection - MySQL - Active/beta) 40020 FAIL (SQL Injection - Hypersonic SQL - Active/beta) 40021 FAIL (SQL Injection - Oracle - Active/beta) @@ -93,7 +97,10 @@ 40029 FAIL (Trace.axd Information Leak - Active/beta) 40032 FAIL (.htaccess Information Leak - Active/release) 40034 FAIL (.env Information Leak - Active/beta) -40035 FAIL (Hidden File Finder - Active/beta) +##### IGNORE (Hidden File Finder - Active/beta) due to false failing similar to SQL +##### Injection false positive above. Replicating parameters of the test +##### result in +40035 IGNORE (Hidden File Finder - Active/beta) 41 FAIL (Source Code Disclosure - Git - Active/beta) 42 FAIL (Source Code Disclosure - SVN - Active/beta) 43 FAIL (Source Code Disclosure - File Inclusion - Active/beta) diff --git a/tdrs-backend/tdpservice/core/logger.py b/tdrs-backend/tdpservice/core/logger.py new file mode 100644 index 000000000..1cd93c50a --- /dev/null +++ b/tdrs-backend/tdpservice/core/logger.py @@ -0,0 +1,38 @@ +"""Contains core logging functionality for TDP.""" + +import logging + +class ColorFormatter(logging.Formatter): + """Simple formatter class to add color to log messages based on log level.""" + + BLACK = '\033[0;30m' + RED = '\033[0;31m' + GREEN = '\033[0;32m' + BROWN = '\033[0;33m' + BLUE = '\033[0;34m' + PURPLE = '\033[0;35m' + CYAN = '\033[0;36m' + GREY = '\033[0;37m' + + DARK_GREY = '\033[1;30m' + LIGHT_RED = '\033[1;31m' + LIGHT_GREEN = '\033[1;32m' + YELLOW = '\033[1;33m' + LIGHT_BLUE = '\033[1;34m' + LIGHT_PURPLE = '\033[1;35m' + LIGHT_CYAN = '\033[1;36m' + WHITE = '\033[1;37m' + + RESET = "\033[0m" + + def __init__(self, *args, **kwargs): + self._colors = {logging.DEBUG: self.CYAN, + logging.INFO: self.GREEN, + logging.WARNING: self.YELLOW, + logging.ERROR: self.LIGHT_RED, + logging.CRITICAL: self.RED} + super(ColorFormatter, self).__init__(*args, **kwargs) + + def format(self, record): + """Format the record to be colored based on the log level.""" + return self._colors.get(record.levelno, self.WHITE) + super().format(record) + self.RESET diff --git a/tdrs-backend/tdpservice/data_files/models.py b/tdrs-backend/tdpservice/data_files/models.py index 9a954122c..b4248a9cd 100644 --- a/tdrs-backend/tdpservice/data_files/models.py +++ b/tdrs-backend/tdpservice/data_files/models.py @@ -221,6 +221,15 @@ def find_latest_version(self, year, quarter, section, stt): version=version, year=year, quarter=quarter, section=section, stt=stt, ).first() + def __repr__(self): + """Return a string representation of the model.""" + return f"{{id: {self.id}, filename: {self.original_filename}, STT: {self.stt}, S3 location: " + \ + f"{self.s3_location}}}" + + def __str__(self): + """Return a string representation of the model.""" + return f"filename: {self.original_filename}" + class LegacyFileTransferManager(models.Manager): """Extends object manager functionality for LegacyFileTransfer model.""" diff --git a/tdrs-backend/tdpservice/data_files/views.py b/tdrs-backend/tdpservice/data_files/views.py index 305985512..f01bb5be6 100644 --- a/tdrs-backend/tdpservice/data_files/views.py +++ b/tdrs-backend/tdpservice/data_files/views.py @@ -66,6 +66,10 @@ def create(self, request, *args, **kwargs): data_file_id = response.data.get('id') data_file = DataFile.objects.get(id=data_file_id) + logger.info(f"Preparing parse task: User META -> user: {request.user}, stt: {data_file.stt}. " + + f"Datafile META -> datafile: {data_file_id}, section: {data_file.section}, " + + f"quarter {data_file.quarter}, year {data_file.year}.") + parser_task.parse.delay(data_file_id) logger.info("Submitted parse task to queue for datafile %s.", data_file_id) diff --git a/tdrs-backend/tdpservice/parsers/fields.py b/tdrs-backend/tdpservice/parsers/fields.py index 6fb742031..fa4c73101 100644 --- a/tdrs-backend/tdpservice/parsers/fields.py +++ b/tdrs-backend/tdpservice/parsers/fields.py @@ -1,5 +1,9 @@ """Datafile field representations.""" +import logging + +logger = logging.getLogger(__name__) + def value_is_empty(value, length): """Handle 'empty' values as field inputs.""" empty_values = [ @@ -36,6 +40,7 @@ def parse_value(self, line): value = line[self.startIndex:self.endIndex] if value_is_empty(value, self.endIndex-self.startIndex): + logger.debug(f"Field: '{self.name}' at position: [{self.startIndex}, {self.endIndex}) is empty.") return None match self.type: @@ -44,9 +49,13 @@ def parse_value(self, line): value = int(value) return value except ValueError: + logger.error(f"Error parsing field value: {value} to integer.") return None case 'string': return value + case _: + logger.warn(f"Unknown field type: {self.type}.") + return None class TransformField(Field): """Represents a field that requires some transformation before serializing.""" diff --git a/tdrs-backend/tdpservice/parsers/models.py b/tdrs-backend/tdpservice/parsers/models.py index da95f64e9..4a638e06a 100644 --- a/tdrs-backend/tdpservice/parsers/models.py +++ b/tdrs-backend/tdpservice/parsers/models.py @@ -57,11 +57,12 @@ def rpt_month_name(self): def __repr__(self): """Return a string representation of the model.""" - return f"ParserError {self.id} for file {self.file} and object key {self.object_id}" + return f"{{id: {self.id}, file: {self.file.id}, row: {self.row_number}, column: {self.column_number}, " + \ + f"error message: {self.error_message}}}" def __str__(self): """Return a string representation of the model.""" - return f"ParserError {self.id}" + return f"error_message: {self.error_message}" def _get_error_message(self): """Return the error message.""" diff --git a/tdrs-backend/tdpservice/parsers/parse.py b/tdrs-backend/tdpservice/parsers/parse.py index 84a64f6e1..2c2183c68 100644 --- a/tdrs-backend/tdpservice/parsers/parse.py +++ b/tdrs-backend/tdpservice/parsers/parse.py @@ -23,15 +23,18 @@ def parse_datafile(datafile): util.make_generate_parser_error(datafile, 1) ) if not header_is_valid: + logger.info(f"Preparser Error: {len(header_errors)} header errors encountered.") errors['header'] = header_errors bulk_create_errors({1: header_errors}, 1, flush=True) return errors is_encrypted = util.contains_encrypted_indicator(header_line, schema_defs.header.get_field_by_name("encryption")) + logger.debug(f"Datafile has encrypted fields: {is_encrypted}.") # ensure file section matches upload section program_type = header['program_type'] section = header['type'] + logger.debug(f"Program type: {program_type}, Section: {section}.") section_is_valid, section_error = validators.validate_header_section_matches_submission( datafile, @@ -40,6 +43,7 @@ def parse_datafile(datafile): ) if not section_is_valid: + logger.info(f"Preparser Error -> Section is not valid: {section_error.error_message}") errors['document'] = [section_error] unsaved_parser_errors = {1: [section_error]} bulk_create_errors(unsaved_parser_errors, 1, flush=True) @@ -55,12 +59,17 @@ def parse_datafile(datafile): def bulk_create_records(unsaved_records, line_number, header_count, batch_size=10000, flush=False): """Bulk create passed in records.""" if (line_number % batch_size == 0 and header_count > 0) or flush: + logger.debug("Bulk creating records.") try: num_created = 0 num_expected = 0 for model, records in unsaved_records.items(): num_expected += len(records) num_created += len(model.objects.bulk_create(records)) + if num_created != num_expected: + logger.error(f"Bulk create only created {num_created}/{num_expected}!") + else: + logger.info(f"Created {num_created}/{num_expected} records.") return num_created == num_expected, {} except DatabaseError as e: logger.error(f"Encountered error while creating datafile records: {e}") @@ -70,7 +79,10 @@ def bulk_create_records(unsaved_records, line_number, header_count, batch_size=1 def bulk_create_errors(unsaved_parser_errors, num_errors, batch_size=5000, flush=False): """Bulk create all ParserErrors.""" if flush or (unsaved_parser_errors and num_errors >= batch_size): - ParserError.objects.bulk_create(list(itertools.chain.from_iterable(unsaved_parser_errors.values()))) + logger.debug("Bulk creating ParserErrors.") + num_created = len(ParserError.objects.bulk_create(list(itertools.chain.from_iterable( + unsaved_parser_errors.values())))) + logger.info(f"Created {num_created}/{num_errors} ParserErrors.") return {}, 0 return unsaved_parser_errors, num_errors @@ -94,12 +106,16 @@ def evaluate_trailer(datafile, trailer_count, multiple_trailer_errors, is_last_l def rollback_records(unsaved_records, datafile): """Delete created records in the event of a failure.""" + logger.info("Rolling back created records.") for model in unsaved_records: - model.objects.filter(datafile=datafile).delete() + num_deleted, models = model.objects.filter(datafile=datafile).delete() + logger.debug(f"Deleted {num_deleted} records of type: {model}.") def rollback_parser_errors(datafile): """Delete created errors in the event of a failure.""" - ParserError.objects.filter(file=datafile).delete() + logger.info("Rolling back created parser errors.") + num_deleted, models = ParserError.objects.filter(file=datafile).delete() + logger.debug(f"Deleted {num_deleted} {ParserError}.") def parse_datafile_lines(datafile, program_type, section, is_encrypted): """Parse lines with appropriate schema and return errors.""" @@ -135,6 +151,8 @@ def parse_datafile_lines(datafile, program_type, section, is_encrypted): is_last, line, line_number) if trailer_errors is not None: + logger.debug(f"{len(trailer_errors)} trailer error(s) detected for file " + + f"'{datafile.original_filename}' on line {line_number}.") errors['trailer'] = trailer_errors unsaved_parser_errors.update({"trailer": trailer_errors}) num_errors += len(trailer_errors) @@ -142,6 +160,7 @@ def parse_datafile_lines(datafile, program_type, section, is_encrypted): generate_error = util.make_generate_parser_error(datafile, line_number) if header_count > 1: + logger.info(f"Preparser Error -> Multiple headers found for file: {datafile.id} on line: {line_number}.") errors.update({'document': ['Multiple headers found.']}) err_obj = generate_error( schema=None, @@ -173,6 +192,9 @@ def parse_datafile_lines(datafile, program_type, section, is_encrypted): record_number += 1 record, record_is_valid, record_errors = r if not record_is_valid: + logger.debug(f"Record #{i} from line {line_number} is invalid.") + line_errors = errors.get(f"{line_number}_{i}", {}) + line_errors.update({record_number: record_errors}) errors.update({f"{line_number}_{i}": record_errors}) unsaved_parser_errors.update({f"{line_number}_{i}": record_errors}) num_errors += len(record_errors) @@ -185,6 +207,7 @@ def parse_datafile_lines(datafile, program_type, section, is_encrypted): unsaved_parser_errors, num_errors = bulk_create_errors(unsaved_parser_errors, num_errors) if header_count == 0: + logger.info(f"Preparser Error -> No headers found for file: {datafile.id}.") errors.update({'document': ['No headers found.']}) err_obj = generate_error( schema=None, @@ -203,6 +226,7 @@ def parse_datafile_lines(datafile, program_type, section, is_encrypted): # successfully create the records. all_created, unsaved_records = bulk_create_records(unsaved_records, line_number, header_count, flush=True) if not all_created: + logger.error(f"Not all parsed records created for file: {datafile.id}!") rollback_records(unsaved_records, datafile) bulk_create_errors(unsaved_parser_errors, num_errors, flush=True) return errors @@ -218,6 +242,7 @@ def manager_parse_line(line, schema_manager, generate_error): records = schema_manager.parse_and_validate(line, generate_error) return records + logger.debug("Record Type is missing from record.") return [(None, False, [ generate_error( schema=None, diff --git a/tdrs-backend/tdpservice/parsers/row_schema.py b/tdrs-backend/tdpservice/parsers/row_schema.py index a15b1a6bc..a4faecdf3 100644 --- a/tdrs-backend/tdpservice/parsers/row_schema.py +++ b/tdrs-backend/tdpservice/parsers/row_schema.py @@ -1,6 +1,9 @@ """Row schema for datafile.""" from .models import ParserErrorCategoryChoices from .fields import Field, value_is_empty +import logging + +logger = logging.getLogger(__name__) class RowSchema: @@ -45,6 +48,7 @@ def parse_and_validate(self, line, generate_error): if not preparsing_is_valid: if self.quiet_preparser_errors: return None, True, [] + logger.info(f"{len(preparsing_errors)} preparser error(s) encountered.") return None, False, preparsing_errors # parse line to model diff --git a/tdrs-backend/tdpservice/scheduling/parser_task.py b/tdrs-backend/tdpservice/scheduling/parser_task.py index 649ae0723..4ffd91277 100644 --- a/tdrs-backend/tdpservice/scheduling/parser_task.py +++ b/tdrs-backend/tdpservice/scheduling/parser_task.py @@ -16,6 +16,6 @@ def parse(data_file_id): # for undetermined amount of time. data_file = DataFile.objects.get(id=data_file_id) - logger.info(f"DataFile parsing started for file {data_file.filename}") + logger.info(f"DataFile parsing started for file -> {repr(data_file)}") errors = parse_datafile(data_file) - logger.info(f"DataFile parsing finished with {len(errors)} errors: {errors}") + logger.info(f"DataFile parsing finished with {len(errors)} errors, for file -> {repr(data_file)}.") diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230810_1500.py b/tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230724_1830.py similarity index 67% rename from tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230810_1500.py rename to tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230724_1830.py index aed6950c0..3a9db8704 100644 --- a/tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230810_1500.py +++ b/tdrs-backend/tdpservice/search_indexes/migrations/0015_auto_20230724_1830.py @@ -1,4 +1,4 @@ -# Generated by Django 3.2.15 on 2023-08-10 15:00 +# Generated by Django 3.2.15 on 2023-07-24 18:30 from django.db import migrations, models @@ -190,7 +190,7 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t4', name='CLOSURE_REASON', - field=models.CharField(max_length=2, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t4', @@ -235,7 +235,7 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t4', name='STRATUM', - field=models.CharField(max_length=2, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t4', @@ -245,12 +245,12 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='AMOUNT_EARNED_INCOME', - field=models.CharField(max_length=4, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t5', name='AMOUNT_UNEARNED_INCOME', - field=models.CharField(max_length=4, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -265,12 +265,12 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='COUNTABLE_MONTHS_STATE_TRIBE', - field=models.CharField(max_length=2, null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', name='COUNTABLE_MONTH_FED_TIME', - field=models.CharField(max_length=3, null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -280,12 +280,12 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='EDUCATION_LEVEL', - field=models.CharField(max_length=2, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t5', name='EMPLOYMENT_STATUS', - field=models.IntegerField(null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -300,7 +300,7 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='MARITAL_STATUS', - field=models.IntegerField(null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -345,12 +345,12 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='REC_AID_AGED_BLIND', - field=models.IntegerField(null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', name='REC_AID_TOTALLY_DISABLED', - field=models.IntegerField(null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -360,7 +360,7 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='REC_OASDI_INSURANCE', - field=models.IntegerField(null=True), + field=models.FloatField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -370,7 +370,7 @@ class Migration(migrations.Migration): migrations.AddField( model_name='tanf_t5', name='RELATIONSHIP_HOH', - field=models.CharField(max_length=2, null=True), + field=models.IntegerField(null=True), ), migrations.AddField( model_name='tanf_t5', @@ -387,154 +387,4 @@ class Migration(migrations.Migration): name='SSN', field=models.CharField(max_length=9, null=True), ), - migrations.AlterField( - model_name='tanf_t2', - name='AID_AGED_BLIND', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='CITIZENSHIP_STATUS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='COOPERATION_CHILD_SUPPORT', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='CURRENT_MONTH_STATE_EXEMPT', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='DISABLED_TITLE_XIVAPDT', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='EMPLOYMENT_STATUS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='FED_DISABILITY_STATUS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='FED_OASDI_PROGRAM', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='MARITAL_STATUS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='NEEDS_PREGNANT_WOMAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='PARENT_WITH_MINOR_CHILD', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_AMER_INDIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_ASIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_BLACK', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_HAWAIIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_HISPANIC', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RACE_WHITE', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RECEIVE_SSI', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t2', - name='RELATIONSHIP_HOH', - field=models.CharField(max_length=2, null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='CITIZENSHIP_STATUS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='PARENT_MINOR_CHILD', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_AMER_INDIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_ASIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_BLACK', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_HAWAIIAN', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_HISPANIC', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RACE_WHITE', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RECEIVE_NONSSA_BENEFITS', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RECEIVE_SSI', - field=models.IntegerField(null=True), - ), - migrations.AlterField( - model_name='tanf_t3', - name='RELATIONSHIP_HOH', - field=models.CharField(max_length=2, null=True), - ), ] diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230803_1721.py b/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230803_1721.py new file mode 100644 index 000000000..33092e24c --- /dev/null +++ b/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230803_1721.py @@ -0,0 +1,83 @@ +# Generated by Django 3.2.15 on 2023-08-03 17:21 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('search_indexes', '0015_auto_20230724_1830'), + ] + + operations = [ + migrations.AlterField( + model_name='tanf_t4', + name='CLOSURE_REASON', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t4', + name='COUNTY_FIPS_CODE', + field=models.CharField(max_length=3, null=True), + ), + migrations.AlterField( + model_name='tanf_t4', + name='STRATUM', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='CITIZENSHIP_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='EMPLOYMENT_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='MARITAL_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='NEEDS_OF_PREGNANT_WOMAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='PARENT_MINOR_CHILD', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='REC_AID_AGED_BLIND', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='REC_AID_TOTALLY_DISABLED', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='REC_FEDERAL_DISABILITY', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='REC_OASDI_INSURANCE', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='REC_SSI', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='RELATIONSHIP_HOH', + field=models.CharField(max_length=2, null=True), + ), + ] diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230817_1635.py b/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230817_1635.py deleted file mode 100644 index a3eef285c..000000000 --- a/tdrs-backend/tdpservice/search_indexes/migrations/0016_auto_20230817_1635.py +++ /dev/null @@ -1,23 +0,0 @@ -# Generated by Django 3.2.15 on 2023-08-17 16:35 - -from django.db import migrations, models - - -class Migration(migrations.Migration): - - dependencies = [ - ('search_indexes', '0015_auto_20230810_1500'), - ] - - operations = [ - migrations.AlterField( - model_name='ssp_m1', - name='STRATUM', - field=models.CharField(max_length=2, null=True), - ), - migrations.AlterField( - model_name='tanf_t1', - name='STRATUM', - field=models.CharField(max_length=2, null=True), - ), - ] diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230804_1935.py b/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230804_1935.py deleted file mode 100644 index 2db5f8890..000000000 --- a/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230804_1935.py +++ /dev/null @@ -1,179 +0,0 @@ -# Generated by Django 3.2.15 on 2023-08-04 19:35 - -from django.db import migrations, models - - -class Migration(migrations.Migration): - - dependencies = [ - ('search_indexes', '0016_auto_20230817_1635'), - ] - - operations = [ - migrations.RemoveField( - model_name='tanf_t6', - name='adult_recipients', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='applications', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='approved', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='assistance', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='births', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='calendar_quarter', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='child_recipients', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='closed_cases', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='denied', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='families', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='fips_code', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='noncustodials', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='num_1_parents', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='num_2_parents', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='num_no_parents', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='outwedlock_births', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='recipients', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='record', - ), - migrations.RemoveField( - model_name='tanf_t6', - name='rpt_month_year', - ), - migrations.AddField( - model_name='tanf_t6', - name='CALENDAR_QUARTER', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='CALENDAR_YEAR', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_1_PARENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_2_PARENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_ADULT_RECIPIENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_APPLICATIONS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_APPROVED', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='ASSISTANCE', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_BIRTHS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_CHILD_RECIPIENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_CLOSED_CASES', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_DENIED', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_FAMILIES', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_NONCUSTODIALS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_NO_PARENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_OUTWEDLOCK_BIRTHS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='NUM_RECIPIENTS', - field=models.IntegerField(blank=True, null=True), - ), - migrations.AddField( - model_name='tanf_t6', - name='RecordType', - field=models.CharField(max_length=156, null=True), - ), - ] diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230914_1720.py b/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230914_1720.py new file mode 100644 index 000000000..c4fb4576c --- /dev/null +++ b/tdrs-backend/tdpservice/search_indexes/migrations/0017_auto_20230914_1720.py @@ -0,0 +1,364 @@ +# Generated by Django 3.2.15 on 2023-09-14 17:20 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('search_indexes', '0016_auto_20230803_1721'), + ] + + operations = [ + migrations.RemoveField( + model_name='tanf_t6', + name='adult_recipients', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='applications', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='approved', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='assistance', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='births', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='calendar_quarter', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='child_recipients', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='closed_cases', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='denied', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='families', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='fips_code', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='noncustodials', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='num_1_parents', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='num_2_parents', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='num_no_parents', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='outwedlock_births', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='recipients', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='record', + ), + migrations.RemoveField( + model_name='tanf_t6', + name='rpt_month_year', + ), + migrations.AddField( + model_name='tanf_t6', + name='ASSISTANCE', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='CALENDAR_QUARTER', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_1_PARENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_2_PARENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_ADULT_RECIPIENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_APPLICATIONS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_APPROVED', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_BIRTHS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_CHILD_RECIPIENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_CLOSED_CASES', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_DENIED', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_FAMILIES', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_NONCUSTODIALS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_NO_PARENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_OUTWEDLOCK_BIRTHS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='NUM_RECIPIENTS', + field=models.IntegerField(blank=True, null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='RPT_MONTH_YEAR', + field=models.IntegerField(null=True), + ), + migrations.AddField( + model_name='tanf_t6', + name='RecordType', + field=models.CharField(max_length=156, null=True), + ), + migrations.AlterField( + model_name='ssp_m1', + name='STRATUM', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t1', + name='STRATUM', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='AID_AGED_BLIND', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='CITIZENSHIP_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='COOPERATION_CHILD_SUPPORT', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='CURRENT_MONTH_STATE_EXEMPT', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='DISABLED_TITLE_XIVAPDT', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='EMPLOYMENT_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='FED_DISABILITY_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='FED_OASDI_PROGRAM', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='MARITAL_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='NEEDS_PREGNANT_WOMAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='PARENT_WITH_MINOR_CHILD', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_AMER_INDIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_ASIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_BLACK', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_HAWAIIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_HISPANIC', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RACE_WHITE', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RECEIVE_SSI', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t2', + name='RELATIONSHIP_HOH', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='CITIZENSHIP_STATUS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='PARENT_MINOR_CHILD', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_AMER_INDIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_ASIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_BLACK', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_HAWAIIAN', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_HISPANIC', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RACE_WHITE', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RECEIVE_NONSSA_BENEFITS', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RECEIVE_SSI', + field=models.IntegerField(null=True), + ), + migrations.AlterField( + model_name='tanf_t3', + name='RELATIONSHIP_HOH', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='AMOUNT_EARNED_INCOME', + field=models.CharField(max_length=4, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='AMOUNT_UNEARNED_INCOME', + field=models.CharField(max_length=4, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='COUNTABLE_MONTHS_STATE_TRIBE', + field=models.CharField(max_length=2, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='COUNTABLE_MONTH_FED_TIME', + field=models.CharField(max_length=3, null=True), + ), + migrations.AlterField( + model_name='tanf_t5', + name='EDUCATION_LEVEL', + field=models.CharField(max_length=2, null=True), + ), + ] diff --git a/tdrs-backend/tdpservice/search_indexes/migrations/0018_auto_20230816_1917.py b/tdrs-backend/tdpservice/search_indexes/migrations/0018_auto_20230816_1917.py deleted file mode 100644 index 37ebf621c..000000000 --- a/tdrs-backend/tdpservice/search_indexes/migrations/0018_auto_20230816_1917.py +++ /dev/null @@ -1,22 +0,0 @@ -# Generated by Django 3.2.15 on 2023-08-16 19:17 - -from django.db import migrations, models - - -class Migration(migrations.Migration): - - dependencies = [ - ('search_indexes', '0017_auto_20230804_1935'), - ] - - operations = [ - migrations.RemoveField( - model_name='tanf_t6', - name='CALENDAR_YEAR', - ), - migrations.AddField( - model_name='tanf_t6', - name='RPT_MONTH_YEAR', - field=models.IntegerField(null=True), - ), - ] diff --git a/tdrs-backend/tdpservice/settings/common.py b/tdrs-backend/tdpservice/settings/common.py index cae90db27..e69a0b8c8 100644 --- a/tdrs-backend/tdpservice/settings/common.py +++ b/tdrs-backend/tdpservice/settings/common.py @@ -205,10 +205,12 @@ class Common(Configuration): }, "verbose": { "format": ( - "[%(asctime)s %(levelname)s %(filename)s::%(funcName)s:L%(lineno)d : %(message)s" + "%(asctime)s %(levelname)s %(filename)s::%(funcName)s:L%(lineno)d : %(message)s" ) }, "simple": {"format": "%(levelname)s %(message)s"}, + "color": {"()": "tdpservice.core.logger.ColorFormatter", + "format": "%(asctime)s %(levelname)s %(filename)s::%(funcName)s:L%(lineno)d : %(message)s"} }, "filters": {"require_debug_true": {"()": "django.utils.log.RequireDebugTrue"}}, "handlers": { @@ -224,7 +226,7 @@ class Common(Configuration): }, "application": { "class": "logging.StreamHandler", - "formatter": "verbose", + "formatter": "color", }, }, "loggers": { @@ -233,6 +235,11 @@ class Common(Configuration): "propagate": True, "level": LOGGING_LEVEL }, + "tdpservice.parsers": { + "handlers": ["application"], + "propagate": False, + "level": LOGGING_LEVEL + }, "django": {"handlers": ["console"], "propagate": True}, "django.server": { "handlers": ["django.server"], diff --git a/tdrs-frontend/reports/zap.conf b/tdrs-frontend/reports/zap.conf index 854c0ea39..763647dc2 100644 --- a/tdrs-frontend/reports/zap.conf +++ b/tdrs-frontend/reports/zap.conf @@ -79,6 +79,12 @@ 40014 FAIL (Cross Site Scripting (Persistent) - Active/release) 40016 FAIL (Cross Site Scripting (Persistent) - Prime - Active/release) 40017 FAIL (Cross Site Scripting (Persistent) - Spider - Active/release) +##### IGNORE (SQL Injection - Active/release, MySQL, Hypersonig SQL, Oracle) +##### as they don't apply to us and is giving false positives because +##### it takes us to a default django page notifying us of the 403 +##### forbidden, instead of just a 403 being returned. The test is +##### treating this as though the SQL injection worked, since a page +##### is returned. 40018 IGNORE (SQL Injection - Active/release) 40019 IGNORE (SQL Injection - MySQL - Active/beta) 40020 IGNORE (SQL Injection - Hypersonic SQL - Active/beta) @@ -93,7 +99,10 @@ 40029 FAIL (Trace.axd Information Leak - Active/beta) 40032 FAIL (.htaccess Information Leak - Active/release) 40034 FAIL (.env Information Leak - Active/beta) -40035 FAIL (Hidden File Finder - Active/beta) +##### IGNORE (Hidden File Finder - Active/beta) due to false failing similar to SQL +##### Injection false positive above. Replicating parameters of the test +##### result in +40035 IGNORE (Hidden File Finder - Active/beta) 41 FAIL (Source Code Disclosure - Git - Active/beta) 42 FAIL (Source Code Disclosure - SVN - Active/beta) 43 FAIL (Source Code Disclosure - File Inclusion - Active/beta)