Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

console recent changes to merged in master #1274

Open
wants to merge 11 commits into
base: master
Choose a base branch
from
Open

Conversation

nitish-egov
Copy link
Collaborator

@nitish-egov nitish-egov commented Dec 12, 2024

Summary by CodeRabbit

  • New Features

    • Introduced a new debugging configuration for enhanced error tracking and debugging capabilities.
    • Added a cache burst functionality for localization after chunk uploads.
    • Enhanced validation for microplan campaigns with additional checks for boundary presence.
    • Improved handling of campaign details and boundary management.
    • Added support for dynamic chunk size configuration during uploads.
    • Implemented Gzip request handling for improved payload processing.
  • Bug Fixes

    • Improved error handling in various functions to provide clearer messages and prevent runtime errors.
  • Documentation

    • Added comments to the TypeScript configuration file for better clarity on compiler options.
  • Refactor

    • Updated several utility functions for better readability and maintainability, particularly in campaign and boundary management.
    • Enhanced middleware to support Gzip content type processing.

ashish-egov and others added 8 commits December 10, 2024 18:06
* HLM service request, updated DataTypeEnum (#872)

* Service request changelog 1.5 (#875)

* Added changelog and upgraded the versions for household, individual and service request

* Update core-services/service-request/CHANGELOG.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update health-services/individual/CHANGELOG.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* HLM fixed merge issues

* HLM fixed merge issues

* HCMPRE-413: updated the changelog as per code review comments

* Update health-services/project/CHANGELOG.md

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: kavi_elrey@1993 <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* HCMPRE-424: fixed hrms call from pgr-service

* HCMPRE-424: updated as per code review comments

* Create branch-name-validator (#960)

* Create branch-name-validator

* Update branch-name-validator

* Update .github/workflows/branch-name-validator

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update .github/workflows/branch-name-validator

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update branch-name-validator

* Rename branch-name-validator to branch-name-validator.yml

* Added census-service in build-config (#990)

* [HCMPRE-658] Refractor resource-estimation-service to resource-generator (#910)

Co-authored-by: Priyanka-eGov <[email protected]>

* Update package.json

* Update tsconfig.json

* Added configs and env dependencies

* dockerfile update

* Update tsconfig.json

* Update tsconfig.json

* refactored

* HCM Admin Console v0.3 Release code changes (#1082)

* kafka fix for large messages

* Update genericUtils.ts

* Update campaignValidators.ts

* Fixed the mdms search path keys

* fix of migration script

* fix on repeated key

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignUtils.ts

* Update campaignUtils.ts

* Update campaignUtils.ts

* Fix project target mapping

* refactored migration files fro project-factory (#867)

* refactored migration files fro project-factory

* updated logic for unique username generation

* updated format and id name  for user name

* removed hash logic for username generation

* added indexing on columns

* updated idgen seq format for user name in index.ts

* Update health-services/project-factory/src/server/api/campaignApis.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* updated logic for regenerate if campaign type differs (#876)

* id generation throw error update

* Enhance generate template for user and facility in update  ongoing campaign flow (#885)

* commit for update-generate-template

* updated campaign  flow generate template enhancement

* just if else changes

* some reformatting

* update index.ts

* added additional valiadtion for parent campaign

* updated logic for validating  parent campaign

* refcatored as per change requests

* update index.ts

* updated logic for same campaignnumber when paren is present

* updated the campaign name logic along with handling isfailed status too (#888)

* updating campaign name same as parent name and number too

* updated target template for updating ongoing campaign (#893)

* Microplan bulk user creation (#890)

* Feat : initialised bulk user creation for microplan

* Enhanced user bulk upload for microplan

* Fixed configs

* Merge fix with console

* Feat : added columns in user sheet

* Added userroles sheet for bulk user template in microplan

* Added source microplan while resource creation

* changed logic for isSourceMicroplan

* Update campaignApis.ts

* Update campaignValidators.ts

* changes for campaign update flow

* Update campaignUtils.ts

* Integrated required error messages

* added numeric check in  microplan phone number

* Implemented no data validation

* added logic for creating projects , project facility and project staff on newly added boundaries (#917)

* updated target template for updating ongoing campaign

* update flow campaign mapping

* updated flow campaign mapping

* added logic for project, project facility and project staff  creation on newly added boundaries

* removed one useless func

* removed await from a func

* removed console.logs

* added some minor enhancemnets

* added one edge case scenario

* changed request limit to 1 mb

* Feat : added locksheet filter for user microplan creation

* updated logic for regenerate if campaign type differs (#876)

* Enhancement for microplan user creation (#940)

* some modifications for edge cases (#930)

* added commit for testing  update campaign flow

* some chenages related to type boundary in data create api

* /* MODIFIED FOR LTS UPGRADE */

* Microplan user enhancement

* Some changes regarding microplan user and boundary

* added some null checks

* /* Temporay fix for project creation of LLIN since the structure of delivery rules is getting changed */

* Revert "/* MODIFIED FOR LTS UPGRADE */"

This reverts commit 52ed772.

* added code to add lat long in the project-factory apis

* Changed code based on comments

* removed default campaignid

* added code to add lat long in the project-factory apis (#951)

* added code to add lat long in the project-factory apis

* Changed code based on comments

* removed default campaignid

* Fixed code to manage create

* fixed the build

* added for field protection on sheet data

* Facility microplan validation (#975)

* Microplan facility validation

* Enhancement in microplan validations

* Microplan sheet lock

* Enhanced for multiple sheetErrors in additionalDetails

* Update campaignApis.ts

* fixes for filestore and unfreezing boundary code mandatory columns (#984)

* Update CODEOWNERS

* Update campaignValidators.ts (#987)

* some  correction of error after changes from microplan code merge (#988)

* some  correction of error after changes from microplan code merge

* added question mark

* added localization fix (#993)

Co-authored-by: ansh-egov <[email protected]>

* Update campaignApis.ts (#994)

* Added some fixes for the project transformation

* Update projectTypeUtils.ts

* Update campaignUtils.ts

* Removed date Update projectTypeUtils.ts #1006

* HCEMPRE-809-Boundary-geometry-codes (#1011)

* added localization fix

* added logic for boundaryGeometryManagement

* fixed some things

* fixed campaign search

* update  project facility and staff mappings of exisitng facilities and users  (#998)

* some  correction of error after changes from microplan code merge

* added logic for updating mapping of existing facilitie and users

* resolved comments by jagan on the pr for delinking and linking project resources

* fetchProjectsWithBoundaryCodeAndName fucntion update

* Update campaignValidators.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* added code to add lat long in the project-factory apis (#1019)

* Project staff mapping correction from uuid to userserviceuuid (#1022)

* some  correction of error after changes from microplan code merge

* corrected for mapping of project staff

* added changes for project-resource mapping (#1028)

* added changes for project-resource mapping

* changed the variable name to boundaryProjectMappingForProjectResourceCreation from newBoundaryProjectMapping

* renamed the entity

---------

Co-authored-by: Jagankumar <[email protected]>

* Added logic to retry in project campaign create (#1031)

* not needed to update every resource in update flow (#1036)

* not needed to update every resource in update flow

* added changes for if boundaries present in update  flow all resources are mandate

* Some checks enhancement (#1042)

* Update genericApis.ts (#1043)

* Update campaignValidators.ts (#1046)

* consolidate resources array in update campaign flow (#1051)

* consolidate resources array in update campaign flow

* spelling correct

* Search criteria  object corrected (#1052)

* consolidate resources array in update campaign flow

* data search criteria  id has to be an array of strings

* spelling

* some more corrections regarding search criteria body

* Boundaries consolidate after creating child campaign (#1056)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* Boundaries correction (#1058)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* changes in extracing boundaries from campaign object

* Missing resources in chid campaign to be added from parent camaig  logic refactored  (#1059)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* refactored logic for adding missing resources from parent campaign

* Correction datatocreate column from status to userservice uuids (#1061)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* changed data to create column from user sheet

* Hide Boundary and Target Old Columns (#1062)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* hide boundary code  old and target old

* Corrected target update flow (#1065)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* corrected target mapping in update flow

* Total count of Campaigns if only is active true (#1066)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* count will be only of active campaigns

* HCMPRE-1212:: migrated to point only to MMDS v2 api

* Update index.ts

* User/facility inactive (#1070)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* added logic for making exiting user facility inactive

---------

Co-authored-by: Jagankumar <[email protected]>

* Refactor facility mappings  (#1072)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* refactored facility mappings

* Created enity for boundary

* updated the boundary relationship function

* Update index.ts

* fixed some localization issue (#1075)

* fixed some localization issue

* fixed

* Target update while campaign update flow (#1078)

* consolidate resources array in update campaign flow

* boundaries consolidate after creating  child campaign

* logic for updating targets

* some refactor for adding logs and index.ts

* updated the boundary localisation name

---------

Co-authored-by: Jagankumar <[email protected]>

* fixed some localization issue (#1079)

* fixed some localization issue

* fixed

* fixed a issue

* reverted failed campaign is active true from false (#1080)

* reverted failed campaign is active true from false

* took constants from index

* refactor

* Merge branch 'project-factory-kafka-fix' into console

* Changed hierarchyFectch to v2 (#1077)

* Changed hierarchyFectch to v2

* Changed messages

* Merge branch 'project-factory-kafka-fix' into hierarchyFetchV2

---------

Co-authored-by: Jagankumar <[email protected]>

* calll generate when create completes for type boundary management

* auto generate resource if there is no previous generated history

* Fixed crashloop issue (#1084)

* Fixed crashloop issue

* Update dataManageService.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* changed the master from hierarchyConfig to HierarchySchema (#1086)

* getting boundaries split on logic change (#1088)

* fixed some localization issue (#1090)

* fixed some localization issue

* fixed

* fixed a issue

* integrated microplan with console

* fixxed index

* fixed crashloop (#1091)

* added validation for boundary bulk upload (#1092)

Co-authored-by: Jagankumar <[email protected]>

* refcatored diffferent tab separation (#1093)

* added timeout (#1095)

* Microplan integration :: set start date to tommorow (#1096)

* set start date to tommorow

* updated end date

* Enhance PlanFacility object (#1099)

* validation for update template in create flow (#1100)

* removed await (#1103)

* some correction (#1104)

* logic for updating targets only when present in resources array in update flow (#1105)

* updated (#1106)

* updated

* added fix for the redis error in logs

---------

Co-authored-by: Jagankumar <[email protected]>

* added error responder (#1107)

* refactored download api (#1108)

* Cache issue fix(#1109)

* refactored download api

* refactor

---------

Co-authored-by: Jagankumar <[email protected]>

* made disable of cache always during boundary generate (#1110)

Co-authored-by: Jagankumar <[email protected]>

* Redis cache key deleted (#1113)

* removed await

* delete cache from boundary relationship search

* updated redis delete func

* Revert "removed await"

This reverts commit a5acb54.

* Update redisUtils.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* updated redis delete func (#1114)

* removed cache from boundary relation create (#1115)

* corrected params of auto generate after download api (#1116)

* refactored consolidate (#1119)

* fix on the fetch from microplan Update campaignUtils.ts (#1120)

* Update microplanUtils.ts (#1123)

* addded localization function (#1125)

* Update SearchCriteria.ts

* made createandtransfrom localization as await to upsert all localization in boundary management create flow (#1127)

* added logs in handledropdownthings (#1128)

* Fixed district missing issue (#1129)

* Facility Village List For microplan and dropdown fix (#1130)

* Facility Village List For microplan and dropdown fix

* Optional chaining

* Reverted recievedDropdown Changes

* removed localization caceh in boundary generate flow for hierarchy module (#1133)

* planFacility create Fix (#1132)

* fixed the localisation cache on multiple data creates in boundary

* Revert "fixed the localisation cache on multiple data creates in boundary"

This reverts commit 94eb970.

* Facility fix generation for microplan (#1134)

* planFacility create Fix

* Fixed Facility Generation for microplan

* added the count info of the localisation upsert (#1144)

Co-authored-by: Jagankumar <[email protected]>

* Microplan integrated with console for facility , user & target (#1151)

* set start date to tommorow

* updated end date

* added code for target sheet

* fixed

* added mdms call

* microplan integration changes added for target ,facility & user

* undo changes

* Update microplanIntergration.ts

* saving all the progress on the integration

* project facility mapping done

* Update microplanIntergration.ts

* Update microplanIntergration.ts

* Update microplanIntergration.ts

* target & facility integration completed statically

* completed facility & target file created based on microplan

* added static for user

* Added user related changes

* added the user integartion

* added target and facility in resources array of campaign from microplan

* added user in resources array in campaign object

* Update microplanIntergration.ts

* revert the others

* Update index.ts

* Cleaned up code

---------

Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: Jagankumar <[email protected]>
Co-authored-by: nitish-egov <[email protected]>

* added missing default tenantid (#1152)

* Boundary locale fix (#1153)

* planFacility create Fix

* Fixed boundary validation for different locales

---------

Co-authored-by: Jagankumar <[email protected]>

* removed duplicate key (#1154)

* Logs for reordering added (#1161)

* added missing default tenantid

* added logs for reordering before project creation

* Update index.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* Readme for microplans (#1160)

* Readme for microplans

* Fixed boolean type

* Optimized isMicroplanRequest

* Optimizing roles for microplan (#1164)

* Improved  some performance with huge campaign object (#1165)

* Trying optimised code by chatgpt

* added 4mb limit

* Update app.ts

* Updted the comments

* Changes for pollutils and reorder

* Update pollUtils.ts

---------

Co-authored-by: Jagankumar <[email protected]>
Co-authored-by: ashish-egov <[email protected]>

* Added fixes for error during processing (#1172)

* added missing default tenantid

* added try catch

* Update index.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* fixed the hard codings in the code (#1170)

* fixed the hard codings in the code

* fixed

* Added filter check for the fetch from microplan if it has already some resources

* fixed hardcoding in target flow (#1175)

* fixed hardcoding in target flow

* fixed

* fixed

* Update campaignApis.ts (#1177)

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignApis.ts

* Update genericUtils.ts (#1178)

* changed the campaig key to activity (#1180)

* added missing default tenantid

* Update campaignUtils.ts

* Update index.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* handled failed generations in downlaod api (#1185)

* Change for roles name change (#1187)

* added search before update in fetch all datas (#1190)

* added seacrh before update in fetch all datas

* Change for roles name change (#1187)

---------

Co-authored-by: ashish-egov <[email protected]>
Co-authored-by: Jagankumar <[email protected]>

* formatted and made promise all to do a promise all to make calls parallely

* filtered the plan facility response to have only facility which has only service boundarires

* added hierarchy filteration from mdms (#1188)

* added hierarchy filteration from mdms

* did some hardcoding

* fixed fetching of headers

* added some logs

* added extra loggers for fetch from microplan activities (#1193)

* added extra loggers for fetch from microplan activities

* Update microplanIntergration.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* added retry in localization upsert (#1194)

* Adding the additonal loggers to know more informs on microplan integration

* Update health-services/project-factory/src/server/validators/campaignValidators.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update health-services/project-factory/src/server/validators/campaignValidators.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update health-services/project-factory/src/server/controllers/campaignManage/campaignManage.controller.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* fixed integration (#1197)

* Global handler (#1199)

* Change for roles name change

* Global exception handler integrated

* String logger

* Update app.ts

* added heap memory log & created a env variable for incomingRequestPay…   …loadLimit (#1201)

* added heap memory log & created a env variable for incomingRequestPayloadLimit

* Update index.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* Set memory limit and added log off avaiable, max limits (#1202)

* added logs to check current value

* Update app.ts

* Update Dockerfile

* Update Dockerfile

* Update app.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* Handle for google sheet formulas (#1207)

* Update microplanValidators.ts (#1213)

* Update microplanValidators.ts

* Update microplanValidators.ts

* Update microplanValidators.ts

* Localised roles (#1217)

* added change log for admin console version 0.3 (#1224)

* Pvar validation (#1225)

* product variant validation added

* Optimized validations

* Refactored

* Logger error fix

* Refactor

* Refactor

* refactored project reosurce mapping logic (#1204)

* refactored project reosurce mapping logic

* added new function ot search project after campaign creation time

* added reference id  params in project search

* added logic for adding resources only for newly created projects

* refactored and code clean up for project resource mapppings in update and create flow

* refactor getProjectMappingBody func

* some condition check

* correction

* removed project departments

* microplan save topic changes (#1231)

* Update microplanUtils.ts

* Update index.ts

* Update campaignApis.ts (#1232)

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignValidators.ts

* Revert boundaryProject Mapping

* Cleaned up data configs (#1234)

* Update index.ts

* Update campaignUtils.ts

* Update health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update app.ts

* Update health-services/project-factory/src/server/utils/microplanUtils.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* added changes for performance testing (#1236)

* added changes for performance testing

* microplan save topic changes (#1231)

* Update microplanUtils.ts

* Update index.ts

* Update campaignApis.ts (#1232)

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignApis.ts

* Update campaignValidators.ts

* Revert boundaryProject Mapping

* Cleaned up data configs (#1234)

* Update index.ts

* Update campaignUtils.ts

* try catch handling

* Update health-services/project-factory/src/server/service/dataManageService.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: ashish-egov <[email protected]>
Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: Jagankumar <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update app.ts

* Update health-services/project-factory/src/server/utils/microplanUtils.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update campaignValidators.ts (#1244)

* Update campaignValidators.ts

* Update campaignValidators.ts

* add

* added new config values

* Other configs (#1250)

* add

* added new config values

* Update request.ts

---------

Co-authored-by: Jagankumar <[email protected]>

* Updating debug function for error handeling (#1243)

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update health-services/project-factory/src/server/service/dataManageService.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update dataManageService.ts

* Applied code rabbit changes

* refactored sheet consolidate logic (#1254)

* refactored sheet consolidate logic

* Applied code rabbit changes

---------

Co-authored-by: Jagankumar <[email protected]>

* refactor sheet consolidate for target (#1255)

* Update package.json

* Update health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* removed other ui workflows

* Update publishProjectFactory.yml

* Update publishProjectFactory.yml

* Update publishProjectFactory.yml

* Update publishProjectFactory.yml

* Update publishProjectFactory.yml

* Update publishProjectFactory.yml

* consolidate sheet handle logic change (#1256)

* Apply suggestions from code review

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: ashish-egov <[email protected]>
Co-authored-by: ashish-egov <[email protected]>
Co-authored-by: nitish-egov <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: nitish-egov <[email protected]>
Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: ejagankumar <[email protected]>

* Changed docker file

* package changes

* dockerfile revert

* changed package json

* Update package.json

* reverted other folders

* reverted

* Update settings.json

---------

Co-authored-by: kanishq-egov <[email protected]>
Co-authored-by: kavi_elrey@1993 <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: kanishq-egov <[email protected]>
Co-authored-by: Sathish P <[email protected]>
Co-authored-by: tanishi-egov <[email protected]>
Co-authored-by: Palak Garg <[email protected]>
Co-authored-by: Priyanka-eGov <[email protected]>
Co-authored-by: Jagankumar <[email protected]>
Co-authored-by: nitish-egov <[email protected]>
Co-authored-by: nitish-egov <[email protected]>
Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: ansh-egov <[email protected]>
Co-authored-by: ejagankumar <[email protected]>
* some boundary bulk and microplan user changes

* Some refactoring
* localisation-cache-fix

* Refactor

* Refactor

* logger added
* some boundary bulk and microplan user changes

* Some refactoring

* Fix for target

* Facility fix

* User Fix

* Target valiodation for  microplan
Copy link
Contributor

coderabbitai bot commented Dec 12, 2024

Walkthrough

The changes in this pull request involve several modifications across various files in the project-factory project. Key updates include changes to the Dockerfile to specify the command for running the application using Yarn, enhancements to the package.json scripts for debugging capabilities, and additions to configuration files to support new properties. New methods and error handling improvements were also introduced in several utility and controller files, enhancing functionality and maintainability. A new TypeScript configuration file was added, and existing configurations were updated for clarity.

Changes

File Path Change Summary
health-services/project-factory/Dockerfile Updated command to run application using Yarn in production mode (CMD ["yarn", "prod"]).
health-services/project-factory/package.json Modified serve and prod scripts to include debugging capabilities based on the DEBUG environment variable.
health-services/project-factory/src/server/config/index.ts Added new properties to facility and localisation objects, enhancing configuration options.
health-services/project-factory/src/server/controllers/localisationController/localisation.controller.ts Introduced new method cacheBurst and improved error handling in createLocalisation.
health-services/project-factory/src/server/utils/campaignUtils.ts Refactored functions for campaign and boundary management, improved error handling, and introduced new utility functions.
health-services/project-factory/src/server/utils/genericUtils.ts Cosmetic changes for readability and consistency in formatting and logging.
health-services/project-factory/src/server/utils/microplanUtils.ts Updated validateNationalDuplicacy function to use localizationMap, enhancing phone number validation.
health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts Updated modifyProcessedSheetData function to include a new schema parameter for improved data processing.
health-services/project-factory/src/server/utils/request.ts Enhanced error handling and logging in the httpRequest function.
health-services/project-factory/src/server/utils/transforms/localisationMessageConstructor.ts Made CHUNK_SIZE dynamic and improved error handling in uploadInChunks.
health-services/project-factory/src/server/validators/campaignValidators.ts Added import for validateExtraBoundariesForMicroplan and integrated it into immediateValidationForTargetSheet.
health-services/project-factory/src/server/validators/microplanValidators.ts Updated validateMicroplanFacility error message and added validateExtraBoundariesForMicroplan function for additional validation checks.
health-services/project-factory/tsconfig.debug.json Introduced new TypeScript configuration file with various compiler options.
health-services/project-factory/tsconfig.json Added comments to clarify compiler options; no changes to actual configuration values.
health-services/project-factory/src/server/api/campaignApis.ts Updated performAndSaveResourceActivity to log accurate creation time and refined processGenericRequest to exclude certain resource types from processing logic.
health-services/project-factory/src/server/app.ts Removed redundant JSON body parser middleware initialization.
health-services/project-factory/src/server/utils/gzipHandler.ts Introduced a new utility for handling Gzip-compressed requests, including functions for processing and decompressing Gzip data.
health-services/project-factory/src/server/utils/middlewares/requestMiddleware.ts Updated to handle both application/json and application/gzip content types, integrating Gzip request handling.

Possibly related PRs

  • Admin console, user password logic updated #760: The changes in the performAndSaveResourceActivity function in campaignApis.ts include updates to logging and processing logic, which may relate to the overall handling of resource activities in the application, similar to the modifications made in the main PR's Dockerfile and package.json that affect how the application runs and is configured.

🐰 In the code, I hop and play,
With scripts and configs, I pave the way.
Yarn runs the show, in production we thrive,
New methods and checks keep our code alive!
Debugging made easy, with just a little tweak,
In the land of code, it's the rabbits we seek! 🐇✨


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 24

🧹 Outside diff range comments (7)
health-services/project-factory/src/server/utils/request.ts (2)

Line range hint 132-196: Enhance error logging for better debugging

While the error handling is comprehensive, the logging can be improved to help with debugging issues.

Consider enhancing the error logging:

     } catch (error: any) {
       const errorResponse = error?.response;
+      const requestDetails = {
+        url: _url,
+        method: _method,
+        params: _params,
+        serviceName: getServiceName(_url)
+      };
       logger.error(
-        "INTER-SERVICE :: FAILURE :: " +
-        getServiceName(_url) +
-        ":: CODE :: " +
-        errorResponse?.status +
-        ":: ERROR :: " +
-        (errorResponse?.data?.Errors?.[0]?.code || error) +
-        ":: DESCRIPTION :: " +
-        errorResponse?.data?.Errors?.[0]?.description
+        JSON.stringify({
+          level: "ERROR",
+          message: "INTER-SERVICE_FAILURE",
+          request: requestDetails,
+          response: {
+            status: errorResponse?.status,
+            error: errorResponse?.data?.Errors?.[0]?.code || error,
+            description: errorResponse?.data?.Errors?.[0]?.description
+          }
+        })
       );

This change:

  1. Adds request details to help with debugging
  2. Structures logs in a consistent JSON format
  3. Makes logs easier to parse and analyze

Line range hint 67-67: Improve caching configuration and strategy

The current caching implementation has several areas for improvement:

  1. Hard-coded TTL value
  2. Non-standardized cache key generation
  3. Missing cache invalidation strategy for error cases

Consider these improvements:

  1. Move cache configuration to config file:
-  const cacheTTL = 300; // TTL in seconds (5 minutes)
+  const cacheTTL = config.cacheValues.ttl || 300;
  1. Standardize cache key generation:
-  const cacheKey = headers && headers.cachekey ? `cache:${headers.cachekey}` : null;
+  const cacheKey = headers?.cachekey ? 
+    `cache:${config.values.env}:${getServiceName(_url)}:${headers.cachekey}` : null;
  1. Add cache invalidation for error responses in the catch block:
if (cacheKey && isRedisConnected) {
  await redis.del(cacheKey);
}

Also applies to: 69-69

health-services/project-factory/tsconfig.json (2)

Line range hint 8-9: Enable source maps for improved debugging experience

Given that this PR involves "console recent changes" and debugging capabilities, consider enabling source maps to improve the debugging experience. Currently, source map generation is commented out.

Apply this diff to enable source maps:

    // "declaration": true,                   /* Generates corresponding '.d.ts' file. */
    // "declarationMap": true,                /* Generates a sourcemap for each corresponding '.d.ts' file. */
-   // "sourceMap": true,                     /* Generates corresponding '.map' file. */
+   "sourceMap": true,                        /* Generates corresponding '.map' file. */
    // "outFile": "./",                       /* Concatenate and emit output to single file. */

Also applies to: 53-57


Update typeRoots configuration to include local type definitions

The verification revealed that there are local TypeScript declaration files (*.d.ts) in the src/server/models directory that define important interfaces for boundary and MDMS functionality. These files are actively used in the codebase through imports. The current typeRoots configuration that only includes ./node_modules/@types would indeed be too restrictive.

  • Add "./src/server/models" to the typeRoots array in tsconfig.json to ensure these local type definitions are properly recognized
  • Alternatively, consider using the paths compiler option to map these specific type modules if you want more granular control
🔗 Analysis chain

Line range hint 49-50: Verify module resolution configuration

The typeRoots setting is configured to only look in ./node_modules/@types. This might be too restrictive if you have local type definitions.

Let's check if there are any local type definitions that might be excluded:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check for local type definition files
# Look for .d.ts files outside node_modules

echo "Local type definition files:"
fd "\.d\.ts$" --exclude "node_modules"

Length of output: 229


Script:

#!/bin/bash
# Let's check the content of these type definition files to understand their usage
echo "Content of Boundary.d.ts:"
cat health-services/project-factory/src/server/models/Boundary.d.ts

echo -e "\nContent of MDMS.d.ts:"
cat health-services/project-factory/src/server/models/MDMS.d.ts

# Also check if these types are being imported anywhere
echo -e "\nUsage of these type definitions:"
rg -l "from ['\"].*/(Boundary|MDMS)['\"]" --type ts

Length of output: 8706

health-services/project-factory/src/server/utils/campaignUtils.ts (3)

Line range hint 932-932: Use let or const instead of var for variable declarations

The var keyword has function scope and can lead to unexpected behavior. It's recommended to use let or const for block-scoped variables. Please replace var with let or const in the following instances.

Apply the following diffs:

- var columns = {};
+ const columns = {};

- var errorData = request.body.sheetErrorDetails;
+ let errorData = request.body.sheetErrorDetails;

- var errorMap: any = {};
+ let errorMap: any = {};

- var newSheetErrorDetails: any = [];
+ let newSheetErrorDetails: any = [];

- var createResourceDetailsIds: any[] = [];
+ const createResourceDetailsIds: any[] = [];

- var parentCampaign = requestBody?.parentCampaign;
+ const parentCampaign = requestBody?.parentCampaign;

Also applies to: 936-936, 957-957, 970-970, 1071-1071, 1322-1322


Line range hint 2427-2427: Replace console.log with logger.error for error logging

Using console.log is discouraged in production code. Use the provided logging mechanism for consistent error logging.

Apply the following diffs:

-        console.log(error);
+        logger.error(error);

Also applies to: 2450-2450


Line range hint 2458-2461: Handle potential errors when updating the campaign

In the updateCampaignAfterSearch function, ensure that errors returned from updateProjectTypeCampaignService are properly handled to avoid unhandled promise rejections.

Consider adding a try...catch block around the updateProjectTypeCampaignService call:

      try {
          await updateProjectTypeCampaignService(req);
          logger.info("Updated the received campaign object");
      } catch (error: any) {
          logger.error(`Error updating campaign: ${error.message}`);
          // Handle the error appropriately
      }
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between fd9a00f and 3ed33e6.

📒 Files selected for processing (14)
  • health-services/project-factory/Dockerfile (1 hunks)
  • health-services/project-factory/package.json (1 hunks)
  • health-services/project-factory/src/server/config/index.ts (3 hunks)
  • health-services/project-factory/src/server/controllers/localisationController/localisation.controller.ts (3 hunks)
  • health-services/project-factory/src/server/utils/campaignUtils.ts (4 hunks)
  • health-services/project-factory/src/server/utils/genericUtils.ts (14 hunks)
  • health-services/project-factory/src/server/utils/microplanUtils.ts (2 hunks)
  • health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts (2 hunks)
  • health-services/project-factory/src/server/utils/request.ts (1 hunks)
  • health-services/project-factory/src/server/utils/transforms/localisationMessageConstructor.ts (3 hunks)
  • health-services/project-factory/src/server/validators/campaignValidators.ts (2 hunks)
  • health-services/project-factory/src/server/validators/microplanValidators.ts (3 hunks)
  • health-services/project-factory/tsconfig.debug.json (1 hunks)
  • health-services/project-factory/tsconfig.json (1 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
health-services/project-factory/src/server/config/index.ts

[error] 100-100: Use Number.parseInt instead of the equivalent global.

ES2015 moved some globals into the Number namespace for consistency.
Safe fix: Use Number.parseInt instead.

(lint/style/useNumberNamespace)


[error] 101-101: Use Number.parseInt instead of the equivalent global.

ES2015 moved some globals into the Number namespace for consistency.
Safe fix: Use Number.parseInt instead.

(lint/style/useNumberNamespace)

health-services/project-factory/src/server/utils/genericUtils.ts

[error] 911-911: Unnecessary use of boolean literals in conditional expression.

Simplify your code by directly assigning the result without using a ternary operator.
If your goal is negation, you may use the logical NOT (!) or double NOT (!!) operator for clearer and concise code.
Check for more details about NOT operator.
Unsafe fix: Remove the conditional expression with

(lint/complexity/noUselessTernary)


[error] 1313-1313: This let declares a variable that is only assigned once.

'd' is never reassigned.

Safe fix: Use const instead.

(lint/style/useConst)

health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts

[error] 82-82: Unsafe usage of optional chaining.

If it short-circuits with 'undefined' the evaluation will throw TypeError here:

(lint/correctness/noUnsafeOptionalChaining)

health-services/project-factory/src/server/utils/microplanUtils.ts

[error] 119-119: Template literals are preferred over string concatenation.

Unsafe fix: Use a template literal.

(lint/style/useTemplate)

health-services/project-factory/src/server/utils/campaignUtils.ts

[error] 2417-2417: Avoid the delete operator which can impact performance.

(lint/performance/noDelete)

health-services/project-factory/tsconfig.debug.json

[error] 4-4: JSON standard does not allow comments.

(parse)


[error] 6-6: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 6-6: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 6-6: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 6-6: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 6-6: JSON standard does not allow comments.

(parse)


[error] 7-7: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 7-7: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 7-7: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 7-13: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 7-7: JSON standard does not allow comments.

(parse)


[error] 14-14: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 14-14: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 14-14: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 14-15: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 14-14: JSON standard does not allow comments.

(parse)


[error] 16-16: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 16-16: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 16-16: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 16-25: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 16-16: JSON standard does not allow comments.

(parse)


[error] 25-25: JSON standard does not allow comments.

(parse)


[error] 26-26: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 26-26: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 26-26: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 26-26: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 26-26: JSON standard does not allow comments.

(parse)


[error] 27-27: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 27-27: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 27-27: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 27-27: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 27-27: JSON standard does not allow comments.

(parse)


[error] 28-28: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 28-28: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 28-28: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 28-28: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 28-28: JSON standard does not allow comments.

(parse)


[error] 29-29: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 29-29: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 29-29: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 29-32: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 29-29: JSON standard does not allow comments.

(parse)


[error] 33-33: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 33-33: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 33-33: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 33-34: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 33-33: JSON standard does not allow comments.

(parse)


[error] 34-34: JSON standard does not allow comments.

(parse)


[error] 35-35: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 35-35: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 35-35: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 35-36: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 35-35: JSON standard does not allow comments.

(parse)


[error] 37-37: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 37-37: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 37-37: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 37-39: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 37-37: JSON standard does not allow comments.

(parse)


[error] 39-39: JSON standard does not allow comments.

(parse)


[error] 40-40: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 40-40: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 40-40: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 40-40: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 40-40: JSON standard does not allow comments.

(parse)


[error] 41-41: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 41-41: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 41-41: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 41-43: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 41-41: JSON standard does not allow comments.

(parse)


[error] 44-44: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 44-44: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 44-46: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 46-48: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 46-46: JSON standard does not allow comments.

(parse)


[error] 49-49: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 49-49: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 49-49: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 49-49: JSON standard does not allow comments.

(parse)


[error] 52-52: JSON standard does not allow comments.

(parse)


[error] 57-57: JSON standard does not allow comments.

(parse)


[error] 61-61: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 61-61: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 61-63: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 63-63: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 64-64: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 64-64: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 64-67: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 67-68: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)

🔇 Additional comments (11)
health-services/project-factory/src/server/validators/microplanValidators.ts (1)

273-273: Improved error message clarity

The error message now accurately reflects that the facility capacity validation checks for both empty and zero values.

health-services/project-factory/src/server/validators/campaignValidators.ts (2)

25-25: LGTM!

The import statement is correctly added and properly grouped with related microplan validator imports.


1323-1323: LGTM!

The validateExtraBoundariesForMicroplan function is well-integrated into the validation flow, placed at an appropriate point in the sequence of validations.

health-services/project-factory/Dockerfile (1)

Line range hint 1-1: Verify compatibility with Node.js 20

The base image node:20 is relatively recent. Ensure all dependencies and code are compatible with Node.js 20.

Let's check the package.json for any potential compatibility issues:

✅ Verification successful

Node.js 20 upgrade is compatible with the codebase

The upgrade from Node.js 16 to 20 is safe based on the following evidence:

  • The package.json dependencies are all compatible with Node.js 20, with many packages explicitly supporting Node.js >=18.0.0
  • The codebase already has @types/node v20.11.29 as a dependency
  • The LOCALSETUP.md file already recommends Node.js 20
  • The TypeScript codebase shows modern patterns compatible with Node.js 20
  • Key dependencies like Express, Axios, and Winston are on recent versions that support Node.js 20
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check package.json for engine constraints and dependencies

# Check for engine constraints
echo "Checking engine constraints..."
if [ -f package.json ]; then
    jq '.engines' package.json
fi

# List direct dependencies
echo "Checking dependencies..."
if [ -f package.json ]; then
    jq '.dependencies' package.json
fi

Length of output: 181


Script:

#!/bin/bash
# Let's try to find the package.json in the correct directory
echo "Looking for package.json in health-services/project-factory..."
if [ -f "health-services/project-factory/package.json" ]; then
    echo "Found package.json. Checking contents..."
    jq '.engines, .dependencies' health-services/project-factory/package.json
else
    echo "package.json not found in health-services/project-factory"
fi

# Let's also check for any TypeScript files to understand the codebase better
echo -e "\nChecking for TypeScript files..."
fd -e ts -p health-services/project-factory/

# Check for any existing Node.js version references in the codebase
echo -e "\nChecking for Node.js version references..."
rg -i "node.*[0-9]{2}" health-services/project-factory/

Length of output: 32572

health-services/project-factory/tsconfig.json (1)

Line range hint 77-81: Review test file exclusion pattern

The current configuration excludes test files with **/*.test.ts pattern but includes all source files. This might miss test files with different naming patterns (e.g., .spec.ts).

Let's verify if there are any test files that might be unintentionally included:

health-services/project-factory/tsconfig.debug.json (1)

64-67: Verify test file exclusion for debugging

The configuration excludes all test files (**/*.test.ts). Consider if you need to debug test files, in which case you might want to:

  1. Remove the test file exclusion
  2. Create a separate debug configuration for tests
✅ Verification successful

Test files are needed for debugging

The codebase contains multiple TypeScript files with test-related code (describe/it/test patterns) within the source files themselves. Since these test patterns are integrated into the source files rather than separate .test.ts files, excluding **/*.test.ts won't affect debugging capabilities. The current exclusion pattern is safe to keep.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check for test files that might need debugging
fd -e test.ts

Length of output: 15


Script:

#!/bin/bash
# Let's try a different approach to find test files
rg -g "*.ts" "^.*test.*\.(ts|tsx)$" --files-only

# Also check for any test-related patterns in the codebase
rg -g "*.ts" -l "(describe|it|test)\s*\(" 

# Check tsconfig patterns in the repository
fd tsconfig

Length of output: 1897

🧰 Tools
🪛 Biome (1.9.4)

[error] 64-64: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 64-64: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)


[error] 64-67: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)

health-services/project-factory/src/server/utils/campaignUtils.ts (1)

1038-1038: Ensure getRootBoundaryCode function handles empty or undefined boundaries

The getRootBoundaryCode function might return an empty string if no root boundary is found. Ensure that this case is properly handled to avoid potential issues downstream.

Run the following script to verify if any calls to getRootBoundaryCode could result in an empty string and handle them appropriately:

health-services/project-factory/src/server/utils/onGoingCampaignUpdateUtils.ts (2)

36-38: LGTM!

The code correctly filters the resources by processedType and maps them to their createResourceId. Implementation is precise and functional.


79-79: Ensure all calls to modifyProcessedSheetData are updated

The function signature of modifyProcessedSheetData has been changed. Please verify that all invocations of this function elsewhere in the codebase are updated to match the new parameters (type: any, sheetData: any, schema: any, localizationMap?: any).

Run the following script to find and review all usages of modifyProcessedSheetData:

✅ Verification successful

Let me verify the function calls by checking the actual implementation of these calls to ensure they match the new signature.


Let me get more context around these function calls to verify the parameters being passed.


All calls to modifyProcessedSheetData are correctly updated

Both calls to modifyProcessedSheetData in the codebase are passing all the required parameters in the correct order:

  1. typeWithoutWith as the type parameter
  2. processedFacilitySheetData/processedUserSheetData as the sheetData parameter
  3. schema as the schema parameter
  4. localizationMap as the optional localizationMap parameter
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Find all calls to `modifyProcessedSheetData` in the codebase.

# Search for the function usage across the repository
rg -A 2 'modifyProcessedSheetData\('

Length of output: 1281


Script:

#!/bin/bash
# Get the implementation context of the function calls
ast-grep --pattern 'modifyProcessedSheetData($$$)' 

Length of output: 3009


Script:

#!/bin/bash
# Get more context around the function calls in genericUtils.ts
rg -B 2 -A 2 'modifyProcessedSheetData\(' health-services/project-factory/src/server/utils/genericUtils.ts

Length of output: 936

health-services/project-factory/src/server/utils/microplanUtils.ts (2)

Line range hint 96-138: Role validation logic updated correctly

The updates to the validateNationalDuplicacy function enhance the role validation logic and effectively incorporate localization. This improves the clarity of error messages and ensures consistent role handling.

🧰 Tools
🪛 Biome (1.9.4)

[error] 119-119: Template literals are preferred over string concatenation.

Unsafe fix: Use a template literal.

(lint/style/useTemplate)


66-66: Verify that all calls to validateNationalDuplicacy are updated

Ensure that all function calls to validateNationalDuplicacy across the codebase have been updated to match the new signature accepting localizationMap instead of phoneNumberKey.

Run the following script to check for any outdated calls to validateNationalDuplicacy still using phoneNumberKey as the third argument:

✅ Verification successful

No outdated calls found - function signature change is consistent

The verification shows that validateNationalDuplicacy is only used once in the codebase, and that single call correctly uses localizationMap as the third argument, matching the function's definition. There are no outdated calls using phoneNumberKey.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Search for calls to 'validateNationalDuplicacy' with 'phoneNumberKey' as the third argument.

# Find all occurrences where 'validateNationalDuplicacy' is called with 'phoneNumberKey'
rg 'validateNationalDuplicacy\([^,]+,[^,]+,\s*phoneNumberKey\s*\)'

Length of output: 68


Script:

#!/bin/bash
# Let's first find all occurrences of validateNationalDuplicacy to see its usage pattern
rg "validateNationalDuplicacy" -A 2

# Also let's look at the function definition to understand the signature change
ast-grep --pattern 'function validateNationalDuplicacy($_) {
  $$$
}'

Length of output: 792

Comment on lines +306 to +323
export async function validateExtraBoundariesForMicroplan(request: any, dataFromSheet: any, localizationMap: any) {
if (await isMicroplanRequest(request)) {
const campaignBoundariesSet = new Set(request?.body?.campaignBoundaries?.map((boundary: any) => boundary.code));
for (const key in dataFromSheet) {
if (key !== getLocalizedName(getBoundaryTabName(), localizationMap) && key !== getLocalizedName(config?.values?.readMeTab, localizationMap)) {
if (Object.prototype.hasOwnProperty.call(dataFromSheet, key)) {
const dataArray = (dataFromSheet as { [key: string]: any[] })[key];
for (const boundaryRow of dataArray) {
const boundaryCode = boundaryRow[getLocalizedName(getBoundaryColumnName(), localizationMap)];
if (!campaignBoundariesSet.has(boundaryCode)) {
throwError("COMMON", 400, "VALIDATION_ERROR", `Some boundaries in uploaded sheet are not present in campaign boundaries. Please upload from downloaded template only.`);
}
}
}
}
}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Well-implemented boundary validation with room for improvement

The function efficiently validates boundaries using Set for O(1) lookups. Consider these enhancements:

  1. Localize the error message using the provided localizationMap
  2. Include the specific missing boundary codes in the error message for better debugging
 if (!campaignBoundariesSet.has(boundaryCode)) {
-    throwError("COMMON", 400, "VALIDATION_ERROR", `Some boundaries in uploaded sheet are not present in campaign boundaries. Please upload from downloaded template only.`);
+    const errorMessage = getLocalizedName("BOUNDARY_NOT_FOUND_ERROR", localizationMap) || 
+        `Boundary code '${boundaryCode}' in uploaded sheet is not present in campaign boundaries. Please upload from downloaded template only.`;
+    throwError("COMMON", 400, "VALIDATION_ERROR", errorMessage);
 }

Committable suggestion skipped: line range outside the PR's diff.

# Replaced by CMD ["yarn", "prod"]


# Replaced by CMD ["yarn", "prod"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Remove redundant comment

The comment about CMD replacement is redundant as Git history already tracks these changes. Consider removing this line for cleaner code.

-# Replaced by CMD ["yarn", "prod"]

@@ -31,6 +31,4 @@
EXPOSE 3000

CMD ["yarn", "prod"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider adding DEBUG environment variable

According to the summary, the prod script's behavior depends on the DEBUG environment variable, but it's not defined in the Dockerfile.

Add the DEBUG environment variable definition:

 EXPOSE 3000
+
+# Set DEBUG environment variable (default to false for production)
+ENV DEBUG=false
 
 CMD ["yarn", "prod"]

Committable suggestion skipped: line range outside the PR's diff.

Comment on lines +128 to +131
else{
logger.warn(`Error occurred while making request to ${getServiceName(_url)}: with error response ${JSON.stringify(response.data)}`);
return sendStatusCode ? { ...response.data, statusCode: responseStatus } : response.data;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Critical: Inconsistent error handling for non-successful responses

The new else block treats non-successful HTTP status codes (3xx, 4xx, 5xx) as valid responses, which:

  • Breaks the standard error handling pattern where non-200 responses should trigger error handling
  • May hide errors from upstream code that expects exceptions for non-200 responses
  • Provides less detailed logging compared to the catch block

Consider this alternative implementation that maintains consistent error handling:

-      else{
-        logger.warn(`Error occurred while making request to ${getServiceName(_url)}: with error response ${JSON.stringify(response.data)}`);
-        return sendStatusCode ? { ...response.data, statusCode: responseStatus } : response.data;
-      }
+      else {
+        const error = new Error('Non-successful response status');
+        error.response = response;
+        throw error; // This will be caught by the catch block, ensuring consistent error handling
+      }

This change ensures that:

  1. Non-successful responses are properly handled by the catch block
  2. Detailed error logging is maintained
  3. Upstream code receives consistent error behavior
  4. The retry mechanism works for all error cases
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
else{
logger.warn(`Error occurred while making request to ${getServiceName(_url)}: with error response ${JSON.stringify(response.data)}`);
return sendStatusCode ? { ...response.data, statusCode: responseStatus } : response.data;
}
else {
const error = new Error('Non-successful response status');
error.response = response;
throw error; // This will be caught by the catch block, ensuring consistent error handling
}

"alwaysStrict": true, /* Parse in strict mode and emit "use strict" for each source file. */
/* Additional Checks */
"noUnusedLocals": true, /* Report errors on unused locals. */
// "noUnusedParameters": true, /* Report errors on unused parameters. */
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Enable unused parameter checks for debugging

Enable noUnusedParameters to catch potential issues with unused function parameters during debugging.

-        // "noUnusedParameters": true,            /* Report errors on unused parameters. */
+        "noUnusedParameters": true,            /* Report errors on unused parameters. */
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// "noUnusedParameters": true, /* Report errors on unused parameters. */
"noUnusedParameters": true, /* Report errors on unused parameters. */
🧰 Tools
🪛 Biome (1.9.4)

[error] 35-36: End of file expected

Use an array for a sequence of values: [1, 2]

(parse)

@@ -1359,7 +1363,7 @@
async function getLocalizedMessagesHandler(request: any, tenantId: any, module = config.localisation.localizationModule, overrideCache = false) {
const localisationcontroller = Localisation.getInstance();
const locale = getLocaleFromRequest(request);
const localizationResponse = await localisationcontroller.getLocalisedData(module, locale, tenantId,overrideCache);
const localizationResponse = await localisationcontroller.getLocalisedData(module, locale, tenantId, overrideCache);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Consider caching optimization for localization.

The localization data fetching could benefit from caching to improve performance.

Consider implementing a caching mechanism for frequently accessed localization data to reduce database load and improve response times. This could be achieved using the existing appCache utility or a dedicated localization cache.

Comment on lines +752 to +758
let receivedDropdowns = request.body?.dropdowns;
logger.info("started adding dropdowns in facility", JSON.stringify(receivedDropdowns))

if(!receivedDropdowns||Object.keys(receivedDropdowns)?.length==0){
if (!receivedDropdowns || Object.keys(receivedDropdowns)?.length == 0) {
logger.info("No dropdowns found");
receivedDropdowns= setDropdownFromSchema(request,schema,localizationMap);
logger.info("refetched drodowns",JSON.stringify(receivedDropdowns))
receivedDropdowns = setDropdownFromSchema(request, schema, localizationMap);
logger.info("refetched drodowns", JSON.stringify(receivedDropdowns))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Improve error handling for dropdown initialization.

The code lacks proper error handling when setting dropdowns. If setDropdownFromSchema fails, it could lead to undefined behavior.

Add error handling:

  let receivedDropdowns = request.body?.dropdowns;
  logger.info("started adding dropdowns in facility", JSON.stringify(receivedDropdowns))
  if (!receivedDropdowns || Object.keys(receivedDropdowns)?.length == 0) {
    logger.info("No dropdowns found");
+   try {
      receivedDropdowns = setDropdownFromSchema(request, schema, localizationMap);
      logger.info("refetched drodowns", JSON.stringify(receivedDropdowns))
+   } catch (error) {
+     logger.error("Failed to set dropdowns:", error);
+     throwError("COMMON", 500, "DROPDOWN_INITIALIZATION_FAILED", error?.message);
+   }
  }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let receivedDropdowns = request.body?.dropdowns;
logger.info("started adding dropdowns in facility", JSON.stringify(receivedDropdowns))
if(!receivedDropdowns||Object.keys(receivedDropdowns)?.length==0){
if (!receivedDropdowns || Object.keys(receivedDropdowns)?.length == 0) {
logger.info("No dropdowns found");
receivedDropdowns= setDropdownFromSchema(request,schema,localizationMap);
logger.info("refetched drodowns",JSON.stringify(receivedDropdowns))
receivedDropdowns = setDropdownFromSchema(request, schema, localizationMap);
logger.info("refetched drodowns", JSON.stringify(receivedDropdowns))
let receivedDropdowns = request.body?.dropdowns;
logger.info("started adding dropdowns in facility", JSON.stringify(receivedDropdowns))
if (!receivedDropdowns || Object.keys(receivedDropdowns)?.length == 0) {
logger.info("No dropdowns found");
try {
receivedDropdowns = setDropdownFromSchema(request, schema, localizationMap);
logger.info("refetched drodowns", JSON.stringify(receivedDropdowns))
} catch (error) {
logger.error("Failed to set dropdowns:", error);
throwError("COMMON", 500, "DROPDOWN_INITIALIZATION_FAILED", error?.message);
}

Comment on lines +1310 to +1315
if (type == "boundaryManagement") {
logger.info("Processing data for boundaryManagement type")
const latLongBoundaryMap = await getLatLongMapForBoundaryCodes(request, boundaryCodeList);
for (let d of data) {
const boundaryCode = d[d.length - 1]; // Assume last element is the boundary code

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Improve type safety in boundary management processing.

The boundary code access assumes array structure without validation.

Add validation:

  if (type == "boundaryManagement") {
    logger.info("Processing data for boundaryManagement type")
    const latLongBoundaryMap = await getLatLongMapForBoundaryCodes(request, boundaryCodeList);
    for (let d of data) {
+     if (!Array.isArray(d) || d.length === 0) {
+       logger.error("Invalid data structure for boundary management");
+       continue;
+     }
      const boundaryCode = d[d.length - 1];
      if (latLongBoundaryMap[boundaryCode]) {
        const [latitude = null, longitude = null] = latLongBoundaryMap[boundaryCode];
        d.push(latitude);
        d.push(longitude);
      }
    }
  }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (type == "boundaryManagement") {
logger.info("Processing data for boundaryManagement type")
const latLongBoundaryMap = await getLatLongMapForBoundaryCodes(request, boundaryCodeList);
for (let d of data) {
const boundaryCode = d[d.length - 1]; // Assume last element is the boundary code
if (type == "boundaryManagement") {
logger.info("Processing data for boundaryManagement type")
const latLongBoundaryMap = await getLatLongMapForBoundaryCodes(request, boundaryCodeList);
for (let d of data) {
if (!Array.isArray(d) || d.length === 0) {
logger.error("Invalid data structure for boundary management");
continue;
}
const boundaryCode = d[d.length - 1]; // Assume last element is the boundary code
🧰 Tools
🪛 Biome (1.9.4)

[error] 1313-1313: This let declares a variable that is only assigned once.

'd' is never reassigned.

Safe fix: Use const instead.

(lint/style/useConst)

@@ -374,7 +374,7 @@
const localizationMapModule = await getLocalizedMessagesHandler(request, request?.query?.tenantId);
const localizationMap = { ...localizationMapHierarchy, ...localizationMapModule };
let fileUrlResponse: any;
if(type != 'boundaryManagement' && request?.query?.campaignId != 'default' && type != 'boundaryGeometryManagement'){
if (type != 'boundaryManagement' && request?.query?.campaignId != 'default' && type != 'boundaryGeometryManagement') {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Improve condition readability with early return pattern.

The nested condition checks can be simplified using early returns for better readability.

-    if (type != 'boundaryManagement' && request?.query?.campaignId != 'default' && type != 'boundaryGeometryManagement') {
+    if (type === 'boundaryManagement' || type === 'boundaryGeometryManagement' || request?.query?.campaignId === 'default') {
+      return;
+    }

Committable suggestion skipped: line range outside the PR's diff.

@@ -815,7 +815,7 @@

async function handleHiddenColumns(sheet: any, hiddenColumns: any) {
// logger.info(sheet)
logger.info("hiddenColumns",hiddenColumns);
logger.info("hiddenColumns", hiddenColumns);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Improve logging with structured format.

The logging statement could be more informative with structured data.

- logger.info("hiddenColumns", hiddenColumns);
+ logger.info("Processing hidden columns:", { count: hiddenColumns?.length, columns: hiddenColumns });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
logger.info("hiddenColumns", hiddenColumns);
logger.info("Processing hidden columns:", { count: hiddenColumns?.length, columns: hiddenColumns });

ashish-egov and others added 3 commits December 12, 2024 17:45
* Fix for boundary validation

* Optional Chaning
* added gzip handler in middleware

* added gzip handler specific to request middleware

* refactored gzip handler
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🔭 Outside diff range comments (1)
health-services/project-factory/src/server/app.ts (1)

Line range hint 77-106: Extract memory monitoring logic

Consider extracting the memory monitoring logic into a separate utility for better maintainability.

Create a new file utils/memoryMonitor.ts:

import * as v8 from 'node:v8';
import { logger } from './logger';

const MEMORY_CHECK_INTERVAL = 5 * 60 * 1000; // 5 minutes
const MEMORY_ALERT_THRESHOLD = 0.8; // 80%

export const startMemoryMonitoring = () => {
  setInterval(() => {
    const stats = v8.getHeapStatistics();
    const usedHeapSize = stats.used_heap_size;
    const heapLimit = stats.heap_size_limit;

    logger.debug(
      JSON.stringify({
        "Heap Usage": {
          used: printMemoryInMB(usedHeapSize),
          limit: printMemoryInMB(heapLimit),
          percentage: ((usedHeapSize / heapLimit) * 100).toFixed(2),
        },
      })
    );

    if (usedHeapSize / heapLimit > MEMORY_ALERT_THRESHOLD) {
      logger.warn("High memory usage detected");
    }
  }, MEMORY_CHECK_INTERVAL);
};

Then update the listen method to use it:

-      setInterval(() => {
-        const stats = v8.getHeapStatistics();
-        // ... monitoring code ...
-      }, 5 * 60 * 1000);
+      startMemoryMonitoring();
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b373ac4 and b69ba14.

📒 Files selected for processing (3)
  • health-services/project-factory/src/server/app.ts (1 hunks)
  • health-services/project-factory/src/server/utils/gzipHandler.ts (1 hunks)
  • health-services/project-factory/src/server/utils/middlewares/requestMiddleware.ts (3 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
health-services/project-factory/src/server/utils/gzipHandler.ts

[error] 2-2: A Node.js builtin module should be imported with the node: protocol.

Using the node: protocol is more explicit and signals that the imported module belongs to Node.js.
Unsafe fix: Add the node: protocol.

(lint/style/useNodejsImportProtocol)

🔇 Additional comments (1)
health-services/project-factory/src/server/app.ts (1)

53-53: Remove commented code

Good catch on removing the duplicate bodyParser.json() middleware. Let's clean up by removing the commented line entirely.

-    // this.app.use(bodyParser.json());

Comment on lines +1 to +2
import { Request } from "express";
import * as zlib from "zlib";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Update Node.js import to use the node: protocol

For better clarity and explicit Node.js module identification, update the zlib import.

-import * as zlib from "zlib";
+import * as zlib from "node:zlib";
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import { Request } from "express";
import * as zlib from "zlib";
import { Request } from "express";
import * as zlib from "node:zlib";
🧰 Tools
🪛 Biome (1.9.4)

[error] 2-2: A Node.js builtin module should be imported with the node: protocol.

Using the node: protocol is more explicit and signals that the imported module belongs to Node.js.
Unsafe fix: Add the node: protocol.

(lint/style/useNodejsImportProtocol)

Comment on lines +24 to +36
// Helper function to decompress Gzip data
const decompressGzip = (gzipBuffer: Buffer): Promise<any> => {
return new Promise((resolve, reject) => {
zlib.gunzip(gzipBuffer, (err, result) => {
if (err) return reject(err);
try {
resolve(JSON.parse(result.toString()));
} catch (parseErr) {
reject(new Error("Invalid JSON format in decompressed data"));
}
});
});
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance error handling and type safety in decompressGzip

The current implementation could be improved with more specific error handling and type safety.

-const decompressGzip = (gzipBuffer: Buffer): Promise<any> => {
+const decompressGzip = (gzipBuffer: Buffer): Promise<unknown> => {
     return new Promise((resolve, reject) => {
         zlib.gunzip(gzipBuffer, (err, result) => {
-            if (err) return reject(err);
+            if (err) {
+                return reject(new Error(`Gzip decompression failed: ${err.message}`));
+            }
             try {
-                resolve(JSON.parse(result.toString()));
+                const parsed = JSON.parse(result.toString('utf-8'));
+                if (typeof parsed !== 'object' || parsed === null) {
+                    throw new Error('Decompressed content must be a JSON object');
+                }
+                resolve(parsed);
             } catch (parseErr) {
-                reject(new Error("Invalid JSON format in decompressed data"));
+                reject(new Error(`Invalid JSON format: ${parseErr.message}`));
             }
         });
     });
 };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Helper function to decompress Gzip data
const decompressGzip = (gzipBuffer: Buffer): Promise<any> => {
return new Promise((resolve, reject) => {
zlib.gunzip(gzipBuffer, (err, result) => {
if (err) return reject(err);
try {
resolve(JSON.parse(result.toString()));
} catch (parseErr) {
reject(new Error("Invalid JSON format in decompressed data"));
}
});
});
};
// Helper function to decompress Gzip data
const decompressGzip = (gzipBuffer: Buffer): Promise<unknown> => {
return new Promise((resolve, reject) => {
zlib.gunzip(gzipBuffer, (err, result) => {
if (err) {
return reject(new Error(`Gzip decompression failed: ${err.message}`));
}
try {
const parsed = JSON.parse(result.toString('utf-8'));
if (typeof parsed !== 'object' || parsed === null) {
throw new Error('Decompressed content must be a JSON object');
}
resolve(parsed);
} catch (parseErr) {
reject(new Error(`Invalid JSON format: ${parseErr.message}`));
}
});
});
};

Comment on lines +4 to +22
export const handleGzipRequest = async (req: Request): Promise<void> => {
const buffers: Buffer[] = [];

// Collect data chunks from the request
await new Promise<void>((resolve, reject) => {
req.on("data", (chunk: any) => buffers.push(chunk));
req.on("end", resolve);
req.on("error", reject);
});

// Concatenate and decompress the data
const gzipBuffer = Buffer.concat(buffers);
try {
const decompressedData = await decompressGzip(gzipBuffer);
req.body = decompressedData; // Assign the parsed data to req.body
} catch (err: any) {
throw new Error(`Failed to process Gzip data: ${err.message}`);
}
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add memory safeguards for buffer collection

The current implementation collects all chunks without size limits, which could lead to memory issues with large requests.

Add a size limit check:

 export const handleGzipRequest = async (req: Request): Promise<void> => {
     const buffers: Buffer[] = [];
+    let totalSize = 0;
+    const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
 
     // Collect data chunks from the request
     await new Promise<void>((resolve, reject) => {
-        req.on("data", (chunk: any) => buffers.push(chunk));
+        req.on("data", (chunk: Buffer) => {
+            totalSize += chunk.length;
+            if (totalSize > MAX_SIZE) {
+                reject(new Error("Request payload too large"));
+                return;
+            }
+            buffers.push(chunk);
+        });
         req.on("end", resolve);
         req.on("error", reject);
     });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export const handleGzipRequest = async (req: Request): Promise<void> => {
const buffers: Buffer[] = [];
// Collect data chunks from the request
await new Promise<void>((resolve, reject) => {
req.on("data", (chunk: any) => buffers.push(chunk));
req.on("end", resolve);
req.on("error", reject);
});
// Concatenate and decompress the data
const gzipBuffer = Buffer.concat(buffers);
try {
const decompressedData = await decompressGzip(gzipBuffer);
req.body = decompressedData; // Assign the parsed data to req.body
} catch (err: any) {
throw new Error(`Failed to process Gzip data: ${err.message}`);
}
};
export const handleGzipRequest = async (req: Request): Promise<void> => {
const buffers: Buffer[] = [];
let totalSize = 0;
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
// Collect data chunks from the request
await new Promise<void>((resolve, reject) => {
req.on("data", (chunk: Buffer) => {
totalSize += chunk.length;
if (totalSize > MAX_SIZE) {
reject(new Error("Request payload too large"));
return;
}
buffers.push(chunk);
});
req.on("end", resolve);
req.on("error", reject);
});
// Concatenate and decompress the data
const gzipBuffer = Buffer.concat(buffers);
try {
const decompressedData = await decompressGzip(gzipBuffer);
req.body = decompressedData; // Assign the parsed data to req.body
} catch (err: any) {
throw new Error(`Failed to process Gzip data: ${err.message}`);
}
};

Comment on lines +29 to +31
if (contentType === 'application/gzip') {
await handleGzipRequest(req);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Improve Gzip request handling

The current implementation assumes content-type exactly matches 'application/gzip'. This should be consistent with the normalized check above.

-    if (contentType === 'application/gzip') {
+    if (normalizedContentType === 'application/gzip') {
       await handleGzipRequest(req);
     }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (contentType === 'application/gzip') {
await handleGzipRequest(req);
}
if (normalizedContentType === 'application/gzip') {
await handleGzipRequest(req);
}

Comment on lines +22 to 27
if (!contentType || !contentType.split(';').map(part => part.trim()).includes('application/json') && !contentType.split(';').map(part => part.trim()).includes('application/gzip')) {
// If content type is not 'application/json' or 'application/gzip', throw Unsupported Media Type error
let e: any = new Error("Unsupported Media Type: Content-Type should be 'application/json' or 'application/gzip'");
e = Object.assign(e, { status: 415, code: "UNSUPPORTED_MEDIA_TYPE" });
errorResponder(e, req, res, 415)
return;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Simplify content-type checking logic

The current content-type checking is complex and hard to read. Consider simplifying it.

-    if (!contentType || !contentType.split(';').map(part => part.trim()).includes('application/json') && !contentType.split(';').map(part => part.trim()).includes('application/gzip')) {
-      // If content type is not 'application/json' or 'application/gzip', throw Unsupported Media Type error
-      let e: any = new Error("Unsupported Media Type: Content-Type should be 'application/json' or 'application/gzip'");
-      e = Object.assign(e, { status: 415, code: "UNSUPPORTED_MEDIA_TYPE" });
-      errorResponder(e, req, res, 415)
+    const normalizedContentType = contentType?.split(';')[0].trim();
+    const allowedTypes = ['application/json', 'application/gzip'];
+    
+    if (!normalizedContentType || !allowedTypes.includes(normalizedContentType)) {
+      const error = Object.assign(
+        new Error(`Unsupported Media Type: Content-Type should be one of: ${allowedTypes.join(', ')}`),
+        { status: 415, code: "UNSUPPORTED_MEDIA_TYPE" }
+      );
+      errorResponder(error, req, res, 415);
       return;
     }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (!contentType || !contentType.split(';').map(part => part.trim()).includes('application/json') && !contentType.split(';').map(part => part.trim()).includes('application/gzip')) {
// If content type is not 'application/json' or 'application/gzip', throw Unsupported Media Type error
let e: any = new Error("Unsupported Media Type: Content-Type should be 'application/json' or 'application/gzip'");
e = Object.assign(e, { status: 415, code: "UNSUPPORTED_MEDIA_TYPE" });
errorResponder(e, req, res, 415)
return;
const normalizedContentType = contentType?.split(';')[0].trim();
const allowedTypes = ['application/json', 'application/gzip'];
if (!normalizedContentType || !allowedTypes.includes(normalizedContentType)) {
const error = Object.assign(
new Error(`Unsupported Media Type: Content-Type should be one of: ${allowedTypes.join(', ')}`),
{ status: 415, code: "UNSUPPORTED_MEDIA_TYPE" }
);
errorResponder(error, req, res, 415);
return;
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants