diff --git a/Outreach/GraceHopperConf/OpenSourceDay2023_Readme b/Outreach/GraceHopperConf/OpenSourceDay2023_Readme new file mode 100644 index 00000000..b851d50b --- /dev/null +++ b/Outreach/GraceHopperConf/OpenSourceDay2023_Readme @@ -0,0 +1,10 @@ +Weclome + +Important Links +[Project Presentation Deck](https://docs.google.com/presentation/d/1CQRv8Hw5KEY1HeH5MZLUo0zfqixx6EAsRFt2hx2QzsE/edit#slide=id.p16) + +About the Project + +Our Goal + +Technology \ No newline at end of file diff --git a/Outreach/Mentorship Sum '23/.DS_Store b/Outreach/Mentorship Sum '23/.DS_Store new file mode 100644 index 00000000..169e8d65 Binary files /dev/null and b/Outreach/Mentorship Sum '23/.DS_Store differ diff --git a/Outreach/Mentorship Sum '23/Alpha Omega Summer Mentorship 2023.md b/Outreach/Mentorship Sum '23/Alpha Omega Summer Mentorship 2023.md new file mode 100644 index 00000000..1bbfc973 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Alpha Omega Summer Mentorship 2023.md @@ -0,0 +1,175 @@ + + +# Alpha Omega Summer Mentorship 2023 + +**Term:** June- August + +**Number of Mentees:** 4 + +**Mentors:** Jonathan Leitschu and Yesenia Yser + +**Stipend:** $17,500/ Mentee + +Alpha Omega Summer Mentorship program connects Senior Software Security Engineers for the Alpha-Omega OpenSSF project with newcomers to the open source community, software development, or security researcher assisting. Mentee’s joining this program will receive one on one education and support over a 12 week period. Mentees will be awarded stipends in two installments, the first is to be paid out at six weeks, the second at the completion of the program. + +Applications for the Summer Mentorship Program are open as of April 10th, 2023. Mentorship application has closed. Program is expected to start on June 1, 2023. + + +# Program Details + +1. [Security Engineer](#A-O Security Engineer Mentee) +2. [Security Research](#A-O Security Research Mentee) + +--- + + +## A-O Security Engineer Mentee + + +### Relevant Knowledge + +* Languages (in order of priority): Python, Python-Django, CSS, HTML, Redis, Postgres +* Interest in Security, Software Development, Vulnerability Management, DevSecOps, and Security Research + +# Projects + +Below is a list of defined projects to aspire for during the Summer 2023 mentorship program. The phases will be high level as the individual tasks depend on the project assigned and an individual contributor's experience and interest. + +Overall Goal: Establish communication and connection between the Omega Analyzer and the Triage Portal. +1. Create a method to connect the Omega analyzer to the Triage Portal (local dev + prod). +2. Engineer an API endpoint from the Triage Portal so the Omega analyzer can automatically send SARIF files. +3. Implement a Vulnerability workflow process within the Triage Portal to assist with Security Research triaging lifecycle. + +# Phases + +## All Phases + +General items that are relevant at any point + +* Attend and participate in relevant OpenSSF working group meetings, including, but not limited to: Vulnerability Disclosures WG & Vulnerabilities Disclosures WG SIG Autofix +* Watch recordings of previous year Black Hat and DEF CON talks for inspiration and continuing education. +* Watch recordings of Linux Foundation training videos on Open Source Software community, best practices, ethics and standard. +* Enhance the onboarding, how-to, and offboarding documentation for the tooling and the program. +* Provide a weekly status report to their mentors during 1-1 sync and team syncs. +* All documentation, visual representations, and github issues are to be shared in the OpenSSF #alpha-omega channel to promote additional collaboration. +* All demos and recording will go on mentee’s personal youtube or digital portfolio website + + +## Phase I: Establish Base Familiarity + +This is a high level outline of expected tasks to be completed. This phase will be approximately 6 weeks in time. + + +* Research + Onboarding (2 weeks) + * Become familiar with project tooling + * Build out tooling in local workstation, Github CodeSpace, or via cloud provider + * Enhance the onboarding build process and documentation + * Draft out initial research of the problem, scope, and areas of research +* Design (2-3 weeks) + * Research the technical details required to solve the problem + * Document a SW Requirement documentation on the technical design, to include use case, security requirements, and test requirements + * If applicable, provide a proof of concept or visual demonstration / visualization of solution + * Provide a status report on a). Problem and the solution; b). What you learned, c). What challenges have you faced? +* Presentation and Alignment ( 1 week) + * Gather community feedback via channels outside of the Alpha-Omega team + * Gather feedback from the Alpha-Omega team + * Provide a 30-min presentation on a). Problem and the solution; b). What you learned?, c). What challenges have you faced?, d). What did you accomplish? + + +## Phase II: Implement Knowledge + +This is a high level outline of expected tasks to be completed. This phase will be approximately 6 weeks in time. The fine grained details of this phase will be relevant and understandable during the Design steps of Phase I. + + + +* Implementation (3 weeks) + * Implement the reviewed functionality solution + * Raise any challenges or blockers, as soon as possible, to mentor + * Documentation on how-to use and update any documentation on functionality changes +* Testing (2 weeks) + * Provide test cases based on the SW requirement documents, use cases, and security requirements + * Provide documentation on test cases; a). What is covered?, b). Areas of test not covered, c). Improvements and future functionality +* Offboarding (1 week) + * Record demo on the new functionality + * Documentation on a). Next steps, b). What works, c). What does not work, d). Future challenges, e). Fresh perspective + * Provide a 30-min presentation on a). Problem and the solution; b). What you learned?, c). What challenges have you faced?, d). What did you accomplish? + * Respond to a program feedback survey + * Offboarding call with mentor + + +## A-O Security Research Mentee + +### Relevant Knowledge + +* Languages (in order of priority): Java, Python, Kotlin +* Interest in Security and Security Research + +# Phases + +## All Phases + +General items that are relevant at any point + +* Attend and participate in relevant OpenSSF working group meetings, including, but not limited to: Vulnerability Disclosures WG & Vulnerabilities Disclosures WG SIG Autofix +* Watch recordings of previous year Black Hat and DEF CON talks for inspiration and continuing education. +* Watch recordings of Linux Foundation training videos on Open Source Software community, best practices, ethics and standard. +* Enhance the onboarding, how-to, and offboarding documentation for the tooling and the program. +* Provide a weekly status report to their mentors during 1-1 sync and team syncs. +* All documentation, visual representations, and github issues are to be shared in the OpenSSF #alpha-omega channel to promote additional collaboration. +* All demos and recording will go on mentee’s personal youtube or digital portfolio website + +## Phase I: Establish Base Familiarity + +SOME of these tasks will be completed +* OpenRewrite + * Become familiar with OpenRewrite and AST manipulation + * Write a basic OpenRewrite recipe + * Become familiar with OpenRewrite’s Control Flow and DataFlow API +* CodeQL + * Setup CodeQL Development Environment + * Become familiar with CodeQL’s API including DataFlow and Control Flow + * Write a simple CodeQL query to find common code patterns + * Become familiar with CodeQL’’s Control Flow and DataFlow API +* Omega-Moderne Client + * Become familiar with the codebase +* Documentation + * Enhance the onboarding build process and documentation + * Provide a status report on + * a). Problem and the solution; + * b). What you learned, + * c). What challenges have you faced? + * Gather community feedback via channels outside of the Alpha-Omega team + * Gather feedback from the Alpha-Omega team + * Provide a 30-min presentation on + * a). Problem and the solution; + * b). What you learned?, + * c). What challenges have you faced?, + * d). What did you accomplish? + +## Phase II: Implement Knowledge + +SOME of these tasks will be completed +All of these efforts will need to be more actively collaborative between myself and the mentee. + +* CodeQL + * Leverage CodeQL to detect variants of a new or existing vulnerability. + * Contribute the query to the GitHub Security Lab Bug Bounty program +* OpenRewrite + * Expand Control Flow and DataFlow to support multi-file analysis + * Apply DataFlow and ControlFlow to write a recipe to detect and remediate a class of vulnerabilities. +* Vulnerability Reporting + * Exposure to the vulnerability disclosure process. + * Report a real security vulnerability to an OSS project. +* Documentation + * Raise any challenges or blockers, as soon as possible, to mentor + * Documentation on how-to use and update any documentation on functionality changes + * Record demo on the new functionality, process, or research + * Documentation on + * a). Next steps, + * b). What works, + * c). What does not work, + * d). Future challenges, + * e). Fresh perspective + * Provide a 30-min presentation on a). Problem and the solution; b). What you learned?, c). What challenges have you faced?, d). What did you accomplish? + * Respond to a program feedback survey + * Offboarding call with mentor diff --git a/Outreach/Mentorship Sum '23/Meet the Mentors.md b/Outreach/Mentorship Sum '23/Meet the Mentors.md new file mode 100644 index 00000000..b0baf596 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Meet the Mentors.md @@ -0,0 +1,13 @@ + + +# Yesenia Yser +![Yesenia Yser](img/Yesenia.jpg) + + +Yesenia is a Senior Software Security Engineer for the Alpha-Omega OpenSSF project. She holds a Bachelor's degree in Computer science and a Masters in Digital Forensics. Over the last ten years, her career has been dedicated to being a servant leader to the software engineer and cybersecurity realm, pioneering teams and systems for Security Operations, DevSecOps, Supply Chain Security, by executing on the never-been-done. As a cyber big sister, she has founded several Women in Tech organizations, particularly in the South Florida area, and is a co-founder in the Latinas In Cyber organization. In her free time, she is a practitioner and instructor of Brazilian Jiu Jitsu and growing the Women’s Self-Defense classes at her academy. + + +# Jonathan Leitschuh +![Jonathan Leitschuh](img/Joanthan.jpg) + +Jonathan Leitschuh is a Senior Software Security Researcher currently working for the Open Source Security Foundation (OpenSSF). He is responsible for thousands of automatically generated pull requests, fixing vulnerabilities across OSS. He was the first Dan Kaminsky Fellow and former Software Engineer. Jonathan is best known for his July 2019 bombshell Zoom 0-day vulnerability disclosure. He is amongst the most prolific OSS researchers on GitHub by advisory credit. He’s both a GitHub Star and GitHub Security Ambassador. In 2019 he championed an industry-wide initiative to get all major artifact servers in the JVM ecosystem to formally decommission the support of HTTP in favor of HTTPS only. He has spoken at many conferences from BSides, ShmooCon, Sec-T, & GitHub Universe, to Black Hat, & DEFCON. diff --git a/Outreach/Mentorship Sum '23/Mentorship_Overview.md b/Outreach/Mentorship Sum '23/Mentorship_Overview.md new file mode 100644 index 00000000..e884a3d3 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Mentorship_Overview.md @@ -0,0 +1,5 @@ +# Alpha Omega Mentorship Program + +The Alpha Omega Mentorship Program’s Goals are to accelerate the mission of the Omega in Alpha-Omega project. Each term four entry-level individual contributors (IC) are being requested for a mentorship program. Two of these individuals will assist with the software development and research on the Omega toolchain, led by Yesenia. While the remaining two will assist with developing automated remediation of security vulnerabilities, and open source security research, led by Jonathan. The overall goal will be to provide a 12-week program for undergraduate and graduate students. The targeted students will be newcomers to the open source community, software development, or security researcher assisting to expand knowledge and talent to underrepresented majorities, DEI groups, and increase the diverse talent pool for future employers. + +To apply please visit the [Linux Foundation Mentorship Dashboard](https://mentorship.lfx.linuxfoundation.org/project/4df8fab8-e11a-4877-8140-c3633099ea24) diff --git a/Outreach/Mentorship Sum '23/Software Requirements - Analyzer.md b/Outreach/Mentorship Sum '23/Software Requirements - Analyzer.md new file mode 100644 index 00000000..6f216c58 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Software Requirements - Analyzer.md @@ -0,0 +1,204 @@ +# Analyzer + +## Objective +- Enhancement to the analyzer to combine the scan results and assertion report into a single file, while providing the additional functionality from sending the file to local or API instance. + + +## Use Case +1. Support functionality to support uploading the single SARIF report to the Triage Portal API (local, production, or stdout) +1. Support functionality to combine scan results and assertion into a single run and a SARIF report +1. Design requirements for input improvements to support cadence runs and bulk request (i.e, scan 10 oss projects at once) +1. Use the Analyzer to scan pypi open source projects and report status + +## Diagram +```mermaid +--- +Analyzer Diagram +--- + +stateDiagram-v2 + direction TB + + state AssertFlagCond <> + state TriagePushCond <> + + [*] --> Omega_Analyzer: Request bulk packet scan + Omega_Analyzer --> AssertFlag + + AssertFlag --> AssertFlagCond + + + AssertFlagCond --> Assertion_Framework: Yes + AssertFlagCond --> TriagePush: No + + Assertion_Framework --> TriagePush + TriagePush --> TriagePushCond + + TriagePushCond --> Triage_Portal: Yes + TriagePushCond --> Local_Computer: No + +``` + + +## Requirements + +### Use Case 1 +- [x] Push 'summary-results.sarif' to the triage-portal +- [x] Triage Portal credentials should be supported as environment variable (be sure to change the .env template) +- [x] Triage Portal Credentials should be passed as parameters when running the ./runtools.sh command +- Exception Handling on the following: + - [x] Triage Portal isn't available + - [x] Perform 3 Retry Attempts, then default to stdout + - [x] User hasn't supplied enough information to connect to triage portal +- [x] Error should be returned to the user via stdout with a standard log message and HTTP error code, if necessary. + - If error occurs on the triage portal, then user should get the HTTP code plus error message +- [x] Should be able to scan all packages (with a Focus on being more or less compatible with JavaScript [npm], Go (Golang) [go] and Python [PyPi]) + + +### Use Case 2 - Goes beyond scope of the analyzer and is regarded as a completely different component +- [ ] Running assertion on a package should be supported by environment variables + - [ ] For frequent and/or cadence, support an assertionReport option to always run the assertion report +- [ ] Running assertion on a package should be supported by passing as an parameter when running the ./runtools.sh command +- [ ] The scan and assertion should run at the same time + - [ ] Assume that each will take time (assertion and scan) + - [ ] Prevent "timeouts" as best as possible +- [ ] The assertion report should be included in the final 'summary-results.sarif' file +- [ ] The assertion report scheme should have a key-value pair for the triage portal to easily identify the report data required for parsing + - ex. "assertion-results"={} + + + +### Use Case 4 +- [x] Keep results in Google Drive (until prod triage portal) +- [x] Top-level stats on Google Drive +- [x] Scanning 10 pypis a week + - [x] Record failures and status of results + - [x] Record time to scan + - [x] Record summary-results.sarif file size + - [x] Record date of scan + +## Bug fixes and additional functionalities implemented +- [./runtools.sh argument and options parsing](https://github.com/ossf/alpha-omega/pull/162) + +- [./runtools.sh dynamic version resolution](https://github.com/ossf/alpha-omega/pull/162) + +- [Add Golang Support](https://github.com/ossf/alpha-omega/pull/210) + +- MD5 Collision Hash Pentesting + +- Analytics on a typical pypi package and cost to run + +- [Containerized oaf (Omega Assertion Framework)](https://github.com/ossf/alpha-omega/pull/214) + +- Bulk Scan POC (Proof-of-concept) + +- Bug Fixes! + +## Security Requirements +- [x] Threat: Maniupulation of Binaries / Files + - [x] Remediation: Checksum validation +- [x] Manipulation on package name / version number + - Package and Version Validation +- :heavy_check_mark: Erroneous data mitgiation + - [x] One package per container + + +## Acceptance Criteria +- [x] The Analyzer pushes the final SARIF file to the Triage Portal's Endpoint +- [x] The Analyzer has multiple methods to pass the credentials to establish a connection to the Triage Portal + - Environment Variables or as an argument +- [x] The analyzer should have support for JavaScript [npm], and Python [PyPi] +- [x] The analyzer is able to create a SARIF file with only the Scan results +- [x] The analyzer has been tested with the Omega Top 10k list despite success of the scan. +- [x] Tested Omega top 10k list has been documented based on its success and failure + - Tested on the pypi packages from the Omega top 10k, not the complete 10k +- [x] The anayzer is used every week to scan 10 pypi project from the Omega Top 10k list +- [x] The results from the scanned pypi projects are recorded to include scan dureation and success +- [x] Send a checksum from the analyzer side + +## Future Improvements (Includes Improvements to the Omega Assertion Framework (oaf)) +- [ ] The analyzer is able to create a SARIF file to include both the scan results and the assertion report +- [ ] Test Full Omega top 10k list has been documented based on its success and failure +- Triage Push Exception Handling + - [ ] Triage Portal does not support or issue with the formatting +- [ ] The analyzer has exception handling for the SARIF file upload + - This can be done by adding `--include` tag to the `curl` command and grepping for successes. +- [ ] Migrate to a micro-services architecture, with `Omega Analyzer Toolshed (oat)`, `Omega Assertion Framework (oaf)`, `Omega Fuzzing (ofu)` being separate components that can feed into one place +- [ ] Add Documentation on how to run the `oaf` +- [ ] Automatic Testing and Insight into which language packages work and don't work on the assertion framework + + +## Testing +| Test No | Description | Files | Steps +| :---- | :---- | :---- | :---- +| 1 | Analyzer Build Script | | [Steps](#1) +| 2 | Analyzer Build Script with Flag | | [Steps](#2) +| 3 | Version Resolution | | [Steps](#3) +| 4 | Analyzer do assertion | | [Steps](#4) +| 5 | Failure on Errorenous Version | | [Steps](#5) +| 6 | Failure on invalid input format | | [Steps](#6) +| 7 | .... + + + + + +### 1 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script) | `./build.sh` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libaries io creds> openssf/omega-toolshed pkg:npm/left-pad@latest` | ./alpha-omega/omega/analyzer + + +### 2 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script with force flag) | `./build.sh -f` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libaries io creds> openssf/omega-toolshed pkg:npm/left-pad@latest` | ./alpha-omega/omega/analyzer + + +### 3 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script) | `./build.sh` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libaries io creds> openssf/omega-toolshed pkg:npm/left-pad@latest` | ./alpha-omega/omega/analyzer +| Verify that @latest resolves to a version (i.e returns an output directory | --- | ./alpha-omega/omega/analyzer + + +### 4 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script) | `./build.sh` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libaries io creds> openssf/omega-toolshed pkg:npm/left-pad@latest` | ./alpha-omega/omega/analyzer +| Verify that @latest resolves to a version (i.e returns an output directory | --- | ./alpha-omega/omega/analyzer +| Verify that assertion results are within the summary-results.sarif file | `find . -name 'summary-results.sarif' -exec grep 'assertion-results' {} \;` | ./alpha-omega/omega/analyzer + +- -v .:/opt/export is used to mount to get content from the container to local machine + + +### 5 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script) | `./build.sh` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libaries io creds> openssf/omega-toolshed pkg:npm/left-pad@latest` | ./alpha-omega/omega/analyzer +| Verify that container fails (outputs could not find package: Package could not be found, nothing to do) | --- | ./alpha-omega/omega/analyzer + +### 6 +| Steps | Linux Steps | Current Directory +| :----- | :----: | :---- +| Clone Alpha-Omega Repository | `git clone git@github.com:ossf/alpha-omega.git` | . +| Change Directory to omega/analyzer | `cd omega/analyzer` | ./alpha-omega/omega/analyzer +| Build Container (using build script) | `./build.sh` | ./alpha-omega/omega/analyzer +| Run toolshed container using format | `docker run --rm -it --env-file <.env containing the libraries io creds> openssf/omega-toolshed npm/left-pad@latest` | ./alpha-omega/omega/analyzer +| Verify that container fails execution (Unable to parse Package Url) | ---- | ./alpha-omega/omega/analyzer + diff --git a/Outreach/Mentorship Sum '23/Software Requirements - Triage Portal.md b/Outreach/Mentorship Sum '23/Software Requirements - Triage Portal.md new file mode 100644 index 00000000..d48f7337 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Software Requirements - Triage Portal.md @@ -0,0 +1,119 @@ +# Triage Portal +***Objective*** + +Establish functionality that will support the upload of a single SARIF file via API Endpoint and parse an Omega analyzer SARIF report that contents scan results and the assertion report. + +***Use Cases*** + +1. Support an API Endpoint within the Triage Portal that will accept a single SARIF file-based conditions. + 1. The accepted SARIF file data will be parsed and stored into its respected database. + 1. Assertion data must be stored in the assertion table. + 1. Scan Results data must be stored in the Findings table. +1. A user must be a trusted party to submit a SARIF file. +1. (Nice to have) Updating the Postgres db must be performed through an API endpoint per table. + +***Diagram*** + +![Triage To Storage Lucid Charts](img/TriageToStorage.png) + + +***Requirements*** + +**Use Case #1** + +- Within the Triage Portal, create a Restful API (or graphql) in the Django framework to accept a SARIF file. + - User should be trusted via authentication method (jwt token) + - Exception with HTTP code if not a trusted user + - “Admin” to enable user checking. +- Endpoint must validate that it is a SARIF file. + - Exception with HTTP code if not support file. +- Limit the size of the SARIF file (to be determined in the design) + - Document size limitation in readme +- API should leverage Triage Portal functionality to parse the scan results. +- Functionality to parse the assertion report will be required (limit code redundant and try to reuse the scan parser). +- Check if assertion data is available before parsing. +- Instead of parsing all the data from the assertion file, only the package name, uuid, total assertions found within the package, and a URL that point to the assurance assertion webpage that shows the assertions of that specific package with even more detail is stored. That way we are prioritizing meaningful and non-repetitive data presentation to researchers. +- If either parse or storage failures, do not prevent the other from completing, but send an appropriate log/error message to stdout. + - Send a successful HTTP code with an error message notifying “partial success with x failed to store.” +- On the UI, add a notification that the SARIF file has been uploaded. +- Design the API to use the upload functionality (like the UI upload), arguments should be the SARIF file and the package name/version. +- Display the information of the scan result for each individual finding in the Triage Portal Finding page. + - Parse out the key-value for each finding. + - Design the page for accessibility and readability on the scan results data. +- Triage Portal Wiki allows transitioning a wiki from closed or deleted to another status. + - Closed wiki should be shown in the UI. + - Deleted Wiki should not be shown in the UI. + - Deleted wiki should be accessible via the URL and their Wiki ID +- In the Wiki page UI, separate the wiki sections into Unresolved Wiki’s (New, Not specified, active) and Resolved Wiki’s (Closed, Resolved) + - The Resolved Wiki Section should be collapsible. Default should be collapsed. + + +***Bugs fixes and additional functionalities implemented.*** + +- Inside the Tool Defects tap fix error not querying the assigned to me and the active defects (Pull request # 72) + - https://github.com/ossf/omega-triage-portal/commit/edf41b2b293391a02a1f2bf31c91d895a14c7e6b#diff-0a4e88ef8789d3ffde7c78b98ade254f8f922d58793ec106838af814a746611e +- Fix the edit and add wiki state dropdown for the Wiki tap (Pull request # 77) + - https://github.com/ossf/omega-triage-portal/pull/77 +- Server should not crash/end after 404 Wiki error (Pull request # 77) + - https://github.com/ossf/omega-triage-portal/pull/77 +- Error handling for Closed and Deleted Wiki Status (Pull request # 77) + - https://github.com/ossf/omega-triage-portal/pull/77 +- Upload button not working because OSS Gadget could not download the package. Changed to the latest version of the OSS Gadget (Pull request # 77) + - https://github.com/ossf/omega-triage-portal/pull/77 +- Patched the requirements file due to updates from dependabot (Pull request # 79) + - https://github.com/ossf/omega-triage-portal/pull/79 +- Fixed other package compatibility issues and added to the tool defect to save the user that the finding had been assigned to(Pull request # 89) + - https://github.com/ossf/omega-triage-portal/pull/89 +- Pull request # 103 https://github.com/ossf/omega-triage-portal/pull/103 + - Fixed redis connection error + - Implemented the API endpoint for the triage portal upload functionality using GraphQL. + - UI notification implementation for status of uploading SARIF file when a file was successful or not. + - Added logic for the upload status when uploading files. +- Changed azure-core package version for compatibility purposes when doing the build, added description to a field in the schema, fixed formatting of other files (Pull request # 107) + - https://github.com/ossf/omega-triage-portal/pull/107 +- Pull request # 116 https://github.com/ossf/omega-triage-portal/pull/116 + - Created Assertion model + - Added migration file + - Some small changes to schema + - Inside sarif importer added checking for assertion_data key + - Inside sarif importer implemented the adding to assertion details to database or update the fields if the assertion already exists +- Pull request # 118 https://github.com/ossf/omega-triage-portal/pull/118 + - Updated the link needed to put credentials of local environment in README file + - Added a section about the API documentation in README file + - Created a new directory called "docs" to store new and future documentation + - Added database mapping of triage portal database + - Added documentation of how I did the mapping +- Added registration of the AssertionsPerPackage model (Pull request # 119) + - https://github.com/ossf/omega-triage-portal/pull/119 + +***Security Requirements*** + +- Making sure that only users that are authorize can access the triage portal (Authorization) +- Users that are part of different groups should have different permissions and access to the triage portal (Authentication) +- To prevent compromising the data of the SARIF file the API will support a checksum field and validate that the checksum matches the calculated checksum of the file. + +***Acceptance Criteria*** + +- Triage Portal has an accessible endpoint to upload SARIF files. +- Exception and error handling has been designed for credentials and file validation. +- The Triage Portal uploads and parses the scan results into its own database table. +- The triage Portal uploads and parses the assertion results into its own database table. +- Exception and error handling has been implemented for parsing errors. +- The UI displays a module after success or failure of SARIF file upload. +- The Triage Portal should allow any user or groups to upload a SARIF file via API or UI. +- The Triage Portal upload endpoint has limitations on the size of SARIF files that can be uploaded. +- The postgres database has a table for scan results. +- The postgres database has a table for assertion results. +- The Finding page has been designed for accessibility and readability of the scan results and related data. + +***Future Improvements*** + +- Password management and policy. +- Apply appropriate security measures to protect sensitive data transmitted via the API. +- Validate input received by the API to prevent potential attacks. +- When the portal implements personas a permissions decorator should be added to the mutation of uploading a file, so that only users with certain permissions can make the upload of the file to the portal. + - https://django-graphql-jwt.domake.io/decorators.html#permission-required +- Make the package version model connect with the assertion model. +- Have assertions based on project versions be displayed to the user in UI. +- History tracking can be added for the model when changes are made outside the admin portal + diff --git a/Outreach/Mentorship Sum '23/Test Cases - Triage Portal.md b/Outreach/Mentorship Sum '23/Test Cases - Triage Portal.md new file mode 100644 index 00000000..55e48185 --- /dev/null +++ b/Outreach/Mentorship Sum '23/Test Cases - Triage Portal.md @@ -0,0 +1,281 @@ +# Functionality Tests: +**1. Sending correct data to the Triage Portal using the POST API endpoint** + +**Preconditions:** + +- Users must have valid credentials to access the API endpoint. +- The Triage Portal API endpoint URL and required payload structure are known. + +**Steps to Test:** + +- Obtain valid authentication credentials for the Triage Portal API. +- Set up the requested payload with the necessary data fields for the Triage Portal. +- Use an API testing tool to send a POST request to the Triage Portal API endpoint, providing the appropriate authentication credentials and payload. +- Capture and inspect the response from the API endpoint. +- Validate the response status code to ensure a successful request (200 OK). + +**Expected Result:** + +- The API request is successful, indicated by a response status code of 200. +- The Triage Portal processes sent data correctly, as indicated by any expected responses in the response body or headers. +- Any additional expected behaviors or validations specific to the Triage Portal API are met. + +**** + +**2. Store parsed scan results from SARIF file into the findings table** + +**Preconditions:** + +- The API endpoint for receiving the SARIF file is available. +- Access to the database is established. + +**Steps to Test:** + +- Prepare a SARIF file with sample data to be sent through the API. +- Send an API request with the necessary data fields for the Triage Portal. +- Validate that the API successfully receives the SARIF file and extracts the relevant data from it. +- Store the extracted scan results data into the findings table of the database, ensuring proper mapping of the data fields. +- Compare the retrieved data with the original data from the SARIF file to ensure accurate storage and mapping. + +**Expected Result:** + +- The API should successfully receive the SARIF file without any errors. +- The SARIF file should be parsed, and the relevant data should be extracted and stored in the findings table accurately. + +**** + +**3. Checking that it is a SARIF file when using the API endpoint** + +**Preconditions:** + +- The user has valid authentication credentials to access the API endpoint. +- The Triage Portal API endpoint URL is known. + +**Steps to Test:** + +- Obtain valid authentication credentials for the Triage Portal API. +- Prepare a non-SARIF file (e.g., text file, image file) to be sent through the API for testing. +- Send a POST request to the Triage Portal API endpoint, including the non-SARIF file in the request payload. +- Validate that the API endpoint enforces strict file type validation for the SARIF file field. +- Verify that the API rejects the request and returns an appropriate error response indicating that only SARIF files are allowed. +- Confirm that the error response contains clear and accurate information about the file type restriction. + +**Expected Result:** + +- When attempting to send a POST request with a non-SARIF file, the API should reject the request. + +# Unit test cases: +**1. Verify that the API returns a 200 response when making a correct request** + + class TestPostAPI(unittest.TestCase): + def setUp(self): + # Set up any necessary data or configurations before each test case + self.api_url = "http://example.com/api" # Replace with actual API endpoint + + def test_post_request_returns_200(self): + # Test that the POST request returns a 200 response + file_path = "path/to/file.sarif" # Replace with the actual path to the file + files = { + 'file': open(file_path, 'rb') + } + + payload = { + "package_name": "my_package", + "checksum": "abcd1234" + } + response = requests.post(self.api_url, files=files, data=payload) + self.assertEqual(response.status_code, 200) + +**2. Verify that the API returns an error response when making a wrong request** + + class TestPostAPI(unittest.TestCase): + def setUp(self): + # Set up any necessary data or configurations before each test case + self.api_url = "http://example.com/api" # Replace with actual API endpoint + + def test_post_request_returns_error_for_non_sarif_file(self): + # Test that the POST request returns a 200 response + file_path = "path/to/non_sarif_file.txt" # Replace with the actual path to the file + files = { + 'file': open(file_path, 'rb') + } + payload = { + "package_name": "my_package", + "checksum": "abcd1234" + } + response = requests.post(self.api_url, files=files, data=payload) + self.assertEqual(response.status_code, 400) + self.assertIn("error", response.json()) + +# User-Interface Tests: +**1. Successful SARIF File Upload Notification** + +**Preconditions:** + +- The user is signed into the portal. +- The user is on the SARIF file upload page. +- The user has a valid SARIF file ready for upload. + +**Steps to Test:** + +- Choose the file to upload. +- Enter the package name in the designated input field. +- Click on the "Add" button. +- Wait for the upload process to complete. +- Observe the page for the presence of a successful notification. + +**Expected Results:** + +- The SARIF file should be uploaded successfully without any errors. +- A notification should be displayed to inform the user about the successful upload. + +**** + +**2. The Wiki page UI separates the articles into unresolved (New, Not specified, Active) and resolved (Closed, Resolved)** + +**Preconditions:** + +- The user is logged into the portal. +- The user has appropriate permissions to add and view wiki articles. + +**Steps to Test:** + +- Navigate to the wiki page by clicking on the wiki tab. +- Fill in the required fields for creating an article, and on the “State” field choose either Closed or Resolved. +- Click on the "Add" button to add the article. +- Click on the "View All" button to view the list of wiki articles. +- Verify that the article created is not listed in the "Unresolved Wiki Articles" table. +- Locate the "Resolved Wiki Articles" section. +- Click on the arrow next to the "Resolved Wiki Articles" title to expand the table.** +- Verify that the article created is listed in the " Resolved Wiki Articles" table. + +**Expected Result:** + +- The resolved or closed article should be displayed under the "Resolved Wiki Articles" table when the corresponding state is selected. + + +# Security test case: +**1. Authorization and Access Control for Sending SARIF File Information** + +**Preconditions:** + +- The Triage Portal API endpoint URL and required payload structure are known. +- User authentication and authorization mechanisms are in place. + +**Steps to Test:** + +- Attempt to send a POST request to the Triage Portal API endpoint without authentication or with invalid credentials, including SARIF file information in the request payload. +- Verify that the API rejects the request and returns an appropriate unauthorized or forbidden response. +- Use valid authentication credentials for a non-authorized user (e.g., a user without proper privileges or role) to send a POST request with SARIF file information to the API endpoint. +- Verify that the API rejects the request and returns an appropriate unauthorized or forbidden response. +- Use valid authentication credentials for an authorized user to send a POST request with SARIF file information to the API endpoint. +- Verify that the API accepts the request and returns a successful response, indicating that the SARIF file information has been sent to the Triage Portal successfully. + +**Expected Result:** + +- Access control mechanisms should be properly enforced, and unauthorized or non-privileged users should be denied access to the API endpoint. +# Integration test cases: +**1. Verify that the Analyzer can successfully upload SARIF File to Triage Portal through the API Endpoint** + +**Preconditions:** + +- The Analyzer and Triage Portal applications are installed and running. +- The API endpoint for sending data from the Analyzer to the Triage Portal is properly implemented and accessible. +- The Analyzer application has valid authentication credentials to access the Triage Portal API. + +**Steps to Test:** + +- Open the Analyzer application and push the necessary information to the Triage Portal. +- Ensure that the Analyzer application invokes the API endpoint to send the data to the Triage Portal. +- Monitor the network traffic or API logs to confirm that the data is sent to the correct API endpoint. + +**Expected Results:** + +- The Analyzer application successfully sends the data to the Triage Portal API endpoint. + +**** + +**2. Verify that the Triage Portal correctly stores the data received from the Analyzer – Assertion Data Available.** + +**Preconditions:** + +- The Analyzer and Triage Portal applications are installed and running. +- The API endpoint for sending data from the Analyzer to the Triage Portal is properly implemented and accessible. +- The Analyzer application has valid authentication credentials to access the Triage Portal API. + +**Steps to Test:** + +- Open the Analyzer application and push the necessary information to the Triage Portal with assertion data available. +- Access the Triage Portal’s corresponding database. +- Verify that the parse assertion report is properly stored in the assertion table, including all relevant details and attributes. +- Verify that the parse scan results are properly stored in the findings table, including all relevant details and attributes. +- Validate the data integrity by comparing key fields between the sent data and the stored data in the Triage Portal. + +**Expected Results:** + +- The Triage Portal application correctly receives and saves the data sent from the Analyzer. + +**** + +**3. Verify that the Triage Portal correctly stores the data received from the Analyzer – No Assertion Data Available.** + +**Preconditions:** + +- The Analyzer and Triage Portal applications are installed and running. +- The API endpoint for sending data from the Analyzer to the Triage Portal is properly implemented and accessible. +- The Analyzer application has valid authentication credentials to access the Triage Portal API. + +**Steps to Test:** + +- Open the Analyzer application and push the necessary information to the Triage Portal with no assertion data available. +- Access the Triage Portal’s corresponding database. +- Verify the data is properly stored in the findings table, including all relevant details and attributes. +- Validate the data integrity by comparing key fields between the sent data and the stored data in the Triage Portal. +- Verify nothing was stored in the assertion table. + +**Expected Results:** + +- The Triage Portal application correctly receives and saves the data sent from the Analyzer. + +**** + +**4. Analyzer to Triage Portal scalability when processing large data sets or high data volumes.** + +**Preconditions:** + +- The Analyzer and Triage Portal applications are installed and running. +- The API endpoint for sending data from the Analyzer to the Triage Portal is properly implemented and accessible. +- The Analyzer application has valid authentication credentials to access the Triage Portal API. + +**Steps to Test:** + +- Open the Analyzer application and push the necessary information to the Triage Portal. +- Repeat the above step multiple times for pushing data to the Triage Portal. Make sure to include large/heavy files. +- Monitor the performance, response times, or API logs of both the Analyzer and Triage Portal during the data integration. +- Verify that the integration remains stable, and the response times are within acceptable limits. +- Verify that the data is consistently and accurately saved in the Triage Portal. + +**Expected Results:** + +- The integration between the Analyzer and Triage Portal remains stable and performs well, even with large data sets or high data volumes. + +**** + +**5. Unauthenticated User Data Submission Integration Test** + +**Preconditions:** + +- The Analyzer application does not have valid authentication credentials to access the Triage Portal API. +- The Analyzer and Triage Portal applications are installed and running. +- The API endpoint for sending data from the Analyzer to the Triage Portal is properly implemented and accessible. + +**Steps to Test:** + +- Open the Analyzer application and push the necessary information to the Triage Portal. +- Monitor the network traffic or API logs to confirm that the data submission request is made to the Triage Portal API endpoint. +- Verify that the Triage Portal returns an appropriate error response indicating unauthorized access. + +**Expected Results:** + +- The Triage Portal rejects the data submission request and returns an appropriate error response indicating unauthorized access. + diff --git a/Outreach/Mentorship Sum '23/img/.DS_Store b/Outreach/Mentorship Sum '23/img/.DS_Store new file mode 100644 index 00000000..c1145c6a Binary files /dev/null and b/Outreach/Mentorship Sum '23/img/.DS_Store differ diff --git a/Outreach/Mentorship Sum '23/img/Joanthan.jpg b/Outreach/Mentorship Sum '23/img/Joanthan.jpg new file mode 100644 index 00000000..08a40956 Binary files /dev/null and b/Outreach/Mentorship Sum '23/img/Joanthan.jpg differ diff --git a/Outreach/Mentorship Sum '23/img/TriageToStorage.png b/Outreach/Mentorship Sum '23/img/TriageToStorage.png new file mode 100644 index 00000000..70e8dbd2 Binary files /dev/null and b/Outreach/Mentorship Sum '23/img/TriageToStorage.png differ diff --git a/Outreach/Mentorship Sum '23/img/Yesenia.jpg b/Outreach/Mentorship Sum '23/img/Yesenia.jpg new file mode 100644 index 00000000..526acf49 Binary files /dev/null and b/Outreach/Mentorship Sum '23/img/Yesenia.jpg differ