From 2e44e04d7bd84695cb68e5cd363cb67abe17af65 Mon Sep 17 00:00:00 2001
From: Assaf Attias <49212512+attiasas@users.noreply.github.com>
Date: Mon, 1 Jan 2024 11:52:51 +0200
Subject: [PATCH 1/4] Initial project structure (#1)
---
.github/ISSUE_TEMPLATE/bug_report.yml | 69 +++++
.github/ISSUE_TEMPLATE/feature_request.md | 20 ++
.github/ISSUE_TEMPLATE/question.md | 8 +
.github/PULL_REQUEST_TEMPLATE.md | 5 +
.github/release.yml | 20 ++
.github/workflows/analysis.yml | 43 +++
.github/workflows/cla.yml | 35 +++
.../workflows/frogbot-scan-pull-request.yml | 47 +++
.github/workflows/frogbot-scan-repository.yml | 36 +++
.github/workflows/test.yml | 69 +++++
.gitignore | 21 ++
LICENSE | 201 ++++++++++++
README.md | 19 ++
cli/cli.go | 15 +
go.mod | 91 ++++++
go.sum | 287 ++++++++++++++++++
jfrogclisecurity.go | 10 +
jfrogclisecurity_test.go | 59 ++++
tests/integration/xray_test.go | 1 +
19 files changed, 1056 insertions(+)
create mode 100644 .github/ISSUE_TEMPLATE/bug_report.yml
create mode 100644 .github/ISSUE_TEMPLATE/feature_request.md
create mode 100644 .github/ISSUE_TEMPLATE/question.md
create mode 100644 .github/PULL_REQUEST_TEMPLATE.md
create mode 100644 .github/release.yml
create mode 100644 .github/workflows/analysis.yml
create mode 100644 .github/workflows/cla.yml
create mode 100644 .github/workflows/frogbot-scan-pull-request.yml
create mode 100644 .github/workflows/frogbot-scan-repository.yml
create mode 100644 .github/workflows/test.yml
create mode 100644 .gitignore
create mode 100644 LICENSE
create mode 100644 cli/cli.go
create mode 100644 go.mod
create mode 100644 go.sum
create mode 100644 jfrogclisecurity.go
create mode 100644 jfrogclisecurity_test.go
create mode 100644 tests/integration/xray_test.go
diff --git a/.github/ISSUE_TEMPLATE/bug_report.yml b/.github/ISSUE_TEMPLATE/bug_report.yml
new file mode 100644
index 00000000..ceb84a6c
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/bug_report.yml
@@ -0,0 +1,69 @@
+---
+name: "🐛 Bug Report"
+description: Create a report to help us improve
+labels: [bug]
+body:
+ - type: textarea
+ id: description
+ attributes:
+ label: Describe the bug
+ description: What is the problem? A clear and concise description of the bug.
+ validations:
+ required: true
+
+ - type: textarea
+ id: current
+ attributes:
+ label: Current behavior
+ description: |
+ Please include full errors, uncaught exceptions, screenshots, and relevant logs.
+ Using environment variable JFROG_CLI_LOG_LEVEL="DEBUG" upon running the command will provide more log information.
+ validations:
+ required: true
+
+ - type: textarea
+ id: reproduction
+ attributes:
+ label: Reproduction steps
+ description: |
+ Provide steps to reproduce the behavior.
+ validations:
+ required: false
+
+ - type: textarea
+ id: expected
+ attributes:
+ label: Expected behavior
+ description: |
+ What did you expect to happen?
+ validations:
+ required: false
+
+ - type: input
+ id: cli-security-version
+ attributes:
+ label: JFrog CLI-Security version
+ validations:
+ required: true
+
+ - type: input
+ id: cli-version
+ attributes:
+ label: JFrog CLI version (if applicable)
+ description: using "jf --version"
+ validations:
+ required: false
+
+ - type: input
+ id: os-version
+ attributes:
+ label: Operating system type and version
+ validations:
+ required: true
+
+ - type: input
+ id: xr-version
+ attributes:
+ label: JFrog Xray version
+ validations:
+ required: false
diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md
new file mode 100644
index 00000000..461ca285
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/feature_request.md
@@ -0,0 +1,20 @@
+---
+name: ⭐️ Feature request
+about: Suggest an idea for this project
+title: ''
+labels: feature request
+assignees: ''
+
+---
+
+**Is your feature request related to a problem? Please describe.**
+A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
+
+**Describe the solution you'd like to see**
+A clear and concise description of the new feature.
+
+**Describe alternatives you've considered**
+If applicable, a clear and concise description of any alternative solutions or features you've considered.
+
+**Additional context**
+Add any other context or screenshots about the feature request here.
diff --git a/.github/ISSUE_TEMPLATE/question.md b/.github/ISSUE_TEMPLATE/question.md
new file mode 100644
index 00000000..960c7f8f
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/question.md
@@ -0,0 +1,8 @@
+---
+name: ❓ Question
+about: Ask a question
+title: ''
+labels: question
+assignees: ''
+
+---
diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md
new file mode 100644
index 00000000..6082527b
--- /dev/null
+++ b/.github/PULL_REQUEST_TEMPLATE.md
@@ -0,0 +1,5 @@
+- [ ] All [tests](https://github.com/jfrog/jfrog-cli-security#tests) passed. If this feature is not already covered by the tests, I added new tests.
+- [ ] All [static analysis checks](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml) passed.
+- [ ] This pull request is on the dev branch.
+- [ ] I used gofmt for formatting the code before submitting the pull request.
+-----
diff --git a/.github/release.yml b/.github/release.yml
new file mode 100644
index 00000000..50210177
--- /dev/null
+++ b/.github/release.yml
@@ -0,0 +1,20 @@
+changelog:
+ exclude:
+ labels:
+ - ignore for release
+ categories:
+ - title: Breaking Changes 🚨
+ labels:
+ - breaking change
+ - title: Exciting New Features 🎉
+ labels:
+ - new feature
+ - title: Improvements 🌱
+ labels:
+ - improvement
+ - title: Bug Fixes 🛠
+ labels:
+ - bug
+ - title: Other Changes 📚
+ labels:
+ - "*"
diff --git a/.github/workflows/analysis.yml b/.github/workflows/analysis.yml
new file mode 100644
index 00000000..1cd20256
--- /dev/null
+++ b/.github/workflows/analysis.yml
@@ -0,0 +1,43 @@
+name: "Static Analysis"
+on:
+ push:
+ branches:
+ - '**'
+ tags-ignore:
+ - '**'
+ pull_request:
+jobs:
+ Static-Check:
+ runs-on: ubuntu-latest
+ steps:
+ - name: Checkout Source
+ uses: actions/checkout@v3
+
+ - name: Install Go
+ uses: actions/setup-go@v3
+ with:
+ go-version: 1.20.x
+
+ - name: Static Code Analysis
+ uses: golangci/golangci-lint-action@v3
+ with:
+ args: |
+ --timeout 5m --out-${NO_FUTURE}format colored-line-number --enable errcheck,gosimple,govet,ineffassign,staticcheck,typecheck,unused,gocritic,asasalint,asciicheck,errchkjson,exportloopref,forcetypeassert,makezero,nilerr,unparam,unconvert,wastedassign,usestdlibvars
+
+
+ Go-Sec:
+ runs-on: ubuntu-latest
+ steps:
+ - name: Checkout Source
+ uses: actions/checkout@v3
+
+ - name: Install Go
+ uses: actions/setup-go@v3
+ with:
+ go-version: 1.20.x
+
+ # Temporarily set version 2.18.0 to workaround https://github.com/securego/gosec/issues/1046
+ - name: Run Gosec Security Scanner
+ uses: securego/gosec@v2.18.0
+ with:
+ args: -exclude G204,G301,G302,G304,G306 -tests -exclude-dir \.*test\.* ./...
diff --git a/.github/workflows/cla.yml b/.github/workflows/cla.yml
new file mode 100644
index 00000000..38183d2d
--- /dev/null
+++ b/.github/workflows/cla.yml
@@ -0,0 +1,35 @@
+name: "CLA Assistant"
+on:
+ # issue_comment triggers this action on each comment on issues and pull requests
+ issue_comment:
+ types: [ created ]
+ pull_request_target:
+ types: [ opened,synchronize ]
+
+jobs:
+ CLAssistant:
+ runs-on: ubuntu-latest
+ steps:
+ - uses: actions-ecosystem/action-regex-match@v2
+ id: sign-or-recheck
+ with:
+ text: ${{ github.event.comment.body }}
+ regex: '\s*(I have read the CLA Document and I hereby sign the CLA)|(recheck)\s*'
+
+ - name: "CLA Assistant"
+ if: ${{ steps.sign-or-recheck.outputs.match != '' || github.event_name == 'pull_request_target' }}
+ # Alpha Release
+ uses: cla-assistant/github-action@v2.3.0
+ env:
+ # Generated and maintained by GitHub
+ GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+ # JFrog organization secret
+ PERSONAL_ACCESS_TOKEN: ${{ secrets.CLA_SIGN_TOKEN }}
+ with:
+ path-to-signatures: 'signed_clas.json'
+ path-to-document: 'https://jfrog.com/cla/'
+ remote-organization-name: 'jfrog'
+ remote-repository-name: 'jfrog-signed-clas'
+ # branch should not be protected
+ branch: 'master'
+ allowlist: bot*
diff --git a/.github/workflows/frogbot-scan-pull-request.yml b/.github/workflows/frogbot-scan-pull-request.yml
new file mode 100644
index 00000000..4afd0486
--- /dev/null
+++ b/.github/workflows/frogbot-scan-pull-request.yml
@@ -0,0 +1,47 @@
+name: "Frogbot Scan Pull Request"
+on:
+ pull_request_target:
+ types: [ opened, synchronize ]
+permissions:
+ pull-requests: write
+ contents: read
+jobs:
+ scan-pull-request:
+ runs-on: ubuntu-latest
+ # A pull request needs to be approved before Frogbot scans it. Any GitHub user who is associated with the
+ # "frogbot" GitHub environment can approve the pull request to be scanned.
+ environment: frogbot
+ steps:
+ - uses: jfrog/frogbot@v2
+ env:
+ JFROG_CLI_LOG_LEVEL: "DEBUG"
+ # [Mandatory]
+ # JFrog platform URL (This functionality requires version 3.29.0 or above of Xray)
+ JF_URL: ${{ secrets.FROGBOT_URL }}
+
+ # [Mandatory if JF_USER and JF_PASSWORD are not provided]
+ # JFrog access token with 'read' permissions on Xray service
+ JF_ACCESS_TOKEN: ${{ secrets.FROGBOT_ACCESS_TOKEN }}
+
+ # [Mandatory]
+ # The GitHub token is automatically generated for the job
+ JF_GIT_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+
+ # [Optional]
+ # Configure the SMTP server to enable Frogbot to send emails with detected secrets in pull request scans.
+ # SMTP server URL including should the relevant port: (Example: smtp.server.com:8080)
+ JF_SMTP_SERVER: ${{ secrets.JF_SMTP_SERVER }}
+
+ # [Mandatory if JF_SMTP_SERVER is set]
+ # The username required for authenticating with the SMTP server.
+ JF_SMTP_USER: ${{ secrets.JF_SMTP_USER }}
+
+ # [Mandatory if JF_SMTP_SERVER is set]
+ # The password associated with the username required for authentication with the SMTP server.
+ JF_SMTP_PASSWORD: ${{ secrets.JF_SMTP_PASSWORD }}
+
+ # [Optional]
+ # List of comma separated email addresses to receive email notifications about secrets
+ # detected during pull request scanning. The notification is also sent to the email set
+ # in the committer git profile regardless of whether this variable is set or not.
+ JF_EMAIL_RECEIVERS: "eco-system@jfrog.com"
\ No newline at end of file
diff --git a/.github/workflows/frogbot-scan-repository.yml b/.github/workflows/frogbot-scan-repository.yml
new file mode 100644
index 00000000..cb41bb51
--- /dev/null
+++ b/.github/workflows/frogbot-scan-repository.yml
@@ -0,0 +1,36 @@
+name: "Frogbot Scan Repository"
+on:
+ workflow_dispatch:
+ schedule:
+ # The repository will be scanned once a day at 00:00 GMT.
+ - cron: "0 0 * * *"
+permissions:
+ contents: write
+ pull-requests: write
+ security-events: write
+jobs:
+ scan-repository:
+ runs-on: ubuntu-latest
+ strategy:
+ matrix:
+ # The repository scanning will be triggered periodically on the following branches.
+ branch: [ "dev" ]
+ steps:
+ - uses: jfrog/frogbot@v2
+ env:
+ JFROG_CLI_LOG_LEVEL: "DEBUG"
+ # [Mandatory]
+ # JFrog platform URL (This functionality requires version 3.29.0 or above of Xray)
+ JF_URL: ${{ secrets.FROGBOT_URL }}
+
+ # [Mandatory if JF_USER and JF_PASSWORD are not provided]
+ # JFrog access token with 'read' permissions on Xray service
+ JF_ACCESS_TOKEN: ${{ secrets.FROGBOT_ACCESS_TOKEN }}
+
+ # [Mandatory]
+ # The GitHub token is automatically generated for the job
+ JF_GIT_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+
+ # [Mandatory]
+ # The name of the branch on which Frogbot will perform the scan
+ JF_GIT_BASE_BRANCH: ${{ matrix.branch }}
\ No newline at end of file
diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
new file mode 100644
index 00000000..9a9439fa
--- /dev/null
+++ b/.github/workflows/test.yml
@@ -0,0 +1,69 @@
+name: JFrog CLI Security Tests
+on:
+ push:
+ branches:
+ - '**'
+ tags-ignore:
+ - '**'
+ # Triggers the workflow on labeled PRs only.
+ pull_request_target:
+ types: [ labeled ]
+# Ensures that only the latest commit is running for each PR at a time.
+concurrency:
+ group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}-${{ github.ref }}
+ cancel-in-progress: true
+jobs:
+ test:
+ if: contains(github.event.pull_request.labels.*.name, 'safe to test') || github.event_name == 'push'
+ runs-on: ${{ matrix.os }}-latest
+ strategy:
+ fail-fast: false
+ matrix:
+ os: [ ubuntu, windows, macos ]
+ env:
+ GOPROXY: direct
+ GRADLE_OPTS: -Dorg.gradle.daemon=false
+ JFROG_CLI_LOG_LEVEL: "DEBUG"
+ steps:
+ # Install dependencies
+ - name: Install Go
+ uses: actions/setup-go@v3
+ with:
+ go-version: 1.20.x
+ - name: Install npm
+ uses: actions/setup-node@v3
+ with:
+ node-version: "16"
+ - name: Install Java
+ uses: actions/setup-java@v3
+ with:
+ java-version: "11"
+ distribution: "adopt"
+ - name: Install NuGet
+ uses: nuget/setup-nuget@v1
+ with:
+ nuget-version: 6.x
+ - name: Install dotnet
+ uses: actions/setup-dotnet@v3
+ with:
+ dotnet-version: '6.x'
+ - name: Setup Python3
+ uses: actions/setup-python@v4
+ with:
+ python-version: "3.x"
+ - name: Setup Pipenv
+ run: python -m pip install pipenv
+ - name: Setup Poetry
+ run: python -m pip install poetry
+ - name: Setup Gradle
+ uses: gradle/gradle-build-action@v2
+ with:
+ gradle-version: 7.6
+ # Checkout code
+ - name: Checkout code
+ uses: actions/checkout@v3
+ with:
+ ref: ${{ github.event.pull_request.head.sha }}
+ # Test
+ - name: Run security tests
+ run: go test -v github.com/jfrog/jfrog-cli-security --timeout 0 --race
diff --git a/.gitignore b/.gitignore
new file mode 100644
index 00000000..6ebfa3ed
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1,21 @@
+# IDEs
+.idea
+.vscode
+*.iml
+
+# IOS
+*.DS_Store
+
+# Vim
+*~
+*.swp
+
+# Gradle
+.gradle
+
+# npm build files
+node_modules
+
+# Test files
+tmp
+out
\ No newline at end of file
diff --git a/LICENSE b/LICENSE
new file mode 100644
index 00000000..9c8f3ea0
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,201 @@
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "{}"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright {yyyy} {name of copyright owner}
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
\ No newline at end of file
diff --git a/README.md b/README.md
index 19863b61..69b10acf 100644
--- a/README.md
+++ b/README.md
@@ -1 +1,20 @@
+
+
# jfrog-cli-security
+[![Scanned by Frogbot](https://raw.github.com/jfrog/frogbot/master/images/frogbot-badge.svg)](https://github.com/jfrog/frogbot#readme)
+
+
+
+| Branch | Status |
+|:------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
+| master | [![Test](https://github.com/jfrog/jfrog-cli-security/actions/workflows/test.yml/badge.svg?branch=master)](https://github.com/jfrog/jfrog-cli-security/actions/workflows/test.yml?query=branch%3Amaster) [![Static Analysis](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml/badge.svg?branch=master)](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml) |
+| dev | [![Test](https://github.com/jfrog/jfrog-cli-security/actions/workflows/test.yml/badge.svg?branch=dev)](https://github.com/jfrog/jfrog-cli-security/actions/workflows/test.yml?query=branch%3Adev) [![Static Analysis](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml/badge.svg?branch=dev)](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml) |
+
+## General
+
+**jfrog-cli-security** is a go module which contains the security code components (Xray, JAS) used by the [JFrog CLI source code](https://github.com/jfrog/jfrog-cli).
+
+## 🫱🏻🫲🏼 Contributions
+
+We welcome pull requests from the community. To help us improve this project, please read
+our [Contribution](./CONTRIBUTING.md) guide.
diff --git a/cli/cli.go b/cli/cli.go
new file mode 100644
index 00000000..bb56ee08
--- /dev/null
+++ b/cli/cli.go
@@ -0,0 +1,15 @@
+package cli
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+)
+
+func GetJfrogCliSecurityApp() components.App {
+ app := components.CreateApp(
+ "security",
+ "v1.0.0",
+ "Jfrog Security CLI embedded plugin",
+ []components.Command{},
+ )
+ return app
+}
diff --git a/go.mod b/go.mod
new file mode 100644
index 00000000..0b286e2a
--- /dev/null
+++ b/go.mod
@@ -0,0 +1,91 @@
+module github.com/jfrog/jfrog-cli-security
+
+go 1.20
+
+require (
+ github.com/jfrog/jfrog-cli-core/v2 v2.46.2
+ github.com/jfrog/jfrog-client-go v1.35.5
+ github.com/stretchr/testify v1.8.4
+)
+
+require (
+ dario.cat/mergo v1.0.0 // indirect
+ github.com/BurntSushi/toml v1.3.2 // indirect
+ github.com/CycloneDX/cyclonedx-go v0.7.2 // indirect
+ github.com/Microsoft/go-winio v0.6.1 // indirect
+ github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c // indirect
+ github.com/andybalholm/brotli v1.0.1 // indirect
+ github.com/buger/jsonparser v1.1.1 // indirect
+ github.com/chzyer/readline v1.5.1 // indirect
+ github.com/cloudflare/circl v1.3.3 // indirect
+ github.com/cpuguy83/go-md2man/v2 v2.0.2 // indirect
+ github.com/cyphar/filepath-securejoin v0.2.4 // indirect
+ github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
+ github.com/dsnet/compress v0.0.2-0.20210315054119-f66993602bf5 // indirect
+ github.com/emirpasic/gods v1.18.1 // indirect
+ github.com/forPelevin/gomoji v1.1.8 // indirect
+ github.com/fsnotify/fsnotify v1.7.0 // indirect
+ github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
+ github.com/go-git/go-billy/v5 v5.5.0 // indirect
+ github.com/go-git/go-git/v5 v5.11.0 // indirect
+ github.com/golang-jwt/jwt/v4 v4.5.0 // indirect
+ github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
+ github.com/golang/snappy v0.0.2 // indirect
+ github.com/google/uuid v1.5.0 // indirect
+ github.com/gookit/color v1.5.4 // indirect
+ github.com/hashicorp/hcl v1.0.0 // indirect
+ github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
+ github.com/jedib0t/go-pretty/v6 v6.4.0 // indirect
+ github.com/jfrog/build-info-go v1.9.19 // indirect
+ github.com/jfrog/gofrog v1.4.0 // indirect
+ github.com/kevinburke/ssh_config v1.2.0 // indirect
+ github.com/klauspost/compress v1.17.0 // indirect
+ github.com/klauspost/cpuid/v2 v2.2.3 // indirect
+ github.com/klauspost/pgzip v1.2.5 // indirect
+ github.com/magiconair/properties v1.8.7 // indirect
+ github.com/manifoldco/promptui v0.9.0 // indirect
+ github.com/mattn/go-runewidth v0.0.13 // indirect
+ github.com/mholt/archiver/v3 v3.5.1 // indirect
+ github.com/minio/sha256-simd v1.0.1 // indirect
+ github.com/mitchellh/mapstructure v1.5.0 // indirect
+ github.com/nwaples/rardecode v1.1.0 // indirect
+ github.com/owenrumney/go-sarif/v2 v2.3.0 // indirect
+ github.com/pelletier/go-toml/v2 v2.1.0 // indirect
+ github.com/pierrec/lz4/v4 v4.1.2 // indirect
+ github.com/pjbgf/sha1cd v0.3.0 // indirect
+ github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 // indirect
+ github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
+ github.com/rivo/uniseg v0.4.3 // indirect
+ github.com/russross/blackfriday/v2 v2.1.0 // indirect
+ github.com/sagikazarmark/locafero v0.4.0 // indirect
+ github.com/sagikazarmark/slog-shim v0.1.0 // indirect
+ github.com/sergi/go-diff v1.1.0 // indirect
+ github.com/skeema/knownhosts v1.2.1 // indirect
+ github.com/sourcegraph/conc v0.3.0 // indirect
+ github.com/spf13/afero v1.11.0 // indirect
+ github.com/spf13/cast v1.6.0 // indirect
+ github.com/spf13/pflag v1.0.5 // indirect
+ github.com/spf13/viper v1.18.2 // indirect
+ github.com/subosito/gotenv v1.6.0 // indirect
+ github.com/ulikunitz/xz v0.5.9 // indirect
+ github.com/urfave/cli v1.22.14 // indirect
+ github.com/xanzy/ssh-agent v0.3.3 // indirect
+ github.com/xi2/xz v0.0.0-20171230120015-48954b6210f8 // indirect
+ github.com/xo/terminfo v0.0.0-20210125001918-ca9a967f8778 // indirect
+ go.uber.org/atomic v1.9.0 // indirect
+ go.uber.org/multierr v1.9.0 // indirect
+ golang.org/x/crypto v0.17.0 // indirect
+ golang.org/x/exp v0.0.0-20231226003508-02704c960a9b // indirect
+ golang.org/x/mod v0.14.0 // indirect
+ golang.org/x/net v0.19.0 // indirect
+ golang.org/x/sync v0.5.0 // indirect
+ golang.org/x/sys v0.15.0 // indirect
+ golang.org/x/term v0.15.0 // indirect
+ golang.org/x/text v0.14.0 // indirect
+ golang.org/x/tools v0.16.0 // indirect
+ gopkg.in/ini.v1 v1.67.0 // indirect
+ gopkg.in/warnings.v0 v0.1.2 // indirect
+ gopkg.in/yaml.v3 v3.0.1 // indirect
+)
+
+replace github.com/jfrog/jfrog-cli-core/v2 => github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048
diff --git a/go.sum b/go.sum
new file mode 100644
index 00000000..3dbe6280
--- /dev/null
+++ b/go.sum
@@ -0,0 +1,287 @@
+dario.cat/mergo v1.0.0 h1:AGCNq9Evsj31mOgNPcLyXc+4PNABt905YmuqPYYpBWk=
+dario.cat/mergo v1.0.0/go.mod h1:uNxQE+84aUszobStD9th8a29P2fMDhsBdgRYvZOxGmk=
+github.com/BurntSushi/toml v1.3.2 h1:o7IhLm0Msx3BaB+n3Ag7L8EVlByGnpq14C4YWiu/gL8=
+github.com/BurntSushi/toml v1.3.2/go.mod h1:CxXYINrC8qIiEnFrOxCa7Jy5BFHlXnUU2pbicEuybxQ=
+github.com/CycloneDX/cyclonedx-go v0.7.2 h1:kKQ0t1dPOlugSIYVOMiMtFqeXI2wp/f5DBIdfux8gnQ=
+github.com/CycloneDX/cyclonedx-go v0.7.2/go.mod h1:K2bA+324+Og0X84fA8HhN2X066K7Bxz4rpMQ4ZhjtSk=
+github.com/Microsoft/go-winio v0.5.2/go.mod h1:WpS1mjBmmwHBEWmogvA2mj8546UReBk4v8QkMxJ6pZY=
+github.com/Microsoft/go-winio v0.6.1 h1:9/kr64B9VUZrLm5YYwbGtUJnMgqWVOdUAXu6Migciow=
+github.com/Microsoft/go-winio v0.6.1/go.mod h1:LRdKpFKfdobln8UmuiYcKPot9D2v6svN5+sAH+4kjUM=
+github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c h1:kMFnB0vCcX7IL/m9Y5LO+KQYv+t1CQOiFe6+SV2J7bE=
+github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c/go.mod h1:EjAoLdwvbIOoOQr3ihjnSoLZRtE8azugULFRteWMNc0=
+github.com/andybalholm/brotli v1.0.1 h1:KqhlKozYbRtJvsPrrEeXcO+N2l6NYT5A2QAFmSULpEc=
+github.com/andybalholm/brotli v1.0.1/go.mod h1:loMXtMfwqflxFJPmdbJO0a3KNoPuLBgiu3qAvBg8x/Y=
+github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
+github.com/apparentlymart/go-textseg/v13 v13.0.0/go.mod h1:ZK2fH7c4NqDTLtiYLvIkEghdlcqw7yxLeM89kiTRPUo=
+github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
+github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048 h1:CgwdiO5lMeu9nIa3a4p6FXQ4J2Hw4uRHmjW44mGlirQ=
+github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048/go.mod h1:l5y34dJhQ0W16o7OrCUjTQdGikoZPKTRI1NKGneoJ0g=
+github.com/bradleyjkemp/cupaloy/v2 v2.8.0 h1:any4BmKE+jGIaMpnU8YgH/I2LPiLBufr6oMMlVBbn9M=
+github.com/buger/jsonparser v1.1.1 h1:2PnMjfWD7wBILjqQbt530v576A/cAbQvEW9gGIpYMUs=
+github.com/buger/jsonparser v1.1.1/go.mod h1:6RYKKt7H4d4+iWqouImQ9R2FZql3VbhNgx27UK13J/0=
+github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
+github.com/chzyer/logex v1.1.10/go.mod h1:+Ywpsq7O8HXn0nuIou7OrIPyXbp3wmkHB+jjWRnGsAI=
+github.com/chzyer/logex v1.2.1 h1:XHDu3E6q+gdHgsdTPH6ImJMIp436vR6MPtH8gP05QzM=
+github.com/chzyer/logex v1.2.1/go.mod h1:JLbx6lG2kDbNRFnfkgvh4eRJRPX1QCoOIWomwysCBrQ=
+github.com/chzyer/readline v0.0.0-20180603132655-2972be24d48e/go.mod h1:nSuG5e5PlCu98SY8svDHJxuZscDgtXS6KTTbou5AhLI=
+github.com/chzyer/readline v1.5.1 h1:upd/6fQk4src78LMRzh5vItIt361/o4uq553V8B5sGI=
+github.com/chzyer/readline v1.5.1/go.mod h1:Eh+b79XXUwfKfcPLepksvw2tcLE/Ct21YObkaSkeBlk=
+github.com/chzyer/test v0.0.0-20180213035817-a1ea475d72b1/go.mod h1:Q3SI9o4m/ZMnBNeIyt5eFwwo7qiLfzFZmjNmxjkiQlU=
+github.com/chzyer/test v1.0.0 h1:p3BQDXSxOhOG0P9z6/hGnII4LGiEPOYBhs8asl/fC04=
+github.com/chzyer/test v1.0.0/go.mod h1:2JlltgoNkt4TW/z9V/IzDdFaMTM2JPIi26O1pF38GC8=
+github.com/cloudflare/circl v1.3.3 h1:fE/Qz0QdIGqeWfnwq0RE0R7MI51s0M2E4Ga9kq5AEMs=
+github.com/cloudflare/circl v1.3.3/go.mod h1:5XYMA4rFBvNIrhs50XuiBJ15vF2pZn4nnUKZrLbUZFA=
+github.com/cpuguy83/go-md2man/v2 v2.0.2 h1:p1EgwI/C7NhT0JmVkwCD2ZBK8j4aeHQX2pMHHBfMQ6w=
+github.com/cpuguy83/go-md2man/v2 v2.0.2/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
+github.com/cyphar/filepath-securejoin v0.2.4 h1:Ugdm7cg7i6ZK6x3xDF1oEu1nfkyfH53EtKeQYTC3kyg=
+github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
+github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
+github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
+github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc h1:U9qPSI2PIWSS1VwoXQT9A3Wy9MM3WgvqSxFWenqJduM=
+github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
+github.com/dsnet/compress v0.0.2-0.20210315054119-f66993602bf5 h1:iFaUwBSo5Svw6L7HYpRu/0lE3e0BaElwnNO1qkNQxBY=
+github.com/dsnet/compress v0.0.2-0.20210315054119-f66993602bf5/go.mod h1:qssHWj60/X5sZFNxpG4HBPDHVqxNm4DfnCKgrbZOT+s=
+github.com/dsnet/golib v0.0.0-20171103203638-1ea166775780/go.mod h1:Lj+Z9rebOhdfkVLjJ8T6VcRQv3SXugXy999NBtR9aFY=
+github.com/elazarl/goproxy v0.0.0-20230808193330-2592e75ae04a h1:mATvB/9r/3gvcejNsXKSkQ6lcIaNec2nyfOdlTBR2lU=
+github.com/emirpasic/gods v1.18.1 h1:FXtiHYKDGKCW2KzwZKx0iC0PQmdlorYgdFG9jPXJ1Bc=
+github.com/emirpasic/gods v1.18.1/go.mod h1:8tpGGwCnJ5H4r6BWwaV6OrWmMoPhUl5jm/FMNAnJvWQ=
+github.com/forPelevin/gomoji v1.1.8 h1:JElzDdt0TyiUlecy6PfITDL6eGvIaxqYH1V52zrd0qQ=
+github.com/forPelevin/gomoji v1.1.8/go.mod h1:8+Z3KNGkdslmeGZBC3tCrwMrcPy5GRzAD+gL9NAwMXg=
+github.com/frankban/quicktest v1.14.6 h1:7Xjx+VpznH+oBnejlPUj8oUpdxnVs4f8XU8WnHkI4W8=
+github.com/fsnotify/fsnotify v1.7.0 h1:8JEhPFa5W2WU7YfeZzPNqzMP6Lwt7L2715Ggo0nosvA=
+github.com/fsnotify/fsnotify v1.7.0/go.mod h1:40Bi/Hjc2AVfZrqy+aj+yEI+/bRxZnMJyTJwOpGvigM=
+github.com/gliderlabs/ssh v0.3.5 h1:OcaySEmAQJgyYcArR+gGGTHCyE7nvhEMTlYY+Dp8CpY=
+github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 h1:+zs/tPmkDkHx3U66DAb0lQFJrpS6731Oaa12ikc+DiI=
+github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376/go.mod h1:an3vInlBmSxCcxctByoQdvwPiA7DTK7jaaFDBTtu0ic=
+github.com/go-git/go-billy/v5 v5.5.0 h1:yEY4yhzCDuMGSv83oGxiBotRzhwhNr8VZyphhiu+mTU=
+github.com/go-git/go-billy/v5 v5.5.0/go.mod h1:hmexnoNsr2SJU1Ju67OaNz5ASJY3+sHgFRpCtpDCKow=
+github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=
+github.com/go-git/go-git/v5 v5.11.0 h1:XIZc1p+8YzypNr34itUfSvYJcv+eYdTnTvOZ2vD3cA4=
+github.com/go-git/go-git/v5 v5.11.0/go.mod h1:6GFcX2P3NM7FPBfpePbpLd21XxsgdAt+lKqXmCUiUCY=
+github.com/golang-jwt/jwt/v4 v4.5.0 h1:7cYmW1XlMY7h7ii7UhUyChSgS5wUJEnm9uZVTGqOWzg=
+github.com/golang-jwt/jwt/v4 v4.5.0/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=
+github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE=
+github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
+github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
+github.com/golang/protobuf v1.3.4/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw=
+github.com/golang/snappy v0.0.2 h1:aeE13tS0IiQgFjYdoL8qN3K1N2bXXtI6Vi51/y7BpMw=
+github.com/golang/snappy v0.0.2/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
+github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
+github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
+github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
+github.com/google/uuid v1.5.0 h1:1p67kYwdtXjb0gL0BPiP1Av9wiZPo5A8z2cWkTZ+eyU=
+github.com/google/uuid v1.5.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
+github.com/gookit/color v1.5.4 h1:FZmqs7XOyGgCAxmWyPslpiok1k05wmY3SJTytgvYFs0=
+github.com/gookit/color v1.5.4/go.mod h1:pZJOeOS8DM43rXbp4AZo1n9zCU2qjpcRko0b6/QJi9w=
+github.com/hashicorp/hcl v1.0.0 h1:0Anlzjpi4vEasTeNFn2mLJgTSwt0+6sfsiTG8qcWGx4=
+github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
+github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
+github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
+github.com/jedib0t/go-pretty/v6 v6.4.0 h1:YlI/2zYDrweA4MThiYMKtGRfT+2qZOO65ulej8GTcVI=
+github.com/jedib0t/go-pretty/v6 v6.4.0/go.mod h1:MgmISkTWDSFu0xOqiZ0mKNntMQ2mDgOcwOkwBEkMDJI=
+github.com/jfrog/build-info-go v1.9.19 h1:tFPR0Je+ETLXcJqa7UrICkSjwc27zeY06AoWaMYPdQI=
+github.com/jfrog/build-info-go v1.9.19/go.mod h1:DBxqvz1N/uI9iI/1gkCfjKjOrlcCzQ3hiKXqtKJUrrY=
+github.com/jfrog/gofrog v1.4.0 h1:s7eysVnmIBfVheMs4LPU43MAlxwPa4K8u2N5h7kwzXA=
+github.com/jfrog/gofrog v1.4.0/go.mod h1:AQo5Fq0G9nDEF6icH7MYQK0iohR4HuEAXl8jaxRuT6Q=
+github.com/jfrog/jfrog-client-go v1.35.5 h1:1QlrXdMhGi099Cs3mVKIpeVre2w1DiYhU7WGSEH2gQU=
+github.com/jfrog/jfrog-client-go v1.35.5/go.mod h1:Leua+MdhCV+M4gl746PcTsHF8dDP7+LLJ/NgHCTl/Fo=
+github.com/kevinburke/ssh_config v1.2.0 h1:x584FjTGwHzMwvHx18PXxbBVzfnxogHaAReU4gf13a4=
+github.com/kevinburke/ssh_config v1.2.0/go.mod h1:CT57kijsi8u/K/BOFA39wgDQJ9CxiF4nAY/ojJ6r6mM=
+github.com/klauspost/compress v1.4.1/go.mod h1:RyIbtBH6LamlWaDj8nUwkbUhJ87Yi3uG0guNDohfE1A=
+github.com/klauspost/compress v1.11.4/go.mod h1:aoV0uJVorq1K+umq18yTdKaF57EivdYsUV+/s2qKfXs=
+github.com/klauspost/compress v1.17.0 h1:Rnbp4K9EjcDuVuHtd0dgA4qNuv9yKDYKK1ulpJwgrqM=
+github.com/klauspost/compress v1.17.0/go.mod h1:ntbaceVETuRiXiv4DpjP66DpAtAGkEQskQzEyD//IeE=
+github.com/klauspost/cpuid v1.2.0/go.mod h1:Pj4uuM528wm8OyEC2QMXAi2YiTZ96dNQPGgoMS4s3ek=
+github.com/klauspost/cpuid/v2 v2.2.3 h1:sxCkb+qR91z4vsqw4vGGZlDgPz3G7gjaLyK3V8y70BU=
+github.com/klauspost/cpuid/v2 v2.2.3/go.mod h1:RVVoqg1df56z8g3pUjL/3lE5UfnlrJX8tyFgg4nqhuY=
+github.com/klauspost/pgzip v1.2.5 h1:qnWYvvKqedOF2ulHpMG72XQol4ILEJ8k2wwRl/Km8oE=
+github.com/klauspost/pgzip v1.2.5/go.mod h1:Ch1tH69qFZu15pkjo5kYi6mth2Zzwzt50oCQKQE9RUs=
+github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
+github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
+github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
+github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
+github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
+github.com/magiconair/properties v1.8.7 h1:IeQXZAiQcpL9mgcAe1Nu6cX9LLw6ExEHKjN0VQdvPDY=
+github.com/magiconair/properties v1.8.7/go.mod h1:Dhd985XPs7jluiymwWYZ0G4Z61jb3vdS329zhj2hYo0=
+github.com/manifoldco/promptui v0.9.0 h1:3V4HzJk1TtXW1MTZMP7mdlwbBpIinw3HztaIlYthEiA=
+github.com/manifoldco/promptui v0.9.0/go.mod h1:ka04sppxSGFAtxX0qhlYQjISsg9mR4GWtQEhdbn6Pgg=
+github.com/mattn/go-runewidth v0.0.13 h1:lTGmDsbAYt5DmK6OnoV7EuIF1wEIFAcxld6ypU4OSgU=
+github.com/mattn/go-runewidth v0.0.13/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
+github.com/mholt/archiver/v3 v3.5.1 h1:rDjOBX9JSF5BvoJGvjqK479aL70qh9DIpZCl+k7Clwo=
+github.com/mholt/archiver/v3 v3.5.1/go.mod h1:e3dqJ7H78uzsRSEACH1joayhuSyhnonssnDhppzS1L4=
+github.com/minio/sha256-simd v1.0.1 h1:6kaan5IFmwTNynnKKpDHe6FWHohJOHhCPchzK49dzMM=
+github.com/minio/sha256-simd v1.0.1/go.mod h1:Pz6AKMiUdngCLpeTL/RJY1M9rUuPMYujV5xJjtbRSN8=
+github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=
+github.com/mitchellh/mapstructure v1.5.0/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=
+github.com/nwaples/rardecode v1.1.0 h1:vSxaY8vQhOcVr4mm5e8XllHWTiM4JF507A0Katqw7MQ=
+github.com/nwaples/rardecode v1.1.0/go.mod h1:5DzqNKiOdpKKBH87u8VlvAnPZMXcGRhxWkRpHbbfGS0=
+github.com/onsi/gomega v1.27.10 h1:naR28SdDFlqrG6kScpT8VWpu1xWY5nJRCF3XaYyBjhI=
+github.com/owenrumney/go-sarif v1.1.1/go.mod h1:dNDiPlF04ESR/6fHlPyq7gHKmrM0sHUvAGjsoh8ZH0U=
+github.com/owenrumney/go-sarif/v2 v2.3.0 h1:wP5yEpI53zr0v5cBmagXzLbHZp9Oylyo3AJDpfLBITs=
+github.com/owenrumney/go-sarif/v2 v2.3.0/go.mod h1:MSqMMx9WqlBSY7pXoOZWgEsVB4FDNfhcaXDA1j6Sr+w=
+github.com/pelletier/go-toml/v2 v2.1.0 h1:FnwAJ4oYMvbT/34k9zzHuZNrhlz48GB3/s6at6/MHO4=
+github.com/pelletier/go-toml/v2 v2.1.0/go.mod h1:tJU2Z3ZkXwnxa4DPO899bsyIoywizdUvyaeZurnPPDc=
+github.com/pierrec/lz4/v4 v4.1.2 h1:qvY3YFXRQE/XB8MlLzJH7mSzBs74eA2gg52YTk6jUPM=
+github.com/pierrec/lz4/v4 v4.1.2/go.mod h1:gZWDp/Ze/IJXGXf23ltt2EXimqmTUXEy0GFuRQyBid4=
+github.com/pjbgf/sha1cd v0.3.0 h1:4D5XXmUUBUl/xQ6IjCkEAbqXskkq/4O7LmGn0AqMDs4=
+github.com/pjbgf/sha1cd v0.3.0/go.mod h1:nZ1rrWOcGJ5uZgEEVL1VUM9iRQiZvWdbZjkKyFzPPsI=
+github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 h1:KoWmjvw+nsYOo29YJK9vDA65RGE3NrOnUtO7a+RF9HU=
+github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8/go.mod h1:HKlIX3XHQyzLZPlr7++PzdhaXEj94dEiJgZDTsxEqUI=
+github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
+github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
+github.com/pkg/profile v1.6.0/go.mod h1:qBsxPvzyUincmltOk6iyRVxHYg4adc0OFOv72ZdLa18=
+github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
+github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRIccs7FGNTlIRMkT8wgtp5eCXdBlqhYGL6U=
+github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
+github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
+github.com/rivo/uniseg v0.4.3 h1:utMvzDsuh3suAEnhH0RdHmoPbU648o6CvXxTx4SBMOw=
+github.com/rivo/uniseg v0.4.3/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
+github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
+github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk=
+github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
+github.com/sagikazarmark/locafero v0.4.0 h1:HApY1R9zGo4DBgr7dqsTH/JJxLTTsOt7u6keLGt6kNQ=
+github.com/sagikazarmark/locafero v0.4.0/go.mod h1:Pe1W6UlPYUk/+wc/6KFhbORCfqzgYEpgQ3O5fPuL3H4=
+github.com/sagikazarmark/slog-shim v0.1.0 h1:diDBnUNK9N/354PgrxMywXnAwEr1QZcOr6gto+ugjYE=
+github.com/sagikazarmark/slog-shim v0.1.0/go.mod h1:SrcSrq8aKtyuqEI1uvTDTK1arOWRIczQRv+GVI1AkeQ=
+github.com/sergi/go-diff v1.1.0 h1:we8PVUC3FE2uYfodKH/nBHMSetSfHDR6scGdBi+erh0=
+github.com/sergi/go-diff v1.1.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=
+github.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
+github.com/skeema/knownhosts v1.2.1 h1:SHWdIUa82uGZz+F+47k8SY4QhhI291cXCpopT1lK2AQ=
+github.com/skeema/knownhosts v1.2.1/go.mod h1:xYbVRSPxqBZFrdmDyMmsOs+uX1UZC3nTN3ThzgDxUwo=
+github.com/sourcegraph/conc v0.3.0 h1:OQTbbt6P72L20UqAkXXuLOj79LfEanQ+YQFNpLA9ySo=
+github.com/sourcegraph/conc v0.3.0/go.mod h1:Sdozi7LEKbFPqYX2/J+iBAM6HpqSLTASQIKqDmF7Mt0=
+github.com/spf13/afero v1.11.0 h1:WJQKhtpdm3v2IzqG8VMqrr6Rf3UYpEF239Jy9wNepM8=
+github.com/spf13/afero v1.11.0/go.mod h1:GH9Y3pIexgf1MTIWtNGyogA5MwRIDXGUr+hbWNoBjkY=
+github.com/spf13/cast v1.6.0 h1:GEiTHELF+vaR5dhz3VqZfFSzZjYbgeKDpBxQVS4GYJ0=
+github.com/spf13/cast v1.6.0/go.mod h1:ancEpBxwJDODSW/UG4rDrAqiKolqNNh2DX3mk86cAdo=
+github.com/spf13/pflag v1.0.5 h1:iy+VFUOCP1a+8yFto/drg2CJ5u0yRoB7fZw3DKv/JXA=
+github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
+github.com/spf13/viper v1.18.2 h1:LUXCnvUvSM6FXAsj6nnfc8Q2tp1dIgUfY9Kc8GsSOiQ=
+github.com/spf13/viper v1.18.2/go.mod h1:EKmWIqdnk5lOcmR72yw6hS+8OPYcwD0jteitLMVB+yk=
+github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
+github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
+github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
+github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
+github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
+github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
+github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
+github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
+github.com/stretchr/testify v1.7.4/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
+github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
+github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
+github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
+github.com/subosito/gotenv v1.6.0 h1:9NlTDc1FTs4qu0DDq7AEtTPNw6SVm7uBMsUCUjABIf8=
+github.com/subosito/gotenv v1.6.0/go.mod h1:Dk4QP5c2W3ibzajGcXpNraDfq2IrhjMIvMSWPKKo0FU=
+github.com/terminalstatic/go-xsd-validate v0.1.5 h1:RqpJnf6HGE2CB/lZB1A8BYguk8uRtcvYAPLCF15qguo=
+github.com/ulikunitz/xz v0.5.8/go.mod h1:nbz6k7qbPmH4IRqmfOplQw/tblSgqTqBwxkY0oWt/14=
+github.com/ulikunitz/xz v0.5.9 h1:RsKRIA2MO8x56wkkcd3LbtcE/uMszhb6DpRf+3uwa3I=
+github.com/ulikunitz/xz v0.5.9/go.mod h1:nbz6k7qbPmH4IRqmfOplQw/tblSgqTqBwxkY0oWt/14=
+github.com/urfave/cli v1.22.14 h1:ebbhrRiGK2i4naQJr+1Xj92HXZCrK7MsyTS/ob3HnAk=
+github.com/urfave/cli v1.22.14/go.mod h1:X0eDS6pD6Exaclxm99NJ3FiCDRED7vIHpx2mDOHLvkA=
+github.com/vmihailenco/msgpack/v4 v4.3.12/go.mod h1:gborTTJjAo/GWTqqRjrLCn9pgNN+NXzzngzBKDPIqw4=
+github.com/vmihailenco/tagparser v0.1.1/go.mod h1:OeAg3pn3UbLjkWt+rN9oFYB6u/cQgqMEUPoW2WPyhdI=
+github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
+github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
+github.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f h1:J9EGpcZtP0E/raorCMxlFGSTBrsSlaDGf3jU/qvAE2c=
+github.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415 h1:EzJWgHovont7NscjpAxXsDA8S8BMYve8Y5+7cuRE7R0=
+github.com/xeipuuv/gojsonschema v1.2.0 h1:LhYJRs+L4fBtjZUfuSZIKGeVu0QRy8e5Xi7D17UxZ74=
+github.com/xi2/xz v0.0.0-20171230120015-48954b6210f8 h1:nIPpBwaJSVYIxUFsDv3M8ofmx9yWTog9BfvIu0q41lo=
+github.com/xi2/xz v0.0.0-20171230120015-48954b6210f8/go.mod h1:HUYIGzjTL3rfEspMxjDjgmT5uz5wzYJKVo23qUhYTos=
+github.com/xo/terminfo v0.0.0-20210125001918-ca9a967f8778 h1:QldyIu/L63oPpyvQmHgvgickp1Yw510KJOqX7H24mg8=
+github.com/xo/terminfo v0.0.0-20210125001918-ca9a967f8778/go.mod h1:2MuV+tbUrU1zIOPMxZ5EncGwgmMJsa+9ucAQZXxsObs=
+github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
+github.com/zclconf/go-cty v1.10.0/go.mod h1:vVKLxnk3puL4qRAv72AO+W99LUD4da90g3uUAzyuvAk=
+go.uber.org/atomic v1.9.0 h1:ECmE8Bn/WFTYwEW/bpKD3M8VtR/zQVbavAoalC1PYyE=
+go.uber.org/atomic v1.9.0/go.mod h1:fEN4uk6kAWBTFdckzkM89CLk9XfWZrxpCo0nPH17wJc=
+go.uber.org/multierr v1.9.0 h1:7fIwc/ZtS0q++VgcfqFDxSBZVv/Xo49/SYnDFupUwlI=
+go.uber.org/multierr v1.9.0/go.mod h1:X2jQV1h+kxSjClGpnseKVIxpmcjrj7MNnI0bnlfKTVQ=
+golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
+golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
+golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
+golang.org/x/crypto v0.3.1-0.20221117191849-2c476679df9a/go.mod h1:hebNnKkNXi2UzZN1eVRvBB7co0a+JxK6XbPiWVs/3J4=
+golang.org/x/crypto v0.7.0/go.mod h1:pYwdfH91IfpZVANVyUOhSIPZaFoJGxTFbZhFTx+dXZU=
+golang.org/x/crypto v0.17.0 h1:r8bRNjWL3GshPW3gkd+RpvzWrZAwPS49OmTGZ/uhM4k=
+golang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=
+golang.org/x/exp v0.0.0-20231226003508-02704c960a9b h1:kLiC65FbiHWFAOu+lxwNPujcsl8VYyTYYEZnsOO1WK4=
+golang.org/x/exp v0.0.0-20231226003508-02704c960a9b/go.mod h1:iRJReGqOEeBhDZGkGbynYwcHlctCvnjTYIamk7uXpHI=
+golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
+golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
+golang.org/x/mod v0.14.0 h1:dGoOF9QVLYng8IHTm7BAyWqCqSheQ5pYWGhzW00YJr0=
+golang.org/x/mod v0.14.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
+golang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=
+golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
+golang.org/x/net v0.0.0-20200301022130-244492dfa37a/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
+golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
+golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
+golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
+golang.org/x/net v0.2.0/go.mod h1:KqCZLdyyvdV855qA2rE3GC2aiw5xGR5TEjj8smXukLY=
+golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
+golang.org/x/net v0.8.0/go.mod h1:QVkue5JL9kW//ek3r6jTKnTFis1tRmNAW2P1shuFdJc=
+golang.org/x/net v0.19.0 h1:zTwKpTd2XuCqf8huc7Fo2iSy+4RHPd10s4KzeTnVr1c=
+golang.org/x/net v0.19.0/go.mod h1:CfAk/cbD4CthTvqiEl8NpboMuiuOYsAr/7NOjZJtv1U=
+golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.5.0 h1:60k92dhOjHxJkrqnwsfl8KuaHbn/5dl0lUPUklKo3qE=
+golang.org/x/sync v0.5.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
+golang.org/x/sys v0.0.0-20181122145206-62eef0e2fa9b/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
+golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
+golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20210616045830-e2b7044e8c71/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220310020820-b874c991c1a5/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220704084225-05e143d24a9e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.2.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.3.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.15.0 h1:h48lPFYpsTvQJZF4EKyI4aLHaev3CxivZmv7yZig9pc=
+golang.org/x/sys v0.15.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
+golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
+golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
+golang.org/x/term v0.2.0/go.mod h1:TVmDHMZPmdnySmBfhjOoOdhjzdE1h4u1VwSiw2l1Nuc=
+golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
+golang.org/x/term v0.6.0/go.mod h1:m6U89DPEgQRMq3DNkDClhWw02AUbt2daBVO4cn4Hv9U=
+golang.org/x/term v0.15.0 h1:y/Oo/a/q3IXu26lQgl04j/gjuBDOBlx7X6Om1j2CPW4=
+golang.org/x/term v0.15.0/go.mod h1:BDl952bC7+uMoWR75FIrCDx79TPU9oHkTZ9yRbYOrX0=
+golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
+golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
+golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
+golang.org/x/text v0.3.5/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
+golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
+golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
+golang.org/x/text v0.4.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
+golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
+golang.org/x/text v0.8.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
+golang.org/x/text v0.14.0 h1:ScX5w1eTa3QqT8oi6+ziP7dTV1S2+ALU0bI+0zXKWiQ=
+golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
+golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
+golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
+golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
+golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
+golang.org/x/tools v0.16.0 h1:GO788SKMRunPIBCXiQyo2AaexLstOrVhuAL5YwsckQM=
+golang.org/x/tools v0.16.0/go.mod h1:kYVVN6I1mBNoB1OX+noeBjbRk4IUEPa7JJ+TJMEooJ0=
+golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
+google.golang.org/appengine v1.6.5/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
+gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
+gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
+gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
+gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
+gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
+gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
+gopkg.in/warnings.v0 v0.1.2 h1:wFXVbFY8DY5/xOe1ECiWdKCzZlxgshcYVNkBHstARME=
+gopkg.in/warnings.v0 v0.1.2/go.mod h1:jksf8JmL6Qr/oQM2OXTHunEvvTAsrWBLb6OOjuVWRNI=
+gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
+gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
+gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
+gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
+gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
+gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
diff --git a/jfrogclisecurity.go b/jfrogclisecurity.go
new file mode 100644
index 00000000..791f860f
--- /dev/null
+++ b/jfrogclisecurity.go
@@ -0,0 +1,10 @@
+package main
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/plugins"
+ "github.com/jfrog/jfrog-cli-security/cli"
+)
+
+func main() {
+ plugins.PluginMain(cli.GetJfrogCliSecurityApp())
+}
diff --git a/jfrogclisecurity_test.go b/jfrogclisecurity_test.go
new file mode 100644
index 00000000..39b14ccd
--- /dev/null
+++ b/jfrogclisecurity_test.go
@@ -0,0 +1,59 @@
+package main
+
+import (
+ "flag"
+ "fmt"
+ "os"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+
+ coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ clientTests "github.com/jfrog/jfrog-client-go/utils/tests"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/log"
+ clientLog "github.com/jfrog/jfrog-client-go/utils/log"
+)
+
+const (
+ CliIntegrationTests = "github.com/jfrog/jfrog-cli-security/tests/integration"
+)
+
+func TestMain(m *testing.M) {
+ setupTests()
+ result := m.Run()
+
+ os.Exit(result)
+}
+
+func TestUnitTests(t *testing.T) {
+ // Create temp jfrog home
+ cleanUpJfrogHome, err := coreTests.SetJfrogHome()
+ if err != nil {
+ clientLog.Error(err)
+ os.Exit(1)
+ }
+ // Clean from previous tests.
+ defer cleanUpJfrogHome()
+
+ packages := clientTests.GetTestPackages("./...")
+ packages = clientTests.ExcludeTestsPackage(packages, CliIntegrationTests)
+ assert.NoError(t, clientTests.RunTests(packages, false))
+}
+
+func setupTests() {
+ // Disable usage report.
+ if err := os.Setenv(coreutils.ReportUsage, "false"); err != nil {
+ clientLog.Error(fmt.Sprintf("Couldn't set env: %s. Error: %s", coreutils.ReportUsage, err.Error()))
+ os.Exit(1)
+ }
+ // Disable progress bar and confirmation messages.
+ if err := os.Setenv(coreutils.CI, "true"); err != nil {
+ clientLog.Error(fmt.Sprintf("Couldn't set env: %s. Error: %s", coreutils.CI, err.Error()))
+ os.Exit(1)
+ }
+ // General
+ flag.Parse()
+ log.SetDefaultLogger()
+}
diff --git a/tests/integration/xray_test.go b/tests/integration/xray_test.go
new file mode 100644
index 00000000..76ab1b72
--- /dev/null
+++ b/tests/integration/xray_test.go
@@ -0,0 +1 @@
+package integration
From 6e6767706b4472a6c833042c4215206dbd8efb4e Mon Sep 17 00:00:00 2001
From: Assaf Attias <49212512+attiasas@users.noreply.github.com>
Date: Mon, 1 Jan 2024 14:28:49 +0200
Subject: [PATCH 2/4] add removeLabel action for tests (#2)
---
.github/PULL_REQUEST_TEMPLATE.md | 10 ++++++----
.github/workflows/removeLabel.yml | 18 ++++++++++++++++++
.github/workflows/test.yml | 8 ++++++--
3 files changed, 30 insertions(+), 6 deletions(-)
create mode 100644 .github/workflows/removeLabel.yml
diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md
index 6082527b..24485f50 100644
--- a/.github/PULL_REQUEST_TEMPLATE.md
+++ b/.github/PULL_REQUEST_TEMPLATE.md
@@ -1,5 +1,7 @@
-- [ ] All [tests](https://github.com/jfrog/jfrog-cli-security#tests) passed. If this feature is not already covered by the tests, I added new tests.
+- [ ] The pull request is targeting the `dev` branch.
+- [ ] The code has been validated to compile successfully by running `go vet ./...`.
+- [ ] The code has been formatted properly using `go fmt ./...`.
- [ ] All [static analysis checks](https://github.com/jfrog/jfrog-cli-security/actions/workflows/analysis.yml) passed.
-- [ ] This pull request is on the dev branch.
-- [ ] I used gofmt for formatting the code before submitting the pull request.
------
+- [ ] All [tests](https://github.com/jfrog/jfrog-cli-security/actions/workflows/test.yml) have passed. If this feature is not already covered by the tests, new tests have been added.
+
+-----
\ No newline at end of file
diff --git a/.github/workflows/removeLabel.yml b/.github/workflows/removeLabel.yml
new file mode 100644
index 00000000..67be7e8d
--- /dev/null
+++ b/.github/workflows/removeLabel.yml
@@ -0,0 +1,18 @@
+name: Remove Label
+on:
+ pull_request_target:
+ types: [labeled]
+# Ensures that only the latest commit is running for each PR at a time.
+concurrency:
+ group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }}-${{ github.ref }}
+ cancel-in-progress: true
+jobs:
+ Remove-Label:
+ if: contains(github.event.pull_request.labels.*.name, 'safe to test')
+ name: Remove label
+ runs-on: ubuntu-latest
+ steps:
+ - name: Remove 'safe to test'
+ uses: actions-ecosystem/action-remove-labels@v1
+ with:
+ labels: "safe to test"
diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index 9a9439fa..00a3be52 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -65,5 +65,9 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
# Test
- - name: Run security tests
- run: go test -v github.com/jfrog/jfrog-cli-security --timeout 0 --race
+ - name: Run security tests (without Docker Scan)
+ run: go test -v github.com/jfrog/jfrog-cli-security --timeout 0 --test.security --jfrog.url=${{ secrets.PLATFORM_URL }} --jfrog.adminToken=${{ secrets.PLATFORM_ADMIN_TOKEN }} --jfrog.user=${{ secrets.PLATFORM_USER }} --test.containerRegistry=${{ secrets.CONTAINER_REGISTRY }} --ci.runId=${{ runner.os }}-xray
+ if: ${{ matrix.os != 'ubuntu' }}
+ - name: Run security tests (with Docker Scan, only on Ubuntu)
+ run: go test -v github.com/jfrog/jfrog-cli-security --timeout 0 --test.security --test.dockerScan --jfrog.url=${{ secrets.PLATFORM_URL }} --jfrog.adminToken=${{ secrets.PLATFORM_ADMIN_TOKEN }} --test.containerRegistry=${{ secrets.CONTAINER_REGISTRY }} --ci.runId=${{ runner.os }}-xray
+ if: ${{ matrix.os == 'ubuntu' }}
From d44be676f906c6b536fac7e81c4a8311bd93ad04 Mon Sep 17 00:00:00 2001
From: Assaf Attias <49212512+attiasas@users.noreply.github.com>
Date: Thu, 18 Jan 2024 13:06:00 +0200
Subject: [PATCH 3/4] Add Security content from core and CLI (#3)
---
.github/workflows/test.yml | 1 -
README.md | 5 +-
artifactory_test.go | 244 +
audit_test.go | 492 +
cli/cli.go | 12 +-
cli/docs/auditspecific/help.go | 31 +
cli/docs/flags.go | 226 +
cli/docs/scan/audit/help.go | 5 +
cli/docs/scan/buildscan/help.go | 20 +
cli/docs/scan/curation/help.go | 5 +
cli/docs/scan/dockerscan/help.go | 18 +
cli/docs/scan/scan/help.go | 16 +
cli/docs/xray/curl/help.go | 11 +
cli/docs/xray/offlineupdate/help.go | 5 +
cli/scancommands.go | 474 +
cli/xraycommands.go | 188 +
commands/audit/audit.go | 196 +
commands/audit/auditparams.go | 105 +
.../jas/applicability/applicabilitymanager.go | 195 +
.../applicabilitymanager_test.go | 336 +
commands/audit/jas/common.go | 277 +
commands/audit/jas/common_test.go | 90 +
commands/audit/jas/commons_test.go | 129 +
commands/audit/jas/iac/iacscanner.go | 103 +
commands/audit/jas/iac/iacscanner_test.go | 83 +
commands/audit/jas/sast/sastscanner.go | 147 +
commands/audit/jas/sast/sastscanner_test.go | 156 +
commands/audit/jas/secrets/secretsscanner.go | 121 +
.../audit/jas/secrets/secretsscanner_test.go | 132 +
.../jas/testdata/.jfrog/jfrog-apps-config.yml | 50 +
commands/audit/jasrunner.go | 60 +
commands/audit/jasrunner_test.go | 44 +
commands/audit/sca/common.go | 152 +
commands/audit/sca/common_test.go | 227 +
commands/audit/sca/go/gloang_test.go | 79 +
commands/audit/sca/go/golang.go | 115 +
commands/audit/sca/npm/npm.go | 140 +
commands/audit/sca/npm/npm_test.go | 124 +
commands/audit/sca/nuget/nuget.go | 236 +
commands/audit/sca/nuget/nuget_test.go | 148 +
commands/audit/sca/python/python.go | 282 +
commands/audit/sca/python/python_test.go | 146 +
commands/audit/sca/yarn/yarn.go | 218 +
commands/audit/sca/yarn/yarn_test.go | 101 +
commands/audit/scarunner.go | 298 +
commands/audit/scarunner_test.go | 271 +
commands/curation/curationaudit.go | 573 +
commands/curation/curationaudit_test.go | 582 +
commands/scan/buildscan.go | 168 +
commands/scan/dockerscan.go | 145 +
commands/scan/downloadindexer.go | 193 +
commands/scan/downloadindexer_test.go | 59 +
commands/scan/scan.go | 459 +
commands/xray/curl/curl.go | 17 +
commands/xray/offlineupdate/offlineupdate.go | 463 +
.../xray/offlineupdate/offlineupdate_test.go | 157 +
formats/conversion.go | 196 +
formats/simplejsonapi.go | 123 +
formats/table.go | 137 +
go.mod | 50 +-
go.sum | 104 +-
jfrogclisecurity_test.go | 47 +-
scangraph/params.go | 59 +
scangraph/scangraph.go | 108 +
scangraph/scangraph_test.go | 161 +
scans_test.go | 355 +
tests/config.go | 66 +
tests/consts.go | 182 +
tests/integration/xray_test.go | 1 -
.../docker_local_repository_config.json | 6 +
.../docker_remote_repository_config.json | 7 +
.../docker_virtual_repository_config.json | 8 +
.../go_local_repository_config.json | 5 +
.../go_remote_repository_config.json | 7 +
.../go_virtual_repository_config.json | 6 +
.../gradle_remote_repository_config.json | 6 +
.../maven_remote_repository_config.json | 6 +
.../npm_remote_repository_config.json | 6 +
.../nuget_remote_repository_config.json | 9 +
.../pypi_remote_repository_config.json | 6 +
.../repo1_repository_config.json | 5 +
.../specs_virtual_repository_config.json | 6 +
.../yarn_remote_repository_config.json | 6 +
.../applicable-cve-results.sarif | 84 +
.../applicability-scan/empty-results.sarif | 29 +
.../no-applicable-cves-results.sarif | 121 +
.../contains-iac-violations-working-dir.sarif | 669 +
.../iac-scan/contains-iac-violations.sarif | 129 +
.../other/iac-scan/no-violations.sarif | 30 +
tests/testdata/other/npm/dependencies.json | 447 +
tests/testdata/other/nuget/dependencies.json | 194 +
tests/testdata/other/nuget/expectedTree.json | 92 +
.../sast-scan/contains-sast-violations.sarif | 907 +
.../other/sast-scan/no-violations.sarif | 28 +
.../other/secrets-scan/contain-secrets.sarif | 234 +
.../other/secrets-scan/no-secrets.sarif | 29 +
...uild-info-extractor-maven3-2.20.0-uber.jar | Bin 0 -> 8974067 bytes
.../jas-config/.jfrog/jfrog-apps-config.yml | 11 +
.../jas/jas-config/iac/azure/vpc/module.tf | 116 +
.../jas/jas-config/iac/azure/vpc/outputs.tf | 79 +
.../jas/jas-config/iac/azure/vpc/variables.tf | 39 +
.../jas/jas-config/iac/azure/vpc/versions.tf | 4 +
.../jas/jas-config/iac/azure/vpc_pp/module.tf | 34 +
.../jas-config/iac/azure/vpc_pp/outputs.tf | 62 +
.../jas-config/iac/azure/vpc_pp/variables.tf | 40 +
.../jas-config/iac/azure/vpc_pp/versions.tf | 4 +
.../iac/gcp/k8s-oss/files/chk_k8s_nat | 17 +
.../jas/jas-config/iac/gcp/k8s-oss/module.tf | 158 +
.../jas/jas-config/iac/gcp/k8s-oss/outputs.tf | 54 +
.../jas-config/iac/gcp/k8s-oss/variables.tf | 102 +
.../jas-config/iac/gcp/k8s-oss/versions.tf | 4 +
.../gcp/k8s-pipelines-bp/files/chk_k8s_nat | 17 +
.../iac/gcp/k8s-pipelines-bp/module.tf | 207 +
.../iac/gcp/k8s-pipelines-bp/outputs.tf | 54 +
.../iac/gcp/k8s-pipelines-bp/rbac.tf | 61 +
.../iac/gcp/k8s-pipelines-bp/variables.tf | 137 +
.../iac/gcp/k8s-pipelines-bp/versions.tf | 4 +
.../testdata/projects/jas/jas-config/main.py | 5 +
.../projects/jas/jas-config/requirements.txt | 2 +
.../jas-config/sast/flask_webgoat/__init__.py | 52 +
.../jas/jas-config/sast/flask_webgoat/ui.py | 25 +
.../projects/jas/jas-config/sast/result.sarif | 618 +
.../projects/jas/jas-config/sast/run.py | 15 +
.../jas/jas-config/secrets/more_secrets/key | 7 +
.../jas-config/secrets/more_secrets/sequence | 2 +
.../secrets/secret_generic/blacklist | 9 +
.../secrets/secret_generic/gibberish | 10 +
.../jas/jas-test/iac/azure/vpc/module.tf | 116 +
.../jas/jas-test/iac/azure/vpc/outputs.tf | 79 +
.../jas/jas-test/iac/azure/vpc/variables.tf | 39 +
.../jas/jas-test/iac/azure/vpc/versions.tf | 4 +
.../jas/jas-test/iac/azure/vpc_pp/module.tf | 34 +
.../jas/jas-test/iac/azure/vpc_pp/outputs.tf | 62 +
.../jas-test/iac/azure/vpc_pp/variables.tf | 40 +
.../jas/jas-test/iac/azure/vpc_pp/versions.tf | 4 +
.../iac/gcp/k8s-oss/files/chk_k8s_nat | 17 +
.../jas/jas-test/iac/gcp/k8s-oss/module.tf | 158 +
.../jas/jas-test/iac/gcp/k8s-oss/outputs.tf | 54 +
.../jas/jas-test/iac/gcp/k8s-oss/variables.tf | 102 +
.../jas/jas-test/iac/gcp/k8s-oss/versions.tf | 4 +
.../gcp/k8s-pipelines-bp/files/chk_k8s_nat | 17 +
.../iac/gcp/k8s-pipelines-bp/module.tf | 207 +
.../iac/gcp/k8s-pipelines-bp/outputs.tf | 54 +
.../jas-test/iac/gcp/k8s-pipelines-bp/rbac.tf | 61 +
.../iac/gcp/k8s-pipelines-bp/variables.tf | 137 +
.../iac/gcp/k8s-pipelines-bp/versions.tf | 4 +
tests/testdata/projects/jas/jas-test/main.py | 5 +
.../projects/jas/jas-test/requirements.txt | 2 +
.../jas-test/sast/flask_webgoat/__init__.py | 52 +
.../jas/jas-test/sast/flask_webgoat/ui.py | 25 +
.../projects/jas/jas-test/sast/result.sarif | 618 +
.../projects/jas/jas-test/sast/run.py | 15 +
.../jas/jas-test/secrets/more_secrets/key | 7 +
.../jas-test/secrets/more_secrets/sequence | 2 +
.../jas-test/secrets/secret_generic/blacklist | 9 +
.../jas-test/secrets/secret_generic/gibberish | 10 +
.../ClassLibrary1/ClassLibrary1.csproj | 23 +
.../SharedProject1/SharedProject1.projitems | 13 +
.../SharedProject1/SharedProject1.shproj | 13 +
.../dotnet-multi/TestApp1/TestApp1.csproj | 41 +
.../dotnet/dotnet-multi/TestSolution.sln | 31 +
.../dotnet/dotnet-single/Class1.cs | 6 +
.../dotnet/dotnet-single/dotnet-single.csproj | 16 +
.../package-managers/go/go-project/go.mod.txt | 7 +
.../package-managers/go/go-project/go.sum.txt | 8 +
.../go/go-project/test.go.txt | 10 +
.../package-managers/go/simple-project/go.mod | 9 +
.../package-managers/go/simple-project/go.sum | 6 +
.../go/simple-project/hello.go | 11 +
.../.jfrog/projects/gradle.yaml | 4 +
.../gradle-example-config/api/build.gradle | 15 +
.../main/java/org/gradle/api/PersonList.java | 38 +
.../src/main/java/org/gradle/api/package.html | 19 +
.../main/java/org/gradle/apiImpl/Impl.java | 26 +
.../gradle/gradle-example-config/build.gradle | 131 +
.../gradle-example-config/gradle.properties | 3 +
.../gradle/wrapper/gradle-wrapper.jar | Bin 0 -> 55616 bytes
.../gradle/wrapper/gradle-wrapper.properties | 5 +
.../gradle/gradle-example-config/gradlew | 188 +
.../gradle/gradle-example-config/gradlew.bat | 100 +
.../services/webservice/build.gradle | 7 +
.../java/org/gradle/webservice/TestTest.java | 35 +
.../org/gradle/webservice/TestTestTest.java | 31 +
.../gradle-example-config/settings.gradle | 1 +
.../main/java/org/gradle/shared/Person.java | 42 +
.../java/org/gradle/shared/package-info.java | 20 +
.../org/gradle/shared/main.properties | 17 +
.../gradle/gradle/api/build.gradle | 15 +
.../main/java/org/gradle/api/PersonList.java | 38 +
.../src/main/java/org/gradle/api/package.html | 19 +
.../main/java/org/gradle/apiImpl/Impl.java | 26 +
.../gradle/gradle/build.gradle | 33 +
.../gradle/gradle/wrapper/gradle-wrapper.jar | Bin 0 -> 55616 bytes
.../gradle/wrapper/gradle-wrapper.properties | 5 +
.../package-managers/gradle/gradle/gradlew | 188 +
.../gradle/gradle/gradlew.bat | 100 +
.../gradle/services/webservice/build.gradle | 8 +
.../java/org/gradle/webservice/TestTest.java | 35 +
.../org/gradle/webservice/TestTestTest.java | 31 +
.../gradle/gradle/settings.gradle | 1 +
.../main/java/org/gradle/shared/Person.java | 42 +
.../java/org/gradle/shared/package-info.java | 20 +
.../org/gradle/shared/main.properties | 16 +
.../gradle/gradleproject/build.gradle | 25 +
.../gradle/gradleproject/settings.gradle | 1 +
.../maven/artifactory-maven-plugin/pom.xml | 327 +
.../org/jfrog/buildinfo/ArtifactoryMojo.java | 174 +
.../main/java/org/jfrog/buildinfo/Config.java | 57 +
.../buildinfo/deployment/BuildDeployer.java | 152 +
.../BuildInfoModelPropertyResolver.java | 100 +
.../deployment/BuildInfoRecorder.java | 465 +
.../resolution/RepositoryListener.java | 85 +
.../resolution/ResolutionRepoHelper.java | 122 +
.../buildinfo/types/ModuleArtifacts.java | 51 +
.../utils/ArtifactoryMavenLogger.java | 50 +
.../java/org/jfrog/buildinfo/utils/Utils.java | 223 +
.../resources/META-INF/plexus/components.xml | 14 +
.../.mvn/wrapper/MavenWrapperDownloader.java | 117 +
.../.mvn/wrapper/maven-wrapper.jar | Bin 0 -> 50710 bytes
.../.mvn/wrapper/maven-wrapper.properties | 2 +
.../maven-example-with-wrapper/multi1/pom.xml | 89 +
.../main/java/artifactory/test/Multi1.java | 10 +
.../test/java/artifactory/test/AppTest.java | 38 +
.../classes/artifactory/test/Multi1.class | Bin 0 -> 556 bytes
.../compile/default-compile/createdFiles.lst | 1 +
.../compile/default-compile/inputFiles.lst | 1 +
.../default-testCompile/createdFiles.lst | 1 +
.../default-testCompile/inputFiles.lst | 1 +
.../artifactory/test/AppTest.class | Bin 0 -> 611 bytes
.../maven-example-with-wrapper/multi2/pom.xml | 19 +
.../src/main/java/artifactory/test/App.java | 13 +
.../test/java/artifactory/test/AppTest.java | 38 +
.../target/classes/artifactory/test/App.class | Bin 0 -> 547 bytes
.../compile/default-compile/createdFiles.lst | 1 +
.../compile/default-compile/inputFiles.lst | 1 +
.../default-testCompile/createdFiles.lst | 1 +
.../default-testCompile/inputFiles.lst | 1 +
.../artifactory/test/AppTest.class | Bin 0 -> 611 bytes
.../maven-example-with-wrapper/multi3/pom.xml | 76 +
.../main/java/artifactory/test/Multi3.java | 11 +
.../multi3/src/main/webapp/WEB-INF/web.xml | 9 +
.../test/java/artifactory/test/AppTest.java | 38 +
.../classes/artifactory/test/Multi3.class | Bin 0 -> 602 bytes
.../compile/default-compile/createdFiles.lst | 1 +
.../compile/default-compile/inputFiles.lst | 1 +
.../default-testCompile/createdFiles.lst | 1 +
.../default-testCompile/inputFiles.lst | 1 +
.../artifactory/test/AppTest.class | Bin 0 -> 611 bytes
.../maven/maven-example-with-wrapper/mvnw | 310 +
.../maven/maven-example-with-wrapper/mvnw.cmd | 182 +
.../maven/maven-example-with-wrapper/pom.xml | 73 +
.../maven-example/.jfrog/projects/maven.yaml | 6 +
.../maven/maven-example/multi1/pom.xml | 74 +
.../maven/maven-example/multi2/pom.xml | 19 +
.../maven/maven-example/multi3/pom.xml | 77 +
.../maven/maven-example/pom.xml | 31 +
.../package-managers/maven/maven/pom.xml | 41 +
.../maven/maven/target/build-info.json | 55 +
.../maven/mavenproject/pom.xml | 47 +
.../npm/npm-no-lock/package.json | 17 +
.../npm/npm-project/.jfrog/jfrog-cli.conf.v6 | 12 +
.../npm/npm-project/.jfrog/projects/npm.yaml | 5 +
.../npm/npm-project/package-lock.json | 30 +
.../npm/npm-project/package.json | 15 +
.../npm/npm-scripts/package.json | 7 +
.../npm/npm/package-lock.json | 49 +
.../package-managers/npm/npm/package.json | 17 +
.../multi/ClassLibrary1/ClassLibrary1.csproj | 22 +
.../SharedProject1/SharedProject1.projitems | 13 +
.../SharedProject1/SharedProject1.shproj | 13 +
.../nuget/multi/TestApp1/TestApp1.csproj | 40 +
.../nuget/multi/TestSolution.sln | 31 +
.../nuget/single4.0/core/Multi1.cs | 16 +
.../single4.0/core/Properties/AssemblyInfo.cs | 36 +
.../nuget/single4.0/core/core.csproj | 59 +
.../nuget/single4.0/core/core.nuspec | 13 +
.../nuget/single4.0/core/packages.config | 6 +
.../nuget/single4.0/example.sln | 28 +
.../ClassLibrary1/ClassLibrary1.csproj | 22 +
.../nuget/single5.0/TestSolution.sln | 29 +
.../python/pip/pip-project/requirements.txt | 2 +
.../python/pip/pip-project/setup.py | 14 +
.../pip/requirementsproject/requirements.txt | 1 +
.../python/pip/pip/setuppyproject/setup.py | 9 +
.../python/pipenv/pipenv-project/Pipfile | 14 +
.../python/pipenv/pipenv/pipenv.yaml | 5 +
.../pipenv/pipenv/pipenvproject/Pipfile | 13 +
.../my_poetry_project/__init__.py | 1 +
.../poetry/my-poetry-project/poetry.lock | 169 +
.../poetry/my-poetry-project/pyproject.toml | 16 +
.../my-poetry-project/tests/__init__.py | 0
.../tests/test_my_poetry_project.py | 5 +
.../python/poetry/poetry-project/poetry.lock | 64 +
.../poetry/poetry-project/pyproject.toml | 17 +
.../python/poetry/poetry/hello.py | 4 +
.../python/poetry/poetry/pyproject.toml | 13 +
.../.yarn/releases/yarn-1.22.21.cjs | 147513 +++++++++++++++
.../yarn/yarn-project/.yarnrc | 5 +
.../yarn/yarn-project/.yarnrc.yml | 3 +
.../yarn/yarn-project/package.json | 10 +
.../yarn-v1/.yarn/releases/yarn-1.22.1.cjs | 147386 ++++++++++++++
.../package-managers/yarn/yarn-v1/.yarnrc | 5 +
.../yarn/yarn-v1/package.json | 18 +
.../package-managers/yarn/yarn-v1/yarn.lock | 31 +
.../yarn-v2/.yarn/releases/yarn-2.4.1.cjs | 55 +
.../package-managers/yarn/yarn-v2/.yarnrc.yml | 1 +
.../yarn/yarn-v2/package.json | 17 +
.../yarn-v3/.yarn/releases/yarn-3.2.1.cjs | 54523 ++++++
.../package-managers/yarn/yarn-v3/.yarnrc.yml | 1 +
.../yarn/yarn-v3/package.json | 18 +
.../package-managers/yarn/yarn-v3/yarn.lock | 31 +
tests/utils/test_config.go | 235 +
tests/utils/test_utils.go | 65 +
tests/utils/test_validation.go | 74 +
unit_test.go | 36 +
utils/analyzermanager.go | 268 +
utils/analyzermanager_test.go | 59 +
utils/auditbasicparams.go | 194 +
utils/auditnpmparams.go | 25 +
utils/results.go | 100 +
utils/resultstable.go | 1050 +
utils/resultstable_test.go | 1061 +
utils/resultwriter.go | 571 +
utils/resultwriter_test.go | 379 +
utils/sarifutils.go | 267 +
utils/sarifutils_test.go | 672 +
utils/test_sarifutils.go | 64 +
utils/xraymanager.go | 33 +
xray_test.go | 42 +
329 files changed, 378122 insertions(+), 91 deletions(-)
create mode 100644 artifactory_test.go
create mode 100644 audit_test.go
create mode 100644 cli/docs/auditspecific/help.go
create mode 100644 cli/docs/flags.go
create mode 100644 cli/docs/scan/audit/help.go
create mode 100644 cli/docs/scan/buildscan/help.go
create mode 100644 cli/docs/scan/curation/help.go
create mode 100644 cli/docs/scan/dockerscan/help.go
create mode 100644 cli/docs/scan/scan/help.go
create mode 100644 cli/docs/xray/curl/help.go
create mode 100644 cli/docs/xray/offlineupdate/help.go
create mode 100644 cli/scancommands.go
create mode 100644 cli/xraycommands.go
create mode 100644 commands/audit/audit.go
create mode 100644 commands/audit/auditparams.go
create mode 100644 commands/audit/jas/applicability/applicabilitymanager.go
create mode 100644 commands/audit/jas/applicability/applicabilitymanager_test.go
create mode 100644 commands/audit/jas/common.go
create mode 100644 commands/audit/jas/common_test.go
create mode 100644 commands/audit/jas/commons_test.go
create mode 100644 commands/audit/jas/iac/iacscanner.go
create mode 100644 commands/audit/jas/iac/iacscanner_test.go
create mode 100644 commands/audit/jas/sast/sastscanner.go
create mode 100644 commands/audit/jas/sast/sastscanner_test.go
create mode 100644 commands/audit/jas/secrets/secretsscanner.go
create mode 100644 commands/audit/jas/secrets/secretsscanner_test.go
create mode 100644 commands/audit/jas/testdata/.jfrog/jfrog-apps-config.yml
create mode 100644 commands/audit/jasrunner.go
create mode 100644 commands/audit/jasrunner_test.go
create mode 100644 commands/audit/sca/common.go
create mode 100644 commands/audit/sca/common_test.go
create mode 100644 commands/audit/sca/go/gloang_test.go
create mode 100644 commands/audit/sca/go/golang.go
create mode 100644 commands/audit/sca/npm/npm.go
create mode 100644 commands/audit/sca/npm/npm_test.go
create mode 100644 commands/audit/sca/nuget/nuget.go
create mode 100644 commands/audit/sca/nuget/nuget_test.go
create mode 100644 commands/audit/sca/python/python.go
create mode 100644 commands/audit/sca/python/python_test.go
create mode 100644 commands/audit/sca/yarn/yarn.go
create mode 100644 commands/audit/sca/yarn/yarn_test.go
create mode 100644 commands/audit/scarunner.go
create mode 100644 commands/audit/scarunner_test.go
create mode 100644 commands/curation/curationaudit.go
create mode 100644 commands/curation/curationaudit_test.go
create mode 100644 commands/scan/buildscan.go
create mode 100644 commands/scan/dockerscan.go
create mode 100644 commands/scan/downloadindexer.go
create mode 100644 commands/scan/downloadindexer_test.go
create mode 100644 commands/scan/scan.go
create mode 100644 commands/xray/curl/curl.go
create mode 100644 commands/xray/offlineupdate/offlineupdate.go
create mode 100644 commands/xray/offlineupdate/offlineupdate_test.go
create mode 100644 formats/conversion.go
create mode 100644 formats/simplejsonapi.go
create mode 100644 formats/table.go
create mode 100644 scangraph/params.go
create mode 100644 scangraph/scangraph.go
create mode 100644 scangraph/scangraph_test.go
create mode 100644 scans_test.go
create mode 100644 tests/config.go
create mode 100644 tests/consts.go
delete mode 100644 tests/integration/xray_test.go
create mode 100644 tests/testdata/artifactory-repo-configs/docker_local_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/docker_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/docker_virtual_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/go_local_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/go_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/go_virtual_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/gradle_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/maven_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/npm_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/nuget_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/pypi_remote_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/repo1_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/specs_virtual_repository_config.json
create mode 100644 tests/testdata/artifactory-repo-configs/yarn_remote_repository_config.json
create mode 100644 tests/testdata/other/applicability-scan/applicable-cve-results.sarif
create mode 100644 tests/testdata/other/applicability-scan/empty-results.sarif
create mode 100644 tests/testdata/other/applicability-scan/no-applicable-cves-results.sarif
create mode 100644 tests/testdata/other/iac-scan/contains-iac-violations-working-dir.sarif
create mode 100644 tests/testdata/other/iac-scan/contains-iac-violations.sarif
create mode 100644 tests/testdata/other/iac-scan/no-violations.sarif
create mode 100644 tests/testdata/other/npm/dependencies.json
create mode 100644 tests/testdata/other/nuget/dependencies.json
create mode 100644 tests/testdata/other/nuget/expectedTree.json
create mode 100644 tests/testdata/other/sast-scan/contains-sast-violations.sarif
create mode 100644 tests/testdata/other/sast-scan/no-violations.sarif
create mode 100644 tests/testdata/other/secrets-scan/contain-secrets.sarif
create mode 100644 tests/testdata/other/secrets-scan/no-secrets.sarif
create mode 100644 tests/testdata/projects/binaries/build-info-extractor-maven3-2.20.0-uber.jar
create mode 100644 tests/testdata/projects/jas/jas-config/.jfrog/jfrog-apps-config.yml
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc/module.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc_pp/module.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc_pp/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc_pp/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/azure/vpc_pp/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-oss/files/chk_k8s_nat
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-oss/module.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-oss/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-oss/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-oss/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/files/chk_k8s_nat
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/module.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/rbac.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-config/iac/gcp/k8s-pipelines-bp/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-config/main.py
create mode 100644 tests/testdata/projects/jas/jas-config/requirements.txt
create mode 100644 tests/testdata/projects/jas/jas-config/sast/flask_webgoat/__init__.py
create mode 100644 tests/testdata/projects/jas/jas-config/sast/flask_webgoat/ui.py
create mode 100644 tests/testdata/projects/jas/jas-config/sast/result.sarif
create mode 100644 tests/testdata/projects/jas/jas-config/sast/run.py
create mode 100644 tests/testdata/projects/jas/jas-config/secrets/more_secrets/key
create mode 100644 tests/testdata/projects/jas/jas-config/secrets/more_secrets/sequence
create mode 100644 tests/testdata/projects/jas/jas-config/secrets/secret_generic/blacklist
create mode 100644 tests/testdata/projects/jas/jas-config/secrets/secret_generic/gibberish
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc/module.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc_pp/module.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc_pp/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc_pp/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/azure/vpc_pp/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-oss/files/chk_k8s_nat
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-oss/module.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-oss/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-oss/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-oss/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/files/chk_k8s_nat
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/module.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/outputs.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/rbac.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/variables.tf
create mode 100644 tests/testdata/projects/jas/jas-test/iac/gcp/k8s-pipelines-bp/versions.tf
create mode 100644 tests/testdata/projects/jas/jas-test/main.py
create mode 100644 tests/testdata/projects/jas/jas-test/requirements.txt
create mode 100644 tests/testdata/projects/jas/jas-test/sast/flask_webgoat/__init__.py
create mode 100644 tests/testdata/projects/jas/jas-test/sast/flask_webgoat/ui.py
create mode 100644 tests/testdata/projects/jas/jas-test/sast/result.sarif
create mode 100644 tests/testdata/projects/jas/jas-test/sast/run.py
create mode 100644 tests/testdata/projects/jas/jas-test/secrets/more_secrets/key
create mode 100644 tests/testdata/projects/jas/jas-test/secrets/more_secrets/sequence
create mode 100644 tests/testdata/projects/jas/jas-test/secrets/secret_generic/blacklist
create mode 100644 tests/testdata/projects/jas/jas-test/secrets/secret_generic/gibberish
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-multi/ClassLibrary1/ClassLibrary1.csproj
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-multi/SharedProject1/SharedProject1.projitems
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-multi/SharedProject1/SharedProject1.shproj
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-multi/TestApp1/TestApp1.csproj
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-multi/TestSolution.sln
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-single/Class1.cs
create mode 100644 tests/testdata/projects/package-managers/dotnet/dotnet-single/dotnet-single.csproj
create mode 100644 tests/testdata/projects/package-managers/go/go-project/go.mod.txt
create mode 100644 tests/testdata/projects/package-managers/go/go-project/go.sum.txt
create mode 100644 tests/testdata/projects/package-managers/go/go-project/test.go.txt
create mode 100644 tests/testdata/projects/package-managers/go/simple-project/go.mod
create mode 100644 tests/testdata/projects/package-managers/go/simple-project/go.sum
create mode 100644 tests/testdata/projects/package-managers/go/simple-project/hello.go
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/.jfrog/projects/gradle.yaml
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/api/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/api/src/main/java/org/gradle/api/PersonList.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/api/src/main/java/org/gradle/api/package.html
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/api/src/main/java/org/gradle/apiImpl/Impl.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/gradle.properties
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/gradle/wrapper/gradle-wrapper.jar
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/gradle/wrapper/gradle-wrapper.properties
create mode 100755 tests/testdata/projects/package-managers/gradle/gradle-example-config/gradlew
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/gradlew.bat
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/services/webservice/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/services/webservice/src/main/java/org/gradle/webservice/TestTest.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/services/webservice/src/test/java/org/gradle/webservice/TestTestTest.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/settings.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/shared/src/main/java/org/gradle/shared/Person.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/shared/src/main/java/org/gradle/shared/package-info.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle-example-config/shared/src/main/resources/org/gradle/shared/main.properties
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/api/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/api/src/main/java/org/gradle/api/PersonList.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/api/src/main/java/org/gradle/api/package.html
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/api/src/main/java/org/gradle/apiImpl/Impl.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/build.gradle
create mode 100755 tests/testdata/projects/package-managers/gradle/gradle/gradle/wrapper/gradle-wrapper.jar
create mode 100755 tests/testdata/projects/package-managers/gradle/gradle/gradle/wrapper/gradle-wrapper.properties
create mode 100755 tests/testdata/projects/package-managers/gradle/gradle/gradlew
create mode 100755 tests/testdata/projects/package-managers/gradle/gradle/gradlew.bat
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/services/webservice/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/services/webservice/src/main/java/org/gradle/webservice/TestTest.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/services/webservice/src/test/java/org/gradle/webservice/TestTestTest.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/settings.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/shared/src/main/java/org/gradle/shared/Person.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/shared/src/main/java/org/gradle/shared/package-info.java
create mode 100644 tests/testdata/projects/package-managers/gradle/gradle/shared/src/main/resources/org/gradle/shared/main.properties
create mode 100644 tests/testdata/projects/package-managers/gradle/gradleproject/build.gradle
create mode 100644 tests/testdata/projects/package-managers/gradle/gradleproject/settings.gradle
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/ArtifactoryMojo.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/Config.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/deployment/BuildDeployer.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/deployment/BuildInfoModelPropertyResolver.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/deployment/BuildInfoRecorder.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/resolution/RepositoryListener.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/resolution/ResolutionRepoHelper.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/types/ModuleArtifacts.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/utils/ArtifactoryMavenLogger.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/java/org/jfrog/buildinfo/utils/Utils.java
create mode 100644 tests/testdata/projects/package-managers/maven/artifactory-maven-plugin/src/main/resources/META-INF/plexus/components.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/.mvn/wrapper/MavenWrapperDownloader.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/.mvn/wrapper/maven-wrapper.jar
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/.mvn/wrapper/maven-wrapper.properties
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/src/main/java/artifactory/test/Multi1.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/src/test/java/artifactory/test/AppTest.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/classes/artifactory/test/Multi1.class
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/maven-status/maven-compiler-plugin/compile/default-compile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/maven-status/maven-compiler-plugin/compile/default-compile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi1/target/test-classes/artifactory/test/AppTest.class
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/src/main/java/artifactory/test/App.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/src/test/java/artifactory/test/AppTest.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/classes/artifactory/test/App.class
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/maven-status/maven-compiler-plugin/compile/default-compile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/maven-status/maven-compiler-plugin/compile/default-compile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi2/target/test-classes/artifactory/test/AppTest.class
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/src/main/java/artifactory/test/Multi3.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/src/main/webapp/WEB-INF/web.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/src/test/java/artifactory/test/AppTest.java
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/classes/artifactory/test/Multi3.class
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/maven-status/maven-compiler-plugin/compile/default-compile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/maven-status/maven-compiler-plugin/compile/default-compile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/createdFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/maven-status/maven-compiler-plugin/testCompile/default-testCompile/inputFiles.lst
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/multi3/target/test-classes/artifactory/test/AppTest.class
create mode 100755 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/mvnw
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/mvnw.cmd
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example-with-wrapper/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example/.jfrog/projects/maven.yaml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example/multi1/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example/multi2/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example/multi3/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven-example/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven/pom.xml
create mode 100644 tests/testdata/projects/package-managers/maven/maven/target/build-info.json
create mode 100644 tests/testdata/projects/package-managers/maven/mavenproject/pom.xml
create mode 100755 tests/testdata/projects/package-managers/npm/npm-no-lock/package.json
create mode 100644 tests/testdata/projects/package-managers/npm/npm-project/.jfrog/jfrog-cli.conf.v6
create mode 100644 tests/testdata/projects/package-managers/npm/npm-project/.jfrog/projects/npm.yaml
create mode 100644 tests/testdata/projects/package-managers/npm/npm-project/package-lock.json
create mode 100644 tests/testdata/projects/package-managers/npm/npm-project/package.json
create mode 100644 tests/testdata/projects/package-managers/npm/npm-scripts/package.json
create mode 100644 tests/testdata/projects/package-managers/npm/npm/package-lock.json
create mode 100644 tests/testdata/projects/package-managers/npm/npm/package.json
create mode 100644 tests/testdata/projects/package-managers/nuget/multi/ClassLibrary1/ClassLibrary1.csproj
create mode 100644 tests/testdata/projects/package-managers/nuget/multi/SharedProject1/SharedProject1.projitems
create mode 100644 tests/testdata/projects/package-managers/nuget/multi/SharedProject1/SharedProject1.shproj
create mode 100644 tests/testdata/projects/package-managers/nuget/multi/TestApp1/TestApp1.csproj
create mode 100644 tests/testdata/projects/package-managers/nuget/multi/TestSolution.sln
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/core/Multi1.cs
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/core/Properties/AssemblyInfo.cs
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/core/core.csproj
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/core/core.nuspec
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/core/packages.config
create mode 100644 tests/testdata/projects/package-managers/nuget/single4.0/example.sln
create mode 100644 tests/testdata/projects/package-managers/nuget/single5.0/ClassLibrary1/ClassLibrary1.csproj
create mode 100644 tests/testdata/projects/package-managers/nuget/single5.0/TestSolution.sln
create mode 100644 tests/testdata/projects/package-managers/python/pip/pip-project/requirements.txt
create mode 100644 tests/testdata/projects/package-managers/python/pip/pip-project/setup.py
create mode 100644 tests/testdata/projects/package-managers/python/pip/pip/requirementsproject/requirements.txt
create mode 100644 tests/testdata/projects/package-managers/python/pip/pip/setuppyproject/setup.py
create mode 100644 tests/testdata/projects/package-managers/python/pipenv/pipenv-project/Pipfile
create mode 100644 tests/testdata/projects/package-managers/python/pipenv/pipenv/pipenv.yaml
create mode 100644 tests/testdata/projects/package-managers/python/pipenv/pipenv/pipenvproject/Pipfile
create mode 100644 tests/testdata/projects/package-managers/python/poetry/my-poetry-project/my_poetry_project/__init__.py
create mode 100644 tests/testdata/projects/package-managers/python/poetry/my-poetry-project/poetry.lock
create mode 100644 tests/testdata/projects/package-managers/python/poetry/my-poetry-project/pyproject.toml
create mode 100644 tests/testdata/projects/package-managers/python/poetry/my-poetry-project/tests/__init__.py
create mode 100644 tests/testdata/projects/package-managers/python/poetry/my-poetry-project/tests/test_my_poetry_project.py
create mode 100644 tests/testdata/projects/package-managers/python/poetry/poetry-project/poetry.lock
create mode 100644 tests/testdata/projects/package-managers/python/poetry/poetry-project/pyproject.toml
create mode 100644 tests/testdata/projects/package-managers/python/poetry/poetry/hello.py
create mode 100644 tests/testdata/projects/package-managers/python/poetry/poetry/pyproject.toml
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-project/.yarn/releases/yarn-1.22.21.cjs
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-project/.yarnrc
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-project/.yarnrc.yml
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-project/package.json
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-v1/.yarn/releases/yarn-1.22.1.cjs
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v1/.yarnrc
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v1/package.json
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v1/yarn.lock
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-v2/.yarn/releases/yarn-2.4.1.cjs
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v2/.yarnrc.yml
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v2/package.json
create mode 100755 tests/testdata/projects/package-managers/yarn/yarn-v3/.yarn/releases/yarn-3.2.1.cjs
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v3/.yarnrc.yml
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v3/package.json
create mode 100644 tests/testdata/projects/package-managers/yarn/yarn-v3/yarn.lock
create mode 100644 tests/utils/test_config.go
create mode 100644 tests/utils/test_utils.go
create mode 100644 tests/utils/test_validation.go
create mode 100644 unit_test.go
create mode 100644 utils/analyzermanager.go
create mode 100644 utils/analyzermanager_test.go
create mode 100644 utils/auditbasicparams.go
create mode 100644 utils/auditnpmparams.go
create mode 100644 utils/results.go
create mode 100644 utils/resultstable.go
create mode 100644 utils/resultstable_test.go
create mode 100644 utils/resultwriter.go
create mode 100644 utils/resultwriter_test.go
create mode 100644 utils/sarifutils.go
create mode 100644 utils/sarifutils_test.go
create mode 100644 utils/test_sarifutils.go
create mode 100644 utils/xraymanager.go
create mode 100644 xray_test.go
diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index 00a3be52..c9c7e360 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -23,7 +23,6 @@ jobs:
env:
GOPROXY: direct
GRADLE_OPTS: -Dorg.gradle.daemon=false
- JFROG_CLI_LOG_LEVEL: "DEBUG"
steps:
# Install dependencies
- name: Install Go
diff --git a/README.md b/README.md
index 69b10acf..7431da36 100644
--- a/README.md
+++ b/README.md
@@ -12,9 +12,8 @@
## General
-**jfrog-cli-security** is a go module which contains the security code components (Xray, JAS) used by the [JFrog CLI source code](https://github.com/jfrog/jfrog-cli).
+**jfrog-cli-security** is a Go module that encompasses the security commands of [JFrog CLI](https://docs.jfrog-applications.jfrog.io/jfrog-applications/jfrog-cli). This module is an Embedded JFrog CLI Plugins and is referenced as a Go module within the [JFrog CLI codebase](https://github.com/jfrog/jfrog-cli).
## 🫱🏻🫲🏼 Contributions
-We welcome pull requests from the community. To help us improve this project, please read
-our [Contribution](./CONTRIBUTING.md) guide.
+We welcome contributions from the community through pull requests. To assist in enhancing this project, please review our [Plugin Contribution](https://github.com/jfrog/jfrog-cli-core/blob/dev/plugins/README.md) guide.
diff --git a/artifactory_test.go b/artifactory_test.go
new file mode 100644
index 00000000..26df2789
--- /dev/null
+++ b/artifactory_test.go
@@ -0,0 +1,244 @@
+package main
+
+import (
+ "errors"
+ "github.com/stretchr/testify/require"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/dependencies"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+
+ "github.com/stretchr/testify/assert"
+
+ biutils "github.com/jfrog/build-info-go/utils"
+
+ securityTests "github.com/jfrog/jfrog-cli-security/tests"
+ securityTestUtils "github.com/jfrog/jfrog-cli-security/tests/utils"
+ "github.com/jfrog/jfrog-cli-security/utils"
+
+ "github.com/jfrog/jfrog-cli-core/v2/artifactory/commands/generic"
+ commonCommands "github.com/jfrog/jfrog-cli-core/v2/common/commands"
+ "github.com/jfrog/jfrog-cli-core/v2/common/project"
+ "github.com/jfrog/jfrog-cli-core/v2/common/spec"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+
+ clientTests "github.com/jfrog/jfrog-client-go/utils/tests"
+)
+
+// We perform validation on dependency resolution from an Artifactory server during the construction of the dependency tree during 'audit' flow.
+// This process involves resolving all dependencies required by the project.
+func TestDependencyResolutionFromArtifactory(t *testing.T) {
+ testCases := []struct {
+ testProjectPath []string
+ resolveRepoName string
+ cacheRepoName string
+ projectType project.ProjectType
+ }{
+ {
+ testProjectPath: []string{"npm", "npm-no-lock"},
+ resolveRepoName: securityTests.NpmRemoteRepo,
+ cacheRepoName: securityTests.NpmRemoteRepo,
+ projectType: project.Npm,
+ },
+ {
+ testProjectPath: []string{"dotnet", "dotnet-single"},
+ resolveRepoName: securityTests.NugetRemoteRepo,
+ cacheRepoName: securityTests.NugetRemoteRepo,
+ projectType: project.Dotnet,
+ },
+ {
+ testProjectPath: []string{"yarn", "yarn-v2"},
+ resolveRepoName: securityTests.YarnRemoteRepo,
+ cacheRepoName: securityTests.YarnRemoteRepo,
+ projectType: project.Yarn,
+ },
+ {
+ testProjectPath: []string{"gradle", "gradleproject"},
+ resolveRepoName: securityTests.GradleRemoteRepo,
+ cacheRepoName: securityTests.GradleRemoteRepo,
+ projectType: project.Gradle,
+ },
+ {
+ testProjectPath: []string{"maven", "mavenproject"},
+ resolveRepoName: securityTests.MvnRemoteRepo,
+ cacheRepoName: securityTests.MvnRemoteRepo,
+ projectType: project.Maven,
+ },
+ {
+ testProjectPath: []string{"go", "simple-project"},
+ resolveRepoName: securityTests.GoVirtualRepo,
+ cacheRepoName: securityTests.GoRemoteRepo,
+ projectType: project.Go,
+ },
+ {
+ testProjectPath: []string{"python", "pipenv", "pipenv", "pipenvproject"},
+ resolveRepoName: securityTests.PypiRemoteRepo,
+ cacheRepoName: securityTests.PypiRemoteRepo,
+ projectType: project.Pipenv,
+ },
+ {
+ testProjectPath: []string{"python", "pip", "pip", "setuppyproject"},
+ resolveRepoName: securityTests.PypiRemoteRepo,
+ cacheRepoName: securityTests.PypiRemoteRepo,
+ projectType: project.Pip,
+ },
+ {
+ testProjectPath: []string{"python", "poetry", "poetry"},
+ resolveRepoName: securityTests.PypiRemoteRepo,
+ cacheRepoName: securityTests.PypiRemoteRepo,
+ projectType: project.Poetry,
+ },
+ }
+ securityTestUtils.CreateJfrogHomeConfig(t, true)
+ defer securityTestUtils.CleanTestsHomeEnv()
+
+ for _, testCase := range testCases {
+ t.Run(testCase.projectType.String(), func(t *testing.T) {
+ testSingleTechDependencyResolution(t, testCase.testProjectPath, testCase.resolveRepoName, testCase.cacheRepoName, testCase.projectType)
+ })
+ }
+}
+
+func testSingleTechDependencyResolution(t *testing.T, testProjectPartialPath []string, resolveRepoName string, cacheRepoName string, projectType project.ProjectType) {
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ testProjectPath := filepath.Join(append([]string{filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers"}, testProjectPartialPath...)...)
+ assert.NoError(t, biutils.CopyDir(testProjectPath, tempDirPath, true, nil))
+ rootDir, err := os.Getwd()
+ assert.NoError(t, err)
+ assert.NoError(t, os.Chdir(tempDirPath))
+ defer func() {
+ assert.NoError(t, os.Chdir(rootDir))
+ }()
+
+ server := &config.ServerDetails{
+ Url: *securityTests.JfrogUrl,
+ ArtifactoryUrl: *securityTests.JfrogUrl + securityTests.ArtifactoryEndpoint,
+ XrayUrl: *securityTests.JfrogUrl + securityTests.XrayEndpoint,
+ AccessToken: *securityTests.JfrogAccessToken,
+ ServerId: securityTests.ServerId,
+ }
+ configCmd := commonCommands.NewConfigCommand(commonCommands.AddOrEdit, securityTests.ServerId).SetDetails(server).SetUseBasicAuthOnly(true).SetInteractive(false)
+ assert.NoError(t, configCmd.Run())
+ // Create build config
+ assert.NoError(t, commonCommands.CreateBuildConfigWithOptions(false, projectType,
+ commonCommands.WithResolverServerId(server.ServerId),
+ commonCommands.WithResolverRepo(resolveRepoName),
+ ))
+
+ artifactoryPathToSearch := cacheRepoName + "-cache/*"
+ // To ensure a clean state between test cases, we need to verify that the cache remains clear for remote directories shared across multiple test cases.
+ deleteCmd := generic.NewDeleteCommand()
+ deleteCmd.SetServerDetails(server).SetRetries(3).SetSpec(spec.NewBuilder().Pattern(artifactoryPathToSearch).Recursive(true).BuildSpec())
+ assert.NoError(t, deleteCmd.Run())
+
+ callbackFunc := clearOrRedirectLocalCacheIfNeeded(t, projectType)
+ if callbackFunc != nil {
+ defer func() {
+ callbackFunc()
+ }()
+ }
+
+ // Executing the 'audit' command on an uninstalled project, we anticipate the resolution of dependencies from the configured Artifactory server and repository.
+ assert.NoError(t, securityTests.PlatformCli.WithoutCredentials().Exec("audit"))
+
+ // Following resolution from Artifactory, we anticipate the repository's cache to contain data.
+ output := coreTests.RunCmdWithOutput(t, func() error {
+ searchCmd := generic.NewSearchCommand()
+ searchCmd.SetServerDetails(server).SetRetries(3).SetSpec(spec.NewBuilder().Pattern(artifactoryPathToSearch).Recursive(true).BuildSpec())
+ err := searchCmd.Run()
+ if err != nil {
+ return err
+ }
+ // After the resolution from Artifactory, we verify whether the repository's cache is filled with artifacts.
+ result := searchCmd.Result()
+ require.NotNil(t, result)
+ reader := result.Reader()
+ require.NotNil(t, reader)
+ defer func() {
+ err = errors.Join(err, reader.Close())
+ }()
+ readerLen, e := reader.Length()
+ if err = errors.Join(err, e); err != nil {
+ return err
+ }
+ assert.NotEqual(t, 0, readerLen)
+ return err
+ })
+ assert.NotEqual(t, "[]\n", output)
+}
+
+// To guarantee that dependencies are resolved from Artifactory, certain package managers may need their local cache to be cleared.
+func clearOrRedirectLocalCacheIfNeeded(t *testing.T, projectType project.ProjectType) (callbackFunc func()) {
+ switch projectType {
+ case project.Dotnet:
+ _, err := exec.Command("dotnet", "nuget", "locals", "all", "--clear").CombinedOutput()
+ assert.NoError(t, err)
+ case project.Maven:
+ mavenCacheTempPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ envVarCallbackFunc := clientTests.SetEnvWithCallbackAndAssert(t, securityTests.JvmLaunchEnvVar, securityTests.MavenCacheRedirectionVal+mavenCacheTempPath)
+ callbackFunc = func() {
+ envVarCallbackFunc()
+ createTempDirCallback()
+ }
+ case project.Go:
+ goTempCachePath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ envVarCallbackFunc := clientTests.SetEnvWithCallbackAndAssert(t, securityTests.GoCacheEnvVar, goTempCachePath)
+
+ callbackFunc = func() {
+ envVarCallbackFunc()
+ // To remove the temporary cache in Go and all its contents, appropriate deletion permissions are required.
+ assert.NoError(t, coreutils.SetPermissionsRecursively(goTempCachePath, 0755))
+ createTempDirCallback()
+ }
+ case project.Pip:
+ pipTempCachePath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ envVarCallbackFunc := clientTests.SetEnvWithCallbackAndAssert(t, securityTests.PipCacheEnvVar, pipTempCachePath)
+ callbackFunc = func() {
+ envVarCallbackFunc()
+ createTempDirCallback()
+ }
+ }
+ return
+}
+
+func TestDownloadAnalyzerManagerIfNeeded(t *testing.T) {
+ // Configure a new JFrog CLI home dir.
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ setEnvCallBack := clientTests.SetEnvWithCallbackAndAssert(t, coreutils.HomeDir, tempDirPath)
+ defer setEnvCallBack()
+
+ // Download
+ err := utils.DownloadAnalyzerManagerIfNeeded()
+ assert.NoError(t, err)
+
+ // Validate Analyzer manager app & checksum.sh2 file exist
+ path, err := utils.GetAnalyzerManagerDirAbsolutePath()
+ assert.NoError(t, err)
+ amPath := filepath.Join(path, utils.GetAnalyzerManagerExecutableName())
+ exists, err := fileutils.IsFileExists(amPath, false)
+ assert.NoError(t, err)
+ assert.True(t, exists)
+ checksumPath := filepath.Join(path, dependencies.ChecksumFileName)
+ exists, err = fileutils.IsFileExists(checksumPath, false)
+ assert.NoError(t, err)
+ assert.True(t, exists)
+ checksumFileStat, err := os.Stat(checksumPath)
+ assert.NoError(t, err)
+ assert.True(t, checksumFileStat.Size() > 0)
+
+ // Validate no second download occurred
+ firstFileStat, err := os.Stat(amPath)
+ assert.NoError(t, err)
+ err = utils.DownloadAnalyzerManagerIfNeeded()
+ assert.NoError(t, err)
+ secondFileStat, err := os.Stat(amPath)
+ assert.NoError(t, err)
+ assert.Equal(t, firstFileStat.ModTime(), secondFileStat.ModTime())
+}
diff --git a/audit_test.go b/audit_test.go
new file mode 100644
index 00000000..c83f25c1
--- /dev/null
+++ b/audit_test.go
@@ -0,0 +1,492 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "github.com/jfrog/jfrog-cli-security/formats"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "strings"
+ "testing"
+
+ "github.com/stretchr/testify/assert"
+
+ biutils "github.com/jfrog/build-info-go/utils"
+
+ "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ securityTests "github.com/jfrog/jfrog-cli-security/tests"
+ securityTestUtils "github.com/jfrog/jfrog-cli-security/tests/utils"
+ clientTests "github.com/jfrog/jfrog-client-go/utils/tests"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+)
+
+// XrayCli -> PlatformCli
+
+func TestXrayAuditNpmJson(t *testing.T) {
+ output := testXrayAuditNpm(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+}
+
+func TestXrayAuditNpmSimpleJson(t *testing.T) {
+ output := testXrayAuditNpm(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+}
+
+func testXrayAuditNpm(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ npmProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "npm", "npm")
+ // Copy the npm project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(npmProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Run npm install before executing jfrog xr npm-audit
+ assert.NoError(t, exec.Command("npm", "install").Run())
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, true)
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--npm", "--licenses", "--format="+format)
+}
+
+func TestXrayAuditYarnV2Json(t *testing.T) {
+ testXrayAuditYarn(t, "yarn-v2", func() {
+ output := runXrayAuditYarnWithOutput(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+ })
+}
+
+func TestXrayAuditYarnV2SimpleJson(t *testing.T) {
+ testXrayAuditYarn(t, "yarn-v3", func() {
+ output := runXrayAuditYarnWithOutput(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+ })
+}
+
+func TestXrayAuditYarnV1Json(t *testing.T) {
+ testXrayAuditYarn(t, "yarn-v1", func() {
+ output := runXrayAuditYarnWithOutput(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+ })
+}
+
+func TestXrayAuditYarnV1JsonWithoutDevDependencies(t *testing.T) {
+ unsetEnv := clientTests.SetEnvWithCallbackAndAssert(t, "NODE_ENV", "production")
+ defer unsetEnv()
+ testXrayAuditYarn(t, "yarn-v1", func() {
+ output := runXrayAuditYarnWithOutput(t, string(format.Json))
+ var results []services.ScanResponse
+ err := json.Unmarshal([]byte(output), &results)
+ assert.NoError(t, err)
+ assert.Len(t, results[0].Vulnerabilities, 0)
+ })
+}
+
+func TestXrayAuditYarnV1SimpleJson(t *testing.T) {
+ testXrayAuditYarn(t, "yarn-v1", func() {
+ output := runXrayAuditYarnWithOutput(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+ })
+}
+
+func testXrayAuditYarn(t *testing.T, projectDirName string, yarnCmd func()) {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ yarnProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "yarn", projectDirName)
+ // Copy the Yarn project from the testdata to a temp directory
+ assert.NoError(t, biutils.CopyDir(yarnProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Run yarn install before executing jf audit --yarn. Return error to assert according to test.
+ assert.NoError(t, exec.Command("yarn").Run())
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, true)
+ yarnCmd()
+}
+
+func runXrayAuditYarnWithOutput(t *testing.T, format string) string {
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--yarn", "--licenses", "--format="+format)
+}
+
+// Tests NuGet audit by providing simple NuGet project + multi-project NuGet project and asserts any error.
+func TestXrayAuditNugetJson(t *testing.T) {
+ var testdata = []struct {
+ projectName string
+ format string
+ restoreTech string
+ minVulnerabilities int
+ minLicences int
+ }{
+ {
+ projectName: "single4.0",
+ format: string(format.Json),
+ restoreTech: "nuget",
+ minVulnerabilities: 2,
+ minLicences: 0,
+ },
+ {
+ projectName: "single5.0",
+ format: string(format.Json),
+ restoreTech: "dotnet",
+ minVulnerabilities: 3,
+ minLicences: 2,
+ },
+ {
+ projectName: "single5.0",
+ format: string(format.Json),
+ restoreTech: "",
+ minVulnerabilities: 3,
+ minLicences: 2,
+ },
+ {
+ projectName: "multi",
+ format: string(format.Json),
+ restoreTech: "dotnet",
+ minVulnerabilities: 5,
+ minLicences: 3,
+ },
+ {
+ projectName: "multi",
+ format: string(format.Json),
+ restoreTech: "",
+ minVulnerabilities: 5,
+ minLicences: 3,
+ },
+ }
+ for _, test := range testdata {
+ runInstallCommand := test.restoreTech != ""
+ t.Run(fmt.Sprintf("projectName:%s,runInstallCommand:%t", test.projectName, runInstallCommand),
+ func(t *testing.T) {
+ output := testXrayAuditNuget(t, test.projectName, test.format, test.restoreTech)
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, test.minVulnerabilities, test.minLicences)
+ })
+ }
+}
+
+func TestXrayAuditNugetSimpleJson(t *testing.T) {
+ var testdata = []struct {
+ projectName string
+ format string
+ restoreTech string
+ minVulnerabilities int
+ minLicences int
+ }{
+ {
+ projectName: "single4.0",
+ format: string(format.SimpleJson),
+ restoreTech: "nuget",
+ minVulnerabilities: 2,
+ minLicences: 0,
+ },
+ {
+ projectName: "single5.0",
+ format: string(format.SimpleJson),
+ restoreTech: "dotnet",
+ minVulnerabilities: 3,
+ minLicences: 2,
+ },
+ {
+ projectName: "single5.0",
+ format: string(format.SimpleJson),
+ restoreTech: "",
+ minVulnerabilities: 3,
+ minLicences: 2,
+ },
+ }
+ for _, test := range testdata {
+ runInstallCommand := test.restoreTech != ""
+ t.Run(fmt.Sprintf("projectName:%s,runInstallCommand:%t", test.projectName, runInstallCommand),
+ func(t *testing.T) {
+ output := testXrayAuditNuget(t, test.projectName, test.format, test.restoreTech)
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, test.minVulnerabilities, test.minLicences)
+ })
+ }
+}
+
+func testXrayAuditNuget(t *testing.T, projectName, format string, restoreTech string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ projectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "nuget", projectName)
+
+ assert.NoError(t, biutils.CopyDir(projectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ // Run NuGet/Dotnet restore before executing jfrog xr audit (NuGet)
+ if restoreTech != "" {
+ _, err := exec.Command(restoreTech, "restore").CombinedOutput()
+ assert.NoError(t, err)
+ }
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--nuget", "--format="+format, "--licenses")
+}
+
+func TestXrayAuditGradleJson(t *testing.T) {
+ output := testXrayAuditGradle(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 3, 3)
+}
+
+func TestXrayAuditGradleSimpleJson(t *testing.T) {
+ output := testXrayAuditGradle(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 3, 3)
+}
+
+func testXrayAuditGradle(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ gradleProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "gradle", "gradle")
+ // Copy the gradle project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(gradleProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--gradle", "--licenses", "--format="+format)
+}
+
+func TestXrayAuditMavenJson(t *testing.T) {
+ output := testXrayAuditMaven(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+}
+
+func TestXrayAuditMavenSimpleJson(t *testing.T) {
+ output := testXrayAuditMaven(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+}
+
+func testXrayAuditMaven(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ mvnProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "maven", "maven")
+ // Copy the maven project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(mvnProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--mvn", "--licenses", "--format="+format)
+}
+
+func TestXrayAuditNoTech(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Run audit on empty folder, expect an error
+ err := securityTests.PlatformCli.Exec("audit")
+ assert.NoError(t, err)
+}
+
+func TestXrayAuditMultiProjects(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ multiProject := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects")
+ // Copy the multi project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(multiProject, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ workingDirsFlag := fmt.Sprintf("--working-dirs=%s, %s ,%s, %s",
+ filepath.Join(tempDirPath, "package-managers", "maven", "maven"), filepath.Join(tempDirPath, "package-managers", "nuget", "single4.0"),
+ filepath.Join(tempDirPath, "package-managers", "python", "pip", "pip-project"), filepath.Join(tempDirPath, "jas", "jas-test"))
+ // Configure a new server named "default"
+ securityTestUtils.CreateJfrogHomeConfig(t, true)
+ defer securityTestUtils.CleanTestsHomeEnv()
+ output := securityTests.PlatformCli.WithoutCredentials().RunCliCmdWithOutput(t, "audit", "--format="+string(format.SimpleJson), workingDirsFlag)
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 35, 0)
+ securityTestUtils.VerifySimpleJsonJasResults(t, output, 1, 9, 7, 3)
+}
+
+func TestXrayAuditPipJson(t *testing.T) {
+ output := testXrayAuditPip(t, string(format.Json), "")
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 3, 1)
+}
+
+func TestXrayAuditPipSimpleJson(t *testing.T) {
+ output := testXrayAuditPip(t, string(format.SimpleJson), "")
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 3, 1)
+}
+
+func TestXrayAuditPipJsonWithRequirementsFile(t *testing.T) {
+ output := testXrayAuditPip(t, string(format.Json), "requirements.txt")
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 2, 0)
+}
+
+func TestXrayAuditPipSimpleJsonWithRequirementsFile(t *testing.T) {
+ output := testXrayAuditPip(t, string(format.SimpleJson), "requirements.txt")
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 2, 0)
+}
+
+func testXrayAuditPip(t *testing.T, format, requirementsFile string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ pipProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "python", "pip", "pip-project")
+ // Copy the pip project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(pipProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ args := []string{"audit", "--pip", "--licenses", "--format=" + format}
+ if requirementsFile != "" {
+ args = append(args, "--requirements-file="+requirementsFile)
+
+ }
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, args...)
+}
+
+func TestXrayAuditPipenvJson(t *testing.T) {
+ output := testXrayAuditPipenv(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 3, 1)
+}
+
+func TestXrayAuditPipenvSimpleJson(t *testing.T) {
+ output := testXrayAuditPipenv(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 3, 1)
+}
+
+func testXrayAuditPipenv(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ pipenvProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "python", "pipenv", "pipenv-project")
+ // Copy the pipenv project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(pipenvProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--pipenv", "--licenses", "--format="+format)
+}
+
+func TestXrayAuditPoetryJson(t *testing.T) {
+ output := testXrayAuditPoetry(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 3, 1)
+}
+
+func TestXrayAuditPoetrySimpleJson(t *testing.T) {
+ output := testXrayAuditPoetry(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 3, 1)
+}
+
+func testXrayAuditPoetry(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ poetryProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "python", "poetry", "poetry-project")
+ // Copy the poetry project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(poetryProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Add dummy descriptor file to check that we run only specific audit
+ addDummyPackageDescriptor(t, false)
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--poetry", "--licenses", "--format="+format)
+}
+
+func addDummyPackageDescriptor(t *testing.T, hasPackageJson bool) {
+ descriptor := "package.json"
+ if hasPackageJson {
+ descriptor = "pom.xml"
+ }
+ dummyFile, err := os.Create(descriptor)
+ assert.NoError(t, err)
+ assert.NoError(t, dummyFile.Close())
+}
+
+// JAS
+
+func TestXrayAuditJasSimpleJson(t *testing.T) {
+ output := testXrayAuditJas(t, string(format.SimpleJson), filepath.Join("jas", "jas-test"))
+ securityTestUtils.VerifySimpleJsonJasResults(t, output, 1, 9, 7, 2)
+}
+
+func TestXrayAuditJasSimpleJsonWithConfig(t *testing.T) {
+ output := testXrayAuditJas(t, string(format.SimpleJson), filepath.Join("jas", "jas-config"))
+ securityTestUtils.VerifySimpleJsonJasResults(t, output, 0, 0, 1, 2)
+}
+
+func TestXrayAuditJasNoViolationsSimpleJson(t *testing.T) {
+ output := testXrayAuditJas(t, string(format.SimpleJson), filepath.Join("package-managers", "npm", "npm"))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 0)
+ securityTestUtils.VerifySimpleJsonJasResults(t, output, 0, 0, 0, 0)
+}
+
+func testXrayAuditJas(t *testing.T, format string, project string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ projectDir := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), filepath.Join("projects", project))
+ // Copy the multi project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(projectDir, tempDirPath, true, nil))
+ // Configure a new server named "default"
+ securityTestUtils.CreateJfrogHomeConfig(t, true)
+ defer securityTestUtils.CleanTestsHomeEnv()
+ baseWd, err := os.Getwd()
+ assert.NoError(t, err)
+ chdirCallback := clientTests.ChangeDirWithCallback(t, baseWd, tempDirPath)
+ defer chdirCallback()
+ return securityTests.PlatformCli.WithoutCredentials().RunCliCmdWithOutput(t, "audit", "--format="+format)
+}
+
+func TestXrayAuditDetectTech(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ mvnProjectPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "maven", "maven")
+ // Copy the maven project from the testdata to a temp dir
+ assert.NoError(t, biutils.CopyDir(mvnProjectPath, tempDirPath, true, nil))
+ prevWd := securityTestUtils.ChangeWD(t, tempDirPath)
+ defer clientTests.ChangeDirAndAssert(t, prevWd)
+ // Run generic audit on mvn project with a vulnerable dependency
+ output := securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--licenses", "--format="+string(format.SimpleJson))
+ var results formats.SimpleJsonResults
+ err := json.Unmarshal([]byte(output), &results)
+ assert.NoError(t, err)
+ // Expects the ImpactedPackageType of the known vulnerability to be maven
+ assert.Equal(t, strings.ToLower(results.Vulnerabilities[0].ImpactedDependencyType), "maven")
+}
+
+func TestXrayRecursiveScan(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ projectDir := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers")
+ // Creating an inner NPM project
+ npmDirPath, err := os.MkdirTemp(tempDirPath, "npm-project")
+ assert.NoError(t, err)
+ npmProjectToCopyPath := filepath.Join(projectDir, "npm", "npm")
+ assert.NoError(t, biutils.CopyDir(npmProjectToCopyPath, npmDirPath, true, nil))
+
+ // Creating an inner .NET project
+ dotnetDirPath, err := os.MkdirTemp(tempDirPath, "dotnet-project")
+ assert.NoError(t, err)
+ dotnetProjectToCopyPath := filepath.Join(projectDir, "dotnet", "dotnet-single")
+ assert.NoError(t, biutils.CopyDir(dotnetProjectToCopyPath, dotnetDirPath, true, nil))
+
+ curWd, err := os.Getwd()
+ assert.NoError(t, err)
+
+ chDirCallback := clientTests.ChangeDirWithCallback(t, curWd, tempDirPath)
+ defer chDirCallback()
+
+ // We anticipate the execution of a recursive scan to encompass both the inner NPM project and the inner .NET project.
+ output := securityTests.PlatformCli.RunCliCmdWithOutput(t, "audit", "--format=json")
+
+ // We anticipate the identification of five vulnerabilities: four originating from the .NET project and one from the NPM project.
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 5, 0)
+
+ var results []services.ScanResponse
+ err = json.Unmarshal([]byte(output), &results)
+ assert.NoError(t, err)
+ // We anticipate receiving an array with a length of 2 to confirm that we have obtained results from two distinct inner projects.
+ assert.Len(t, results, 2)
+}
diff --git a/cli/cli.go b/cli/cli.go
index bb56ee08..eb25dfaf 100644
--- a/cli/cli.go
+++ b/cli/cli.go
@@ -1,15 +1,19 @@
package cli
import (
+ "github.com/jfrog/jfrog-cli-core/v2/common/cliutils"
"github.com/jfrog/jfrog-cli-core/v2/plugins/components"
)
func GetJfrogCliSecurityApp() components.App {
- app := components.CreateApp(
+ app := components.CreateEmbeddedApp(
"security",
- "v1.0.0",
- "Jfrog Security CLI embedded plugin",
- []components.Command{},
+ getAuditAndScansCommands(),
)
+ app.Subcommands = append(app.Subcommands, components.Namespace{
+ Name: string(cliutils.Xr),
+ Description: "Xray commands.",
+ Commands: getXrayNameSpaceCommands(),
+ })
return app
}
diff --git a/cli/docs/auditspecific/help.go b/cli/docs/auditspecific/help.go
new file mode 100644
index 00000000..9a2e9996
--- /dev/null
+++ b/cli/docs/auditspecific/help.go
@@ -0,0 +1,31 @@
+package auditspecific
+
+import "fmt"
+
+// TODO: Deprecated commands (remove at next CLI major version)
+
+const descFormat = "Execute an audit %s command, using the configured Xray details."
+
+func GetGoDescription() string {
+ return fmt.Sprintf(descFormat, "Go")
+}
+
+func GetGradleDescription() string {
+ return fmt.Sprintf(descFormat, "Gradle")
+}
+
+func GetMvnDescription() string {
+ return fmt.Sprintf(descFormat, "Maven")
+}
+
+func GetNpmDescription() string {
+ return fmt.Sprintf(descFormat, "Npm")
+}
+
+func GetPipDescription() string {
+ return fmt.Sprintf(descFormat, "Pip")
+}
+
+func GetPipenvDescription() string {
+ return fmt.Sprintf(descFormat, "Pipenv")
+}
diff --git a/cli/docs/flags.go b/cli/docs/flags.go
new file mode 100644
index 00000000..f14b53b8
--- /dev/null
+++ b/cli/docs/flags.go
@@ -0,0 +1,226 @@
+package docs
+
+import (
+ "fmt"
+ "strings"
+
+ "github.com/jfrog/jfrog-cli-core/v2/common/cliutils"
+ pluginsCommon "github.com/jfrog/jfrog-cli-core/v2/plugins/common"
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+ "github.com/jfrog/jfrog-cli-security/commands/audit"
+ "github.com/jfrog/jfrog-cli-security/commands/curation"
+ "github.com/jfrog/jfrog-cli-security/commands/xray/offlineupdate"
+)
+
+const (
+ // Security Commands Keys
+ XrCurl = "xr-curl"
+ OfflineUpdate = "offline-update"
+ XrScan = "xr-scan"
+ BuildScan = "build-scan"
+ DockerScan = "docker scan"
+ Audit = "audit"
+ CurationAudit = "curation-audit"
+
+ // TODO: Deprecated commands (remove at next CLI major version)
+ AuditMvn = "audit-maven"
+ AuditGradle = "audit-gradle"
+ AuditNpm = "audit-npm"
+ AuditGo = "audit-go"
+ AuditPip = "audit-pip"
+ AuditPipenv = "audit-pipenv"
+)
+
+const (
+ Mvn = "mvn"
+ Gradle = "gradle"
+ Npm = "npm"
+ Yarn = "yarn"
+ Nuget = "nuget"
+ Go = "go"
+ Pip = "pip"
+ Pipenv = "pipenv"
+ Poetry = "poetry"
+)
+
+const (
+ // Base flags keys
+ ServerId = "server-id"
+ url = "url"
+ user = "user"
+ password = "password"
+ accessToken = "access-token"
+
+ // Client certification flags
+ InsecureTls = "insecure-tls"
+
+ // Generic command flags
+ SpecFlag = "spec"
+ Threads = "threads"
+ Recursive = "recursive"
+ RegexpFlag = "regexp"
+ AntFlag = "ant"
+ Project = "project"
+ Exclusions = "exclusions"
+ IncludeDirs = "include-dirs"
+ UseWrapper = "use-wrapper"
+)
+
+const (
+ // Unique offline-update flags keys
+ LicenseId = "license-id"
+ From = "from"
+ To = "to"
+ Version = "version"
+ Target = "target"
+ Stream = "stream"
+ Periodic = "periodic"
+
+ // Unique scan and audit flags
+ scanPrefix = "scan-"
+ scanRecursive = scanPrefix + Recursive
+ scanRegexp = scanPrefix + RegexpFlag
+ scanAnt = scanPrefix + AntFlag
+ OutputFormat = "format"
+ BypassArchiveLimits = "bypass-archive-limits"
+ Watches = "watches"
+ RepoPath = "repo-path"
+ Licenses = "licenses"
+ Fail = "fail"
+ ExtendedTable = "extended-table"
+ MinSeverity = "min-severity"
+ FixableOnly = "fixable-only"
+ Rescan = "rescan"
+ Vuln = "vuln"
+
+ // Unique audit flags
+ auditPrefix = "audit-"
+ ExclusionsAudit = auditPrefix + Exclusions
+ useWrapperAudit = auditPrefix + UseWrapper
+ ExcludeTestDeps = "exclude-test-deps"
+ DepType = "dep-type"
+ ThirdPartyContextualAnalysis = "third-party-contextual-analysis"
+ RequirementsFile = "requirements-file"
+ WorkingDirs = "working-dirs"
+
+ // Unique curation flags
+ CurationOutput = "curation-format"
+ CurationThreads = "curation-threads"
+)
+
+// Mapping between security commands (key) and their flags (key).
+var commandFlags = map[string][]string{
+ XrCurl: {ServerId},
+ OfflineUpdate: {LicenseId, From, To, Version, Target, Stream, Periodic},
+ XrScan: {
+ url, user, password, accessToken, ServerId, SpecFlag, Threads, scanRecursive, scanRegexp, scanAnt,
+ Project, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable, BypassArchiveLimits, MinSeverity, FixableOnly,
+ },
+ BuildScan: {
+ url, user, password, accessToken, ServerId, Project, Vuln, OutputFormat, Fail, ExtendedTable, Rescan,
+ },
+ DockerScan: {
+ ServerId, Project, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable, BypassArchiveLimits, MinSeverity, FixableOnly,
+ },
+ Audit: {
+ url, user, password, accessToken, ServerId, InsecureTls, Project, Watches, RepoPath, Licenses, OutputFormat, ExcludeTestDeps,
+ useWrapperAudit, DepType, RequirementsFile, Fail, ExtendedTable, WorkingDirs, ExclusionsAudit, Mvn, Gradle, Npm, Yarn, Go, Nuget, Pip, Pipenv, Poetry, MinSeverity, FixableOnly, ThirdPartyContextualAnalysis,
+ },
+ CurationAudit: {
+ CurationOutput, WorkingDirs, CurationThreads,
+ },
+ // TODO: Deprecated commands (remove at next CLI major version)
+ AuditMvn: {
+ url, user, password, accessToken, ServerId, InsecureTls, Project, ExclusionsAudit, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable, useWrapperAudit,
+ },
+ AuditGradle: {
+ url, user, password, accessToken, ServerId, ExcludeTestDeps, ExclusionsAudit, useWrapperAudit, Project, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable,
+ },
+ AuditNpm: {
+ url, user, password, accessToken, ServerId, DepType, Project, ExclusionsAudit, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable,
+ },
+ AuditGo: {
+ url, user, password, accessToken, ServerId, Project, ExclusionsAudit, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable,
+ },
+ AuditPip: {
+ url, user, password, accessToken, ServerId, RequirementsFile, Project, ExclusionsAudit, Watches, RepoPath, Licenses, OutputFormat, Fail, ExtendedTable,
+ },
+ AuditPipenv: {
+ url, user, password, accessToken, ServerId, Project, ExclusionsAudit, Watches, RepoPath, Licenses, OutputFormat, ExtendedTable,
+ },
+}
+
+// Security Flag keys mapped to their corresponding components.Flag definition.
+var flagsMap = map[string]components.Flag{
+ // Common commands flags
+ ServerId: components.NewStringFlag(ServerId, "Server ID configured using the config command."),
+ url: components.NewStringFlag(url, "JFrog Xray URL."),
+ user: components.NewStringFlag(user, "JFrog username."),
+ password: components.NewStringFlag(password, "JFrog password."),
+ accessToken: components.NewStringFlag(accessToken, "JFrog access token."),
+ Threads: components.NewStringFlag(Threads, "Number of working threads.", components.WithIntDefaultValue(cliutils.Threads)),
+ // Xray flags
+ LicenseId: components.NewStringFlag(LicenseId, "Xray license ID.", components.SetMandatory(), components.WithHelpValue("Xray license ID")),
+ From: components.NewStringFlag(From, "From update date in YYYY-MM-DD format."),
+ To: components.NewStringFlag(To, "To update date in YYYY-MM-DD format."),
+ Version: components.NewStringFlag(Version, "Xray API version."),
+ Target: components.NewStringFlag(Target, "Target directory to download the updates to.", components.WithStrDefaultValue("./")),
+ Stream: components.NewStringFlag(Stream, fmt.Sprintf("Xray DBSync V3 stream, Possible values are: %s.", offlineupdate.NewValidStreams().GetValidStreamsString())),
+ Periodic: components.NewBoolFlag(Periodic, fmt.Sprintf("Set to true to get the Xray DBSync V3 Periodic Package (Use with %s flag).", Stream)),
+ // Scan flags
+ SpecFlag: components.NewStringFlag(SpecFlag, "Path to a File Spec."),
+ scanRecursive: components.NewBoolFlag(Recursive, "Set to false if you do not wish to collect artifacts in sub-folders to be scanned by Xray.", components.WithBoolDefaultValue(true)),
+ scanRegexp: components.NewBoolFlag(RegexpFlag, "Set to true to use a regular expression instead of wildcards expression to collect files to scan."),
+ scanAnt: components.NewBoolFlag(AntFlag, "Set to true to use an ant pattern instead of wildcards expression to collect files to scan."),
+ Project: components.NewStringFlag(Project, "JFrog Artifactory project key."),
+ Watches: components.NewStringFlag(Watches, "A comma-separated list of Xray watches, to determine Xray's violations creation."),
+ RepoPath: components.NewStringFlag(RepoPath, "Target repo path, to enable Xray to determine watches accordingly."),
+ Licenses: components.NewBoolFlag(Licenses, "Set to true if you'd like to receive licenses from Xray scanning."),
+ OutputFormat: components.NewStringFlag(
+ OutputFormat,
+ "Defines the output format of the command. Acceptable values are: table, json, simple-json and sarif. Note: the json format doesn't include information about scans that are included as part of the Advanced Security package.",
+ components.WithStrDefaultValue("table"),
+ ),
+ Fail: components.NewBoolFlag(Fail, "Set to false if you do not wish the command to return exit code 3, even if the 'Fail Build' rule is matched by Xray.", components.WithBoolDefaultValue(true)),
+ ExtendedTable: components.NewBoolFlag(ExtendedTable, "Set to true if you'd like the table to include extended fields such as 'CVSS' & 'Xray Issue Id'. Ignored if provided 'format' is not 'table'."),
+ BypassArchiveLimits: components.NewBoolFlag(BypassArchiveLimits, "Set to true to bypass the indexer-app archive limits."),
+ MinSeverity: components.NewStringFlag(MinSeverity, "Set the minimum severity of issues to display. The following values are accepted: Low, Medium, High or Critical."),
+ FixableOnly: components.NewBoolFlag(FixableOnly, "Set to true if you wish to display issues that have a fixed version only."),
+ Rescan: components.NewBoolFlag(Rescan, "Set to true when scanning an already successfully scanned build, for example after adding an ignore rule."),
+ Vuln: components.NewBoolFlag(Vuln, "Set to true if you'd like to receive an additional view of all vulnerabilities, regardless of the policy configured in Xray. Ignored if provided 'format' is 'sarif'."),
+ InsecureTls: components.NewBoolFlag(InsecureTls, "Set to true to skip TLS certificates verification."),
+ ExcludeTestDeps: components.NewBoolFlag(ExcludeTestDeps, "[Gradle] Set to true if you'd like to exclude Gradle test dependencies from Xray scanning."),
+ useWrapperAudit: components.NewBoolFlag(
+ UseWrapper,
+ "Set to false if you wish to not use the gradle or maven wrapper.",
+ components.WithBoolDefaultValue(true),
+ ),
+ WorkingDirs: components.NewStringFlag(WorkingDirs, "A comma-separated list of relative working directories, to determine audit targets locations."),
+ ExclusionsAudit: components.NewStringFlag(
+ Exclusions,
+ "List of exclusions separated by semicolons, utilized to skip sub-projects from undergoing an audit. These exclusions may incorporate the * and ? wildcards.",
+ components.WithStrDefaultValue(strings.Join(audit.DefaultExcludePatterns, ";")),
+ ),
+ Mvn: components.NewBoolFlag(Mvn, "Set to true to request audit for a Maven project."),
+ Gradle: components.NewBoolFlag(Gradle, "Set to true to request audit for a Gradle project."),
+ Npm: components.NewBoolFlag(Npm, "Set to true to request audit for an npm project."),
+ Yarn: components.NewBoolFlag(Yarn, "Set to true to request audit for a Yarn project."),
+ Nuget: components.NewBoolFlag(Nuget, "Set to true to request audit for a .NET project."),
+ Pip: components.NewBoolFlag(Pip, "Set to true to request audit for a Pip project."),
+ Pipenv: components.NewBoolFlag(Pipenv, "Set to true to request audit for a Pipenv project."),
+ Poetry: components.NewBoolFlag(Poetry, "Set to true to request audit for a Poetry project."),
+ Go: components.NewBoolFlag(Go, "Set to true to request audit for a Go project."),
+ DepType: components.NewStringFlag(DepType, "[npm] Defines npm dependencies type. Possible values are: all, devOnly and prodOnly."),
+ ThirdPartyContextualAnalysis: components.NewBoolFlag(
+ ThirdPartyContextualAnalysis,
+ "[npm] when set, the Contextual Analysis scan also uses the code of the project dependencies to determine the applicability of the vulnerability.",
+ components.SetHiddenBoolFlag(),
+ ),
+ RequirementsFile: components.NewStringFlag(RequirementsFile, "[Pip] Defines pip requirements file name. For example: 'requirements.txt'."),
+ CurationThreads: components.NewStringFlag(Threads, "Number of working threads.", components.WithIntDefaultValue(curation.TotalConcurrentRequests)),
+ CurationOutput: components.NewStringFlag(OutputFormat, "Defines the output format of the command. Acceptable values are: table, json.", components.WithStrDefaultValue("table")),
+}
+
+func GetCommandFlags(cmdKey string) []components.Flag {
+ return pluginsCommon.GetCommandFlags(cmdKey, commandFlags, flagsMap)
+}
diff --git a/cli/docs/scan/audit/help.go b/cli/docs/scan/audit/help.go
new file mode 100644
index 00000000..cf8fae98
--- /dev/null
+++ b/cli/docs/scan/audit/help.go
@@ -0,0 +1,5 @@
+package audit
+
+func GetDescription() string {
+ return "Audit your local project's dependencies by generating a dependency tree and scanning it with Xray."
+}
diff --git a/cli/docs/scan/buildscan/help.go b/cli/docs/scan/buildscan/help.go
new file mode 100644
index 00000000..a8e5a750
--- /dev/null
+++ b/cli/docs/scan/buildscan/help.go
@@ -0,0 +1,20 @@
+package buildscan
+
+import "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+
+func GetDescription() string {
+ return "Scan a published build-info with Xray."
+}
+
+func GetArguments() []components.Argument {
+ return []components.Argument{
+ {
+ Name: "build name",
+ Description: "Build name.",
+ },
+ {
+ Name: "build number",
+ Description: "Build number.",
+ },
+ }
+}
diff --git a/cli/docs/scan/curation/help.go b/cli/docs/scan/curation/help.go
new file mode 100644
index 00000000..42521f78
--- /dev/null
+++ b/cli/docs/scan/curation/help.go
@@ -0,0 +1,5 @@
+package curation
+
+func GetDescription() string {
+ return "Audit your project dependencies for their curation status."
+}
diff --git a/cli/docs/scan/dockerscan/help.go b/cli/docs/scan/dockerscan/help.go
new file mode 100644
index 00000000..890b2d5f
--- /dev/null
+++ b/cli/docs/scan/dockerscan/help.go
@@ -0,0 +1,18 @@
+package dockerscan
+
+import "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+
+var Usage = []string{"docker scan "}
+
+func GetDescription() string {
+ return "Scan local docker image using the docker client and Xray."
+}
+
+func GetArguments() []components.Argument {
+ return []components.Argument{
+ {
+ Name: "image tag",
+ Description: "The docker image tag to scan.",
+ },
+ }
+}
diff --git a/cli/docs/scan/scan/help.go b/cli/docs/scan/scan/help.go
new file mode 100644
index 00000000..f4de68b0
--- /dev/null
+++ b/cli/docs/scan/scan/help.go
@@ -0,0 +1,16 @@
+package scan
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+ "github.com/jfrog/jfrog-cli-security/cli/docs"
+)
+
+func GetDescription() string {
+ return "Scan files located on the local file-system with Xray."
+}
+
+func GetArguments() []components.Argument {
+ return []components.Argument{{Name: "source pattern", ReplaceWithFlag: docs.SpecFlag, Description: `Specifies the local file system path of the files to be scanned.
+ You can specify multiple files by using wildcards, Ant pattern or a regular expression.
+ If you have specified that you are using regular expressions, then the first one used in the argument must be enclosed in parenthesis.`}}
+}
diff --git a/cli/docs/xray/curl/help.go b/cli/docs/xray/curl/help.go
new file mode 100644
index 00000000..ebba4806
--- /dev/null
+++ b/cli/docs/xray/curl/help.go
@@ -0,0 +1,11 @@
+package curl
+
+import "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+
+func GetDescription() string {
+ return "Execute a cURL command, using the configured Xray details."
+}
+
+func GetArguments() []components.Argument {
+ return []components.Argument{{Name: "curl command", Description: "cURL command to run."}}
+}
diff --git a/cli/docs/xray/offlineupdate/help.go b/cli/docs/xray/offlineupdate/help.go
new file mode 100644
index 00000000..f37f8f84
--- /dev/null
+++ b/cli/docs/xray/offlineupdate/help.go
@@ -0,0 +1,5 @@
+package offlineupdate
+
+func GetDescription() string {
+ return "Download Xray offline updates."
+}
diff --git a/cli/scancommands.go b/cli/scancommands.go
new file mode 100644
index 00000000..5493556d
--- /dev/null
+++ b/cli/scancommands.go
@@ -0,0 +1,474 @@
+package cli
+
+import (
+ "os"
+ "strings"
+
+ "github.com/jfrog/jfrog-cli-core/v2/common/cliutils"
+ commandsCommon "github.com/jfrog/jfrog-cli-core/v2/common/commands"
+ outputFormat "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ "github.com/jfrog/jfrog-cli-core/v2/common/progressbar"
+ "github.com/jfrog/jfrog-cli-core/v2/common/spec"
+ pluginsCommon "github.com/jfrog/jfrog-cli-core/v2/plugins/common"
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+ coreConfig "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+
+ flags "github.com/jfrog/jfrog-cli-security/cli/docs"
+ auditSpecificDocs "github.com/jfrog/jfrog-cli-security/cli/docs/auditspecific"
+ auditDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/audit"
+ buildScanDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/buildscan"
+ curationDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/curation"
+ dockerScanDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/dockerscan"
+ scanDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/scan"
+
+ "github.com/jfrog/jfrog-cli-security/commands/audit"
+ "github.com/jfrog/jfrog-cli-security/commands/curation"
+ "github.com/jfrog/jfrog-cli-security/commands/scan"
+ "github.com/jfrog/jfrog-cli-security/utils"
+)
+
+const auditScanCategory = "Audit & Scan"
+
+const dockerScanCmdHiddenName = "dockerscan"
+
+func getAuditAndScansCommands() []components.Command {
+ return []components.Command{
+ {
+ Name: "scan",
+ Aliases: []string{"s"},
+ Flags: flags.GetCommandFlags(flags.XrScan),
+ Description: scanDocs.GetDescription(),
+ Arguments: scanDocs.GetArguments(),
+ Category: auditScanCategory,
+ Action: ScanCmd,
+ },
+ {
+ Name: "build-scan",
+ Aliases: []string{"bs"},
+ Flags: flags.GetCommandFlags(flags.BuildScan),
+ Description: buildScanDocs.GetDescription(),
+ Arguments: buildScanDocs.GetArguments(),
+ Category: auditScanCategory,
+ Action: BuildScan,
+ },
+ {
+ // this command is hidden and have no logic, it will be run to provide 'help' as a part of the buildtools CLI for 'docker' commands. ('jf docker scan')
+ // CLI buildtools will run the command if requested: https://github.com/jfrog/jfrog-cli/blob/v2/buildtools/cli.go
+ Name: dockerScanCmdHiddenName,
+ Flags: flags.GetCommandFlags(flags.DockerScan),
+ Description: dockerScanDocs.GetDescription(),
+ Arguments: dockerScanDocs.GetArguments(),
+ UsageOptions: &components.UsageOptions{
+ Usage: dockerScanDocs.Usage,
+ ReplaceAutoGeneratedUsage: true,
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit",
+ Aliases: []string{"aud"},
+ Flags: flags.GetCommandFlags(flags.Audit),
+ Description: auditDocs.GetDescription(),
+ Category: auditScanCategory,
+ Action: AuditCmd,
+ },
+ {
+ Name: "curation-audit",
+ Aliases: []string{"ca"},
+ Flags: flags.GetCommandFlags(flags.CurationAudit),
+ Description: curationDocs.GetDescription(),
+ Category: auditScanCategory,
+ Action: CurationCmd,
+ },
+
+ // TODO: Deprecated commands (remove at next CLI major version)
+ {
+ Name: "audit-mvn",
+ Aliases: []string{"am"},
+ Flags: flags.GetCommandFlags(flags.AuditMvn),
+ Description: auditSpecificDocs.GetMvnDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Maven)
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit-gradle",
+ Aliases: []string{"ag"},
+ Flags: flags.GetCommandFlags(flags.AuditGradle),
+ Description: auditSpecificDocs.GetGradleDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Gradle)
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit-npm",
+ Aliases: []string{"an"},
+ Flags: flags.GetCommandFlags(flags.AuditNpm),
+ Description: auditSpecificDocs.GetNpmDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Npm)
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit-go",
+ Aliases: []string{"ago"},
+ Flags: flags.GetCommandFlags(flags.AuditGo),
+ Description: auditSpecificDocs.GetGoDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Go)
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit-pip",
+ Aliases: []string{"ap"},
+ Flags: flags.GetCommandFlags(flags.AuditPip),
+ Description: auditSpecificDocs.GetPipDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Pip)
+ },
+ Hidden: true,
+ },
+ {
+ Name: "audit-pipenv",
+ Aliases: []string{"ape"},
+ Flags: flags.GetCommandFlags(flags.AuditPipenv),
+ Description: auditSpecificDocs.GetPipenvDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Pipenv)
+ },
+ Hidden: true,
+ },
+ }
+}
+
+func ScanCmd(c *components.Context) error {
+ if len(c.Arguments) == 0 && !c.IsFlagSet(flags.SpecFlag) {
+ return pluginsCommon.PrintHelpAndReturnError("providing either a argument or the 'spec' option is mandatory", c)
+ }
+ serverDetails, err := createServerDetailsWithConfigOffer(c)
+ if err != nil {
+ return err
+ }
+ err = validateXrayContext(c, serverDetails)
+ if err != nil {
+ return err
+ }
+ var specFile *spec.SpecFiles
+ if c.IsFlagSet(flags.SpecFlag) && len(c.GetStringFlagValue(flags.SpecFlag)) > 0 {
+ specFile, err = pluginsCommon.GetFileSystemSpec(c)
+ if err != nil {
+ return err
+ }
+ } else {
+ specFile = createDefaultScanSpec(c, addTrailingSlashToRepoPathIfNeeded(c))
+ }
+ err = spec.ValidateSpec(specFile.Files, false, false)
+ if err != nil {
+ return err
+ }
+ threads, err := pluginsCommon.GetThreadsCount(c)
+ if err != nil {
+ return err
+ }
+ format, err := outputFormat.GetOutputFormat(c.GetStringFlagValue(flags.OutputFormat))
+ if err != nil {
+ return err
+ }
+ pluginsCommon.FixWinPathsForFileSystemSourcedCmds(specFile, c)
+ minSeverity, err := utils.GetSeveritiesFormat(c.GetStringFlagValue(flags.MinSeverity))
+ if err != nil {
+ return err
+ }
+ scanCmd := scan.NewScanCommand().
+ SetServerDetails(serverDetails).
+ SetThreads(threads).
+ SetSpec(specFile).
+ SetOutputFormat(format).
+ SetProject(c.GetStringFlagValue(flags.Project)).
+ SetIncludeVulnerabilities(shouldIncludeVulnerabilities(c)).
+ SetIncludeLicenses(c.GetBoolFlagValue(flags.Licenses)).
+ SetFail(c.GetBoolFlagValue(flags.Fail)).
+ SetPrintExtendedTable(c.GetBoolFlagValue(flags.ExtendedTable)).
+ SetBypassArchiveLimits(c.GetBoolFlagValue(flags.BypassArchiveLimits)).
+ SetFixableOnly(c.GetBoolFlagValue(flags.FixableOnly)).
+ SetMinSeverityFilter(minSeverity)
+ if c.IsFlagSet(flags.Watches) {
+ scanCmd.SetWatches(splitByCommaAndTrim(c.GetStringFlagValue(flags.Watches)))
+ }
+ return commandsCommon.Exec(scanCmd)
+}
+
+func createServerDetailsWithConfigOffer(c *components.Context) (*coreConfig.ServerDetails, error) {
+ return pluginsCommon.CreateServerDetailsWithConfigOffer(c, true, cliutils.Xr)
+}
+
+func validateXrayContext(c *components.Context, serverDetails *coreConfig.ServerDetails) error {
+ if serverDetails.XrayUrl == "" {
+ return errorutils.CheckErrorf("JFrog Xray URL must be provided in order run this command. Use the 'jf c add' command to set the Xray server details.")
+ }
+ contextFlag := 0
+ if c.GetStringFlagValue(flags.Watches) != "" {
+ contextFlag++
+ }
+ if isProjectProvided(c) {
+ contextFlag++
+ }
+ if c.GetStringFlagValue(flags.RepoPath) != "" {
+ contextFlag++
+ }
+ if contextFlag > 1 {
+ return errorutils.CheckErrorf("only one of the following flags can be supplied: --watches, --project or --repo-path")
+ }
+ return nil
+}
+
+func isProjectProvided(c *components.Context) bool {
+ return c.GetStringFlagValue(flags.Project) != "" || os.Getenv(coreutils.Project) != ""
+}
+
+func addTrailingSlashToRepoPathIfNeeded(c *components.Context) string {
+ repoPath := c.GetStringFlagValue(flags.RepoPath)
+ if repoPath != "" && !strings.Contains(repoPath, "/") {
+ // In case only repo name was provided (no path) we are adding a trailing slash.
+ repoPath += "/"
+ }
+ return repoPath
+}
+
+func createDefaultScanSpec(c *components.Context, defaultTarget string) *spec.SpecFiles {
+ return spec.NewBuilder().
+ Pattern(c.Arguments[0]).
+ Target(defaultTarget).
+ Recursive(c.GetBoolFlagValue(flags.Recursive)).
+ Exclusions(pluginsCommon.GetStringsArrFlagValue(c, flags.Exclusions)).
+ Regexp(c.GetBoolFlagValue(flags.RegexpFlag)).
+ Ant(c.GetBoolFlagValue(flags.AntFlag)).
+ IncludeDirs(c.GetBoolFlagValue(flags.IncludeDirs)).
+ BuildSpec()
+}
+
+func shouldIncludeVulnerabilities(c *components.Context) bool {
+ // If no context was provided by the user, no Violations will be triggered by Xray, so include general vulnerabilities in the command output
+ return c.GetStringFlagValue(flags.Watches) == "" && !isProjectProvided(c) && c.GetStringFlagValue(flags.RepoPath) == ""
+}
+
+func splitByCommaAndTrim(paramValue string) (res []string) {
+ args := strings.Split(paramValue, ",")
+ res = make([]string, len(args))
+ for i, arg := range args {
+ res[i] = strings.TrimSpace(arg)
+ }
+ return
+}
+
+// Scan published builds with Xray
+func BuildScan(c *components.Context) error {
+ if len(c.Arguments) > 2 {
+ return pluginsCommon.WrongNumberOfArgumentsHandler(c)
+ }
+ buildConfiguration := pluginsCommon.CreateBuildConfiguration(c)
+ if err := buildConfiguration.ValidateBuildParams(); err != nil {
+ return err
+ }
+ serverDetails, err := createServerDetailsWithConfigOffer(c)
+ if err != nil {
+ return err
+ }
+ err = validateXrayContext(c, serverDetails)
+ if err != nil {
+ return err
+ }
+ format, err := outputFormat.GetOutputFormat(c.GetStringFlagValue(flags.OutputFormat))
+ if err != nil {
+ return err
+ }
+ buildScanCmd := scan.NewBuildScanCommand().
+ SetServerDetails(serverDetails).
+ SetFailBuild(c.GetBoolFlagValue(flags.Fail)).
+ SetBuildConfiguration(buildConfiguration).
+ SetOutputFormat(format).
+ SetPrintExtendedTable(c.GetBoolFlagValue(flags.ExtendedTable)).
+ SetRescan(c.GetBoolFlagValue(flags.Rescan))
+ if format != outputFormat.Sarif {
+ // Sarif shouldn't include the additional all-vulnerabilities info that received by adding the vuln flag
+ buildScanCmd.SetIncludeVulnerabilities(c.GetBoolFlagValue(flags.Vuln))
+ }
+ return commandsCommon.Exec(buildScanCmd)
+}
+
+func AuditCmd(c *components.Context) error {
+ auditCmd, err := createAuditCmd(c)
+ if err != nil {
+ return err
+ }
+
+ // Check if user used specific technologies flags
+ allTechnologies := coreutils.GetAllTechnologiesList()
+ technologies := []string{}
+ for _, tech := range allTechnologies {
+ var techExists bool
+ if tech == coreutils.Maven {
+ // On Maven we use '--mvn' flag
+ techExists = c.GetBoolFlagValue(flags.Mvn)
+ } else {
+ techExists = c.GetBoolFlagValue(tech.String())
+ }
+ if techExists {
+ technologies = append(technologies, tech.String())
+ }
+ }
+ auditCmd.SetTechnologies(technologies)
+ return progressbar.ExecWithProgress(auditCmd)
+}
+
+func createAuditCmd(c *components.Context) (*audit.AuditCommand, error) {
+ auditCmd := audit.NewGenericAuditCommand()
+ serverDetails, err := createServerDetailsWithConfigOffer(c)
+ if err != nil {
+ return nil, err
+ }
+ err = validateXrayContext(c, serverDetails)
+ if err != nil {
+ return nil, err
+ }
+ format, err := outputFormat.GetOutputFormat(c.GetStringFlagValue(flags.OutputFormat))
+ if err != nil {
+ return nil, err
+ }
+ minSeverity, err := utils.GetSeveritiesFormat(c.GetStringFlagValue(flags.MinSeverity))
+ if err != nil {
+ return nil, err
+ }
+ auditCmd.SetTargetRepoPath(addTrailingSlashToRepoPathIfNeeded(c)).
+ SetProject(c.GetStringFlagValue(flags.Project)).
+ SetIncludeVulnerabilities(shouldIncludeVulnerabilities(c)).
+ SetIncludeLicenses(c.GetBoolFlagValue(flags.Licenses)).
+ SetFail(c.GetBoolFlagValue(flags.Fail)).
+ SetPrintExtendedTable(c.GetBoolFlagValue(flags.ExtendedTable)).
+ SetMinSeverityFilter(minSeverity).
+ SetFixableOnly(c.GetBoolFlagValue(flags.FixableOnly)).
+ SetThirdPartyApplicabilityScan(c.GetBoolFlagValue(flags.ThirdPartyContextualAnalysis)).
+ SetExclusions(pluginsCommon.GetStringsArrFlagValue(c, flags.Exclusions))
+
+ if c.GetStringFlagValue(flags.Watches) != "" {
+ auditCmd.SetWatches(splitByCommaAndTrim(c.GetStringFlagValue(flags.Watches)))
+ }
+
+ if c.GetStringFlagValue(flags.WorkingDirs) != "" {
+ auditCmd.SetWorkingDirs(splitByCommaAndTrim(c.GetStringFlagValue(flags.WorkingDirs)))
+ }
+ auditCmd.SetServerDetails(serverDetails).
+ SetExcludeTestDependencies(c.GetBoolFlagValue(flags.ExcludeTestDeps)).
+ SetOutputFormat(format).
+ SetUseWrapper(c.GetBoolFlagValue(flags.UseWrapper)).
+ SetInsecureTls(c.GetBoolFlagValue(flags.InsecureTls)).
+ SetNpmScope(c.GetStringFlagValue(flags.DepType)).
+ SetPipRequirementsFile(c.GetStringFlagValue(flags.RequirementsFile))
+ return auditCmd, err
+}
+
+func logNonGenericAuditCommandDeprecation(cmdName string) {
+ if cliutils.ShouldLogWarning() {
+ log.Warn(
+ `You are using a deprecated syntax of the command.
+ Instead of:
+ $ ` + coreutils.GetCliExecutableName() + ` ` + cmdName + ` ...
+ Use:
+ $ ` + coreutils.GetCliExecutableName() + ` audit ...`)
+ }
+}
+
+func AuditSpecificCmd(c *components.Context, technology coreutils.Technology) error {
+ logNonGenericAuditCommandDeprecation(c.CommandName)
+ auditCmd, err := createAuditCmd(c)
+ if err != nil {
+ return err
+ }
+ technologies := []string{string(technology)}
+ auditCmd.SetTechnologies(technologies)
+ return progressbar.ExecWithProgress(auditCmd)
+}
+
+func CurationCmd(c *components.Context) error {
+ threadsFlag, err := c.GetIntFlagValue(flags.Threads)
+ if err != nil {
+ return err
+ }
+ threads, err := curation.DetectNumOfThreads(threadsFlag)
+ if err != nil {
+ return err
+ }
+ curationAuditCommand := curation.NewCurationAuditCommand().
+ SetWorkingDirs(splitByCommaAndTrim(c.GetStringFlagValue(flags.WorkingDirs))).
+ SetParallelRequests(threads)
+
+ serverDetails, err := pluginsCommon.CreateServerDetailsWithConfigOffer(c, true, cliutils.Rt)
+ if err != nil {
+ return err
+ }
+ format, err := curation.GetCurationOutputFormat(c.GetStringFlagValue(flags.OutputFormat))
+ if err != nil {
+ return err
+ }
+ curationAuditCommand.SetServerDetails(serverDetails).
+ SetExcludeTestDependencies(c.GetBoolFlagValue(flags.ExcludeTestDeps)).
+ SetOutputFormat(format).
+ SetUseWrapper(c.GetBoolFlagValue(flags.UseWrapper)).
+ SetInsecureTls(c.GetBoolFlagValue(flags.InsecureTls)).
+ SetNpmScope(c.GetStringFlagValue(flags.DepType)).
+ SetPipRequirementsFile(c.GetStringFlagValue(flags.RequirementsFile))
+ return progressbar.ExecWithProgress(curationAuditCommand)
+}
+
+func DockerScan(c *components.Context, image string) error {
+ // Since this command is not registered normally, we need to handle printing 'help' here by ourselves.
+ c.CommandName = dockerScanCmdHiddenName
+ printHelp := pluginsCommon.GetPrintCurrentCmdHelp(c)
+ if show, err := cliutils.ShowGenericCmdHelpIfNeeded(c.Arguments, printHelp); show || err != nil {
+ return err
+ }
+ if image == "" {
+ return printHelp()
+ }
+ // Run the command
+ serverDetails, err := createServerDetailsWithConfigOffer(c)
+ if err != nil {
+ return err
+ }
+ err = validateXrayContext(c, serverDetails)
+ if err != nil {
+ return err
+ }
+ containerScanCommand := scan.NewDockerScanCommand()
+ format, err := outputFormat.GetOutputFormat(c.GetStringFlagValue(flags.OutputFormat))
+ if err != nil {
+ return err
+ }
+ minSeverity, err := utils.GetSeveritiesFormat(c.GetStringFlagValue(flags.MinSeverity))
+ if err != nil {
+ return err
+ }
+ containerScanCommand.SetImageTag(image).
+ SetTargetRepoPath(addTrailingSlashToRepoPathIfNeeded(c)).
+ SetServerDetails(serverDetails).
+ SetOutputFormat(format).
+ SetProject(c.GetStringFlagValue(flags.Project)).
+ SetIncludeVulnerabilities(shouldIncludeVulnerabilities(c)).
+ SetIncludeLicenses(c.GetBoolFlagValue(flags.Licenses)).
+ SetFail(c.GetBoolFlagValue(flags.Fail)).
+ SetPrintExtendedTable(c.GetBoolFlagValue(flags.ExtendedTable)).
+ SetBypassArchiveLimits(c.GetBoolFlagValue(flags.BypassArchiveLimits)).
+ SetFixableOnly(c.GetBoolFlagValue(flags.FixableOnly)).
+ SetMinSeverityFilter(minSeverity)
+ if c.GetStringFlagValue(flags.Watches) != "" {
+ containerScanCommand.SetWatches(splitByCommaAndTrim(c.GetStringFlagValue(flags.Watches)))
+ }
+ return progressbar.ExecWithProgress(containerScanCommand)
+}
diff --git a/cli/xraycommands.go b/cli/xraycommands.go
new file mode 100644
index 00000000..da8aecab
--- /dev/null
+++ b/cli/xraycommands.go
@@ -0,0 +1,188 @@
+package cli
+
+import (
+ "time"
+
+ corecommon "github.com/jfrog/jfrog-cli-core/v2/common/commands"
+ pluginsCommon "github.com/jfrog/jfrog-cli-core/v2/plugins/common"
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+
+ "github.com/jfrog/jfrog-cli-security/commands/xray/curl"
+ "github.com/jfrog/jfrog-cli-security/commands/xray/offlineupdate"
+
+ flags "github.com/jfrog/jfrog-cli-security/cli/docs"
+ auditSpecificDocs "github.com/jfrog/jfrog-cli-security/cli/docs/auditspecific"
+ scanDocs "github.com/jfrog/jfrog-cli-security/cli/docs/scan/scan"
+ curlDocs "github.com/jfrog/jfrog-cli-security/cli/docs/xray/curl"
+ offlineupdateDocs "github.com/jfrog/jfrog-cli-security/cli/docs/xray/offlineupdate"
+)
+
+func getXrayNameSpaceCommands() []components.Command {
+ return []components.Command{
+ {
+ Name: "curl",
+ Aliases: []string{"cl"},
+ Flags: flags.GetCommandFlags(flags.XrCurl),
+ Description: curlDocs.GetDescription(),
+ Arguments: curlDocs.GetArguments(),
+ SkipFlagParsing: true,
+ Action: curlCmd,
+ },
+ {
+ Name: "offline-update",
+ Aliases: []string{"ou"},
+ Flags: flags.GetCommandFlags(flags.OfflineUpdate),
+ Description: offlineupdateDocs.GetDescription(),
+ Action: offlineUpdates,
+ },
+
+ // TODO: Deprecated commands (remove at next CLI major version)
+ {
+ Name: "scan",
+ Aliases: []string{"s"},
+ Flags: flags.GetCommandFlags(flags.XrScan),
+ Description: scanDocs.GetDescription(),
+ Arguments: scanDocs.GetArguments(),
+ Action: func(c *components.Context) error {
+ return pluginsCommon.RunCmdWithDeprecationWarning("scan", "xr", c, ScanCmd)
+ },
+ },
+ {
+ Name: "audit-mvn",
+ Aliases: []string{"am"},
+ Flags: flags.GetCommandFlags(flags.AuditMvn),
+ Description: auditSpecificDocs.GetMvnDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Maven)
+ },
+ },
+ {
+ Name: "audit-gradle",
+ Aliases: []string{"ag"},
+ Flags: flags.GetCommandFlags(flags.AuditGradle),
+ Description: auditSpecificDocs.GetGradleDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Gradle)
+ },
+ },
+ {
+ Name: "audit-npm",
+ Aliases: []string{"an"},
+ Flags: flags.GetCommandFlags(flags.AuditNpm),
+ Description: auditSpecificDocs.GetNpmDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Npm)
+ },
+ },
+ {
+ Name: "audit-go",
+ Aliases: []string{"ago"},
+ Flags: flags.GetCommandFlags(flags.AuditGo),
+ Description: auditSpecificDocs.GetGoDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Go)
+ },
+ },
+ {
+ Name: "audit-pip",
+ Aliases: []string{"ap"},
+ Flags: flags.GetCommandFlags(flags.AuditPip),
+ Description: auditSpecificDocs.GetPipDescription(),
+ Action: func(c *components.Context) error {
+ return AuditSpecificCmd(c, coreutils.Pip)
+ },
+ },
+ }
+}
+
+// Base on a given context from the CLI, create the curl command and execute it.
+func curlCmd(c *components.Context) error {
+ // Parse context and validate it for the command.
+ if show, err := pluginsCommon.ShowCmdHelpIfNeeded(c, c.Arguments); show || err != nil {
+ return err
+ }
+ if len(c.Arguments) < 1 {
+ return pluginsCommon.WrongNumberOfArgumentsHandler(c)
+ }
+ // Create and execute the curl command.
+ xrCurlCmd, err := newXrCurlCommand(c)
+ if err != nil {
+ return err
+ }
+ return corecommon.Exec(xrCurlCmd)
+}
+
+func newXrCurlCommand(c *components.Context) (*curl.XrCurlCommand, error) {
+ xrCurlCommand := curl.NewXrCurlCommand(*corecommon.NewCurlCommand().SetArguments(pluginsCommon.ExtractArguments(c)))
+ xrDetails, err := xrCurlCommand.GetServerDetails()
+ if err != nil {
+ return nil, err
+ }
+ if xrDetails.XrayUrl == "" {
+ return nil, errorutils.CheckErrorf("No Xray servers configured. Use the 'jf c add' command to set the Xray server details.")
+ }
+ xrCurlCommand.SetServerDetails(xrDetails)
+ xrCurlCommand.SetUrl(xrDetails.XrayUrl)
+ return xrCurlCommand, err
+}
+
+// Base on a given context from the CLI, create the offline-update command and execute it.
+func offlineUpdates(c *components.Context) error {
+ offlineUpdateFlags, err := getOfflineUpdatesFlag(c)
+ if err != nil {
+ return err
+ }
+ return offlineupdate.OfflineUpdate(offlineUpdateFlags)
+}
+
+func getOfflineUpdatesFlag(c *components.Context) (offlineFlags *offlineupdate.OfflineUpdatesFlags, err error) {
+ offlineFlags = new(offlineupdate.OfflineUpdatesFlags)
+ offlineFlags.License = c.GetStringFlagValue(flags.LicenseId)
+ if len(offlineFlags.License) < 1 {
+ return nil, errorutils.CheckErrorf("the --%s option is mandatory", flags.LicenseId)
+ }
+ offlineFlags.Version = c.GetStringFlagValue(flags.Version)
+ offlineFlags.Target = c.GetStringFlagValue(flags.Target)
+ // Handle V3 flags
+ stream := c.GetStringFlagValue(flags.Stream)
+ offlineFlags.IsPeriodicUpdate = c.GetBoolFlagValue(flags.Periodic)
+ // If a 'stream' flag was provided - validate its value and return.
+ if stream != "" {
+ offlineFlags.Stream, err = offlineupdate.ValidateStream(stream)
+ return
+ }
+ if offlineFlags.IsPeriodicUpdate {
+ return nil, errorutils.CheckErrorf("the %s option is only valid with %s", flags.Periodic, flags.Stream)
+ }
+ // Handle V1 flags
+ from := c.GetStringFlagValue(flags.From)
+ to := c.GetStringFlagValue(flags.To)
+ if len(to) > 0 && len(from) < 1 {
+ return nil, errorutils.CheckErrorf("the --%s option is mandatory, when the --%s option is sent", flags.From, flags.To)
+ }
+ if len(from) > 0 && len(to) < 1 {
+ return nil, errorutils.CheckErrorf("the --%s option is mandatory, when the --%s option is sent", flags.To, flags.From)
+ }
+ if len(from) > 0 && len(to) > 0 {
+ offlineFlags.From, err = dateToMilliseconds(from)
+ err = errorutils.CheckError(err)
+ if err != nil {
+ return
+ }
+ offlineFlags.To, err = dateToMilliseconds(to)
+ err = errorutils.CheckError(err)
+ }
+ return
+}
+
+func dateToMilliseconds(date string) (dateInMillisecond int64, err error) {
+ dateFormat := "2006-01-02"
+ t, err := time.Parse(dateFormat, date)
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+ dateInMillisecond = t.UnixNano() / (int64(time.Millisecond) / int64(time.Nanosecond))
+ return
+}
diff --git a/commands/audit/audit.go b/commands/audit/audit.go
new file mode 100644
index 00000000..d2298c50
--- /dev/null
+++ b/commands/audit/audit.go
@@ -0,0 +1,196 @@
+package audit
+
+import (
+ "errors"
+ "os"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "golang.org/x/sync/errgroup"
+
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+)
+
+type AuditCommand struct {
+ watches []string
+ projectKey string
+ targetRepoPath string
+ IncludeVulnerabilities bool
+ IncludeLicenses bool
+ Fail bool
+ PrintExtendedTable bool
+ AuditParams
+}
+
+func NewGenericAuditCommand() *AuditCommand {
+ return &AuditCommand{AuditParams: *NewAuditParams()}
+}
+
+func (auditCmd *AuditCommand) SetWatches(watches []string) *AuditCommand {
+ auditCmd.watches = watches
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetProject(project string) *AuditCommand {
+ auditCmd.projectKey = project
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetTargetRepoPath(repoPath string) *AuditCommand {
+ auditCmd.targetRepoPath = repoPath
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetIncludeVulnerabilities(include bool) *AuditCommand {
+ auditCmd.IncludeVulnerabilities = include
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetIncludeLicenses(include bool) *AuditCommand {
+ auditCmd.IncludeLicenses = include
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetFail(fail bool) *AuditCommand {
+ auditCmd.Fail = fail
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) SetPrintExtendedTable(printExtendedTable bool) *AuditCommand {
+ auditCmd.PrintExtendedTable = printExtendedTable
+ return auditCmd
+}
+
+func (auditCmd *AuditCommand) CreateXrayGraphScanParams() *services.XrayGraphScanParams {
+ params := &services.XrayGraphScanParams{
+ RepoPath: auditCmd.targetRepoPath,
+ Watches: auditCmd.watches,
+ ScanType: services.Dependency,
+ }
+ if auditCmd.projectKey == "" {
+ params.ProjectKey = os.Getenv(coreutils.Project)
+ } else {
+ params.ProjectKey = auditCmd.projectKey
+ }
+ params.IncludeVulnerabilities = auditCmd.IncludeVulnerabilities
+ params.IncludeLicenses = auditCmd.IncludeLicenses
+ return params
+}
+
+func (auditCmd *AuditCommand) Run() (err error) {
+ // If no workingDirs were provided by the user, we apply a recursive scan on the root repository
+ isRecursiveScan := len(auditCmd.workingDirs) == 0
+ workingDirs, err := coreutils.GetFullPathsWorkingDirs(auditCmd.workingDirs)
+ if err != nil {
+ return
+ }
+ auditParams := NewAuditParams().
+ SetXrayGraphScanParams(auditCmd.CreateXrayGraphScanParams()).
+ SetWorkingDirs(workingDirs).
+ SetMinSeverityFilter(auditCmd.minSeverityFilter).
+ SetFixableOnly(auditCmd.fixableOnly).
+ SetGraphBasicParams(auditCmd.AuditBasicParams).
+ SetThirdPartyApplicabilityScan(auditCmd.thirdPartyApplicabilityScan).
+ SetExclusions(auditCmd.exclusions).
+ SetIsRecursiveScan(isRecursiveScan)
+ auditResults, err := RunAudit(auditParams)
+ if err != nil {
+ return
+ }
+ if auditCmd.Progress() != nil {
+ if err = auditCmd.Progress().Quit(); err != nil {
+ return
+ }
+ }
+ var messages []string
+ if !auditResults.ExtendedScanResults.EntitledForJas {
+ messages = []string{coreutils.PrintTitle("The ‘jf audit’ command also supports JFrog Advanced Security features, such as 'Contextual Analysis', 'Secret Detection', 'IaC Scan' and ‘SAST’.\nThis feature isn't enabled on your system. Read more - ") + coreutils.PrintLink("https://jfrog.com/xray/")}
+ }
+ // Print Scan results on all cases except if errors accrued on SCA scan and no security/license issues found.
+ printScanResults := !(auditResults.ScaError != nil && !auditResults.IsScaIssuesFound())
+ if printScanResults {
+ if err = xrayutils.NewResultsWriter(auditResults).
+ SetIsMultipleRootProject(auditResults.IsMultipleProject()).
+ SetIncludeVulnerabilities(auditCmd.IncludeVulnerabilities).
+ SetIncludeLicenses(auditCmd.IncludeLicenses).
+ SetOutputFormat(auditCmd.OutputFormat()).
+ SetPrintExtendedTable(auditCmd.PrintExtendedTable).
+ SetExtraMessages(messages).
+ SetScanType(services.Dependency).
+ PrintScanResults(); err != nil {
+ return
+ }
+ }
+ if err = errors.Join(auditResults.ScaError, auditResults.JasError); err != nil {
+ return
+ }
+
+ // Only in case Xray's context was given (!auditCmd.IncludeVulnerabilities), and the user asked to fail the build accordingly, do so.
+ if auditCmd.Fail && !auditCmd.IncludeVulnerabilities && xrayutils.CheckIfFailBuild(auditResults.GetScaScansXrayResults()) {
+ err = xrayutils.NewFailBuildError()
+ }
+ return
+}
+
+func (auditCmd *AuditCommand) CommandName() string {
+ return "generic_audit"
+}
+
+// Runs an audit scan based on the provided auditParams.
+// Returns an audit Results object containing all the scan results.
+// If the current server is entitled for JAS, the advanced security results will be included in the scan results.
+func RunAudit(auditParams *AuditParams) (results *xrayutils.Results, err error) {
+ // Initialize Results struct
+ results = xrayutils.NewAuditResults()
+
+ serverDetails, err := auditParams.ServerDetails()
+ if err != nil {
+ return
+ }
+ var xrayManager *xray.XrayServicesManager
+ if xrayManager, auditParams.xrayVersion, err = xrayutils.CreateXrayServiceManagerAndGetVersion(serverDetails); err != nil {
+ return
+ }
+ if err = clientutils.ValidateMinimumVersion(clientutils.Xray, auditParams.xrayVersion, scangraph.GraphScanMinXrayVersion); err != nil {
+ return
+ }
+ results.XrayVersion = auditParams.xrayVersion
+ results.ExtendedScanResults.EntitledForJas, err = isEntitledForJas(xrayManager, auditParams.xrayVersion)
+ if err != nil {
+ return
+ }
+
+ errGroup := new(errgroup.Group)
+ if results.ExtendedScanResults.EntitledForJas {
+ // Download (if needed) the analyzer manager in a background routine.
+ errGroup.Go(utils.DownloadAnalyzerManagerIfNeeded)
+ }
+
+ // The sca scan doesn't require the analyzer manager, so it can run separately from the analyzer manager download routine.
+ results.ScaError = runScaScan(auditParams, results) // runScaScan(auditParams, results)
+
+ // Wait for the Download of the AnalyzerManager to complete.
+ if err = errGroup.Wait(); err != nil {
+ err = errors.New("failed while trying to get Analyzer Manager: " + err.Error())
+ }
+
+ // Run scanners only if the user is entitled for Advanced Security
+ if results.ExtendedScanResults.EntitledForJas {
+ results.JasError = runJasScannersAndSetResults(results, auditParams.DirectDependencies(), serverDetails, auditParams.workingDirs, auditParams.Progress(), auditParams.xrayGraphScanParams.MultiScanId, auditParams.thirdPartyApplicabilityScan)
+ }
+ return
+}
+
+func isEntitledForJas(xrayManager *xray.XrayServicesManager, xrayVersion string) (entitled bool, err error) {
+ if e := clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, xrayutils.EntitlementsMinVersion); e != nil {
+ log.Debug(e)
+ return
+ }
+ entitled, err = xrayManager.IsEntitled(xrayutils.ApplicabilityFeatureId)
+ return
+}
diff --git a/commands/audit/auditparams.go b/commands/audit/auditparams.go
new file mode 100644
index 00000000..7e944f31
--- /dev/null
+++ b/commands/audit/auditparams.go
@@ -0,0 +1,105 @@
+package audit
+
+import (
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+)
+
+type AuditParams struct {
+ xrayGraphScanParams *services.XrayGraphScanParams
+ workingDirs []string
+ exclusions []string
+ installFunc func(tech string) error
+ fixableOnly bool
+ minSeverityFilter string
+ *xrayutils.AuditBasicParams
+ xrayVersion string
+ // Include third party dependencies source code in the applicability scan.
+ thirdPartyApplicabilityScan bool
+ isRecursiveScan bool
+}
+
+func NewAuditParams() *AuditParams {
+ return &AuditParams{
+ xrayGraphScanParams: &services.XrayGraphScanParams{},
+ AuditBasicParams: &xrayutils.AuditBasicParams{},
+ }
+}
+
+func (params *AuditParams) InstallFunc() func(tech string) error {
+ return params.installFunc
+}
+
+func (params *AuditParams) XrayGraphScanParams() *services.XrayGraphScanParams {
+ return params.xrayGraphScanParams
+}
+
+func (params *AuditParams) WorkingDirs() []string {
+ return params.workingDirs
+}
+
+func (params *AuditParams) XrayVersion() string {
+ return params.xrayVersion
+}
+
+func (params *AuditParams) Exclusions() []string {
+ return params.exclusions
+}
+
+func (params *AuditParams) SetExclusions(exclusions []string) *AuditParams {
+ params.exclusions = exclusions
+ return params
+}
+
+func (params *AuditParams) SetIsRecursiveScan(isRecursiveScan bool) *AuditParams {
+ params.isRecursiveScan = isRecursiveScan
+ return params
+}
+
+func (params *AuditParams) SetXrayGraphScanParams(xrayGraphScanParams *services.XrayGraphScanParams) *AuditParams {
+ params.xrayGraphScanParams = xrayGraphScanParams
+ return params
+}
+
+func (params *AuditParams) SetGraphBasicParams(gbp *xrayutils.AuditBasicParams) *AuditParams {
+ params.AuditBasicParams = gbp
+ return params
+}
+
+func (params *AuditParams) SetWorkingDirs(workingDirs []string) *AuditParams {
+ params.workingDirs = workingDirs
+ return params
+}
+
+func (params *AuditParams) SetInstallFunc(installFunc func(tech string) error) *AuditParams {
+ params.installFunc = installFunc
+ return params
+}
+
+func (params *AuditParams) FixableOnly() bool {
+ return params.fixableOnly
+}
+
+func (params *AuditParams) SetFixableOnly(fixable bool) *AuditParams {
+ params.fixableOnly = fixable
+ return params
+}
+
+func (params *AuditParams) MinSeverityFilter() string {
+ return params.minSeverityFilter
+}
+
+func (params *AuditParams) SetMinSeverityFilter(minSeverityFilter string) *AuditParams {
+ params.minSeverityFilter = minSeverityFilter
+ return params
+}
+
+func (params *AuditParams) SetThirdPartyApplicabilityScan(includeThirdPartyDeps bool) *AuditParams {
+ params.thirdPartyApplicabilityScan = includeThirdPartyDeps
+ return params
+}
+
+func (params *AuditParams) SetDepsRepo(depsRepo string) *AuditParams {
+ params.AuditBasicParams.SetDepsRepo(depsRepo)
+ return params
+}
diff --git a/commands/audit/jas/applicability/applicabilitymanager.go b/commands/audit/jas/applicability/applicabilitymanager.go
new file mode 100644
index 00000000..a727f1fd
--- /dev/null
+++ b/commands/audit/jas/applicability/applicabilitymanager.go
@@ -0,0 +1,195 @@
+package applicability
+
+import (
+ "path/filepath"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+ "golang.org/x/exp/maps"
+ "golang.org/x/exp/slices"
+)
+
+const (
+ applicabilityScanType = "analyze-applicability"
+ applicabilityScanCommand = "ca"
+ applicabilityDocsUrlSuffix = "contextual-analysis"
+)
+
+type ApplicabilityScanManager struct {
+ applicabilityScanResults []*sarif.Run
+ directDependenciesCves []string
+ indirectDependenciesCves []string
+ xrayResults []services.ScanResponse
+ scanner *jas.JasScanner
+ thirdPartyScan bool
+}
+
+// The getApplicabilityScanResults function runs the applicability scan flow, which includes the following steps:
+// Creating an ApplicabilityScanManager object.
+// Checking if the scanned project is eligible for applicability scan.
+// Running the analyzer manager executable.
+// Parsing the analyzer manager results.
+// Return values:
+// map[string]string: A map containing the applicability result of each XRAY CVE.
+// bool: true if the user is entitled to the applicability scan, false otherwise.
+// error: An error object (if any).
+func RunApplicabilityScan(xrayResults []services.ScanResponse, directDependencies []string,
+ scannedTechnologies []coreutils.Technology, scanner *jas.JasScanner, thirdPartyContextualAnalysis bool) (results []*sarif.Run, err error) {
+ applicabilityScanManager := newApplicabilityScanManager(xrayResults, directDependencies, scanner, thirdPartyContextualAnalysis)
+ if !applicabilityScanManager.shouldRunApplicabilityScan(scannedTechnologies) {
+ log.Debug("The technologies that have been scanned are currently not supported for contextual analysis scanning, or we couldn't find any vulnerable dependencies. Skipping....")
+ return
+ }
+ if err = applicabilityScanManager.scanner.Run(applicabilityScanManager); err != nil {
+ err = utils.ParseAnalyzerManagerError(utils.Applicability, err)
+ return
+ }
+ results = applicabilityScanManager.applicabilityScanResults
+ return
+}
+
+func newApplicabilityScanManager(xrayScanResults []services.ScanResponse, directDependencies []string, scanner *jas.JasScanner, thirdPartyScan bool) (manager *ApplicabilityScanManager) {
+ directDependenciesCves, indirectDependenciesCves := extractDependenciesCvesFromScan(xrayScanResults, directDependencies)
+ return &ApplicabilityScanManager{
+ applicabilityScanResults: []*sarif.Run{},
+ directDependenciesCves: directDependenciesCves,
+ indirectDependenciesCves: indirectDependenciesCves,
+ xrayResults: xrayScanResults,
+ scanner: scanner,
+ thirdPartyScan: thirdPartyScan,
+ }
+}
+
+func addCvesToSet(cves []services.Cve, set *datastructures.Set[string]) {
+ for _, cve := range cves {
+ if cve.Id != "" {
+ set.Add(cve.Id)
+ }
+ }
+}
+
+// This function gets a list of xray scan responses that contain direct and indirect vulnerabilities and returns separate
+// lists of the direct and indirect CVEs
+func extractDependenciesCvesFromScan(xrayScanResults []services.ScanResponse, directDependencies []string) (directCves []string, indirectCves []string) {
+ directCvesSet := datastructures.MakeSet[string]()
+ indirectCvesSet := datastructures.MakeSet[string]()
+ for _, scanResult := range xrayScanResults {
+ for _, vulnerability := range scanResult.Vulnerabilities {
+ if isDirectComponents(maps.Keys(vulnerability.Components), directDependencies) {
+ addCvesToSet(vulnerability.Cves, directCvesSet)
+ } else {
+ addCvesToSet(vulnerability.Cves, indirectCvesSet)
+ }
+ }
+ for _, violation := range scanResult.Violations {
+ if isDirectComponents(maps.Keys(violation.Components), directDependencies) {
+ addCvesToSet(violation.Cves, directCvesSet)
+ } else {
+ addCvesToSet(violation.Cves, indirectCvesSet)
+ }
+ }
+ }
+
+ return directCvesSet.ToSlice(), indirectCvesSet.ToSlice()
+}
+
+func isDirectComponents(components []string, directDependencies []string) bool {
+ for _, component := range components {
+ if slices.Contains(directDependencies, component) {
+ return true
+ }
+ }
+ return false
+}
+
+func (asm *ApplicabilityScanManager) Run(module jfrogappsconfig.Module) (err error) {
+ if jas.ShouldSkipScanner(module, utils.Applicability) {
+ return
+ }
+ if len(asm.scanner.JFrogAppsConfig.Modules) > 1 {
+ log.Info("Running applicability scanning in the", module.SourceRoot, "directory...")
+ } else {
+ log.Info("Running applicability scanning...")
+ }
+ if err = asm.createConfigFile(module); err != nil {
+ return
+ }
+ if err = asm.runAnalyzerManager(); err != nil {
+ return
+ }
+ workingDirResults, err := jas.ReadJasScanRunsFromFile(asm.scanner.ResultsFileName, module.SourceRoot, applicabilityDocsUrlSuffix)
+ if err != nil {
+ return
+ }
+ asm.applicabilityScanResults = append(asm.applicabilityScanResults, workingDirResults...)
+ return
+}
+
+func (asm *ApplicabilityScanManager) shouldRunApplicabilityScan(technologies []coreutils.Technology) bool {
+ return asm.cvesExists() && coreutils.ContainsApplicabilityScannableTech(technologies)
+}
+
+func (asm *ApplicabilityScanManager) cvesExists() bool {
+ return len(asm.indirectDependenciesCves) > 0 || len(asm.directDependenciesCves) > 0
+}
+
+type applicabilityScanConfig struct {
+ Scans []scanConfiguration `yaml:"scans"`
+}
+
+type scanConfiguration struct {
+ Roots []string `yaml:"roots"`
+ Output string `yaml:"output"`
+ Type string `yaml:"type"`
+ GrepDisable bool `yaml:"grep-disable"`
+ CveWhitelist []string `yaml:"cve-whitelist"`
+ IndirectCveWhitelist []string `yaml:"indirect-cve-whitelist"`
+ SkippedDirs []string `yaml:"skipped-folders"`
+}
+
+func (asm *ApplicabilityScanManager) createConfigFile(module jfrogappsconfig.Module) error {
+ roots, err := jas.GetSourceRoots(module, nil)
+ if err != nil {
+ return err
+ }
+ excludePatterns := jas.GetExcludePatterns(module, nil)
+ if asm.thirdPartyScan {
+ log.Info("Including node modules folder in applicability scan")
+ excludePatterns = removeElementFromSlice(excludePatterns, jas.NodeModulesPattern)
+ }
+ configFileContent := applicabilityScanConfig{
+ Scans: []scanConfiguration{
+ {
+ Roots: roots,
+ Output: asm.scanner.ResultsFileName,
+ Type: applicabilityScanType,
+ GrepDisable: false,
+ CveWhitelist: asm.directDependenciesCves,
+ IndirectCveWhitelist: asm.indirectDependenciesCves,
+ SkippedDirs: excludePatterns,
+ },
+ },
+ }
+ return jas.CreateScannersConfigFile(asm.scanner.ConfigFileName, configFileContent, utils.Applicability)
+}
+
+// Runs the analyzerManager app and returns a boolean to indicate whether the user is entitled for
+// advance security feature
+func (asm *ApplicabilityScanManager) runAnalyzerManager() error {
+ return asm.scanner.AnalyzerManager.Exec(asm.scanner.ConfigFileName, applicabilityScanCommand, filepath.Dir(asm.scanner.AnalyzerManager.AnalyzerManagerFullPath), asm.scanner.ServerDetails)
+}
+
+func removeElementFromSlice(skipDirs []string, element string) []string {
+ deleteIndex := slices.Index(skipDirs, element)
+ if deleteIndex == -1 {
+ return skipDirs
+ }
+ return slices.Delete(skipDirs, deleteIndex, deleteIndex+1)
+}
diff --git a/commands/audit/jas/applicability/applicabilitymanager_test.go b/commands/audit/jas/applicability/applicabilitymanager_test.go
new file mode 100644
index 00000000..672f8623
--- /dev/null
+++ b/commands/audit/jas/applicability/applicabilitymanager_test.go
@@ -0,0 +1,336 @@
+package applicability
+
+import (
+ "os"
+ "path/filepath"
+ "testing"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "github.com/stretchr/testify/assert"
+)
+
+var mockDirectDependencies = []string{"issueId_2_direct_dependency", "issueId_1_direct_dependency"}
+var mockMultiRootDirectDependencies = []string{"issueId_2_direct_dependency", "issueId_1_direct_dependency", "issueId_3_direct_dependency", "issueId_4_direct_dependency"}
+
+func TestNewApplicabilityScanManager_InputIsValid(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Act
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, mockDirectDependencies, scanner, false)
+
+ // Assert
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ assert.Len(t, applicabilityManager.directDependenciesCves, 5)
+ }
+}
+
+func TestNewApplicabilityScanManager_DependencyTreeDoesntExist(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Act
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, nil, scanner, false)
+
+ // Assert
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotNil(t, applicabilityManager.scanner.ScannerDirCleanupFunc)
+ assert.Len(t, applicabilityManager.scanner.JFrogAppsConfig.Modules, 1)
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ assert.Empty(t, applicabilityManager.directDependenciesCves)
+ }
+}
+
+func TestNewApplicabilityScanManager_NoDirectDependenciesInScan(t *testing.T) {
+ // Arrange
+ var noDirectDependenciesResults = []services.ScanResponse{
+ {
+ ScanId: "scanId_1",
+ Vulnerabilities: []services.Vulnerability{
+ {IssueId: "issueId_1", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve1"}, {Id: "testCve2"}, {Id: "testCve3"}},
+ Components: map[string]services.Component{
+ "issueId_1_non_direct_dependency": {}}},
+ },
+ Violations: []services.Violation{
+ {IssueId: "issueId_2", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve4"}, {Id: "testCve5"}},
+ Components: map[string]services.Component{
+ "issueId_2_non_direct_dependency": {}}},
+ },
+ },
+ }
+ jas.FakeBasicXrayResults[0].Vulnerabilities[0].Components["issueId_1_non_direct_dependency"] = services.Component{}
+ jas.FakeBasicXrayResults[0].Violations[0].Components["issueId_2_non_direct_dependency"] = services.Component{}
+
+ // Act
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ applicabilityManager := newApplicabilityScanManager(noDirectDependenciesResults, mockDirectDependencies, scanner, false)
+ assertApplicabilityScanner(t, applicabilityManager)
+ // ThirdPartyContextual shouldn't change anything here as this is not npm.
+ applicabilityManager = newApplicabilityScanManager(noDirectDependenciesResults, mockDirectDependencies, scanner, true)
+ assertApplicabilityScanner(t, applicabilityManager)
+}
+
+func assertApplicabilityScanner(t *testing.T, applicabilityManager *ApplicabilityScanManager) {
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ // Non-direct dependencies should not be added
+ assert.Empty(t, applicabilityManager.directDependenciesCves)
+ }
+}
+
+func TestNewApplicabilityScanManager_MultipleDependencyTrees(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Act
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, mockMultiRootDirectDependencies, scanner, false)
+
+ // Assert
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ assert.Len(t, applicabilityManager.directDependenciesCves, 5)
+ }
+}
+
+func TestNewApplicabilityScanManager_ViolationsDontExistInResults(t *testing.T) {
+ // Arrange
+ noViolationScanResponse := []services.ScanResponse{
+ {
+ ScanId: "scanId_1",
+ Vulnerabilities: []services.Vulnerability{
+ {IssueId: "issueId_1", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "test_cve_1"}, {Id: "test_cve_2"}, {Id: "test_cve_3"}},
+ Components: map[string]services.Component{"issueId_1_direct_dependency": {}}},
+ },
+ },
+ }
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ // Act
+ applicabilityManager := newApplicabilityScanManager(noViolationScanResponse, mockDirectDependencies, scanner, false)
+
+ // Assert
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ assert.Len(t, applicabilityManager.directDependenciesCves, 3)
+ }
+}
+
+func TestNewApplicabilityScanManager_VulnerabilitiesDontExist(t *testing.T) {
+ // Arrange
+ noVulnerabilitiesScanResponse := []services.ScanResponse{
+ {
+ ScanId: "scanId_1",
+ Violations: []services.Violation{
+ {IssueId: "issueId_2", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "test_cve_3"}, {Id: "test_cve_4"}},
+ Components: map[string]services.Component{"issueId_2_direct_dependency": {}}},
+ },
+ },
+ }
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ // Act
+ applicabilityManager := newApplicabilityScanManager(noVulnerabilitiesScanResponse, mockDirectDependencies, scanner, false)
+
+ // Assert
+ if assert.NotNil(t, applicabilityManager) {
+ assert.NotEmpty(t, applicabilityManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, applicabilityManager.scanner.ResultsFileName)
+ assert.Len(t, applicabilityManager.directDependenciesCves, 2)
+ }
+}
+
+func TestApplicabilityScanManager_ShouldRun_TechnologiesNotEligibleForScan(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ results, err := RunApplicabilityScan(jas.FakeBasicXrayResults, mockDirectDependencies, []coreutils.Technology{coreutils.Nuget, coreutils.Go}, scanner, false)
+
+ // Assert
+ assert.Nil(t, results)
+ assert.NoError(t, err)
+}
+
+func TestApplicabilityScanManager_ShouldRun_ScanResultsAreEmpty(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ applicabilityManager := newApplicabilityScanManager(nil, mockDirectDependencies, scanner, false)
+
+ // Assert
+ eligible := applicabilityManager.shouldRunApplicabilityScan([]coreutils.Technology{coreutils.Nuget})
+ assert.False(t, eligible)
+}
+
+func TestExtractXrayDirectViolations(t *testing.T) {
+ var xrayResponseForDirectViolationsTest = []services.ScanResponse{
+ {
+ Violations: []services.Violation{
+ {IssueId: "issueId_2", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve4"}, {Id: "testCve5"}},
+ Components: map[string]services.Component{"issueId_2_direct_dependency": {}}},
+ },
+ },
+ }
+ tests := []struct {
+ directDependencies []string
+ directCvesCount int
+ indirectCvesCount int
+ }{
+ {directDependencies: []string{"issueId_2_direct_dependency", "issueId_1_direct_dependency"},
+ directCvesCount: 2,
+ indirectCvesCount: 0,
+ },
+ // Vulnerability dependency, should be ignored by function
+ {directDependencies: []string{"issueId_1_direct_dependency"},
+ directCvesCount: 0,
+ indirectCvesCount: 2,
+ },
+ {directDependencies: []string{},
+ directCvesCount: 0,
+ indirectCvesCount: 2,
+ },
+ }
+
+ for _, test := range tests {
+ directCves, indirectCves := extractDependenciesCvesFromScan(xrayResponseForDirectViolationsTest, test.directDependencies)
+ assert.Len(t, directCves, test.directCvesCount)
+ assert.Len(t, indirectCves, test.indirectCvesCount)
+ }
+}
+
+func TestExtractXrayDirectVulnerabilities(t *testing.T) {
+ var xrayResponseForDirectVulnerabilitiesTest = []services.ScanResponse{
+ {
+ ScanId: "scanId_1",
+ Vulnerabilities: []services.Vulnerability{
+ {
+ IssueId: "issueId_1", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve1"}, {Id: "testCve2"}, {Id: "testCve3"}},
+ Components: map[string]services.Component{"issueId_1_direct_dependency": {}},
+ },
+ {
+ IssueId: "issueId_2", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve4"}, {Id: "testCve5"}},
+ Components: map[string]services.Component{"issueId_2_direct_dependency": {}},
+ },
+ },
+ },
+ }
+ tests := []struct {
+ directDependencies []string
+ directCvesCount int
+ indirectCvesCount int
+ }{
+ {
+ directDependencies: []string{"issueId_1_direct_dependency"},
+ directCvesCount: 3,
+ indirectCvesCount: 2,
+ },
+ {
+ directDependencies: []string{"issueId_2_direct_dependency"},
+ directCvesCount: 2,
+ indirectCvesCount: 3,
+ },
+ {directDependencies: []string{},
+ directCvesCount: 0,
+ indirectCvesCount: 5,
+ },
+ }
+
+ for _, test := range tests {
+ directCves, indirectCves := extractDependenciesCvesFromScan(xrayResponseForDirectVulnerabilitiesTest, test.directDependencies)
+ assert.Len(t, directCves, test.directCvesCount)
+ assert.Len(t, indirectCves, test.indirectCvesCount)
+ }
+}
+
+func TestCreateConfigFile_VerifyFileWasCreated(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, []string{"issueId_1_direct_dependency", "issueId_2_direct_dependency"}, scanner, false)
+
+ currWd, err := coreutils.GetWorkingDirectory()
+ assert.NoError(t, err)
+ err = applicabilityManager.createConfigFile(jfrogappsconfig.Module{SourceRoot: currWd})
+ assert.NoError(t, err)
+
+ defer func() {
+ err = os.Remove(applicabilityManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ }()
+
+ _, fileNotExistError := os.Stat(applicabilityManager.scanner.ConfigFileName)
+ assert.NoError(t, fileNotExistError)
+ fileContent, err := os.ReadFile(applicabilityManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ assert.True(t, len(fileContent) > 0)
+}
+
+func TestParseResults_EmptyResults_AllCvesShouldGetUnknown(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, mockDirectDependencies, scanner, false)
+ applicabilityManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "applicability-scan", "empty-results.sarif")
+
+ // Act
+ var err error
+ applicabilityManager.applicabilityScanResults, err = jas.ReadJasScanRunsFromFile(applicabilityManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, applicabilityDocsUrlSuffix)
+
+ if assert.NoError(t, err) {
+ assert.Len(t, applicabilityManager.applicabilityScanResults, 1)
+ assert.Empty(t, applicabilityManager.applicabilityScanResults[0].Results)
+ }
+}
+
+func TestParseResults_ApplicableCveExist(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, mockDirectDependencies, scanner, false)
+ applicabilityManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "applicability-scan", "applicable-cve-results.sarif")
+
+ // Act
+ var err error
+ applicabilityManager.applicabilityScanResults, err = jas.ReadJasScanRunsFromFile(applicabilityManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, applicabilityDocsUrlSuffix)
+
+ if assert.NoError(t, err) && assert.NotNil(t, applicabilityManager.applicabilityScanResults) {
+ assert.Len(t, applicabilityManager.applicabilityScanResults, 1)
+ assert.NotEmpty(t, applicabilityManager.applicabilityScanResults[0].Results)
+ }
+}
+
+func TestParseResults_AllCvesNotApplicable(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ applicabilityManager := newApplicabilityScanManager(jas.FakeBasicXrayResults, mockDirectDependencies, scanner, false)
+ applicabilityManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "applicability-scan", "no-applicable-cves-results.sarif")
+
+ // Act
+ var err error
+ applicabilityManager.applicabilityScanResults, err = jas.ReadJasScanRunsFromFile(applicabilityManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, applicabilityDocsUrlSuffix)
+
+ if assert.NoError(t, err) && assert.NotNil(t, applicabilityManager.applicabilityScanResults) {
+ assert.Len(t, applicabilityManager.applicabilityScanResults, 1)
+ assert.NotEmpty(t, applicabilityManager.applicabilityScanResults[0].Results)
+ }
+}
diff --git a/commands/audit/jas/common.go b/commands/audit/jas/common.go
new file mode 100644
index 00000000..8e5b63d2
--- /dev/null
+++ b/commands/audit/jas/common.go
@@ -0,0 +1,277 @@
+package jas
+
+import (
+ "errors"
+ "fmt"
+ "os"
+ "path/filepath"
+ "strings"
+ "testing"
+ "unicode"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+ "github.com/stretchr/testify/assert"
+ "golang.org/x/exp/slices"
+ "gopkg.in/yaml.v3"
+)
+
+const (
+ NodeModulesPattern = "**/*node_modules*/**"
+)
+
+var (
+ DefaultExcludePatterns = []string{"**/.git/**", "**/*test*/**", "**/*venv*/**", NodeModulesPattern, "**/target/**"}
+
+ mapSeverityToScore = map[string]string{
+ "": "0.0",
+ "unknown": "0.0",
+ "low": "3.9",
+ "medium": "6.9",
+ "high": "8.9",
+ "critical": "10",
+ }
+)
+
+type JasScanner struct {
+ ConfigFileName string
+ ResultsFileName string
+ AnalyzerManager utils.AnalyzerManager
+ ServerDetails *config.ServerDetails
+ JFrogAppsConfig *jfrogappsconfig.JFrogAppsConfig
+ ScannerDirCleanupFunc func() error
+}
+
+func NewJasScanner(workingDirs []string, serverDetails *config.ServerDetails, multiScanId string) (scanner *JasScanner, err error) {
+ scanner = &JasScanner{}
+ if scanner.AnalyzerManager.AnalyzerManagerFullPath, err = utils.GetAnalyzerManagerExecutable(); err != nil {
+ return
+ }
+ var tempDir string
+ if tempDir, err = fileutils.CreateTempDir(); err != nil {
+ return
+ }
+ scanner.ScannerDirCleanupFunc = func() error {
+ return fileutils.RemoveTempDir(tempDir)
+ }
+ scanner.ServerDetails = serverDetails
+ scanner.ConfigFileName = filepath.Join(tempDir, "config.yaml")
+ scanner.ResultsFileName = filepath.Join(tempDir, "results.sarif")
+ scanner.JFrogAppsConfig, err = createJFrogAppsConfig(workingDirs)
+ scanner.AnalyzerManager.MultiScanId = multiScanId
+ return
+}
+
+func createJFrogAppsConfig(workingDirs []string) (*jfrogappsconfig.JFrogAppsConfig, error) {
+ if jfrogAppsConfig, err := jfrogappsconfig.LoadConfigIfExist(); err != nil {
+ return nil, errorutils.CheckError(err)
+ } else if jfrogAppsConfig != nil {
+ // jfrog-apps-config.yml exist in the workspace
+ return jfrogAppsConfig, nil
+ }
+
+ // jfrog-apps-config.yml does not exist in the workspace
+ fullPathsWorkingDirs, err := coreutils.GetFullPathsWorkingDirs(workingDirs)
+ if err != nil {
+ return nil, err
+ }
+ jfrogAppsConfig := new(jfrogappsconfig.JFrogAppsConfig)
+ for _, workingDir := range fullPathsWorkingDirs {
+ jfrogAppsConfig.Modules = append(jfrogAppsConfig.Modules, jfrogappsconfig.Module{SourceRoot: workingDir})
+ }
+ return jfrogAppsConfig, nil
+}
+
+type ScannerCmd interface {
+ Run(module jfrogappsconfig.Module) (err error)
+}
+
+func (a *JasScanner) Run(scannerCmd ScannerCmd) (err error) {
+ for _, module := range a.JFrogAppsConfig.Modules {
+ func() {
+ defer func() {
+ err = errors.Join(err, deleteJasProcessFiles(a.ConfigFileName, a.ResultsFileName))
+ }()
+ if err = scannerCmd.Run(module); err != nil {
+ return
+ }
+ }()
+ }
+ return
+}
+
+func deleteJasProcessFiles(configFile string, resultFile string) error {
+ exist, err := fileutils.IsFileExists(configFile, false)
+ if err != nil {
+ return err
+ }
+ if exist {
+ if err = os.Remove(configFile); err != nil {
+ return errorutils.CheckError(err)
+ }
+ }
+ exist, err = fileutils.IsFileExists(resultFile, false)
+ if err != nil {
+ return err
+ }
+ if exist {
+ err = os.Remove(resultFile)
+ }
+ return errorutils.CheckError(err)
+}
+
+func ReadJasScanRunsFromFile(fileName, wd, informationUrlSuffix string) (sarifRuns []*sarif.Run, err error) {
+ if sarifRuns, err = utils.ReadScanRunsFromFile(fileName); err != nil {
+ return
+ }
+ for _, sarifRun := range sarifRuns {
+ // Jas reports has only one invocation
+ // Set the actual working directory to the invocation, not the analyzerManager directory
+ // Also used to calculate relative paths if needed with it
+ sarifRun.Invocations[0].WorkingDirectory.WithUri(wd)
+ // Process runs values
+ fillMissingRequiredDriverInformation(utils.BaseDocumentationURL+informationUrlSuffix, utils.GetAnalyzerManagerVersion(), sarifRun)
+ sarifRun.Results = excludeSuppressResults(sarifRun.Results)
+ addScoreToRunRules(sarifRun)
+ }
+ return
+}
+
+func fillMissingRequiredDriverInformation(defaultJasInformationUri, defaultVersion string, run *sarif.Run) {
+ driver := run.Tool.Driver
+ if driver.InformationURI == nil {
+ driver.InformationURI = &defaultJasInformationUri
+ }
+ if driver.Version == nil || !isValidVersion(*driver.Version) {
+ driver.Version = &defaultVersion
+ }
+}
+
+func isValidVersion(version string) bool {
+ if len(version) == 0 {
+ return false
+ }
+ firstChar := rune(version[0])
+ return unicode.IsDigit(firstChar)
+}
+
+func excludeSuppressResults(sarifResults []*sarif.Result) []*sarif.Result {
+ results := []*sarif.Result{}
+ for _, sarifResult := range sarifResults {
+ if len(sarifResult.Suppressions) > 0 {
+ // Describes a request to “suppress” a result (to exclude it from result lists)
+ continue
+ }
+ results = append(results, sarifResult)
+ }
+ return results
+}
+
+func addScoreToRunRules(sarifRun *sarif.Run) {
+ for _, sarifResult := range sarifRun.Results {
+ if rule, err := sarifRun.GetRuleById(*sarifResult.RuleID); err == nil {
+ // Add to the rule security-severity score based on results severity
+ score := convertToScore(utils.GetResultSeverity(sarifResult))
+ if score != utils.MissingCveScore {
+ if rule.Properties == nil {
+ rule.WithProperties(sarif.NewPropertyBag().Properties)
+ }
+ rule.Properties["security-severity"] = score
+ }
+ }
+ }
+}
+
+func convertToScore(severity string) string {
+ if level, ok := mapSeverityToScore[strings.ToLower(severity)]; ok {
+ return level
+ }
+ return ""
+}
+
+func CreateScannersConfigFile(fileName string, fileContent interface{}, scanType utils.JasScanType) error {
+ yamlData, err := yaml.Marshal(&fileContent)
+ if errorutils.CheckError(err) != nil {
+ return err
+ }
+ log.Debug(scanType.String() + " scanner input YAML:\n" + string(yamlData))
+ err = os.WriteFile(fileName, yamlData, 0644)
+ return errorutils.CheckError(err)
+}
+
+var FakeServerDetails = config.ServerDetails{
+ Url: "platformUrl",
+ Password: "password",
+ User: "user",
+}
+
+var FakeBasicXrayResults = []services.ScanResponse{
+ {
+ ScanId: "scanId_1",
+ Vulnerabilities: []services.Vulnerability{
+ {IssueId: "issueId_1", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve1"}, {Id: "testCve2"}, {Id: "testCve3"}},
+ Components: map[string]services.Component{"issueId_1_direct_dependency": {}, "issueId_3_direct_dependency": {}}},
+ },
+ Violations: []services.Violation{
+ {IssueId: "issueId_2", Technology: coreutils.Pipenv.String(),
+ Cves: []services.Cve{{Id: "testCve4"}, {Id: "testCve5"}},
+ Components: map[string]services.Component{"issueId_2_direct_dependency": {}, "issueId_4_direct_dependency": {}}},
+ },
+ },
+}
+
+func InitJasTest(t *testing.T, workingDirs ...string) (*JasScanner, func()) {
+ assert.NoError(t, utils.DownloadAnalyzerManagerIfNeeded())
+ scanner, err := NewJasScanner(workingDirs, &FakeServerDetails, "")
+ assert.NoError(t, err)
+ return scanner, func() {
+ assert.NoError(t, scanner.ScannerDirCleanupFunc())
+ }
+}
+
+func GetTestDataPath() string {
+ return filepath.Join("..", "..", "..", "..", "tests", "testdata", "other")
+}
+
+func ShouldSkipScanner(module jfrogappsconfig.Module, scanType utils.JasScanType) bool {
+ lowerScanType := strings.ToLower(string(scanType))
+ if slices.Contains(module.ExcludeScanners, lowerScanType) {
+ log.Info(fmt.Sprintf("Skipping %s scanning", scanType))
+ return true
+ }
+ return false
+}
+
+func GetSourceRoots(module jfrogappsconfig.Module, scanner *jfrogappsconfig.Scanner) ([]string, error) {
+ root, err := filepath.Abs(module.SourceRoot)
+ if err != nil {
+ return []string{}, errorutils.CheckError(err)
+ }
+ if scanner == nil || len(scanner.WorkingDirs) == 0 {
+ return []string{root}, errorutils.CheckError(err)
+ }
+ var roots []string
+ for _, workingDir := range scanner.WorkingDirs {
+ roots = append(roots, filepath.Join(root, workingDir))
+ }
+ return roots, nil
+}
+
+func GetExcludePatterns(module jfrogappsconfig.Module, scanner *jfrogappsconfig.Scanner) []string {
+ excludePatterns := module.ExcludePatterns
+ if scanner != nil {
+ excludePatterns = append(excludePatterns, scanner.ExcludePatterns...)
+ }
+ if len(excludePatterns) == 0 {
+ return DefaultExcludePatterns
+ }
+ return excludePatterns
+}
diff --git a/commands/audit/jas/common_test.go b/commands/audit/jas/common_test.go
new file mode 100644
index 00000000..98129bcf
--- /dev/null
+++ b/commands/audit/jas/common_test.go
@@ -0,0 +1,90 @@
+package jas
+
+import (
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestExcludeSuppressResults(t *testing.T) {
+ tests := []struct {
+ name string
+ sarifResults []*sarif.Result
+ expectedOutput []*sarif.Result
+ }{
+ {
+ sarifResults: []*sarif.Result{
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet1", "ruleId1", "level1"),
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet2", "ruleId2", "level2"),
+ },
+ expectedOutput: []*sarif.Result{
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet1", "ruleId1", "level1"),
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet2", "ruleId2", "level2"),
+ },
+ },
+ {
+ sarifResults: []*sarif.Result{
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet1", "ruleId1", "level1").WithSuppression([]*sarif.Suppression{sarif.NewSuppression("")}),
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet2", "ruleId2", "level2"),
+ },
+ expectedOutput: []*sarif.Result{
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet2", "ruleId2", "level2"),
+ },
+ },
+ {
+ sarifResults: []*sarif.Result{
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet1", "ruleId1", "level1").WithSuppression([]*sarif.Suppression{sarif.NewSuppression("")}),
+ utils.CreateResultWithOneLocation("", 0, 0, 0, 0, "snippet2", "ruleId2", "level2").WithSuppression([]*sarif.Suppression{sarif.NewSuppression("")}),
+ },
+ expectedOutput: []*sarif.Result{},
+ },
+ }
+
+ for _, test := range tests {
+ assert.Equal(t, test.expectedOutput, excludeSuppressResults(test.sarifResults))
+ }
+}
+
+func TestAddScoreToRunRules(t *testing.T) {
+
+ tests := []struct {
+ name string
+ sarifRun *sarif.Run
+ expectedOutput []*sarif.ReportingDescriptor
+ }{
+ {
+ sarifRun: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file1", 0, 0, 0, 0, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file2", 0, 0, 0, 0, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule2", "warning"),
+ ),
+ expectedOutput: []*sarif.ReportingDescriptor{
+ sarif.NewRule("rule1").WithProperties(sarif.Properties{"security-severity": "6.9"}),
+ sarif.NewRule("rule2").WithProperties(sarif.Properties{"security-severity": "6.9"}),
+ },
+ },
+ {
+ sarifRun: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule1", "none"),
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule2", "note"),
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule3", "info"),
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule4", "warning"),
+ utils.CreateResultWithOneLocation("file", 0, 0, 0, 0, "snippet", "rule5", "error"),
+ ),
+ expectedOutput: []*sarif.ReportingDescriptor{
+ sarif.NewRule("rule1").WithProperties(sarif.Properties{"security-severity": "0.0"}),
+ sarif.NewRule("rule2").WithProperties(sarif.Properties{"security-severity": "3.9"}),
+ sarif.NewRule("rule3").WithProperties(sarif.Properties{"security-severity": "6.9"}),
+ sarif.NewRule("rule4").WithProperties(sarif.Properties{"security-severity": "6.9"}),
+ sarif.NewRule("rule5").WithProperties(sarif.Properties{"security-severity": "8.9"}),
+ },
+ },
+ }
+
+ for _, test := range tests {
+ addScoreToRunRules(test.sarifRun)
+ assert.Equal(t, test.expectedOutput, test.sarifRun.Tool.Driver.Rules)
+ }
+}
diff --git a/commands/audit/jas/commons_test.go b/commands/audit/jas/commons_test.go
new file mode 100644
index 00000000..fb7834e5
--- /dev/null
+++ b/commands/audit/jas/commons_test.go
@@ -0,0 +1,129 @@
+package jas
+
+import (
+ "fmt"
+ "os"
+ "path/filepath"
+ "testing"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ clientTestUtils "github.com/jfrog/jfrog-client-go/utils/tests"
+
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/stretchr/testify/assert"
+)
+
+var createJFrogAppsConfigCases = []struct {
+ workingDirs []string
+}{
+ {workingDirs: []string{}},
+ {workingDirs: []string{"working-dir"}},
+ {workingDirs: []string{"working-dir-1", "working-dir-2"}},
+}
+
+func TestCreateJFrogAppsConfig(t *testing.T) {
+ wd, err := os.Getwd()
+ assert.NoError(t, err)
+
+ for _, testCase := range createJFrogAppsConfigCases {
+ t.Run(fmt.Sprintf("%v", testCase.workingDirs), func(t *testing.T) {
+ jfrogAppsConfig, err := createJFrogAppsConfig(testCase.workingDirs)
+ assert.NoError(t, err)
+ assert.NotNil(t, jfrogAppsConfig)
+ if len(testCase.workingDirs) == 0 {
+ assert.Len(t, jfrogAppsConfig.Modules, 1)
+ assert.Equal(t, wd, jfrogAppsConfig.Modules[0].SourceRoot)
+ return
+ }
+ assert.Len(t, jfrogAppsConfig.Modules, len(testCase.workingDirs))
+ for i, workingDir := range testCase.workingDirs {
+ assert.Equal(t, filepath.Join(wd, workingDir), jfrogAppsConfig.Modules[i].SourceRoot)
+ }
+ })
+ }
+}
+
+func TestCreateJFrogAppsConfigWithConfig(t *testing.T) {
+ wd, err := os.Getwd()
+ assert.NoError(t, err)
+ chdirCallback := clientTestUtils.ChangeDirWithCallback(t, wd, "testdata")
+ defer chdirCallback()
+
+ jfrogAppsConfig, err := createJFrogAppsConfig([]string{})
+ assert.NoError(t, err)
+ assert.NotNil(t, jfrogAppsConfig)
+ assert.Equal(t, "1.0", jfrogAppsConfig.Version)
+ assert.Len(t, jfrogAppsConfig.Modules, 1)
+}
+
+func TestShouldSkipScanner(t *testing.T) {
+ module := jfrogappsconfig.Module{}
+ assert.False(t, ShouldSkipScanner(module, utils.IaC))
+
+ module = jfrogappsconfig.Module{ExcludeScanners: []string{"sast"}}
+ assert.False(t, ShouldSkipScanner(module, utils.IaC))
+ assert.True(t, ShouldSkipScanner(module, utils.Sast))
+}
+
+var getSourceRootsCases = []struct {
+ scanner *jfrogappsconfig.Scanner
+}{
+ {scanner: nil},
+ {&jfrogappsconfig.Scanner{WorkingDirs: []string{"working-dir"}}},
+ {&jfrogappsconfig.Scanner{WorkingDirs: []string{"working-dir-1", "working-dir-2"}}},
+}
+
+func TestGetSourceRoots(t *testing.T) {
+ testGetSourceRoots(t, "source-root")
+}
+
+func TestGetSourceRootsEmptySourceRoot(t *testing.T) {
+ testGetSourceRoots(t, "")
+}
+
+func testGetSourceRoots(t *testing.T, sourceRoot string) {
+ sourceRoot, err := filepath.Abs(sourceRoot)
+ assert.NoError(t, err)
+ module := jfrogappsconfig.Module{SourceRoot: sourceRoot}
+ for _, testCase := range getSourceRootsCases {
+ t.Run("", func(t *testing.T) {
+ scanner := testCase.scanner
+ actualSourceRoots, err := GetSourceRoots(module, scanner)
+ assert.NoError(t, err)
+ if scanner == nil {
+ assert.ElementsMatch(t, []string{module.SourceRoot}, actualSourceRoots)
+ return
+ }
+ expectedWorkingDirs := []string{}
+ for _, workingDir := range scanner.WorkingDirs {
+ expectedWorkingDirs = append(expectedWorkingDirs, filepath.Join(module.SourceRoot, workingDir))
+ }
+ assert.ElementsMatch(t, actualSourceRoots, expectedWorkingDirs)
+ })
+ }
+}
+
+var getExcludePatternsCases = []struct {
+ scanner *jfrogappsconfig.Scanner
+}{
+ {scanner: nil},
+ {&jfrogappsconfig.Scanner{WorkingDirs: []string{"exclude-dir"}}},
+ {&jfrogappsconfig.Scanner{WorkingDirs: []string{"exclude-dir-1", "exclude-dir-2"}}},
+}
+
+func TestGetExcludePatterns(t *testing.T) {
+ module := jfrogappsconfig.Module{ExcludePatterns: []string{"exclude-root"}}
+ for _, testCase := range getExcludePatternsCases {
+ t.Run("", func(t *testing.T) {
+ scanner := testCase.scanner
+ actualExcludePatterns := GetExcludePatterns(module, scanner)
+ if scanner == nil {
+ assert.ElementsMatch(t, module.ExcludePatterns, actualExcludePatterns)
+ return
+ }
+ expectedExcludePatterns := module.ExcludePatterns
+ expectedExcludePatterns = append(expectedExcludePatterns, scanner.ExcludePatterns...)
+ assert.ElementsMatch(t, actualExcludePatterns, expectedExcludePatterns)
+ })
+ }
+}
diff --git a/commands/audit/jas/iac/iacscanner.go b/commands/audit/jas/iac/iacscanner.go
new file mode 100644
index 00000000..312dcf4f
--- /dev/null
+++ b/commands/audit/jas/iac/iacscanner.go
@@ -0,0 +1,103 @@
+package iac
+
+import (
+ "path/filepath"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+)
+
+const (
+ iacScannerType = "iac-scan-modules"
+ iacScanCommand = "iac"
+ iacDocsUrlSuffix = "infrastructure-as-code-iac"
+)
+
+type IacScanManager struct {
+ iacScannerResults []*sarif.Run
+ scanner *jas.JasScanner
+}
+
+// The getIacScanResults function runs the iac scan flow, which includes the following steps:
+// Creating an IacScanManager object.
+// Running the analyzer manager executable.
+// Parsing the analyzer manager results.
+// Return values:
+// []utils.SourceCodeScanResult: a list of the iac violations that were found.
+// bool: true if the user is entitled to iac scan, false otherwise.
+// error: An error object (if any).
+func RunIacScan(scanner *jas.JasScanner) (results []*sarif.Run, err error) {
+ iacScanManager := newIacScanManager(scanner)
+ log.Info("Running IaC scanning...")
+ if err = iacScanManager.scanner.Run(iacScanManager); err != nil {
+ err = utils.ParseAnalyzerManagerError(utils.IaC, err)
+ return
+ }
+ if len(iacScanManager.iacScannerResults) > 0 {
+ log.Info("Found", utils.GetResultsLocationCount(iacScanManager.iacScannerResults...), "IaC vulnerabilities")
+ }
+ results = iacScanManager.iacScannerResults
+ return
+}
+
+func newIacScanManager(scanner *jas.JasScanner) (manager *IacScanManager) {
+ return &IacScanManager{
+ iacScannerResults: []*sarif.Run{},
+ scanner: scanner,
+ }
+}
+
+func (iac *IacScanManager) Run(module jfrogappsconfig.Module) (err error) {
+ if jas.ShouldSkipScanner(module, utils.IaC) {
+ return
+ }
+ if err = iac.createConfigFile(module); err != nil {
+ return
+ }
+ if err = iac.runAnalyzerManager(); err != nil {
+ return
+ }
+ workingDirResults, err := jas.ReadJasScanRunsFromFile(iac.scanner.ResultsFileName, module.SourceRoot, iacDocsUrlSuffix)
+ if err != nil {
+ return
+ }
+ iac.iacScannerResults = append(iac.iacScannerResults, workingDirResults...)
+ return
+}
+
+type iacScanConfig struct {
+ Scans []iacScanConfiguration `yaml:"scans"`
+}
+
+type iacScanConfiguration struct {
+ Roots []string `yaml:"roots"`
+ Output string `yaml:"output"`
+ Type string `yaml:"type"`
+ SkippedDirs []string `yaml:"skipped-folders"`
+}
+
+func (iac *IacScanManager) createConfigFile(module jfrogappsconfig.Module) error {
+ roots, err := jas.GetSourceRoots(module, module.Scanners.Iac)
+ if err != nil {
+ return err
+ }
+ configFileContent := iacScanConfig{
+ Scans: []iacScanConfiguration{
+ {
+ Roots: roots,
+ Output: iac.scanner.ResultsFileName,
+ Type: iacScannerType,
+ SkippedDirs: jas.GetExcludePatterns(module, module.Scanners.Iac),
+ },
+ },
+ }
+ return jas.CreateScannersConfigFile(iac.scanner.ConfigFileName, configFileContent, utils.IaC)
+}
+
+func (iac *IacScanManager) runAnalyzerManager() error {
+ return iac.scanner.AnalyzerManager.Exec(iac.scanner.ConfigFileName, iacScanCommand, filepath.Dir(iac.scanner.AnalyzerManager.AnalyzerManagerFullPath), iac.scanner.ServerDetails)
+}
diff --git a/commands/audit/jas/iac/iacscanner_test.go b/commands/audit/jas/iac/iacscanner_test.go
new file mode 100644
index 00000000..1a5403f4
--- /dev/null
+++ b/commands/audit/jas/iac/iacscanner_test.go
@@ -0,0 +1,83 @@
+package iac
+
+import (
+ "os"
+ "path/filepath"
+ "testing"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestNewIacScanManager(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t, "currentDir")
+ defer cleanUp()
+ // Act
+ iacScanManager := newIacScanManager(scanner)
+
+ // Assert
+ if assert.NotNil(t, iacScanManager) {
+ assert.NotEmpty(t, iacScanManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, iacScanManager.scanner.ResultsFileName)
+ assert.NotEmpty(t, iacScanManager.scanner.JFrogAppsConfig.Modules[0].SourceRoot)
+ assert.Equal(t, &jas.FakeServerDetails, iacScanManager.scanner.ServerDetails)
+ }
+}
+
+func TestIacScan_CreateConfigFile_VerifyFileWasCreated(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t, "currentDir")
+ defer cleanUp()
+
+ iacScanManager := newIacScanManager(scanner)
+
+ currWd, err := coreutils.GetWorkingDirectory()
+ assert.NoError(t, err)
+ err = iacScanManager.createConfigFile(jfrogappsconfig.Module{SourceRoot: currWd})
+
+ defer func() {
+ err = os.Remove(iacScanManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ }()
+
+ _, fileNotExistError := os.Stat(iacScanManager.scanner.ConfigFileName)
+ assert.NoError(t, fileNotExistError)
+ fileContent, err := os.ReadFile(iacScanManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ assert.True(t, len(fileContent) > 0)
+}
+
+func TestIacParseResults_EmptyResults(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ // Arrange
+ iacScanManager := newIacScanManager(scanner)
+ iacScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "iac-scan", "no-violations.sarif")
+
+ // Act
+ var err error
+ iacScanManager.iacScannerResults, err = jas.ReadJasScanRunsFromFile(iacScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, iacDocsUrlSuffix)
+ if assert.NoError(t, err) && assert.NotNil(t, iacScanManager.iacScannerResults) {
+ assert.Len(t, iacScanManager.iacScannerResults, 1)
+ assert.Empty(t, iacScanManager.iacScannerResults[0].Results)
+ }
+}
+
+func TestIacParseResults_ResultsContainIacViolations(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Arrange
+ iacScanManager := newIacScanManager(scanner)
+ iacScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "iac-scan", "contains-iac-violations.sarif")
+
+ // Act
+ var err error
+ iacScanManager.iacScannerResults, err = jas.ReadJasScanRunsFromFile(iacScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, iacDocsUrlSuffix)
+ if assert.NoError(t, err) && assert.NotNil(t, iacScanManager.iacScannerResults) {
+ assert.Len(t, iacScanManager.iacScannerResults, 1)
+ assert.Len(t, iacScanManager.iacScannerResults[0].Results, 4)
+ }
+}
diff --git a/commands/audit/jas/sast/sastscanner.go b/commands/audit/jas/sast/sastscanner.go
new file mode 100644
index 00000000..f81d869a
--- /dev/null
+++ b/commands/audit/jas/sast/sastscanner.go
@@ -0,0 +1,147 @@
+package sast
+
+import (
+ "fmt"
+
+ "path/filepath"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+ "golang.org/x/exp/maps"
+)
+
+const (
+ sastScannerType = "sast"
+ sastScanCommand = "zd"
+ sastDocsUrlSuffix = "sast"
+)
+
+type SastScanManager struct {
+ sastScannerResults []*sarif.Run
+ scanner *jas.JasScanner
+}
+
+func RunSastScan(scanner *jas.JasScanner) (results []*sarif.Run, err error) {
+ sastScanManager := newSastScanManager(scanner)
+ log.Info("Running SAST scanning...")
+ if err = sastScanManager.scanner.Run(sastScanManager); err != nil {
+ err = utils.ParseAnalyzerManagerError(utils.Sast, err)
+ return
+ }
+ if len(sastScanManager.sastScannerResults) > 0 {
+ log.Info("Found", utils.GetResultsLocationCount(sastScanManager.sastScannerResults...), "SAST vulnerabilities")
+ }
+ results = sastScanManager.sastScannerResults
+ return
+}
+
+func newSastScanManager(scanner *jas.JasScanner) (manager *SastScanManager) {
+ return &SastScanManager{
+ sastScannerResults: []*sarif.Run{},
+ scanner: scanner,
+ }
+}
+
+func (ssm *SastScanManager) Run(module jfrogappsconfig.Module) (err error) {
+ if jas.ShouldSkipScanner(module, utils.Sast) {
+ return
+ }
+ if err = ssm.createConfigFile(module); err != nil {
+ return
+ }
+ scanner := ssm.scanner
+ if err = ssm.runAnalyzerManager(filepath.Dir(ssm.scanner.AnalyzerManager.AnalyzerManagerFullPath)); err != nil {
+ return
+ }
+ workingDirRuns, err := jas.ReadJasScanRunsFromFile(scanner.ResultsFileName, module.SourceRoot, sastDocsUrlSuffix)
+ if err != nil {
+ return
+ }
+ groupResultsByLocation(workingDirRuns)
+ ssm.sastScannerResults = append(ssm.sastScannerResults, workingDirRuns...)
+ return
+}
+
+type sastScanConfig struct {
+ Scans []scanConfiguration `yaml:"scans,omitempty"`
+}
+
+type scanConfiguration struct {
+ Roots []string `yaml:"roots,omitempty"`
+ Type string `yaml:"type,omitempty"`
+ Language string `yaml:"language,omitempty"`
+ ExcludePatterns []string `yaml:"exclude_patterns,omitempty"`
+ ExcludedRules []string `yaml:"excluded-rules,omitempty"`
+}
+
+func (ssm *SastScanManager) createConfigFile(module jfrogappsconfig.Module) error {
+ sastScanner := module.Scanners.Sast
+ if sastScanner == nil {
+ sastScanner = &jfrogappsconfig.SastScanner{}
+ }
+ roots, err := jas.GetSourceRoots(module, &sastScanner.Scanner)
+ if err != nil {
+ return err
+ }
+ configFileContent := sastScanConfig{
+ Scans: []scanConfiguration{
+ {
+ Type: sastScannerType,
+ Roots: roots,
+ Language: sastScanner.Language,
+ ExcludedRules: sastScanner.ExcludedRules,
+ ExcludePatterns: jas.GetExcludePatterns(module, &sastScanner.Scanner),
+ },
+ },
+ }
+ return jas.CreateScannersConfigFile(ssm.scanner.ConfigFileName, configFileContent, utils.Sast)
+}
+
+func (ssm *SastScanManager) runAnalyzerManager(wd string) error {
+ return ssm.scanner.AnalyzerManager.ExecWithOutputFile(ssm.scanner.ConfigFileName, sastScanCommand, wd, ssm.scanner.ResultsFileName, ssm.scanner.ServerDetails)
+}
+
+// In the Sast scanner, there can be multiple results with the same location.
+// The only difference is that their CodeFlow values are different.
+// We combine those under the same result location value
+func groupResultsByLocation(sarifRuns []*sarif.Run) {
+ for _, sastRun := range sarifRuns {
+ locationToResult := map[string]*sarif.Result{}
+ for _, sastResult := range sastRun.Results {
+ resultID := getResultId(sastResult)
+ if result, exists := locationToResult[resultID]; exists {
+ result.CodeFlows = append(result.CodeFlows, sastResult.CodeFlows...)
+ } else {
+ locationToResult[resultID] = sastResult
+ }
+ }
+ sastRun.Results = maps.Values(locationToResult)
+ }
+}
+
+func getResultLocationStr(result *sarif.Result) string {
+ if len(result.Locations) == 0 {
+ return ""
+ }
+ location := result.Locations[0]
+ return fmt.Sprintf("%s%d%d%d%d",
+ utils.GetLocationFileName(location),
+ utils.GetLocationStartLine(location),
+ utils.GetLocationStartColumn(location),
+ utils.GetLocationEndLine(location),
+ utils.GetLocationEndColumn(location))
+}
+
+func getResultRuleId(result *sarif.Result) string {
+ if result.RuleID == nil {
+ return ""
+ }
+ return *result.RuleID
+}
+
+func getResultId(result *sarif.Result) string {
+ return getResultRuleId(result) + utils.GetResultSeverity(result) + utils.GetResultMsgText(result) + getResultLocationStr(result)
+}
diff --git a/commands/audit/jas/sast/sastscanner_test.go b/commands/audit/jas/sast/sastscanner_test.go
new file mode 100644
index 00000000..5dcd2110
--- /dev/null
+++ b/commands/audit/jas/sast/sastscanner_test.go
@@ -0,0 +1,156 @@
+package sast
+
+import (
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+
+ "github.com/stretchr/testify/assert"
+)
+
+func TestNewSastScanManager(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t, "currentDir")
+ defer cleanUp()
+ // Act
+ sastScanManager := newSastScanManager(scanner)
+
+ // Assert
+ if assert.NotNil(t, sastScanManager) {
+ assert.NotEmpty(t, sastScanManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, sastScanManager.scanner.ResultsFileName)
+ assert.NotEmpty(t, sastScanManager.scanner.JFrogAppsConfig.Modules[0].SourceRoot)
+ assert.Equal(t, &jas.FakeServerDetails, sastScanManager.scanner.ServerDetails)
+ }
+}
+
+func TestSastParseResults_EmptyResults(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ // Arrange
+ sastScanManager := newSastScanManager(scanner)
+ sastScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "sast-scan", "no-violations.sarif")
+
+ // Act
+ var err error
+ sastScanManager.sastScannerResults, err = jas.ReadJasScanRunsFromFile(sastScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, sastDocsUrlSuffix)
+
+ // Assert
+ if assert.NoError(t, err) && assert.NotNil(t, sastScanManager.sastScannerResults) {
+ assert.Len(t, sastScanManager.sastScannerResults, 1)
+ assert.Empty(t, sastScanManager.sastScannerResults[0].Results)
+ groupResultsByLocation(sastScanManager.sastScannerResults)
+ assert.Len(t, sastScanManager.sastScannerResults, 1)
+ assert.Empty(t, sastScanManager.sastScannerResults[0].Results)
+ }
+}
+
+func TestSastParseResults_ResultsContainIacViolations(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Arrange
+ sastScanManager := newSastScanManager(scanner)
+ sastScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "sast-scan", "contains-sast-violations.sarif")
+
+ // Act
+ var err error
+ sastScanManager.sastScannerResults, err = jas.ReadJasScanRunsFromFile(sastScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, sastDocsUrlSuffix)
+
+ // Assert
+ if assert.NoError(t, err) && assert.NotNil(t, sastScanManager.sastScannerResults) {
+ assert.Len(t, sastScanManager.sastScannerResults, 1)
+ assert.NotEmpty(t, sastScanManager.sastScannerResults[0].Results)
+ groupResultsByLocation(sastScanManager.sastScannerResults)
+ // File has 4 results, 2 of them at the same location different codeFlow
+ assert.Len(t, sastScanManager.sastScannerResults[0].Results, 3)
+ }
+}
+
+func TestGroupResultsByLocation(t *testing.T) {
+ tests := []struct {
+ run *sarif.Run
+ expectedOutput *sarif.Run
+ }{
+ {
+ run: utils.CreateRunWithDummyResults(),
+ expectedOutput: utils.CreateRunWithDummyResults(),
+ },
+ {
+ // No similar groups at all
+ run: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "note"),
+ utils.CreateResultWithOneLocation("file", 5, 6, 7, 8, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file2", 1, 2, 3, 4, "snippet", "rule1", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other", 0, 0, 0, 0, "other-snippet"),
+ utils.CreateLocation("file2", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ utils.CreateResultWithOneLocation("file2", 1, 2, 3, 4, "snippet", "rule2", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other2", 1, 1, 1, 1, "other-snippet2"),
+ utils.CreateLocation("file2", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ ),
+ expectedOutput: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "note"),
+ utils.CreateResultWithOneLocation("file", 5, 6, 7, 8, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file2", 1, 2, 3, 4, "snippet", "rule1", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other", 0, 0, 0, 0, "other-snippet"),
+ utils.CreateLocation("file2", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ utils.CreateResultWithOneLocation("file2", 1, 2, 3, 4, "snippet", "rule2", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other2", 1, 1, 1, 1, "other-snippet2"),
+ utils.CreateLocation("file2", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ ),
+ },
+ {
+ // With similar groups
+ run: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other", 0, 0, 0, 0, "other-snippet"),
+ utils.CreateLocation("file", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other2", 1, 1, 1, 1, "other-snippet"),
+ utils.CreateLocation("file", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ utils.CreateResultWithOneLocation("file", 5, 6, 7, 8, "snippet", "rule1", "info"),
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info"),
+ ),
+ expectedOutput: utils.CreateRunWithDummyResults(
+ utils.CreateResultWithOneLocation("file", 1, 2, 3, 4, "snippet", "rule1", "info").WithCodeFlows([]*sarif.CodeFlow{
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other", 0, 0, 0, 0, "other-snippet"),
+ utils.CreateLocation("file", 1, 2, 3, 4, "snippet"),
+ )),
+ utils.CreateCodeFlow(utils.CreateThreadFlow(
+ utils.CreateLocation("other2", 1, 1, 1, 1, "other-snippet"),
+ utils.CreateLocation("file", 1, 2, 3, 4, "snippet"),
+ )),
+ }),
+ utils.CreateResultWithOneLocation("file", 5, 6, 7, 8, "snippet", "rule1", "info"),
+ ),
+ },
+ }
+
+ for _, test := range tests {
+ groupResultsByLocation([]*sarif.Run{test.run})
+ assert.ElementsMatch(t, test.expectedOutput.Results, test.run.Results)
+ }
+}
diff --git a/commands/audit/jas/secrets/secretsscanner.go b/commands/audit/jas/secrets/secretsscanner.go
new file mode 100644
index 00000000..ca9d2ce7
--- /dev/null
+++ b/commands/audit/jas/secrets/secretsscanner.go
@@ -0,0 +1,121 @@
+package secrets
+
+import (
+ "path/filepath"
+ "strings"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/owenrumney/go-sarif/v2/sarif"
+)
+
+const (
+ secretsScanCommand = "sec"
+ secretsScannerType = "secrets-scan"
+ secretsDocsUrlSuffix = "secrets"
+)
+
+type SecretScanManager struct {
+ secretsScannerResults []*sarif.Run
+ scanner *jas.JasScanner
+}
+
+// The getSecretsScanResults function runs the secrets scan flow, which includes the following steps:
+// Creating an SecretScanManager object.
+// Running the analyzer manager executable.
+// Parsing the analyzer manager results.
+// Return values:
+// []utils.IacOrSecretResult: a list of the secrets that were found.
+// error: An error object (if any).
+func RunSecretsScan(scanner *jas.JasScanner) (results []*sarif.Run, err error) {
+ secretScanManager := newSecretsScanManager(scanner)
+ log.Info("Running secrets scanning...")
+ if err = secretScanManager.scanner.Run(secretScanManager); err != nil {
+ err = utils.ParseAnalyzerManagerError(utils.Secrets, err)
+ return
+ }
+ results = secretScanManager.secretsScannerResults
+ if len(results) > 0 {
+ log.Info("Found", utils.GetResultsLocationCount(results...), "secrets")
+ }
+ return
+}
+
+func newSecretsScanManager(scanner *jas.JasScanner) (manager *SecretScanManager) {
+ return &SecretScanManager{
+ secretsScannerResults: []*sarif.Run{},
+ scanner: scanner,
+ }
+}
+
+func (ssm *SecretScanManager) Run(module jfrogappsconfig.Module) (err error) {
+ if jas.ShouldSkipScanner(module, utils.Secrets) {
+ return
+ }
+ if err = ssm.createConfigFile(module); err != nil {
+ return
+ }
+ if err = ssm.runAnalyzerManager(); err != nil {
+ return
+ }
+ workingDirRuns, err := jas.ReadJasScanRunsFromFile(ssm.scanner.ResultsFileName, module.SourceRoot, secretsDocsUrlSuffix)
+ if err != nil {
+ return
+ }
+ ssm.secretsScannerResults = append(ssm.secretsScannerResults, processSecretScanRuns(workingDirRuns)...)
+ return
+}
+
+type secretsScanConfig struct {
+ Scans []secretsScanConfiguration `yaml:"scans"`
+}
+
+type secretsScanConfiguration struct {
+ Roots []string `yaml:"roots"`
+ Output string `yaml:"output"`
+ Type string `yaml:"type"`
+ SkippedDirs []string `yaml:"skipped-folders"`
+}
+
+func (s *SecretScanManager) createConfigFile(module jfrogappsconfig.Module) error {
+ roots, err := jas.GetSourceRoots(module, module.Scanners.Secrets)
+ if err != nil {
+ return err
+ }
+ configFileContent := secretsScanConfig{
+ Scans: []secretsScanConfiguration{
+ {
+ Roots: roots,
+ Output: s.scanner.ResultsFileName,
+ Type: secretsScannerType,
+ SkippedDirs: jas.GetExcludePatterns(module, module.Scanners.Secrets),
+ },
+ },
+ }
+ return jas.CreateScannersConfigFile(s.scanner.ConfigFileName, configFileContent, utils.Secrets)
+}
+
+func (s *SecretScanManager) runAnalyzerManager() error {
+ return s.scanner.AnalyzerManager.Exec(s.scanner.ConfigFileName, secretsScanCommand, filepath.Dir(s.scanner.AnalyzerManager.AnalyzerManagerFullPath), s.scanner.ServerDetails)
+}
+
+func maskSecret(secret string) string {
+ if len(secret) <= 3 {
+ return "***"
+ }
+ return secret[:3] + strings.Repeat("*", 12)
+}
+
+func processSecretScanRuns(sarifRuns []*sarif.Run) []*sarif.Run {
+ for _, secretRun := range sarifRuns {
+ // Hide discovered secrets value
+ for _, secretResult := range secretRun.Results {
+ for _, location := range secretResult.Locations {
+ utils.SetLocationSnippet(location, maskSecret(utils.GetLocationSnippet(location)))
+ }
+ }
+ }
+ return sarifRuns
+}
diff --git a/commands/audit/jas/secrets/secretsscanner_test.go b/commands/audit/jas/secrets/secretsscanner_test.go
new file mode 100644
index 00000000..d10eb7e0
--- /dev/null
+++ b/commands/audit/jas/secrets/secretsscanner_test.go
@@ -0,0 +1,132 @@
+package secrets
+
+import (
+ "os"
+ "path/filepath"
+ "testing"
+
+ jfrogappsconfig "github.com/jfrog/jfrog-apps-config/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestNewSecretsScanManager(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ secretScanManager := newSecretsScanManager(scanner)
+
+ assert.NotEmpty(t, secretScanManager)
+ assert.NotEmpty(t, secretScanManager.scanner.ConfigFileName)
+ assert.NotEmpty(t, secretScanManager.scanner.ResultsFileName)
+ assert.Equal(t, &jas.FakeServerDetails, secretScanManager.scanner.ServerDetails)
+}
+
+func TestSecretsScan_CreateConfigFile_VerifyFileWasCreated(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ secretScanManager := newSecretsScanManager(scanner)
+
+ currWd, err := coreutils.GetWorkingDirectory()
+ assert.NoError(t, err)
+ err = secretScanManager.createConfigFile(jfrogappsconfig.Module{SourceRoot: currWd})
+ assert.NoError(t, err)
+
+ defer func() {
+ err = os.Remove(secretScanManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ }()
+
+ _, fileNotExistError := os.Stat(secretScanManager.scanner.ConfigFileName)
+ assert.NoError(t, fileNotExistError)
+ fileContent, err := os.ReadFile(secretScanManager.scanner.ConfigFileName)
+ assert.NoError(t, err)
+ assert.True(t, len(fileContent) > 0)
+}
+
+func TestRunAnalyzerManager_ReturnsGeneralError(t *testing.T) {
+ defer func() {
+ os.Clearenv()
+ }()
+
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ secretScanManager := newSecretsScanManager(scanner)
+ assert.Error(t, secretScanManager.runAnalyzerManager())
+}
+
+func TestParseResults_EmptyResults(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+ // Arrange
+ secretScanManager := newSecretsScanManager(scanner)
+ secretScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "secrets-scan", "no-secrets.sarif")
+
+ // Act
+ var err error
+ secretScanManager.secretsScannerResults, err = jas.ReadJasScanRunsFromFile(secretScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, secretsDocsUrlSuffix)
+
+ // Assert
+ if assert.NoError(t, err) && assert.NotNil(t, secretScanManager.secretsScannerResults) {
+ assert.Len(t, secretScanManager.secretsScannerResults, 1)
+ assert.Empty(t, secretScanManager.secretsScannerResults[0].Results)
+ secretScanManager.secretsScannerResults = processSecretScanRuns(secretScanManager.secretsScannerResults)
+ assert.Len(t, secretScanManager.secretsScannerResults, 1)
+ assert.Empty(t, secretScanManager.secretsScannerResults[0].Results)
+ }
+
+}
+
+func TestParseResults_ResultsContainSecrets(t *testing.T) {
+ // Arrange
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ secretScanManager := newSecretsScanManager(scanner)
+ secretScanManager.scanner.ResultsFileName = filepath.Join(jas.GetTestDataPath(), "secrets-scan", "contain-secrets.sarif")
+
+ // Act
+ var err error
+ secretScanManager.secretsScannerResults, err = jas.ReadJasScanRunsFromFile(secretScanManager.scanner.ResultsFileName, scanner.JFrogAppsConfig.Modules[0].SourceRoot, secretsDocsUrlSuffix)
+
+ // Assert
+ if assert.NoError(t, err) && assert.NotNil(t, secretScanManager.secretsScannerResults) {
+ assert.Len(t, secretScanManager.secretsScannerResults, 1)
+ assert.NotEmpty(t, secretScanManager.secretsScannerResults[0].Results)
+ secretScanManager.secretsScannerResults = processSecretScanRuns(secretScanManager.secretsScannerResults)
+ assert.Len(t, secretScanManager.secretsScannerResults, 1)
+ assert.Len(t, secretScanManager.secretsScannerResults[0].Results, 7)
+ }
+ assert.NoError(t, err)
+
+}
+
+func TestGetSecretsScanResults_AnalyzerManagerReturnsError(t *testing.T) {
+ scanner, cleanUp := jas.InitJasTest(t)
+ defer cleanUp()
+
+ secretsResults, err := RunSecretsScan(scanner)
+
+ assert.Error(t, err)
+ assert.ErrorContains(t, err, "failed to run Secrets scan")
+ assert.Nil(t, secretsResults)
+}
+
+func TestHideSecret(t *testing.T) {
+ tests := []struct {
+ secret string
+ expectedOutput string
+ }{
+ {secret: "", expectedOutput: "***"},
+ {secret: "12", expectedOutput: "***"},
+ {secret: "123", expectedOutput: "***"},
+ {secret: "123456789", expectedOutput: "123************"},
+ {secret: "3478hfnkjhvd848446gghgfh", expectedOutput: "347************"},
+ }
+
+ for _, test := range tests {
+ assert.Equal(t, test.expectedOutput, maskSecret(test.secret))
+ }
+}
diff --git a/commands/audit/jas/testdata/.jfrog/jfrog-apps-config.yml b/commands/audit/jas/testdata/.jfrog/jfrog-apps-config.yml
new file mode 100644
index 00000000..9357059d
--- /dev/null
+++ b/commands/audit/jas/testdata/.jfrog/jfrog-apps-config.yml
@@ -0,0 +1,50 @@
+# [Required] JFrog Applications Config version
+version: "1.0"
+
+modules:
+ # [Required] Module name
+ - name: FrogLeapApp
+ # [Optional, default: "."] Application's root directory
+ source_root: "src"
+ # [Optional] Directories to exclude from scanning across all scanners
+ exclude_patterns:
+ - "docs/"
+ # [Optional] Scanners to exclude from JFrog Advanced Security (Options: "secrets", "sast", "iac")
+ exclude_scanners:
+ - secrets
+ # [Optional] Customize scanner configurations
+ scanners:
+ # [Optional] Configuration for Static Application Security Testing (SAST)
+ sast:
+ # [Optional] Specify the programming language for SAST
+ language: java
+ # [Optional] Working directories specific to SAST (Relative to source_root)
+ working_dirs:
+ - "src/module1"
+ - "src/module2"
+ # [Optional] Additional exclude patterns for this scanner
+ exclude_patterns:
+ - "src/module1/test"
+ # [Optional] List of specific scan rules to exclude from the scan
+ excluded_rules:
+ - xss-injection
+
+ # [Optional] Configuration for secrets scan
+ secrets:
+ # [Optional] Working directories specific to the secret scanner (Relative to source_root)
+ working_dirs:
+ - "src/module1"
+ - "src/module2"
+ # [Optional] Additional exclude patterns for this scanner
+ exclude_patterns:
+ - "src/module1/test"
+
+ # [Optional] Configuration for Infrastructure as Code scan (IaC)
+ iac:
+ # [Optional] Working directories specific to IaC (Relative to source_root)
+ working_dirs:
+ - "src/module1"
+ - "src/module2"
+ # [Optional] Additional exclude patterns for this Scanner
+ exclude_patterns:
+ - "src/module1/test"
\ No newline at end of file
diff --git a/commands/audit/jasrunner.go b/commands/audit/jasrunner.go
new file mode 100644
index 00000000..3d4fadad
--- /dev/null
+++ b/commands/audit/jasrunner.go
@@ -0,0 +1,60 @@
+package audit
+
+import (
+ "errors"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas/applicability"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas/iac"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas/sast"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas/secrets"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/io"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+)
+
+func runJasScannersAndSetResults(scanResults *utils.Results, directDependencies []string,
+ serverDetails *config.ServerDetails, workingDirs []string, progress io.ProgressMgr, multiScanId string, thirdPartyApplicabilityScan bool) (err error) {
+ if serverDetails == nil || len(serverDetails.Url) == 0 {
+ log.Warn("To include 'Advanced Security' scan as part of the audit output, please run the 'jf c add' command before running this command.")
+ return
+ }
+ scanner, err := jas.NewJasScanner(workingDirs, serverDetails, multiScanId)
+ if err != nil {
+ return
+ }
+ defer func() {
+ cleanup := scanner.ScannerDirCleanupFunc
+ err = errors.Join(err, cleanup())
+ }()
+ if progress != nil {
+ progress.SetHeadlineMsg("Running applicability scanning")
+ }
+ scanResults.ExtendedScanResults.ApplicabilityScanResults, err = applicability.RunApplicabilityScan(scanResults.GetScaScansXrayResults(), directDependencies, scanResults.GetScaScannedTechnologies(), scanner, thirdPartyApplicabilityScan)
+ if err != nil {
+ return
+ }
+ // Don't execute other scanners when scanning third party dependencies.
+ if thirdPartyApplicabilityScan {
+ return
+ }
+ if progress != nil {
+ progress.SetHeadlineMsg("Running secrets scanning")
+ }
+ scanResults.ExtendedScanResults.SecretsScanResults, err = secrets.RunSecretsScan(scanner)
+ if err != nil {
+ return
+ }
+ if progress != nil {
+ progress.SetHeadlineMsg("Running IaC scanning")
+ }
+ scanResults.ExtendedScanResults.IacScanResults, err = iac.RunIacScan(scanner)
+ if err != nil {
+ return
+ }
+ if progress != nil {
+ progress.SetHeadlineMsg("Running SAST scanning")
+ }
+ scanResults.ExtendedScanResults.SastScanResults, err = sast.RunSastScan(scanner)
+ return
+}
diff --git a/commands/audit/jasrunner_test.go b/commands/audit/jasrunner_test.go
new file mode 100644
index 00000000..2acd536f
--- /dev/null
+++ b/commands/audit/jasrunner_test.go
@@ -0,0 +1,44 @@
+package audit
+
+import (
+ "os"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/jas"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestGetExtendedScanResults_AnalyzerManagerDoesntExist(t *testing.T) {
+ tmpDir, err := fileutils.CreateTempDir()
+ defer func() {
+ assert.NoError(t, fileutils.RemoveTempDir(tmpDir))
+ }()
+ assert.NoError(t, err)
+ assert.NoError(t, os.Setenv(coreutils.HomeDir, tmpDir))
+ defer func() {
+ assert.NoError(t, os.Unsetenv(coreutils.HomeDir))
+ }()
+ scanResults := &utils.Results{ScaResults: []utils.ScaScanResult{{Technology: coreutils.Yarn, XrayResults: jas.FakeBasicXrayResults}}, ExtendedScanResults: &utils.ExtendedScanResults{}}
+ err = runJasScannersAndSetResults(scanResults, []string{"issueId_1_direct_dependency", "issueId_2_direct_dependency"}, &jas.FakeServerDetails, nil, nil, "", false)
+ // Expect error:
+ assert.Error(t, err)
+}
+
+func TestGetExtendedScanResults_ServerNotValid(t *testing.T) {
+ scanResults := &utils.Results{ScaResults: []utils.ScaScanResult{{Technology: coreutils.Pip, XrayResults: jas.FakeBasicXrayResults}}, ExtendedScanResults: &utils.ExtendedScanResults{}}
+ err := runJasScannersAndSetResults(scanResults, []string{"issueId_1_direct_dependency", "issueId_2_direct_dependency"}, nil, nil, nil, "", false)
+ assert.NoError(t, err)
+}
+
+func TestGetExtendedScanResults_AnalyzerManagerReturnsError(t *testing.T) {
+ assert.NoError(t, utils.DownloadAnalyzerManagerIfNeeded())
+
+ scanResults := &utils.Results{ScaResults: []utils.ScaScanResult{{Technology: coreutils.Yarn, XrayResults: jas.FakeBasicXrayResults}}, ExtendedScanResults: &utils.ExtendedScanResults{}}
+ err := runJasScannersAndSetResults(scanResults, []string{"issueId_2_direct_dependency", "issueId_1_direct_dependency"}, &jas.FakeServerDetails, nil, nil, "", false)
+
+ // Expect error:
+ assert.ErrorContains(t, err, "failed to run Applicability scan")
+}
diff --git a/commands/audit/sca/common.go b/commands/audit/sca/common.go
new file mode 100644
index 00000000..260ece62
--- /dev/null
+++ b/commands/audit/sca/common.go
@@ -0,0 +1,152 @@
+package sca
+
+import (
+ "fmt"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ ioUtils "github.com/jfrog/jfrog-client-go/utils/io"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "os/exec"
+ "path/filepath"
+ "strings"
+ "testing"
+)
+
+func RunXrayDependenciesTreeScanGraph(dependencyTree *xrayUtils.GraphNode, progress ioUtils.ProgressMgr, technology coreutils.Technology, scanGraphParams *scangraph.ScanGraphParams) (results []services.ScanResponse, err error) {
+ scanGraphParams.XrayGraphScanParams().DependenciesGraph = dependencyTree
+ xscGitInfoContext := scanGraphParams.XrayGraphScanParams().XscGitInfoContext
+ if xscGitInfoContext != nil {
+ xscGitInfoContext.Technologies = []string{technology.String()}
+ }
+ scanMessage := fmt.Sprintf("Scanning %d %s dependencies", len(dependencyTree.Nodes), technology)
+ if progress != nil {
+ progress.SetHeadlineMsg(scanMessage)
+ }
+ log.Info(scanMessage + "...")
+ var scanResults *services.ScanResponse
+ scanResults, err = scangraph.RunScanGraphAndGetResults(scanGraphParams)
+ if err != nil {
+ err = errorutils.CheckErrorf("scanning %s dependencies failed with error: %s", string(technology), err.Error())
+ return
+ }
+ for i := range scanResults.Vulnerabilities {
+ scanResults.Vulnerabilities[i].Technology = technology.String()
+ }
+ for i := range scanResults.Violations {
+ scanResults.Violations[i].Technology = technology.String()
+ }
+ results = append(results, *scanResults)
+ return
+}
+
+func CreateTestWorkspace(t *testing.T, sourceDir string) (string, func()) {
+ return tests.CreateTestWorkspace(t, filepath.Join("..", "..", "..", "..", "tests", "testdata", sourceDir))
+}
+
+// GetExecutableVersion gets an executable version and prints to the debug log if possible.
+// Only supported for package managers that use "--version".
+func LogExecutableVersion(executable string) {
+ verBytes, err := exec.Command(executable, "--version").CombinedOutput()
+ if err != nil {
+ log.Debug(fmt.Sprintf("'%q --version' command received an error: %s", executable, err.Error()))
+ return
+ }
+ if len(verBytes) == 0 {
+ log.Debug(fmt.Sprintf("'%q --version' command received an empty response", executable))
+ return
+ }
+ version := strings.TrimSpace(string(verBytes))
+ log.Debug(fmt.Sprintf("Used %q version: %s", executable, version))
+}
+
+// BuildImpactPathsForScanResponse builds the full impact paths for each vulnerability found in the scanResult argument, using the dependencyTrees argument.
+// Returns the updated services.ScanResponse slice.
+func BuildImpactPathsForScanResponse(scanResult []services.ScanResponse, dependencyTree []*xrayUtils.GraphNode) []services.ScanResponse {
+ for _, result := range scanResult {
+ if len(result.Vulnerabilities) > 0 {
+ buildVulnerabilitiesImpactPaths(result.Vulnerabilities, dependencyTree)
+ }
+ if len(result.Violations) > 0 {
+ buildViolationsImpactPaths(result.Violations, dependencyTree)
+ }
+ if len(result.Licenses) > 0 {
+ buildLicensesImpactPaths(result.Licenses, dependencyTree)
+ }
+ }
+ return scanResult
+}
+
+// Initialize a map of issues empty impact paths
+func fillIssuesMapWithEmptyImpactPaths(issuesImpactPathsMap map[string][][]services.ImpactPathNode, components map[string]services.Component) {
+ for dependencyName := range components {
+ issuesImpactPathsMap[dependencyName] = [][]services.ImpactPathNode{}
+ }
+}
+
+// Set the impact paths for each issue in the map
+func buildImpactPaths(issuesImpactPathsMap map[string][][]services.ImpactPathNode, dependencyTrees []*xrayUtils.GraphNode) {
+ for _, dependency := range dependencyTrees {
+ setPathsForIssues(dependency, issuesImpactPathsMap, []services.ImpactPathNode{})
+ }
+}
+
+func buildVulnerabilitiesImpactPaths(vulnerabilities []services.Vulnerability, dependencyTrees []*xrayUtils.GraphNode) {
+ issuesMap := make(map[string][][]services.ImpactPathNode)
+ for _, vulnerability := range vulnerabilities {
+ fillIssuesMapWithEmptyImpactPaths(issuesMap, vulnerability.Components)
+ }
+ buildImpactPaths(issuesMap, dependencyTrees)
+ for i := range vulnerabilities {
+ updateComponentsWithImpactPaths(vulnerabilities[i].Components, issuesMap)
+ }
+}
+
+func buildViolationsImpactPaths(violations []services.Violation, dependencyTrees []*xrayUtils.GraphNode) {
+ issuesMap := make(map[string][][]services.ImpactPathNode)
+ for _, violation := range violations {
+ fillIssuesMapWithEmptyImpactPaths(issuesMap, violation.Components)
+ }
+ buildImpactPaths(issuesMap, dependencyTrees)
+ for i := range violations {
+ updateComponentsWithImpactPaths(violations[i].Components, issuesMap)
+ }
+}
+
+func buildLicensesImpactPaths(licenses []services.License, dependencyTrees []*xrayUtils.GraphNode) {
+ issuesMap := make(map[string][][]services.ImpactPathNode)
+ for _, license := range licenses {
+ fillIssuesMapWithEmptyImpactPaths(issuesMap, license.Components)
+ }
+ buildImpactPaths(issuesMap, dependencyTrees)
+ for i := range licenses {
+ updateComponentsWithImpactPaths(licenses[i].Components, issuesMap)
+ }
+}
+
+func updateComponentsWithImpactPaths(components map[string]services.Component, issuesMap map[string][][]services.ImpactPathNode) {
+ for dependencyName := range components {
+ updatedComponent := services.Component{
+ FixedVersions: components[dependencyName].FixedVersions,
+ ImpactPaths: issuesMap[dependencyName],
+ Cpes: components[dependencyName].Cpes,
+ }
+ components[dependencyName] = updatedComponent
+ }
+}
+
+func setPathsForIssues(dependency *xrayUtils.GraphNode, issuesImpactPathsMap map[string][][]services.ImpactPathNode, pathFromRoot []services.ImpactPathNode) {
+ pathFromRoot = append(pathFromRoot, services.ImpactPathNode{ComponentId: dependency.Id})
+ if _, exists := issuesImpactPathsMap[dependency.Id]; exists {
+ // Create a copy of pathFromRoot to avoid modifying the original slice
+ pathCopy := make([]services.ImpactPathNode, len(pathFromRoot))
+ copy(pathCopy, pathFromRoot)
+ issuesImpactPathsMap[dependency.Id] = append(issuesImpactPathsMap[dependency.Id], pathCopy)
+ }
+ for _, depChild := range dependency.Nodes {
+ setPathsForIssues(depChild, issuesImpactPathsMap, pathFromRoot)
+ }
+}
diff --git a/commands/audit/sca/common_test.go b/commands/audit/sca/common_test.go
new file mode 100644
index 00000000..1fb1c2d7
--- /dev/null
+++ b/commands/audit/sca/common_test.go
@@ -0,0 +1,227 @@
+package sca
+
+import (
+ "reflect"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ coreXray "github.com/jfrog/jfrog-cli-core/v2/utils/xray"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestBuildXrayDependencyTree(t *testing.T) {
+ treeHelper := make(map[string][]string)
+ rootDep := []string{"topDep1", "topDep2", "topDep3"}
+ topDep1 := []string{"midDep1", "midDep2"}
+ topDep2 := []string{"midDep2", "midDep3"}
+ midDep1 := []string{"bottomDep1"}
+ midDep2 := []string{"bottomDep2", "bottomDep3"}
+ bottomDep3 := []string{"leafDep"}
+ treeHelper["rootDep"] = rootDep
+ treeHelper["topDep1"] = topDep1
+ treeHelper["topDep2"] = topDep2
+ treeHelper["midDep1"] = midDep1
+ treeHelper["midDep2"] = midDep2
+ treeHelper["bottomDep3"] = bottomDep3
+
+ expectedUniqueDeps := []string{"rootDep", "topDep1", "topDep2", "topDep3", "midDep1", "midDep2", "midDep3", "bottomDep1", "bottomDep2", "bottomDep3", "leafDep"}
+
+ // Constructing the expected tree Nodes
+ leafDepNode := &xrayUtils.GraphNode{Id: "leafDep", Nodes: []*xrayUtils.GraphNode{}}
+ bottomDep3Node := &xrayUtils.GraphNode{Id: "bottomDep3", Nodes: []*xrayUtils.GraphNode{}}
+ bottomDep2Node := &xrayUtils.GraphNode{Id: "bottomDep2", Nodes: []*xrayUtils.GraphNode{}}
+ bottomDep1Node := &xrayUtils.GraphNode{Id: "bottomDep1", Nodes: []*xrayUtils.GraphNode{}}
+ midDep3Node := &xrayUtils.GraphNode{Id: "midDep3", Nodes: []*xrayUtils.GraphNode{}}
+ midDep2Node := &xrayUtils.GraphNode{Id: "midDep2", Nodes: []*xrayUtils.GraphNode{}}
+ midDep1Node := &xrayUtils.GraphNode{Id: "midDep1", Nodes: []*xrayUtils.GraphNode{}}
+ topDep3Node := &xrayUtils.GraphNode{Id: "topDep3", Nodes: []*xrayUtils.GraphNode{}}
+ topDep2Node := &xrayUtils.GraphNode{Id: "topDep2", Nodes: []*xrayUtils.GraphNode{}}
+ topDep1Node := &xrayUtils.GraphNode{Id: "topDep1", Nodes: []*xrayUtils.GraphNode{}}
+ rootNode := &xrayUtils.GraphNode{Id: "rootDep", Nodes: []*xrayUtils.GraphNode{}}
+
+ // Setting children to parents
+ bottomDep3Node.Nodes = append(bottomDep3Node.Nodes, leafDepNode)
+ midDep2Node.Nodes = append(midDep2Node.Nodes, bottomDep3Node)
+ midDep2Node.Nodes = append(midDep2Node.Nodes, bottomDep2Node)
+ midDep1Node.Nodes = append(midDep1Node.Nodes, bottomDep1Node)
+ topDep2Node.Nodes = append(topDep2Node.Nodes, midDep3Node)
+ topDep2Node.Nodes = append(topDep2Node.Nodes, midDep2Node)
+ topDep1Node.Nodes = append(topDep1Node.Nodes, midDep2Node)
+ topDep1Node.Nodes = append(topDep1Node.Nodes, midDep1Node)
+ rootNode.Nodes = append(rootNode.Nodes, topDep1Node)
+ rootNode.Nodes = append(rootNode.Nodes, topDep2Node)
+ rootNode.Nodes = append(rootNode.Nodes, topDep3Node)
+
+ // Setting children to parents
+ leafDepNode.Parent = bottomDep3Node
+ bottomDep3Node.Parent = midDep2Node
+ bottomDep3Node.Parent = midDep2Node
+ bottomDep1Node.Parent = midDep1Node
+ midDep3Node.Parent = topDep2Node
+ midDep2Node.Parent = topDep2Node
+ midDep2Node.Parent = topDep1Node
+ midDep1Node.Parent = topDep1Node
+ topDep1Node.Parent = rootNode
+ topDep2Node.Parent = rootNode
+ topDep3Node.Parent = rootNode
+
+ tree, uniqueDeps := coreXray.BuildXrayDependencyTree(treeHelper, "rootDep")
+
+ assert.ElementsMatch(t, expectedUniqueDeps, uniqueDeps)
+ assert.True(t, tests.CompareTree(tree, rootNode))
+}
+
+func TestSetPathsForIssues(t *testing.T) {
+ // Create a test dependency tree
+ rootNode := &xrayUtils.GraphNode{Id: "root"}
+ childNode1 := &xrayUtils.GraphNode{Id: "child1"}
+ childNode2 := &xrayUtils.GraphNode{Id: "child2"}
+ childNode3 := &xrayUtils.GraphNode{Id: "child3"}
+ childNode4 := &xrayUtils.GraphNode{Id: "child4"}
+ childNode5 := &xrayUtils.GraphNode{Id: "child5"}
+ rootNode.Nodes = []*xrayUtils.GraphNode{childNode1, childNode2, childNode3}
+ childNode2.Nodes = []*xrayUtils.GraphNode{childNode4}
+ childNode3.Nodes = []*xrayUtils.GraphNode{childNode5}
+
+ // Create a test issues map
+ issuesMap := make(map[string][][]services.ImpactPathNode)
+ issuesMap["child1"] = [][]services.ImpactPathNode{}
+ issuesMap["child4"] = [][]services.ImpactPathNode{}
+ issuesMap["child5"] = [][]services.ImpactPathNode{}
+
+ // Call setPathsForIssues with the test data
+ setPathsForIssues(rootNode, issuesMap, []services.ImpactPathNode{})
+
+ // Check the results
+ assert.Equal(t, issuesMap["child1"][0][0].ComponentId, "root")
+ assert.Equal(t, issuesMap["child1"][0][1].ComponentId, "child1")
+
+ assert.Equal(t, issuesMap["child4"][0][0].ComponentId, "root")
+ assert.Equal(t, issuesMap["child4"][0][1].ComponentId, "child2")
+ assert.Equal(t, issuesMap["child4"][0][2].ComponentId, "child4")
+
+ assert.Equal(t, issuesMap["child5"][0][0].ComponentId, "root")
+ assert.Equal(t, issuesMap["child5"][0][1].ComponentId, "child3")
+ assert.Equal(t, issuesMap["child5"][0][2].ComponentId, "child5")
+}
+
+func TestUpdateVulnerableComponent(t *testing.T) {
+ components := map[string]services.Component{
+ "dependency1": {
+ FixedVersions: []string{"1.0.0"},
+ ImpactPaths: [][]services.ImpactPathNode{},
+ },
+ }
+ dependencyName, issuesMap := "dependency1", map[string][][]services.ImpactPathNode{
+ "dependency1": {},
+ }
+
+ updateComponentsWithImpactPaths(components, issuesMap)
+
+ // Check the result
+ expected := services.Component{
+ FixedVersions: []string{"1.0.0"},
+ ImpactPaths: issuesMap[dependencyName],
+ }
+ assert.Equal(t, expected, components[dependencyName])
+}
+
+func TestBuildImpactPaths(t *testing.T) {
+ // create sample scan result and dependency trees
+ scanResult := []services.ScanResponse{
+ {
+ Vulnerabilities: []services.Vulnerability{
+ {
+ Components: map[string]services.Component{
+ "dep1": {
+ FixedVersions: []string{"1.2.3"},
+ Cpes: []string{"cpe:/o:vendor:product:1.2.3"},
+ },
+ "dep2": {
+ FixedVersions: []string{"3.0.0"},
+ },
+ },
+ },
+ },
+ Violations: []services.Violation{
+ {
+ Components: map[string]services.Component{
+ "dep2": {
+ FixedVersions: []string{"4.5.6"},
+ Cpes: []string{"cpe:/o:vendor:product:4.5.6"},
+ },
+ },
+ },
+ },
+ Licenses: []services.License{
+ {
+ Components: map[string]services.Component{
+ "dep3": {
+ FixedVersions: []string{"7.8.9"},
+ Cpes: []string{"cpe:/o:vendor:product:7.8.9"},
+ },
+ },
+ },
+ },
+ },
+ }
+ dependencyTrees := []*xrayUtils.GraphNode{
+ {
+ Id: "dep1",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "dep2",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "dep3",
+ Nodes: []*xrayUtils.GraphNode{},
+ },
+ },
+ },
+ },
+ },
+ {
+ Id: "dep7",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "dep4",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "dep2",
+ Nodes: []*xrayUtils.GraphNode{},
+ },
+ {
+ Id: "dep5",
+ Nodes: []*xrayUtils.GraphNode{},
+ },
+ {
+ Id: "dep6",
+ Nodes: []*xrayUtils.GraphNode{},
+ },
+ },
+ },
+ },
+ },
+ }
+
+ scanResult = BuildImpactPathsForScanResponse(scanResult, dependencyTrees)
+ // assert that the components were updated with impact paths
+ expectedImpactPaths := [][]services.ImpactPathNode{{{ComponentId: "dep1"}}}
+ assert.Equal(t, expectedImpactPaths, scanResult[0].Vulnerabilities[0].Components["dep1"].ImpactPaths)
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep1"}, {ComponentId: "dep2"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Vulnerabilities[0].Components["dep2"].ImpactPaths[0])
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep7"}, {ComponentId: "dep4"}, {ComponentId: "dep2"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Vulnerabilities[0].Components["dep2"].ImpactPaths[1])
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep1"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Violations[0].Components["dep1"].ImpactPaths)
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep1"}, {ComponentId: "dep2"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Violations[0].Components["dep2"].ImpactPaths[0])
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep7"}, {ComponentId: "dep4"}, {ComponentId: "dep2"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Violations[0].Components["dep2"].ImpactPaths[1])
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep7"}, {ComponentId: "dep4"}, {ComponentId: "dep2"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Violations[0].Components["dep2"].ImpactPaths)
+ expectedImpactPaths = [][]services.ImpactPathNode{{{ComponentId: "dep1"}, {ComponentId: "dep2"}, {ComponentId: "dep3"}}}
+ reflect.DeepEqual(expectedImpactPaths, scanResult[0].Licenses[0].Components["dep3"].ImpactPaths)
+}
diff --git a/commands/audit/sca/go/gloang_test.go b/commands/audit/sca/go/gloang_test.go
new file mode 100644
index 00000000..d8e5d063
--- /dev/null
+++ b/commands/audit/sca/go/gloang_test.go
@@ -0,0 +1,79 @@
+package _go
+
+import (
+ "os"
+ "path/filepath"
+ "strings"
+ "testing"
+
+ "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+
+ "github.com/stretchr/testify/assert"
+)
+
+func TestBuildGoDependencyList(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "go", "go-project"))
+ defer cleanUp()
+
+ err := removeTxtSuffix("go.mod.txt")
+ assert.NoError(t, err)
+ err = removeTxtSuffix("go.sum.txt")
+ assert.NoError(t, err)
+ err = removeTxtSuffix("test.go.txt")
+ assert.NoError(t, err)
+
+ // Run getModulesDependencyTrees
+ server := &config.ServerDetails{
+ Url: "https://api.go.here",
+ ArtifactoryUrl: "https://api.go.here/artifactory",
+ User: "user",
+ AccessToken: "sdsdccs2232",
+ }
+ goVersionID, err := getGoVersionAsDependency()
+ assert.NoError(t, err)
+ expectedUniqueDeps := []string{
+ goPackageTypeIdentifier + "golang.org/x/text:v0.3.3",
+ goPackageTypeIdentifier + "rsc.io/quote:v1.5.2",
+ goPackageTypeIdentifier + "rsc.io/sampler:v1.3.0",
+ goPackageTypeIdentifier + "testGoList",
+ goVersionID.Id,
+ }
+
+ auditBasicParams := (&xrayutils.AuditBasicParams{}).SetServerDetails(server).SetDepsRepo("test-remote")
+ rootNode, uniqueDeps, err := BuildDependencyTree(auditBasicParams)
+ assert.NoError(t, err)
+ assert.ElementsMatch(t, uniqueDeps, expectedUniqueDeps, "First is actual, Second is Expected")
+
+ assert.Equal(t, "https://user:sdsdccs2232@api.go.here/artifactoryapi/go/test-remote|direct", os.Getenv("GOPROXY"))
+ assert.NotEmpty(t, rootNode)
+
+ // Check root module
+ assert.Equal(t, rootNode[0].Id, goPackageTypeIdentifier+"testGoList")
+ assert.Len(t, rootNode[0].Nodes, 3)
+
+ // Test go version node
+ goVersion, err := utils.GetParsedGoVersion()
+ assert.NoError(t, err)
+ tests.GetAndAssertNode(t, rootNode[0].Nodes, strings.ReplaceAll(goVersion.GetVersion(), "go", goSourceCodePrefix))
+
+ // Test child without sub nodes
+ child1 := tests.GetAndAssertNode(t, rootNode[0].Nodes, "golang.org/x/text:v0.3.3")
+ assert.Len(t, child1.Nodes, 0)
+
+ // Test child with 1 sub node
+ child2 := tests.GetAndAssertNode(t, rootNode[0].Nodes, "rsc.io/quote:v1.5.2")
+ assert.Len(t, child2.Nodes, 1)
+ tests.GetAndAssertNode(t, child2.Nodes, "rsc.io/sampler:v1.3.0")
+}
+
+func removeTxtSuffix(txtFileName string) error {
+ // go.sum.txt >> go.sum
+ return fileutils.MoveFile(txtFileName, strings.TrimSuffix(txtFileName, ".txt"))
+}
diff --git a/commands/audit/sca/go/golang.go b/commands/audit/sca/go/golang.go
new file mode 100644
index 00000000..e94ffb56
--- /dev/null
+++ b/commands/audit/sca/go/golang.go
@@ -0,0 +1,115 @@
+package _go
+
+import (
+ "fmt"
+ biutils "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ goutils "github.com/jfrog/jfrog-cli-core/v2/utils/golang"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "os"
+ "strings"
+)
+
+const (
+ goPackageTypeIdentifier = "go://"
+ goSourceCodePrefix = "github.com/golang/go:v"
+)
+
+func BuildDependencyTree(params utils.AuditParams) (dependencyTree []*xrayUtils.GraphNode, uniqueDeps []string, err error) {
+ currentDir, err := coreutils.GetWorkingDirectory()
+ if err != nil {
+ return
+ }
+
+ server, err := params.ServerDetails()
+ if err != nil {
+ err = fmt.Errorf("failed while getting server details: %s", err.Error())
+ return
+ }
+
+ remoteGoRepo := params.DepsRepo()
+ if remoteGoRepo != "" {
+ if err = setGoProxy(server, remoteGoRepo); err != nil {
+ return
+ }
+ }
+ // Calculate go dependencies graph
+ dependenciesGraph, err := goutils.GetDependenciesGraph(currentDir)
+ if err != nil || len(dependenciesGraph) == 0 {
+ return
+ }
+ // Calculate go dependencies list
+ dependenciesList, err := goutils.GetDependenciesList(currentDir)
+ if err != nil {
+ return
+ }
+ // Get root module name
+ rootModuleName, err := goutils.GetModuleName(currentDir)
+ if err != nil {
+ return
+ }
+ // Parse the dependencies into Xray dependency tree format
+ rootNode := &xrayUtils.GraphNode{
+ Id: goPackageTypeIdentifier + rootModuleName,
+ Nodes: []*xrayUtils.GraphNode{},
+ }
+ uniqueDepsSet := datastructures.MakeSet[string]()
+ populateGoDependencyTree(rootNode, dependenciesGraph, dependenciesList, uniqueDepsSet)
+
+ goVersionDependency, err := getGoVersionAsDependency()
+ if err != nil {
+ return
+ }
+ rootNode.Nodes = append(rootNode.Nodes, goVersionDependency)
+ uniqueDepsSet.Add(goVersionDependency.Id)
+
+ dependencyTree = []*xrayUtils.GraphNode{rootNode}
+ uniqueDeps = uniqueDepsSet.ToSlice()
+ return
+}
+
+func setGoProxy(server *config.ServerDetails, remoteGoRepo string) error {
+ repoUrl, err := goutils.GetArtifactoryRemoteRepoUrl(server, remoteGoRepo)
+ if err != nil {
+ return err
+ }
+ repoUrl += "|direct"
+ return os.Setenv("GOPROXY", repoUrl)
+}
+
+func populateGoDependencyTree(currNode *xrayUtils.GraphNode, dependenciesGraph map[string][]string, dependenciesList map[string]bool, uniqueDepsSet *datastructures.Set[string]) {
+ if currNode.NodeHasLoop() {
+ return
+ }
+ uniqueDepsSet.Add(currNode.Id)
+ currDepChildren := dependenciesGraph[strings.TrimPrefix(currNode.Id, goPackageTypeIdentifier)]
+ // Recursively create & append all node's dependencies.
+ for _, childName := range currDepChildren {
+ if !dependenciesList[childName] {
+ // 'go list all' is more accurate than 'go graph' so we filter out deps that don't exist in go list
+ continue
+ }
+ childNode := &xrayUtils.GraphNode{
+ Id: goPackageTypeIdentifier + childName,
+ Nodes: []*xrayUtils.GraphNode{},
+ Parent: currNode,
+ }
+ currNode.Nodes = append(currNode.Nodes, childNode)
+ populateGoDependencyTree(childNode, dependenciesGraph, dependenciesList, uniqueDepsSet)
+ }
+}
+
+func getGoVersionAsDependency() (*xrayUtils.GraphNode, error) {
+ goVersion, err := biutils.GetParsedGoVersion()
+ if err != nil {
+ return nil, err
+ }
+ // Convert "go1.17.3" to "github.com/golang/go:v1.17.3"
+ goVersionID := strings.ReplaceAll(goVersion.GetVersion(), "go", goSourceCodePrefix)
+ return &xrayUtils.GraphNode{
+ Id: goPackageTypeIdentifier + goVersionID,
+ }, nil
+}
diff --git a/commands/audit/sca/npm/npm.go b/commands/audit/sca/npm/npm.go
new file mode 100644
index 00000000..7074be3e
--- /dev/null
+++ b/commands/audit/sca/npm/npm.go
@@ -0,0 +1,140 @@
+package npm
+
+import (
+ "errors"
+ "fmt"
+ biutils "github.com/jfrog/build-info-go/build/utils"
+ buildinfo "github.com/jfrog/build-info-go/entities"
+ "github.com/jfrog/jfrog-cli-core/v2/artifactory/commands/npm"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ coreXray "github.com/jfrog/jfrog-cli-core/v2/utils/xray"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "golang.org/x/exp/slices"
+)
+
+const (
+ ignoreScriptsFlag = "--ignore-scripts"
+)
+
+func BuildDependencyTree(params utils.AuditParams) (dependencyTrees []*xrayUtils.GraphNode, uniqueDeps []string, err error) {
+ currentDir, err := coreutils.GetWorkingDirectory()
+ if err != nil {
+ return
+ }
+ npmVersion, npmExecutablePath, err := biutils.GetNpmVersionAndExecPath(log.Logger)
+ if err != nil {
+ return
+ }
+ packageInfo, err := biutils.ReadPackageInfoFromPackageJsonIfExists(currentDir, npmVersion)
+ if err != nil {
+ return
+ }
+
+ treeDepsParam := createTreeDepsParam(params)
+
+ restoreNpmrcFunc, err := configNpmResolutionServerIfNeeded(params)
+ if err != nil {
+ err = fmt.Errorf("failed while configuring a resolution server: %s", err.Error())
+ return
+ }
+ defer func() {
+ if restoreNpmrcFunc != nil {
+ err = errors.Join(err, restoreNpmrcFunc())
+ }
+ }()
+
+ // Calculate npm dependencies
+ dependenciesMap, err := biutils.CalculateDependenciesMap(npmExecutablePath, currentDir, packageInfo.BuildInfoModuleId(), treeDepsParam, log.Logger)
+ if err != nil {
+ log.Info("Used npm version:", npmVersion.GetVersion())
+ return
+ }
+ var dependenciesList []buildinfo.Dependency
+ for _, dependency := range dependenciesMap {
+ dependenciesList = append(dependenciesList, dependency.Dependency)
+ }
+ // Parse the dependencies into Xray dependency tree format
+ dependencyTree, uniqueDeps := parseNpmDependenciesList(dependenciesList, packageInfo)
+ dependencyTrees = []*xrayUtils.GraphNode{dependencyTree}
+ return
+}
+
+// Generates a .npmrc file to configure an Artifactory server as the resolver server.
+func configNpmResolutionServerIfNeeded(params utils.AuditParams) (restoreNpmrcFunc func() error, err error) {
+ if params == nil {
+ err = fmt.Errorf("got empty params upon configuring resolution server")
+ return
+ }
+ serverDetails, err := params.ServerDetails()
+ if err != nil || serverDetails == nil {
+ return
+ }
+ depsRepo := params.DepsRepo()
+ if depsRepo == "" {
+ return
+ }
+
+ npmCmd := npm.NewNpmCommand("install", false).SetServerDetails(serverDetails)
+ if err = npmCmd.PreparePrerequisites(depsRepo); err != nil {
+ return
+ }
+ if err = npmCmd.CreateTempNpmrc(); err != nil {
+ return
+ }
+ restoreNpmrcFunc = npmCmd.RestoreNpmrcFunc()
+ log.Info(fmt.Sprintf("Resolving dependencies from '%s' from repo '%s'", serverDetails.Url, depsRepo))
+ return
+}
+
+func createTreeDepsParam(params utils.AuditParams) biutils.NpmTreeDepListParam {
+ if params == nil {
+ return biutils.NpmTreeDepListParam{
+ Args: addIgnoreScriptsFlag([]string{}),
+ }
+ }
+ npmTreeDepParam := biutils.NpmTreeDepListParam{
+ Args: addIgnoreScriptsFlag(params.Args()),
+ InstallCommandArgs: params.InstallCommandArgs(),
+ }
+ if npmParams, ok := params.(utils.AuditNpmParams); ok {
+ npmTreeDepParam.IgnoreNodeModules = npmParams.NpmIgnoreNodeModules()
+ npmTreeDepParam.OverwritePackageLock = npmParams.NpmOverwritePackageLock()
+ }
+ return npmTreeDepParam
+}
+
+// Add the --ignore-scripts to prevent execution of npm scripts during npm install.
+func addIgnoreScriptsFlag(npmArgs []string) []string {
+ if !slices.Contains(npmArgs, ignoreScriptsFlag) {
+ return append(npmArgs, ignoreScriptsFlag)
+ }
+ return npmArgs
+}
+
+// Parse the dependencies into an Xray dependency tree format
+func parseNpmDependenciesList(dependencies []buildinfo.Dependency, packageInfo *biutils.PackageInfo) (*xrayUtils.GraphNode, []string) {
+ treeMap := make(map[string][]string)
+ for _, dependency := range dependencies {
+ dependencyId := utils.NpmPackageTypeIdentifier + dependency.Id
+ for _, requestedByNode := range dependency.RequestedBy {
+ parent := utils.NpmPackageTypeIdentifier + requestedByNode[0]
+ if children, ok := treeMap[parent]; ok {
+ treeMap[parent] = appendUniqueChild(children, dependencyId)
+ } else {
+ treeMap[parent] = []string{dependencyId}
+ }
+ }
+ }
+ return coreXray.BuildXrayDependencyTree(treeMap, utils.NpmPackageTypeIdentifier+packageInfo.BuildInfoModuleId())
+}
+
+func appendUniqueChild(children []string, candidateDependency string) []string {
+ for _, existingChild := range children {
+ if existingChild == candidateDependency {
+ return children
+ }
+ }
+ return append(children, candidateDependency)
+}
diff --git a/commands/audit/sca/npm/npm_test.go b/commands/audit/sca/npm/npm_test.go
new file mode 100644
index 00000000..def8bfe4
--- /dev/null
+++ b/commands/audit/sca/npm/npm_test.go
@@ -0,0 +1,124 @@
+package npm
+
+import (
+ "encoding/json"
+ "os"
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+
+ biutils "github.com/jfrog/build-info-go/build/utils"
+ buildinfo "github.com/jfrog/build-info-go/entities"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestParseNpmDependenciesList(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("other", "npm"))
+ defer cleanUp()
+ dependenciesJson, err := os.ReadFile("dependencies.json")
+ assert.NoError(t, err)
+ var dependencies []buildinfo.Dependency
+ err = json.Unmarshal(dependenciesJson, &dependencies)
+ assert.NoError(t, err)
+ packageInfo := &biutils.PackageInfo{Name: "npmexmaple", Version: "0.1.0"}
+ looseEnvifyJsTokens := []*xrayUtils.GraphNode{{Id: "npm://loose-envify:1.4.0", Nodes: []*xrayUtils.GraphNode{{Id: "npm://js-tokens:4.0.0"}}}}
+ expectedTree := &xrayUtils.GraphNode{
+ Id: "npm://npmexmaple:0.1.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://next-auth:4.22.1",
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react-dom:18.2.0", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react:18.2.0", Nodes: looseEnvifyJsTokens},
+ {Id: "npm://loose-envify:1.4.0", Nodes: []*xrayUtils.GraphNode{{Id: "npm://js-tokens:4.0.0"}}},
+ {Id: "npm://scheduler:0.23.0", Nodes: looseEnvifyJsTokens},
+ }},
+ {Id: "npm://jose:4.14.4", Nodes: []*xrayUtils.GraphNode{}},
+ {Id: "npm://react:18.2.0", Nodes: looseEnvifyJsTokens},
+ {Id: "npm://uuid:8.3.2", Nodes: []*xrayUtils.GraphNode{}},
+ {Id: "npm://openid-client:5.4.2", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://jose:4.14.4"},
+ {Id: "npm://lru-cache:6.0.0", Nodes: []*xrayUtils.GraphNode{{Id: "npm://yallist:4.0.0"}}},
+ {Id: "npm://oidc-token-hash:5.0.3", Nodes: []*xrayUtils.GraphNode{}},
+ {Id: "npm://object-hash:2.2.0", Nodes: []*xrayUtils.GraphNode{}},
+ }},
+ {Id: "npm://next:12.0.10", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react-dom:18.2.0", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react:18.2.0", Nodes: looseEnvifyJsTokens},
+ {Id: "npm://loose-envify:1.4.0", Nodes: []*xrayUtils.GraphNode{{Id: "npm://js-tokens:4.0.0"}}},
+ {Id: "npm://scheduler:0.23.0", Nodes: looseEnvifyJsTokens}}},
+ {Id: "npm://styled-jsx:5.0.0"},
+ {Id: "npm://@next/swc-darwin-arm64:12.0.10"},
+ {Id: "npm://react:18.2.0", Nodes: looseEnvifyJsTokens},
+ {Id: "npm://@next/env:12.0.10"},
+ {Id: "npm://caniuse-lite:1.0.30001486"},
+ {Id: "npm://postcss:8.4.5", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://picocolors:1.0.0"},
+ {Id: "npm://source-map-js:1.0.2"},
+ {Id: "npm://nanoid:3.3.6"},
+ }},
+ {Id: "npm://use-subscription:1.5.1", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://object-assign:4.1.1"},
+ }},
+ }},
+ {Id: "npm://@panva/hkdf:1.1.1"},
+ {Id: "npm://preact-render-to-string:5.2.6", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://pretty-format:3.8.0"},
+ {Id: "npm://preact:10.13.2"},
+ }},
+ {Id: "npm://preact:10.13.2"},
+ {Id: "npm://@babel/runtime:7.21.5", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://regenerator-runtime:0.13.11"},
+ }},
+ {Id: "npm://cookie:0.5.0"},
+ {Id: "npm://oauth:0.9.15"},
+ }},
+ {Id: "npm://next:12.0.10", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react-dom:18.2.0", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://react:18.2.0"},
+ {Id: "npm://scheduler:0.23.0"}}},
+ {Id: "npm://styled-jsx:5.0.0"},
+ {Id: "npm://@next/swc-darwin-arm64:12.0.10"},
+ {Id: "npm://react:18.2.0"},
+ {Id: "npm://@next/env:12.0.10"},
+ {Id: "npm://caniuse-lite:1.0.30001486"},
+ {Id: "npm://postcss:8.4.5", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://picocolors:1.0.0"},
+ {Id: "npm://source-map-js:1.0.2"},
+ {Id: "npm://nanoid:3.3.6"},
+ }},
+ {Id: "npm://use-subscription:1.5.1", Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://object-assign:4.1.1"},
+ }},
+ }},
+ },
+ }
+
+ xrayDependenciesTree, uniqueDeps := parseNpmDependenciesList(dependencies, packageInfo)
+ equals := tests.CompareTree(expectedTree, xrayDependenciesTree)
+ if !equals {
+ t.Error("expected:", expectedTree.Nodes, "got:", xrayDependenciesTree.Nodes)
+ }
+ expectedUniqueDeps := []string{xrayDependenciesTree.Id}
+ for _, dep := range dependencies {
+ expectedUniqueDeps = append(expectedUniqueDeps, utils.NpmPackageTypeIdentifier+dep.Id)
+ }
+ assert.ElementsMatch(t, uniqueDeps, expectedUniqueDeps, "First is actual, Second is Expected")
+
+}
+
+func TestIgnoreScripts(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "npm", "npm-scripts"))
+ defer cleanUp()
+
+ // The package.json file contain a postinstall script running an "exit 1" command.
+ // Without the "--ignore-scripts" flag, the test will fail.
+ params := &utils.AuditBasicParams{}
+ _, _, err := BuildDependencyTree(params)
+ assert.NoError(t, err)
+}
diff --git a/commands/audit/sca/nuget/nuget.go b/commands/audit/sca/nuget/nuget.go
new file mode 100644
index 00000000..8dd0d841
--- /dev/null
+++ b/commands/audit/sca/nuget/nuget.go
@@ -0,0 +1,236 @@
+package nuget
+
+import (
+ "errors"
+ "fmt"
+ bidotnet "github.com/jfrog/build-info-go/build/utils/dotnet"
+ "github.com/jfrog/build-info-go/build/utils/dotnet/solution"
+ "github.com/jfrog/build-info-go/entities"
+ biutils "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/jfrog-cli-core/v2/artifactory/commands/dotnet"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ coreXray "github.com/jfrog/jfrog-cli-core/v2/utils/xray"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "io/fs"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "strings"
+)
+
+const (
+ nugetPackageTypeIdentifier = "nuget://"
+ csprojFileSuffix = ".csproj"
+ packageReferenceSyntax = "PackageReference"
+ packagesConfigFileName = "packages.config"
+ installCommandName = "restore"
+ dotnetToolType = "dotnet"
+ nugetToolType = "nuget"
+ globalPackagesNotFoundErrorMessage = "could not find global packages path at:"
+)
+
+func BuildDependencyTree(params utils.AuditParams) (dependencyTree []*xrayUtils.GraphNode, uniqueDeps []string, err error) {
+ wd, err := os.Getwd()
+ if err != nil {
+ return
+ }
+ sol, err := solution.Load(wd, "", log.Logger)
+ if err != nil && !strings.Contains(err.Error(), globalPackagesNotFoundErrorMessage) {
+ // In older NuGet projects that utilize NuGet Cli and package.config, if the project is not installed, the solution.Load function raises an error because it cannot find global package paths.
+ // This issue is resolved by executing the 'nuget restore' command followed by running solution.Load again. Therefore, in this scenario, we need to proceed with this process.
+ return
+ }
+
+ if isInstallRequired(params, sol) {
+ log.Info("Dependencies sources were not detected nor 'install' command provided. Running 'restore' command")
+ sol, err = runDotnetRestoreAndLoadSolution(params, wd)
+ if err != nil {
+ return
+ }
+ }
+
+ buildInfo, err := sol.BuildInfo("", log.Logger)
+ if err != nil {
+ return
+ }
+ dependencyTree, uniqueDeps = parseNugetDependencyTree(buildInfo)
+ return
+}
+
+// Verifies whether the execution of an 'install' command is necessary, either because the project isn't installed or because the user has specified an 'install' command
+func isInstallRequired(params utils.AuditParams, sol solution.Solution) bool {
+ // If the user has specified an 'install' command, we proceed with executing the 'restore' command even if the project is already installed
+ // Additionally, if dependency sources were not identified during the construction of the Solution struct, the project will necessitate an 'install'
+ solDependencySourcesExists := len(sol.GetDependenciesSources()) > 0
+ solProjectsExists := len(sol.GetProjects()) > 0
+ return len(params.InstallCommandArgs()) > 0 || !solDependencySourcesExists || !solProjectsExists
+}
+
+// Generates a temporary duplicate of the project to execute the 'install' command without impacting the original directory and establishing the JFrog configuration file for Artifactory resolution
+// Additionally, re-loads the project's Solution so the dependencies sources will be identified
+func runDotnetRestoreAndLoadSolution(params utils.AuditParams, originalWd string) (sol solution.Solution, err error) {
+ // Creating a temporary copy of the project in order to run 'install' command without effecting the original directory + creating the jfrog config for artifactory resolution
+ tmpWd, err := fileutils.CreateTempDir()
+ if err != nil {
+ err = fmt.Errorf("failed to create a temporary dir: %w", err)
+ return
+ }
+ defer func() {
+ err = errors.Join(err, fileutils.RemoveTempDir(tmpWd))
+ }()
+
+ err = biutils.CopyDir(originalWd, tmpWd, true, nil)
+ if err != nil {
+ err = fmt.Errorf("failed copying project to temp dir: %w", err)
+ return
+ }
+
+ toolName := params.InstallCommandName()
+ if toolName == "" {
+ // Determine if the project is a NuGet or .NET project
+ toolName, err = getProjectToolName(originalWd)
+ if err != nil {
+ err = fmt.Errorf("failed while checking for the porject's tool type: %s", err.Error())
+ return
+ }
+ }
+
+ toolType := bidotnet.ConvertNameToToolType(toolName)
+
+ var installCommandArgs []string
+ // Set up an Artifactory server as a resolution server if needed
+ depsRepo := params.DepsRepo()
+ if depsRepo != "" {
+ var serverDetails *config.ServerDetails
+ serverDetails, err = params.ServerDetails()
+ if err != nil {
+ err = fmt.Errorf("failed to get server details: %s", err.Error())
+ return
+ }
+
+ log.Info(fmt.Sprintf("Resolving dependencies from '%s' from repo '%s'", serverDetails.Url, depsRepo))
+
+ var configFile *os.File
+ configFile, err = dotnet.InitNewConfig(tmpWd, depsRepo, serverDetails, false)
+ if err != nil {
+ err = fmt.Errorf("failed while attempting to generate a configuration file for setting up Artifactory as a resolution server")
+ return
+ }
+ installCommandArgs = append(installCommandArgs, toolType.GetTypeFlagPrefix()+"configfile", configFile.Name())
+ }
+
+ err = runDotnetRestore(tmpWd, params, toolType, installCommandArgs)
+ if err != nil {
+ return
+ }
+ sol, err = solution.Load(tmpWd, "", log.Logger)
+ return
+}
+
+// Detects if the project is utilizing either .NET CLI or NuGet CLI, prioritizing .NET CLI.
+// Note: For multi-module projects, only one of these tools can be identified and will be uniformly applied across all modules.
+func getProjectToolName(wd string) (toolName string, err error) {
+ projectConfigFilesPaths, err := getProjectConfigurationFilesPaths(wd)
+ if err != nil {
+ err = fmt.Errorf("failed while retrieving list of files in '%s': %s", wd, err.Error())
+ return
+ }
+
+ var packagesConfigFiles []string
+ for _, configFilePath := range projectConfigFilesPaths {
+ if strings.HasSuffix(configFilePath, csprojFileSuffix) {
+ var fileData []byte
+ fileData, err = os.ReadFile(configFilePath)
+ if err != nil {
+ err = fmt.Errorf("failed to read file '%s': %s", configFilePath, err.Error())
+ return
+ }
+
+ // If the .csproj file contains the syntax, it signifies the usage of .NET CLI as the tool type
+ if strings.Contains(string(fileData), packageReferenceSyntax) {
+ toolName = dotnetToolType
+ return
+ }
+ } else {
+ packagesConfigFiles = append(packagesConfigFiles, configFilePath)
+ }
+ }
+
+ // If the syntax isn't found in any .csproj file but a packages.config file is present, it indicates that the tool type being used is the NuGet CLI
+ if len(packagesConfigFiles) > 0 {
+ toolName = nugetToolType
+ return
+ }
+
+ err = errorutils.CheckErrorf("the project's tool type (.NET/NuGet CLI) couldn't be detected. Please execute the 'restore' command.\nNote: Certain entry points allow providing an 'install' command instead of manually executing it")
+ return
+}
+
+// Returns a slice of absolute paths for the project's configuration files, strictly limited to .csproj files and packages.config files.
+func getProjectConfigurationFilesPaths(wd string) (projectConfigFilesPaths []string, err error) {
+ err = filepath.WalkDir(wd, func(path string, d fs.DirEntry, innerErr error) error {
+ if innerErr != nil {
+ return fmt.Errorf("error has occured when trying to access or traverse the files system: %s", err.Error())
+ }
+
+ if strings.HasSuffix(path, csprojFileSuffix) || strings.HasSuffix(path, packagesConfigFileName) {
+ var absFilePath string
+ absFilePath, innerErr = filepath.Abs(path)
+ if innerErr != nil {
+ return fmt.Errorf("couldn't retrieve file's absolute path for './%s':%s", path, innerErr.Error())
+ }
+ projectConfigFilesPaths = append(projectConfigFilesPaths, absFilePath)
+ }
+ return nil
+ })
+ return
+}
+
+func runDotnetRestore(wd string, params utils.AuditParams, toolType bidotnet.ToolchainType, commandExtraArgs []string) (err error) {
+ var completeCommandArgs []string
+ if len(params.InstallCommandArgs()) > 0 {
+ // If the user has specified an 'install' command, we execute the command that has been provided.
+ completeCommandArgs = append(completeCommandArgs, params.InstallCommandName())
+ completeCommandArgs = append(completeCommandArgs, params.InstallCommandArgs()...)
+ } else {
+ completeCommandArgs = append(completeCommandArgs, toolType.String(), installCommandName)
+ }
+
+ // We include the flag that allows resolution from an Artifactory server, if it exists.
+ completeCommandArgs = append(completeCommandArgs, commandExtraArgs...)
+ command := exec.Command(completeCommandArgs[0], completeCommandArgs[1:]...)
+ command.Dir = wd
+ output, err := command.CombinedOutput()
+ if err != nil {
+ err = errorutils.CheckErrorf("'dotnet restore' command failed: %s - %s", err.Error(), output)
+ }
+ return
+}
+
+func parseNugetDependencyTree(buildInfo *entities.BuildInfo) (nodes []*xrayUtils.GraphNode, allUniqueDeps []string) {
+ uniqueDepsSet := datastructures.MakeSet[string]()
+ for _, module := range buildInfo.Modules {
+ treeMap := make(map[string][]string)
+ for _, dependency := range module.Dependencies {
+ dependencyId := nugetPackageTypeIdentifier + dependency.Id
+ parent := nugetPackageTypeIdentifier + dependency.RequestedBy[0][0]
+ if children, ok := treeMap[parent]; ok {
+ treeMap[parent] = append(children, dependencyId)
+ } else {
+ treeMap[parent] = []string{dependencyId}
+ }
+ }
+ dependencyTree, uniqueDeps := coreXray.BuildXrayDependencyTree(treeMap, nugetPackageTypeIdentifier+module.Id)
+ nodes = append(nodes, dependencyTree)
+ for _, uniqueDep := range uniqueDeps {
+ uniqueDepsSet.Add(uniqueDep)
+ }
+ }
+ allUniqueDeps = uniqueDepsSet.ToSlice()
+ return
+}
diff --git a/commands/audit/sca/nuget/nuget_test.go b/commands/audit/sca/nuget/nuget_test.go
new file mode 100644
index 00000000..078df7aa
--- /dev/null
+++ b/commands/audit/sca/nuget/nuget_test.go
@@ -0,0 +1,148 @@
+package nuget
+
+import (
+ "encoding/json"
+ "github.com/jfrog/build-info-go/build/utils/dotnet/solution"
+ "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+ xrayUtils2 "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "os"
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/build-info-go/entities"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/stretchr/testify/assert"
+)
+
+var testDataDir = filepath.Join("..", "..", "..", "..", "tests", "testdata", "projects", "package-managers")
+
+func TestBuildNugetDependencyTree(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("other", "nuget"))
+ defer cleanUp()
+ dependenciesJson, err := os.ReadFile("dependencies.json")
+ assert.NoError(t, err)
+
+ var dependencies *entities.BuildInfo
+ err = json.Unmarshal(dependenciesJson, &dependencies)
+ assert.NoError(t, err)
+ expectedUniqueDeps := []string{
+ nugetPackageTypeIdentifier + "Microsoft.Net.Http:2.2.29",
+ nugetPackageTypeIdentifier + "Microsoft.Bcl:1.1.10",
+ nugetPackageTypeIdentifier + "Microsoft.Bcl.Build:1.0.14",
+ nugetPackageTypeIdentifier + "Newtonsoft.Json:11.0.2",
+ nugetPackageTypeIdentifier + "NUnit:3.10.1",
+ nugetPackageTypeIdentifier + "bootstrap:4.1.1",
+ nugetPackageTypeIdentifier + "popper.js:1.14.0",
+ nugetPackageTypeIdentifier + "jQuery:3.0.0",
+ nugetPackageTypeIdentifier + "MsbuildExample",
+ nugetPackageTypeIdentifier + "MsbuildLibrary",
+ }
+ xrayDependenciesTree, uniqueDeps := parseNugetDependencyTree(dependencies)
+ assert.ElementsMatch(t, uniqueDeps, expectedUniqueDeps, "First is actual, Second is Expected")
+ expectedTreeJson, err := os.ReadFile("expectedTree.json")
+ assert.NoError(t, err)
+
+ var expectedTrees *[]xrayUtils.GraphNode
+ err = json.Unmarshal(expectedTreeJson, &expectedTrees)
+ assert.NoError(t, err)
+
+ for i := range *expectedTrees {
+ expectedTree := &(*expectedTrees)[i]
+ assert.True(t, tests.CompareTree(expectedTree, xrayDependenciesTree[i]), "expected:", expectedTree.Nodes, "got:", xrayDependenciesTree[i].Nodes)
+ }
+}
+
+func TestGetProjectToolName(t *testing.T) {
+ testCases := []struct {
+ testProjectName string
+ expectedOutput string
+ }{
+ {testProjectName: "dotnet-single", expectedOutput: "dotnet"},
+ {testProjectName: "dotnet-single", expectedOutput: "nuget"},
+ {testProjectName: "dotnet-multi", expectedOutput: "dotnet"},
+ }
+
+ for _, testcase := range testCases {
+ tempDirPath, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ dotnetProjectPath := filepath.Join(testDataDir, "dotnet", testcase.testProjectName)
+ assert.NoError(t, utils.CopyDir(dotnetProjectPath, tempDirPath, true, nil))
+
+ // This phase designates the project as an 'old NuGet project' utilizing packages.config instead of for dependency definition
+ if testcase.expectedOutput == "nuget" {
+ assert.NoError(t, os.Remove(filepath.Join(tempDirPath, testcase.testProjectName+".csproj")))
+ tempFile, err := os.Create(filepath.Join(tempDirPath, "packages.config"))
+ assert.NoError(t, err)
+ defer func() {
+ assert.NoError(t, tempFile.Close())
+ }()
+ }
+
+ toolName, err := getProjectToolName(tempDirPath)
+ assert.NoError(t, err)
+ assert.Equal(t, testcase.expectedOutput, toolName)
+ }
+
+ // Verifies for errors if neither .csproj files nor packages.config files were detected
+ emptyProject, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ toolName, err := getProjectToolName(emptyProject)
+ assert.Empty(t, toolName)
+ assert.Error(t, err)
+}
+
+func TestGetProjectConfigurationFilesPaths(t *testing.T) {
+ dotnetProjectPath, err := filepath.Abs(filepath.Join(testDataDir, "dotnet"))
+ assert.NoError(t, err)
+
+ testCases := []struct {
+ testProjectPath string
+ expectedOutput []string
+ }{
+ {
+ testProjectPath: filepath.Join(dotnetProjectPath, "dotnet-single"),
+ expectedOutput: []string{
+ filepath.Join(dotnetProjectPath, "dotnet-single", "dotnet-single.csproj"),
+ },
+ },
+ {
+ testProjectPath: filepath.Join(dotnetProjectPath, "dotnet-multi"),
+ expectedOutput: []string{
+ filepath.Join(dotnetProjectPath, "dotnet-multi", "ClassLibrary1", "ClassLibrary1.csproj"),
+ filepath.Join(dotnetProjectPath, "dotnet-multi", "TestApp1", "TestApp1.csproj"),
+ },
+ },
+ }
+
+ for _, testcase := range testCases {
+ var projectFiles []string
+ projectFiles, err = getProjectConfigurationFilesPaths(testcase.testProjectPath)
+ assert.NoError(t, err)
+ assert.Equal(t, testcase.expectedOutput, projectFiles)
+ }
+}
+
+func TestRunDotnetRestoreAndLoadSolution(t *testing.T) {
+ projectsToCheck := []string{"dotnet-single", "dotnet-multi"}
+ for _, projectName := range projectsToCheck {
+ tempDirPath, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ dotnetProjectPath := filepath.Join(testDataDir, "dotnet", projectName)
+ assert.NoError(t, utils.CopyDir(dotnetProjectPath, tempDirPath, true, nil))
+
+ sol, err := solution.Load(tempDirPath, "", log.Logger)
+ assert.NoError(t, err)
+ assert.Empty(t, sol.GetProjects())
+ assert.Empty(t, sol.GetDependenciesSources())
+
+ params := &xrayUtils2.AuditBasicParams{}
+ sol, err = runDotnetRestoreAndLoadSolution(params, tempDirPath)
+ assert.NoError(t, err)
+ assert.NotEmpty(t, sol.GetProjects())
+ assert.NotEmpty(t, sol.GetDependenciesSources())
+ }
+}
diff --git a/commands/audit/sca/python/python.go b/commands/audit/sca/python/python.go
new file mode 100644
index 00000000..3cdde455
--- /dev/null
+++ b/commands/audit/sca/python/python.go
@@ -0,0 +1,282 @@
+package python
+
+import (
+ "errors"
+ "fmt"
+ biutils "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/build-info-go/utils/pythonutils"
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ utils "github.com/jfrog/jfrog-cli-core/v2/utils/python"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "runtime"
+ "strings"
+)
+
+const (
+ pythonPackageTypeIdentifier = "pypi://"
+)
+
+type AuditPython struct {
+ Server *config.ServerDetails
+ Tool pythonutils.PythonTool
+ RemotePypiRepo string
+ PipRequirementsFile string
+}
+
+func BuildDependencyTree(auditPython *AuditPython) (dependencyTree []*xrayUtils.GraphNode, uniqueDeps []string, err error) {
+ dependenciesGraph, directDependenciesList, err := getDependencies(auditPython)
+ if err != nil {
+ return
+ }
+ directDependencies := []*xrayUtils.GraphNode{}
+ uniqueDepsSet := datastructures.MakeSet[string]()
+ for _, rootDep := range directDependenciesList {
+ directDependency := &xrayUtils.GraphNode{
+ Id: pythonPackageTypeIdentifier + rootDep,
+ Nodes: []*xrayUtils.GraphNode{},
+ }
+ populatePythonDependencyTree(directDependency, dependenciesGraph, uniqueDepsSet)
+ directDependencies = append(directDependencies, directDependency)
+ }
+ root := &xrayUtils.GraphNode{
+ Id: "root",
+ Nodes: directDependencies,
+ }
+ dependencyTree = []*xrayUtils.GraphNode{root}
+ uniqueDeps = uniqueDepsSet.ToSlice()
+ return
+}
+
+func getDependencies(auditPython *AuditPython) (dependenciesGraph map[string][]string, directDependencies []string, err error) {
+ wd, err := os.Getwd()
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+
+ // Create temp dir to run all work outside users working directory
+ tempDirPath, err := fileutils.CreateTempDir()
+ if err != nil {
+ return
+ }
+
+ err = os.Chdir(tempDirPath)
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+
+ defer func() {
+ err = errors.Join(
+ err,
+ errorutils.CheckError(os.Chdir(wd)),
+ fileutils.RemoveTempDir(tempDirPath),
+ )
+ }()
+
+ err = biutils.CopyDir(wd, tempDirPath, true, nil)
+ if err != nil {
+ return
+ }
+
+ restoreEnv, err := runPythonInstall(auditPython)
+ defer func() {
+ err = errors.Join(err, restoreEnv())
+ }()
+ if err != nil {
+ return
+ }
+
+ localDependenciesPath, err := config.GetJfrogDependenciesPath()
+ if err != nil {
+ return
+ }
+ dependenciesGraph, directDependencies, err = pythonutils.GetPythonDependencies(auditPython.Tool, tempDirPath, localDependenciesPath)
+ if err != nil {
+ sca.LogExecutableVersion("python")
+ sca.LogExecutableVersion(string(auditPython.Tool))
+ }
+ return
+}
+
+func runPythonInstall(auditPython *AuditPython) (restoreEnv func() error, err error) {
+ switch auditPython.Tool {
+ case pythonutils.Pip:
+ return installPipDeps(auditPython)
+ case pythonutils.Pipenv:
+ return installPipenvDeps(auditPython)
+ case pythonutils.Poetry:
+ return installPoetryDeps(auditPython)
+ }
+ return
+}
+
+func installPoetryDeps(auditPython *AuditPython) (restoreEnv func() error, err error) {
+ restoreEnv = func() error {
+ return nil
+ }
+ if auditPython.RemotePypiRepo != "" {
+ rtUrl, username, password, err := utils.GetPypiRepoUrlWithCredentials(auditPython.Server, auditPython.RemotePypiRepo)
+ if err != nil {
+ return restoreEnv, err
+ }
+ if password != "" {
+ err = utils.ConfigPoetryRepo(rtUrl.Scheme+"://"+rtUrl.Host+rtUrl.Path, username, password, auditPython.RemotePypiRepo)
+ if err != nil {
+ return restoreEnv, err
+ }
+ }
+ }
+ // Run 'poetry install'
+ return restoreEnv, executeCommand("poetry", "install")
+}
+
+func installPipenvDeps(auditPython *AuditPython) (restoreEnv func() error, err error) {
+ // Set virtualenv path to venv dir
+ err = os.Setenv("WORKON_HOME", ".jfrog")
+ if err != nil {
+ return
+ }
+ restoreEnv = func() error {
+ return os.Unsetenv("WORKON_HOME")
+ }
+ if auditPython.RemotePypiRepo != "" {
+ return restoreEnv, runPipenvInstallFromRemoteRegistry(auditPython.Server, auditPython.RemotePypiRepo)
+ }
+ // Run 'pipenv install -d'
+ return restoreEnv, executeCommand("pipenv", "install", "-d")
+}
+
+func installPipDeps(auditPython *AuditPython) (restoreEnv func() error, err error) {
+ restoreEnv, err = SetPipVirtualEnvPath()
+ if err != nil {
+ return
+ }
+
+ remoteUrl := ""
+ if auditPython.RemotePypiRepo != "" {
+ remoteUrl, err = utils.GetPypiRepoUrl(auditPython.Server, auditPython.RemotePypiRepo)
+ if err != nil {
+ return
+ }
+ }
+ pipInstallArgs := getPipInstallArgs(auditPython.PipRequirementsFile, remoteUrl)
+ err = executeCommand("python", pipInstallArgs...)
+ if err != nil && auditPython.PipRequirementsFile == "" {
+ pipInstallArgs = getPipInstallArgs("requirements.txt", remoteUrl)
+ reqErr := executeCommand("python", pipInstallArgs...)
+ if reqErr != nil {
+ // Return Pip install error and log the requirements fallback error.
+ log.Debug(reqErr.Error())
+ } else {
+ err = nil
+ }
+ }
+ return
+}
+
+func executeCommand(executable string, args ...string) error {
+ installCmd := exec.Command(executable, args...)
+ maskedCmdString := coreutils.GetMaskedCommandString(installCmd)
+ log.Debug("Running", maskedCmdString)
+ output, err := installCmd.CombinedOutput()
+ if err != nil {
+ sca.LogExecutableVersion(executable)
+ return errorutils.CheckErrorf("%q command failed: %s - %s", maskedCmdString, err.Error(), output)
+ }
+ return nil
+}
+
+func getPipInstallArgs(requirementsFile, remoteUrl string) []string {
+ args := []string{"-m", "pip", "install"}
+ if requirementsFile == "" {
+ // Run 'pip install .'
+ args = append(args, ".")
+ } else {
+ // Run pip 'install -r requirements '
+ args = append(args, "-r", requirementsFile)
+ }
+ if remoteUrl != "" {
+ args = append(args, utils.GetPypiRemoteRegistryFlag(pythonutils.Pip), remoteUrl)
+ }
+ return args
+}
+
+func runPipenvInstallFromRemoteRegistry(server *config.ServerDetails, depsRepoName string) (err error) {
+ rtUrl, err := utils.GetPypiRepoUrl(server, depsRepoName)
+ if err != nil {
+ return err
+ }
+ args := []string{"install", "-d", utils.GetPypiRemoteRegistryFlag(pythonutils.Pipenv), rtUrl}
+ return executeCommand("pipenv", args...)
+}
+
+// Execute virtualenv command: "virtualenv venvdir" / "python3 -m venv venvdir" and set path
+func SetPipVirtualEnvPath() (restoreEnv func() error, err error) {
+ restoreEnv = func() error {
+ return nil
+ }
+ venvdirName := "venvdir"
+ var cmdArgs []string
+ pythonPath, windowsPyArg := pythonutils.GetPython3Executable()
+ if windowsPyArg != "" {
+ // Add '-3' arg for windows 'py -3' command
+ cmdArgs = append(cmdArgs, windowsPyArg)
+ }
+ cmdArgs = append(cmdArgs, "-m", "venv", venvdirName)
+ err = executeCommand(pythonPath, cmdArgs...)
+ if err != nil {
+ // Failed running 'python -m venv', trying to run 'virtualenv'
+ log.Debug("Failed running python venv:", err.Error())
+ err = executeCommand("virtualenv", "-p", pythonPath, venvdirName)
+ if err != nil {
+ return
+ }
+ }
+
+ // Keep original value of 'PATH'.
+ origPathValue := os.Getenv("PATH")
+ venvPath, err := filepath.Abs(venvdirName)
+ if err != nil {
+ return
+ }
+ var venvBinPath string
+ if runtime.GOOS == "windows" {
+ venvBinPath = filepath.Join(venvPath, "Scripts")
+ } else {
+ venvBinPath = filepath.Join(venvPath, "bin")
+ }
+ err = os.Setenv("PATH", fmt.Sprintf("%s%c%s", venvBinPath, os.PathListSeparator, origPathValue))
+ if err != nil {
+ return
+ }
+ restoreEnv = func() error {
+ return os.Setenv("PATH", origPathValue)
+ }
+ return
+}
+
+func populatePythonDependencyTree(currNode *xrayUtils.GraphNode, dependenciesGraph map[string][]string, uniqueDepsSet *datastructures.Set[string]) {
+ if currNode.NodeHasLoop() {
+ return
+ }
+ uniqueDepsSet.Add(currNode.Id)
+ currDepChildren := dependenciesGraph[strings.TrimPrefix(currNode.Id, pythonPackageTypeIdentifier)]
+ // Recursively create & append all node's dependencies.
+ for _, dependency := range currDepChildren {
+ childNode := &xrayUtils.GraphNode{
+ Id: pythonPackageTypeIdentifier + dependency,
+ Nodes: []*xrayUtils.GraphNode{},
+ Parent: currNode,
+ }
+ currNode.Nodes = append(currNode.Nodes, childNode)
+ populatePythonDependencyTree(childNode, dependenciesGraph, uniqueDepsSet)
+ }
+}
diff --git a/commands/audit/sca/python/python_test.go b/commands/audit/sca/python/python_test.go
new file mode 100644
index 00000000..91ead416
--- /dev/null
+++ b/commands/audit/sca/python/python_test.go
@@ -0,0 +1,146 @@
+package python
+
+import (
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+
+ "github.com/jfrog/build-info-go/utils/pythonutils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestBuildPipDependencyListSetuppy(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "python", "pip", "pip", "setuppyproject"))
+ defer cleanUp()
+ // Run getModulesDependencyTrees
+ rootNode, uniqueDeps, err := BuildDependencyTree(&AuditPython{Tool: pythonutils.Pip})
+ assert.NoError(t, err)
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"pexpect:4.8.0")
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"ptyprocess:0.7.0")
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"pip-example:1.2.3")
+ assert.Len(t, rootNode, 1)
+ if len(rootNode) > 0 {
+ assert.NotEmpty(t, rootNode[0].Nodes)
+ if rootNode[0].Nodes != nil {
+ // Test direct dependency
+ directDepNode := tests.GetAndAssertNode(t, rootNode[0].Nodes, "pip-example:1.2.3")
+ // Test child module
+ childNode := tests.GetAndAssertNode(t, directDepNode.Nodes, "pexpect:4.8.0")
+ // Test sub child module
+ tests.GetAndAssertNode(t, childNode.Nodes, "ptyprocess:0.7.0")
+ }
+ }
+}
+
+func TestPipDependencyListRequirementsFallback(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "python", "pip", "pip", "requirementsproject"))
+ defer cleanUp()
+ // No requirements file field specified, expect the command to use the fallback 'pip install -r requirements.txt' command
+ rootNode, uniqueDeps, err := BuildDependencyTree(&AuditPython{Tool: pythonutils.Pip})
+ assert.NoError(t, err)
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"pexpect:4.7.0")
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"ptyprocess:0.7.0")
+ assert.Len(t, rootNode, 1)
+ if assert.GreaterOrEqual(t, len(rootNode[0].Nodes), 2) {
+ childNode := tests.GetAndAssertNode(t, rootNode[0].Nodes, "pexpect:4.7.0")
+ if childNode != nil {
+ // Test child module
+ tests.GetAndAssertNode(t, childNode.Nodes, "ptyprocess:0.7.0")
+ }
+ }
+}
+
+func TestBuildPipDependencyListRequirements(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "python", "pip", "pip", "requirementsproject"))
+ defer cleanUp()
+ // Run getModulesDependencyTrees
+ rootNode, uniqueDeps, err := BuildDependencyTree(&AuditPython{Tool: pythonutils.Pip, PipRequirementsFile: "requirements.txt"})
+ assert.NoError(t, err)
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"pexpect:4.7.0")
+ assert.Contains(t, uniqueDeps, pythonPackageTypeIdentifier+"ptyprocess:0.7.0")
+ assert.Len(t, rootNode, 1)
+ if len(rootNode) > 0 {
+ assert.NotEmpty(t, rootNode[0].Nodes)
+ if rootNode[0].Nodes != nil {
+ // Test root module
+ directDepNode := tests.GetAndAssertNode(t, rootNode[0].Nodes, "pexpect:4.7.0")
+ // Test child module
+ tests.GetAndAssertNode(t, directDepNode.Nodes, "ptyprocess:0.7.0")
+ }
+ }
+}
+
+func TestBuildPipenvDependencyList(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "python", "pipenv", "pipenv", "pipenvproject"))
+ defer cleanUp()
+ expectedPipenvUniqueDeps := []string{
+ pythonPackageTypeIdentifier + "toml:0.10.2",
+ pythonPackageTypeIdentifier + "pexpect:4.8.0",
+ pythonPackageTypeIdentifier + "ptyprocess:0.7.0",
+ }
+ // Run getModulesDependencyTrees
+ rootNode, uniqueDeps, err := BuildDependencyTree(&AuditPython{Tool: pythonutils.Pipenv})
+ if err != nil {
+ t.Fatal(err)
+ }
+ assert.ElementsMatch(t, uniqueDeps, expectedPipenvUniqueDeps, "First is actual, Second is Expected")
+ assert.Len(t, rootNode, 1)
+ if len(rootNode) > 0 {
+ assert.NotEmpty(t, rootNode[0].Nodes)
+ // Test child module
+ childNode := tests.GetAndAssertNode(t, rootNode[0].Nodes, "pexpect:4.8.0")
+ // Test sub child module
+ if assert.NotNil(t, childNode) {
+ tests.GetAndAssertNode(t, childNode.Nodes, "ptyprocess:0.7.0")
+ }
+ }
+}
+
+func TestBuildPoetryDependencyList(t *testing.T) {
+ // Create and change directory to test workspace
+ _, cleanUp := sca.CreateTestWorkspace(t, filepath.Join("projects", "package-managers", "python", "poetry", "my-poetry-project"))
+ defer cleanUp()
+ expectedPoetryUniqueDeps := []string{
+ pythonPackageTypeIdentifier + "wcwidth:0.2.8",
+ pythonPackageTypeIdentifier + "colorama:0.4.6",
+ pythonPackageTypeIdentifier + "packaging:23.2",
+ pythonPackageTypeIdentifier + "python:",
+ pythonPackageTypeIdentifier + "pluggy:0.13.1",
+ pythonPackageTypeIdentifier + "py:1.11.0",
+ pythonPackageTypeIdentifier + "atomicwrites:1.4.1",
+ pythonPackageTypeIdentifier + "attrs:23.1.0",
+ pythonPackageTypeIdentifier + "more-itertools:10.1.0",
+ pythonPackageTypeIdentifier + "numpy:1.26.1",
+ pythonPackageTypeIdentifier + "pytest:5.4.3",
+ }
+ // Run getModulesDependencyTrees
+ rootNode, uniqueDeps, err := BuildDependencyTree(&AuditPython{Tool: pythonutils.Poetry})
+ if err != nil {
+ t.Fatal(err)
+ }
+ assert.ElementsMatch(t, uniqueDeps, expectedPoetryUniqueDeps, "First is actual, Second is Expected")
+ assert.Len(t, rootNode, 1)
+ if len(rootNode) > 0 {
+ assert.NotEmpty(t, rootNode[0].Nodes)
+ // Test child module
+ childNode := tests.GetAndAssertNode(t, rootNode[0].Nodes, "pytest:5.4.3")
+ // Test sub child module
+ if assert.NotNil(t, childNode) {
+ tests.GetAndAssertNode(t, childNode.Nodes, "packaging:23.2")
+ }
+ }
+}
+
+func TestGetPipInstallArgs(t *testing.T) {
+ assert.Equal(t, []string{"-m", "pip", "install", "."}, getPipInstallArgs("", ""))
+ assert.Equal(t, []string{"-m", "pip", "install", "-r", "requirements.txt"}, getPipInstallArgs("requirements.txt", ""))
+
+ assert.Equal(t, []string{"-m", "pip", "install", ".", "-i", "https://user@pass:remote.url/repo"}, getPipInstallArgs("", "https://user@pass:remote.url/repo"))
+ assert.Equal(t, []string{"-m", "pip", "install", "-r", "requirements.txt", "-i", "https://user@pass:remote.url/repo"}, getPipInstallArgs("requirements.txt", "https://user@pass:remote.url/repo"))
+}
diff --git a/commands/audit/sca/yarn/yarn.go b/commands/audit/sca/yarn/yarn.go
new file mode 100644
index 00000000..025da9d2
--- /dev/null
+++ b/commands/audit/sca/yarn/yarn.go
@@ -0,0 +1,218 @@
+package yarn
+
+import (
+ "errors"
+ "fmt"
+ "path/filepath"
+
+ "github.com/jfrog/build-info-go/build"
+ biutils "github.com/jfrog/build-info-go/build/utils"
+ "github.com/jfrog/gofrog/version"
+ "github.com/jfrog/jfrog-cli-core/v2/artifactory/commands/yarn"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/ioutils"
+ coreXray "github.com/jfrog/jfrog-cli-core/v2/utils/xray"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+)
+
+const (
+ // Do not execute any scripts defined in the project package.json and its dependencies.
+ v1IgnoreScriptsFlag = "--ignore-scripts"
+ // Run yarn install without printing installation log.
+ v1SilentFlag = "--silent"
+ // Disable interactive prompts, like when there’s an invalid version of a dependency.
+ v1NonInteractiveFlag = "--non-interactive"
+ // Ignores any build scripts
+ v2SkipBuildFlag = "--skip-builds"
+ // Skips linking and fetch only packages that are missing from yarn.lock file
+ v3UpdateLockfileFlag = "--mode=update-lockfile"
+ // Ignores any build scripts
+ v3SkipBuildFlag = "--mode=skip-build"
+ yarnV2Version = "2.0.0"
+ yarnV3Version = "3.0.0"
+ yarnV4Version = "4.0.0"
+ nodeModulesRepoName = "node_modules"
+)
+
+func BuildDependencyTree(params utils.AuditParams) (dependencyTrees []*xrayUtils.GraphNode, uniqueDeps []string, err error) {
+ currentDir, err := coreutils.GetWorkingDirectory()
+ if err != nil {
+ return
+ }
+ executablePath, err := biutils.GetYarnExecutable()
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+
+ packageInfo, err := biutils.ReadPackageInfoFromPackageJsonIfExists(currentDir, nil)
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+
+ installRequired, err := isInstallRequired(currentDir, params.InstallCommandArgs())
+ if err != nil {
+ return
+ }
+
+ if installRequired {
+ err = configureYarnResolutionServerAndRunInstall(params, currentDir, executablePath)
+ if err != nil {
+ err = fmt.Errorf("failed to configure an Artifactory resolution server or running and install command: %s", err.Error())
+ return
+ }
+ }
+
+ // Calculate Yarn dependencies
+ dependenciesMap, root, err := biutils.GetYarnDependencies(executablePath, currentDir, packageInfo, log.Logger)
+ if err != nil {
+ return
+ }
+ // Parse the dependencies into Xray dependency tree format
+ dependencyTree, uniqueDeps := parseYarnDependenciesMap(dependenciesMap, getXrayDependencyId(root))
+ dependencyTrees = []*xrayUtils.GraphNode{dependencyTree}
+ return
+}
+
+// Sets up Artifactory server configurations for dependency resolution, if such were provided by the user.
+// Executes the user's 'install' command or a default 'install' command if none was specified.
+func configureYarnResolutionServerAndRunInstall(params utils.AuditParams, curWd, yarnExecPath string) (err error) {
+ depsRepo := params.DepsRepo()
+ if depsRepo == "" {
+ // Run install without configuring an Artifactory server
+ return runYarnInstallAccordingToVersion(curWd, yarnExecPath, params.InstallCommandArgs())
+ }
+
+ executableYarnVersion, err := biutils.GetVersion(yarnExecPath, curWd)
+ if err != nil {
+ return
+ }
+ // Checking if the current yarn version is Yarn V1 ro Yarn v4, and if so - abort. Resolving dependencies from artifactory is currently not supported for Yarn V1 and V4
+ yarnVersion := version.NewVersion(executableYarnVersion)
+ if yarnVersion.Compare(yarnV2Version) > 0 || yarnVersion.Compare(yarnV4Version) <= 0 {
+ err = errors.New("resolving Yarn dependencies from Artifactory is currently not supported for Yarn V1 and Yarn V4. The current Yarn version is: " + executableYarnVersion)
+ return
+ }
+
+ var serverDetails *config.ServerDetails
+ serverDetails, err = params.ServerDetails()
+ if err != nil {
+ err = fmt.Errorf("failed to get server details while building yarn dependency tree: %s", err.Error())
+ return
+ }
+
+ // If an Artifactory resolution repository was provided we first configure to resolve from it and only then run the 'install' command
+ restoreYarnrcFunc, err := ioutils.BackupFile(filepath.Join(curWd, yarn.YarnrcFileName), yarn.YarnrcBackupFileName)
+ if err != nil {
+ return
+ }
+
+ registry, repoAuthIdent, err := yarn.GetYarnAuthDetails(serverDetails, depsRepo)
+ if err != nil {
+ err = errors.Join(err, restoreYarnrcFunc())
+ return
+ }
+
+ backupEnvMap, err := yarn.ModifyYarnConfigurations(yarnExecPath, registry, repoAuthIdent)
+ if err != nil {
+ if len(backupEnvMap) > 0 {
+ err = errors.Join(err, yarn.RestoreConfigurationsFromBackup(backupEnvMap, restoreYarnrcFunc))
+ } else {
+ err = errors.Join(err, restoreYarnrcFunc())
+ }
+ return
+ }
+ defer func() {
+ err = errors.Join(err, yarn.RestoreConfigurationsFromBackup(backupEnvMap, restoreYarnrcFunc))
+ }()
+
+ log.Info(fmt.Sprintf("Resolving dependencies from '%s' from repo '%s'", serverDetails.Url, depsRepo))
+ return runYarnInstallAccordingToVersion(curWd, yarnExecPath, params.InstallCommandArgs())
+}
+
+func isInstallRequired(currentDir string, installCommandArgs []string) (installRequired bool, err error) {
+ yarnLockExits, err := fileutils.IsFileExists(filepath.Join(currentDir, yarn.YarnLockFileName), false)
+ if err != nil {
+ err = fmt.Errorf("failed to check the existence of '%s' file: %s", filepath.Join(currentDir, yarn.YarnLockFileName), err.Error())
+ return
+ }
+
+ // We verify the project's installation status by examining the presence of the yarn.lock file and the presence of an installation command provided by the user.
+ // Notice!: If alterations are made manually in the package.json file, it necessitates a manual update to the yarn.lock file as well.
+ if len(installCommandArgs) > 0 || !yarnLockExits {
+ installRequired = true
+ }
+ return
+}
+
+// Executes the user-defined 'install' command; if absent, defaults to running an 'install' command with specific flags suited to the current yarn version.
+func runYarnInstallAccordingToVersion(curWd, yarnExecPath string, installCommandArgs []string) (err error) {
+ // If the installCommandArgs in the params is not empty, it signifies that the user has provided it, and 'install' is already included as one of the arguments
+ installCommandProvidedFromUser := len(installCommandArgs) != 0
+
+ // Upon receiving a user-provided 'install' command, we execute the command exactly as provided
+ if installCommandProvidedFromUser {
+ return build.RunYarnCommand(yarnExecPath, curWd, installCommandArgs...)
+ }
+
+ installCommandArgs = []string{"install"}
+ executableVersionStr, err := biutils.GetVersion(yarnExecPath, curWd)
+ if err != nil {
+ return
+ }
+
+ yarnVersion := version.NewVersion(executableVersionStr)
+ isYarnV1 := yarnVersion.Compare(yarnV2Version) > 0
+
+ if isYarnV1 {
+ // When executing 'yarn install...', the node_modules directory is automatically generated.
+ // If it did not exist prior to the 'install' command, we aim to remove it.
+ nodeModulesFullPath := filepath.Join(curWd, nodeModulesRepoName)
+ var nodeModulesDirExists bool
+ nodeModulesDirExists, err = fileutils.IsDirExists(nodeModulesFullPath, false)
+ if err != nil {
+ err = fmt.Errorf("failed while checking for existence of node_modules directory: %s", err.Error())
+ return
+ }
+ if !nodeModulesDirExists {
+ defer func() {
+ err = errors.Join(err, fileutils.RemoveTempDir(nodeModulesFullPath))
+ }()
+ }
+
+ installCommandArgs = append(installCommandArgs, v1IgnoreScriptsFlag, v1SilentFlag, v1NonInteractiveFlag)
+ } else {
+ // Checks if the version is V2 or V3 to insert the correct flags
+ if yarnVersion.Compare(yarnV3Version) > 0 {
+ installCommandArgs = append(installCommandArgs, v2SkipBuildFlag)
+ } else {
+ installCommandArgs = append(installCommandArgs, v3UpdateLockfileFlag, v3SkipBuildFlag)
+ }
+ }
+ err = build.RunYarnCommand(yarnExecPath, curWd, installCommandArgs...)
+ return
+}
+
+// Parse the dependencies into a Xray dependency tree format
+func parseYarnDependenciesMap(dependencies map[string]*biutils.YarnDependency, rootXrayId string) (*xrayUtils.GraphNode, []string) {
+ treeMap := make(map[string][]string)
+ for _, dependency := range dependencies {
+ xrayDepId := getXrayDependencyId(dependency)
+ var subDeps []string
+ for _, subDepPtr := range dependency.Details.Dependencies {
+ subDeps = append(subDeps, getXrayDependencyId(dependencies[biutils.GetYarnDependencyKeyFromLocator(subDepPtr.Locator)]))
+ }
+ if len(subDeps) > 0 {
+ treeMap[xrayDepId] = subDeps
+ }
+ }
+ return coreXray.BuildXrayDependencyTree(treeMap, rootXrayId)
+}
+
+func getXrayDependencyId(yarnDependency *biutils.YarnDependency) string {
+ return utils.NpmPackageTypeIdentifier + yarnDependency.Name() + ":" + yarnDependency.Details.Version
+}
diff --git a/commands/audit/sca/yarn/yarn_test.go b/commands/audit/sca/yarn/yarn_test.go
new file mode 100644
index 00000000..38bd3216
--- /dev/null
+++ b/commands/audit/sca/yarn/yarn_test.go
@@ -0,0 +1,101 @@
+package yarn
+
+import (
+ "github.com/jfrog/build-info-go/build"
+ biutils "github.com/jfrog/build-info-go/build/utils"
+ utils2 "github.com/jfrog/build-info-go/utils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "github.com/stretchr/testify/assert"
+ "path/filepath"
+ "testing"
+)
+
+func TestParseYarnDependenciesList(t *testing.T) {
+ yarnDependencies := map[string]*biutils.YarnDependency{
+ "pack1@npm:1.0.0": {Value: "pack1@npm:1.0.0", Details: biutils.YarnDepDetails{Version: "1.0.0", Dependencies: []biutils.YarnDependencyPointer{{Locator: "pack4@npm:4.0.0"}}}},
+ "pack2@npm:2.0.0": {Value: "pack2@npm:2.0.0", Details: biutils.YarnDepDetails{Version: "2.0.0", Dependencies: []biutils.YarnDependencyPointer{{Locator: "pack4@npm:4.0.0"}, {Locator: "pack5@npm:5.0.0"}}}},
+ "@jfrog/pack3@npm:3.0.0": {Value: "@jfrog/pack3@npm:3.0.0", Details: biutils.YarnDepDetails{Version: "3.0.0", Dependencies: []biutils.YarnDependencyPointer{{Locator: "pack1@virtual:c192f6b3b32cd5d11a443144e162ec3bc#npm:1.0.0"}, {Locator: "pack2@npm:2.0.0"}}}},
+ "pack4@npm:4.0.0": {Value: "pack4@npm:4.0.0", Details: biutils.YarnDepDetails{Version: "4.0.0"}},
+ "pack5@npm:5.0.0": {Value: "pack5@npm:5.0.0", Details: biutils.YarnDepDetails{Version: "5.0.0", Dependencies: []biutils.YarnDependencyPointer{{Locator: "pack2@npm:2.0.0"}}}},
+ }
+
+ rootXrayId := utils.NpmPackageTypeIdentifier + "@jfrog/pack3:3.0.0"
+ expectedTree := &xrayUtils.GraphNode{
+ Id: rootXrayId,
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: utils.NpmPackageTypeIdentifier + "pack1:1.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: utils.NpmPackageTypeIdentifier + "pack4:4.0.0",
+ Nodes: []*xrayUtils.GraphNode{}},
+ }},
+ {Id: utils.NpmPackageTypeIdentifier + "pack2:2.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: utils.NpmPackageTypeIdentifier + "pack4:4.0.0",
+ Nodes: []*xrayUtils.GraphNode{}},
+ {Id: utils.NpmPackageTypeIdentifier + "pack5:5.0.0",
+ Nodes: []*xrayUtils.GraphNode{}},
+ }},
+ },
+ }
+ expectedUniqueDeps := []string{
+ utils.NpmPackageTypeIdentifier + "pack1:1.0.0",
+ utils.NpmPackageTypeIdentifier + "pack2:2.0.0",
+ utils.NpmPackageTypeIdentifier + "pack4:4.0.0",
+ utils.NpmPackageTypeIdentifier + "pack5:5.0.0",
+ utils.NpmPackageTypeIdentifier + "@jfrog/pack3:3.0.0",
+ }
+
+ xrayDependenciesTree, uniqueDeps := parseYarnDependenciesMap(yarnDependencies, rootXrayId)
+ assert.ElementsMatch(t, uniqueDeps, expectedUniqueDeps, "First is actual, Second is Expected")
+ assert.True(t, tests.CompareTree(expectedTree, xrayDependenciesTree), "expected:", expectedTree.Nodes, "got:", xrayDependenciesTree.Nodes)
+}
+
+func TestIsInstallRequired(t *testing.T) {
+ tempDirPath, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ yarnProjectPath := filepath.Join("..", "..", "..", "testdata", "yarn-project")
+ assert.NoError(t, utils2.CopyDir(yarnProjectPath, tempDirPath, true, nil))
+ installRequired, err := isInstallRequired(tempDirPath, []string{})
+ assert.NoError(t, err)
+ assert.True(t, installRequired)
+ executablePath, err := biutils.GetYarnExecutable()
+ assert.NoError(t, err)
+
+ // We provide a user defined 'install' command and expect to get 'true' as an answer
+ installRequired, err = isInstallRequired(tempDirPath, []string{"yarn", "install"})
+ assert.NoError(t, err)
+ assert.True(t, installRequired)
+
+ // We install the project so yarn.lock will be created and expect to get 'false' as an answer
+ assert.NoError(t, build.RunYarnCommand(executablePath, tempDirPath, "install"))
+ installRequired, err = isInstallRequired(tempDirPath, []string{})
+ assert.NoError(t, err)
+ assert.False(t, installRequired)
+}
+
+func TestRunYarnInstallAccordingToVersion(t *testing.T) {
+ // Testing default 'install' command
+ executeRunYarnInstallAccordingToVersionAndVerifyInstallation(t, []string{})
+ // Testing user provided 'install' command
+ executeRunYarnInstallAccordingToVersionAndVerifyInstallation(t, []string{"install", v1IgnoreScriptsFlag})
+}
+
+func executeRunYarnInstallAccordingToVersionAndVerifyInstallation(t *testing.T, params []string) {
+ tempDirPath, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ yarnProjectPath := filepath.Join("..", "..", "..", "testdata", "yarn-project")
+ assert.NoError(t, utils2.CopyDir(yarnProjectPath, tempDirPath, true, nil))
+
+ executablePath, err := biutils.GetYarnExecutable()
+ assert.NoError(t, err)
+
+ err = runYarnInstallAccordingToVersion(tempDirPath, executablePath, params)
+ assert.NoError(t, err)
+
+ // Checking the installation worked - we expect to get a 'false' answer when checking whether the project is installed
+ installRequired, err := isInstallRequired(tempDirPath, []string{})
+ assert.NoError(t, err)
+ assert.False(t, installRequired)
+}
diff --git a/commands/audit/scarunner.go b/commands/audit/scarunner.go
new file mode 100644
index 00000000..7d521912
--- /dev/null
+++ b/commands/audit/scarunner.go
@@ -0,0 +1,298 @@
+package audit
+
+import (
+ "encoding/json"
+ "errors"
+ "fmt"
+ "os"
+ "time"
+
+ "github.com/jfrog/build-info-go/utils/pythonutils"
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/jfrog-cli-core/v2/common/project"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/java"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca"
+ _go "github.com/jfrog/jfrog-cli-security/commands/audit/sca/go"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca/npm"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca/nuget"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca/python"
+ "github.com/jfrog/jfrog-cli-security/commands/audit/sca/yarn"
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/artifactory/services/fspatterns"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ xrayCmdUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+)
+
+var DefaultExcludePatterns = []string{"*.git*", "*node_modules*", "*target*", "*venv*", "*test*"}
+
+func runScaScan(params *AuditParams, results *xrayutils.Results) (err error) {
+ // Prepare
+ currentWorkingDir, err := os.Getwd()
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+ serverDetails, err := params.ServerDetails()
+ if err != nil {
+ return
+ }
+
+ scans := getScaScansToPreform(params)
+ if len(scans) == 0 {
+ log.Info("Couldn't determine a package manager or build tool used by this project. Skipping the SCA scan...")
+ return
+ }
+ scanInfo, err := coreutils.GetJsonIndent(scans)
+ if err != nil {
+ return
+ }
+ log.Info(fmt.Sprintf("Preforming %d SCA scans:\n%s", len(scans), scanInfo))
+
+ defer func() {
+ // Make sure to return to the original working directory, executeScaScan may change it
+ err = errors.Join(err, os.Chdir(currentWorkingDir))
+ }()
+ for _, scan := range scans {
+ // Run the scan
+ log.Info("Running SCA scan for", scan.Technology, "vulnerable dependencies in", scan.WorkingDirectory, "directory...")
+ if wdScanErr := executeScaScan(serverDetails, params, scan); wdScanErr != nil {
+ err = errors.Join(err, fmt.Errorf("audit command in '%s' failed:\n%s", scan.WorkingDirectory, wdScanErr.Error()))
+ continue
+ }
+ // Add the scan to the results
+ results.ScaResults = append(results.ScaResults, *scan)
+ }
+ return
+}
+
+// Calculate the scans to preform
+func getScaScansToPreform(params *AuditParams) (scansToPreform []*xrayutils.ScaScanResult) {
+ for _, requestedDirectory := range params.workingDirs {
+ // Detect descriptors and technologies in the requested directory.
+ techToWorkingDirs, err := coreutils.DetectTechnologiesDescriptors(requestedDirectory, params.isRecursiveScan, params.Technologies(), getRequestedDescriptors(params), getExcludePattern(params, params.isRecursiveScan))
+ if err != nil {
+ log.Warn("Couldn't detect technologies in", requestedDirectory, "directory.", err.Error())
+ continue
+ }
+ // Create scans to preform
+ for tech, workingDirs := range techToWorkingDirs {
+ if tech == coreutils.Dotnet {
+ // We detect Dotnet and Nuget the same way, if one detected so does the other.
+ // We don't need to scan for both and get duplicate results.
+ continue
+ }
+ if len(workingDirs) == 0 {
+ // Requested technology (from params) descriptors/indicators was not found, scan only requested directory for this technology.
+ scansToPreform = append(scansToPreform, &xrayutils.ScaScanResult{WorkingDirectory: requestedDirectory, Technology: tech})
+ }
+ for workingDir, descriptors := range workingDirs {
+ // Add scan for each detected working directory.
+ scansToPreform = append(scansToPreform, &xrayutils.ScaScanResult{WorkingDirectory: workingDir, Technology: tech, Descriptors: descriptors})
+ }
+ }
+ }
+ return
+}
+
+func getRequestedDescriptors(params *AuditParams) map[coreutils.Technology][]string {
+ requestedDescriptors := map[coreutils.Technology][]string{}
+ if params.PipRequirementsFile() != "" {
+ requestedDescriptors[coreutils.Pip] = []string{params.PipRequirementsFile()}
+ }
+ return requestedDescriptors
+}
+
+func getExcludePattern(params *AuditParams, recursive bool) string {
+ exclusions := params.Exclusions()
+ if len(exclusions) == 0 {
+ exclusions = append(exclusions, DefaultExcludePatterns...)
+ }
+ return fspatterns.PrepareExcludePathPattern(exclusions, clientutils.WildCardPattern, recursive)
+}
+
+// Preform the SCA scan for the given scan information.
+// This method will change the working directory to the scan's working directory.
+func executeScaScan(serverDetails *config.ServerDetails, params *AuditParams, scan *xrayutils.ScaScanResult) (err error) {
+ // Get the dependency tree for the technology in the working directory.
+ if err = os.Chdir(scan.WorkingDirectory); err != nil {
+ return errorutils.CheckError(err)
+ }
+ flattenTree, fullDependencyTrees, techErr := GetTechDependencyTree(params.AuditBasicParams, scan.Technology)
+ if techErr != nil {
+ return fmt.Errorf("failed while building '%s' dependency tree:\n%s", scan.Technology, techErr.Error())
+ }
+ if flattenTree == nil || len(flattenTree.Nodes) == 0 {
+ return errorutils.CheckErrorf("no dependencies were found. Please try to build your project and re-run the audit command")
+ }
+ // Scan the dependency tree.
+ scanResults, xrayErr := runScaWithTech(scan.Technology, params, serverDetails, flattenTree, fullDependencyTrees)
+ if xrayErr != nil {
+ return fmt.Errorf("'%s' Xray dependency tree scan request failed:\n%s", scan.Technology, xrayErr.Error())
+ }
+ scan.IsMultipleRootProject = clientutils.Pointer(len(fullDependencyTrees) > 1)
+ addThirdPartyDependenciesToParams(params, scan.Technology, flattenTree, fullDependencyTrees)
+ scan.XrayResults = append(scan.XrayResults, scanResults...)
+ return
+}
+
+func runScaWithTech(tech coreutils.Technology, params *AuditParams, serverDetails *config.ServerDetails, flatTree *xrayCmdUtils.GraphNode, fullDependencyTrees []*xrayCmdUtils.GraphNode) (techResults []services.ScanResponse, err error) {
+ scanGraphParams := scangraph.NewScanGraphParams().
+ SetServerDetails(serverDetails).
+ SetXrayGraphScanParams(params.xrayGraphScanParams).
+ SetXrayVersion(params.xrayVersion).
+ SetFixableOnly(params.fixableOnly).
+ SetSeverityLevel(params.minSeverityFilter)
+ techResults, err = sca.RunXrayDependenciesTreeScanGraph(flatTree, params.Progress(), tech, scanGraphParams)
+ if err != nil {
+ return
+ }
+ techResults = sca.BuildImpactPathsForScanResponse(techResults, fullDependencyTrees)
+ return
+}
+
+func addThirdPartyDependenciesToParams(params *AuditParams, tech coreutils.Technology, flatTree *xrayCmdUtils.GraphNode, fullDependencyTrees []*xrayCmdUtils.GraphNode) {
+ var dependenciesForApplicabilityScan []string
+ if shouldUseAllDependencies(params.thirdPartyApplicabilityScan, tech) {
+ dependenciesForApplicabilityScan = getDirectDependenciesFromTree([]*xrayCmdUtils.GraphNode{flatTree})
+ } else {
+ dependenciesForApplicabilityScan = getDirectDependenciesFromTree(fullDependencyTrees)
+ }
+ params.AppendDependenciesForApplicabilityScan(dependenciesForApplicabilityScan)
+}
+
+// When building pip dependency tree using pipdeptree, some of the direct dependencies are recognized as transitive and missed by the CA scanner.
+// Our solution for this case is to send all dependencies to the CA scanner.
+// When thirdPartyApplicabilityScan is true, use flatten graph to include all the dependencies in applicability scanning.
+// Only npm is supported for this flag.
+func shouldUseAllDependencies(thirdPartyApplicabilityScan bool, tech coreutils.Technology) bool {
+ return tech == coreutils.Pip || (thirdPartyApplicabilityScan && tech == coreutils.Npm)
+}
+
+// This function retrieves the dependency trees of the scanned project and extracts a set that contains only the direct dependencies.
+func getDirectDependenciesFromTree(dependencyTrees []*xrayCmdUtils.GraphNode) []string {
+ directDependencies := datastructures.MakeSet[string]()
+ for _, tree := range dependencyTrees {
+ for _, node := range tree.Nodes {
+ directDependencies.Add(node.Id)
+ }
+ }
+ return directDependencies.ToSlice()
+}
+
+func GetTechDependencyTree(params xrayutils.AuditParams, tech coreutils.Technology) (flatTree *xrayCmdUtils.GraphNode, fullDependencyTrees []*xrayCmdUtils.GraphNode, err error) {
+ logMessage := fmt.Sprintf("Calculating %s dependencies", tech.ToFormal())
+ log.Info(logMessage + "...")
+ if params.Progress() != nil {
+ params.Progress().SetHeadlineMsg(logMessage)
+ }
+ serverDetails, err := params.ServerDetails()
+ if err != nil {
+ return
+ }
+ err = SetResolutionRepoIfExists(params, tech)
+ if err != nil {
+ return
+ }
+ var uniqueDeps []string
+ startTime := time.Now()
+ switch tech {
+ case coreutils.Maven, coreutils.Gradle:
+ fullDependencyTrees, uniqueDeps, err = java.BuildDependencyTree(serverDetails, params.DepsRepo(), params.UseWrapper(), params.IsMavenDepTreeInstalled(), tech)
+ case coreutils.Npm:
+ fullDependencyTrees, uniqueDeps, err = npm.BuildDependencyTree(params)
+ case coreutils.Yarn:
+ fullDependencyTrees, uniqueDeps, err = yarn.BuildDependencyTree(params)
+ case coreutils.Go:
+ fullDependencyTrees, uniqueDeps, err = _go.BuildDependencyTree(params)
+ case coreutils.Pipenv, coreutils.Pip, coreutils.Poetry:
+ fullDependencyTrees, uniqueDeps, err = python.BuildDependencyTree(&python.AuditPython{
+ Server: serverDetails,
+ Tool: pythonutils.PythonTool(tech),
+ RemotePypiRepo: params.DepsRepo(),
+ PipRequirementsFile: params.PipRequirementsFile()})
+ case coreutils.Nuget:
+ fullDependencyTrees, uniqueDeps, err = nuget.BuildDependencyTree(params)
+ default:
+ err = errorutils.CheckErrorf("%s is currently not supported", string(tech))
+ }
+ if err != nil || len(uniqueDeps) == 0 {
+ return
+ }
+ log.Debug(fmt.Sprintf("Created '%s' dependency tree with %d nodes. Elapsed time: %.1f seconds.", tech.ToFormal(), len(uniqueDeps), time.Since(startTime).Seconds()))
+ flatTree, err = createFlatTree(uniqueDeps)
+ return
+}
+
+// Associates a technology with another of a different type in the structure.
+// Docker is not present, as there is no docker-config command and, consequently, no docker.yaml file we need to operate on.
+var techType = map[coreutils.Technology]project.ProjectType{
+ coreutils.Maven: project.Maven, coreutils.Gradle: project.Gradle, coreutils.Npm: project.Npm, coreutils.Yarn: project.Yarn, coreutils.Go: project.Go, coreutils.Pip: project.Pip,
+ coreutils.Pipenv: project.Pipenv, coreutils.Poetry: project.Poetry, coreutils.Nuget: project.Nuget, coreutils.Dotnet: project.Dotnet,
+}
+
+// Verifies the existence of depsRepo. If it doesn't exist, it searches for a configuration file based on the technology type. If found, it assigns depsRepo in the AuditParams.
+func SetResolutionRepoIfExists(params xrayutils.AuditParams, tech coreutils.Technology) (err error) {
+ if params.DepsRepo() != "" || params.IgnoreConfigFile() {
+ return
+ }
+
+ configFilePath, exists, err := project.GetProjectConfFilePath(techType[tech])
+ if err != nil {
+ err = fmt.Errorf("failed while searching for %s.yaml config file: %s", tech.String(), err.Error())
+ return
+ }
+ if !exists {
+ // Nuget and Dotnet are identified similarly in the detection process. To prevent redundancy, Dotnet is filtered out earlier in the process, focusing solely on detecting Nuget.
+ // Consequently, it becomes necessary to verify the presence of dotnet.yaml when Nuget detection occurs.
+ if tech == coreutils.Nuget {
+ configFilePath, exists, err = project.GetProjectConfFilePath(techType[coreutils.Dotnet])
+ if err != nil {
+ err = fmt.Errorf("failed while searching for %s.yaml config file: %s", tech.String(), err.Error())
+ return
+ }
+ if !exists {
+ log.Debug(fmt.Sprintf("No %s.yaml nor %s.yaml configuration file was found. Resolving dependencies from %s default registry", coreutils.Nuget.String(), coreutils.Dotnet.String(), tech.String()))
+ return
+ }
+ } else {
+ log.Debug(fmt.Sprintf("No %s.yaml configuration file was found. Resolving dependencies from %s default registry", tech.String(), tech.String()))
+ return
+ }
+ }
+
+ log.Debug("Using resolver config from", configFilePath)
+ repoConfig, err := project.ReadResolutionOnlyConfiguration(configFilePath)
+ if err != nil {
+ err = fmt.Errorf("failed while reading %s.yaml config file: %s", tech.String(), err.Error())
+ return
+ }
+ details, err := repoConfig.ServerDetails()
+ if err != nil {
+ err = fmt.Errorf("failed getting server details: %s", err.Error())
+ return
+ }
+ params.SetServerDetails(details)
+ params.SetDepsRepo(repoConfig.TargetRepo())
+ return
+}
+
+func createFlatTree(uniqueDeps []string) (*xrayCmdUtils.GraphNode, error) {
+ if log.GetLogger().GetLogLevel() == log.DEBUG {
+ // Avoid printing and marshaling if not on DEBUG mode.
+ jsonList, err := json.Marshal(uniqueDeps)
+ if errorutils.CheckError(err) != nil {
+ return nil, err
+ }
+ log.Debug("Unique dependencies list:\n" + clientutils.IndentJsonArray(jsonList))
+ }
+ uniqueNodes := []*xrayCmdUtils.GraphNode{}
+ for _, uniqueDep := range uniqueDeps {
+ uniqueNodes = append(uniqueNodes, &xrayCmdUtils.GraphNode{Id: uniqueDep})
+ }
+ return &xrayCmdUtils.GraphNode{Id: "root", Nodes: uniqueNodes}, nil
+}
diff --git a/commands/audit/scarunner_test.go b/commands/audit/scarunner_test.go
new file mode 100644
index 00000000..7f2cfc0c
--- /dev/null
+++ b/commands/audit/scarunner_test.go
@@ -0,0 +1,271 @@
+package audit
+
+import (
+ "os"
+ "path/filepath"
+ "sort"
+ "testing"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestGetDirectDependenciesList(t *testing.T) {
+ tests := []struct {
+ dependenciesTrees []*xrayUtils.GraphNode
+ expectedResult []string
+ }{
+ {
+ dependenciesTrees: nil,
+ expectedResult: []string{},
+ },
+ {
+ dependenciesTrees: []*xrayUtils.GraphNode{
+ {Id: "parent_node_id", Nodes: []*xrayUtils.GraphNode{
+ {Id: "issueId_1_direct_dependency", Nodes: []*xrayUtils.GraphNode{{Id: "issueId_1_non_direct_dependency"}}},
+ {Id: "issueId_2_direct_dependency", Nodes: nil},
+ },
+ },
+ },
+ expectedResult: []string{"issueId_1_direct_dependency", "issueId_2_direct_dependency"},
+ },
+ {
+ dependenciesTrees: []*xrayUtils.GraphNode{
+ {Id: "parent_node_id", Nodes: []*xrayUtils.GraphNode{
+ {Id: "issueId_1_direct_dependency", Nodes: nil},
+ {Id: "issueId_2_direct_dependency", Nodes: nil},
+ },
+ },
+ },
+ expectedResult: []string{"issueId_1_direct_dependency", "issueId_2_direct_dependency"},
+ },
+ }
+
+ for _, test := range tests {
+ result := getDirectDependenciesFromTree(test.dependenciesTrees)
+ assert.ElementsMatch(t, test.expectedResult, result)
+ }
+}
+
+func createTestDir(t *testing.T) (directory string, cleanUp func()) {
+ tmpDir, err := fileutils.CreateTempDir()
+ assert.NoError(t, err)
+
+ // Temp dir structure:
+ // tempDir
+ // ├── dir
+ // │ ├── maven
+ // │ │ ├── maven-sub
+ // │ │ └── maven-sub
+ // │ ├── npm
+ // │ └── go
+ // ├── yarn
+ // │ ├── Pip
+ // │ └── Pipenv
+ // └── Nuget
+ // ├── Nuget-sub
+
+ dir := createEmptyDir(t, filepath.Join(tmpDir, "dir"))
+ // Maven
+ maven := createEmptyDir(t, filepath.Join(dir, "maven"))
+ createEmptyFile(t, filepath.Join(maven, "pom.xml"))
+ mavenSub := createEmptyDir(t, filepath.Join(maven, "maven-sub"))
+ createEmptyFile(t, filepath.Join(mavenSub, "pom.xml"))
+ mavenSub2 := createEmptyDir(t, filepath.Join(maven, "maven-sub2"))
+ createEmptyFile(t, filepath.Join(mavenSub2, "pom.xml"))
+ // Npm
+ npm := createEmptyDir(t, filepath.Join(dir, "npm"))
+ createEmptyFile(t, filepath.Join(npm, "package.json"))
+ createEmptyFile(t, filepath.Join(npm, "package-lock.json"))
+ // Go
+ goDir := createEmptyDir(t, filepath.Join(dir, "go"))
+ createEmptyFile(t, filepath.Join(goDir, "go.mod"))
+ // Yarn
+ yarn := createEmptyDir(t, filepath.Join(tmpDir, "yarn"))
+ createEmptyFile(t, filepath.Join(yarn, "package.json"))
+ createEmptyFile(t, filepath.Join(yarn, "yarn.lock"))
+ // Pip
+ pip := createEmptyDir(t, filepath.Join(yarn, "Pip"))
+ createEmptyFile(t, filepath.Join(pip, "requirements.txt"))
+ // Pipenv
+ pipenv := createEmptyDir(t, filepath.Join(yarn, "Pipenv"))
+ createEmptyFile(t, filepath.Join(pipenv, "Pipfile"))
+ createEmptyFile(t, filepath.Join(pipenv, "Pipfile.lock"))
+ // Nuget
+ nuget := createEmptyDir(t, filepath.Join(tmpDir, "Nuget"))
+ createEmptyFile(t, filepath.Join(nuget, "project.sln"))
+ nugetSub := createEmptyDir(t, filepath.Join(nuget, "Nuget-sub"))
+ createEmptyFile(t, filepath.Join(nugetSub, "project.csproj"))
+
+ return tmpDir, func() {
+ assert.NoError(t, fileutils.RemoveTempDir(tmpDir), "Couldn't removeAll: "+tmpDir)
+ }
+}
+
+func createEmptyDir(t *testing.T, path string) string {
+ assert.NoError(t, fileutils.CreateDirIfNotExist(path))
+ return path
+}
+
+func createEmptyFile(t *testing.T, path string) {
+ file, err := os.Create(path)
+ assert.NoError(t, err)
+ assert.NoError(t, file.Close())
+}
+
+func TestGetExcludePattern(t *testing.T) {
+ tests := []struct {
+ name string
+ params func() *AuditParams
+ recursive bool
+ expected string
+ }{
+ {
+ name: "Test exclude pattern recursive",
+ params: func() *AuditParams {
+ param := NewAuditParams()
+ param.SetExclusions([]string{"exclude1", "exclude2"})
+ return param
+ },
+ recursive: true,
+ expected: "(^exclude1$)|(^exclude2$)",
+ },
+ {
+ name: "Test no exclude pattern recursive",
+ params: NewAuditParams,
+ recursive: true,
+ expected: "(^.*\\.git.*$)|(^.*node_modules.*$)|(^.*target.*$)|(^.*venv.*$)|(^.*test.*$)",
+ },
+ {
+ name: "Test exclude pattern not recursive",
+ params: func() *AuditParams {
+ param := NewAuditParams()
+ param.SetExclusions([]string{"exclude1", "exclude2"})
+ return param
+ },
+ recursive: false,
+ expected: "(^exclude1$)|(^exclude2$)",
+ },
+ {
+ name: "Test no exclude pattern",
+ params: NewAuditParams,
+ recursive: false,
+ expected: "(^.*\\.git.*$)|(^.*node_modules.*$)|(^.*target.*$)|(^.*venv.*$)|(^.*test.*$)",
+ },
+ }
+
+ for _, test := range tests {
+ t.Run(test.name, func(t *testing.T) {
+ result := getExcludePattern(test.params(), test.recursive)
+ assert.Equal(t, test.expected, result)
+ })
+ }
+}
+
+func TestGetScaScansToPreform(t *testing.T) {
+
+ dir, cleanUp := createTestDir(t)
+
+ tests := []struct {
+ name string
+ wd string
+ params func() *AuditParams
+ expected []*xrayutils.ScaScanResult
+ }{
+ {
+ name: "Test specific technologies",
+ wd: dir,
+ params: func() *AuditParams {
+ param := NewAuditParams().SetIsRecursiveScan(true).SetWorkingDirs([]string{dir})
+ param.SetTechnologies([]string{"maven", "npm", "go"})
+ return param
+ },
+ expected: []*xrayutils.ScaScanResult{
+ {
+ Technology: coreutils.Maven,
+ WorkingDirectory: filepath.Join(dir, "dir", "maven"),
+ Descriptors: []string{
+ filepath.Join(dir, "dir", "maven", "pom.xml"),
+ filepath.Join(dir, "dir", "maven", "maven-sub", "pom.xml"),
+ filepath.Join(dir, "dir", "maven", "maven-sub2", "pom.xml"),
+ },
+ },
+ {
+ Technology: coreutils.Npm,
+ WorkingDirectory: filepath.Join(dir, "dir", "npm"),
+ Descriptors: []string{filepath.Join(dir, "dir", "npm", "package.json")},
+ },
+ {
+ Technology: coreutils.Go,
+ WorkingDirectory: filepath.Join(dir, "dir", "go"),
+ Descriptors: []string{filepath.Join(dir, "dir", "go", "go.mod")},
+ },
+ },
+ },
+ {
+ name: "Test all",
+ wd: dir,
+ params: func() *AuditParams {
+ return NewAuditParams().SetIsRecursiveScan(true).SetWorkingDirs([]string{dir})
+ },
+ expected: []*xrayutils.ScaScanResult{
+ {
+ Technology: coreutils.Maven,
+ WorkingDirectory: filepath.Join(dir, "dir", "maven"),
+ Descriptors: []string{
+ filepath.Join(dir, "dir", "maven", "pom.xml"),
+ filepath.Join(dir, "dir", "maven", "maven-sub", "pom.xml"),
+ filepath.Join(dir, "dir", "maven", "maven-sub2", "pom.xml"),
+ },
+ },
+ {
+ Technology: coreutils.Npm,
+ WorkingDirectory: filepath.Join(dir, "dir", "npm"),
+ Descriptors: []string{filepath.Join(dir, "dir", "npm", "package.json")},
+ },
+ {
+ Technology: coreutils.Go,
+ WorkingDirectory: filepath.Join(dir, "dir", "go"),
+ Descriptors: []string{filepath.Join(dir, "dir", "go", "go.mod")},
+ },
+ {
+ Technology: coreutils.Yarn,
+ WorkingDirectory: filepath.Join(dir, "yarn"),
+ Descriptors: []string{filepath.Join(dir, "yarn", "package.json")},
+ },
+ {
+ Technology: coreutils.Pip,
+ WorkingDirectory: filepath.Join(dir, "yarn", "Pip"),
+ Descriptors: []string{filepath.Join(dir, "yarn", "Pip", "requirements.txt")},
+ },
+ {
+ Technology: coreutils.Pipenv,
+ WorkingDirectory: filepath.Join(dir, "yarn", "Pipenv"),
+ Descriptors: []string{filepath.Join(dir, "yarn", "Pipenv", "Pipfile")},
+ },
+ {
+ Technology: coreutils.Nuget,
+ WorkingDirectory: filepath.Join(dir, "Nuget"),
+ Descriptors: []string{filepath.Join(dir, "Nuget", "project.sln"), filepath.Join(dir, "Nuget", "Nuget-sub", "project.csproj")},
+ },
+ },
+ },
+ }
+
+ for _, test := range tests {
+ t.Run(test.name, func(t *testing.T) {
+ result := getScaScansToPreform(test.params())
+ for i := range result {
+ sort.Strings(result[i].Descriptors)
+ sort.Strings(test.expected[i].Descriptors)
+ }
+ assert.ElementsMatch(t, test.expected, result)
+ })
+ }
+
+ cleanUp()
+}
diff --git a/commands/curation/curationaudit.go b/commands/curation/curationaudit.go
new file mode 100644
index 00000000..91ba0cf5
--- /dev/null
+++ b/commands/curation/curationaudit.go
@@ -0,0 +1,573 @@
+package curation
+
+import (
+ "encoding/json"
+ "errors"
+ "fmt"
+ "net/http"
+ "os"
+ "path/filepath"
+ "regexp"
+ "sort"
+ "strings"
+ "sync"
+
+ "github.com/jfrog/gofrog/datastructures"
+ "github.com/jfrog/gofrog/parallel"
+ rtUtils "github.com/jfrog/jfrog-cli-core/v2/artifactory/utils"
+ outFormat "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ "github.com/jfrog/jfrog-cli-core/v2/common/project"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/commands/audit"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/artifactory"
+ "github.com/jfrog/jfrog-client-go/auth"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/httputils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+)
+
+const (
+ // The "blocked" represents the unapproved status that can be returned by the Curation Service for dependencies..
+ blocked = "blocked"
+ BlockingReasonPolicy = "Policy violations"
+ BlockingReasonNotFound = "Package pending update"
+
+ directRelation = "direct"
+ indirectRelation = "indirect"
+
+ BlockMessageKey = "jfrog packages curation"
+ NotBeingFoundKey = "not being found"
+
+ extractPoliciesRegexTemplate = "({.*?})"
+
+ errorTemplateHeadRequest = "failed sending HEAD request to %s for package '%s:%s'. Status-code: %v. Cause: %v"
+
+ errorTemplateUnsupportedTech = "It looks like this project uses '%s' to download its dependencies. " +
+ "This package manager however isn't supported by this command."
+
+ TotalConcurrentRequests = 10
+)
+
+var CurationOutputFormats = []string{string(outFormat.Table), string(outFormat.Json)}
+
+var supportedTech = map[coreutils.Technology]struct{}{
+ coreutils.Npm: {},
+}
+
+type ErrorsResp struct {
+ Errors []ErrorResp `json:"errors"`
+}
+
+type ErrorResp struct {
+ Status int `json:"status"`
+ Message string `json:"message"`
+}
+
+type PackageStatus struct {
+ Action string `json:"action"`
+ ParentName string `json:"direct_dependency_package_name"`
+ ParentVersion string `json:"direct_dependency_package_version"`
+ BlockedPackageUrl string `json:"blocked_package_url,omitempty"`
+ PackageName string `json:"blocked_package_name"`
+ PackageVersion string `json:"blocked_package_version"`
+ BlockingReason string `json:"blocking_reason"`
+ DepRelation string `json:"dependency_relation"`
+ PkgType string `json:"type"`
+ Policy []Policy `json:"policies,omitempty"`
+}
+
+type Policy struct {
+ Policy string `json:"policy"`
+ Condition string `json:"condition"`
+ Explanation string `json:"explanation"`
+ Recommendation string `json:"recommendation"`
+}
+
+type PackageStatusTable struct {
+ ParentName string `col-name:"Direct\nDependency\nPackage\nName" auto-merge:"true"`
+ ParentVersion string `col-name:"Direct\nDependency\nPackage\nVersion" auto-merge:"true"`
+ PackageName string `col-name:"Blocked\nPackage\nName" auto-merge:"true"`
+ PackageVersion string `col-name:"Blocked\nPackage\nVersion" auto-merge:"true"`
+ BlockingReason string `col-name:"Blocking Reason" auto-merge:"true"`
+ PkgType string `col-name:"Package\nType" auto-merge:"true"`
+ Policy string `col-name:"Violated\nPolicy\nName"`
+ Condition string `col-name:"Violated Condition\nName"`
+ Explanation string `col-name:"Explanation"`
+ Recommendation string `col-name:"Recommendation"`
+}
+
+type treeAnalyzer struct {
+ rtManager artifactory.ArtifactoryServicesManager
+ extractPoliciesRegex *regexp.Regexp
+ rtAuth auth.ServiceDetails
+ httpClientDetails httputils.HttpClientDetails
+ url string
+ repo string
+ tech coreutils.Technology
+ parallelRequests int
+}
+
+type CurationAuditCommand struct {
+ PackageManagerConfig *project.RepositoryConfig
+ extractPoliciesRegex *regexp.Regexp
+ workingDirs []string
+ OriginPath string
+ parallelRequests int
+ utils.AuditParams
+}
+
+func NewCurationAuditCommand() *CurationAuditCommand {
+ return &CurationAuditCommand{
+ extractPoliciesRegex: regexp.MustCompile(extractPoliciesRegexTemplate),
+ AuditParams: &utils.AuditBasicParams{},
+ }
+}
+
+func (ca *CurationAuditCommand) setPackageManagerConfig(pkgMangerConfig *project.RepositoryConfig) *CurationAuditCommand {
+ ca.PackageManagerConfig = pkgMangerConfig
+ return ca
+}
+
+func (ca *CurationAuditCommand) SetWorkingDirs(dirs []string) *CurationAuditCommand {
+ ca.workingDirs = dirs
+ return ca
+}
+
+func (ca *CurationAuditCommand) SetParallelRequests(threads int) *CurationAuditCommand {
+ ca.parallelRequests = threads
+ return ca
+}
+
+func (ca *CurationAuditCommand) Run() (err error) {
+ rootDir, err := os.Getwd()
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ if len(ca.workingDirs) > 0 {
+ defer func() {
+ if e := errorutils.CheckError(os.Chdir(rootDir)); err == nil {
+ err = e
+ }
+ }()
+ } else {
+ ca.workingDirs = append(ca.workingDirs, rootDir)
+ }
+ results := map[string][]*PackageStatus{}
+ for _, workDir := range ca.workingDirs {
+ var absWd string
+ absWd, err = filepath.Abs(workDir)
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ log.Info("Running curation audit on project:", absWd)
+ if absWd != rootDir {
+ if err = os.Chdir(absWd); err != nil {
+ return errorutils.CheckError(err)
+ }
+ }
+ // If error returned, continue to print results(if any), and return error at the end.
+ if e := ca.doCurateAudit(results); e != nil {
+ err = errors.Join(err, e)
+ }
+ }
+ if ca.Progress() != nil {
+ err = errors.Join(err, ca.Progress().Quit())
+ }
+
+ for projectPath, packagesStatus := range results {
+ err = errors.Join(err, printResult(ca.OutputFormat(), projectPath, packagesStatus))
+ }
+ return
+}
+
+func (ca *CurationAuditCommand) doCurateAudit(results map[string][]*PackageStatus) error {
+ techs := coreutils.DetectedTechnologiesList()
+ for _, tech := range techs {
+ if _, ok := supportedTech[coreutils.Technology(tech)]; !ok {
+ log.Info(fmt.Sprintf(errorTemplateUnsupportedTech, tech))
+ continue
+ }
+ if err := ca.auditTree(coreutils.Technology(tech), results); err != nil {
+ return err
+ }
+ }
+ return nil
+}
+
+func (ca *CurationAuditCommand) getAuditParamsByTech(tech coreutils.Technology) utils.AuditParams {
+ if tech == coreutils.Npm {
+ return utils.AuditNpmParams{AuditParams: ca.AuditParams}.
+ SetNpmIgnoreNodeModules(true).
+ SetNpmOverwritePackageLock(true)
+ }
+ return ca.AuditParams
+}
+
+func (ca *CurationAuditCommand) auditTree(tech coreutils.Technology, results map[string][]*PackageStatus) error {
+ flattenGraph, fullDependenciesTree, err := audit.GetTechDependencyTree(ca.getAuditParamsByTech(tech), tech)
+ if err != nil {
+ return err
+ }
+ // Validate the graph isn't empty.
+ if len(fullDependenciesTree) == 0 {
+ return errorutils.CheckErrorf("found no dependencies for the audited project using '%v' as the package manager", tech.String())
+ }
+ if err = ca.SetRepo(tech); err != nil {
+ return err
+ }
+ // Resolve the dependencies of the project.
+ serverDetails, err := ca.PackageManagerConfig.ServerDetails()
+ if err != nil {
+ return err
+ }
+ rtManager, err := rtUtils.CreateServiceManager(serverDetails, 2, 0, false)
+ if err != nil {
+ return err
+ }
+ rtAuth, err := serverDetails.CreateArtAuthConfig()
+ if err != nil {
+ return err
+ }
+ rootNode := fullDependenciesTree[0]
+ _, projectName, projectScope, projectVersion := getUrlNameAndVersionByTech(tech, rootNode.Id, "", "")
+ if ca.Progress() != nil {
+ ca.Progress().SetHeadlineMsg(fmt.Sprintf("Fetch curation status for %s graph with %v nodes project name: %s:%s", tech.ToFormal(), len(flattenGraph.Nodes)-1, projectName, projectVersion))
+ }
+ if projectScope != "" {
+ projectName = projectScope + "/" + projectName
+ }
+ if ca.parallelRequests == 0 {
+ ca.parallelRequests = TotalConcurrentRequests
+ }
+ var packagesStatus []*PackageStatus
+ analyzer := treeAnalyzer{
+ rtManager: rtManager,
+ extractPoliciesRegex: ca.extractPoliciesRegex,
+ rtAuth: rtAuth,
+ httpClientDetails: rtAuth.CreateHttpClientDetails(),
+ url: rtAuth.GetUrl(),
+ repo: ca.PackageManagerConfig.TargetRepo(),
+ tech: tech,
+ parallelRequests: ca.parallelRequests,
+ }
+ packagesStatusMap := sync.Map{}
+ // Fetch status for each node from a flatten graph which, has no duplicate nodes.
+ err = analyzer.fetchNodesStatus(flattenGraph, &packagesStatusMap, rootNode.Id)
+ analyzer.fillGraphRelations(rootNode, &packagesStatusMap,
+ &packagesStatus, "", "", datastructures.MakeSet[string](), true)
+ sort.Slice(packagesStatus, func(i, j int) bool {
+ return packagesStatus[i].ParentName < packagesStatus[j].ParentName
+ })
+ results[fmt.Sprintf("%s:%s", projectName, projectVersion)] = packagesStatus
+ return err
+}
+
+func printResult(format outFormat.OutputFormat, projectPath string, packagesStatus []*PackageStatus) error {
+ if format == "" {
+ format = outFormat.Table
+ }
+ log.Output(fmt.Sprintf("Found %v blocked packages for project %s", len(packagesStatus), projectPath))
+ switch format {
+ case outFormat.Json:
+ if len(packagesStatus) > 0 {
+ err := utils.PrintJson(packagesStatus)
+ if err != nil {
+ return err
+ }
+ }
+ case outFormat.Table:
+ pkgStatusTable := convertToPackageStatusTable(packagesStatus)
+ err := coreutils.PrintTable(pkgStatusTable, "Curation", "Found 0 blocked packages", true)
+ if err != nil {
+ return err
+ }
+ }
+ log.Output("\n")
+ return nil
+}
+
+func convertToPackageStatusTable(packagesStatus []*PackageStatus) []PackageStatusTable {
+ var pkgStatusTable []PackageStatusTable
+ for index, pkgStatus := range packagesStatus {
+ // We use auto-merge supported by the 'go-pretty' library. It doesn't have an option to merge lines by a group of unique fields.
+ // In order to so, we make each group merge only with itself by adding or not adding space. This way, it won't be merged with the next group.
+ uniqLineSep := ""
+ if index%2 == 0 {
+ uniqLineSep = " "
+ }
+ pkgTable := PackageStatusTable{
+ ParentName: pkgStatus.ParentName + uniqLineSep,
+ ParentVersion: pkgStatus.ParentVersion + uniqLineSep,
+ PackageName: pkgStatus.PackageName + uniqLineSep,
+ PackageVersion: pkgStatus.PackageVersion + uniqLineSep,
+ BlockingReason: pkgStatus.BlockingReason + uniqLineSep,
+ PkgType: pkgStatus.PkgType + uniqLineSep,
+ }
+ if len(pkgStatus.Policy) == 0 {
+ pkgStatusTable = append(pkgStatusTable, pkgTable)
+ continue
+ }
+ for _, policyCond := range pkgStatus.Policy {
+ pkgTable.Policy = policyCond.Policy
+ pkgTable.Explanation = policyCond.Explanation
+ pkgTable.Recommendation = policyCond.Recommendation
+ pkgTable.Condition = policyCond.Condition
+ pkgStatusTable = append(pkgStatusTable, pkgTable)
+ }
+ }
+
+ return pkgStatusTable
+}
+
+func (ca *CurationAuditCommand) CommandName() string {
+ return "curation_audit"
+}
+
+func (ca *CurationAuditCommand) SetRepo(tech coreutils.Technology) error {
+ switch tech {
+ case coreutils.Npm:
+ configFilePath, exists, err := project.GetProjectConfFilePath(project.Npm)
+ if err != nil {
+ return err
+ }
+ if !exists {
+ return errorutils.CheckErrorf("no config file was found! Before running the npm command on a " +
+ "project for the first time, the project should be configured using the 'jf npmc' command")
+ }
+ vConfig, err := project.ReadConfigFile(configFilePath, project.YAML)
+ if err != nil {
+ return err
+ }
+ resolverParams, err := project.GetRepoConfigByPrefix(configFilePath, project.ProjectConfigResolverPrefix, vConfig)
+ if err != nil {
+ return err
+ }
+ ca.setPackageManagerConfig(resolverParams)
+ default:
+ return errorutils.CheckErrorf(errorTemplateUnsupportedTech, tech.String())
+ }
+ return nil
+}
+
+func (nc *treeAnalyzer) fillGraphRelations(node *xrayUtils.GraphNode, preProcessMap *sync.Map,
+ packagesStatus *[]*PackageStatus, parent, parentVersion string, visited *datastructures.Set[string], isRoot bool) {
+ for _, child := range node.Nodes {
+ packageUrl, name, scope, version := getUrlNameAndVersionByTech(nc.tech, child.Id, nc.url, nc.repo)
+ if isRoot {
+ parent = name
+ parentVersion = version
+ if scope != "" {
+ parent = scope + "/" + parent
+ }
+ }
+ if visited.Exists(scope + name + version + "-" + parent + parentVersion) {
+ continue
+ }
+
+ visited.Add(scope + name + version + "-" + parent + parentVersion)
+ if pkgStatus, exist := preProcessMap.Load(packageUrl); exist {
+ relation := indirectRelation
+ if isRoot {
+ relation = directRelation
+ }
+ pkgStatusCast, isPkgStatus := pkgStatus.(*PackageStatus)
+ if isPkgStatus {
+ pkgStatusClone := *pkgStatusCast
+ pkgStatusClone.DepRelation = relation
+ pkgStatusClone.ParentName = parent
+ pkgStatusClone.ParentVersion = parentVersion
+ *packagesStatus = append(*packagesStatus, &pkgStatusClone)
+ }
+ }
+ nc.fillGraphRelations(child, preProcessMap, packagesStatus, parent, parentVersion, visited, false)
+ }
+}
+func (nc *treeAnalyzer) fetchNodesStatus(graph *xrayUtils.GraphNode, p *sync.Map, rootNodeId string) error {
+ var multiErrors error
+ consumerProducer := parallel.NewBounedRunner(nc.parallelRequests, false)
+ errorsQueue := clientutils.NewErrorsQueue(1)
+ go func() {
+ defer consumerProducer.Done()
+ for _, node := range graph.Nodes {
+ if node.Id == rootNodeId {
+ continue
+ }
+ getTask := func(node xrayUtils.GraphNode) func(threadId int) error {
+ return func(threadId int) (err error) {
+ return nc.fetchNodeStatus(node, p)
+ }
+ }
+ if _, err := consumerProducer.AddTaskWithError(getTask(*node), errorsQueue.AddError); err != nil {
+ multiErrors = errors.Join(err, multiErrors)
+ }
+ }
+ }()
+ consumerProducer.Run()
+ if err := errorsQueue.GetError(); err != nil {
+ multiErrors = errors.Join(err, multiErrors)
+ }
+ return multiErrors
+}
+
+func (nc *treeAnalyzer) fetchNodeStatus(node xrayUtils.GraphNode, p *sync.Map) error {
+ packageUrl, name, scope, version := getUrlNameAndVersionByTech(nc.tech, node.Id, nc.url, nc.repo)
+ if scope != "" {
+ name = scope + "/" + name
+ }
+ resp, _, err := nc.rtManager.Client().SendHead(packageUrl, &nc.httpClientDetails)
+ if err != nil {
+ if resp != nil && resp.StatusCode >= 400 {
+ return errorutils.CheckErrorf(errorTemplateHeadRequest, packageUrl, name, version, resp.StatusCode, err)
+ }
+ if resp == nil || resp.StatusCode != http.StatusForbidden {
+ return err
+ }
+ }
+ if resp != nil && resp.StatusCode >= 400 && resp.StatusCode != http.StatusForbidden {
+ return errorutils.CheckErrorf(errorTemplateHeadRequest, packageUrl, name, version, resp.StatusCode, err)
+ }
+ if resp.StatusCode == http.StatusForbidden {
+ pkStatus, err := nc.getBlockedPackageDetails(packageUrl, name, version)
+ if err != nil {
+ return err
+ }
+ if pkStatus != nil {
+ p.Store(pkStatus.BlockedPackageUrl, pkStatus)
+ }
+ }
+ return nil
+}
+
+// We try to collect curation details from GET response after HEAD request got forbidden status code.
+func (nc *treeAnalyzer) getBlockedPackageDetails(packageUrl string, name string, version string) (*PackageStatus, error) {
+ getResp, respBody, _, err := nc.rtManager.Client().SendGet(packageUrl, true, &nc.httpClientDetails)
+ if err != nil {
+ if getResp == nil {
+ return nil, err
+ }
+ if getResp.StatusCode != http.StatusForbidden {
+ return nil, errorutils.CheckErrorf(errorTemplateHeadRequest, packageUrl, name, version, getResp.StatusCode, err)
+ }
+ }
+ if getResp.StatusCode == http.StatusForbidden {
+ respError := &ErrorsResp{}
+ if err := json.Unmarshal(respBody, respError); err != nil {
+ return nil, errorutils.CheckError(err)
+ }
+ if len(respError.Errors) == 0 {
+ return nil, errorutils.CheckErrorf("received 403 for unknown reason, no curation status will be presented for this package. "+
+ "package name: %s, version: %s, download url: %s ", name, version, packageUrl)
+ }
+ // if the error message contains the curation string key, then we can be sure it got blocked by Curation service.
+ if strings.Contains(strings.ToLower(respError.Errors[0].Message), BlockMessageKey) {
+ blockingReason := BlockingReasonPolicy
+ if strings.Contains(strings.ToLower(respError.Errors[0].Message), NotBeingFoundKey) {
+ blockingReason = BlockingReasonNotFound
+ }
+ policies := nc.extractPoliciesFromMsg(respError)
+ return &PackageStatus{
+ PackageName: name,
+ PackageVersion: version,
+ BlockedPackageUrl: packageUrl,
+ Action: blocked,
+ Policy: policies,
+ BlockingReason: blockingReason,
+ PkgType: string(nc.tech),
+ }, nil
+ }
+ }
+ return nil, nil
+}
+
+// Return policies and conditions names from the FORBIDDEN HTTP error message.
+// Message structure: Package %s:%s download was blocked by JFrog Packages Curation service due to the following policies violated {%s, %s, %s, %s},{%s, %s, %s, %s}.
+func (nc *treeAnalyzer) extractPoliciesFromMsg(respError *ErrorsResp) []Policy {
+ var policies []Policy
+ msg := respError.Errors[0].Message
+ allMatches := nc.extractPoliciesRegex.FindAllString(msg, -1)
+ for _, match := range allMatches {
+ match = strings.TrimSuffix(strings.TrimPrefix(match, "{"), "}")
+ polCond := strings.Split(match, ",")
+ if len(polCond) >= 2 {
+ pol := polCond[0]
+ cond := polCond[1]
+
+ if len(polCond) == 4 {
+ exp, rec := makeLegiblePolicyDetails(polCond[2], polCond[3])
+ policies = append(policies, Policy{Policy: strings.TrimSpace(pol),
+ Condition: strings.TrimSpace(cond), Explanation: strings.TrimSpace(exp), Recommendation: strings.TrimSpace(rec)})
+ continue
+ }
+ policies = append(policies, Policy{Policy: strings.TrimSpace(pol), Condition: strings.TrimSpace(cond)})
+ }
+ }
+ return policies
+}
+
+// Adding a new line after the headline and replace every "|" with a new line.
+func makeLegiblePolicyDetails(explanation, recommendation string) (string, string) {
+ explanation = strings.ReplaceAll(strings.Replace(explanation, ": ", ":\n", 1), " | ", "\n")
+ recommendation = strings.ReplaceAll(strings.Replace(recommendation, ": ", ":\n", 1), " | ", "\n")
+ return explanation, recommendation
+}
+
+func getUrlNameAndVersionByTech(tech coreutils.Technology, nodeId, artiUrl, repo string) (downloadUrl string, name string, scope string, version string) {
+ if tech == coreutils.Npm {
+ return getNpmNameScopeAndVersion(nodeId, artiUrl, repo, coreutils.Npm.String())
+ }
+ return
+}
+
+// The graph holds, for each node, the component ID (xray representation)
+// from which we extract the package name, version, and construct the Artifactory download URL.
+func getNpmNameScopeAndVersion(id, artiUrl, repo, tech string) (downloadUrl, name, scope, version string) {
+ id = strings.TrimPrefix(id, tech+"://")
+
+ nameVersion := strings.Split(id, ":")
+ name = nameVersion[0]
+ if len(nameVersion) > 1 {
+ version = nameVersion[1]
+ }
+ scopeSplit := strings.Split(name, "/")
+ if len(scopeSplit) > 1 {
+ scope = scopeSplit[0]
+ name = scopeSplit[1]
+ }
+ return buildNpmDownloadUrl(artiUrl, repo, name, scope, version), name, scope, version
+}
+
+func buildNpmDownloadUrl(url, repo, name, scope, version string) string {
+ var packageUrl string
+ if scope != "" {
+ packageUrl = fmt.Sprintf("%s/api/npm/%s/%s/%s/-/%s-%s.tgz", strings.TrimSuffix(url, "/"), repo, scope, name, name, version)
+ } else {
+ packageUrl = fmt.Sprintf("%s/api/npm/%s/%s/-/%s-%s.tgz", strings.TrimSuffix(url, "/"), repo, name, name, version)
+ }
+ return packageUrl
+}
+
+func DetectNumOfThreads(threadsCount int) (int, error) {
+ if threadsCount > TotalConcurrentRequests {
+ return 0, errorutils.CheckErrorf("number of threads crossed the maximum, the maximum threads allowed is %v", TotalConcurrentRequests)
+ }
+ return threadsCount, nil
+}
+
+func GetCurationOutputFormat(formatFlagVal string) (format outFormat.OutputFormat, err error) {
+ // Default print format is table.
+ format = outFormat.Table
+ if formatFlagVal != "" {
+ switch strings.ToLower(formatFlagVal) {
+ case string(outFormat.Table):
+ format = outFormat.Table
+ case string(outFormat.Json):
+ format = outFormat.Json
+ default:
+ err = errorutils.CheckErrorf("only the following output formats are supported: " + coreutils.ListToText(CurationOutputFormats))
+ }
+ }
+ return
+}
diff --git a/commands/curation/curationaudit_test.go b/commands/curation/curationaudit_test.go
new file mode 100644
index 00000000..759b1bf6
--- /dev/null
+++ b/commands/curation/curationaudit_test.go
@@ -0,0 +1,582 @@
+package curation
+
+import (
+ "encoding/json"
+ "fmt"
+ "github.com/jfrog/gofrog/datastructures"
+ coretests "github.com/jfrog/jfrog-cli-core/v2/common/tests"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ clienttestutils "github.com/jfrog/jfrog-client-go/utils/tests"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+ "github.com/stretchr/testify/assert"
+ "net/http"
+ "net/http/httptest"
+ "os"
+ "path/filepath"
+ "regexp"
+ "sort"
+ "strconv"
+ "strings"
+ "sync"
+ "testing"
+)
+
+var TestDataDir = filepath.Join("..", "..", "tests", "testdata")
+
+func TestExtractPoliciesFromMsg(t *testing.T) {
+ var err error
+ extractPoliciesRegex := regexp.MustCompile(extractPoliciesRegexTemplate)
+ assert.NoError(t, err)
+ tests := getTestCasesForExtractPoliciesFromMsg()
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ ta := treeAnalyzer{extractPoliciesRegex: extractPoliciesRegex}
+ got := ta.extractPoliciesFromMsg(tt.errResp)
+ assert.Equal(t, tt.expect, got)
+ })
+ }
+}
+
+func getTestCasesForExtractPoliciesFromMsg() []struct {
+ name string
+ errResp *ErrorsResp
+ expect []Policy
+} {
+ tests := []struct {
+ name string
+ errResp *ErrorsResp
+ expect []Policy
+ }{
+ {
+ name: "one policy",
+ errResp: &ErrorsResp{
+ Errors: []ErrorResp{
+ {
+ Status: 403,
+ Message: "Package test:1.0.0 download was blocked by JFrog Packages Curation service due to the following policies violated {policy1, condition1}.",
+ },
+ },
+ },
+ expect: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ },
+ {
+ name: "one policy",
+ errResp: &ErrorsResp{
+ Errors: []ErrorResp{
+ {
+ Status: 403,
+ Message: "Package test:1.0.0 download was blocked by JFrog Packages Curation service due to the following policies violated {policy1, condition1, Package is 3339 days old, Upgrade to version 0.2.4 (latest)}.",
+ },
+ },
+ },
+ expect: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ Explanation: "Package is 3339 days old",
+ Recommendation: "Upgrade to version 0.2.4 (latest)",
+ },
+ },
+ },
+ {
+ name: "two policies",
+ errResp: &ErrorsResp{
+ Errors: []ErrorResp{
+ {
+ Status: 403,
+ Message: "Package test:1.0.0 download was blocked by JFrog Packages Curation service due to" +
+ " the following policies violated {policy1, condition1}, {policy2, condition2}.",
+ },
+ },
+ },
+ expect: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ {
+ Policy: "policy2",
+ Condition: "condition2",
+ },
+ },
+ },
+ {
+ name: "no policies",
+ errResp: &ErrorsResp{
+ Errors: []ErrorResp{
+ {
+ Status: 403,
+ Message: "not the expected message format.",
+ },
+ },
+ },
+ expect: nil,
+ },
+ }
+ return tests
+}
+
+func TestGetNameScopeAndVersion(t *testing.T) {
+ tests := []struct {
+ name string
+ componentId string
+ artiUrl string
+ repo string
+ tech string
+ wantDownloadUrl string
+ wantName string
+ wantVersion string
+ wantScope string
+ }{
+ {
+ name: "npm component",
+ componentId: "npm://test:1.0.0",
+ artiUrl: "http://localhost:8000/artifactory",
+ repo: "npm",
+ tech: coreutils.Npm.String(),
+ wantDownloadUrl: "http://localhost:8000/artifactory/api/npm/npm/test/-/test-1.0.0.tgz",
+ wantName: "test",
+ wantVersion: "1.0.0",
+ },
+ {
+ name: "npm component with scope",
+ componentId: "npm://dev/test:1.0.0",
+ artiUrl: "http://localhost:8000/artifactory",
+ repo: "npm",
+ tech: coreutils.Npm.String(),
+ wantDownloadUrl: "http://localhost:8000/artifactory/api/npm/npm/dev/test/-/test-1.0.0.tgz",
+ wantName: "test",
+ wantVersion: "1.0.0",
+ wantScope: "dev",
+ },
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ gotDownloadUrl, gotName, gotScope, gotVersion := getNpmNameScopeAndVersion(tt.componentId, tt.artiUrl, tt.repo, tt.repo)
+ assert.Equal(t, tt.wantDownloadUrl, gotDownloadUrl, "getNameScopeAndVersion() gotDownloadUrl = %v, want %v", gotDownloadUrl, tt.wantDownloadUrl)
+ assert.Equal(t, tt.wantName, gotName, "getNpmNameScopeAndVersion() gotName = %v, want %v", gotName, tt.wantName)
+ assert.Equal(t, tt.wantScope, gotScope, "getNpmNameScopeAndVersion() gotScope = %v, want %v", gotScope, tt.wantScope)
+ assert.Equal(t, tt.wantVersion, gotVersion, "getNpmNameScopeAndVersion() gotVersion = %v, want %v", gotVersion, tt.wantVersion)
+ })
+ }
+}
+
+func TestTreeAnalyzerFillGraphRelations(t *testing.T) {
+ tests := getTestCasesForFillGraphRelations()
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ nc := &treeAnalyzer{
+ url: "http://localhost:8046/artifactory",
+ repo: "npm-repo",
+ tech: "npm",
+ }
+ var packageStatus []*PackageStatus
+ preProcessedMap := fillSyncedMap(tt.givenMap)
+ nc.fillGraphRelations(tt.givenGraph, preProcessedMap, &packageStatus, "", "", datastructures.MakeSet[string](), true)
+ sort.Slice(packageStatus, func(i, j int) bool {
+ if packageStatus[i].BlockedPackageUrl == packageStatus[j].BlockedPackageUrl {
+ return packageStatus[i].ParentName < packageStatus[j].ParentName
+ }
+ return packageStatus[i].BlockedPackageUrl < packageStatus[j].BlockedPackageUrl
+ })
+ sort.Slice(tt.expectedPackagesStatus, func(i, j int) bool {
+ if tt.expectedPackagesStatus[i].BlockedPackageUrl == tt.expectedPackagesStatus[j].BlockedPackageUrl {
+ return tt.expectedPackagesStatus[i].ParentName < tt.expectedPackagesStatus[j].ParentName
+ }
+ return tt.expectedPackagesStatus[i].BlockedPackageUrl < tt.expectedPackagesStatus[j].BlockedPackageUrl
+ })
+ assert.Equal(t, tt.expectedPackagesStatus, packageStatus)
+ })
+ }
+}
+
+func getTestCasesForFillGraphRelations() []struct {
+ name string
+ givenGraph *xrayUtils.GraphNode
+ givenMap []*PackageStatus
+ expectedPackagesStatus []*PackageStatus
+} {
+ tests := []struct {
+ name string
+ givenGraph *xrayUtils.GraphNode
+ givenMap []*PackageStatus
+ expectedPackagesStatus []*PackageStatus
+ }{
+ {
+ name: "block indirect",
+ givenGraph: &xrayUtils.GraphNode{
+ Id: "npm://root-test",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://test-parent:1.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {Id: "npm://test-child:2.0.0"},
+ },
+ },
+ },
+ },
+ givenMap: []*PackageStatus{
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/test-child/-/test-child-2.0.0.tgz",
+ PackageName: "test-child",
+ PackageVersion: "2.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ },
+ },
+ expectedPackagesStatus: []*PackageStatus{
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/test-child/-/test-child-2.0.0.tgz",
+ PackageName: "test-child",
+ PackageVersion: "2.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ ParentName: "test-parent",
+ ParentVersion: "1.0.0",
+ DepRelation: "indirect",
+ },
+ },
+ },
+ {
+ name: "no duplications",
+ givenGraph: &xrayUtils.GraphNode{
+ Id: "npm://root-test",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://test-parent:1.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://test-child:2.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://@dev/test-child:4.0.0",
+ },
+ },
+ },
+ {
+ Id: "npm://test-child:3.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://@dev/test-child:4.0.0",
+ },
+ },
+ },
+ {
+ Id: "npm://@dev/test-child:5.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://test-child:4.0.0",
+ },
+ },
+ },
+ },
+ },
+ {
+ Id: "npm://@dev/test-parent:1.0.0",
+ Nodes: []*xrayUtils.GraphNode{
+ {
+ Id: "npm://test-child:4.0.0",
+ },
+ },
+ },
+ },
+ },
+ givenMap: []*PackageStatus{
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/@dev/test-child/-/test-child-4.0.0.tgz",
+ PackageName: "@dev/test-child",
+ PackageVersion: "4.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ },
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/test-child/-/test-child-4.0.0.tgz",
+ PackageName: "test-child",
+ PackageVersion: "4.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ },
+ },
+ expectedPackagesStatus: []*PackageStatus{
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/test-child/-/test-child-4.0.0.tgz",
+ PackageName: "test-child",
+ PackageVersion: "4.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ ParentName: "test-parent",
+ ParentVersion: "1.0.0",
+ DepRelation: "indirect",
+ },
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/test-child/-/test-child-4.0.0.tgz",
+ PackageName: "test-child",
+ PackageVersion: "4.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ ParentName: "@dev/test-parent",
+ ParentVersion: "1.0.0",
+ DepRelation: "indirect",
+ },
+ {
+ Action: "blocked",
+ BlockedPackageUrl: "http://localhost:8046/artifactory/api/npm/npm-repo/@dev/test-child/-/test-child-4.0.0.tgz",
+ PackageName: "@dev/test-child",
+ PackageVersion: "4.0.0",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ Policy: []Policy{
+ {
+ Policy: "policy1",
+ Condition: "condition1",
+ },
+ },
+ ParentName: "test-parent",
+ ParentVersion: "1.0.0",
+ DepRelation: "indirect",
+ },
+ },
+ },
+ }
+ return tests
+}
+
+func fillSyncedMap(pkgStatus []*PackageStatus) *sync.Map {
+ syncMap := sync.Map{}
+ for _, value := range pkgStatus {
+ syncMap.Store(value.BlockedPackageUrl, value)
+ }
+ return &syncMap
+}
+
+func TestDoCurationAudit(t *testing.T) {
+ tests := getTestCasesForDoCurationAudit()
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ currentDir, err := os.Getwd()
+ assert.NoError(t, err)
+ configurationDir := filepath.Join(TestDataDir, "projects", "package-managers", "npm", "npm-project", ".jfrog")
+ callback := clienttestutils.SetEnvWithCallbackAndAssert(t, coreutils.HomeDir, filepath.Join(currentDir, configurationDir))
+ defer callback()
+
+ mockServer, config := curationServer(t, tt.expectedRequest, tt.requestToFail, tt.requestToError)
+ defer mockServer.Close()
+ configFilePath := WriteServerDetailsConfigFileBytes(t, config.ArtifactoryUrl, configurationDir)
+ defer func() {
+ assert.NoError(t, fileutils.RemoveTempDir(configFilePath))
+ }()
+ curationCmd := NewCurationAuditCommand()
+ curationCmd.parallelRequests = 3
+ curationCmd.SetIgnoreConfigFile(true)
+ rootDir, err := os.Getwd()
+ assert.NoError(t, err)
+ // Set the working dir for npm project.
+ callback = clienttestutils.ChangeDirWithCallback(t, rootDir, filepath.Join(TestDataDir, "projects", "package-managers", "npm", "npm-project"))
+ defer callback()
+ results := map[string][]*PackageStatus{}
+ if tt.requestToError == nil {
+ assert.NoError(t, curationCmd.doCurateAudit(results))
+ } else {
+ gotError := curationCmd.doCurateAudit(results)
+ assert.Error(t, gotError)
+ startUrl := strings.Index(tt.expectedError, "/")
+ assert.GreaterOrEqual(t, startUrl, 0)
+ errMsgExpected := tt.expectedError[:startUrl] + config.ArtifactoryUrl +
+ tt.expectedError[strings.Index(tt.expectedError, "/")+1:]
+ assert.EqualError(t, gotError, errMsgExpected)
+ }
+ // Add the mock server to the expected blocked message url
+ for index := range tt.expectedResp {
+ tt.expectedResp[index].BlockedPackageUrl = fmt.Sprintf("%s%s", strings.TrimSuffix(config.GetArtifactoryUrl(), "/"), tt.expectedResp[index].BlockedPackageUrl)
+ }
+ gotResults := results["npm_test:1.0.0"]
+ assert.Equal(t, tt.expectedResp, gotResults)
+ for _, requestDone := range tt.expectedRequest {
+ assert.True(t, requestDone)
+ }
+ })
+ }
+}
+
+func getTestCasesForDoCurationAudit() []struct {
+ name string
+ expectedRequest map[string]bool
+ requestToFail map[string]bool
+ expectedResp []*PackageStatus
+ requestToError map[string]bool
+ expectedError string
+} {
+ tests := []struct {
+ name string
+ expectedRequest map[string]bool
+ requestToFail map[string]bool
+ expectedResp []*PackageStatus
+ requestToError map[string]bool
+ expectedError string
+ }{
+ {
+ name: "npm tree - two blocked package ",
+ expectedRequest: map[string]bool{
+ "/api/npm/npms/lightweight/-/lightweight-0.1.0.tgz": false,
+ "/api/npm/npms/underscore/-/underscore-1.13.6.tgz": false,
+ },
+ requestToFail: map[string]bool{
+ "/api/npm/npms/underscore/-/underscore-1.13.6.tgz": false,
+ },
+ expectedResp: []*PackageStatus{
+ {
+ Action: "blocked",
+ ParentVersion: "1.13.6",
+ ParentName: "underscore",
+ BlockedPackageUrl: "/api/npm/npms/underscore/-/underscore-1.13.6.tgz",
+ PackageName: "underscore",
+ PackageVersion: "1.13.6",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ DepRelation: "direct",
+ Policy: []Policy{
+ {
+ Policy: "pol1",
+ Condition: "cond1",
+ },
+ },
+ },
+ },
+ },
+ {
+ name: "npm tree - two blocked one error",
+ expectedRequest: map[string]bool{
+ "/api/npm/npms/lightweight/-/lightweight-0.1.0.tgz": false,
+ "/api/npm/npms/underscore/-/underscore-1.13.6.tgz": false,
+ },
+ requestToFail: map[string]bool{
+ "/api/npm/npms/underscore/-/underscore-1.13.6.tgz": false,
+ },
+ requestToError: map[string]bool{
+ "/api/npm/npms/lightweight/-/lightweight-0.1.0.tgz": false,
+ },
+ expectedResp: []*PackageStatus{
+ {
+ Action: "blocked",
+ ParentVersion: "1.13.6",
+ ParentName: "underscore",
+ BlockedPackageUrl: "/api/npm/npms/underscore/-/underscore-1.13.6.tgz",
+ PackageName: "underscore",
+ PackageVersion: "1.13.6",
+ BlockingReason: "Policy violations",
+ PkgType: "npm",
+ DepRelation: "direct",
+ Policy: []Policy{
+ {
+ Policy: "pol1",
+ Condition: "cond1",
+ },
+ },
+ },
+ },
+ expectedError: fmt.Sprintf("failed sending HEAD request to %s for package '%s'. Status-code: %v. "+
+ "Cause: executor timeout after 2 attempts with 0 milliseconds wait intervals",
+ "/api/npm/npms/lightweight/-/lightweight-0.1.0.tgz", "lightweight:0.1.0", http.StatusInternalServerError),
+ },
+ }
+ return tests
+}
+
+func curationServer(t *testing.T, expectedRequest map[string]bool, requestToFail map[string]bool, requestToError map[string]bool) (*httptest.Server, *config.ServerDetails) {
+ mapLockReadWrite := sync.Mutex{}
+ serverMock, config, _ := coretests.CreateRtRestsMockServer(t, func(w http.ResponseWriter, r *http.Request) {
+ if r.Method == http.MethodHead {
+ mapLockReadWrite.Lock()
+ if _, exist := expectedRequest[r.RequestURI]; exist {
+ expectedRequest[r.RequestURI] = true
+ }
+ mapLockReadWrite.Unlock()
+ if _, exist := requestToFail[r.RequestURI]; exist {
+ w.WriteHeader(http.StatusForbidden)
+ }
+ if _, exist := requestToError[r.RequestURI]; exist {
+ w.WriteHeader(http.StatusInternalServerError)
+ }
+ }
+ if r.Method == http.MethodGet {
+ if _, exist := requestToFail[r.RequestURI]; exist {
+ w.WriteHeader(http.StatusForbidden)
+ _, err := w.Write([]byte("{\n \"errors\": [\n {\n \"status\": 403,\n " +
+ "\"message\": \"Package download was blocked by JFrog Packages " +
+ "Curation service due to the following policies violated {pol1, cond1}\"\n }\n ]\n}"))
+ assert.NoError(t, err)
+ }
+ }
+ })
+ return serverMock, config
+}
+
+func WriteServerDetailsConfigFileBytes(t *testing.T, url string, configPath string) string {
+ serverDetails := config.ConfigV5{
+ Servers: []*config.ServerDetails{
+ {
+ User: "admin",
+ Password: "password",
+ ServerId: "test",
+ Url: url,
+ ArtifactoryUrl: url,
+ },
+ },
+ Version: "v" + strconv.Itoa(coreutils.GetCliConfigVersion()),
+ }
+
+ detailsByte, err := json.Marshal(serverDetails)
+ assert.NoError(t, err)
+ confFilePath := filepath.Join(configPath, "jfrog-cli.conf.v"+strconv.Itoa(coreutils.GetCliConfigVersion()))
+ assert.NoError(t, os.WriteFile(confFilePath, detailsByte, 0644))
+ return confFilePath
+}
diff --git a/commands/scan/buildscan.go b/commands/scan/buildscan.go
new file mode 100644
index 00000000..bbd18d0b
--- /dev/null
+++ b/commands/scan/buildscan.go
@@ -0,0 +1,168 @@
+package scan
+
+import (
+ "errors"
+
+ "github.com/jfrog/jfrog-cli-core/v2/common/build"
+ outputFormat "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-security/utils"
+ xrutils "github.com/jfrog/jfrog-cli-security/utils"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+)
+
+const (
+ BuildScanMinVersion = "3.37.0"
+ BuildScanIncludeVulnerabilitiesMinVersion = "3.40.0"
+)
+
+type BuildScanCommand struct {
+ serverDetails *config.ServerDetails
+ outputFormat outputFormat.OutputFormat
+ buildConfiguration *build.BuildConfiguration
+ includeVulnerabilities bool
+ failBuild bool
+ printExtendedTable bool
+ rescan bool
+}
+
+func NewBuildScanCommand() *BuildScanCommand {
+ return &BuildScanCommand{}
+}
+
+func (bsc *BuildScanCommand) SetServerDetails(server *config.ServerDetails) *BuildScanCommand {
+ bsc.serverDetails = server
+ return bsc
+}
+
+func (bsc *BuildScanCommand) SetOutputFormat(format outputFormat.OutputFormat) *BuildScanCommand {
+ bsc.outputFormat = format
+ return bsc
+}
+
+func (bsc *BuildScanCommand) ServerDetails() (*config.ServerDetails, error) {
+ return bsc.serverDetails, nil
+}
+
+func (bsc *BuildScanCommand) SetBuildConfiguration(buildConfiguration *build.BuildConfiguration) *BuildScanCommand {
+ bsc.buildConfiguration = buildConfiguration
+ return bsc
+}
+
+func (bsc *BuildScanCommand) SetIncludeVulnerabilities(include bool) *BuildScanCommand {
+ bsc.includeVulnerabilities = include
+ return bsc
+}
+
+func (bsc *BuildScanCommand) SetFailBuild(failBuild bool) *BuildScanCommand {
+ bsc.failBuild = failBuild
+ return bsc
+}
+
+func (bsc *BuildScanCommand) SetPrintExtendedTable(printExtendedTable bool) *BuildScanCommand {
+ bsc.printExtendedTable = printExtendedTable
+ return bsc
+}
+
+func (bsc *BuildScanCommand) SetRescan(rescan bool) *BuildScanCommand {
+ bsc.rescan = rescan
+ return bsc
+}
+
+// Scan published builds with Xray
+func (bsc *BuildScanCommand) Run() (err error) {
+ xrayManager, xrayVersion, err := utils.CreateXrayServiceManagerAndGetVersion(bsc.serverDetails)
+ if err != nil {
+ return err
+ }
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, BuildScanMinVersion)
+ if err != nil {
+ return err
+ }
+ if bsc.includeVulnerabilities {
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, BuildScanIncludeVulnerabilitiesMinVersion)
+ if err != nil {
+ return errors.New("build-scan command with '--vuln' flag is not supported on your current Xray version. " + err.Error())
+ }
+ }
+ buildName, err := bsc.buildConfiguration.GetBuildName()
+ if err != nil {
+ return err
+ }
+ buildNumber, err := bsc.buildConfiguration.GetBuildNumber()
+ if err != nil {
+ return err
+ }
+ params := services.XrayBuildParams{
+ BuildName: buildName,
+ BuildNumber: buildNumber,
+ Project: bsc.buildConfiguration.GetProject(),
+ Rescan: bsc.rescan,
+ }
+
+ isFailBuildResponse, err := bsc.runBuildScanAndPrintResults(xrayManager, xrayVersion, params)
+ if err != nil {
+ return err
+ }
+ // If failBuild flag is true and also got fail build response from Xray
+ if bsc.failBuild && isFailBuildResponse {
+ return xrutils.NewFailBuildError()
+ }
+ return
+}
+
+func (bsc *BuildScanCommand) runBuildScanAndPrintResults(xrayManager *xray.XrayServicesManager, xrayVersion string, params services.XrayBuildParams) (isFailBuildResponse bool, err error) {
+ buildScanResults, noFailBuildPolicy, err := xrayManager.BuildScan(params, bsc.includeVulnerabilities)
+ if err != nil {
+ return false, err
+ }
+ log.Info("The scan data is available at: " + buildScanResults.MoreDetailsUrl)
+ isFailBuildResponse = buildScanResults.FailBuild
+
+ scanResponse := []services.ScanResponse{{
+ Violations: buildScanResults.Violations,
+ Vulnerabilities: buildScanResults.Vulnerabilities,
+ XrayDataUrl: buildScanResults.MoreDetailsUrl,
+ }}
+
+ scanResults := xrutils.NewAuditResults()
+ scanResults.XrayVersion = xrayVersion
+ scanResults.ScaResults = []xrutils.ScaScanResult{{XrayResults: scanResponse}}
+
+ resultsPrinter := xrutils.NewResultsWriter(scanResults).
+ SetOutputFormat(bsc.outputFormat).
+ SetIncludeVulnerabilities(bsc.includeVulnerabilities).
+ SetIncludeLicenses(false).
+ SetIsMultipleRootProject(true).
+ SetPrintExtendedTable(bsc.printExtendedTable).
+ SetScanType(services.Binary).
+ SetExtraMessages(nil)
+
+ if bsc.outputFormat != outputFormat.Table {
+ // Print the violations and/or vulnerabilities as part of one JSON.
+ err = resultsPrinter.PrintScanResults()
+ } else {
+ // Print two different tables for violations and vulnerabilities (if needed)
+
+ // If "No Xray Fail build policy...." error received, no need to print violations
+ if !noFailBuildPolicy {
+ if err = resultsPrinter.PrintScanResults(); err != nil {
+ return false, err
+ }
+ }
+ if bsc.includeVulnerabilities {
+ resultsPrinter.SetIncludeVulnerabilities(true)
+ if err = resultsPrinter.PrintScanResults(); err != nil {
+ return false, err
+ }
+ }
+ }
+ return
+}
+
+func (bsc *BuildScanCommand) CommandName() string {
+ return "xr_build_scan"
+}
diff --git a/commands/scan/dockerscan.go b/commands/scan/dockerscan.go
new file mode 100644
index 00000000..8e954792
--- /dev/null
+++ b/commands/scan/dockerscan.go
@@ -0,0 +1,145 @@
+package scan
+
+import (
+ "bytes"
+ "fmt"
+ "github.com/jfrog/jfrog-cli-core/v2/common/spec"
+ xrayutils "github.com/jfrog/jfrog-cli-security/utils"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "os"
+ "os/exec"
+ "path/filepath"
+ "strings"
+)
+
+const (
+ indexerEnvPrefix = "JFROG_INDEXER_"
+ DockerScanMinXrayVersion = "3.40.0"
+)
+
+type DockerScanCommand struct {
+ ScanCommand
+ imageTag string
+ targetRepoPath string
+}
+
+func NewDockerScanCommand() *DockerScanCommand {
+ return &DockerScanCommand{ScanCommand: *NewScanCommand()}
+}
+
+func (dsc *DockerScanCommand) SetImageTag(imageTag string) *DockerScanCommand {
+ dsc.imageTag = imageTag
+ return dsc
+}
+
+func (dsc *DockerScanCommand) SetTargetRepoPath(repoPath string) *DockerScanCommand {
+ dsc.targetRepoPath = repoPath
+ return dsc
+}
+
+func (dsc *DockerScanCommand) Run() (err error) {
+ // Validate Xray minimum version
+ _, xrayVersion, err := xrayutils.CreateXrayServiceManagerAndGetVersion(dsc.ScanCommand.serverDetails)
+ if err != nil {
+ return err
+ }
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, DockerScanMinXrayVersion)
+ if err != nil {
+ return err
+ }
+
+ tempDirPath, err := fileutils.CreateTempDir()
+ if err != nil {
+ return err
+ }
+ defer func() {
+ e := fileutils.RemoveTempDir(tempDirPath)
+ if err == nil {
+ err = e
+ }
+ }()
+
+ // Run the 'docker save' command, to create tar file from the docker image, and pass it to the indexer-app
+ if dsc.progress != nil {
+ dsc.progress.SetHeadlineMsg("Creating image archive 📦")
+ }
+ log.Info("Creating image archive...")
+ imageTarPath := filepath.Join(tempDirPath, "image.tar")
+ dockerSaveCmd := exec.Command("docker", "save", dsc.imageTag, "-o", imageTarPath)
+ var stderr bytes.Buffer
+ dockerSaveCmd.Stderr = &stderr
+ err = dockerSaveCmd.Run()
+ if err != nil {
+ return fmt.Errorf("failed running command: '%s' with error: %s - %s", strings.Join(dockerSaveCmd.Args, " "), err.Error(), stderr.String())
+ }
+
+ // Perform scan on image.tar
+ dsc.SetSpec(spec.NewBuilder().
+ Pattern(imageTarPath).
+ Target(dsc.targetRepoPath).
+ BuildSpec()).SetThreads(1)
+ err = dsc.setCredentialEnvsForIndexerApp()
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ defer func() {
+ e := dsc.unsetCredentialEnvsForIndexerApp()
+ if err == nil {
+ err = errorutils.CheckError(e)
+ }
+ }()
+ return dsc.ScanCommand.Run()
+}
+
+// When indexing RPM files inside the docker container, the indexer-app needs to connect to the Xray Server.
+// This is because RPM indexing is performed on the server side. This method therefore sets the Xray credentials as env vars to be read and used by the indexer-app.
+func (dsc *DockerScanCommand) setCredentialEnvsForIndexerApp() error {
+ err := os.Setenv(indexerEnvPrefix+"XRAY_URL", dsc.serverDetails.XrayUrl)
+ if err != nil {
+ return err
+ }
+ if dsc.serverDetails.AccessToken != "" {
+ err = os.Setenv(indexerEnvPrefix+"XRAY_ACCESS_TOKEN", dsc.serverDetails.AccessToken)
+ if err != nil {
+ return err
+ }
+ } else {
+ err = os.Setenv(indexerEnvPrefix+"XRAY_USER", dsc.serverDetails.User)
+ if err != nil {
+ return err
+ }
+ err = os.Setenv(indexerEnvPrefix+"XRAY_PASSWORD", dsc.serverDetails.Password)
+ if err != nil {
+ return err
+ }
+ }
+ return nil
+}
+
+func (dsc *DockerScanCommand) unsetCredentialEnvsForIndexerApp() error {
+ err := os.Unsetenv(indexerEnvPrefix + "XRAY_URL")
+ if err != nil {
+ return err
+ }
+ err = os.Unsetenv(indexerEnvPrefix + "XRAY_ACCESS_TOKEN")
+ if err != nil {
+ return err
+ }
+ err = os.Unsetenv(indexerEnvPrefix + "XRAY_USER")
+ if err != nil {
+ return err
+ }
+ err = os.Unsetenv(indexerEnvPrefix + "XRAY_PASSWORD")
+ if err != nil {
+ return err
+ }
+
+ return nil
+}
+
+func (dsc *DockerScanCommand) CommandName() string {
+ return "xr_docker_scan"
+}
diff --git a/commands/scan/downloadindexer.go b/commands/scan/downloadindexer.go
new file mode 100644
index 00000000..d0c3456d
--- /dev/null
+++ b/commands/scan/downloadindexer.go
@@ -0,0 +1,193 @@
+package scan
+
+import (
+ "errors"
+ "fmt"
+ "net/http"
+ "os"
+ "path/filepath"
+ "runtime"
+ "sort"
+ "strings"
+
+ gofrogio "github.com/jfrog/gofrog/io"
+ "github.com/jfrog/gofrog/version"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/lock"
+ "github.com/jfrog/jfrog-client-go/http/httpclient"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray"
+)
+
+const (
+ indexerDirName = "xray-indexer"
+ tempIndexerDirName = "temp"
+)
+
+func DownloadIndexerIfNeeded(xrayManager *xray.XrayServicesManager, xrayVersionStr string) (indexerPath string, err error) {
+ dependenciesPath, err := config.GetJfrogDependenciesPath()
+ if err != nil {
+ return
+ }
+ indexerDirPath := filepath.Join(dependenciesPath, indexerDirName)
+ indexerBinaryName := getIndexerBinaryName()
+ indexerPath = filepath.Join(indexerDirPath, xrayVersionStr, indexerBinaryName)
+
+ locksDirPath, err := coreutils.GetJfrogLocksDir()
+ if err != nil {
+ return
+ }
+ unlockFunc, err := lock.CreateLock(filepath.Join(locksDirPath, indexerDirName))
+ // Defer the lockFile.Unlock() function before throwing a possible error to avoid deadlock situations.
+ defer func() {
+ e := unlockFunc()
+ if err == nil {
+ err = e
+ }
+ }()
+ if err != nil {
+ return
+ }
+ exists, err := fileutils.IsFileExists(indexerPath, false)
+ if exists || err != nil {
+ return
+ }
+
+ log.Info("JFrog Xray Indexer " + xrayVersionStr + " is not cached locally. Downloading it now...")
+ indexerPath, err = downloadIndexer(xrayManager, indexerDirPath, indexerBinaryName)
+ if err != nil {
+ err = errors.New("failed while attempting to download Xray indexer: " + err.Error())
+ }
+ return
+}
+
+func downloadIndexer(xrayManager *xray.XrayServicesManager, indexerDirPath, indexerBinaryName string) (string, error) {
+ tempDirPath := filepath.Join(indexerDirPath, tempIndexerDirName)
+
+ // Delete the temporary directory if it exists
+ tempDirExists, err := fileutils.IsDirExists(tempDirPath, false)
+ if err != nil {
+ return "", err
+ }
+ if tempDirExists {
+ err = fileutils.RemoveTempDir(tempDirPath)
+ if err != nil {
+ return "", errorutils.CheckErrorf("Temporary download directory already exists, and can't be removed: %s\nRemove this directory manually and try again: %s", err.Error(), tempDirPath)
+ }
+ }
+
+ // Delete all old indexers, but the two newest
+ err = deleteOldIndexers(indexerDirPath)
+ if err != nil {
+ return "", err
+ }
+
+ // Download the indexer from Xray to the temporary directory
+ url := fmt.Sprintf("%sapi/v1/indexer-resources/download/%s/%s", xrayManager.Config().GetServiceDetails().GetUrl(), runtime.GOOS, runtime.GOARCH)
+ downloadFileDetails := &httpclient.DownloadFileDetails{
+ DownloadPath: url,
+ LocalPath: tempDirPath,
+ LocalFileName: indexerBinaryName,
+ }
+ httpClientDetails := xrayManager.Config().GetServiceDetails().CreateHttpClientDetails()
+ resp, err := xrayManager.Client().DownloadFile(downloadFileDetails, "", &httpClientDetails, false, false)
+ if err != nil {
+ return "", fmt.Errorf("an error occurred while trying to download '%s':\n%s", url, err.Error())
+ }
+ if err = errorutils.CheckResponseStatus(resp, http.StatusOK); err != nil {
+ if resp.StatusCode == http.StatusUnauthorized {
+ err = fmt.Errorf(err.Error() + "\nHint: It appears that the credentials provided do not have sufficient permissions for JFrog Xray. This could be due to either incorrect credentials or limited permissions restricted to Artifactory only.")
+ }
+ return "", fmt.Errorf("failed while attempting to download '%s':\n%s", url, err.Error())
+ }
+
+ // Add execution permissions to the indexer
+ indexerPath := filepath.Join(tempDirPath, indexerBinaryName)
+ err = os.Chmod(indexerPath, 0777)
+ if err != nil {
+ return "", errorutils.CheckError(err)
+ }
+ indexerVersion, err := getIndexerVersion(indexerPath)
+ if err != nil {
+ return "", err
+ }
+ log.Info("The downloaded Xray Indexer version is " + indexerVersion)
+ newDirPath := filepath.Join(indexerDirPath, indexerVersion)
+
+ // In case of a hot upgrade of Xray in progress, the version of the downloaded indexer might be different from the Xray version we got above,
+ // so the indexer we just downloaded may already exist.
+ newDirExists, err := fileutils.IsDirExists(newDirPath, false)
+ if err != nil {
+ return "", err
+ }
+ if newDirExists {
+ err = fileutils.RemoveTempDir(tempDirPath)
+ } else {
+ err = fileutils.MoveDir(tempDirPath, newDirPath)
+ }
+
+ return filepath.Join(newDirPath, indexerBinaryName), errorutils.CheckError(err)
+}
+
+func getIndexerVersion(indexerPath string) (string, error) {
+ indexCmd := &coreutils.GeneralExecCmd{
+ ExecPath: indexerPath,
+ Command: []string{"version"},
+ }
+ output, err := gofrogio.RunCmdOutput(indexCmd)
+ if err != nil {
+ return "", errorutils.CheckError(err)
+ }
+ splitOutput := strings.Split(output, " ")
+ // The output of the command looks like: jfrog xray indexer-app version 1.2.3
+ indexerVersion := strings.TrimSuffix(splitOutput[len(splitOutput)-1], "\n")
+ return indexerVersion, nil
+}
+
+func deleteOldIndexers(indexerDirPath string) error {
+ indexerDirExists, err := fileutils.IsDirExists(indexerDirPath, false)
+ if !indexerDirExists || err != nil {
+ return err
+ }
+
+ filesList, err := os.ReadDir(indexerDirPath)
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ var dirsList []string
+ for _, file := range filesList {
+ if file.IsDir() {
+ dirsList = append(dirsList, file.Name())
+ }
+ }
+
+ if len(dirsList) <= 2 {
+ return nil
+ }
+
+ sort.Slice(dirsList, func(i, j int) bool {
+ currVersion := version.NewVersion(dirsList[i])
+ return currVersion.AtLeast(dirsList[j])
+ })
+
+ for i := 2; i < len(dirsList); i++ {
+ err = os.RemoveAll(filepath.Join(indexerDirPath, dirsList[i]))
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ }
+
+ return nil
+}
+
+func getIndexerBinaryName() string {
+ switch runtime.GOOS {
+ case "windows":
+ return "indexer-app.exe"
+ default:
+ return "indexer-app"
+ }
+}
diff --git a/commands/scan/downloadindexer_test.go b/commands/scan/downloadindexer_test.go
new file mode 100644
index 00000000..0cc7e7b8
--- /dev/null
+++ b/commands/scan/downloadindexer_test.go
@@ -0,0 +1,59 @@
+package scan
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+ "os"
+ "path/filepath"
+ "testing"
+
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/stretchr/testify/assert"
+)
+
+func TestDeleteOldIndexers(t *testing.T) {
+ testsDir, createTempDirCallback := tests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ indexersDir := filepath.Join(testsDir, "xray-indexer")
+ indexersDirsPaths := []string{
+ filepath.Join(indexersDir, "1.0.0"),
+ filepath.Join(indexersDir, "1.2.0"),
+ filepath.Join(indexersDir, "1.3.x-SNAPSHOT"),
+ }
+
+ // Test no indexers directory at all
+ assert.NoError(t, deleteOldIndexers(indexersDir))
+
+ // Test there are two directories in the indexers directory - nothing should be deleted
+ createDummyIndexer(t, indexersDirsPaths[0])
+ createDummyIndexer(t, indexersDirsPaths[1])
+ assert.NoError(t, deleteOldIndexers(indexersDir))
+ assert.True(t, checkIndexerExists(t, indexersDirsPaths[0]))
+ assert.True(t, checkIndexerExists(t, indexersDirsPaths[1]))
+
+ // Test there are three directories in the indexers directory - the oldest one (by version) should be deleted
+ createDummyIndexer(t, indexersDirsPaths[2])
+ assert.NoError(t, deleteOldIndexers(indexersDir))
+ assert.False(t, checkIndexerExists(t, indexersDirsPaths[0]))
+ assert.True(t, checkIndexerExists(t, indexersDirsPaths[1]))
+ assert.True(t, checkIndexerExists(t, indexersDirsPaths[2]))
+}
+
+func createDummyIndexer(t *testing.T, dirPath string) {
+ err := os.MkdirAll(dirPath, 0777)
+ assert.NoError(t, err)
+ fullPath := filepath.Join(dirPath, getIndexerBinaryName())
+ file, err := os.Create(fullPath)
+ assert.NoError(t, err)
+ defer func() {
+ assert.NoError(t, file.Close())
+ }()
+ _, err = file.Write([]byte(fullPath))
+ assert.NoError(t, err)
+}
+
+func checkIndexerExists(t *testing.T, dirPath string) bool {
+ indexerPath := filepath.Join(dirPath, getIndexerBinaryName())
+ exists, err := fileutils.IsFileExists(indexerPath, true)
+ assert.NoError(t, err)
+ return exists
+}
diff --git a/commands/scan/scan.go b/commands/scan/scan.go
new file mode 100644
index 00000000..535d81f9
--- /dev/null
+++ b/commands/scan/scan.go
@@ -0,0 +1,459 @@
+package scan
+
+import (
+ "bytes"
+ "encoding/json"
+ "errors"
+ "fmt"
+ "os/exec"
+ "path/filepath"
+ "regexp"
+ "strings"
+
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+
+ "github.com/jfrog/gofrog/parallel"
+ "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ outputFormat "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ "github.com/jfrog/jfrog-cli-core/v2/common/spec"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-cli-security/formats"
+ xrutils "github.com/jfrog/jfrog-cli-security/utils"
+ "github.com/jfrog/jfrog-client-go/artifactory/services/fspatterns"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ ioUtils "github.com/jfrog/jfrog-client-go/utils/io"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+)
+
+type FileContext func(string) parallel.TaskFunc
+type indexFileHandlerFunc func(file string)
+
+const (
+ BypassArchiveLimitsMinXrayVersion = "3.59.0"
+ indexingCommand = "graph"
+ fileNotSupportedExitCode = 3
+)
+
+type ScanCommand struct {
+ serverDetails *config.ServerDetails
+ spec *spec.SpecFiles
+ threads int
+ // The location of the downloaded Xray indexer binary on the local file system.
+ indexerPath string
+ indexerTempDir string
+ outputFormat outputFormat.OutputFormat
+ projectKey string
+ minSeverityFilter string
+ watches []string
+ includeVulnerabilities bool
+ includeLicenses bool
+ fail bool
+ printExtendedTable bool
+ bypassArchiveLimits bool
+ fixableOnly bool
+ progress ioUtils.ProgressMgr
+}
+
+func (scanCmd *ScanCommand) SetMinSeverityFilter(minSeverityFilter string) *ScanCommand {
+ scanCmd.minSeverityFilter = minSeverityFilter
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetFixableOnly(fixable bool) *ScanCommand {
+ scanCmd.fixableOnly = fixable
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetProgress(progress ioUtils.ProgressMgr) {
+ scanCmd.progress = progress
+}
+
+func (scanCmd *ScanCommand) SetThreads(threads int) *ScanCommand {
+ scanCmd.threads = threads
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetOutputFormat(format outputFormat.OutputFormat) *ScanCommand {
+ scanCmd.outputFormat = format
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetServerDetails(server *config.ServerDetails) *ScanCommand {
+ scanCmd.serverDetails = server
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetSpec(spec *spec.SpecFiles) *ScanCommand {
+ scanCmd.spec = spec
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetProject(project string) *ScanCommand {
+ scanCmd.projectKey = project
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetWatches(watches []string) *ScanCommand {
+ scanCmd.watches = watches
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetIncludeVulnerabilities(include bool) *ScanCommand {
+ scanCmd.includeVulnerabilities = include
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetIncludeLicenses(include bool) *ScanCommand {
+ scanCmd.includeLicenses = include
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) ServerDetails() (*config.ServerDetails, error) {
+ return scanCmd.serverDetails, nil
+}
+
+func (scanCmd *ScanCommand) SetFail(fail bool) *ScanCommand {
+ scanCmd.fail = fail
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetPrintExtendedTable(printExtendedTable bool) *ScanCommand {
+ scanCmd.printExtendedTable = printExtendedTable
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) SetBypassArchiveLimits(bypassArchiveLimits bool) *ScanCommand {
+ scanCmd.bypassArchiveLimits = bypassArchiveLimits
+ return scanCmd
+}
+
+func (scanCmd *ScanCommand) indexFile(filePath string) (*xrayUtils.BinaryGraphNode, error) {
+ var indexerResults xrayUtils.BinaryGraphNode
+ indexerCmd := exec.Command(scanCmd.indexerPath, indexingCommand, filePath, "--temp-dir", scanCmd.indexerTempDir)
+ if scanCmd.bypassArchiveLimits {
+ indexerCmd.Args = append(indexerCmd.Args, "--bypass-archive-limits")
+ }
+ var stderr bytes.Buffer
+ var stdout bytes.Buffer
+ indexerCmd.Stdout = &stdout
+ indexerCmd.Stderr = &stderr
+ err := indexerCmd.Run()
+ if err != nil {
+ var e *exec.ExitError
+ if errors.As(err, &e) {
+ if e.ExitCode() == fileNotSupportedExitCode {
+ log.Debug(fmt.Sprintf("File %s is not supported by Xray indexer app.", filePath))
+ return &indexerResults, nil
+ }
+ }
+ return nil, errorutils.CheckErrorf("Xray indexer app failed indexing %s with %s: %s", filePath, err, stderr.String())
+ }
+ if stderr.String() != "" {
+ log.Info(stderr.String())
+ }
+ err = json.Unmarshal(stdout.Bytes(), &indexerResults)
+ return &indexerResults, errorutils.CheckError(err)
+}
+
+func (scanCmd *ScanCommand) Run() (err error) {
+ defer func() {
+ if err != nil {
+ var e *exec.ExitError
+ if errors.As(err, &e) {
+ if e.ExitCode() != coreutils.ExitCodeVulnerableBuild.Code {
+ err = errors.New("Scan command failed. " + err.Error())
+ }
+ }
+ }
+ }()
+ xrayManager, xrayVersion, err := xrutils.CreateXrayServiceManagerAndGetVersion(scanCmd.serverDetails)
+ if err != nil {
+ return err
+ }
+
+ // Validate Xray minimum version for graph scan command
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, scangraph.GraphScanMinXrayVersion)
+ if err != nil {
+ return err
+ }
+
+ if scanCmd.bypassArchiveLimits {
+ // Validate Xray minimum version for BypassArchiveLimits flag for indexer
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, xrayVersion, BypassArchiveLimitsMinXrayVersion)
+ if err != nil {
+ return err
+ }
+ }
+ log.Info("JFrog Xray version is:", xrayVersion)
+ // First download Xray Indexer if needed
+ scanCmd.indexerPath, err = DownloadIndexerIfNeeded(xrayManager, xrayVersion)
+ if err != nil {
+ return err
+ }
+ // Create Temp dir for Xray Indexer
+ scanCmd.indexerTempDir, err = fileutils.CreateTempDir()
+ if err != nil {
+ return err
+ }
+ defer func() {
+ e := fileutils.RemoveTempDir(scanCmd.indexerTempDir)
+ if err == nil {
+ err = e
+ }
+ }()
+ threads := 1
+ if scanCmd.threads > 1 {
+ threads = scanCmd.threads
+ }
+
+ // resultsArr is a two-dimensional array. Each array in it contains a list of ScanResponses that were requested and collected by a specific thread.
+ resultsArr := make([][]*services.ScanResponse, threads)
+ fileProducerConsumer := parallel.NewRunner(scanCmd.threads, 20000, false)
+ fileProducerErrors := make([][]formats.SimpleJsonError, threads)
+ indexedFileProducerConsumer := parallel.NewRunner(scanCmd.threads, 20000, false)
+ indexedFileProducerErrors := make([][]formats.SimpleJsonError, threads)
+ fileCollectingErrorsQueue := clientutils.NewErrorsQueue(1)
+ // Start walking on the filesystem to "produce" files that match the given pattern
+ // while the consumer uses the indexer to index those files.
+ scanCmd.prepareScanTasks(fileProducerConsumer, indexedFileProducerConsumer, resultsArr, fileProducerErrors, indexedFileProducerErrors, fileCollectingErrorsQueue, xrayVersion)
+ scanCmd.performScanTasks(fileProducerConsumer, indexedFileProducerConsumer)
+
+ // Handle results
+ flatResults := []services.ScanResponse{}
+ for _, arr := range resultsArr {
+ for _, res := range arr {
+ flatResults = append(flatResults, *res)
+ }
+ }
+ if scanCmd.progress != nil {
+ if err = scanCmd.progress.Quit(); err != nil {
+ return err
+ }
+
+ }
+
+ fileCollectingErr := fileCollectingErrorsQueue.GetError()
+ var scanErrors []formats.SimpleJsonError
+ if fileCollectingErr != nil {
+ scanErrors = append(scanErrors, formats.SimpleJsonError{ErrorMessage: fileCollectingErr.Error()})
+ }
+ scanErrors = appendErrorSlice(scanErrors, fileProducerErrors)
+ scanErrors = appendErrorSlice(scanErrors, indexedFileProducerErrors)
+
+ scanResults := xrutils.NewAuditResults()
+ scanResults.XrayVersion = xrayVersion
+ scanResults.ScaResults = []xrutils.ScaScanResult{{XrayResults: flatResults}}
+
+ if err = xrutils.NewResultsWriter(scanResults).
+ SetOutputFormat(scanCmd.outputFormat).
+ SetIncludeVulnerabilities(scanCmd.includeVulnerabilities).
+ SetIncludeLicenses(scanCmd.includeLicenses).
+ SetPrintExtendedTable(scanCmd.printExtendedTable).
+ SetIsMultipleRootProject(true).
+ SetScanType(services.Binary).
+ PrintScanResults(); err != nil {
+ return
+ }
+
+ if err != nil {
+ return err
+ }
+ // If includeVulnerabilities is false it means that context was provided, so we need to check for build violations.
+ // If user provided --fail=false, don't fail the build.
+ if scanCmd.fail && !scanCmd.includeVulnerabilities {
+ if xrutils.CheckIfFailBuild(flatResults) {
+ return xrutils.NewFailBuildError()
+ }
+ }
+ if len(scanErrors) > 0 {
+ return errorutils.CheckErrorf(scanErrors[0].ErrorMessage)
+ }
+ log.Info("Scan completed successfully.")
+ return nil
+}
+
+func NewScanCommand() *ScanCommand {
+ return &ScanCommand{}
+}
+
+func (scanCmd *ScanCommand) CommandName() string {
+ return "xr_scan"
+}
+
+func (scanCmd *ScanCommand) prepareScanTasks(fileProducer, indexedFileProducer parallel.Runner, resultsArr [][]*services.ScanResponse, fileErrors, indexedFileErrors [][]formats.SimpleJsonError, fileCollectingErrorsQueue *clientutils.ErrorsQueue, xrayVersion string) {
+ go func() {
+ defer fileProducer.Done()
+ // Iterate over file-spec groups and produce indexing tasks.
+ // When encountering an error, log and move to next group.
+ specFiles := scanCmd.spec.Files
+ for i := range specFiles {
+ artifactHandlerFunc := scanCmd.createIndexerHandlerFunc(&specFiles[i], indexedFileProducer, resultsArr, fileErrors, indexedFileErrors, xrayVersion)
+ taskHandler := getAddTaskToProducerFunc(fileProducer, artifactHandlerFunc)
+
+ err := collectFilesForIndexing(specFiles[i], taskHandler)
+ if err != nil {
+ log.Error(err)
+ fileCollectingErrorsQueue.AddError(err)
+ }
+ }
+ }()
+}
+
+func (scanCmd *ScanCommand) createIndexerHandlerFunc(file *spec.File, indexedFileProducer parallel.Runner, resultsArr [][]*services.ScanResponse, fileErrors, indexedFileErrors [][]formats.SimpleJsonError, xrayVersion string) FileContext {
+ return func(filePath string) parallel.TaskFunc {
+ return func(threadId int) (err error) {
+ logMsgPrefix := clientutils.GetLogMsgPrefix(threadId, false)
+ log.Info(logMsgPrefix+"Indexing file:", filePath)
+ if scanCmd.progress != nil {
+ scanCmd.progress.SetHeadlineMsg("Indexing file: " + filepath.Base(filePath) + " 🗄")
+ }
+ graph, err := scanCmd.indexFile(filePath)
+ if err != nil {
+ fileErrors[threadId] = append(fileErrors[threadId], formats.SimpleJsonError{FilePath: filePath, ErrorMessage: err.Error()})
+ return err
+ }
+ // In case of empty graph returned by the indexer,
+ // for instance due to unsupported file format, continue without sending a
+ // graph request to Xray.
+ if graph.Id == "" {
+ return nil
+ }
+ // Add a new task to the second producer/consumer
+ // which will send the indexed binary to Xray and then will store the received result.
+ taskFunc := func(threadId int) (err error) {
+ params := &services.XrayGraphScanParams{
+ BinaryGraph: graph,
+ RepoPath: getXrayRepoPathFromTarget(file.Target),
+ Watches: scanCmd.watches,
+ IncludeLicenses: scanCmd.includeLicenses,
+ IncludeVulnerabilities: scanCmd.includeVulnerabilities,
+ ProjectKey: scanCmd.projectKey,
+ ScanType: services.Binary,
+ }
+ if scanCmd.progress != nil {
+ scanCmd.progress.SetHeadlineMsg("Scanning 🔍")
+ }
+ scanGraphParams := scangraph.NewScanGraphParams().
+ SetServerDetails(scanCmd.serverDetails).
+ SetXrayGraphScanParams(params).
+ SetXrayVersion(xrayVersion).
+ SetFixableOnly(scanCmd.fixableOnly).
+ SetSeverityLevel(scanCmd.minSeverityFilter)
+ scanResults, err := scangraph.RunScanGraphAndGetResults(scanGraphParams)
+ if err != nil {
+ log.Error(fmt.Sprintf("scanning '%s' failed with error: %s", graph.Id, err.Error()))
+ indexedFileErrors[threadId] = append(indexedFileErrors[threadId], formats.SimpleJsonError{FilePath: filePath, ErrorMessage: err.Error()})
+ return
+ }
+ resultsArr[threadId] = append(resultsArr[threadId], scanResults)
+ return
+ }
+
+ _, _ = indexedFileProducer.AddTask(taskFunc)
+ return
+ }
+ }
+}
+
+func getAddTaskToProducerFunc(producer parallel.Runner, fileHandlerFunc FileContext) indexFileHandlerFunc {
+ return func(filePath string) {
+ taskFunc := fileHandlerFunc(filePath)
+ _, _ = producer.AddTask(taskFunc)
+ }
+}
+
+func (scanCmd *ScanCommand) performScanTasks(fileConsumer parallel.Runner, indexedFileConsumer parallel.Runner) {
+ go func() {
+ // Blocking until consuming is finished.
+ fileConsumer.Run()
+ // After all files have been indexed, The second producer notifies that no more tasks will be produced.
+ indexedFileConsumer.Done()
+ }()
+ // Blocking until consuming is finished.
+ indexedFileConsumer.Run()
+}
+
+func collectFilesForIndexing(fileData spec.File, dataHandlerFunc indexFileHandlerFunc) error {
+
+ fileData.Pattern = clientutils.ReplaceTildeWithUserHome(fileData.Pattern)
+ patternType := fileData.GetPatternType()
+ rootPath, err := fspatterns.GetRootPath(fileData.Pattern, fileData.Target, "", patternType, false)
+ if err != nil {
+ return err
+ }
+
+ isDir, err := fileutils.IsDirExists(rootPath, false)
+ if err != nil {
+ return err
+ }
+
+ // If the path is a single file, index it and return
+ if !isDir {
+ dataHandlerFunc(rootPath)
+ return nil
+ }
+ fileData.Pattern = clientutils.ConvertLocalPatternToRegexp(fileData.Pattern, patternType)
+ return collectPatternMatchingFiles(fileData, rootPath, dataHandlerFunc)
+}
+
+func collectPatternMatchingFiles(fileData spec.File, rootPath string, dataHandlerFunc indexFileHandlerFunc) error {
+ fileParams, err := fileData.ToCommonParams()
+ if err != nil {
+ return err
+ }
+ excludePathPattern := fspatterns.PrepareExcludePathPattern(fileParams.Exclusions, fileParams.GetPatternType(), fileParams.IsRecursive())
+ patternRegex, err := regexp.Compile(fileData.Pattern)
+ if errorutils.CheckError(err) != nil {
+ return err
+ }
+ recursive, err := fileData.IsRecursive(true)
+ if err != nil {
+ return err
+ }
+
+ paths, err := fspatterns.ListFiles(rootPath, recursive, false, false, false, excludePathPattern)
+ if err != nil {
+ return err
+ }
+ for _, path := range paths {
+ matches, isDir, err := fspatterns.SearchPatterns(path, false, false, patternRegex)
+ if err != nil {
+ return err
+ }
+ // Because paths should contain all files and directories (walks recursively) we can ignore dirs, as only files relevance for indexing.
+ if isDir {
+ continue
+ }
+ if len(matches) > 0 {
+ dataHandlerFunc(path)
+ }
+ }
+ return nil
+}
+
+// Xray expects a path inside a repo, but does not accept a path to a file.
+// Therefore, if the given target path is a path to a file,
+// the path to the parent directory will be returned.
+// Otherwise, the func will return the path itself.
+func getXrayRepoPathFromTarget(target string) (repoPath string) {
+ if strings.HasSuffix(target, "/") {
+ return target
+ }
+ return target[:strings.LastIndex(target, "/")+1]
+}
+
+func appendErrorSlice(scanErrors []formats.SimpleJsonError, errorsToAdd [][]formats.SimpleJsonError) []formats.SimpleJsonError {
+ for _, errorSlice := range errorsToAdd {
+ scanErrors = append(scanErrors, errorSlice...)
+ }
+ return scanErrors
+}
+
+func ConditionalUploadDefaultScanFunc(serverDetails *config.ServerDetails, fileSpec *spec.SpecFiles, threads int, scanOutputFormat format.OutputFormat) error {
+ return NewScanCommand().SetServerDetails(serverDetails).SetSpec(fileSpec).SetThreads(threads).SetOutputFormat(scanOutputFormat).SetFail(true).SetPrintExtendedTable(false).Run()
+}
diff --git a/commands/xray/curl/curl.go b/commands/xray/curl/curl.go
new file mode 100644
index 00000000..54a047a2
--- /dev/null
+++ b/commands/xray/curl/curl.go
@@ -0,0 +1,17 @@
+package curl
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/common/commands"
+)
+
+type XrCurlCommand struct {
+ commands.CurlCommand
+}
+
+func NewXrCurlCommand(curlCommand commands.CurlCommand) *XrCurlCommand {
+ return &XrCurlCommand{curlCommand}
+}
+
+func (curlCmd *XrCurlCommand) CommandName() string {
+ return "xr_curl"
+}
diff --git a/commands/xray/offlineupdate/offlineupdate.go b/commands/xray/offlineupdate/offlineupdate.go
new file mode 100644
index 00000000..ccc49cad
--- /dev/null
+++ b/commands/xray/offlineupdate/offlineupdate.go
@@ -0,0 +1,463 @@
+package offlineupdate
+
+import (
+ "encoding/json"
+ "errors"
+ "fmt"
+ "golang.org/x/exp/maps"
+ "net/http"
+ "os"
+ "path/filepath"
+ "sort"
+ "strconv"
+ "strings"
+
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ "github.com/jfrog/jfrog-client-go/http/httpclient"
+ "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/utils/errorutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/fileutils"
+ "github.com/jfrog/jfrog-client-go/utils/io/httputils"
+ "github.com/jfrog/jfrog-client-go/utils/log"
+)
+
+const (
+ Vulnerability = "__vuln"
+ Component = "__comp"
+ JxrayDefaultBaseUrl = "https://jxray.jfrog.io/"
+ JxrayApiBundles = "api/v1/updates/bundles"
+ JxrayApiOnboarding = "api/v1/updates/onboarding"
+ periodicState = "periodic"
+ onboardingState = "onboarding"
+)
+
+func OfflineUpdate(flags *OfflineUpdatesFlags) error {
+ if shouldUseDBSyncV3(flags) {
+ return handleDBSyncV3OfflineUpdate(flags)
+ }
+ return handleDBSyncV1OfflineUpdate(flags)
+}
+
+// Verify that the given string is a valid optional stream.
+func ValidateStream(stream string) (string, error) {
+ streams := NewValidStreams()
+ if streams.StreamsMap[stream] {
+ return stream, nil
+ }
+ return "", errorutils.CheckErrorf("Invalid stream type: %s, Possible values are: %v", stream, streams.GetValidStreamsString())
+}
+
+// Should use DBSync version 3 if the 'stream' flag was specified.
+func shouldUseDBSyncV3(flags *OfflineUpdatesFlags) bool {
+ return flags.Stream != ""
+}
+
+func handleDBSyncV1OfflineUpdate(flags *OfflineUpdatesFlags) error {
+ updatesUrl, err := buildUpdatesUrl(flags)
+ if err != nil {
+ return err
+ }
+ vulnerabilities, components, lastUpdate, err := getFilesList(updatesUrl, flags)
+ if err != nil {
+ return err
+ }
+ zipSuffix := "_" + strconv.FormatInt(lastUpdate, 10)
+ xrayTempDir, err := getXrayTempDir()
+ if err != nil {
+ return err
+ }
+
+ if flags.Target != "" && (len(vulnerabilities) > 0 || len(components) > 0) {
+ err = os.MkdirAll(flags.Target, 0777)
+ if errorutils.CheckError(err) != nil {
+ return err
+ }
+ }
+
+ if len(vulnerabilities) > 0 {
+ log.Info("Downloading vulnerabilities...")
+ err := saveData(xrayTempDir, "vuln", zipSuffix, flags.Target, vulnerabilities)
+ if err != nil {
+ return err
+ }
+ } else {
+ log.Info("There are no new vulnerabilities.")
+ }
+
+ if len(components) > 0 {
+ log.Info("Downloading components...")
+ err := saveData(xrayTempDir, "comp", zipSuffix, flags.Target, components)
+ if err != nil {
+ return err
+ }
+ } else {
+ log.Info("There are no new components.")
+ }
+ return nil
+}
+
+func getURLsToDownloadDBSyncV3(responseBody []byte, isPeriodicUpdate bool) ([]string, error) {
+ var onboardingResponse OnboardingResponse
+ var periodicResponse V3PeriodicUpdateResponse
+ var urlsToDownload []string
+ var err error
+ if isPeriodicUpdate {
+ err = json.Unmarshal(responseBody, &periodicResponse)
+ if err != nil {
+ return nil, errorutils.CheckError(err)
+ }
+ for _, packageUrl := range periodicResponse.Update {
+ urlsToDownload = append(urlsToDownload, packageUrl.DownloadUrl)
+ }
+ for _, packageUrl := range periodicResponse.Deletion {
+ urlsToDownload = append(urlsToDownload, packageUrl.DownloadUrl)
+ }
+ } else {
+ err = json.Unmarshal(responseBody, &onboardingResponse)
+ if err != nil {
+ return nil, errorutils.CheckError(err)
+ }
+ for _, packageUrl := range onboardingResponse {
+ urlsToDownload = append(urlsToDownload, packageUrl.DownloadUrl)
+ }
+ }
+ return urlsToDownload, nil
+}
+
+func createV3MetadataFile(state string, body []byte, destFolder string) (err error) {
+ metaDataFileName := state + ".json"
+ metaDataFile := filepath.Join(destFolder, metaDataFileName)
+ f, err := os.Create(metaDataFile)
+ if err != nil {
+ return errorutils.CheckError(err)
+ }
+ defer func() {
+ if cerr := f.Close(); err != nil {
+ err = cerr
+ }
+ }()
+ _, err = f.Write(body)
+ return errorutils.CheckError(err)
+}
+
+func handleDBSyncV3OfflineUpdate(flags *OfflineUpdatesFlags) (err error) {
+ url := buildUrlDBSyncV3(flags)
+ log.Info("Getting updates...")
+ headers := make(map[string]string)
+ headers["X-Xray-License"] = flags.License
+ httpClientDetails := httputils.HttpClientDetails{
+ Headers: headers,
+ }
+ client, err := httpclient.ClientBuilder().SetRetries(3).Build()
+ if err != nil {
+ return err
+ }
+ resp, body, _, err := client.SendGet(url, false, httpClientDetails, "")
+ if errorutils.CheckError(err) != nil {
+ return err
+ }
+ if err = errorutils.CheckResponseStatusWithBody(resp, body, http.StatusOK); err != nil {
+ return err
+ }
+
+ urlsToDownload, err := getURLsToDownloadDBSyncV3(body, flags.IsPeriodicUpdate)
+ if err != nil {
+ return err
+ }
+
+ var state string
+ if flags.IsPeriodicUpdate {
+ state = periodicState
+ } else {
+ state = onboardingState
+ }
+ dataDir, err := os.MkdirTemp(flags.Target, "xray_downloaded_data")
+ if err != nil {
+ return err
+ }
+ defer func() {
+ if cerr := fileutils.RemoveTempDir(dataDir); err == nil {
+ err = cerr
+ }
+ }()
+ err = downloadData(urlsToDownload, dataDir, createXrayFileNameFromUrlV3)
+ if err != nil {
+ return err
+ }
+
+ err = createV3MetadataFile(state, body, dataDir)
+ if err != nil {
+ return err
+ }
+
+ packageName := "xray_" + flags.Stream + "_update_package" + "_" + state
+ err = createZipArchive(dataDir, flags.Target, packageName, "")
+ if err != nil {
+ return err
+ }
+ return nil
+}
+
+func buildUrlDBSyncV3(flags *OfflineUpdatesFlags) string {
+ streams := NewValidStreams()
+ url := getJxRayBaseUrl() + "api/v3/updates/"
+ // Build JAS urls if needed.
+ if flags.Stream == streams.GetExposuresStream() {
+ url += streams.GetExposuresStream() + "/"
+ } else if flags.Stream == streams.GetContextualAnalysisStream() {
+ url += streams.GetContextualAnalysisStream() + "/"
+ }
+
+ if flags.IsPeriodicUpdate {
+ return url + periodicState
+ } else {
+ return url + onboardingState
+ }
+}
+
+func getJxRayBaseUrl() string {
+ jxRayBaseUrl := os.Getenv("JFROG_CLI_JXRAY_BASE_URL")
+ jxRayBaseUrl = utils.AddTrailingSlashIfNeeded(jxRayBaseUrl)
+ if jxRayBaseUrl == "" {
+ jxRayBaseUrl = JxrayDefaultBaseUrl
+ }
+ return jxRayBaseUrl
+}
+
+func getUpdatesBaseUrl(datesSpecified bool) string {
+ jxRayBaseUrl := getJxRayBaseUrl()
+ if datesSpecified {
+ return jxRayBaseUrl + JxrayApiBundles
+ }
+ return jxRayBaseUrl + JxrayApiOnboarding
+}
+
+func buildUpdatesUrl(flags *OfflineUpdatesFlags) (string, error) {
+ var queryParams string
+ datesSpecified := flags.From > 0 && flags.To > 0
+ if datesSpecified {
+ if err := validateDates(flags.From, flags.To); err != nil {
+ return "", err
+ }
+ queryParams += fmt.Sprintf("from=%v&to=%v", flags.From, flags.To)
+ }
+ if flags.Version != "" {
+ if queryParams != "" {
+ queryParams += "&"
+ }
+ queryParams += fmt.Sprintf("version=%v", flags.Version)
+ }
+ url := getUpdatesBaseUrl(datesSpecified)
+ if queryParams != "" {
+ url += "?" + queryParams
+ }
+ return url, nil
+}
+
+func validateDates(from, to int64) error {
+ if from < 0 || to < 0 {
+ err := errors.New("invalid dates")
+ return errorutils.CheckError(err)
+ }
+ if from > to {
+ err := errors.New("invalid dates range")
+ return errorutils.CheckError(err)
+ }
+ return nil
+}
+
+func getXrayTempDir() (string, error) {
+ xrayDir := filepath.Join(coreutils.GetCliPersistentTempDirPath(), "jfrog", "xray")
+ if err := os.MkdirAll(xrayDir, 0777); err != nil {
+ return "", errorutils.CheckError(err)
+ }
+ return xrayDir, nil
+}
+
+func downloadData(urlsList []string, dataDir string, fileNameFromUrlFunc func(string) (string, error)) error {
+ log.Info(fmt.Sprintf("Downloading updated packages to %s.", dataDir))
+ for _, url := range urlsList {
+ fileName, err := fileNameFromUrlFunc(url)
+ if err != nil {
+ return err
+ }
+ client, err := httpclient.ClientBuilder().SetRetries(3).Build()
+ if err != nil {
+ log.Error(fmt.Sprintf("Couldn't download from %s", url))
+ return err
+ }
+
+ details := &httpclient.DownloadFileDetails{
+ FileName: fileName,
+ DownloadPath: url,
+ LocalPath: dataDir,
+ LocalFileName: fileName}
+ response, _, err := client.SendHead(url, httputils.HttpClientDetails{}, "")
+ if err != nil {
+ return fmt.Errorf("couldn't get content length of %s. Error: %s", url, err.Error())
+ }
+ log.Info(fmt.Sprintf("Downloading updated package from %s. Content size: %.4f MB.", url, float64(response.ContentLength)/1000000))
+ _, err = client.DownloadFile(details, "", httputils.HttpClientDetails{}, false, false)
+ if err != nil {
+ return errorutils.CheckErrorf("Couldn't download from %s. Error: %s", url, err.Error())
+ }
+ }
+ log.Info("Download completed.")
+ return nil
+}
+
+func createZipArchive(dataDir, targetPath, filesPrefix, zipSuffix string) error {
+ log.Info("Zipping files.")
+ err := fileutils.ZipFolderFiles(dataDir, filepath.Join(targetPath, filesPrefix+zipSuffix+".zip"))
+ if err != nil {
+ return err
+ }
+ log.Info("Done zipping files.")
+ return nil
+}
+
+func saveData(xrayTmpDir, filesPrefix, zipSuffix, targetPath string, urlsList []string) (err error) {
+ dataDir, err := os.MkdirTemp(xrayTmpDir, filesPrefix)
+ if err != nil {
+ return err
+ }
+ defer func() {
+ if cerr := fileutils.RemoveTempDir(dataDir); cerr != nil && err == nil {
+ err = cerr
+ }
+ }()
+ err = downloadData(urlsList, dataDir, createXrayFileNameFromUrl)
+ if err != nil {
+ return err
+ }
+ err = createZipArchive(dataDir, targetPath, filesPrefix, zipSuffix)
+ if err != nil {
+ return err
+ }
+ return nil
+}
+
+func getUrlSections(url string) []string {
+ index := strings.Index(url, "?")
+ if index != -1 {
+ url = url[:index]
+ }
+ index = strings.Index(url, ";")
+ if index != -1 {
+ url = url[:index]
+ }
+ return strings.Split(url, "/")
+}
+
+func createXrayFileNameFromUrlV3(url string) (string, error) {
+ sections := getUrlSections(url)
+ length := len(sections)
+ return sections[length-1], nil
+}
+
+func createXrayFileNameFromUrl(url string) (fileName string, err error) {
+ sections := getUrlSections(url)
+ length := len(sections)
+ if length < 2 {
+ err = errorutils.CheckErrorf("Unexpected URL format: %s", url)
+ return
+ }
+ fileName = fmt.Sprintf("%s__%s", sections[length-2], sections[length-1])
+ return
+}
+
+func getFilesList(updatesUrl string, flags *OfflineUpdatesFlags) (vulnerabilities []string, components []string, lastUpdate int64, err error) {
+ log.Info("Getting updates...")
+ headers := make(map[string]string)
+ headers["X-Xray-License"] = flags.License
+ httpClientDetails := httputils.HttpClientDetails{
+ Headers: headers,
+ }
+ client, err := httpclient.ClientBuilder().SetRetries(3).Build()
+ if err != nil {
+ return
+ }
+ resp, body, _, err := client.SendGet(updatesUrl, false, httpClientDetails, "")
+ if errorutils.CheckError(err) != nil {
+ return
+ }
+ if err = errorutils.CheckResponseStatusWithBody(resp, body, http.StatusOK); err != nil {
+ return
+ }
+
+ var urls FilesList
+ err = json.Unmarshal(body, &urls)
+ if err != nil {
+ err = errorutils.CheckErrorf("Failed parsing json response: " + string(body))
+ return
+ }
+
+ for _, v := range urls.Urls {
+ if strings.Contains(v, Vulnerability) {
+ vulnerabilities = append(vulnerabilities, v)
+ } else if strings.Contains(v, Component) {
+ components = append(components, v)
+ }
+ }
+ lastUpdate = urls.LastUpdate
+ return
+}
+
+// ValidStreams represents the valid values that can be provided to the 'stream' flag during offline updates.
+type ValidStreams struct {
+ StreamsMap map[string]bool
+}
+
+func NewValidStreams() *ValidStreams {
+ validStreams := &ValidStreams{StreamsMap: map[string]bool{}}
+ validStreams.StreamsMap[validStreams.GetPublicDataStream()] = true
+ validStreams.StreamsMap[validStreams.GetExposuresStream()] = true
+ validStreams.StreamsMap[validStreams.GetContextualAnalysisStream()] = true
+ return validStreams
+}
+
+func (vs *ValidStreams) GetValidStreamsString() string {
+ streams := maps.Keys(vs.StreamsMap)
+ sort.Sort(sort.Reverse(sort.StringSlice(streams)))
+ streamsStr := strings.Join(streams[0:len(streams)-1], ", ")
+ return fmt.Sprintf("%s and %s", streamsStr, streams[len(streams)-1])
+}
+
+func (vs *ValidStreams) GetPublicDataStream() string {
+ return "public_data"
+}
+
+func (vs *ValidStreams) GetExposuresStream() string {
+ return "exposures"
+}
+
+func (vs *ValidStreams) GetContextualAnalysisStream() string {
+ return "contextual_analysis"
+}
+
+type OfflineUpdatesFlags struct {
+ License string
+ From int64
+ To int64
+ Version string
+ Target string
+ Stream string
+ IsPeriodicUpdate bool
+}
+
+type FilesList struct {
+ LastUpdate int64
+ Urls []string
+}
+
+type V3UpdateResponseItem struct {
+ DownloadUrl string `json:"download_url"`
+ Timestamp int64 `json:"timestamp"`
+}
+
+type V3PeriodicUpdateResponse struct {
+ Update []V3UpdateResponseItem `json:"update"`
+ Deletion []V3UpdateResponseItem `json:"deletion"`
+}
+
+type OnboardingResponse []V3UpdateResponseItem
diff --git a/commands/xray/offlineupdate/offlineupdate_test.go b/commands/xray/offlineupdate/offlineupdate_test.go
new file mode 100644
index 00000000..0c29b64d
--- /dev/null
+++ b/commands/xray/offlineupdate/offlineupdate_test.go
@@ -0,0 +1,157 @@
+package offlineupdate
+
+import (
+ "os"
+ "path/filepath"
+ "reflect"
+ "strings"
+ "testing"
+
+ "github.com/magiconair/properties/assert"
+)
+
+func TestCreateXrayFileNameFromUrl(t *testing.T) {
+ tests := []struct {
+ url string
+ fileName string
+ }{
+ {url: "https://dl.bintray.com/jfrog/jxray-data-dumps/2018-05/onboarding__vuln9__.json?expiry=1528473414900&id=K8v%2BJBItDfdcU9%2BBa2lxhmJRitQZPWsH89MtXs3pYfWKvRUwGNuUB8vcHv7EvJydaJGrwKm%2B%2FYAIAjMR3TaTm5VIRouiChTABPYbDNTNf%2F4%3D&signature=ePBfZuVZBljVvQTFHIpPH6lo7Qby%2Ban44resdLMo5f16W%2FX4Ou6gOleBHo5ViyWFy1tnFoPopk1XIQgB918ZFg%3D%3D", fileName: "2018-05__onboarding__vuln9__.json"},
+ {url: "https://dl.bintray.com/jfrog/jxray-data-dumps/2018-06-07/onboarding__vulnR1_1__.zip?expiry=1528711288481&id=K8v%2BJBItDfdcU9%2BBa2lxhmJRitQZPWsH89MtXs3pYfWKvRUwGNuUB8vcHv7EvJyd3g6UkTiZXV2mGxN1HD6KtovwjhKr4IdWuYKciRkl1agY487O8kk4jIOibc7paR2t&signature=eiB%2FcOMjWKjJdSybFr%2BPo56FmusgHvzvRMTnHSuHwIWvY5QnX2dIumsbIlMaVvL9xzoQObWHJ%2FMZyWnTVcv67g%3D%3D", fileName: "2018-06-07__onboarding__vulnR1_1__.zip"},
+ {url: "https://dl.bintray.com/jfrog/jxray-data-dumps/2018-05/onboarding__vulnR1_16__.zip?expiry=1528711287386&id=K8v%2BJBItDfdcU9%2BBa2lxhmJRitQZPWsH89MtXs3pYfWKvRUwGNuUB8vcHv7EvJydaJGrwKm%2B%2FYAIAjMR3TaTm9Wd2NqK5BiRQc33JGl4b0DZ9e%2B1cTtPyVtm5jxX9kIL&signature=HQKhgmRBtvJ1EwomgggR47M9TAWSegvWFUw9NItI5Cj22EZ2BqbhxIfcVAti8WJTjsKfAS2ap7yb%2BBmQilnSng%3D%3D", fileName: "2018-05__onboarding__vulnR1_16__.zip"},
+ {url: "https://dl.bintray.com/jfrog/jxray-data-dumps/2018-06-07/onboarding__vulnS1_1__.zip?expiry=1528711288397&id=K8v%2BJBItDfdcU9%2BBa2lxhmJRitQZPWsH89MtXs3pYfWKvRUwGNuUB8vcHv7EvJyd3g6UkTiZXV2mGxN1HD6KtozvQ8zhzPTSbLjftsv4zhgY487O8kk4jIOibc7paR2t&signature=P9XPWugJkqz5ekpEQrOkAbIsogAB7EOgq7ou1%2BTAPSOFfsKA9j1I%2Fhj94y%2BoIipYRtUUtSGCqXyRP%2B%2BG%2FscwmA%3D%3D", fileName: "2018-06-07__onboarding__vulnS1_1__.zip"},
+ }
+
+ for _, test := range tests {
+ fileName, err := createXrayFileNameFromUrl(test.url)
+ if err != nil {
+ t.Error(err)
+ }
+ assert.Equal(t, fileName, test.fileName)
+ }
+}
+
+func TestValidateStream(t *testing.T) {
+ streams := NewValidStreams()
+ type args struct {
+ streams string
+ }
+ tests := []struct {
+ name string
+ args args
+ want string
+ wantErr bool
+ }{
+ {"empty array", args{streams: ""}, "", true},
+ {"PublicData", args{streams: streams.GetPublicDataStream()}, streams.GetPublicDataStream(), false},
+ {"ContextualAnalysis", args{streams: streams.GetContextualAnalysisStream()}, streams.GetContextualAnalysisStream(), false},
+ {"Exposures", args{streams: streams.GetExposuresStream()}, streams.GetExposuresStream(), false},
+ {"invalid elements", args{streams: "bad_stream"}, "", true},
+ {"array", args{streams: streams.GetPublicDataStream() + ";" + streams.GetContextualAnalysisStream()}, "", true},
+ }
+ for _, tt := range tests {
+ t.Run(tt.name, func(t *testing.T) {
+ got, err := ValidateStream(tt.args.streams)
+ if (err != nil) != tt.wantErr {
+ t.Errorf("validateStream() error = %v, wantErr %v", err, tt.wantErr)
+ return
+ }
+ if !reflect.DeepEqual(got, tt.want) {
+ t.Errorf("validateStream() got = %v, want %v", got, tt.want)
+ }
+ })
+ }
+}
+
+// DBSync V3 test data
+
+var periodicUpdateResponse = "[{\"download_url\":\"some_url_to_package_update\",\"timestamp\":1234}]"
+var periodicDeletionResponse = "[{\"download_url\":\"some_url_to_package_delete\",\"timestamp\":1234}]"
+var periodicUpdateResponseSection = "\"update\":" + periodicUpdateResponse
+var periodicDeleteResponseSection = "\"deletion\":" + periodicDeletionResponse
+
+var periodicResponse = "{" + periodicUpdateResponseSection + "," + periodicDeleteResponseSection + "}"
+var onboardingResponse = "[{\"download_url\":\"some_url_to_package_onboard\",\"timestamp\":1234}]"
+
+func TestBuildUrlDBSyncV3(t *testing.T) {
+ streams := NewValidStreams()
+ tests := []struct {
+ flags *OfflineUpdatesFlags
+ expected string
+ }{
+ {&OfflineUpdatesFlags{Stream: streams.GetPublicDataStream(), IsPeriodicUpdate: true}, "api/v3/updates/periodic"},
+ {&OfflineUpdatesFlags{Stream: streams.GetPublicDataStream(), IsPeriodicUpdate: false}, "api/v3/updates/onboarding"},
+ {&OfflineUpdatesFlags{Stream: streams.GetExposuresStream(), IsPeriodicUpdate: true}, "api/v3/updates/exposures/periodic"},
+ {&OfflineUpdatesFlags{Stream: streams.GetExposuresStream(), IsPeriodicUpdate: false}, "api/v3/updates/exposures/onboarding"},
+ {&OfflineUpdatesFlags{Stream: streams.GetContextualAnalysisStream(), IsPeriodicUpdate: true}, "api/v3/updates/contextual_analysis/periodic"},
+ {&OfflineUpdatesFlags{Stream: streams.GetContextualAnalysisStream(), IsPeriodicUpdate: false}, "api/v3/updates/contextual_analysis/onboarding"},
+ }
+ for _, test := range tests {
+ url := buildUrlDBSyncV3(test.flags)
+ assert.Equal(t, strings.HasSuffix(url, test.expected), true)
+ }
+}
+
+func TestDBSyncV3getURLsToDownload(t *testing.T) {
+ tests := []struct {
+ serverResponse []byte
+ isPeriodic bool
+ expected []string
+ }{
+ {[]byte(periodicResponse), true, []string{"some_url_to_package_update", "some_url_to_package_delete"}},
+ {[]byte(onboardingResponse), false, []string{"some_url_to_package_onboard"}},
+ }
+
+ for _, test := range tests {
+ urls, err := getURLsToDownloadDBSyncV3(test.serverResponse, test.isPeriodic)
+ if err != nil {
+ t.Error(err)
+ }
+ assert.Equal(t, urls, test.expected)
+ }
+}
+
+func TestDBSyncV3createXrayFileNameFromURL(t *testing.T) {
+ tests := []struct {
+ url string
+ expected string
+ }{{"a/b/c/d.zip", "d.zip"}, {"x/y.zip", "y.zip"}}
+
+ for _, test := range tests {
+ expected, err := createXrayFileNameFromUrlV3(test.url)
+ if err != nil {
+ t.Error(err)
+ }
+ assert.Equal(t, expected, test.expected)
+ }
+}
+
+func TestDBSyncV3createV3MetadataFile(t *testing.T) {
+ tests := []struct {
+ serverResponse []byte
+ state string
+ expectedFilename string
+ }{
+ {[]byte(periodicResponse), periodicState, periodicState + ".json"},
+ {[]byte(onboardingResponse), onboardingState, onboardingState + ".json"},
+ }
+
+ for _, test := range tests {
+ dir := t.TempDir()
+ err := createV3MetadataFile(test.state, test.serverResponse, dir)
+ if err != nil {
+ t.Error(err)
+ }
+ fileContent, err := os.ReadFile(filepath.Join(dir, test.expectedFilename))
+ if err != nil {
+ t.Error(err)
+ }
+ assert.Equal(t, fileContent, test.serverResponse)
+ }
+
+}
+
+func TestGetValidStreamsString(t *testing.T) {
+ validStreams := NewValidStreams()
+ assert.Equal(t, 3, len(validStreams.StreamsMap))
+ assert.Equal(t, validStreams.GetValidStreamsString(), "public_data, exposures and contextual_analysis")
+}
diff --git a/formats/conversion.go b/formats/conversion.go
new file mode 100644
index 00000000..1a360ec1
--- /dev/null
+++ b/formats/conversion.go
@@ -0,0 +1,196 @@
+package formats
+
+import (
+ "strconv"
+ "strings"
+)
+
+func ConvertToVulnerabilityTableRow(rows []VulnerabilityOrViolationRow) (tableRows []vulnerabilityTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, vulnerabilityTableRow{
+ severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ applicable: rows[i].Applicable,
+ impactedDependencyName: rows[i].ImpactedDependencyName,
+ impactedDependencyVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ fixedVersions: strings.Join(rows[i].FixedVersions, "\n"),
+ directDependencies: convertToComponentTableRow(rows[i].Components),
+ cves: convertToCveTableRow(rows[i].Cves),
+ issueId: rows[i].IssueId,
+ })
+ }
+ return
+}
+
+func ConvertToVulnerabilityScanTableRow(rows []VulnerabilityOrViolationRow) (tableRows []vulnerabilityScanTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, vulnerabilityScanTableRow{
+ severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ impactedPackageName: rows[i].ImpactedDependencyName,
+ impactedPackageVersion: rows[i].ImpactedDependencyVersion,
+ ImpactedPackageType: rows[i].ImpactedDependencyType,
+ fixedVersions: strings.Join(rows[i].FixedVersions, "\n"),
+ directPackages: convertToComponentScanTableRow(rows[i].Components),
+ cves: convertToCveTableRow(rows[i].Cves),
+ issueId: rows[i].IssueId,
+ })
+ }
+ return
+}
+
+func ConvertToLicenseViolationTableRow(rows []LicenseRow) (tableRows []licenseViolationTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, licenseViolationTableRow{
+ licenseKey: rows[i].LicenseKey,
+ severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ impactedDependencyName: rows[i].ImpactedDependencyName,
+ impactedDependencyVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentTableRow(rows[i].Components),
+ })
+ }
+ return
+}
+
+func ConvertToLicenseViolationScanTableRow(rows []LicenseRow) (tableRows []licenseViolationScanTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, licenseViolationScanTableRow{
+ licenseKey: rows[i].LicenseKey,
+ severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ impactedPackageName: rows[i].ImpactedDependencyName,
+ impactedPackageVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentScanTableRow(rows[i].Components),
+ })
+ }
+ return
+}
+
+func ConvertToLicenseTableRow(rows []LicenseRow) (tableRows []licenseTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, licenseTableRow{
+ licenseKey: rows[i].LicenseKey,
+ impactedDependencyName: rows[i].ImpactedDependencyName,
+ impactedDependencyVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentTableRow(rows[i].Components),
+ })
+ }
+ return
+}
+
+func ConvertToLicenseScanTableRow(rows []LicenseRow) (tableRows []licenseScanTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, licenseScanTableRow{
+ licenseKey: rows[i].LicenseKey,
+ impactedPackageName: rows[i].ImpactedDependencyName,
+ impactedPackageVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentScanTableRow(rows[i].Components),
+ })
+ }
+ return
+}
+
+func ConvertToOperationalRiskViolationTableRow(rows []OperationalRiskViolationRow) (tableRows []operationalRiskViolationTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, operationalRiskViolationTableRow{
+ Severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ impactedDependencyName: rows[i].ImpactedDependencyName,
+ impactedDependencyVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentTableRow(rows[i].Components),
+ isEol: rows[i].IsEol,
+ cadence: rows[i].Cadence,
+ Commits: rows[i].Commits,
+ committers: rows[i].Committers,
+ newerVersions: rows[i].NewerVersions,
+ latestVersion: rows[i].LatestVersion,
+ riskReason: rows[i].RiskReason,
+ eolMessage: rows[i].EolMessage,
+ })
+ }
+ return
+}
+
+func ConvertToOperationalRiskViolationScanTableRow(rows []OperationalRiskViolationRow) (tableRows []operationalRiskViolationScanTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, operationalRiskViolationScanTableRow{
+ Severity: rows[i].Severity,
+ severityNumValue: rows[i].SeverityNumValue,
+ impactedPackageName: rows[i].ImpactedDependencyName,
+ impactedPackageVersion: rows[i].ImpactedDependencyVersion,
+ impactedDependencyType: rows[i].ImpactedDependencyType,
+ directDependencies: convertToComponentScanTableRow(rows[i].Components),
+ isEol: rows[i].IsEol,
+ cadence: rows[i].Cadence,
+ commits: rows[i].Commits,
+ committers: rows[i].Committers,
+ newerVersions: rows[i].NewerVersions,
+ latestVersion: rows[i].LatestVersion,
+ riskReason: rows[i].RiskReason,
+ eolMessage: rows[i].EolMessage,
+ })
+ }
+ return
+}
+
+func ConvertToSecretsTableRow(rows []SourceCodeRow) (tableRows []secretsTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, secretsTableRow{
+ severity: rows[i].Severity,
+ file: rows[i].File,
+ lineColumn: strconv.Itoa(rows[i].StartLine) + ":" + strconv.Itoa(rows[i].StartColumn),
+ secret: rows[i].Snippet,
+ })
+ }
+ return
+}
+
+func ConvertToIacOrSastTableRow(rows []SourceCodeRow) (tableRows []iacOrSastTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, iacOrSastTableRow{
+ severity: rows[i].Severity,
+ file: rows[i].File,
+ lineColumn: strconv.Itoa(rows[i].StartLine) + ":" + strconv.Itoa(rows[i].StartColumn),
+ finding: rows[i].Finding,
+ })
+ }
+ return
+}
+
+func convertToComponentTableRow(rows []ComponentRow) (tableRows []directDependenciesTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, directDependenciesTableRow{
+ name: rows[i].Name,
+ version: rows[i].Version,
+ })
+ }
+ return
+}
+
+func convertToComponentScanTableRow(rows []ComponentRow) (tableRows []directPackagesTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, directPackagesTableRow{
+ name: rows[i].Name,
+ version: rows[i].Version,
+ })
+ }
+ return
+}
+
+func convertToCveTableRow(rows []CveRow) (tableRows []cveTableRow) {
+ for i := range rows {
+ tableRows = append(tableRows, cveTableRow{
+ id: rows[i].Id,
+ cvssV2: rows[i].CvssV2,
+ cvssV3: rows[i].CvssV3,
+ })
+ }
+ return
+}
diff --git a/formats/simplejsonapi.go b/formats/simplejsonapi.go
new file mode 100644
index 00000000..81e90baf
--- /dev/null
+++ b/formats/simplejsonapi.go
@@ -0,0 +1,123 @@
+package formats
+
+import "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+
+// Structs in this file should NOT be changed!
+// The structs are used as an API for the simple-json format, thus changing their structure or the 'json' annotation will break the API.
+
+// This struct holds the sorted results of the simple-json output.
+type SimpleJsonResults struct {
+ Vulnerabilities []VulnerabilityOrViolationRow `json:"vulnerabilities"`
+ SecurityViolations []VulnerabilityOrViolationRow `json:"securityViolations"`
+ LicensesViolations []LicenseRow `json:"licensesViolations"`
+ Licenses []LicenseRow `json:"licenses"`
+ OperationalRiskViolations []OperationalRiskViolationRow `json:"operationalRiskViolations"`
+ Secrets []SourceCodeRow `json:"secrets"`
+ Iacs []SourceCodeRow `json:"iacViolations"`
+ Sast []SourceCodeRow `json:"sastViolations"`
+ Errors []SimpleJsonError `json:"errors"`
+}
+
+type SeverityDetails struct {
+ Severity string `json:"severity"`
+ SeverityNumValue int `json:"-"` // For sorting
+}
+
+type ImpactedDependencyDetails struct {
+ SeverityDetails
+ ImpactedDependencyName string `json:"impactedPackageName"`
+ ImpactedDependencyVersion string `json:"impactedPackageVersion"`
+ ImpactedDependencyType string `json:"impactedPackageType"`
+ Components []ComponentRow `json:"components"`
+}
+
+// Used for vulnerabilities and security violations
+type VulnerabilityOrViolationRow struct {
+ ImpactedDependencyDetails
+ Summary string `json:"summary"`
+ Applicable string `json:"applicable"`
+ FixedVersions []string `json:"fixedVersions"`
+ Cves []CveRow `json:"cves"`
+ IssueId string `json:"issueId"`
+ References []string `json:"references"`
+ ImpactPaths [][]ComponentRow `json:"impactPaths"`
+ JfrogResearchInformation *JfrogResearchInformation `json:"jfrogResearchInformation"`
+ Technology coreutils.Technology `json:"-"`
+}
+
+type LicenseRow struct {
+ ImpactedDependencyDetails
+ LicenseKey string `json:"licenseKey"`
+ ImpactPaths [][]ComponentRow `json:"impactPaths"`
+}
+
+type OperationalRiskViolationRow struct {
+ ImpactedDependencyDetails
+ RiskReason string `json:"riskReason"`
+ IsEol string `json:"isEndOfLife"`
+ EolMessage string `json:"endOfLifeMessage"`
+ Cadence string `json:"cadence"`
+ Commits string `json:"commits"`
+ Committers string `json:"committers"`
+ NewerVersions string `json:"newerVersions"`
+ LatestVersion string `json:"latestVersion"`
+}
+
+type SourceCodeRow struct {
+ SeverityDetails
+ Location
+ Finding string `json:"finding,omitempty"`
+ ScannerDescription string `json:"scannerDescription,omitempty"`
+ CodeFlow [][]Location `json:"codeFlow,omitempty"`
+}
+
+type Location struct {
+ File string `json:"file"`
+ StartLine int `json:"startLine,omitempty"`
+ StartColumn int `json:"startColumn,omitempty"`
+ EndLine int `json:"endLine,omitempty"`
+ EndColumn int `json:"endColumn,omitempty"`
+ Snippet string `json:"snippet,omitempty"`
+}
+
+type ComponentRow struct {
+ Name string `json:"name"`
+ Version string `json:"version"`
+}
+
+type CveRow struct {
+ Id string `json:"id"`
+ CvssV2 string `json:"cvssV2"`
+ CvssV3 string `json:"cvssV3"`
+ Applicability *Applicability `json:"applicability,omitempty"`
+}
+
+type Applicability struct {
+ Status string `json:"status"`
+ ScannerDescription string `json:"scannerDescription,omitempty"`
+ Evidence []Evidence `json:"evidence,omitempty"`
+}
+
+type Evidence struct {
+ Location
+ Reason string `json:"reason,omitempty"`
+}
+
+type SimpleJsonError struct {
+ FilePath string `json:"filePath"`
+ ErrorMessage string `json:"errorMessage"`
+}
+
+type JfrogResearchInformation struct {
+ SeverityDetails
+ Summary string `json:"summary,omitempty"`
+ Details string `json:"details,omitempty"`
+ SeverityReasons []JfrogResearchSeverityReason `json:"severityReasons,omitempty"`
+ Remediation string `json:"remediation,omitempty"`
+}
+
+type JfrogResearchSeverityReason struct {
+ Name string `json:"name,omitempty"`
+ Description string `json:"description,omitempty"`
+ IsPositive bool `json:"isPositive,omitempty"`
+}
diff --git a/formats/table.go b/formats/table.go
new file mode 100644
index 00000000..8dd71d1d
--- /dev/null
+++ b/formats/table.go
@@ -0,0 +1,137 @@
+package formats
+
+// Structs in this file are used for the 'table' format output of scan/audit commands.
+// Annotations are as described in the tableutils.PrintTable description.
+// Use the conversion methods in this package to convert from the API structs to the table structs.
+
+// Used for vulnerabilities and security violations
+type vulnerabilityTableRow struct {
+ severity string `col-name:"Severity"`
+ applicable string `col-name:"Contextual\nAnalysis" omitempty:"true"`
+ // For sorting
+ severityNumValue int
+ directDependencies []directDependenciesTableRow `embed-table:"true"`
+ impactedDependencyName string `col-name:"Impacted\nDependency\nName"`
+ impactedDependencyVersion string `col-name:"Impacted\nDependency\nVersion"`
+ fixedVersions string `col-name:"Fixed\nVersions"`
+ impactedDependencyType string `col-name:"Type"`
+ cves []cveTableRow `embed-table:"true"`
+ issueId string `col-name:"Issue ID" extended:"true"`
+}
+
+type vulnerabilityScanTableRow struct {
+ severity string `col-name:"Severity"`
+ // For sorting
+ severityNumValue int
+ directPackages []directPackagesTableRow `embed-table:"true"`
+ impactedPackageName string `col-name:"Impacted\nPackage\nName"`
+ impactedPackageVersion string `col-name:"Impacted\nPackage\nVersion"`
+ fixedVersions string `col-name:"Fixed\nVersions"`
+ ImpactedPackageType string `col-name:"Type"`
+ cves []cveTableRow `embed-table:"true"`
+ issueId string `col-name:"Issue ID" extended:"true"`
+}
+
+type licenseTableRow struct {
+ licenseKey string `col-name:"License"`
+ directDependencies []directDependenciesTableRow `embed-table:"true"`
+ impactedDependencyName string `col-name:"Impacted\nDependency"`
+ impactedDependencyVersion string `col-name:"Impacted\nDependency\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+}
+
+type licenseScanTableRow struct {
+ licenseKey string `col-name:"License"`
+ directDependencies []directPackagesTableRow `embed-table:"true"`
+ impactedPackageName string `col-name:"Impacted\nPackage"`
+ impactedPackageVersion string `col-name:"Impacted\nPackage\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+}
+
+type licenseViolationTableRow struct {
+ licenseKey string `col-name:"License"`
+ severity string `col-name:"Severity"`
+ // For sorting
+ severityNumValue int
+ directDependencies []directDependenciesTableRow `embed-table:"true"`
+ impactedDependencyName string `col-name:"Impacted\nDependency"`
+ impactedDependencyVersion string `col-name:"Impacted\nDependency\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+}
+
+type licenseViolationScanTableRow struct {
+ licenseKey string `col-name:"License"`
+ severity string `col-name:"Severity"`
+ // For sorting
+ severityNumValue int
+ directDependencies []directPackagesTableRow `embed-table:"true"`
+ impactedPackageName string `col-name:"Impacted\nPackage"`
+ impactedPackageVersion string `col-name:"Impacted\nPackage\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+}
+
+type operationalRiskViolationTableRow struct {
+ Severity string `col-name:"Severity"`
+ // For sorting
+ severityNumValue int
+ directDependencies []directDependenciesTableRow `embed-table:"true"`
+ impactedDependencyName string `col-name:"Impacted\nDependency"`
+ impactedDependencyVersion string `col-name:"Impacted\nDependency\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+ riskReason string `col-name:"Risk\nReason"`
+ isEol string `col-name:"Is\nEnd\nOf\nLife" extended:"true"`
+ eolMessage string `col-name:"End\nOf\nLife\nMessage" extended:"true"`
+ cadence string `col-name:"Cadence" extended:"true"`
+ Commits string `col-name:"Commits" extended:"true"`
+ committers string `col-name:"Committers" extended:"true"`
+ newerVersions string `col-name:"Newer\nVersions" extended:"true"`
+ latestVersion string `col-name:"Latest\nVersion" extended:"true"`
+}
+
+type operationalRiskViolationScanTableRow struct {
+ Severity string `col-name:"Severity"`
+ // For sorting
+ severityNumValue int
+ directDependencies []directPackagesTableRow `embed-table:"true"`
+ impactedPackageName string `col-name:"Impacted\nPackage"`
+ impactedPackageVersion string `col-name:"Impacted\nPackage\nVersion"`
+ impactedDependencyType string `col-name:"Type"`
+ riskReason string `col-name:"Risk\nReason"`
+ isEol string `col-name:"Is\nEnd\nOf\nLife" extended:"true"`
+ eolMessage string `col-name:"End\nOf\nLife\nMessage" extended:"true"`
+ cadence string `col-name:"Cadence" extended:"true"`
+ commits string `col-name:"Commits" extended:"true"`
+ committers string `col-name:"Committers" extended:"true"`
+ newerVersions string `col-name:"Newer\nVersions" extended:"true"`
+ latestVersion string `col-name:"Latest\nVersion" extended:"true"`
+}
+
+type directDependenciesTableRow struct {
+ name string `col-name:"Direct\nDependency"`
+ version string `col-name:"Direct\nDependency\nVersion"`
+}
+
+type directPackagesTableRow struct {
+ name string `col-name:"Direct\nPackage"`
+ version string `col-name:"Direct\nPackage\nVersion"`
+}
+
+type cveTableRow struct {
+ id string `col-name:"CVE"`
+ cvssV2 string `col-name:"CVSS\nv2" extended:"true"`
+ cvssV3 string `col-name:"CVSS\nv3" extended:"true"`
+}
+
+type secretsTableRow struct {
+ severity string `col-name:"Severity"`
+ file string `col-name:"File"`
+ lineColumn string `col-name:"Line:Column"`
+ secret string `col-name:"Secret"`
+}
+
+type iacOrSastTableRow struct {
+ severity string `col-name:"Severity"`
+ file string `col-name:"File"`
+ lineColumn string `col-name:"Line:Column"`
+ finding string `col-name:"Finding"`
+}
diff --git a/go.mod b/go.mod
index 0b286e2a..e81b863f 100644
--- a/go.mod
+++ b/go.mod
@@ -3,9 +3,19 @@ module github.com/jfrog/jfrog-cli-security
go 1.20
require (
+ github.com/gookit/color v1.5.4
+ github.com/jfrog/build-info-go v1.9.20
+ github.com/jfrog/gofrog v1.5.0
+ github.com/jfrog/jfrog-apps-config v1.0.1
github.com/jfrog/jfrog-cli-core/v2 v2.46.2
- github.com/jfrog/jfrog-client-go v1.35.5
+ github.com/jfrog/jfrog-client-go v1.35.6
+ github.com/magiconair/properties v1.8.7
+ github.com/owenrumney/go-sarif/v2 v2.3.0
github.com/stretchr/testify v1.8.4
+ golang.org/x/exp v0.0.0-20240112132812-db7319d0e0e3
+ golang.org/x/sync v0.6.0
+ golang.org/x/text v0.14.0
+ gopkg.in/yaml.v3 v3.0.1
)
require (
@@ -14,8 +24,11 @@ require (
github.com/CycloneDX/cyclonedx-go v0.7.2 // indirect
github.com/Microsoft/go-winio v0.6.1 // indirect
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c // indirect
+ github.com/VividCortex/ewma v1.2.0 // indirect
+ github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d // indirect
github.com/andybalholm/brotli v1.0.1 // indirect
github.com/buger/jsonparser v1.1.1 // indirect
+ github.com/c-bata/go-prompt v0.2.5 // indirect
github.com/chzyer/readline v1.5.1 // indirect
github.com/cloudflare/circl v1.3.3 // indirect
github.com/cpuguy83/go-md2man/v2 v2.0.2 // indirect
@@ -28,32 +41,32 @@ require (
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
github.com/go-git/go-billy/v5 v5.5.0 // indirect
github.com/go-git/go-git/v5 v5.11.0 // indirect
+ github.com/gocarina/gocsv v0.0.0-20231116093920-b87c2d0e983a // indirect
github.com/golang-jwt/jwt/v4 v4.5.0 // indirect
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
github.com/golang/snappy v0.0.2 // indirect
github.com/google/uuid v1.5.0 // indirect
- github.com/gookit/color v1.5.4 // indirect
github.com/hashicorp/hcl v1.0.0 // indirect
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
- github.com/jedib0t/go-pretty/v6 v6.4.0 // indirect
- github.com/jfrog/build-info-go v1.9.19 // indirect
- github.com/jfrog/gofrog v1.4.0 // indirect
+ github.com/jedib0t/go-pretty/v6 v6.5.3 // indirect
+ github.com/jfrog/archiver/v3 v3.5.3 // indirect
github.com/kevinburke/ssh_config v1.2.0 // indirect
github.com/klauspost/compress v1.17.0 // indirect
github.com/klauspost/cpuid/v2 v2.2.3 // indirect
github.com/klauspost/pgzip v1.2.5 // indirect
- github.com/magiconair/properties v1.8.7 // indirect
github.com/manifoldco/promptui v0.9.0 // indirect
- github.com/mattn/go-runewidth v0.0.13 // indirect
- github.com/mholt/archiver/v3 v3.5.1 // indirect
+ github.com/mattn/go-colorable v0.1.13 // indirect
+ github.com/mattn/go-isatty v0.0.17 // indirect
+ github.com/mattn/go-runewidth v0.0.15 // indirect
+ github.com/mattn/go-tty v0.0.3 // indirect
github.com/minio/sha256-simd v1.0.1 // indirect
github.com/mitchellh/mapstructure v1.5.0 // indirect
github.com/nwaples/rardecode v1.1.0 // indirect
- github.com/owenrumney/go-sarif/v2 v2.3.0 // indirect
github.com/pelletier/go-toml/v2 v2.1.0 // indirect
github.com/pierrec/lz4/v4 v4.1.2 // indirect
github.com/pjbgf/sha1cd v0.3.0 // indirect
- github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 // indirect
+ github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c // indirect
+ github.com/pkg/term v1.1.0 // indirect
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
github.com/rivo/uniseg v0.4.3 // indirect
github.com/russross/blackfriday/v2 v2.1.0 // indirect
@@ -69,23 +82,20 @@ require (
github.com/subosito/gotenv v1.6.0 // indirect
github.com/ulikunitz/xz v0.5.9 // indirect
github.com/urfave/cli v1.22.14 // indirect
+ github.com/vbauerster/mpb/v7 v7.5.3 // indirect
github.com/xanzy/ssh-agent v0.3.3 // indirect
github.com/xi2/xz v0.0.0-20171230120015-48954b6210f8 // indirect
github.com/xo/terminfo v0.0.0-20210125001918-ca9a967f8778 // indirect
go.uber.org/atomic v1.9.0 // indirect
go.uber.org/multierr v1.9.0 // indirect
- golang.org/x/crypto v0.17.0 // indirect
- golang.org/x/exp v0.0.0-20231226003508-02704c960a9b // indirect
+ golang.org/x/crypto v0.18.0 // indirect
golang.org/x/mod v0.14.0 // indirect
- golang.org/x/net v0.19.0 // indirect
- golang.org/x/sync v0.5.0 // indirect
- golang.org/x/sys v0.15.0 // indirect
- golang.org/x/term v0.15.0 // indirect
- golang.org/x/text v0.14.0 // indirect
- golang.org/x/tools v0.16.0 // indirect
+ golang.org/x/net v0.20.0 // indirect
+ golang.org/x/sys v0.16.0 // indirect
+ golang.org/x/term v0.16.0 // indirect
+ golang.org/x/tools v0.17.0 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
gopkg.in/warnings.v0 v0.1.2 // indirect
- gopkg.in/yaml.v3 v3.0.1 // indirect
)
-replace github.com/jfrog/jfrog-cli-core/v2 => github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048
+replace github.com/jfrog/jfrog-cli-core/v2 => github.com/jfrog/jfrog-cli-core/v2 v2.31.1-0.20240118100957-b4e1537e91dd
diff --git a/go.sum b/go.sum
index 3dbe6280..fb521b7f 100644
--- a/go.sum
+++ b/go.sum
@@ -9,17 +9,21 @@ github.com/Microsoft/go-winio v0.6.1 h1:9/kr64B9VUZrLm5YYwbGtUJnMgqWVOdUAXu6Migc
github.com/Microsoft/go-winio v0.6.1/go.mod h1:LRdKpFKfdobln8UmuiYcKPot9D2v6svN5+sAH+4kjUM=
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c h1:kMFnB0vCcX7IL/m9Y5LO+KQYv+t1CQOiFe6+SV2J7bE=
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c/go.mod h1:EjAoLdwvbIOoOQr3ihjnSoLZRtE8azugULFRteWMNc0=
+github.com/VividCortex/ewma v1.2.0 h1:f58SaIzcDXrSy3kWaHNvuJgJ3Nmz59Zji6XoJR/q1ow=
+github.com/VividCortex/ewma v1.2.0/go.mod h1:nz4BbCtbLyFDeC9SUHbtcT5644juEuWfUAUnGx7j5l4=
+github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d h1:licZJFw2RwpHMqeKTCYkitsPqHNxTmd4SNR5r94FGM8=
+github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d/go.mod h1:asat636LX7Bqt5lYEZ27JNDcqxfjdBQuJ/MM4CN/Lzo=
github.com/andybalholm/brotli v1.0.1 h1:KqhlKozYbRtJvsPrrEeXcO+N2l6NYT5A2QAFmSULpEc=
github.com/andybalholm/brotli v1.0.1/go.mod h1:loMXtMfwqflxFJPmdbJO0a3KNoPuLBgiu3qAvBg8x/Y=
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
github.com/apparentlymart/go-textseg/v13 v13.0.0/go.mod h1:ZK2fH7c4NqDTLtiYLvIkEghdlcqw7yxLeM89kiTRPUo=
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
-github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048 h1:CgwdiO5lMeu9nIa3a4p6FXQ4J2Hw4uRHmjW44mGlirQ=
-github.com/attiasas/jfrog-cli-core/v2 v2.0.0-20231231090348-2b7cc2486048/go.mod h1:l5y34dJhQ0W16o7OrCUjTQdGikoZPKTRI1NKGneoJ0g=
github.com/bradleyjkemp/cupaloy/v2 v2.8.0 h1:any4BmKE+jGIaMpnU8YgH/I2LPiLBufr6oMMlVBbn9M=
github.com/buger/jsonparser v1.1.1 h1:2PnMjfWD7wBILjqQbt530v576A/cAbQvEW9gGIpYMUs=
github.com/buger/jsonparser v1.1.1/go.mod h1:6RYKKt7H4d4+iWqouImQ9R2FZql3VbhNgx27UK13J/0=
github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
+github.com/c-bata/go-prompt v0.2.5 h1:3zg6PecEywxNn0xiqcXHD96fkbxghD+gdB2tbsYfl+Y=
+github.com/c-bata/go-prompt v0.2.5/go.mod h1:vFnjEGDIIA/Lib7giyE4E9c50Lvl8j0S+7FVlAwDAVw=
github.com/chzyer/logex v1.1.10/go.mod h1:+Ywpsq7O8HXn0nuIou7OrIPyXbp3wmkHB+jjWRnGsAI=
github.com/chzyer/logex v1.2.1 h1:XHDu3E6q+gdHgsdTPH6ImJMIp436vR6MPtH8gP05QzM=
github.com/chzyer/logex v1.2.1/go.mod h1:JLbx6lG2kDbNRFnfkgvh4eRJRPX1QCoOIWomwysCBrQ=
@@ -58,6 +62,8 @@ github.com/go-git/go-billy/v5 v5.5.0/go.mod h1:hmexnoNsr2SJU1Ju67OaNz5ASJY3+sHgF
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=
github.com/go-git/go-git/v5 v5.11.0 h1:XIZc1p+8YzypNr34itUfSvYJcv+eYdTnTvOZ2vD3cA4=
github.com/go-git/go-git/v5 v5.11.0/go.mod h1:6GFcX2P3NM7FPBfpePbpLd21XxsgdAt+lKqXmCUiUCY=
+github.com/gocarina/gocsv v0.0.0-20231116093920-b87c2d0e983a h1:RYfmiM0zluBJOiPDJseKLEN4BapJ42uSi9SZBQ2YyiA=
+github.com/gocarina/gocsv v0.0.0-20231116093920-b87c2d0e983a/go.mod h1:5YoVOkjYAQumqlV356Hj3xeYh4BdZuLE0/nRkf2NKkI=
github.com/golang-jwt/jwt/v4 v4.5.0 h1:7cYmW1XlMY7h7ii7UhUyChSgS5wUJEnm9uZVTGqOWzg=
github.com/golang-jwt/jwt/v4 v4.5.0/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE=
@@ -77,14 +83,20 @@ github.com/hashicorp/hcl v1.0.0 h1:0Anlzjpi4vEasTeNFn2mLJgTSwt0+6sfsiTG8qcWGx4=
github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
-github.com/jedib0t/go-pretty/v6 v6.4.0 h1:YlI/2zYDrweA4MThiYMKtGRfT+2qZOO65ulej8GTcVI=
-github.com/jedib0t/go-pretty/v6 v6.4.0/go.mod h1:MgmISkTWDSFu0xOqiZ0mKNntMQ2mDgOcwOkwBEkMDJI=
-github.com/jfrog/build-info-go v1.9.19 h1:tFPR0Je+ETLXcJqa7UrICkSjwc27zeY06AoWaMYPdQI=
-github.com/jfrog/build-info-go v1.9.19/go.mod h1:DBxqvz1N/uI9iI/1gkCfjKjOrlcCzQ3hiKXqtKJUrrY=
-github.com/jfrog/gofrog v1.4.0 h1:s7eysVnmIBfVheMs4LPU43MAlxwPa4K8u2N5h7kwzXA=
-github.com/jfrog/gofrog v1.4.0/go.mod h1:AQo5Fq0G9nDEF6icH7MYQK0iohR4HuEAXl8jaxRuT6Q=
-github.com/jfrog/jfrog-client-go v1.35.5 h1:1QlrXdMhGi099Cs3mVKIpeVre2w1DiYhU7WGSEH2gQU=
-github.com/jfrog/jfrog-client-go v1.35.5/go.mod h1:Leua+MdhCV+M4gl746PcTsHF8dDP7+LLJ/NgHCTl/Fo=
+github.com/jedib0t/go-pretty/v6 v6.5.3 h1:GIXn6Er/anHTkVUoufs7ptEvxdD6KIhR7Axa2wYCPF0=
+github.com/jedib0t/go-pretty/v6 v6.5.3/go.mod h1:5LQIxa52oJ/DlDSLv0HEkWOFMDGoWkJb9ss5KqPpJBg=
+github.com/jfrog/archiver/v3 v3.5.3 h1:Udz6+z/YIhTFmcEp1TeW2DEwNyo7JSAnrGUsrbL2FZI=
+github.com/jfrog/archiver/v3 v3.5.3/go.mod h1:/MbmBhPzkliu9PtweAg9lCYHGcKdapwMMZS/QS09T5c=
+github.com/jfrog/build-info-go v1.9.20 h1:tQF6EMjt/EEX8syTrgpL/c7FjhlBSjtv848jNvxpMp8=
+github.com/jfrog/build-info-go v1.9.20/go.mod h1:Vxv6zmx4e1NWsx40OHaDWCCYDeYAq2yXzpJ4nsDChbE=
+github.com/jfrog/gofrog v1.5.0 h1:OLaXpNaEniliE4Kq8lJ5evVYzzt3zdYtpMIBu6TO++c=
+github.com/jfrog/gofrog v1.5.0/go.mod h1:wQqagqq2VpuCWRPlq/65GbH9gsRz+7Bgc1Q+PKD4Y+k=
+github.com/jfrog/jfrog-apps-config v1.0.1 h1:mtv6k7g8A8BVhlHGlSveapqf4mJfonwvXYLipdsOFMY=
+github.com/jfrog/jfrog-apps-config v1.0.1/go.mod h1:8AIIr1oY9JuH5dylz2S6f8Ym2MaadPLR6noCBO4C22w=
+github.com/jfrog/jfrog-cli-core/v2 v2.31.1-0.20240118100957-b4e1537e91dd h1:7JOQANVaULKq0b2X10ERsEAZOGccfooOvstr3UZcGTc=
+github.com/jfrog/jfrog-cli-core/v2 v2.31.1-0.20240118100957-b4e1537e91dd/go.mod h1:tbplJYWXBgQNLMWadfZYh2uaajZjG1tLgBb1txLNAQw=
+github.com/jfrog/jfrog-client-go v1.35.6 h1:nVS94x6cwSRkhtj8OM3elbUcGgQhqsK8YMPvC/gf5sk=
+github.com/jfrog/jfrog-client-go v1.35.6/go.mod h1:V+XKC27k6GA5OcWIAItpnxZAZnCigg8xCkpXKP905Fk=
github.com/kevinburke/ssh_config v1.2.0 h1:x584FjTGwHzMwvHx18PXxbBVzfnxogHaAReU4gf13a4=
github.com/kevinburke/ssh_config v1.2.0/go.mod h1:CT57kijsi8u/K/BOFA39wgDQJ9CxiF4nAY/ojJ6r6mM=
github.com/klauspost/compress v1.4.1/go.mod h1:RyIbtBH6LamlWaDj8nUwkbUhJ87Yi3uG0guNDohfE1A=
@@ -105,10 +117,23 @@ github.com/magiconair/properties v1.8.7 h1:IeQXZAiQcpL9mgcAe1Nu6cX9LLw6ExEHKjN0V
github.com/magiconair/properties v1.8.7/go.mod h1:Dhd985XPs7jluiymwWYZ0G4Z61jb3vdS329zhj2hYo0=
github.com/manifoldco/promptui v0.9.0 h1:3V4HzJk1TtXW1MTZMP7mdlwbBpIinw3HztaIlYthEiA=
github.com/manifoldco/promptui v0.9.0/go.mod h1:ka04sppxSGFAtxX0qhlYQjISsg9mR4GWtQEhdbn6Pgg=
-github.com/mattn/go-runewidth v0.0.13 h1:lTGmDsbAYt5DmK6OnoV7EuIF1wEIFAcxld6ypU4OSgU=
+github.com/mattn/go-colorable v0.1.4/go.mod h1:U0ppj6V5qS13XJ6of8GYAs25YV2eR4EVcfRqFIhoBtE=
+github.com/mattn/go-colorable v0.1.7/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
+github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
+github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
+github.com/mattn/go-isatty v0.0.8/go.mod h1:Iq45c/XA43vh69/j3iqttzPXn0bhXyGjM0Hdxcsrc5s=
+github.com/mattn/go-isatty v0.0.10/go.mod h1:qgIWMr58cqv1PHHyhnkY9lrL7etaEgOFcMEpPG5Rm84=
+github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
+github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
+github.com/mattn/go-isatty v0.0.17 h1:BTarxUcIeDqL27Mc+vyvdWYSL28zpIhv3RoTdsLMPng=
+github.com/mattn/go-isatty v0.0.17/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
+github.com/mattn/go-runewidth v0.0.6/go.mod h1:H031xJmbD/WCDINGzjvQ9THkh0rPKHF+m2gUSrubnMI=
+github.com/mattn/go-runewidth v0.0.9/go.mod h1:H031xJmbD/WCDINGzjvQ9THkh0rPKHF+m2gUSrubnMI=
github.com/mattn/go-runewidth v0.0.13/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
-github.com/mholt/archiver/v3 v3.5.1 h1:rDjOBX9JSF5BvoJGvjqK479aL70qh9DIpZCl+k7Clwo=
-github.com/mholt/archiver/v3 v3.5.1/go.mod h1:e3dqJ7H78uzsRSEACH1joayhuSyhnonssnDhppzS1L4=
+github.com/mattn/go-runewidth v0.0.15 h1:UNAjwbU9l54TA3KzvqLGxwWjHmMgBUVhBiTjelZgg3U=
+github.com/mattn/go-runewidth v0.0.15/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
+github.com/mattn/go-tty v0.0.3 h1:5OfyWorkyO7xP52Mq7tB36ajHDG5OHrmBGIS/DtakQI=
+github.com/mattn/go-tty v0.0.3/go.mod h1:ihxohKRERHTVzN+aSVRwACLCeqIoZAWpoICkkvrWyR0=
github.com/minio/sha256-simd v1.0.1 h1:6kaan5IFmwTNynnKKpDHe6FWHohJOHhCPchzK49dzMM=
github.com/minio/sha256-simd v1.0.1/go.mod h1:Pz6AKMiUdngCLpeTL/RJY1M9rUuPMYujV5xJjtbRSN8=
github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=
@@ -125,11 +150,12 @@ github.com/pierrec/lz4/v4 v4.1.2 h1:qvY3YFXRQE/XB8MlLzJH7mSzBs74eA2gg52YTk6jUPM=
github.com/pierrec/lz4/v4 v4.1.2/go.mod h1:gZWDp/Ze/IJXGXf23ltt2EXimqmTUXEy0GFuRQyBid4=
github.com/pjbgf/sha1cd v0.3.0 h1:4D5XXmUUBUl/xQ6IjCkEAbqXskkq/4O7LmGn0AqMDs4=
github.com/pjbgf/sha1cd v0.3.0/go.mod h1:nZ1rrWOcGJ5uZgEEVL1VUM9iRQiZvWdbZjkKyFzPPsI=
-github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 h1:KoWmjvw+nsYOo29YJK9vDA65RGE3NrOnUtO7a+RF9HU=
-github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8/go.mod h1:HKlIX3XHQyzLZPlr7++PzdhaXEj94dEiJgZDTsxEqUI=
+github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ=
+github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c/go.mod h1:7rwL4CYBLnjLxUqIJNnCWiEdr3bn6IUYi15bNlnbCCU=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
-github.com/pkg/profile v1.6.0/go.mod h1:qBsxPvzyUincmltOk6iyRVxHYg4adc0OFOv72ZdLa18=
+github.com/pkg/term v1.1.0 h1:xIAAdCMh3QIAy+5FrE8Ad8XoDhEU4ufwbaSozViP9kk=
+github.com/pkg/term v1.1.0/go.mod h1:E25nymQcrSllhX42Ok8MRm1+hyBdHY0dCeiKZ9jpNGw=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRIccs7FGNTlIRMkT8wgtp5eCXdBlqhYGL6U=
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
@@ -166,7 +192,6 @@ github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UV
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
-github.com/stretchr/testify v1.7.4/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
@@ -178,11 +203,13 @@ github.com/ulikunitz/xz v0.5.9 h1:RsKRIA2MO8x56wkkcd3LbtcE/uMszhb6DpRf+3uwa3I=
github.com/ulikunitz/xz v0.5.9/go.mod h1:nbz6k7qbPmH4IRqmfOplQw/tblSgqTqBwxkY0oWt/14=
github.com/urfave/cli v1.22.14 h1:ebbhrRiGK2i4naQJr+1Xj92HXZCrK7MsyTS/ob3HnAk=
github.com/urfave/cli v1.22.14/go.mod h1:X0eDS6pD6Exaclxm99NJ3FiCDRED7vIHpx2mDOHLvkA=
+github.com/vbauerster/mpb/v7 v7.5.3 h1:BkGfmb6nMrrBQDFECR/Q7RkKCw7ylMetCb4079CGs4w=
+github.com/vbauerster/mpb/v7 v7.5.3/go.mod h1:i+h4QY6lmLvBNK2ah1fSreiw3ajskRlBp9AhY/PnuOE=
github.com/vmihailenco/msgpack/v4 v4.3.12/go.mod h1:gborTTJjAo/GWTqqRjrLCn9pgNN+NXzzngzBKDPIqw4=
github.com/vmihailenco/tagparser v0.1.1/go.mod h1:OeAg3pn3UbLjkWt+rN9oFYB6u/cQgqMEUPoW2WPyhdI=
github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
-github.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f h1:J9EGpcZtP0E/raorCMxlFGSTBrsSlaDGf3jU/qvAE2c=
+github.com/xeipuuv/gojsonpointer v0.0.0-20190905194746-02993c407bfb h1:zGWFAtiMcyryUHoUjUJX0/lt1H2+i2Ka2n+D3DImSNo=
github.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415 h1:EzJWgHovont7NscjpAxXsDA8S8BMYve8Y5+7cuRE7R0=
github.com/xeipuuv/gojsonschema v1.2.0 h1:LhYJRs+L4fBtjZUfuSZIKGeVu0QRy8e5Xi7D17UxZ74=
github.com/xi2/xz v0.0.0-20171230120015-48954b6210f8 h1:nIPpBwaJSVYIxUFsDv3M8ofmx9yWTog9BfvIu0q41lo=
@@ -200,10 +227,10 @@ golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5y
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
golang.org/x/crypto v0.3.1-0.20221117191849-2c476679df9a/go.mod h1:hebNnKkNXi2UzZN1eVRvBB7co0a+JxK6XbPiWVs/3J4=
golang.org/x/crypto v0.7.0/go.mod h1:pYwdfH91IfpZVANVyUOhSIPZaFoJGxTFbZhFTx+dXZU=
-golang.org/x/crypto v0.17.0 h1:r8bRNjWL3GshPW3gkd+RpvzWrZAwPS49OmTGZ/uhM4k=
-golang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=
-golang.org/x/exp v0.0.0-20231226003508-02704c960a9b h1:kLiC65FbiHWFAOu+lxwNPujcsl8VYyTYYEZnsOO1WK4=
-golang.org/x/exp v0.0.0-20231226003508-02704c960a9b/go.mod h1:iRJReGqOEeBhDZGkGbynYwcHlctCvnjTYIamk7uXpHI=
+golang.org/x/crypto v0.18.0 h1:PGVlW0xEltQnzFZ55hkuX5+KLyrMYhHld1YHO4AKcdc=
+golang.org/x/crypto v0.18.0/go.mod h1:R0j02AL6hcrfOiy9T4ZYp/rcWeMxM3L6QYxlOuEG1mg=
+golang.org/x/exp v0.0.0-20240112132812-db7319d0e0e3 h1:hNQpMuAJe5CtcUqCXaWga3FHu+kQvCqcsoVaQgSV60o=
+golang.org/x/exp v0.0.0-20240112132812-db7319d0e0e3/go.mod h1:idGWGoKP1toJGkd5/ig9ZLuPcZBC3ewk7SzmH0uou08=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.14.0 h1:dGoOF9QVLYng8IHTm7BAyWqCqSheQ5pYWGhzW00YJr0=
@@ -217,40 +244,49 @@ golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug
golang.org/x/net v0.2.0/go.mod h1:KqCZLdyyvdV855qA2rE3GC2aiw5xGR5TEjj8smXukLY=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.8.0/go.mod h1:QVkue5JL9kW//ek3r6jTKnTFis1tRmNAW2P1shuFdJc=
-golang.org/x/net v0.19.0 h1:zTwKpTd2XuCqf8huc7Fo2iSy+4RHPd10s4KzeTnVr1c=
-golang.org/x/net v0.19.0/go.mod h1:CfAk/cbD4CthTvqiEl8NpboMuiuOYsAr/7NOjZJtv1U=
+golang.org/x/net v0.20.0 h1:aCL9BSgETF1k+blQaYUBx9hJ9LOGP3gAVemcZlf1Kpo=
+golang.org/x/net v0.20.0/go.mod h1:z8BVo6PvndSri0LbOE3hAn0apkU+1YvI6E70E9jsnvY=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
+golang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
-golang.org/x/sync v0.5.0 h1:60k92dhOjHxJkrqnwsfl8KuaHbn/5dl0lUPUklKo3qE=
-golang.org/x/sync v0.5.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
+golang.org/x/sync v0.6.0 h1:5BMeUDZ7vkXGfEr1x9B4bRcTH4lpkTkpdh0T/J+qjbQ=
+golang.org/x/sync v0.6.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sys v0.0.0-20181122145206-62eef0e2fa9b/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
-golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20190222072716-a9d3bda3a223/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
+golang.org/x/sys v0.0.0-20191008105621-543471e840be/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20191120155948-bd437916bb0e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200909081042-eff7692f9009/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
+golang.org/x/sys v0.0.0-20200918174421-af09f7315aff/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
-golang.org/x/sys v0.0.0-20210616045830-e2b7044e8c71/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220310020820-b874c991c1a5/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220704084225-05e143d24a9e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.0.0-20220909162455-aba9fc2a8ff2/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
+golang.org/x/sys v0.1.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.2.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.3.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
-golang.org/x/sys v0.15.0 h1:h48lPFYpsTvQJZF4EKyI4aLHaev3CxivZmv7yZig9pc=
-golang.org/x/sys v0.15.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
+golang.org/x/sys v0.16.0 h1:xWw16ngr6ZMtmxDyKyIgsE93KNKz5HKmMa3b8ALHidU=
+golang.org/x/sys v0.16.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.2.0/go.mod h1:TVmDHMZPmdnySmBfhjOoOdhjzdE1h4u1VwSiw2l1Nuc=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.6.0/go.mod h1:m6U89DPEgQRMq3DNkDClhWw02AUbt2daBVO4cn4Hv9U=
-golang.org/x/term v0.15.0 h1:y/Oo/a/q3IXu26lQgl04j/gjuBDOBlx7X6Om1j2CPW4=
-golang.org/x/term v0.15.0/go.mod h1:BDl952bC7+uMoWR75FIrCDx79TPU9oHkTZ9yRbYOrX0=
+golang.org/x/term v0.16.0 h1:m+B6fahuftsE9qjo0VWp2FW0mB3MTJvR0BaMQrq0pmE=
+golang.org/x/term v0.16.0/go.mod h1:yn7UURbUtPyrVJPGPq404EukNFxcm/foM+bV/bfcDsY=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
@@ -266,8 +302,8 @@ golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGm
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
-golang.org/x/tools v0.16.0 h1:GO788SKMRunPIBCXiQyo2AaexLstOrVhuAL5YwsckQM=
-golang.org/x/tools v0.16.0/go.mod h1:kYVVN6I1mBNoB1OX+noeBjbRk4IUEPa7JJ+TJMEooJ0=
+golang.org/x/tools v0.17.0 h1:FvmRgNOcs3kOa+T20R1uhfP9F6HgG2mfxDv1vrx1Htc=
+golang.org/x/tools v0.17.0/go.mod h1:xsh6VxdV005rRVaS6SSAf9oiAqljS7UZUacMZ8Bnsps=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
google.golang.org/appengine v1.6.5/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
diff --git a/jfrogclisecurity_test.go b/jfrogclisecurity_test.go
index 39b14ccd..3af50790 100644
--- a/jfrogclisecurity_test.go
+++ b/jfrogclisecurity_test.go
@@ -3,46 +3,27 @@ package main
import (
"flag"
"fmt"
- "os"
- "testing"
-
- "github.com/stretchr/testify/assert"
-
- coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
- clientTests "github.com/jfrog/jfrog-client-go/utils/tests"
"github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
"github.com/jfrog/jfrog-cli-core/v2/utils/log"
+ "github.com/jfrog/jfrog-cli-security/tests/utils"
+
+ configTests "github.com/jfrog/jfrog-cli-security/tests"
+
clientLog "github.com/jfrog/jfrog-client-go/utils/log"
-)
-const (
- CliIntegrationTests = "github.com/jfrog/jfrog-cli-security/tests/integration"
+ "os"
+ "testing"
)
func TestMain(m *testing.M) {
- setupTests()
+ setupIntegrationTests()
result := m.Run()
-
+ tearDownIntegrationTests()
os.Exit(result)
}
-func TestUnitTests(t *testing.T) {
- // Create temp jfrog home
- cleanUpJfrogHome, err := coreTests.SetJfrogHome()
- if err != nil {
- clientLog.Error(err)
- os.Exit(1)
- }
- // Clean from previous tests.
- defer cleanUpJfrogHome()
-
- packages := clientTests.GetTestPackages("./...")
- packages = clientTests.ExcludeTestsPackage(packages, CliIntegrationTests)
- assert.NoError(t, clientTests.RunTests(packages, false))
-}
-
-func setupTests() {
+func setupIntegrationTests() {
// Disable usage report.
if err := os.Setenv(coreutils.ReportUsage, "false"); err != nil {
clientLog.Error(fmt.Sprintf("Couldn't set env: %s. Error: %s", coreutils.ReportUsage, err.Error()))
@@ -56,4 +37,14 @@ func setupTests() {
// General
flag.Parse()
log.SetDefaultLogger()
+ // Init
+ utils.InitTestCliDetails()
+ utils.AuthenticateArtifactory()
+ utils.CreateRequiredRepositories()
+}
+
+func tearDownIntegrationTests() {
+ // Important - Virtual repositories must be deleted first
+ utils.DeleteRepos(configTests.CreatedVirtualRepositories)
+ utils.DeleteRepos(configTests.CreatedNonVirtualRepositories)
}
diff --git a/scangraph/params.go b/scangraph/params.go
new file mode 100644
index 00000000..76e0106e
--- /dev/null
+++ b/scangraph/params.go
@@ -0,0 +1,59 @@
+package scangraph
+
+import (
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+)
+
+type ScanGraphParams struct {
+ serverDetails *config.ServerDetails
+ xrayGraphScanParams *services.XrayGraphScanParams
+ fixableOnly bool
+ xrayVersion string
+ severityLevel int
+}
+
+func NewScanGraphParams() *ScanGraphParams {
+ return &ScanGraphParams{}
+}
+
+func (sgp *ScanGraphParams) SetServerDetails(serverDetails *config.ServerDetails) *ScanGraphParams {
+ sgp.serverDetails = serverDetails
+ return sgp
+}
+
+func (sgp *ScanGraphParams) SetXrayGraphScanParams(params *services.XrayGraphScanParams) *ScanGraphParams {
+ sgp.xrayGraphScanParams = params
+ return sgp
+}
+
+func (sgp *ScanGraphParams) SetXrayVersion(xrayVersion string) *ScanGraphParams {
+ sgp.xrayVersion = xrayVersion
+ return sgp
+}
+
+func (sgp *ScanGraphParams) SetSeverityLevel(severity string) *ScanGraphParams {
+ sgp.severityLevel = getLevelOfSeverity(severity)
+ return sgp
+}
+
+func (sgp *ScanGraphParams) XrayGraphScanParams() *services.XrayGraphScanParams {
+ return sgp.xrayGraphScanParams
+}
+
+func (sgp *ScanGraphParams) XrayVersion() string {
+ return sgp.xrayVersion
+}
+
+func (sgp *ScanGraphParams) ServerDetails() *config.ServerDetails {
+ return sgp.serverDetails
+}
+
+func (sgp *ScanGraphParams) FixableOnly() bool {
+ return sgp.fixableOnly
+}
+
+func (sgp *ScanGraphParams) SetFixableOnly(fixable bool) *ScanGraphParams {
+ sgp.fixableOnly = fixable
+ return sgp
+}
diff --git a/scangraph/scangraph.go b/scangraph/scangraph.go
new file mode 100644
index 00000000..3752fdd2
--- /dev/null
+++ b/scangraph/scangraph.go
@@ -0,0 +1,108 @@
+package scangraph
+
+import (
+ "github.com/jfrog/jfrog-cli-security/utils"
+ clientutils "github.com/jfrog/jfrog-client-go/utils"
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "golang.org/x/text/cases"
+ "golang.org/x/text/language"
+)
+
+const (
+ GraphScanMinXrayVersion = "3.29.0"
+ ScanTypeMinXrayVersion = "3.37.2"
+)
+
+func RunScanGraphAndGetResults(params *ScanGraphParams) (*services.ScanResponse, error) {
+ xrayManager, err := utils.CreateXrayServiceManager(params.serverDetails)
+ if err != nil {
+ return nil, err
+ }
+
+ err = clientutils.ValidateMinimumVersion(clientutils.Xray, params.xrayVersion, ScanTypeMinXrayVersion)
+ if err != nil {
+ // Remove scan type param if Xray version is under the minimum supported version
+ params.xrayGraphScanParams.ScanType = ""
+ }
+
+ if params.xrayGraphScanParams.XscGitInfoContext != nil {
+ if params.xrayGraphScanParams.XscVersion, err = xrayManager.XscEnabled(); err != nil {
+ return nil, err
+ }
+ }
+
+ scanId, err := xrayManager.ScanGraph(*params.xrayGraphScanParams)
+ if err != nil {
+ return nil, err
+ }
+
+ xscEnabled := params.xrayGraphScanParams.XscVersion != ""
+ scanResult, err := xrayManager.GetScanGraphResults(scanId, params.XrayGraphScanParams().IncludeVulnerabilities, params.XrayGraphScanParams().IncludeLicenses, xscEnabled)
+ if err != nil {
+ return nil, err
+ }
+ return filterResultIfNeeded(scanResult, params), nil
+}
+
+func filterResultIfNeeded(scanResult *services.ScanResponse, params *ScanGraphParams) *services.ScanResponse {
+ if !shouldFilterResults(params) {
+ return scanResult
+ }
+
+ scanResult.Violations = filterViolations(scanResult.Violations, params)
+ scanResult.Vulnerabilities = filterVulnerabilities(scanResult.Vulnerabilities, params)
+ return scanResult
+}
+
+func shouldFilterResults(params *ScanGraphParams) bool {
+ return params.severityLevel > 0 || params.fixableOnly
+}
+
+func filterViolations(violations []services.Violation, params *ScanGraphParams) []services.Violation {
+ var filteredViolations []services.Violation
+ for _, violation := range violations {
+ if params.fixableOnly {
+ violation.Components = getFixableComponents(violation.Components)
+ if len(violation.Components) == 0 {
+ // All the components were filtered, filter this violation
+ continue
+ }
+ }
+ if getLevelOfSeverity(violation.Severity) >= params.severityLevel {
+ filteredViolations = append(filteredViolations, violation)
+ }
+ }
+ return filteredViolations
+}
+
+func filterVulnerabilities(vulnerabilities []services.Vulnerability, params *ScanGraphParams) []services.Vulnerability {
+ var filteredVulnerabilities []services.Vulnerability
+ for _, vulnerability := range vulnerabilities {
+ if params.fixableOnly {
+ vulnerability.Components = getFixableComponents(vulnerability.Components)
+ if len(vulnerability.Components) == 0 {
+ // All the components were filtered, filter this violation
+ continue
+ }
+ }
+ if getLevelOfSeverity(vulnerability.Severity) >= params.severityLevel {
+ filteredVulnerabilities = append(filteredVulnerabilities, vulnerability)
+ }
+ }
+ return filteredVulnerabilities
+}
+
+func getFixableComponents(components map[string]services.Component) map[string]services.Component {
+ fixableComponents := make(map[string]services.Component)
+ for vulnKey, vulnDetails := range components {
+ if len(vulnDetails.FixedVersions) > 0 {
+ fixableComponents[vulnKey] = vulnDetails
+ }
+ }
+ return fixableComponents
+}
+
+func getLevelOfSeverity(s string) int {
+ severity := utils.GetSeverity(cases.Title(language.Und).String(s), utils.ApplicabilityUndetermined)
+ return severity.NumValue()
+}
diff --git a/scangraph/scangraph_test.go b/scangraph/scangraph_test.go
new file mode 100644
index 00000000..28c5d19f
--- /dev/null
+++ b/scangraph/scangraph_test.go
@@ -0,0 +1,161 @@
+package scangraph
+
+import (
+ "github.com/jfrog/jfrog-client-go/xray/services"
+ "github.com/stretchr/testify/assert"
+ "reflect"
+ "testing"
+)
+
+func TestFilterResultIfNeeded(t *testing.T) {
+ // Define test cases
+ tests := []struct {
+ name string
+ scanResult services.ScanResponse
+ params ScanGraphParams
+ expected services.ScanResponse
+ }{
+ {
+ name: "Should not filter",
+ scanResult: services.ScanResponse{},
+ params: ScanGraphParams{},
+ expected: services.ScanResponse{},
+ },
+ {
+ name: "No filter level specified",
+ scanResult: services.ScanResponse{
+ Violations: []services.Violation{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ Vulnerabilities: []services.Vulnerability{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ },
+ params: ScanGraphParams{
+ severityLevel: 0,
+ },
+ expected: services.ScanResponse{
+ Violations: []services.Violation{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ Vulnerabilities: []services.Vulnerability{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ },
+ },
+ {
+ name: "Filter violations and vulnerabilities by high severity",
+ scanResult: services.ScanResponse{
+ Violations: []services.Violation{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ Vulnerabilities: []services.Vulnerability{
+ {Severity: "Low"},
+ {Severity: "Medium"},
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ },
+ params: ScanGraphParams{
+ severityLevel: 11,
+ },
+ expected: services.ScanResponse{
+ Violations: []services.Violation{
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ Vulnerabilities: []services.Vulnerability{
+ {Severity: "High"},
+ {Severity: "Critical"},
+ },
+ },
+ },
+ }
+
+ // Run test cases
+ for i := range tests {
+ t.Run(tests[i].name, func(t *testing.T) {
+ // Call the function with the input parameters
+ actual := filterResultIfNeeded(&tests[i].scanResult, &tests[i].params)
+ // Check that the function returned the expected result
+ assert.True(t, reflect.DeepEqual(*actual, tests[i].expected))
+ })
+ }
+}
+
+func TestGetFixableComponents(t *testing.T) {
+ // create test cases
+ testCases := []struct {
+ name string
+ components map[string]services.Component
+ expectedMap map[string]services.Component
+ }{
+ {
+ name: "Returns an empty map when all components have no fixed versions",
+ components: map[string]services.Component{
+ "vuln1": {
+ FixedVersions: []string{},
+ },
+ "vuln2": {
+ FixedVersions: []string{},
+ },
+ },
+ expectedMap: map[string]services.Component{},
+ },
+ {
+ name: "Returns a filtered map with only components that have fixed versions",
+ components: map[string]services.Component{
+ "vuln1": {
+ FixedVersions: []string{},
+ },
+ "vuln2": {
+ FixedVersions: []string{"1.0.0"},
+ },
+ "vuln3": {
+ FixedVersions: []string{"2.0.0", "3.0.0"},
+ },
+ "vuln4": {
+ FixedVersions: []string{},
+ },
+ },
+ expectedMap: map[string]services.Component{
+ "vuln2": {
+ FixedVersions: []string{"1.0.0"},
+ },
+ "vuln3": {
+ FixedVersions: []string{"2.0.0", "3.0.0"},
+ },
+ },
+ },
+ }
+
+ // run test cases
+ for _, tc := range testCases {
+ t.Run(tc.name, func(t *testing.T) {
+ actualMap := getFixableComponents(tc.components)
+ assert.Equal(t, len(tc.expectedMap), len(actualMap))
+ for k, v := range tc.expectedMap {
+ if v.FixedVersions == nil {
+ assert.True(t, actualMap[k].FixedVersions == nil)
+ } else {
+ assert.Equal(t, len(actualMap[k].FixedVersions), len(v.FixedVersions))
+ }
+ }
+ })
+ }
+}
diff --git a/scans_test.go b/scans_test.go
new file mode 100644
index 00000000..6d9b6bae
--- /dev/null
+++ b/scans_test.go
@@ -0,0 +1,355 @@
+package main
+
+import (
+ "encoding/json"
+ "fmt"
+ "net/http"
+ "net/http/httptest"
+ "os"
+ "path"
+ "path/filepath"
+ "strconv"
+ "strings"
+ "sync"
+ "testing"
+ "time"
+
+ biutils "github.com/jfrog/build-info-go/utils"
+ "github.com/stretchr/testify/assert"
+ "github.com/stretchr/testify/require"
+
+ "github.com/jfrog/jfrog-cli-security/cli"
+ "github.com/jfrog/jfrog-cli-security/cli/docs"
+ "github.com/jfrog/jfrog-cli-security/commands/curation"
+ "github.com/jfrog/jfrog-cli-security/commands/scan"
+ securityTests "github.com/jfrog/jfrog-cli-security/tests"
+ securityTestUtils "github.com/jfrog/jfrog-cli-security/tests/utils"
+
+ "github.com/jfrog/jfrog-cli-core/v2/artifactory/commands/container"
+ containerUtils "github.com/jfrog/jfrog-cli-core/v2/artifactory/utils/container"
+ pluginsCommon "github.com/jfrog/jfrog-cli-core/v2/plugins/common"
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+
+ "github.com/jfrog/jfrog-cli-core/v2/common/build"
+ commonCommands "github.com/jfrog/jfrog-cli-core/v2/common/commands"
+ "github.com/jfrog/jfrog-cli-core/v2/common/format"
+ "github.com/jfrog/jfrog-cli-core/v2/common/project"
+ commonTests "github.com/jfrog/jfrog-cli-core/v2/common/tests"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/coreutils"
+ coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+
+ "github.com/jfrog/jfrog-cli-security/scangraph"
+ "github.com/jfrog/jfrog-cli-security/utils"
+
+ clientUtils "github.com/jfrog/jfrog-client-go/utils"
+ clientTestUtils "github.com/jfrog/jfrog-client-go/utils/tests"
+ xrayUtils "github.com/jfrog/jfrog-client-go/xray/services/utils"
+)
+
+// Binary scan tests
+
+func TestXrayBinaryScanJson(t *testing.T) {
+ output := testXrayBinaryScan(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+}
+
+func TestXrayBinaryScanSimpleJson(t *testing.T) {
+ output := testXrayBinaryScan(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+}
+
+func TestXrayBinaryScanJsonWithProgress(t *testing.T) {
+ callback := commonTests.MockProgressInitialization()
+ defer callback()
+ output := testXrayBinaryScan(t, string(format.Json))
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+}
+
+func TestXrayBinaryScanSimpleJsonWithProgress(t *testing.T) {
+ callback := commonTests.MockProgressInitialization()
+ defer callback()
+ output := testXrayBinaryScan(t, string(format.SimpleJson))
+ securityTestUtils.VerifySimpleJsonScanResults(t, output, 1, 1)
+}
+
+func testXrayBinaryScan(t *testing.T, format string) string {
+ securityTestUtils.InitSecurityTest(t, scangraph.GraphScanMinXrayVersion)
+ binariesPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "binaries", "*")
+ return securityTests.PlatformCli.RunCliCmdWithOutput(t, "scan", binariesPath, "--licenses", "--format="+format)
+}
+
+func TestXrayBinaryScanWithBypassArchiveLimits(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, scan.BypassArchiveLimitsMinXrayVersion)
+ unsetEnv := clientTestUtils.SetEnvWithCallbackAndAssert(t, "JF_INDEXER_COMPRESS_MAXENTITIES", "10")
+ defer unsetEnv()
+ binariesPath := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "binaries", "*")
+ scanArgs := []string{"scan", binariesPath, "--format=json", "--licenses"}
+ // Run without bypass flag and expect scan to fail
+ err := securityTests.PlatformCli.Exec(scanArgs...)
+ // Expect error
+ assert.Error(t, err)
+
+ // Run with bypass flag and expect it to find vulnerabilities
+ scanArgs = append(scanArgs, "--bypass-archive-limits")
+ output := securityTests.PlatformCli.RunCliCmdWithOutput(t, scanArgs...)
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, 1, 1)
+}
+
+// Docker scan tests
+
+func TestDockerScanWithProgressBar(t *testing.T) {
+ callback := commonTests.MockProgressInitialization()
+ defer callback()
+ TestDockerScan(t)
+}
+
+func TestDockerScan(t *testing.T) {
+ cleanup := initNativeDockerWithXrayTest(t)
+ defer cleanup()
+
+ watchName, deleteWatch := createTestWatch(t)
+ defer deleteWatch()
+
+ imagesToScan := []string{
+ // Simple image with vulnerabilities
+ "bitnami/minio:2022",
+
+ // Image with RPM with vulnerabilities
+ "redhat/ubi8-micro:8.4",
+ }
+ for _, imageName := range imagesToScan {
+ runDockerScan(t, imageName, watchName, 3, 3, 3)
+ }
+
+ // On Xray 3.40.3 there is a bug whereby xray fails to scan docker image with 0 vulnerabilities,
+ // So we skip it for now till the next version will be released
+ securityTestUtils.ValidateXrayVersion(t, "3.41.0")
+
+ // Image with 0 vulnerabilities
+ runDockerScan(t, "busybox:1.35", "", 0, 0, 0)
+}
+
+func initNativeDockerWithXrayTest(t *testing.T) func() {
+ if !*securityTests.TestDockerScan || !*securityTests.TestSecurity {
+ t.Skip("Skipping Docker scan test. To run Xray Docker test add the '-test.dockerScan=true' and '-test.security=true' options.")
+ }
+ oldHomeDir := os.Getenv(coreutils.HomeDir)
+ securityTestUtils.ValidateXrayVersion(t, scan.DockerScanMinXrayVersion)
+ // Create server config to use with the command.
+ securityTestUtils.CreateJfrogHomeConfig(t, true)
+ // Add docker scan mock command
+ securityTests.TestApplication.Commands = append(securityTests.TestApplication.Commands, dockerScanMockCommand(t))
+ return func() {
+ clientTestUtils.SetEnvAndAssert(t, coreutils.HomeDir, oldHomeDir)
+ // remove docker scan mock command
+ securityTests.TestApplication.Commands = securityTests.TestApplication.Commands[:len(securityTests.TestApplication.Commands)-1]
+ }
+}
+
+func dockerScanMockCommand(t *testing.T) components.Command {
+ return components.Command{
+ Name: "docker",
+ Flags: docs.GetCommandFlags(docs.DockerScan),
+ Action: func(c *components.Context) error {
+ args := pluginsCommon.ExtractArguments(c)
+ var cmd, image string
+ // We may have prior flags before push/pull commands for the docker client.
+ for _, arg := range args {
+ if !strings.HasPrefix(arg, "-") {
+ if cmd == "" {
+ cmd = arg
+ } else {
+ image = arg
+ break
+ }
+ }
+ }
+ assert.Equal(t, "scan", cmd)
+ return cli.DockerScan(c, image)
+ },
+ }
+}
+
+func runDockerScan(t *testing.T, imageName, watchName string, minViolations, minVulnerabilities, minLicenses int) {
+ // Pull image from docker repo
+ imageTag := path.Join(*securityTests.ContainerRegistry, securityTests.DockerVirtualRepo, imageName)
+ dockerPullCommand := container.NewPullCommand(containerUtils.DockerClient)
+ dockerPullCommand.SetCmdParams([]string{"pull", imageTag}).SetImageTag(imageTag).SetRepo(securityTests.DockerVirtualRepo).SetServerDetails(securityTests.XrDetails).SetBuildConfiguration(new(build.BuildConfiguration))
+ if assert.NoError(t, dockerPullCommand.Run()) {
+ defer commonTests.DeleteTestImage(t, imageTag, containerUtils.DockerClient)
+ // Run docker scan on image
+ cmdArgs := []string{"docker", "scan", imageTag, "--server-id=default", "--licenses", "--format=json", "--fail=false", "--min-severity=low", "--fixable-only"}
+ output := securityTests.PlatformCli.WithoutCredentials().RunCliCmdWithOutput(t, cmdArgs...)
+ if assert.NotEmpty(t, output) {
+ securityTestUtils.VerifyJsonScanResults(t, output, 0, minVulnerabilities, minLicenses)
+ }
+ // Run docker scan on image with watch
+ if watchName == "" {
+ return
+ }
+ cmdArgs = append(cmdArgs, "--watches="+watchName)
+ output = securityTests.PlatformCli.WithoutCredentials().RunCliCmdWithOutput(t, cmdArgs...)
+ if assert.NotEmpty(t, output) {
+ securityTestUtils.VerifyJsonScanResults(t, output, minViolations, 0, 0)
+ }
+ }
+}
+
+func createTestWatch(t *testing.T) (string, func()) {
+ xrayManager, err := utils.CreateXrayServiceManager(securityTests.XrDetails)
+ require.NoError(t, err)
+ // Create new default policy.
+ policyParams := xrayUtils.PolicyParams{
+ Name: fmt.Sprintf("%s-%s", "docker-policy", strconv.FormatInt(time.Now().Unix(), 10)),
+ Type: xrayUtils.Security,
+ Rules: []xrayUtils.PolicyRule{{
+ Name: "sec_rule",
+ Criteria: *xrayUtils.CreateSeverityPolicyCriteria(xrayUtils.Low),
+ Priority: 1,
+ Actions: &xrayUtils.PolicyAction{
+ FailBuild: clientUtils.Pointer(true),
+ },
+ }},
+ }
+ if !assert.NoError(t, xrayManager.CreatePolicy(policyParams)) {
+ return "", func() {}
+ }
+ // Create new default watch.
+ watchParams := xrayUtils.NewWatchParams()
+ watchParams.Name = fmt.Sprintf("%s-%s", "docker-watch", strconv.FormatInt(time.Now().Unix(), 10))
+ watchParams.Active = true
+ watchParams.Builds.Type = xrayUtils.WatchBuildAll
+ watchParams.Policies = []xrayUtils.AssignedPolicy{
+ {
+ Name: policyParams.Name,
+ Type: "security",
+ },
+ }
+ assert.NoError(t, xrayManager.CreateWatch(watchParams))
+ return watchParams.Name, func() {
+ assert.NoError(t, xrayManager.DeleteWatch(watchParams.Name))
+ assert.NoError(t, xrayManager.DeletePolicy(policyParams.Name))
+ }
+}
+
+// Curation tests
+
+func TestCurationAudit(t *testing.T) {
+ securityTestUtils.InitSecurityTest(t, "")
+ tempDirPath, createTempDirCallback := coreTests.CreateTempDirWithCallbackAndAssert(t)
+ defer createTempDirCallback()
+ multiProject := filepath.Join(filepath.FromSlash(securityTestUtils.GetTestResourcesPath()), "projects", "package-managers", "npm")
+ assert.NoError(t, biutils.CopyDir(multiProject, tempDirPath, true, nil))
+ rootDir, err := os.Getwd()
+ require.NoError(t, err)
+ defer func() {
+ assert.NoError(t, os.Chdir(rootDir))
+ }()
+ require.NoError(t, os.Chdir(filepath.Join(tempDirPath, "npm")))
+ expectedRequest := map[string]bool{
+ "/api/npm/npms/json/-/json-9.0.6.tgz": false,
+ "/api/npm/npms/xml/-/xml-1.0.1.tgz": false,
+ }
+ requestToFail := map[string]bool{
+ "/api/npm/npms/xml/-/xml-1.0.1.tgz": false,
+ }
+ serverMock, config := curationServer(t, expectedRequest, requestToFail)
+
+ cleanUpJfrogHome, err := coreTests.SetJfrogHome()
+ assert.NoError(t, err)
+ defer cleanUpJfrogHome()
+
+ config.User = "admin"
+ config.Password = "password"
+ config.ServerId = "test"
+ configCmd := commonCommands.NewConfigCommand(commonCommands.AddOrEdit, config.ServerId).SetDetails(config).SetUseBasicAuthOnly(true).SetInteractive(false)
+ assert.NoError(t, configCmd.Run())
+
+ defer serverMock.Close()
+ // Create build config
+ assert.NoError(t, commonCommands.CreateBuildConfigWithOptions(false, project.Npm,
+ commonCommands.WithResolverServerId(config.ServerId),
+ commonCommands.WithResolverRepo("npms"),
+ commonCommands.WithDeployerServerId(config.ServerId),
+ commonCommands.WithDeployerRepo("npm-local"),
+ ))
+
+ localXrayCli := securityTests.PlatformCli.WithoutCredentials()
+ workingDirsFlag := fmt.Sprintf("--working-dirs=%s", filepath.Join(tempDirPath, "npm"))
+ output := localXrayCli.RunCliCmdWithOutput(t, "curation-audit", "--format="+string(format.Json), workingDirsFlag)
+ expectedResp := getCurationExpectedResponse(config)
+ var got []curation.PackageStatus
+ bracketIndex := strings.Index(output, "[")
+ require.Less(t, 0, bracketIndex, "Unexpected Curation output with missing '['")
+ err = json.Unmarshal([]byte(output[bracketIndex:]), &got)
+ assert.NoError(t, err)
+ assert.Equal(t, expectedResp, got)
+ for k, v := range expectedRequest {
+ assert.Truef(t, v, "didn't receive expected GET request for package url %s", k)
+ }
+}
+
+func getCurationExpectedResponse(config *config.ServerDetails) []curation.PackageStatus {
+ expectedResp := []curation.PackageStatus{
+ {
+ Action: "blocked",
+ PackageName: "xml",
+ PackageVersion: "1.0.1",
+ BlockedPackageUrl: config.ArtifactoryUrl + "api/npm/npms/xml/-/xml-1.0.1.tgz",
+ BlockingReason: curation.BlockingReasonPolicy,
+ ParentName: "xml",
+ ParentVersion: "1.0.1",
+ DepRelation: "direct",
+ PkgType: "npm",
+ Policy: []curation.Policy{
+ {
+ Policy: "pol1",
+ Condition: "cond1",
+ Explanation: "explanation",
+ Recommendation: "recommendation",
+ },
+ {
+ Policy: "pol2",
+ Condition: "cond2",
+ Explanation: "explanation2",
+ Recommendation: "recommendation2",
+ },
+ },
+ },
+ }
+ return expectedResp
+}
+
+func curationServer(t *testing.T, expectedRequest map[string]bool, requestToFail map[string]bool) (*httptest.Server, *config.ServerDetails) {
+ mapLockReadWrite := sync.Mutex{}
+ serverMock, config, _ := commonTests.CreateRtRestsMockServer(t, func(w http.ResponseWriter, r *http.Request) {
+ if r.Method == http.MethodHead {
+ mapLockReadWrite.Lock()
+ if _, exist := expectedRequest[r.RequestURI]; exist {
+ expectedRequest[r.RequestURI] = true
+ }
+ mapLockReadWrite.Unlock()
+ if _, exist := requestToFail[r.RequestURI]; exist {
+ w.WriteHeader(http.StatusForbidden)
+ }
+ }
+ if r.Method == http.MethodGet {
+ if r.RequestURI == "/api/system/version" {
+ _, err := w.Write([]byte(`{"version": "7.0.0"}`))
+ require.NoError(t, err)
+ w.WriteHeader(http.StatusOK)
+ return
+ }
+
+ if _, exist := requestToFail[r.RequestURI]; exist {
+ w.WriteHeader(http.StatusForbidden)
+ _, err := w.Write([]byte("{\n \"errors\": [\n {\n \"status\": 403,\n " +
+ "\"message\": \"Package download was blocked by JFrog Packages " +
+ "Curation service due to the following policies violated {pol1, cond1, explanation, recommendation}, {pol2, cond2, explanation2, recommendation2}\"\n }\n ]\n}"))
+ require.NoError(t, err)
+ }
+ }
+ })
+ return serverMock, config
+}
diff --git a/tests/config.go b/tests/config.go
new file mode 100644
index 00000000..9fe0641e
--- /dev/null
+++ b/tests/config.go
@@ -0,0 +1,66 @@
+package tests
+
+import (
+ "flag"
+
+ "github.com/jfrog/jfrog-cli-core/v2/plugins/components"
+ "github.com/jfrog/jfrog-cli-core/v2/utils/config"
+ "github.com/jfrog/jfrog-client-go/auth"
+ "github.com/jfrog/jfrog-client-go/utils/io/httputils"
+
+ coreTests "github.com/jfrog/jfrog-cli-core/v2/utils/tests"
+)
+
+// Integration tests - global variables
+var (
+ XrDetails *config.ServerDetails
+ XrAuth auth.ServiceDetails
+
+ RtDetails *config.ServerDetails
+ RtAuth auth.ServiceDetails
+ RtHttpDetails httputils.HttpClientDetails
+
+ PlatformCli *coreTests.JfrogCli
+
+ TestApplication *components.App
+
+ timestampAdded bool
+)
+
+// Test flags
+var (
+ TestSecurity *bool
+ TestDockerScan *bool
+
+ JfrogUrl *string
+ JfrogUser *string
+ JfrogPassword *string
+ JfrogSshKeyPath *string
+ JfrogSshPassphrase *string
+ JfrogAccessToken *string
+
+ ContainerRegistry *string
+
+ HideUnitTestLog *bool
+ SkipUnitTests *bool
+ ciRunId *string
+)
+
+func init() {
+ TestSecurity = flag.Bool("test.security", true, "Test Security")
+ TestDockerScan = flag.Bool("test.dockerScan", false, "Test Docker scan")
+
+ JfrogUrl = flag.String("jfrog.url", "http://localhost:8081/", "JFrog platform url")
+ JfrogUser = flag.String("jfrog.user", "admin", "JFrog platform username")
+ JfrogPassword = flag.String("jfrog.password", "password", "JFrog platform password")
+ JfrogSshKeyPath = flag.String("jfrog.sshKeyPath", "", "Ssh key file path")
+ JfrogSshPassphrase = flag.String("jfrog.sshPassphrase", "", "Ssh key passphrase")
+ JfrogAccessToken = flag.String("jfrog.adminToken", "", "JFrog platform admin token")
+
+ ContainerRegistry = flag.String("test.containerRegistry", "localhost:8082", "Container registry")
+
+ HideUnitTestLog = flag.Bool("test.hideUnitTestLog", false, "Hide unit tests logs and print it in a file")
+ SkipUnitTests = flag.Bool("test.skipUnitTests", false, "Skip unit tests")
+
+ ciRunId = flag.String("ci.runId", "", "A unique identifier used as a suffix to create repositories and builds in the tests")
+}
diff --git a/tests/consts.go b/tests/consts.go
new file mode 100644
index 00000000..2b39678b
--- /dev/null
+++ b/tests/consts.go
@@ -0,0 +1,182 @@
+package tests
+
+import (
+ "strconv"
+ "strings"
+ "time"
+)
+
+const (
+ JvmLaunchEnvVar = "MAVEN_OPTS"
+ GoCacheEnvVar = "GOMODCACHE"
+ PipCacheEnvVar = "PIP_CACHE_DIR"
+
+ MavenCacheRedirectionVal = "-Dmaven.repo.local="
+)
+
+const (
+ XrayEndpoint = "xray/"
+ ArtifactoryEndpoint = "artifactory/"
+ AccessEndpoint = "access/"
+ RepoDetailsEndpoint = "api/repositories/"
+
+ Out = "out"
+ Temp = "tmp"
+)
+
+// Integration tests - Artifactory information
+var (
+ ServerId = "testServerId"
+
+ // Repositories
+ RtRepo1 = "cli-rt1"
+ RtVirtualRepo = "cli-rt-virtual"
+
+ DockerVirtualRepo = "cli-docker-virtual"
+ DockerLocalRepo = "cli-docker-local"
+ DockerRemoteRepo = "cli-docker-remote"
+ NpmRemoteRepo = "cli-npm-remote"
+ NugetRemoteRepo = "cli-nuget-remote"
+ YarnRemoteRepo = "cli-yarn-remote"
+ GradleRemoteRepo = "cli-gradle-remote"
+ MvnRemoteRepo = "cli-mvn-remote"
+ GoVirtualRepo = "cli-go-virtual"
+ GoRemoteRepo = "cli-go-remote"
+ GoRepo = "cli-go"
+ PypiRemoteRepo = "cli-pypi-remote"
+)
+
+// Integration tests - Artifactory repositories creation templates
+const (
+ DockerVirtualRepositoryConfig = "docker_virtual_repository_config.json"
+ DockerLocalRepositoryConfig = "docker_local_repository_config.json"
+ DockerRemoteRepositoryConfig = "docker_remote_repository_config.json"
+ NpmRemoteRepositoryConfig = "npm_remote_repository_config.json"
+ NugetRemoteRepositoryConfig = "nuget_remote_repository_config.json"
+ YarnRemoteRepositoryConfig = "yarn_remote_repository_config.json"
+ GradleRemoteRepositoryConfig = "gradle_remote_repository_config.json"
+ MavenRemoteRepositoryConfig = "maven_remote_repository_config.json"
+ GoVirtualRepositoryConfig = "go_virtual_repository_config.json"
+ GoRemoteRepositoryConfig = "go_remote_repository_config.json"
+ GoLocalRepositoryConfig = "go_local_repository_config.json"
+ PypiRemoteRepositoryConfig = "pypi_remote_repository_config.json"
+
+ Repo1RepositoryConfig = "repo1_repository_config.json"
+ VirtualRepositoryConfig = "specs_virtual_repository_config.json"
+)
+
+var reposConfigMap = map[*string]string{
+ &RtRepo1: Repo1RepositoryConfig,
+ &RtVirtualRepo: VirtualRepositoryConfig,
+
+ &DockerVirtualRepo: DockerVirtualRepositoryConfig,
+ &DockerLocalRepo: DockerLocalRepositoryConfig,
+ &DockerRemoteRepo: DockerRemoteRepositoryConfig,
+ &NpmRemoteRepo: NpmRemoteRepositoryConfig,
+ &NugetRemoteRepo: NugetRemoteRepositoryConfig,
+ &YarnRemoteRepo: YarnRemoteRepositoryConfig,
+ &GradleRemoteRepo: GradleRemoteRepositoryConfig,
+ &MvnRemoteRepo: MavenRemoteRepositoryConfig,
+ &GoVirtualRepo: GoVirtualRepositoryConfig,
+ &GoRemoteRepo: GoRemoteRepositoryConfig,
+ &GoRepo: GoLocalRepositoryConfig,
+ &PypiRemoteRepo: PypiRemoteRepositoryConfig,
+}
+
+// Return local and remote repositories for the test suites, respectfully
+func GetNonVirtualRepositories() map[*string]string {
+ nonVirtualReposMap := map[*bool][]*string{
+ TestDockerScan: {&DockerLocalRepo, &DockerRemoteRepo},
+ TestSecurity: {&NpmRemoteRepo, &NugetRemoteRepo, &YarnRemoteRepo, &GradleRemoteRepo, &MvnRemoteRepo, &GoRepo, &GoRemoteRepo, &PypiRemoteRepo},
+ }
+ return getNeededRepositories(nonVirtualReposMap)
+}
+
+// Return virtual repositories for the test suites, respectfully
+func GetVirtualRepositories() map[*string]string {
+ virtualReposMap := map[*bool][]*string{
+ TestDockerScan: {&DockerVirtualRepo},
+ TestSecurity: {&GoVirtualRepo},
+ }
+ return getNeededRepositories(virtualReposMap)
+}
+
+var CreatedNonVirtualRepositories map[*string]string
+var CreatedVirtualRepositories map[*string]string
+
+func GetAllRepositoriesNames() []string {
+ var baseRepoNames []string
+ for repoName := range GetNonVirtualRepositories() {
+ baseRepoNames = append(baseRepoNames, *repoName)
+ }
+ for repoName := range GetVirtualRepositories() {
+ baseRepoNames = append(baseRepoNames, *repoName)
+ }
+ return baseRepoNames
+}
+
+func getNeededRepositories(reposMap map[*bool][]*string) map[*string]string {
+ reposToCreate := map[*string]string{}
+ for needed, testRepos := range reposMap {
+ if *needed {
+ for _, repo := range testRepos {
+ reposToCreate[repo] = reposConfigMap[repo]
+ }
+ }
+ }
+ return reposToCreate
+}
+
+func AddTimestampToGlobalVars() {
+ // Make sure the global timestamp is added only once even in case of multiple tests flags
+ if timestampAdded {
+ return
+ }
+ timestamp := strconv.FormatInt(time.Now().Unix(), 10)
+ uniqueSuffix := "-" + timestamp
+
+ if *ciRunId != "" {
+ uniqueSuffix = "-" + *ciRunId + uniqueSuffix
+ }
+ // Artifactory accepts only lowercase repository names
+ uniqueSuffix = strings.ToLower(uniqueSuffix)
+
+ // Repositories
+ GoRepo += uniqueSuffix
+ GoRemoteRepo += uniqueSuffix
+ GoVirtualRepo += uniqueSuffix
+ DockerLocalRepo += uniqueSuffix
+ DockerRemoteRepo += uniqueSuffix
+ DockerVirtualRepo += uniqueSuffix
+ GradleRemoteRepo += uniqueSuffix
+ MvnRemoteRepo += uniqueSuffix
+ NpmRemoteRepo += uniqueSuffix
+ NugetRemoteRepo += uniqueSuffix
+ YarnRemoteRepo += uniqueSuffix
+ PypiRemoteRepo += uniqueSuffix
+
+ timestampAdded = true
+}
+
+// Builds and repositories names to replace in the test files.
+// We use substitution map to set repositories and builds with timestamp.
+func GetSubstitutionMap() map[string]string {
+ return map[string]string{
+ "${REPO1}": RtRepo1,
+ "${VIRTUAL_REPO}": RtVirtualRepo,
+
+ "${DOCKER_REPO}": DockerLocalRepo,
+ "${DOCKER_REMOTE_REPO}": DockerRemoteRepo,
+ "${DOCKER_VIRTUAL_REPO}": DockerVirtualRepo,
+
+ "${GO_REPO}": GoRepo,
+ "${GO_REMOTE_REPO}": GoRemoteRepo,
+ "${GO_VIRTUAL_REPO}": GoVirtualRepo,
+ "${GRADLE_REMOTE_REPO}": GradleRemoteRepo,
+ "${MAVEN_REMOTE_REPO}": MvnRemoteRepo,
+ "${NPM_REMOTE_REPO}": NpmRemoteRepo,
+ "${NUGET_REMOTE_REPO}": NugetRemoteRepo,
+ "${PYPI_REMOTE_REPO}": PypiRemoteRepo,
+ "${YARN_REMOTE_REPO}": YarnRemoteRepo,
+ }
+}
diff --git a/tests/integration/xray_test.go b/tests/integration/xray_test.go
deleted file mode 100644
index 76ab1b72..00000000
--- a/tests/integration/xray_test.go
+++ /dev/null
@@ -1 +0,0 @@
-package integration
diff --git a/tests/testdata/artifactory-repo-configs/docker_local_repository_config.json b/tests/testdata/artifactory-repo-configs/docker_local_repository_config.json
new file mode 100644
index 00000000..d58a3d5b
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/docker_local_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${DOCKER_REPO}",
+ "rclass": "local",
+ "packageType": "docker",
+ "blockPushingSchema1": "false"
+}
\ No newline at end of file
diff --git a/tests/testdata/artifactory-repo-configs/docker_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/docker_remote_repository_config.json
new file mode 100644
index 00000000..2842fd0a
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/docker_remote_repository_config.json
@@ -0,0 +1,7 @@
+{
+ "key": "${DOCKER_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "docker",
+ "url": "https://registry-1.docker.io/",
+ "blockPushingSchema1": "false"
+}
\ No newline at end of file
diff --git a/tests/testdata/artifactory-repo-configs/docker_virtual_repository_config.json b/tests/testdata/artifactory-repo-configs/docker_virtual_repository_config.json
new file mode 100644
index 00000000..7aa3a5c1
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/docker_virtual_repository_config.json
@@ -0,0 +1,8 @@
+{
+ "key": "${DOCKER_VIRTUAL_REPO}",
+ "rclass": "virtual",
+ "packageType": "docker",
+ "repositories": ["${DOCKER_REMOTE_REPO}", "${DOCKER_REPO}"],
+ "defaultDeploymentRepo": "${DOCKER_REPO}",
+ "blockPushingSchema1": "false"
+}
\ No newline at end of file
diff --git a/tests/testdata/artifactory-repo-configs/go_local_repository_config.json b/tests/testdata/artifactory-repo-configs/go_local_repository_config.json
new file mode 100644
index 00000000..e244362e
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/go_local_repository_config.json
@@ -0,0 +1,5 @@
+{
+ "key": "${GO_REPO}",
+ "rclass": "local",
+ "packageType": "go"
+}
diff --git a/tests/testdata/artifactory-repo-configs/go_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/go_remote_repository_config.json
new file mode 100644
index 00000000..4229a068
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/go_remote_repository_config.json
@@ -0,0 +1,7 @@
+{
+ "key": "${GO_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "go",
+ "url": "https://proxy.golang.org",
+ "vcsGitProvider" : "ARTIFACTORY"
+}
diff --git a/tests/testdata/artifactory-repo-configs/go_virtual_repository_config.json b/tests/testdata/artifactory-repo-configs/go_virtual_repository_config.json
new file mode 100644
index 00000000..b71693cd
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/go_virtual_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${GO_VIRTUAL_REPO}",
+ "rclass": "virtual",
+ "packageType": "go",
+ "repositories": ["${GO_REMOTE_REPO}", "${GO_REPO}"]
+}
diff --git a/tests/testdata/artifactory-repo-configs/gradle_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/gradle_remote_repository_config.json
new file mode 100644
index 00000000..63956d53
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/gradle_remote_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${GRADLE_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "gradle",
+ "url": "https://repo.maven.apache.org/maven2"
+}
\ No newline at end of file
diff --git a/tests/testdata/artifactory-repo-configs/maven_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/maven_remote_repository_config.json
new file mode 100644
index 00000000..9a6012db
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/maven_remote_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${MAVEN_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "maven",
+ "url": "https://repo.maven.apache.org/maven2"
+}
\ No newline at end of file
diff --git a/tests/testdata/artifactory-repo-configs/npm_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/npm_remote_repository_config.json
new file mode 100644
index 00000000..2d4e8ef3
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/npm_remote_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${NPM_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "npm",
+ "url": "https://registry.npmjs.org"
+}
diff --git a/tests/testdata/artifactory-repo-configs/nuget_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/nuget_remote_repository_config.json
new file mode 100644
index 00000000..6a65c4f8
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/nuget_remote_repository_config.json
@@ -0,0 +1,9 @@
+{
+ "key": "${NUGET_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "nuget",
+ "url": "https://www.nuget.org/",
+ "downloadContextPath":"api/v3/package",
+ "feedContextPath":"api/v3",
+ "v3FeedUrl":"https://api.nuget.org/v3/index.json"
+}
diff --git a/tests/testdata/artifactory-repo-configs/pypi_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/pypi_remote_repository_config.json
new file mode 100644
index 00000000..76f6ef16
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/pypi_remote_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${PYPI_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "pypi",
+ "url": "https://files.pythonhosted.org"
+}
diff --git a/tests/testdata/artifactory-repo-configs/repo1_repository_config.json b/tests/testdata/artifactory-repo-configs/repo1_repository_config.json
new file mode 100644
index 00000000..ecbfecdb
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/repo1_repository_config.json
@@ -0,0 +1,5 @@
+{
+ "key": "${REPO1}",
+ "rclass": "local",
+ "packageType": "generic"
+}
diff --git a/tests/testdata/artifactory-repo-configs/specs_virtual_repository_config.json b/tests/testdata/artifactory-repo-configs/specs_virtual_repository_config.json
new file mode 100644
index 00000000..7d1a13b4
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/specs_virtual_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${VIRTUAL_REPO}",
+ "packageType": "generic",
+ "repositories": ["${REPO1}"],
+ "rclass": "virtual"
+}
diff --git a/tests/testdata/artifactory-repo-configs/yarn_remote_repository_config.json b/tests/testdata/artifactory-repo-configs/yarn_remote_repository_config.json
new file mode 100644
index 00000000..1474dc89
--- /dev/null
+++ b/tests/testdata/artifactory-repo-configs/yarn_remote_repository_config.json
@@ -0,0 +1,6 @@
+{
+ "key": "${YARN_REMOTE_REPO}",
+ "rclass": "remote",
+ "packageType": "npm",
+ "url": "https://registry.npmjs.org"
+}
diff --git a/tests/testdata/other/applicability-scan/applicable-cve-results.sarif b/tests/testdata/other/applicability-scan/applicable-cve-results.sarif
new file mode 100644
index 00000000..66aee38a
--- /dev/null
+++ b/tests/testdata/other/applicability-scan/applicable-cve-results.sarif
@@ -0,0 +1,84 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Applicability Scanner",
+ "rules": [
+ {
+ "id": "applic_testCve1",
+ "fullDescription": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called.",
+ "markdown": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "name": "CVE-2021-3807",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3807"
+ }
+ },
+ {
+ "id": "applic_testCve3",
+ "fullDescription": {
+ "text": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument.",
+ "markdown": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument."
+ },
+ "name": "CVE-2021-3918",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3918"
+ }
+ }
+ ],
+ "version": "APPLIC_SCANNERv0.2.3"
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "ca_scanner/applicability_scanner",
+ "scan",
+ "ca_config_example.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/ort/Desktop/am_versions_for_ga/mac_arm"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "The vulnerable function json-schema.validate is called with external input"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/ort/Desktop/am_versions_for_ga/mac_arm/demo/node_modules/npm/node_modules/jsprim/lib/jsprim.js"
+ },
+ "region": {
+ "endColumn": 53,
+ "endLine": 531,
+ "snippet": {
+ "text": "mod_jsonschema.validate(input, schema)"
+ },
+ "startColumn": 15,
+ "startLine": 531
+ }
+ }
+ }
+ ],
+ "ruleId": "applic_testCve1"
+ },
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve3"
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/applicability-scan/empty-results.sarif b/tests/testdata/other/applicability-scan/empty-results.sarif
new file mode 100644
index 00000000..ec3c9ccf
--- /dev/null
+++ b/tests/testdata/other/applicability-scan/empty-results.sarif
@@ -0,0 +1,29 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Applicability Scanner",
+ "rules": [],
+ "version": "APPLIC_SCANNERv0.2.3"
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "ca_scanner/applicability_scanner",
+ "scan",
+ "/Users/ort/workspace/src/jfrog.com/jfrog-cli/LeWVtGDMPW.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/ort/workspace/src/jfrog.com/jfrog-cli"
+ }
+ }
+ ],
+ "results": []
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/applicability-scan/no-applicable-cves-results.sarif b/tests/testdata/other/applicability-scan/no-applicable-cves-results.sarif
new file mode 100644
index 00000000..4257bc86
--- /dev/null
+++ b/tests/testdata/other/applicability-scan/no-applicable-cves-results.sarif
@@ -0,0 +1,121 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Applicability Scanner",
+ "rules": [
+ {
+ "id": "applic_testCve2",
+ "fullDescription": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called.",
+ "markdown": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "name": "CVE-2021-3807",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3807"
+ }
+ },
+ {
+ "id": "applic_testCve3",
+ "fullDescription": {
+ "text": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument.",
+ "markdown": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument."
+ },
+ "name": "CVE-2021-3918",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3918"
+ }
+ },
+ {
+ "id": "applic_testCve4",
+ "fullDescription": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called.",
+ "markdown": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "name": "CVE-2021-3807",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3807"
+ }
+ },
+ {
+ "id": "applic_testCve5",
+ "fullDescription": {
+ "text": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument.",
+ "markdown": "The scanner checks whether any of the following vulnerable functions are called:\n\n* `json-schema.validate` with external input to its 1st (`instance`) argument.\n* `json-schema.checkPropertyChange` with external input to its 2nd (`schema`) argument."
+ },
+ "name": "CVE-2021-3918",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3918"
+ }
+ },
+ {
+ "id": "applic_testCve1",
+ "fullDescription": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called.",
+ "markdown": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "name": "CVE-2021-3807",
+ "shortDescription": {
+ "text": "Scanner for CVE-2021-3807"
+ }
+ }
+ ],
+ "version": "APPLIC_SCANNERv0.2.3"
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "ca_scanner/applicability_scanner",
+ "scan",
+ "ca_config_example.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/ort/Desktop/am_versions_for_ga/mac_arm"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve2"
+ },
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve3"
+ },
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve4"
+ },
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve5"
+ },
+ {
+ "message": {
+ "text": "The scanner checks whether the vulnerable function `ansi-regex` is called."
+ },
+ "kind": "pass",
+ "ruleId": "applic_testCve1"
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/iac-scan/contains-iac-violations-working-dir.sarif b/tests/testdata/other/iac-scan/contains-iac-violations-working-dir.sarif
new file mode 100644
index 00000000..cb7d1aa5
--- /dev/null
+++ b/tests/testdata/other/iac-scan/contains-iac-violations-working-dir.sarif
@@ -0,0 +1,669 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Terraform scanner",
+ "rules": [
+ {
+ "id": "aws_alb_https_only",
+ "fullDescription": {
+ "text": "Resources `aws_lb_listener` and `aws_alb_listener` should set `protocol = \"HTTPS\"` (default is `\"HTTP\"`)\n\nVulnerable example -\n```\nresource \"aws_lb_listener\" \"vulnerable_example\" {\n protocol = \"HTTP\"\n}\n```\n\nSecure example -\n```\nresource \"aws_lb_listener\" \"secure_example\" {\n protocol = \"HTTPS\"\n}\n```",
+ "markdown": "Resources `aws_lb_listener` and `aws_alb_listener` should set `protocol = \"HTTPS\"` (default is `\"HTTP\"`)\n\nVulnerable example -\n```\nresource \"aws_lb_listener\" \"vulnerable_example\" {\n protocol = \"HTTP\"\n}\n```\n\nSecure example -\n```\nresource \"aws_lb_listener\" \"secure_example\" {\n protocol = \"HTTPS\"\n}\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_alb_https_only"
+ }
+ },
+ {
+ "id": "aws_cloudwatch_log_encrypt",
+ "fullDescription": {
+ "text": "Resource `aws_cloudwatch_log_group` should have `kms_key_id`\n\nVulnerable example - \n```\nresource \"aws_cloudwatch_log_group\" \"vulnerable_example\" {\n # kms_key_id is not set\n}\n```\n\nSecure example -\n```\nresource \"aws_cloudwatch_log_group\" \"secure_example\" {\n kms_key_id = aws_kms_key.example.arn\n}\n```",
+ "markdown": "Resource `aws_cloudwatch_log_group` should have `kms_key_id`\n\nVulnerable example - \n```\nresource \"aws_cloudwatch_log_group\" \"vulnerable_example\" {\n # kms_key_id is not set\n}\n```\n\nSecure example -\n```\nresource \"aws_cloudwatch_log_group\" \"secure_example\" {\n kms_key_id = aws_kms_key.example.arn\n}\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_cloudwatch_log_encrypt"
+ }
+ },
+ {
+ "id": "aws_docdb_encrypt_cluster",
+ "fullDescription": {
+ "text": "Resource `aws_docdb_cluster` should have `storage_encrypted=true`\n\nVulnerable example - \n```\nresource \"aws_docdb_cluster\" \"vulnerable_example\" {\n # storage_encrypted is unset\n}\n```\n\nSecure example -\n```\nresource \"aws_docdb_cluster\" \"secure_example\" {\n storage_encrypted = true\n}\n```",
+ "markdown": "Resource `aws_docdb_cluster` should have `storage_encrypted=true`\n\nVulnerable example - \n```\nresource \"aws_docdb_cluster\" \"vulnerable_example\" {\n # storage_encrypted is unset\n}\n```\n\nSecure example -\n```\nresource \"aws_docdb_cluster\" \"secure_example\" {\n storage_encrypted = true\n}\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_docdb_encrypt_cluster"
+ }
+ },
+ {
+ "id": "aws_eks_encrypt_cluster",
+ "fullDescription": {
+ "text": "Resource `aws_eks_cluster` should have key `encryption_config`\n\nVulnerable example -\n```\nresource \"aws_eks_cluster\" \"vulnerable_example\" {\n # encryption_config is not set\n}\n```\n\nSecure example -\n```\nresource \"aws_eks_cluster\" \"secure_example\" {\n encryption_config {\n resources = [ \"secrets\" ]\n provider {\n key_arn = aws_kms_key.example.arn\n }\n }\n}\n```",
+ "markdown": "Resource `aws_eks_cluster` should have key `encryption_config`\n\nVulnerable example -\n```\nresource \"aws_eks_cluster\" \"vulnerable_example\" {\n # encryption_config is not set\n}\n```\n\nSecure example -\n```\nresource \"aws_eks_cluster\" \"secure_example\" {\n encryption_config {\n resources = [ \"secrets\" ]\n provider {\n key_arn = aws_kms_key.example.arn\n }\n }\n}\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_eks_encrypt_cluster"
+ }
+ },
+ {
+ "id": "aws_eks_no_cidr",
+ "fullDescription": {
+ "text": "Resource `aws_eks_cluster` should have key `public_access_cidrs` (default is `0.0.0.0/0` which is overly permissive). Note that this endpoint is only enabled when `endpoint_public_access = true` (default is `true`)\n\nVulnerable example -\n```\nresource \"aws_eks_cluster\" \"vulnerable_example\" {\n vpc_config {\n endpoint_public_access = true # or unset\n public_access_cidrs = [\"0.0.0.0/0\"] # or unset \n }\n }\n```\n\nSecure example #1 -\n```\nresource \"aws_eks_cluster\" \"secure_example_1\" {\n vpc_config {\n endpoint_public_access = false\n }\n }\n```\n\nSecure example #2 -\n```\nresource \"aws_eks_cluster\" \"secure_example_2\" {\n vpc_config {\n endpoint_public_access = true\n public_access_cidrs = [\"192.168.0.0/24\"]\n }\n }\n```",
+ "markdown": "Resource `aws_eks_cluster` should have key `public_access_cidrs` (default is `0.0.0.0/0` which is overly permissive). Note that this endpoint is only enabled when `endpoint_public_access = true` (default is `true`)\n\nVulnerable example -\n```\nresource \"aws_eks_cluster\" \"vulnerable_example\" {\n vpc_config {\n endpoint_public_access = true # or unset\n public_access_cidrs = [\"0.0.0.0/0\"] # or unset \n }\n }\n```\n\nSecure example #1 -\n```\nresource \"aws_eks_cluster\" \"secure_example_1\" {\n vpc_config {\n endpoint_public_access = false\n }\n }\n```\n\nSecure example #2 -\n```\nresource \"aws_eks_cluster\" \"secure_example_2\" {\n vpc_config {\n endpoint_public_access = true\n public_access_cidrs = [\"192.168.0.0/24\"]\n }\n }\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_eks_no_cidr"
+ }
+ },
+ {
+ "id": "aws_rds_encrypt_instance",
+ "fullDescription": {
+ "text": "Resource `aws_db_instance` should have `storage_encrypted=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_db_instance\" \"vulnerable_example\" {\r\n # storage_encrypted is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_db_instance\" \"secure_example\" {\r\n kms_key_id = aws_kms_key.example.arn\r\n storage_encrypted = true\r\n}\r\n```",
+ "markdown": "Resource `aws_db_instance` should have `storage_encrypted=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_db_instance\" \"vulnerable_example\" {\r\n # storage_encrypted is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_db_instance\" \"secure_example\" {\r\n kms_key_id = aws_kms_key.example.arn\r\n storage_encrypted = true\r\n}\r\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_rds_encrypt_instance"
+ }
+ },
+ {
+ "id": "aws_rds_iam_auth",
+ "fullDescription": {
+ "text": "Resource `aws_db_instance` should have `iam_database_authentication_enabled=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_db_instance\" \"vulnerable_example\" {\r\n # iam_database_authentication_enabled is unset (or false)\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_db_instance\" \"secure_example\" {\r\n iam_database_authentication_enabled = true\r\n}\r\n```",
+ "markdown": "Resource `aws_db_instance` should have `iam_database_authentication_enabled=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_db_instance\" \"vulnerable_example\" {\r\n # iam_database_authentication_enabled is unset (or false)\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_db_instance\" \"secure_example\" {\r\n iam_database_authentication_enabled = true\r\n}\r\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_rds_iam_auth"
+ }
+ },
+ {
+ "id": "aws_s3_block_public_acl",
+ "fullDescription": {
+ "text": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_public_access_block` must also exist and have `block_public_acls=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"vulnerable_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n # block_public_acls is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"secure_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n block_public_acls = true\r\n}\r\n```",
+ "markdown": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_public_access_block` must also exist and have `block_public_acls=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"vulnerable_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n # block_public_acls is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"secure_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n block_public_acls = true\r\n}\r\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_s3_block_public_acl"
+ }
+ },
+ {
+ "id": "aws_s3_block_public_policy",
+ "fullDescription": {
+ "text": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_public_access_block` must also exist and have `block_public_acls=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"vulnerable_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n # block_public_acls is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"secure_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n block_public_acls = true\r\n}\r\n```",
+ "markdown": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_public_access_block` must also exist and have `block_public_acls=true`\r\n\r\nVulnerable example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"vulnerable_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n # block_public_acls is not set\r\n}\r\n```\r\n\r\nSecure example -\r\n```\r\nresource \"aws_s3_bucket\" \"example_bucket\" {\r\n bucket = \"mybucket\"\r\n}\r\n\r\nresource \"aws_s3_bucket_public_access_block\" \"secure_example\" {\r\n bucket = aws_s3_bucket.example_bucket.id\r\n block_public_acls = true\r\n}\r\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_s3_block_public_policy"
+ }
+ },
+ {
+ "id": "aws_s3_encrypt",
+ "fullDescription": {
+ "text": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_server_side_encryption_configuration` must also exist with the key `apply_server_side_encryption_by_default`. Alternatively, the `aws_s3_bucket` resource should have the (deprecated) `server_side_encryption_configuration` key.\n\nVulnerable example #1 -\n```\nresource \"aws_s3_bucket\" \"mybucket\" {\n bucket = \"mybucket\"\n}\n\n# resource \"aws_s3_bucket_server_side_encryption_configuration\" is not defined\n```\n\nSecure example #1 -\n```\nresource \"aws_s3_bucket\" \"mybucket\" {\n bucket = \"mybucket\"\n}\n\nresource \"aws_s3_bucket_server_side_encryption_configuration\" \"secure_example_1\" {\n bucket = aws_s3_bucket.mybucket.bucket\n\n rule {\n apply_server_side_encryption_by_default {\n kms_master_key_id = aws_kms_key.mykey.arn\n sse_algorithm = \"aws:kms\"\n }\n }\n}\n```\n\nVulnerable example #2 -\n```\nresource \"aws_s3_bucket\" \"vulnerable_example_2\" {\n # server_side_encryption_configuration is not set\n}\n```\n\nSecure example #2 -\n```\nresource \"aws_s3_bucket\" \"secure_example_2\" {\n bucket = \"mybucket\"\n\n server_side_encryption_configuration {\n rule {\n apply_server_side_encryption_by_default {\n kms_master_key_id = aws_kms_key.mykey.arn\n sse_algorithm = \"aws:kms\"\n }\n }\n }\n}\n```",
+ "markdown": "If resource `aws_s3_bucket` exists, then resource `aws_s3_bucket_server_side_encryption_configuration` must also exist with the key `apply_server_side_encryption_by_default`. Alternatively, the `aws_s3_bucket` resource should have the (deprecated) `server_side_encryption_configuration` key.\n\nVulnerable example #1 -\n```\nresource \"aws_s3_bucket\" \"mybucket\" {\n bucket = \"mybucket\"\n}\n\n# resource \"aws_s3_bucket_server_side_encryption_configuration\" is not defined\n```\n\nSecure example #1 -\n```\nresource \"aws_s3_bucket\" \"mybucket\" {\n bucket = \"mybucket\"\n}\n\nresource \"aws_s3_bucket_server_side_encryption_configuration\" \"secure_example_1\" {\n bucket = aws_s3_bucket.mybucket.bucket\n\n rule {\n apply_server_side_encryption_by_default {\n kms_master_key_id = aws_kms_key.mykey.arn\n sse_algorithm = \"aws:kms\"\n }\n }\n}\n```\n\nVulnerable example #2 -\n```\nresource \"aws_s3_bucket\" \"vulnerable_example_2\" {\n # server_side_encryption_configuration is not set\n}\n```\n\nSecure example #2 -\n```\nresource \"aws_s3_bucket\" \"secure_example_2\" {\n bucket = \"mybucket\"\n\n server_side_encryption_configuration {\n rule {\n apply_server_side_encryption_by_default {\n kms_master_key_id = aws_kms_key.mykey.arn\n sse_algorithm = \"aws:kms\"\n }\n }\n }\n}\n```"
+ },
+ "shortDescription": {
+ "text": "Scanner for aws_s3_encrypt"
+ }
+ }
+ ],
+ "version": ""
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "iac_scanner/tf_scanner",
+ "scan",
+ "/var/folders/mj/sk15wcdx5kl7p5shk662bjt80000gn/T/jfrog.cli.temp.-1690974158-62790465/config.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/omerz/.jfrog/dependencies/analyzerManager"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "storage_encrypted=false was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/byok/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 121,
+ "snippet": {
+ "text": "byok_database"
+ },
+ "startColumn": 1,
+ "startLine": 69
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_encrypt_instance"
+ },
+ {
+ "message": {
+ "text": "iam_database_authentication_enabled=False was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/byok/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 121,
+ "snippet": {
+ "text": "byok_database"
+ },
+ "startColumn": 1,
+ "startLine": 69
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_iam_auth"
+ },
+ {
+ "message": {
+ "text": "storage_encrypted=False was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/documentdb/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 31,
+ "snippet": {
+ "text": "default"
+ },
+ "startColumn": 1,
+ "startLine": 15
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_docdb_encrypt_cluster"
+ },
+ {
+ "message": {
+ "text": "AWS EKS public API server is publicly accessible"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/eks_mng_ng/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 65,
+ "snippet": {
+ "text": "aws_eks"
+ },
+ "startColumn": 1,
+ "startLine": 36
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_eks_no_cidr"
+ },
+ {
+ "message": {
+ "text": "AWS EKS public API server is publicly accessible"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/eks_mng_ng_coralogix/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 65,
+ "snippet": {
+ "text": "aws_eks"
+ },
+ "startColumn": 1,
+ "startLine": 36
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_eks_no_cidr"
+ },
+ {
+ "message": {
+ "text": "AWS EKS public API server is publicly accessible"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/k8s/module/cluster.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 24,
+ "snippet": {
+ "text": "this"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_eks_no_cidr"
+ },
+ {
+ "message": {
+ "text": "AWS EKS secrets do not usedata-at-rest encryption"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/k8s/module/cluster.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 24,
+ "snippet": {
+ "text": "this"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_eks_encrypt_cluster"
+ },
+ {
+ "message": {
+ "text": "AWS EKS public API server is publicly accessible"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/k8s/module2/cluster.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 49,
+ "snippet": {
+ "text": "this"
+ },
+ "startColumn": 1,
+ "startLine": 9
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_eks_no_cidr"
+ },
+ {
+ "message": {
+ "text": "kms_key_id='' was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/msk/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 33,
+ "snippet": {
+ "text": "log"
+ },
+ "startColumn": 1,
+ "startLine": 30
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_cloudwatch_log_encrypt"
+ },
+ {
+ "message": {
+ "text": "block_public_acls=false was detected"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/msk/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 39,
+ "snippet": {
+ "text": "bucket"
+ },
+ "startColumn": 1,
+ "startLine": 35
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_block_public_acl"
+ },
+ {
+ "message": {
+ "text": "block_public_acls=false was detected"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/msk/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 39,
+ "snippet": {
+ "text": "bucket"
+ },
+ "startColumn": 1,
+ "startLine": 35
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_block_public_policy"
+ },
+ {
+ "message": {
+ "text": "Missing server_side_encryption_configuration was detected, Missing aws_s3_bucket_server_side_encryption_configuration was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/msk/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 39,
+ "snippet": {
+ "text": "bucket"
+ },
+ "startColumn": 1,
+ "startLine": 35
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_encrypt"
+ },
+ {
+ "message": {
+ "text": "Missing server_side_encryption_configuration was detected, Missing aws_s3_bucket_server_side_encryption_configuration was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/msk/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 39,
+ "snippet": {
+ "text": "bucket"
+ },
+ "startColumn": 1,
+ "startLine": 35
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_encrypt"
+ },
+ {
+ "message": {
+ "text": "storage_encrypted=false was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/mysql_coralogix/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 147,
+ "snippet": {
+ "text": "k8s_database"
+ },
+ "startColumn": 1,
+ "startLine": 102
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_encrypt_instance"
+ },
+ {
+ "message": {
+ "text": "iam_database_authentication_enabled=False was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/mysql_coralogix/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 147,
+ "snippet": {
+ "text": "k8s_database"
+ },
+ "startColumn": 1,
+ "startLine": 102
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_iam_auth"
+ },
+ {
+ "message": {
+ "text": "AWS Load balancer using insecure communications"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/private_link/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 68,
+ "snippet": {
+ "text": "pl_lb_listener"
+ },
+ "startColumn": 1,
+ "startLine": 53
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_alb_https_only"
+ },
+ {
+ "message": {
+ "text": "AWS Load balancer using insecure communications"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/private_link/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 115,
+ "snippet": {
+ "text": "pl_lb_listener_plain"
+ },
+ "startColumn": 1,
+ "startLine": 100
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_alb_https_only"
+ },
+ {
+ "message": {
+ "text": "storage_encrypted=false was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/rds/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 152,
+ "snippet": {
+ "text": "k8s_database"
+ },
+ "startColumn": 1,
+ "startLine": 103
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_encrypt_instance"
+ },
+ {
+ "message": {
+ "text": "iam_database_authentication_enabled=False was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/rds/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 152,
+ "snippet": {
+ "text": "k8s_database"
+ },
+ "startColumn": 1,
+ "startLine": 103
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_rds_iam_auth"
+ },
+ {
+ "message": {
+ "text": "block_public_acls=false was detected"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/s3/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 28,
+ "snippet": {
+ "text": "default"
+ },
+ "startColumn": 1,
+ "startLine": 8
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_block_public_acl"
+ },
+ {
+ "message": {
+ "text": "block_public_acls=false was detected"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/s3/module.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 28,
+ "snippet": {
+ "text": "default"
+ },
+ "startColumn": 1,
+ "startLine": 8
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_s3_block_public_policy"
+ },
+ {
+ "message": {
+ "text": "kms_key_id='' was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/omerz/Documents/analyzers_test/iac/aws/vpc/module2/examples/vpc-flow-logs/cloud-watch-logs.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 53,
+ "snippet": {
+ "text": "flow_log"
+ },
+ "startColumn": 1,
+ "startLine": 51
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_cloudwatch_log_encrypt"
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/iac-scan/contains-iac-violations.sarif b/tests/testdata/other/iac-scan/contains-iac-violations.sarif
new file mode 100644
index 00000000..f9ee5301
--- /dev/null
+++ b/tests/testdata/other/iac-scan/contains-iac-violations.sarif
@@ -0,0 +1,129 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Terraform scanner",
+ "rules": [],
+ "version": ""
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "./tf_scanner",
+ "scan",
+ "scan.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/src/dist/tf_scanner"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "AWS Load balancer using insecure communications"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/tests/hcl/applicable/req_sw_terraform_aws_alb_https_only.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 12,
+ "snippet": {
+ "text": "vulnerable_example"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_alb_https_only"
+ },
+ {
+ "message": {
+ "text": "authorization=NONE was detected"
+ },
+ "level": "error",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/tests/hcl/applicable/req_sw_terraform_aws_api_gateway_auth.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 6,
+ "snippet": {
+ "text": "vulnerable_method"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_api_gateway_auth"
+ },
+ {
+ "message": {
+ "text": "cache_data_encrypted=False was detected"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/tests/hcl/applicable/req_sw_terraform_aws_api_gateway_encrypt_cache.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 8,
+ "snippet": {
+ "text": "vulnerable_example"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_api_gateway_encrypt_cache"
+ },
+ {
+ "message": {
+ "text": "security_policy!=TLS_1_2 was detected"
+ },
+ "level": "note",
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/tests/hcl/applicable/req_sw_terraform_aws_api_gateway_tls_version.tf"
+ },
+ "region": {
+ "endColumn": 2,
+ "endLine": 4,
+ "snippet": {
+ "text": "vulnerable_example"
+ },
+ "startColumn": 1,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "aws_api_gateway_tls_version"
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/iac-scan/no-violations.sarif b/tests/testdata/other/iac-scan/no-violations.sarif
new file mode 100644
index 00000000..0edb478b
--- /dev/null
+++ b/tests/testdata/other/iac-scan/no-violations.sarif
@@ -0,0 +1,30 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Terraform scanner",
+ "rules": [],
+ "version": ""
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "./tf_scanner",
+ "scan",
+ "scan.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/ilya/Downloads/tf-scanner-main/src/dist/tf_scanner"
+ }
+ }
+ ],
+ "results": [
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/npm/dependencies.json b/tests/testdata/other/npm/dependencies.json
new file mode 100644
index 00000000..eadc8ca1
--- /dev/null
+++ b/tests/testdata/other/npm/dependencies.json
@@ -0,0 +1,447 @@
+[
+ {
+ "id": "react-dom:18.2.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "styled-jsx:5.0.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "jose:4.14.4",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "openid-client:5.4.2",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "yallist:4.0.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "lru-cache:6.0.0",
+ "openid-client:5.4.2",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "lru-cache:6.0.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "openid-client:5.4.2",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "picocolors:1.0.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "postcss:8.4.5",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "@next/swc-darwin-arm64:12.0.10",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "react:18.2.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "react-dom:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "styled-jsx:5.0.0",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "use-subscription:1.5.1",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "uuid:8.3.2",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "oidc-token-hash:5.0.3",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "openid-client:5.4.2",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "pretty-format:3.8.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "preact-render-to-string:5.2.6",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "loose-envify:1.4.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "react-dom:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "scheduler:0.23.0",
+ "react-dom:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "react:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "@next/env:12.0.10",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "caniuse-lite:1.0.30001486",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "source-map-js:1.0.2",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "postcss:8.4.5",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "regenerator-runtime:0.13.11",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "@babel/runtime:7.21.5",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "openid-client:5.4.2",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "js-tokens:4.0.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "loose-envify:1.4.0",
+ "react-dom:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "next-auth:4.22.1",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "next:12.0.10",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "nanoid:3.3.6",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "postcss:8.4.5",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "@panva/hkdf:1.1.1",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "preact-render-to-string:5.2.6",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "preact:10.13.2",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "preact-render-to-string:5.2.6",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ],
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "scheduler:0.23.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "react-dom:18.2.0",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "postcss:8.4.5",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "use-subscription:1.5.1",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "@babel/runtime:7.21.5",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "cookie:0.5.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "object-assign:4.1.1",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "use-subscription:1.5.1",
+ "next:12.0.10",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "oauth:0.9.15",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ },
+ {
+ "id": "object-hash:2.2.0",
+ "scopes": [
+ "prod"
+ ],
+ "requestedBy": [
+ [
+ "openid-client:5.4.2",
+ "next-auth:4.22.1",
+ "npmexmaple:0.1.0"
+ ]
+ ]
+ }
+]
diff --git a/tests/testdata/other/nuget/dependencies.json b/tests/testdata/other/nuget/dependencies.json
new file mode 100644
index 00000000..cc8a0700
--- /dev/null
+++ b/tests/testdata/other/nuget/dependencies.json
@@ -0,0 +1,194 @@
+{
+ "modules": [
+ {
+ "type": "nuget",
+ "id": "MsbuildExample",
+ "dependencies": [
+ {
+ "id": "popper.js:1.14.0",
+ "requestedBy": [
+ [
+ "bootstrap:4.1.1",
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "d5d19f07320da701533fa3312ff9a1ab19226a58",
+ "md5": "ab6c0c4a4115189ec0f98365aa6d5f24"
+ },
+ {
+ "id": "bootstrap:4.1.1",
+ "requestedBy": [
+ [
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "b7ed3361060a0340f320775fd1683bbd99c69a9e",
+ "md5": "0985d5cbfe518b73d0c2c6e1d88f3383"
+ },
+ {
+ "id": "jQuery:3.0.0",
+ "requestedBy": [
+ [
+ "bootstrap:4.1.1",
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "9e33e5144cf5bf44209bf69eb7afd7055be4d084",
+ "md5": "396cfda081aa84129e4a250c4011c183"
+ },
+ {
+ "id": "Microsoft.Bcl:1.1.10",
+ "requestedBy": [
+ [
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "70eee9600a89d2056280636105a385a0f4573737",
+ "md5": "ff7048162a09a686a1515922e28cf75e"
+ },
+ {
+ "id": "Microsoft.Bcl.Build:1.0.14",
+ "requestedBy": [
+ [
+ "Microsoft.Bcl:1.1.10",
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildExample"
+ ],
+ [
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "b8f2886356d67b2010bf3e5b31980270cba8a501",
+ "md5": "4589a330c8ca38a950280c276d566ee1"
+ },
+ {
+ "id": "Microsoft.Net.Http:2.2.29",
+ "requestedBy": [
+ [
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "b4064674a5e08b9cdb8d5fa91b6574c22fe86894",
+ "md5": "6addd47c57094b7d67d5ebc0bf0b1a6f"
+ },
+ {
+ "id": "Newtonsoft.Json:11.0.2",
+ "requestedBy": [
+ [
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "5703f14d06c5de46adc5b4699af3a4dacbb84956",
+ "md5": "cceef905d79cd7cd5d8e22ea68fe7fb6"
+ },
+ {
+ "id": "NUnit:3.10.1",
+ "requestedBy": [
+ [
+ "MsbuildExample"
+ ]
+ ],
+ "sha1": "3023720a81a788cae5996b171be3b352b6e73b26",
+ "md5": "2060d814f22b1b640952674a4870c89a"
+ }
+ ]
+ },
+ {
+ "type": "nuget",
+ "id": "MsbuildLibrary",
+ "dependencies": [
+ {
+ "id": "Microsoft.Bcl.Build:1.0.14",
+ "requestedBy": [
+ [
+ "Microsoft.Bcl:1.1.10",
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildLibrary"
+ ],
+ [
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "b8f2886356d67b2010bf3e5b31980270cba8a501",
+ "md5": "4589a330c8ca38a950280c276d566ee1"
+ },
+ {
+ "id": "Microsoft.Net.Http:2.2.29",
+ "requestedBy": [
+ [
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "b4064674a5e08b9cdb8d5fa91b6574c22fe86894",
+ "md5": "6addd47c57094b7d67d5ebc0bf0b1a6f"
+ },
+ {
+ "id": "Newtonsoft.Json:11.0.2",
+ "requestedBy": [
+ [
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "5703f14d06c5de46adc5b4699af3a4dacbb84956",
+ "md5": "cceef905d79cd7cd5d8e22ea68fe7fb6"
+ },
+ {
+ "id": "NUnit:3.10.1",
+ "requestedBy": [
+ [
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "3023720a81a788cae5996b171be3b352b6e73b26",
+ "md5": "2060d814f22b1b640952674a4870c89a"
+ },
+ {
+ "id": "popper.js:1.14.0",
+ "requestedBy": [
+ [
+ "bootstrap:4.1.1",
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "d5d19f07320da701533fa3312ff9a1ab19226a58",
+ "md5": "ab6c0c4a4115189ec0f98365aa6d5f24"
+ },
+ {
+ "id": "bootstrap:4.1.1",
+ "requestedBy": [
+ [
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "b7ed3361060a0340f320775fd1683bbd99c69a9e",
+ "md5": "0985d5cbfe518b73d0c2c6e1d88f3383"
+ },
+ {
+ "id": "jQuery:3.0.0",
+ "requestedBy": [
+ [
+ "bootstrap:4.1.1",
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "9e33e5144cf5bf44209bf69eb7afd7055be4d084",
+ "md5": "396cfda081aa84129e4a250c4011c183"
+ },
+ {
+ "id": "Microsoft.Bcl:1.1.10",
+ "requestedBy": [
+ [
+ "Microsoft.Net.Http:2.2.29",
+ "MsbuildLibrary"
+ ]
+ ],
+ "sha1": "70eee9600a89d2056280636105a385a0f4573737",
+ "md5": "ff7048162a09a686a1515922e28cf75e"
+ }
+ ]
+ }
+ ]
+}
\ No newline at end of file
diff --git a/tests/testdata/other/nuget/expectedTree.json b/tests/testdata/other/nuget/expectedTree.json
new file mode 100644
index 00000000..f05871fd
--- /dev/null
+++ b/tests/testdata/other/nuget/expectedTree.json
@@ -0,0 +1,92 @@
+[
+ {
+ "component_id": "nuget://MsbuildExample",
+ "nodes": [
+ {
+ "component_id": "nuget://bootstrap:4.1.1",
+ "nodes": [
+ {
+ "component_id": "nuget://popper.js:1.14.0"
+ },
+ {
+ "component_id": "nuget://jQuery:3.0.0"
+ }
+ ]
+ },
+ {
+ "component_id": "nuget://Microsoft.Net.Http:2.2.29",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Bcl:1.1.10",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Bcl.Build:1.0.14"
+ }
+ ]
+ }
+ ]
+ },
+ {
+ "component_id": "nuget://Newtonsoft.Json:11.0.2"
+ },
+ {
+ "component_id": "nuget://NUnit:3.10.1"
+ }
+ ]
+ },
+ {
+ "component_id": "nuget://MsbuildLibrary",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Net.Http:2.2.29",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Bcl:1.1.10",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Bcl.Build:1.0.14"
+ },
+ {
+ "component_id": "nuget://Microsoft.Bcl.Build:1.0.14"
+ }
+ ]
+ },
+ {
+ "component_id": "nuget://Microsoft.Bcl:1.1.10",
+ "nodes": [
+ {
+ "component_id": "nuget://Microsoft.Bcl.Build:1.0.14"
+ },
+ {
+ "component_id": "nuget://Microsoft.Bcl.Build:1.0.14"
+ }
+ ]
+ }
+ ]
+ },
+ {
+ "component_id": "nuget://Newtonsoft.Json:11.0.2"
+ },
+ {
+ "component_id": "nuget://NUnit:3.10.1"
+ },
+ {
+ "component_id": "nuget://bootstrap:4.1.1",
+ "nodes": [
+ {
+ "component_id": "nuget://popper.js:1.14.0"
+ },
+ {
+ "component_id": "nuget://jQuery:3.0.0"
+ },
+ {
+ "component_id": "nuget://popper.js:1.14.0"
+ },
+ {
+ "component_id": "nuget://jQuery:3.0.0"
+ }
+ ]
+ }
+ ]
+ }
+]
\ No newline at end of file
diff --git a/tests/testdata/other/sast-scan/contains-sast-violations.sarif b/tests/testdata/other/sast-scan/contains-sast-violations.sarif
new file mode 100644
index 00000000..d8b3c02e
--- /dev/null
+++ b/tests/testdata/other/sast-scan/contains-sast-violations.sarif
@@ -0,0 +1,907 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "USAF",
+ "rules": [
+ {
+ "id": "python-command-injection",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "78"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\nRemote Code Execution is a type of vulnerability that allows an attacker\nto execute arbitrary code on a remote computer or device.\nThis can allow the attacker to gain full control of the target system, potentially\nleading to sensitive information being compromised or unauthorized actions being performed.\n\nIn this query we look for user inputs that can flow directly into execution commands\nin Python. There are many types of Command Injection, so in the future we will make\nfine-tuning changes that will need to apply to this query in the future.\n",
+ "markdown": "\nRemote Code Execution is a type of vulnerability that allows an attacker\nto execute arbitrary code on a remote computer or device.\nThis can allow the attacker to gain full control of the target system, potentially\nleading to sensitive information being compromised or unauthorized actions being performed.\n\nIn this query we look for user inputs that can flow directly into execution commands\nin Python. There are many types of Command Injection, so in the future we will make\nfine-tuning changes that will need to apply to this query in the future.\n"
+ },
+ "shortDescription": {
+ "text": "Command Injection"
+ }
+ },
+ {
+ "id": "python-flask-debug",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "1295"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nDebug mode in a Flask app is a feature that allows the developer to see detailed\nerror messages and tracebacks when an error occurs. This can be useful for debugging\nand troubleshooting, but it can also create a security vulnerability if the app is\ndeployed in debug mode. In debug mode, Flask will display detailed error messages and\ntracebacks to the user, even if the error is caused by malicious input.\nThis can provide attackers with valuable information about the app's internal workings\nand vulnerabilities, making it easier for them to exploit those vulnerabilities.\n\n### Query operation\nIn this query we look Flask applications that set the `debug` argument to `True`\n\n### Vulnerable example\n```python\nfrom flask import Flask\n\napp = Flask(__name__)\n\n@app.route('/')\ndef hello():\n return 'Hello, World!'\n\nif __name__ == '__main__':\n app.run(debug=True)\n```\nIn this example, the Flask application is set to run in debug mode by passing\n`debug=True` as an argument to the `app.run()` function. This will make the application\nemit potentially sensitive information to the users.\n\n### Remediation\nWhen using `app.run`, omit the `debug` flag or set it to `False` -\n```diff\nif __name__ == '__main__':\n- app.run(debug=True)\n+ app.run()\n```\n",
+ "markdown": "\n### Overview\nDebug mode in a Flask app is a feature that allows the developer to see detailed\nerror messages and tracebacks when an error occurs. This can be useful for debugging\nand troubleshooting, but it can also create a security vulnerability if the app is\ndeployed in debug mode. In debug mode, Flask will display detailed error messages and\ntracebacks to the user, even if the error is caused by malicious input.\nThis can provide attackers with valuable information about the app's internal workings\nand vulnerabilities, making it easier for them to exploit those vulnerabilities.\n\n### Query operation\nIn this query we look Flask applications that set the `debug` argument to `True`\n\n### Vulnerable example\n```python\nfrom flask import Flask\n\napp = Flask(__name__)\n\n@app.route('/')\ndef hello():\n return 'Hello, World!'\n\nif __name__ == '__main__':\n app.run(debug=True)\n```\nIn this example, the Flask application is set to run in debug mode by passing\n`debug=True` as an argument to the `app.run()` function. This will make the application\nemit potentially sensitive information to the users.\n\n### Remediation\nWhen using `app.run`, omit the `debug` flag or set it to `False` -\n```diff\nif __name__ == '__main__':\n- app.run(debug=True)\n+ app.run()\n```\n"
+ },
+ "shortDescription": {
+ "text": "Flask Running in Debug"
+ }
+ },
+ {
+ "id": "python-open-redirect",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "601"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nAn open redirect is a type of vulnerability that occurs when a web application or website\nredirects a user to an arbitrary URL, without properly validating the destination URL.\nThis can allow an attacker to redirect a user to a malicious website via a trusted website,\npotentially tricking the user into providing sensitive information or downloading malware.\n\n### Query operation\nIn this query we look for redirections that are affected by any user input.\n\n### Vulnerable example\nIn the following example, the application has a route `/redirect`\nthat takes a query parameter `url` and performs a redirection to that URL\nusing Flask's redirect() function.\n```python\nfrom flask import Flask, request, redirect\n\napp = Flask(__name__)\n\n@app.route('/')\ndef index():\n return \"\"\"\n Welcome to Example App \n Click here to visit Google.\n \"\"\"\n\n@app.route('/redirect')\ndef redirect_to_external():\n url = request.args.get('url', '/')\n return redirect(url)\n\nif __name__ == '__main__':\n app.run()\n```\nThe vulnerability lies in the fact that the application does not validate or sanitize the\n`url` parameter, allowing an attacker to redirect users to malicious or unintended websites.\nAn attacker could exploit this vulnerability by modifying the `url` parameter to a different\nsite, such as:\n`http://localhost:5000/redirect?url=https://www.malicious.com`\n\n### Remediation\nBefore redirection, check whether the target URL leads to a trusted domain, for example by\nusing a whitelist -\n```python\ndef is_safe_url(url):\n # Whitelist trusted domains\n trusted_domains = ['https://www.google.com', 'https://www.example.com']\n\n # Check if the provided URL is in the trusted domains\n for domain in trusted_domains:\n if url.startswith(domain):\n return True\n\n return False\n```\n\n```diff\n@app.route('/redirect')\ndef redirect_to_external():\n url = request.args.get('url', '/')\n\n # Validate the URL to ensure it's a trusted destination\n+ if is_safe_url(url):\n+ return redirect(url)\n+ else:\n+ abort(400) # Bad Request\n```\n",
+ "markdown": "\n### Overview\nAn open redirect is a type of vulnerability that occurs when a web application or website\nredirects a user to an arbitrary URL, without properly validating the destination URL.\nThis can allow an attacker to redirect a user to a malicious website via a trusted website,\npotentially tricking the user into providing sensitive information or downloading malware.\n\n### Query operation\nIn this query we look for redirections that are affected by any user input.\n\n### Vulnerable example\nIn the following example, the application has a route `/redirect`\nthat takes a query parameter `url` and performs a redirection to that URL\nusing Flask's redirect() function.\n```python\nfrom flask import Flask, request, redirect\n\napp = Flask(__name__)\n\n@app.route('/')\ndef index():\n return \"\"\"\n Welcome to Example App \n Click here to visit Google.\n \"\"\"\n\n@app.route('/redirect')\ndef redirect_to_external():\n url = request.args.get('url', '/')\n return redirect(url)\n\nif __name__ == '__main__':\n app.run()\n```\nThe vulnerability lies in the fact that the application does not validate or sanitize the\n`url` parameter, allowing an attacker to redirect users to malicious or unintended websites.\nAn attacker could exploit this vulnerability by modifying the `url` parameter to a different\nsite, such as:\n`http://localhost:5000/redirect?url=https://www.malicious.com`\n\n### Remediation\nBefore redirection, check whether the target URL leads to a trusted domain, for example by\nusing a whitelist -\n```python\ndef is_safe_url(url):\n # Whitelist trusted domains\n trusted_domains = ['https://www.google.com', 'https://www.example.com']\n\n # Check if the provided URL is in the trusted domains\n for domain in trusted_domains:\n if url.startswith(domain):\n return True\n\n return False\n```\n\n```diff\n@app.route('/redirect')\ndef redirect_to_external():\n url = request.args.get('url', '/')\n\n # Validate the URL to ensure it's a trusted destination\n+ if is_safe_url(url):\n+ return redirect(url)\n+ else:\n+ abort(400) # Bad Request\n```\n"
+ },
+ "shortDescription": {
+ "text": "Open Redirect"
+ }
+ },
+ {
+ "id": "python-parameter-injection",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "74"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\nRemote Code Execution is a type of vulnerability that allows an attacker\nto execute arbitrary code on a remote computer or device.\nThis can allow the attacker to gain full control of the target system, potentially\nleading to sensitive information being compromised or unauthorized actions being performed.\n\nIn this query we look for user inputs that can flow directly into execution commands\nin Python. There are many types of Command Injection, so in the future we will make\nfine-tuning changes that will need to apply to this query in the future.\n",
+ "markdown": "\nRemote Code Execution is a type of vulnerability that allows an attacker\nto execute arbitrary code on a remote computer or device.\nThis can allow the attacker to gain full control of the target system, potentially\nleading to sensitive information being compromised or unauthorized actions being performed.\n\nIn this query we look for user inputs that can flow directly into execution commands\nin Python. There are many types of Command Injection, so in the future we will make\nfine-tuning changes that will need to apply to this query in the future.\n"
+ },
+ "shortDescription": {
+ "text": "Parameter Injection"
+ }
+ },
+ {
+ "id": "python-path-traversal",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "22"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nPath traversal, also known as directory traversal, is a type of vulnerability that allows an\nattacker to access files or directories on a computer or device that are outside of\nthe intended directory.\nAllowing arbitrary read access can allow the attacker to read sensitive files, such as\nconfiguration files or sensitive data, potentially leading data loss\nor even system compromise.\nAllowing arbitrary write access is more severe and in most cases leads to arbitrary code\nexecution, via editing important system files or sensitive data.\n\n### Query operation\nIn this query we look for user input that can flow un-sanitized as a path into file access\nfunctions\n(either read or write access)\n\n### Vulnerable example\n```python\nfrom flask import Flask, request, send_file\napp = Flask(__name__)\n\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n return send_file(basepath + filename)\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application has a route `/files/` that serves files from a directory\ncalled `static/files`. The vulnerability lies in the fact that the application does not\nproperly validate or sanitize the `filename` parameter, allowing an attacker to traverse\nbeyond the intended directory and access sensitive files on the server.\nAn attacker could exploit this vulnerability by manipulating the `filename` parameter\nand providing a relative path to access files outside of the `static/files` directory.\nFor example, they could craft a URL like this:\n`http://localhost:5000/files/?filename=../../../etc/passwd`\n\n### Remediation\nWhen possible, use inherently safe path functions such as `send_from_directory` that perform\nfilename escaping -\n```diff\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n- return send_file(basepath + filename)\n+ return send_from_directory(basepath, filename)\n```\nAlternatively, before accessing a potential path, check that the user's `filename` does not\nescape the intended path -\n```python\nfrom pathlib import Path\ndef is_escaping_path(basepath, userpath):\n try:\n Path(basepath).joinpath(userpath).resolve().relative_to(basepath.resolve())\n return False\n except ValueError:\n return True\n```\n\n```diff\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n+ if is_escaping_path(basepath, filename):\n+ abort(400) # Bad Request\n return send_file(basepath + filename)\n```\nAlternatively - use inherently safe\n",
+ "markdown": "\n### Overview\nPath traversal, also known as directory traversal, is a type of vulnerability that allows an\nattacker to access files or directories on a computer or device that are outside of\nthe intended directory.\nAllowing arbitrary read access can allow the attacker to read sensitive files, such as\nconfiguration files or sensitive data, potentially leading data loss\nor even system compromise.\nAllowing arbitrary write access is more severe and in most cases leads to arbitrary code\nexecution, via editing important system files or sensitive data.\n\n### Query operation\nIn this query we look for user input that can flow un-sanitized as a path into file access\nfunctions\n(either read or write access)\n\n### Vulnerable example\n```python\nfrom flask import Flask, request, send_file\napp = Flask(__name__)\n\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n return send_file(basepath + filename)\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application has a route `/files/` that serves files from a directory\ncalled `static/files`. The vulnerability lies in the fact that the application does not\nproperly validate or sanitize the `filename` parameter, allowing an attacker to traverse\nbeyond the intended directory and access sensitive files on the server.\nAn attacker could exploit this vulnerability by manipulating the `filename` parameter\nand providing a relative path to access files outside of the `static/files` directory.\nFor example, they could craft a URL like this:\n`http://localhost:5000/files/?filename=../../../etc/passwd`\n\n### Remediation\nWhen possible, use inherently safe path functions such as `send_from_directory` that perform\nfilename escaping -\n```diff\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n- return send_file(basepath + filename)\n+ return send_from_directory(basepath, filename)\n```\nAlternatively, before accessing a potential path, check that the user's `filename` does not\nescape the intended path -\n```python\nfrom pathlib import Path\ndef is_escaping_path(basepath, userpath):\n try:\n Path(basepath).joinpath(userpath).resolve().relative_to(basepath.resolve())\n return False\n except ValueError:\n return True\n```\n\n```diff\n@app.route('/files/')\ndef serve_file():\n filename = request.args.get('filename')\n basepath = 'static/files/'\n+ if is_escaping_path(basepath, filename):\n+ abort(400) # Bad Request\n return send_file(basepath + filename)\n```\nAlternatively - use inherently safe\n"
+ },
+ "shortDescription": {
+ "text": "Path Traversal"
+ }
+ },
+ {
+ "id": "python-sqli",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "89"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nSQL injection is a type of vulnerability that allows an attacker to execute arbitrary SQL\ncommands on a database.\nThis can allow the attacker to gain access to sensitive information,\nsuch as user credentials or sensitive data, or to perform unauthorized actions,\nsuch as deleting or modifying data.\n\n### Query operation\nIn this query we check if a user input can flow un-sanitized into an SQL query.\n\n### Vulnerable example\n```python\nfrom flask import Flask, request\nimport sqlite3\n\napp = Flask(__name__)\n\n@app.route('/login', methods=['POST'])\ndef login():\n username = request.form.get('username')\n password = request.form.get('password')\n\n # Vulnerable SQL query\n query = f\"SELECT * FROM users WHERE username = '{username}' AND password = '{password}'\"\n\n # Execute the query\n conn = sqlite3.connect('database.db')\n cursor = conn.cursor()\n cursor.execute(query)\n user = cursor.fetchone()\n conn.close()\n\n if user:\n return 'Login successful'\n else:\n return 'Login failed'\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application accepts a `username` and `password` from a login form via a\nPOST request. The SQL query is constructed using string concatenation, which makes it\nvulnerable to SQL injection attacks.\n\nAn attacker can exploit this vulnerability by entering `' OR 1=1 --` as the `username`.\nThe resulting query would become -\n```sql\nSELECT * FROM users WHERE username = '' OR 1=1 --' AND password = ''\n```\nwhich will always evaluate to TRUE, leading to an authentication bypass\nsince the attacker has no valid credentials.\n\n### Remediation\nReplace the vulnerable string concatenation with a parameterized query\nusing `?` placeholders -\n```diff\n@app.route('/login', methods=['POST'])\n def login():\n username = request.form.get('username')\n password = request.form.get('password')\n\n # Vulnerable SQL query\n- query = f\"SELECT * FROM users WHERE username = '{username}' AND password = '{password}'\"\n+ query = \"SELECT * FROM users WHERE username = ? AND password = ?\"\n\n # Execute the query\n conn = sqlite3.connect('database.db')\n cursor = conn.cursor()\n- cursor.execute(query)\n+ cursor.execute(query, (username, password))\n user = cursor.fetchone()\n conn.close()\n\n if user:\n return 'Login successful'\n else:\n return 'Login failed'\n```\n",
+ "markdown": "\n### Overview\nSQL injection is a type of vulnerability that allows an attacker to execute arbitrary SQL\ncommands on a database.\nThis can allow the attacker to gain access to sensitive information,\nsuch as user credentials or sensitive data, or to perform unauthorized actions,\nsuch as deleting or modifying data.\n\n### Query operation\nIn this query we check if a user input can flow un-sanitized into an SQL query.\n\n### Vulnerable example\n```python\nfrom flask import Flask, request\nimport sqlite3\n\napp = Flask(__name__)\n\n@app.route('/login', methods=['POST'])\ndef login():\n username = request.form.get('username')\n password = request.form.get('password')\n\n # Vulnerable SQL query\n query = f\"SELECT * FROM users WHERE username = '{username}' AND password = '{password}'\"\n\n # Execute the query\n conn = sqlite3.connect('database.db')\n cursor = conn.cursor()\n cursor.execute(query)\n user = cursor.fetchone()\n conn.close()\n\n if user:\n return 'Login successful'\n else:\n return 'Login failed'\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application accepts a `username` and `password` from a login form via a\nPOST request. The SQL query is constructed using string concatenation, which makes it\nvulnerable to SQL injection attacks.\n\nAn attacker can exploit this vulnerability by entering `' OR 1=1 --` as the `username`.\nThe resulting query would become -\n```sql\nSELECT * FROM users WHERE username = '' OR 1=1 --' AND password = ''\n```\nwhich will always evaluate to TRUE, leading to an authentication bypass\nsince the attacker has no valid credentials.\n\n### Remediation\nReplace the vulnerable string concatenation with a parameterized query\nusing `?` placeholders -\n```diff\n@app.route('/login', methods=['POST'])\n def login():\n username = request.form.get('username')\n password = request.form.get('password')\n\n # Vulnerable SQL query\n- query = f\"SELECT * FROM users WHERE username = '{username}' AND password = '{password}'\"\n+ query = \"SELECT * FROM users WHERE username = ? AND password = ?\"\n\n # Execute the query\n conn = sqlite3.connect('database.db')\n cursor = conn.cursor()\n- cursor.execute(query)\n+ cursor.execute(query, (username, password))\n user = cursor.fetchone()\n conn.close()\n\n if user:\n return 'Login successful'\n else:\n return 'Login failed'\n```\n"
+ },
+ "shortDescription": {
+ "text": "SQL Injection"
+ }
+ },
+ {
+ "id": "python-stack-trace-exposure",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "209"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nStack trace exposure is a type of security vulnerability that occurs when a program reveals\nsensitive information, such as the names and locations of internal files and variables,\nin error messages or other diagnostic output. This can happen when a program crashes or\nencounters an error, and the stack trace (a record of the program's call stack at the time\nof the error) is included in the output. Stack trace exposure can provide attackers with\nvaluable information about a program's internal workings and vulnerabilities, making it\neasier for them to exploit those vulnerabilities and gain unauthorized access\nto the system.\n\n### Query operation\nIn this query we look for any stack trace information flowing into the output.\n\n### Vulnerable example\n```python\nimport traceback\n\ndef my_function():\n try:\n # Some code that may raise an exception\n raise ValueError('Something went wrong')\n except ValueError as e:\n traceback.print_tb(e.__traceback__)\n\nmy_function()\n```\nIn this example, the `my_function()` function intentionally raises\na `ValueError` exception.\nThe `traceback.print_tb()` function is then used to print the stack trace\nwhen the exception is caught. The vulnerability lies in using `traceback.print_tb()`\nto output the stack trace directly to the console or any other output stream.\nIf this code were part of a web application or exposed through an API,\nthe stack trace would be exposed in the server logs or potentially returned\nas part of an error response to the client.\n\n### Remediation\nLog the exception to a logging framework or file, instead of outputting directly to the\nconsole-\n\n```python\ndef log_exception(exception):\n logging.exception('An exception occurred', exc_info=exception)\n```\n\n```diff\ndef my_function():\n try:\n # Some code that may raise an exception\n raise ValueError('Something went wrong')\n except ValueError as e:\n- traceback.print_tb(e.__traceback__)\n+ log_exception(e)\n```\n",
+ "markdown": "\n### Overview\nStack trace exposure is a type of security vulnerability that occurs when a program reveals\nsensitive information, such as the names and locations of internal files and variables,\nin error messages or other diagnostic output. This can happen when a program crashes or\nencounters an error, and the stack trace (a record of the program's call stack at the time\nof the error) is included in the output. Stack trace exposure can provide attackers with\nvaluable information about a program's internal workings and vulnerabilities, making it\neasier for them to exploit those vulnerabilities and gain unauthorized access\nto the system.\n\n### Query operation\nIn this query we look for any stack trace information flowing into the output.\n\n### Vulnerable example\n```python\nimport traceback\n\ndef my_function():\n try:\n # Some code that may raise an exception\n raise ValueError('Something went wrong')\n except ValueError as e:\n traceback.print_tb(e.__traceback__)\n\nmy_function()\n```\nIn this example, the `my_function()` function intentionally raises\na `ValueError` exception.\nThe `traceback.print_tb()` function is then used to print the stack trace\nwhen the exception is caught. The vulnerability lies in using `traceback.print_tb()`\nto output the stack trace directly to the console or any other output stream.\nIf this code were part of a web application or exposed through an API,\nthe stack trace would be exposed in the server logs or potentially returned\nas part of an error response to the client.\n\n### Remediation\nLog the exception to a logging framework or file, instead of outputting directly to the\nconsole-\n\n```python\ndef log_exception(exception):\n logging.exception('An exception occurred', exc_info=exception)\n```\n\n```diff\ndef my_function():\n try:\n # Some code that may raise an exception\n raise ValueError('Something went wrong')\n except ValueError as e:\n- traceback.print_tb(e.__traceback__)\n+ log_exception(e)\n```\n"
+ },
+ "shortDescription": {
+ "text": "Stack Trace Exposure"
+ }
+ },
+ {
+ "id": "python-unsafe-deserialization",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "502"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nUnsafe deserialization is a security vulnerability that occurs when a program deserializes\nuntrusted data with a potentially dangerous deserializer.\nDeserialization is the process of converting serialized data (data that\nhas been converted into a format that can be easily transmitted or stored) back into its\noriginal form. In some (\"unsafe\") serialization protocols, if an attacker is able to\nmanipulate the serialized data, they may be able to execute arbitrary code or perform other\nmalicious actions when the data is deserialized.\n\n### Query operation\nIn this query we look for user input that can flow un-sanitized to potentially unsafe\ndeserialization methods\n\n### Vulnerable example\n```python\nimport yaml\nfrom flask import Flask, request\n\napp = Flask(__name__)\n\n@app.route('/process', methods=['POST'])\ndef process():\n data = request.get_data()\n\n # Vulnerable deserialization\n obj = yaml.load(data)\n\n # Process the deserialized object (for simplicity, we're just printing it)\n print(obj)\n\n return 'Data processed'\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application exposes a `/process` endpoint that accepts data via a POST\nrequest. The vulnerable code uses the `yaml.load()` function\nto deserialize the received data.\nThe vulnerability lies in the fact that the `yaml` module can execute arbitrary code\nduring the deserialization process.\nAn attacker can exploit this by crafting a malicious payload\nthat executes arbitrary code when the `yaml.load()` function is called.\n\n### Remediation\nUse deserialization routines that are known to handle untrusted data securely, such as\n`yaml.safe_load`. It is highly recommended to use the `json` module for serialization, as it\ndeserializes untrusted data securely.\n\n```diff\n@app.route('/process', methods=['POST'])\ndef process():\n data = request.get_data()\n\n # Safe deserialization\n- obj = yaml.load(data)\n+ obj = yaml.safe_load(data)\n\n # Process the deserialized object (for simplicity, we're just printing it)\n print(obj)\n\n return 'Data processed'\n```\n",
+ "markdown": "\n### Overview\nUnsafe deserialization is a security vulnerability that occurs when a program deserializes\nuntrusted data with a potentially dangerous deserializer.\nDeserialization is the process of converting serialized data (data that\nhas been converted into a format that can be easily transmitted or stored) back into its\noriginal form. In some (\"unsafe\") serialization protocols, if an attacker is able to\nmanipulate the serialized data, they may be able to execute arbitrary code or perform other\nmalicious actions when the data is deserialized.\n\n### Query operation\nIn this query we look for user input that can flow un-sanitized to potentially unsafe\ndeserialization methods\n\n### Vulnerable example\n```python\nimport yaml\nfrom flask import Flask, request\n\napp = Flask(__name__)\n\n@app.route('/process', methods=['POST'])\ndef process():\n data = request.get_data()\n\n # Vulnerable deserialization\n obj = yaml.load(data)\n\n # Process the deserialized object (for simplicity, we're just printing it)\n print(obj)\n\n return 'Data processed'\n\nif __name__ == '__main__':\n app.run()\n```\nIn this example, the application exposes a `/process` endpoint that accepts data via a POST\nrequest. The vulnerable code uses the `yaml.load()` function\nto deserialize the received data.\nThe vulnerability lies in the fact that the `yaml` module can execute arbitrary code\nduring the deserialization process.\nAn attacker can exploit this by crafting a malicious payload\nthat executes arbitrary code when the `yaml.load()` function is called.\n\n### Remediation\nUse deserialization routines that are known to handle untrusted data securely, such as\n`yaml.safe_load`. It is highly recommended to use the `json` module for serialization, as it\ndeserializes untrusted data securely.\n\n```diff\n@app.route('/process', methods=['POST'])\ndef process():\n data = request.get_data()\n\n # Safe deserialization\n- obj = yaml.load(data)\n+ obj = yaml.safe_load(data)\n\n # Process the deserialized object (for simplicity, we're just printing it)\n print(obj)\n\n return 'Data processed'\n```\n"
+ },
+ "shortDescription": {
+ "text": "Unsafe Deserialization"
+ }
+ },
+ {
+ "id": "python-xss",
+ "defaultConfiguration": {
+ "parameters": {
+ "properties": {
+ "CWE": "79"
+ }
+ }
+ },
+ "fullDescription": {
+ "text": "\n### Overview\nXSS, or Cross-Site Scripting, is a type of vulnerability that allows an attacker to\ninject malicious code into a website or web application.\nThis can allow the attacker to steal sensitive information from users, such as their\ncookies or login credentials, or to perform unauthorized actions on their behalf.\n\n### Query operation\nIn the query we look for any user input that flows into\na potential output of the application.\n\n### Vulnerable example\nIn the following example, the Flask application takes a user-supplied parameter (`name`)\nfrom the query string and renders it directly into an HTML template using the\n`render_template_string` function. The issue is that\nthe user input is not properly sanitized or escaped, making it vulnerable to XSS attacks.\n```python\nfrom flask import Flask, request, render_template_string\n\napp = Flask(__name__)\n\n@app.route('/')\ndef index():\n name = request.args.get('name', 'Guest')\n message = f'Hello, {name}!'\n return render_template_string('{} '.format(message))\n\nif __name__ == '__main__':\napp.run()\n```\nAn attacker can exploit this vulnerability by injecting malicious JavaScript code into the\n`name` parameter. For instance, they could modify the URL to include the following payload:\n`http://localhost:5000/?name=`\n\n### Remediation\nWhen rendering templates, use parametrized variable assignments (which are automatically\nescaped) instead of direct string manipulation -\n```diff\n@app.route('/')\ndef index():\n name = request.args.get('name', 'Guest')\n message = f'Hello, {name}!'\n- return render_template_string('{} '.format(message))\n+ return render_template_string('{{ message }} ', message=message)\n```\n",
+ "markdown": "\n### Overview\nXSS, or Cross-Site Scripting, is a type of vulnerability that allows an attacker to\ninject malicious code into a website or web application.\nThis can allow the attacker to steal sensitive information from users, such as their\ncookies or login credentials, or to perform unauthorized actions on their behalf.\n\n### Query operation\nIn the query we look for any user input that flows into\na potential output of the application.\n\n### Vulnerable example\nIn the following example, the Flask application takes a user-supplied parameter (`name`)\nfrom the query string and renders it directly into an HTML template using the\n`render_template_string` function. The issue is that\nthe user input is not properly sanitized or escaped, making it vulnerable to XSS attacks.\n```python\nfrom flask import Flask, request, render_template_string\n\napp = Flask(__name__)\n\n@app.route('/')\ndef index():\n name = request.args.get('name', 'Guest')\n message = f'Hello, {name}!'\n return render_template_string('{} '.format(message))\n\nif __name__ == '__main__':\napp.run()\n```\nAn attacker can exploit this vulnerability by injecting malicious JavaScript code into the\n`name` parameter. For instance, they could modify the URL to include the following payload:\n`http://localhost:5000/?name=`\n\n### Remediation\nWhen rendering templates, use parametrized variable assignments (which are automatically\nescaped) instead of direct string manipulation -\n```diff\n@app.route('/')\ndef index():\n name = request.args.get('name', 'Guest')\n message = f'Hello, {name}!'\n- return render_template_string('{} '.format(message))\n+ return render_template_string('{{ message }} ', message=message)\n```\n"
+ },
+ "shortDescription": {
+ "text": "XSS Vulnerability"
+ }
+ }
+ ]
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "/Users/assafa/.jfrog/dependencies/analyzerManager/zd_scanner/scanner",
+ "scan",
+ "/var/folders/xv/th4cksxn7jv9wjrdnn1h4tj00000gq/T/jfrog.cli.temp.-1693477603-3697552683/results.sarif"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "SQL Injection"
+ },
+ "codeFlows": [
+ {
+ "threadFlows": [
+ {
+ "locations": [
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 28,
+ "endLine": 9,
+ "snippet": {
+ "text": "request.form"
+ },
+ "startColumn": 16,
+ "startLine": 9
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 32,
+ "endLine": 9,
+ "snippet": {
+ "text": "request.form.get"
+ },
+ "startColumn": 16,
+ "startLine": 9
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 44,
+ "endLine": 9,
+ "snippet": {
+ "text": "request.form.get(\"username\")"
+ },
+ "startColumn": 16,
+ "startLine": 9
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 13,
+ "endLine": 9,
+ "snippet": {
+ "text": "username"
+ },
+ "startColumn": 5,
+ "startLine": 9
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 31,
+ "endLine": 20,
+ "snippet": {
+ "text": "(username, password)"
+ },
+ "startColumn": 11,
+ "startLine": 20
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 31,
+ "endLine": 20,
+ "snippet": {
+ "text": "\"SELECT id, username, access_level FROM user WHERE username = '%s' AND password = '%s'\"\n % (username, password)"
+ },
+ "startColumn": 9,
+ "startLine": 19
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 10,
+ "endLine": 18,
+ "snippet": {
+ "text": "query"
+ },
+ "startColumn": 5,
+ "startLine": 18
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 39,
+ "endLine": 22,
+ "snippet": {
+ "text": "query_db(query, [], True)"
+ },
+ "startColumn": 14,
+ "startLine": 22
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 19,
+ "endLine": 10,
+ "snippet": {
+ "text": "query"
+ },
+ "startColumn": 14,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 49,
+ "endLine": 14,
+ "snippet": {
+ "text": "conn.cursor().execute(query, args)"
+ },
+ "startColumn": 15,
+ "startLine": 14
+ }
+ }
+ }
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "level": "error",
+ "locations": [
+ {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 49,
+ "endLine": 14,
+ "snippet": {
+ "text": "conn.cursor().execute(query, args)"
+ },
+ "startColumn": 15,
+ "startLine": 14
+ }
+ }
+ }
+ ],
+ "ruleId": "python-sqli"
+ },
+ {
+ "message": {
+ "text": "SQL Injection"
+ },
+ "codeFlows": [
+ {
+ "threadFlows": [
+ {
+ "locations": [
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 28,
+ "endLine": 10,
+ "snippet": {
+ "text": "request.form"
+ },
+ "startColumn": 16,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 32,
+ "endLine": 10,
+ "snippet": {
+ "text": "request.form.get"
+ },
+ "startColumn": 16,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 44,
+ "endLine": 10,
+ "snippet": {
+ "text": "request.form.get(\"password\")"
+ },
+ "startColumn": 16,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 13,
+ "endLine": 10,
+ "snippet": {
+ "text": "password"
+ },
+ "startColumn": 5,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 31,
+ "endLine": 20,
+ "snippet": {
+ "text": "(username, password)"
+ },
+ "startColumn": 11,
+ "startLine": 20
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 31,
+ "endLine": 20,
+ "snippet": {
+ "text": "\"SELECT id, username, access_level FROM user WHERE username = '%s' AND password = '%s'\"\n % (username, password)"
+ },
+ "startColumn": 9,
+ "startLine": 19
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 10,
+ "endLine": 18,
+ "snippet": {
+ "text": "query"
+ },
+ "startColumn": 5,
+ "startLine": 18
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 39,
+ "endLine": 22,
+ "snippet": {
+ "text": "query_db(query, [], True)"
+ },
+ "startColumn": 14,
+ "startLine": 22
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 19,
+ "endLine": 10,
+ "snippet": {
+ "text": "query"
+ },
+ "startColumn": 14,
+ "startLine": 10
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 49,
+ "endLine": 14,
+ "snippet": {
+ "text": "conn.cursor().execute(query, args)"
+ },
+ "startColumn": 15,
+ "startLine": 14
+ }
+ }
+ }
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "level": "error",
+ "locations": [
+ {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.__init__.query_db"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/__init__.py"
+ },
+ "region": {
+ "endColumn": 49,
+ "endLine": 14,
+ "snippet": {
+ "text": "conn.cursor().execute(query, args)"
+ },
+ "startColumn": 15,
+ "startLine": 14
+ }
+ }
+ }
+ ],
+ "ruleId": "python-sqli"
+ },
+ {
+ "message": {
+ "text": "Open Redirect"
+ },
+ "codeFlows": [
+ {
+ "threadFlows": [
+ {
+ "locations": [
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 23,
+ "endLine": 33,
+ "snippet": {
+ "text": "request.args"
+ },
+ "startColumn": 11,
+ "startLine": 33
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 27,
+ "endLine": 33,
+ "snippet": {
+ "text": "request.args.get"
+ },
+ "startColumn": 11,
+ "startLine": 33
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 34,
+ "endLine": 33,
+ "snippet": {
+ "text": "request.args.get(\"url\")"
+ },
+ "startColumn": 11,
+ "startLine": 33
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 8,
+ "endLine": 33,
+ "snippet": {
+ "text": "url"
+ },
+ "startColumn": 5,
+ "startLine": 33
+ }
+ }
+ }
+ },
+ {
+ "location": {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 29,
+ "endLine": 46,
+ "snippet": {
+ "text": "redirect(url)"
+ },
+ "startColumn": 16,
+ "startLine": 46
+ }
+ }
+ }
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "level": "note",
+ "locations": [
+ {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "flask_webgoat.auth.login_and_redirect"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/flask_webgoat/auth.py"
+ },
+ "region": {
+ "endColumn": 29,
+ "endLine": 46,
+ "snippet": {
+ "text": "redirect(url)"
+ },
+ "startColumn": 16,
+ "startLine": 46
+ }
+ }
+ }
+ ],
+ "ruleId": "python-open-redirect"
+ },
+ {
+ "message": {
+ "text": "Flask Running in Debug"
+ },
+ "locations": [
+ {
+ "logicalLocations": [
+ {
+ "fullyQualifiedName": "run"
+ }
+ ],
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file:///Users/assafa/Documents/code/flask-webgoat/run.py"
+ },
+ "region": {
+ "endColumn": 24,
+ "endLine": 15,
+ "snippet": {
+ "text": "app.run(debug=True)"
+ },
+ "startColumn": 5,
+ "startLine": 15
+ }
+ }
+ }
+ ],
+ "ruleId": "python-flask-debug"
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/sast-scan/no-violations.sarif b/tests/testdata/other/sast-scan/no-violations.sarif
new file mode 100644
index 00000000..ed129e6e
--- /dev/null
+++ b/tests/testdata/other/sast-scan/no-violations.sarif
@@ -0,0 +1,28 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "USAF",
+ "rules": []
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "/Users/assafa/.jfrog/dependencies/analyzerManager/zd_scanner/scanner",
+ "scan",
+ "/var/folders/xv/th4cksxn7jv9wjrdnn1h4tj00000gq/T/jfrog.cli.temp.-1693477603-3697552683/results.sarif"
+ ],
+ "workingDirectory": {
+ "uri": "file:///Users/assafa/Documents/code/terraform"
+ }
+ }
+ ],
+ "results": []
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/secrets-scan/contain-secrets.sarif b/tests/testdata/other/secrets-scan/contain-secrets.sarif
new file mode 100644
index 00000000..af3f378e
--- /dev/null
+++ b/tests/testdata/other/secrets-scan/contain-secrets.sarif
@@ -0,0 +1,234 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Terraform scanner",
+ "rules": [
+ {
+ "id": "entropy",
+ "shortDescription": {
+ "text": "Scanner for entropy"
+ }
+ }
+ ],
+ "version": ""
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "./secrets_scanner",
+ "scan",
+ "sec_config_example.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "secrets_scanner"
+ }
+ }
+ ],
+ "results": [
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.nodejs.hardcoded-secrets/applicable_base64.js"
+ },
+ "region": {
+ "endColumn": 118,
+ "endLine": 1,
+ "snippet": {
+ "text": "2VTHzn1mKZ/n9apD5P6nxsajSQh8QhmyyKvUIRoZWAHCB8lSbBm3YWx5nOdZ1zPEOaA0zIZy1eFgHgfB2HkfAdVrbQj19kagXDVe"
+ },
+ "startColumn": 18,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.nodejs.hardcoded-secrets/applicable_base64.js.approval.json"
+ },
+ "region": {
+ "endColumn": 195,
+ "endLine": 1,
+ "snippet": {
+ "text": "2VTHzn1mKZ/n9apD5P6nxsajSQh8QhmyyKvUIRoZWAHCB8lSbBm3YWx5nOdZ1zPEOaA0zIZy1eFgHgfB2HkfAdVrbQj19kagXDVe"
+ },
+ "startColumn": 95,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.nodejs.hardcoded-secrets/applicable_hex.js"
+ },
+ "region": {
+ "endColumn": 138,
+ "endLine": 1,
+ "snippet": {
+ "text": "0159392e31dc912156e1cc6eab32a3d7df7154aecdf2ffe7d66f10da0d5706f7d9ba3183a366389112819b728b20026d04a4f6304da649beefc7fe49"
+ },
+ "startColumn": 18,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.nodejs.hardcoded-secrets/applicable_hex.js.approval.json"
+ },
+ "region": {
+ "endColumn": 215,
+ "endLine": 1,
+ "snippet": {
+ "text": "0159392e31dc912156e1cc6eab32a3d7df7154aecdf2ffe7d66f10da0d5706f7d9ba3183a366389112819b728b20026d04a4f6304da649beefc7fe49"
+ },
+ "startColumn": 95,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.python.hardcoded-secrets/applicable_base64.py"
+ },
+ "region": {
+ "endColumn": 112,
+ "endLine": 1,
+ "snippet": {
+ "text": "2VTHzn1mKZ/n9apD5P6nxsajSQh8QhmyyKvUIRoZWAHCB8lSbBm3YWx5nOdZ1zPEOaA0zIZy1eFgHgfB2HkfAdVrbQj19kagXDVe"
+ },
+ "startColumn": 12,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.python.hardcoded-secrets/applicable_base64.py.approval.json"
+ },
+ "region": {
+ "endColumn": 191,
+ "endLine": 1,
+ "snippet": {
+ "text": "2VTHzn1mKZ/n9apD5P6nxsajSQh8QhmyyKvUIRoZWAHCB8lSbBm3YWx5nOdZ1zPEOaA0zIZy1eFgHgfB2HkfAdVrbQj19kagXDVe"
+ },
+ "startColumn": 91,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.python.hardcoded-secrets/applicable_hex.py"
+ },
+ "region": {
+ "endColumn": 132,
+ "endLine": 1,
+ "snippet": {
+ "text": "0159392e31dc912156e1cc6eab32a3d7df7154aecdf2ffe7d66f10da0d5706f7d9ba3183a366389112819b728b20026d04a4f6304da649beefc7fe49"
+ },
+ "startColumn": 12,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy"
+ },
+ {
+ "message": {
+ "text": "Hardcoded secrets were found in source files"
+ },
+ "locations": [
+ {
+ "physicalLocation": {
+ "artifactLocation": {
+ "uri": "file://secrets_scanner/tests/req.python.hardcoded-secrets/applicable_hex.py.approval.json"
+ },
+ "region": {
+ "endColumn": 211,
+ "endLine": 1,
+ "snippet": {
+ "text": "0159392e31dc912156e1cc6eab32a3d7df7154aecdf2ffe7d66f10da0d5706f7d9ba3183a366389112819b728b20026d04a4f6304da649beefc7fe49"
+ },
+ "startColumn": 91,
+ "startLine": 1
+ }
+ }
+ }
+ ],
+ "ruleId": "entropy",
+ "suppressions": [
+ {
+ "kind": "inSource"
+ }
+ ]
+ }
+ ]
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/other/secrets-scan/no-secrets.sarif b/tests/testdata/other/secrets-scan/no-secrets.sarif
new file mode 100644
index 00000000..8b0ae50d
--- /dev/null
+++ b/tests/testdata/other/secrets-scan/no-secrets.sarif
@@ -0,0 +1,29 @@
+{
+ "runs": [
+ {
+ "tool": {
+ "driver": {
+ "name": "JFrog Terraform scanner",
+ "rules": [],
+ "version": ""
+ }
+ },
+ "invocations": [
+ {
+ "executionSuccessful": true,
+ "arguments": [
+ "mac_arm/secrets_scanner/secrets_scanner",
+ "scan",
+ "sec_config_example.yaml"
+ ],
+ "workingDirectory": {
+ "uri": "file:///am_versions_for_leap"
+ }
+ }
+ ],
+ "results": []
+ }
+ ],
+ "version": "2.1.0",
+ "$schema": "https://docs.oasis-open.org/sarif/sarif/v2.1.0/cos02/schemas/sarif-schema-2.1.0.json"
+}
\ No newline at end of file
diff --git a/tests/testdata/projects/binaries/build-info-extractor-maven3-2.20.0-uber.jar b/tests/testdata/projects/binaries/build-info-extractor-maven3-2.20.0-uber.jar
new file mode 100644
index 0000000000000000000000000000000000000000..597b58be3fc5857a323cbf9f0a8934dec1d3a9b0
GIT binary patch
literal 8974067
zcmbTeb986HvNjysPA0Z(+rQYhZQHhO+qUgYoJ?$ElF2*woO|!O_g&vQ>%8AytN(kd
zx_0mG>Zhx!w}K2P7z_{)6cCVuo0cNb|9GH*z<>Z^DnfKpa^ehsrhtGH{u2rXWd9e+
zc&k>S`xkEcx1s%Q{|yBQ$w`TeDXY)}#2*1O)3P#j^o#H^bkuV*^UX?(D=hoRPBS7X
zjb&YWl^R2d~@mt1Ng9NH(iLwI<2*JL;V8tZ2?O0u&Odd_1s
z%tJJaOaQsXgVVhe@PC^F^bd3Xhv7g-|I-`?8&eNgXNLbW9tgo^!p#0{)sbkwX`w$7g}-tKhUNgE>4EVF7{6ULXPlX%GnyancDpe
zX^MYC+R4<=#Pna8K=p4aakjT{`b~R>Axr7e|~Xeds_#4J5xIsXL=7?n`>Pe=OcEwzT0{OXK6Dd)8^~S9ka1ZQss8f
z*xc(md`sx2&Qj6T664=XdwK~;`AI1Kr5jx%?D|0bAcR0#bEo-Mt%xC?5;(*-Qft?i
zPCq$G^7-}jk_7Ba&yskqxy(ZFT*q~w&^<~Bya*$2fGZ_dM$=19b3Wmfn2?j?=Omo0
zAaY1`Da`vf*~E+FLAZ$v7fR}ZFb$*LVn726VH3$|N$a49=Pb@Zuo-unLy}BsS$=sp
zMVwhg9Z}6i+$9U}Bq{J2_LRFmU5<_#FgjG2ZXh>#4;}?x9M)}icv|eOB%DvC)DuCL
zMuhnPmMxmdMJ_~kI$lC}m^u%DlCn?-d9gI?z?Ur3+~A{-e)AF}+i>MAYca{~%L}%h
zA(`Um0SYW8;;L+yYW5ZGTiKd@>C`>eeteG~ub=;=-Md;Pz{_F(^H-x7PCVlJ?CHr7
zs9zV^bNKDA`jg>jE?_Wt=t0ILaR`$XY(b)3=_QB^;~`8YWonT0_ZhMCA-jCHK$lV2
z38YYBP=2jTZrz`Fpa9L&BEu|uCrPiOT;fVROc2P*ey2=Ih4Vej+*-<}iNT^9F
zlkpu$#dYKopP8&(D>LPF>3*G!9n&f?+F1so;C7O>9GSv(Ljw4!&P_6OF;k9hg*}Jn
zT4vj11`l9VN8b;MnK*O#kMcl8=;+`n3K)>MnKge(DHABG`fC*O>u-QBvs&?mwea0p
z=3jcKY_kfiToHp!l!5zJ$I#LC4j9>h4SxEWx|VyNI=X~H^8HjtQ2KEQi@8HzmF^r~
zzxh@ZxNH(NAK=}Z0O2r-3c>=g6r$HlH9_Y}|B2BJ?Hlag=Vc@1lS$l4j^wP{Jm#Hf
zC}O{*dKhbLK)fqq>YzaA!kxmi-dONj_Bi>nvz%~pCdK5|GO(6cW8uDQ0nbXIY{pvE
zo|#k|H4dRy%87Z;y;8
z9rf0I9w`=)I+I{37TH9BNgpTX4liYPicUH5OXY{X92gdJI6Wxa
z%XIaP!WxX2WCSGO47Pj~kqx#W^48P*5
z8mKW&b^IiT)}UW$1`RsaHkfN!M_C9Kg@sI+9U25{ChMs)*BDjrq4{yLleLw-#rq!Q
zH0JyG#`Yh;*v4cl;9DM0!fs!>z5;o-DGD1=kX)F39y#57?ps)F+RU+BY
zyYqxhzvcP~#>y8p^;gVeYE5{xFTV
zsM39J5dW-7(_ei^7dq$_f89#xtvXh3{hV#;4zhkHlYM&7q_He>zia*c+h3Ck!{x)t
zzH{#dNO9Ld1wCUN-!Rt=Yz&4==Fr)0NE2S6x5-5`6bBzs7CF}}wR8OV4%zTks+d_y
z3+IE)P*Csjzy_CJS^UTW1UBPb>e6Esrh#b|1j0zvQYFUj_FU=dB
zkce{06#Oi+zTgfDt5B2*biuok)+_^l%fN^L4|qXxnQC0VzZTqz47T?%WUuNS`mL{O
z&lWpUCiWH6DAs#kAq?|c=l3QGyNfg*p!FByHPc~6AQhk$SK?O3O#PNJy12Gzn~7jM
zar$5~;-D!ajQbHw@7@L=#=9A!C!nOx`HpN~HCVlw@7Dc5r5;c{->wGvVkp|tWDKCIIeLDKV&+1f5bbQ}TeG9qTNw&dq!3qM_`VEGf-@+Wiot`n|e`_%}9hAtx70vw!dzPX#A?D^p_^;r}2-rcU(6Hipj5
zrD~T-xMFC3a4;ak0+H$l4MMOIWeiXND_RQ{K{JMa6Y?13|a|H4JJjP%@An-7wSiLUXBK%Lz$Yav$3o|
zS5SQ>)_F_josUcM4UfO^oQb6A^~o
zN8z5{?mSPAZtRJ6a~QuJCvrCsuR-=u^Y{qHoU~!1XbSsja^51BP!Cu|lGzP42xhMy
z`0OYeU9Mwd4$tXgr$KM7BA68vn%d-xBpLdt{iMM(V{$_cETBJ!l+R(Jnw#BP+0f)r
z#GT^ZhTE`W8bU@KzL(~(KjbJXv2ZD@&SFk8mztk!(SJVSZno8rZofN5LYksFJMJM1
z(fTql3#MA?CVR$@OG#P_oesnHz5C7U^nF6)6{xaWf%YW&$&7?>XT(}W4Ie>tG
z>uI=SW5>QRlc$lpw~DLy?8~m$z~mF#$qA%`-Pw5Rh4zoq*ONXIBM__2@C{nTPQXO>hlhgQnxIL=Zji8vUm6w%Tpa(Y=wW@%AHt}r4wCRcYt2}c`nOy;IC>d;H&yFc%7h?1tQw&$a@dY%2H=;00bLbgRFBpw8#b7!nb)x0a(b;8E(7RoW%?+Q{!(g{~-t*L4B
z46Zmw@_d<9T_3k=HMZ1usU$AF)52Gz+eQ4smns>?L(%-eTdkyin=0VY1g0wJ6l(Zk
za1s8&7F3BZ@yc;FS(doC!Ri+fwr9B%knLi>HslAlv_nm`57;hBti%u}oO37GY3kZaP-y%Jv>vx0HT?YGAh_o*akiVoDLN1ArLT%&Vs(V<#9bJCLlc-9
zj9*r8bHLBV(CHUn_xgMVjK8!Y(u`>FnJ^ZQ%R6)gwfqhGzkS%f*i)Jy3J{PPGY}BZ
zzv09F;ju(b9c=79ZT}u7Bu#A`{>y)jYC`*{Jm-FYIrUEd3MI)w1%?!znDi((3=x4+
zXAMaJL%Q?I^dtpkc^u>7tWj@mTl=Y|jUL=yv=PN2Cs?&1^7E&0y<2znR^8Uk=8jvt
zD`LB=X@CEBwr8d+Iha*LzRyYT_x=I*#iif^JKSFH
zZ53eww?+Nw-`7rF;eW;e0;TzrPxi<6cHm!lU)y5+3CI2JivknB33i_3pXNcn80
zL|0RLfXKVtlQv8x>+*^bmUV3$9a?ep4x0x^<((X!8zh-q_>@KW(&wEYYBo8fjWz4~
zw2e7$w@7nlOtYcuOuM4%%)>ueta#*&@pH6B(y!lf;pEfYK5nT`Ao9>MOk
z1TvtAV@{kI0pD3~|0<(%a)D(?8%NRX6BU;+ZIKn91?krjr!i?!60e@}WNV)|Q}WbN
z8?5aS$43p-$!6?VJ95LUcXkZV=INAKs7)V-z_fR{Kenqn0pQrLdB7@bZd{x*9mzIp
z*LuV=7=X{}k)+;@4xP~Vm~fz7-?vg%kdw;2GcFpsyTCpl$tW5*I`hmY_*Nm;^zn5
zXLct!k|eYo_C+)f$9$xr^FrsC!ip>^I58})hCU+A^79)P8XE!#qYMW@1GNDm$Vk^khn1vZI(Wf^ta8+^bz$E;+=kbudrs@t&pv{}=0B<~9>wqU{C;AAcV36^K-aB_!)x|o_T
zLD@hmlC_Mf7tmISx|EKo7wTE~uqE**dzx2JSdkXmm*syq9-g*j76|IZMVEFB7mC)S
za>^bQI{C96gsm4jF!HA~yu0M2O=6j1u%C`<&=Fe-2RayfT|O!OdGpZ|t$`u?8J^K@
z+(M=@C0jtwCE7)M&ro*Be*-o3d?hufH2rhi>Zna=QiIykS%Yl_UlK!3o>0)jav_N*
za{SP2CL((pwiB=nO*ZS?M+Dt(Qf)M5Yl75Ppw|hxooxAT;Q^jZBNJdLF67sYBGrwq
z7+PG=Vhs74wqQCoaL19WyS6aICS8nK)j~_C$t@Nn$7~pQ@?#kVL-zC+69?KoA>Txz
zCxt61$?ze0Z2jVf0-qPhG9<+#Wv{i(E3CeK($Q&MreF&~Qi(*ftS=f1NA
z?&iEHA$d*~-pOg|No9;*QhnerKLiur5IM%n>-7UlV4Jn#XS9ca1$!tN_kLWx8LP$WUF~4%k7_e
zjkviHpJyj-^O1NV=AsP6ItbZJXzpV{L-0mc+O=bGo`4IY{^{meik9pzs-qln`^C=orS)Fp)KIy_mPdEwp
z9>YC1m*t4}Ov!ajtdq@dTL?P3^D7Z|7@hUSqc(cPO9wb6AgrWdo_eCFlg4aR75|l9
z7)qD5NH%v2!Ero?ufmqTFZT4JC!Juz>(722BDlF2uia>`8eu}zIGMP2C~1aFK>?;H
zCt3TMM0@t&s(2CRF?fCqx4;Bq;mqYHqd<^W{bO;`AN-^7#PG0N_^`pqpz24|Po%73
zR8^te*GSX6lyT)A{nPSy8rf-nuhO;m>dyeVy>f!H&lm#%LOq{z2?^eaD0?i9+Yz1}sGCeh=n@D$qPv(aIS
zy`Vr{{;Ak!YmCFvp*;4wIv^~w1V&85FKNOl(1#GCF%u))HFB;L;V
zlW?)ZZk2;1P2_^zIOiz1eERAgOnm1`ge(IgwU)UUg~4oLGmU!Fj?{Y*6V{i+j8J}3
zT~f4<$7!VxKM`EOF&xEK%?X>m4bd2{Qyzg#P1I;|NA?tfA!T?d>Pjh0ba+C)y;GgJ
zv`odYtP5nO1>;O(wGdlDJWx9V*tX^!s?-W;Dn?=Xn7(?z?M2Pn46qA6XCTA@yNvnA
z@S8X2dwezoeb?}$O)ft6Ay)U=@{MEqjc70i3^j@z(x@%e(uVQ7h{L0*D0W6B&rvJo
z2z*El#6|^Y(7>D;BRC0_9A<7M%W$X8&7<$o3?(EaJ7!#kLZJ_hK!)WujuvX!x?E@^
z{!&d9jR^u91izdAkty(Rs4POHPI}}RrrcLOlQh|SZmNHWN&|E3z^56xwWu%vxlJ)<
z8Y48&->MV1mE4<3LsdE39h1V8_7^KgGUT=S4}?zjD>f9{1+a|xs`dS#bgHxpRzA>m
z)dKaX%I#4rk9KV2WejduLLOMToXN0jL&53}39o!E5;WUzwu8a7r43yCD>#kFvew}k
zjY(N?7y5~-b2jj;szvL&!5wr@astyQ(?js3!DtNb#gVPF2|yCR23qUFy0IqTfsN(}
z4Q|+<{ZmKdcFvhB6;AA^3T<~Oh)%3SHVVQ0oFAr>B|H~Odz*M>GtwcXD9BKgL^8I;
zf^9ocHXL3n0{)rQ5VO+jv^xz&v6a58Bh!O$$TUWsA^7~OE}X=nift29S&M>{zzE+{
z);&$Yabwk-Y0@Xh@iRA465(Z28x{2p@w23OEHRPM59~Q~C1mplYT#}!S1;gQjCZqOZ?n(L
zcm*Oe5$(siTk7)P#9V)+!7(a=?nIT_fi9GvvG-*11M-2<5HKYw!MaD}*H$@RloR`F
zZV1r{3X=d|Dq5;(K-W86qbPJ}tB_uiJc5AY2e7=Rpyj=sx;_zR2)OyF-k-F;J#>dlP~
z;AfoG4+O%v4wSC_^|BHIvzNsWU11-7551B&e4_KqGAaQ3<@$3DmD%c!)O97${Ml&E
z1kZ581zjJ|y%^7cV&?G7QK}*rh&;^U;H-)`3?&-6p`Tp%}
zZOJ$sJOz)M_rH6uBFq(YZVKiN;cwC%0oXfI`IzpbXRuFV9hEURMctJev3;WYVYJL$
z;Wwc&-ETx)h~1_wkmD3QcdwwkyO+bedRIHr#sURfY+iyatX>V=gz-l$F;j&qvG{9e
z$XAWx@i0()Ws654n@8x?cN(#Kj#+}}!^7NnWaEP&YCs7Vw0wn!&IEpcH&{2IVOaz8
z%k}8`)#3G@8~Q~#k2%9GcF1h(D+EQ4IPr?z<-f21b}9(X+xiz&_bJrjng$HH(WOcP
z81Wy|K)z-OPhkBJ762jZre9P%UHQ~oBbUgprv$#p(pjMC7NTPdmfSi+e(eGE^j$e^
z27b`GWhaN?>qFqtsWPjBJ)#f=-Ig%m4&+dhpb0_)w%{g6VeE+mUMV~?;fqsDn9G>P
z$-gR={Bs#m0z^#D^MRDo@vx#H3l1KBB&FYp_6<*w9^GnW|lIKHN27AFV
zJxF0e3+qBBpiQcN
z6E;zrvWdzpXhMab0wWUPJzq_ph{iB;A(Kq{j|TK)>lI)YW^fI>B~3*Z^=MJQ2h1&;
znSzZLk0+i@J29g)Elzt#%i`w1yVNWEeECzP9q1I(0wrOAE~>i|=CeYa#V?!AEEU8h
zn;yB>PgAT`#$geARt7&{ooKSa+)_MN*mV8aam7;MmC52hEt#tsjsmrSfXE#*C$e%}
zZ*eDhei_}+K3MPlWNhN`#A^GbzUE2IcjMQ!eUugou>5(o7`K42&`sRzPzq@~p&E2?
z!KzDQPXgF!!j@E4*fq;)>l5X(yZ|VlG1}4;#r}@BPoIisxbJ1{TQq}sbZE2zSo;h{
zu5bc|AN5Smeivl&NeaFWr~g*qd@b_e4Ul~$Gjmq2(rB@atY|V_yy#(v_ixp
z(Un?&TmZsy^#&V~s>GS~1S#fH_7hGo0;E>RsUqbfIKO|y!xy|GnTK3kPm;2p6#6Q$
zU|&5<#l~0r{vG^8FGX^>ZO@+5T_2UJbZ*Zc*XC&sZ&>o7mrTW#NLCT`Q9IkmoFjMs
zw&tn#facm+H%_lG=Wz?39Ofm7!H=bN
z%C5?esf%qDCG=nd-L16+4O7!k%B(WZvc^D5BZ1uIv>&fhmU0Jy9;&5FRm?XHpk0b=
zp(qRJ7RhOZ(VM%BGOAWU12`-O;o=%bL@q60X&qo~P$Dv$)EuDx0xDUlR#~-9$P_gs
zPZSMw)rlL86O-F7lLE6QcCk@*!kJbBPxC2ee{_7|ReBT+|r+fS%_}_sk0$F<7*T4B}uD{h3y#H=sO3Kd6Uc|=oZ(!rz4p4uca^xnwy1=40J_Qn>H16h3nJbhJDK+r(OQJ#{9(U$w_kUA_jGrC)$PAe
zpE-c81_e}P#TpRJz`PyhCf;F680lDKBIrozdd!I(5+VtsbD}aC=|u|+)ka65s*Zbt
zUnj4=41EMgA%{I=`|?tc2$DumLwFg+1#rdVqv9*xEWk1KBMN}ii8q4cq2E11s`O&^
z$$W@o^!WP-n;FC%aP(o0rkVZ3CY*)y(p&3cI&sI=hu*x%
z`U35%-qND%X(;74Crb_}Hx`X~COwAo7|Wh?$n)f|n{iE6Taxba-PirVU4yx3tmS?b
zn~mYhV~5jC&s}IPw=u)Kpra=?lrOiLYHW7sZQv5H=1peCF#-5X(*$g^%w7tDmQKB9=ePo&;UXQdZzhT|_?idi|V*
zLf=ztm`fL!(P?#NJxo=NJFWUHilRSDKcL!$zVg8ihr_xz0e2S14`8U3?@01~*7Psj
zpS7bgCUuHq${|*!F?qfcLuW`nl9NAM6P2#(S+ms)TTU$^5Sy^BzAVIxPOXjYu39QG
zB9)AUdEM$pFkfHD00&T9!HKjcia9MfKZocG(Nk}1_AvXaJB>swJwm*u&z@D{+D5=_
za#0xKa+J2LVlrsr5lak$x+~guV|0KKIN2ck-QD1-bX4j^uG}#qSe%V1!h>>F9-_nV
zjV9yB9n^#w=BzMM|MGqdj8bR&L{eAzlpKz~M+M)4$vcxPjVa0Fz9Mygz`@N1U|A|w
z7?_MC{l=aBsVIWcABLlnL
z8^$~I24T)~zB>%^KB40@UHC!xgmrrWfL?E%F=_O%8*L@AHp$p@o$o~`9kipQivdF+@HYz$ul}?A2)~+%s
zDfNeZW+v78L|~|A!E>c*UHUXy!xe918O$3FDf}iXD^lw$-KbW?HHe7(Wh9k|)p3i;
zxEPOkQyFhwRVJ74jVE|^6xu@?%Xjd+lb#S3XdIK}TvO=MvgNQ*_>!K@BaB)h3B37r
zs7iH7&k!nwKB&+UDie#Z+Ztq7^ShjuN1(?=+-@0h&N1@JvKg;Lc%*~+!;Mjry*IUR
zagey}a#DjUOWrO+JUmT?OR}Y@M6Y{tKkf!|$mlSI7+Ka#(;7u)&9`sjwtCZq%s|^n*33w?KkG~vYp-b!CvJGI
zDQ~FUG%(tB_=PkW$z$$yP+~v%y5jqz(NiY62O5+0k8xj};hcZ{miJ1~cZ`D>Z`#0>
zu5(lhlK7kzq(exw+J(R!LsqxIjQa?8raUHS;9KJkU)t5M-oh1bT)U(^D%t*0(Z#Xr
z#);UKJ;1LmKcm|iA((sah_7hxx+uz%)X$xXJK#i5bKWcF6rv(q>heHJiESy5>84k-
z7Q;{6ZTnHt6I11BVa%rLtnq4QgTHF*tZ{3*kstw>6(`D;eCP;2yLnFAT|JF;%`JTS
zvEhRq+nDv?Mn7*nt)cC$Yk?1M+<4hxkDz^u)O*pCeupzUrL;bUMNI#~=vIL9T8uFJ
zhsX$aU(+gO8v_}$2SwEFl9Wp_{3~gBsZ&?mr>48$3
zc9M)oS|SqW7QWa0g-2Fm><(;c!d0>el*FmAR96(MVdU9@Q70TiLP;`HL-WgAte~2B
zK^6HI#ScsUI{|F<$Sb7dwr5YzRlI7;^d92sVGe;^>?+vKXhm{eT}hn<`b!rX-H2rO
z)5c3x704OhU1f)PJ(a{4B}IXf(mly;b{3Fl7cY`oyTa0Ls0$#ua)v4TKDk?s2u>x~3^4eZ?4lmF`?Enb09X`Yqq$_C1AvZl`1UzfC)=d&WN3U&
z5pno*Ig(H&X1ut)@CW_>O(?>`({rc)k_r94nJ&@)7eWEBH!-#Oo3Q_z=yLH?GW};A
z-#=6Kz1+|qDyyrf+qCu%4k+hbAXv*vy&OX23;l
zYreI=!&IZS9V@jXZlxGTsD_E+f``fq|ndOH*yo#gX8nwb)YYxq6g`se(
zOf38^<{`BOYp-p=LB*?LlW+k2;ua&nYwBc!JZ%BM?8GiD{jYh0)KBkoEo)
z_1|{%dSU6+9^iVFhpJ&-b!zuGJ!+$ucV0IgM40VIu`}bZ9k1l*_G8+#Z@y`V2(Y_W
zgIly7-9he*i}-%qlko(U9m=8A8h_j?Jvera2fbB4iFV8Pzc73R2O;U+5@0wUQmXGH
zsQF{M(LNi&i(`1FgRo<6Tla5pwJhSn2^tOb>HmNcsNOMwzY?2xf8pVIAsH$K3v4K!
zKXgCKnUktu-(75-jrQW}OPB`Zs)ica#=Q;Y+4;s&{IYz!Smn_Zy5r;_UU
z+yU;Fe+{L{jT}s!{WyuBsBlNkt1z2?e&aoOhr%vZUMDnJLxB$|gvZa{MLV}!4zA8=
z!4ftM-aCG(bA;gAW>3sKkVCBP6Rhj2(7>XYR#q2Yqb^qD&02d5)c{os&k2d{b)T?y@xH?$R4Ybp?KAE>UwB+Qn
z;7<%4eNqG)*)<#enmT=~#aes9>4$HLsg;yf6|VXY5t*5cEyX>uDEauLAZlgg6VU;i
z*Fpi;$fe_XUxJ)BBIMVuh6=W1UEv-jv8HU`;PQ>5lqTCHy?g+f$b2Xvn-VS!R>mfTL`bNIrNCxw`l)y&1x<+mC-Is1^1;WswYDEObFzIGL@<62^h(l32c;JL32o*Kk%3ApDhmR2CiBK6EqI3oSVXRi(Vg8~+B5WA5mm(P93hudE&jdlFdgcLvf;X=|l}
z{6%1h25}G!2Yala0nq*va9_EF@dg57FDfmh*zn(h-=Z0J&xLBi>G#Iq-?az+cc|b3
zt9L9u>6m|XhQA-A^%uS8ltV>>)j=Wmz&;g&B*f~-S>z8IFvUyiq*+))+i+y$V*E_>Fn4s)IF=L{
zn!>1|yd9Eo%OehovWe=bBqNe~>s678MlHl+7DhFcYa^UYc|}d+Ha&ApMj)O#Q8@`?
zagZwx50KnNYfDRwIf;i1>TwZ>p-GnkfIZEiefM&eCujG<
zKq7?1b*TUu^i>75+Gk8=0^=~i+8K4d&26#`fC%v=6b?XjV*;S2F8vsB3CV<s%Pd}Eh)HJ%_ZVvilio#tM1Z9Wf_r$=f@bgGQtordpzRA_~oe^D;gGU5UCWGrM_`
z%+^fFb%U6{Zpp^5zjo)yd8=b(Uj?fv754){hVo%UI?e-;EK7rdo$`;E+^9%voNcB$Q~_{u!Onc|-9UVd
zo*&z0pRLG9T!`>%6XUL0forN_OoVY{HpWm`o$udWT&2-e-|1|vGNO1
zdFSG?Xjz>aRYSg!YzQnI>78wogSwN5FUO)ZM^=h?JcHbGstg-Nfw!>h1}J}xS=ux7
zGvrncGI$REkai2(>MmqrlOo;D0t+j)s*F6B!V{~qug-<=W@5(udA~ah*j-pcJe{t9
z+c8y?m~oV!bq$#VBh?hc+Jq~%c{ysu=Zk?B{?=b7O1ty-*@xwns>OSoy&rm?0hFSY
z%2s7}*BeCZp09Xp{@j6_Clw3SBzP5YEz=Dov&?!kA7-;1uitG!cQ>@jHaX(`9|fZT
z!$J(6e46f)zMU$9PZez2!1C4GN~G9GXdOsZx7xW#Nbz)kG>B}8zMy&{Vude-IN;`
za8vC0@^l2bXMI23*{~-_l!Eu!#pgGcWR`;2KgSej5iJeL?CLOB+6O`smWKOsPDV$?
zmhogrzICjL7lpnK_io}a5hH1$mhHHxkx9Of*cHOEmPCNQ<9KT?W2C>~X
zB#elyNS;=|gnL6q=IoH_`REHJ5*s{A?O!0YM_i9JV;!Or4IZOs+{e{2)wB8eGjMFN
zWN(u4J*$%fZS8zpe?JQBy=5f78;4Ji;X5n{fk4I>6Vk~cI3M8UalVwI-wPMMEsGq#
zmg5Q~9IYRn(Ht@MG@NT(A?L-}Ja^BPO3|MEdXTK-p%PV#HbR>A^c-Ig6+_l!oZcjT
z$jvS!`UiIqsSE7Kr<^CE?KP%Ga@Si)V$<+M@9YUc)0NUqf`!eO6X(g2aapG1>`XVS
zDL;5_1C7Gav(X|ikS%?kU`bt^*K1?T*u(44`Y=T#Dn|9>v?BOBF9IY?+dkT5fXRVl
z8x#mCh~e|hm!P%j+_(ST4+y{HRAi;9gky=Px&{-B1nhGH7mN`DzpUI0Ea
zUQNT>BWU4Ps;F-$;a0c7OKR3}s@I{Fij*W($F;m}m@1-!_@S1)3g;8C#$`OL)n+!!
zS=MAyGY#b2=1X^!xY;3@=%JfZvq!%Ij;`0XDRUau1m`1zjCs(;eyGOvY(iM
zO6U>HQ0$A2BV8g{>!D$6)RI13w3?#7=v6dQthBD>DP%}>vctgQiPVaabzl1`NQ7W;
zJ=DJ6-;641B(2oM4=Gvu5ztzqzhW4{IRFTv4
zdJa^6i-6R4Hf8LSYNq&hm5FAkpGa;8{CCr
zU@0d?+JTD20QN!z#ekEJbhdqH>IV2MR*7w%6i*M_+QD-Ju=|3X+t8$kKu%D%0SP~X
zoDYJ|8&1bPVBy}^Vtl?(gmQsuELb9Cn4IZH*NDy;Mk^AM-UcmwP(cTX@S#G$xFNkk~ytBlR{JtV(0X7UTgm+=WiF^^?7aYt+DZt(QOv)ol1V~7J
z*$;dMnPPDM08_M8yk(&IPA2KmOA&m2@DUA3$B4W~u)~H^8*MNejOvcFPgv;#UE^I8@z{9a;UK_?$Za-$%)>3r#)mz8ecD)=RQDjM?L8dHF==;hWCsRx`%2!
zN;sqY8R%ws_ebLJ%k_c>@!@4y8vIYmz>nXBqJbRH8<0n5TuuTT=pHg{5)(JzFg<{Q
z^ad{i4K$Bapl#q#m`}|>cl5Vtp^*Z@?|yX9-L!dkw6|)oUFCh>fN$X!{=Bbt1_S!Y
z3(&WPh*S@F=t#sJpeHH-n1<
z7XSet{lZY7by8q`-J;kNuv4k
z#-qg(m&xO4(s&M{VMl-$eMmJ_U)*PC?JeojBS_J!`&-0SVVuQhaE}O)m-%47WHNTh
zCIo>DY!<@xoqf<0$SZ`kU3Xj)#>
zbS(+KEQ~DLz=5g0c0heQFx!A>PvCb;xUnn6Zdbg$Ug;9CYS=`Q(t$-BF9H@4XKhf3
zz0X`ZHqm-@uAJt+OQi4-?Xm&E>q}T1K!iPz!G_mMFBLoC$0F%FTvl$9i>M7
z{DH5`fIyrfrrhvdjbcpAbocfr3@hq$}BhK5FC|b8tLYzUq*pJ
zS&*jZG}=s=z^D-z2hyi8RP)MZ3AC)2^d@}=3kI*Bpkr~)9~k$g;vpzv?|pjUNG*j@
zj)W$P#JFz5-vSaah%T3%;hN2o7&*ToPDD5m5ZL%fT-YSnp6B!Z{qps?y!Q?C23;L|$CNwHIORFB
z-}Ivn&cF={m>U1&f%4!2s16>PC!gY_D?FcX8|kclmNK^rN~hX#8LEvv)(-5)HDw)S
zm2oUwE&E`bPhrx2U2`3I_D3R2Yb8yy;Oi)5q>bvn*E!OkWqB~KbQfLdn3W7s?}wB~iOQwioVBkUv&Ql+9?@L&y2nIRQbrqsb7Qb^amI+OJ8+@)vl-y$H9hlV(7zY%a@J$~a_8C@_PSszw
zL9a^xdhq$I4w>af*j;u)8RL#k8p~<|=)O8Ek0ZytQ5p;m7sYE=e00R}fHViHRsyyX
zRR?nc*rRBfYBXZDQQiHN#w~?A!}a_(Mp6YX3%@=(!c7As(UUN1>BD%NRY$)E`D)(Y
z;9q@nMyS+2h@!;yH}WdcGqxqF2e#eEQO~G4+a1m1Z&GJo5d#ZFdvyRxG|36-YM*M)
zZudQhz}zO}fy#Tr3@-5>tB}A9D_gZSyeBf<)V4b~X~RQ8yK;JAL>$VgFg$b3BCRnq
z6ejZ)Q*JmdMym&Cm*T-VK{%&147qy*WDDu4_ymvmS!=^
zSDhCJ{Q<=Wtmb;`r
zAo^_=cTL`)Fz+6>HX9f4^ZSipfM@-fF*?h9;!qv3
zWlDP#7v7Dp#=C-ja-~52ECDCa^Aj#eaqmoMLBbbNRsODe>nT5;LM_K+t}!!V|HxY9lK-OwvCRR
zbZpzU?Ju@%<4bb$-*fN7c{_Eg=F6`2I&1E=#vI6q@#^>9=}P5Yj6w3~9Eo;*TF`L5
zWPVwff^ERQCq*&l=kd9GLtExU7G|nLpoyLiyS^7Z+lh`6pYlGZ*n`l7
zwe+Qr^J(^BuAe$<2^)q*`P2EOngf-Xspdm=7=_pZa?acP3@*u<$eM#KpCNkXrzFlg
zso5vl%kA-^Xo-^J@1eyvu1KrZk|dF(0UClV}XO
zYzUIkjD^X$et`Y=MOb81O+F|K2uP|Q2nfgjX14!VTQ&K>>MIX@e7TwG-mzz3!oY$i
zvp7n~fjLSFCMo?1(+P&m}J?7c=GCif)=DTG@iDvOlA|anR)MXN_W;~Q)
zplS=$Qgv{k#wBkRIAoJ%F>F#WU?U5k1G`wxrEoIYkLz>b1Dc4oRX6Vda^nH~%9nVceKZL(;|`TUg*iQCe4_Kt(F3wa_cUJZ_TEzL5JyS(
zyy*6ho1s@Xho<}xQP-oObBxB1vW1aatF3jF#xB0)ZlmRPqb1`GDYg?pbDiMEa$=*K
z>xI1RYdn6bn@)FqxW`ZLdlCQ4l8aC93KHuRXTUo+OK`L--ugtjOt8P_K=0~^a%_l+
z+0nK#FodRkakwWL>pf1AbJUi7J8pL9r2E?G-6u_Q5XrkM-?}m~B)#!&qV%~S{U{Sg
zIIf0Op~iQ|=G%Q0Z&(JjL#`mqiebcB0a@lRq9Htcm|E92y
zK6~uxrQd-w*d!Br9@ZSxq)GZ3jmJ73*d30vJ@~waQ~4biIUQbi-6_37EBPM9`#p>b
zq}|OBykf=zX$l_O#C;9Mdpu4Zt-F97KGgcdZgSx2MU2X3GjcJ#&1e#eN8jQA1IO4oL&kzQku)OW9whijurT1
z7SU8*8R&I)_M|@(W<(hnF8@GoJi#r)^?j6cI33zhDx+|!JnnM6JI=+z6EdoNbXof2
zoG1jzu%*5u?B=gX&y3-i9y
zvb%iuU*;gZ){X*vUz8fgLGQ{WK;zd^E7;4XAW0XV+g#Z(kKGsB&Um4``IG$HiPSy>bzoLiG1JZr
zYd8ZT&AZwm*j%WV>GvV^rU}{a#q!yF|*`Bg|U4)pQD~5ABv69Qd5W+xseP
zl{p-4h3T8!7gT+>E)-ha7pk`t*YGLdrebmyLzGMXU1EYq@G!u@I{*;AG(c
z_A~j>Vj&q+hhvwNB673@3kUqE9M8GXZlbgovtmRw7vnL-?<{6pQRd)KCd`hVlZF4a
zn=DA$Y@tAeW2-r{f?rait3?RQp?un{5)fS%*29hwqs;cc4E@{a81c}W7LZ(g
zLk`75F}u*OB!98CU;}M941}548i$vXVSmj{6vc}w$7sTig7vZud(qMabGZ+kQzsg{
zkDkAO)dDs}gPH~Q4hvQhsIt*0NI#fmYD-yGdc@RS+TXQ6T;j}1%txEKK^f$Ct}RJG
zR6ufRWq<{~O>3R&UDsIN1>q*Mih+H$d+ciZp?MzBBC!Vp>y}_VrzSV7q)I4Ph){D&
z49EE`dP_tN`#fiV(5eFFfbH0W=H;4rVX_wTY&|UEDu&Fx1`^$@5-SISrSw0_5$mlV
zEW(4Ap)8G@fOBL4=GM&uE5V|BLF&Zj`dOwhU+JX2({M-cTDAYEu<}6TaOSm&3oDHs
zXu*ijqueYr0tFq#IF$*l&7;k+%j;1}>#?0oiJ%_xJ!Qq{-k7S|IrI#%cSnn{F9nQr
zD^`(3IOuP_i9osiFF_2YI`9R{6DR@gNj{fp-2sUJv+K8H!{T;(8=ASB%bMn5+Kc{qX3n3&=zdc34
z>wq?W=YQl#lDP4hFemzhXAVsIk(B4DMevgI-NIw(j{#>u_D&zr~Ui}(6
z544U}!a;pE;6~h;c)~sRu9$~%3HS&!v277L_t3Pm0tf9HSVvDUhsA5J8wBaXq&aKlC3+j&WNMxn5G6p
zXRrmZlKJd6TG+vWlu6_gY;ET+oRXWwRYeuj+8{6cGehBGR!@==|B9QQZ(v`7(!@(F
zX8gg7?t)?oE51VTDAhPQn6so(+>S|HMRkSA`uvA!N-(A~_lYh){F0K^qN81r6BkZQ
zc7m|i7E4P@1=H457K{8AV(C3Y^Mi@Dk
z;nMpN;6USZBAA$ZI)d%hy_XBAeO+zw%h-a
z5+{9~GAMh>CTKk~d5KN7&)kT}u>8ID79O?637FSTKJ58u@YodE0@+&(!GjJj!kNX=
zkj)tx%EtA8^=Qrd*-Fx=0(>CH>x5L$C
zTYE#B;!X$*Y~##TdIOS^`Jg{cn?f26t;y*Ix0R>6Zb97_!D2zQ4SyoXXwfK^k%R-o
z0Gt`cMU?Bm{k7+jZsM_<+4iBhu-2ri8FB7fL~sY{%+ItvJg<)IY4xAEcYkP*lK^#7
zy-jg-aJj8{S4Byo)-MjZ!7iR8i>Dyon@7s<%C4jjEyN*boDc?3HyF3vQjE=IA|Vu=S5%MVSv65qQ_
zO^}BL>+yt@yx=xv-rl{f9Qw5<`1L7x2e-w=9VQ|3ycujh@_g8<0pFx@|lM#4Ot{}>VQ<#
zZEjWX8u^EMj-py&EoeIgk|js}O!idgGb|UUo1Sz)>=`d`*uusVSnN8p
zUI(73e5rOKuV|2TSsYcrAiuyb-BqYtyD)K9lDslFNHZCCgR#&8o-(goqkg9Hm|mva
zvVeOgE8SJJ74jGNUv@JC6`iePVP5MTAzk${SJj!Jbk~5aShC>@u3fp8L<
zeOFQPAFqE)Pqd|jz?$kMiS&o+Q^Uf@8JhG*%x$eECPfLIX-%Fx_@uQ%l1_-ay-#i`0752rW=4};$JaF3nd_IMA8{_0?lp8m?Terh57Mh=3~;vy2?7Q%ON
zh^DshN$W@57PSJ5u#QkeHb9o81F{B
z2Q-hxWmmFcYVR9N3@1IwnKZlp4~Jy6Y0>=~Yd(!f$NYL3XSmWI@|P_OEtS7R@JNGf
z%CBi%dWobjm_&YSm5`6q36~$s2y2Aobz`uW0y4QyW-fHQgC$(WrY?2+Pm&8;^=iOd+dy
zu@KD6QB^y}@TJpo)4U8p410*T8V7l_fyx(XtSynoZgSW_d@yX2u|)L`zhAy)*rH;D
zLKL6d%keMFigdfC6IeD;Owk2+`+zAlFrh?l-er4n9_+u(Ms!BBcs$lq
zoYF`ML3BpE4Br!npM_j=K8Zm3DFHHBdA>HPz+*Kr%5oyrTA5^_YLL*JPAq*GU5G`j
zpP_HMmS3;H138;yBei`Aq=UIm%r;0cpUOP3V}8sYlcN4OZn(QX#{!S!B>2l~-nJFD
zA{}-QS1HDBV?6Fu|t4hHoe8w1IR42)%F>6%5Q_`v8yt?AFj2feI$D4XeKa
z>c-fS(PYVZwwcyr)LiLm?F-Dw*d;JH$^HDZkW+Z8MK9MB!Xv^M$+$GMMn!ew%yMjU
z;M?rW(c-h1HV;8gYFfjmA%p~f>xLCU1yvu1`$c4&XL(Ur52Ea8XlWbdSbxyS6=p=d
z`V-KDWw--MA!p+khhRBq_)3hiU7*0aOAxvZL1##ZQC#6yI+?UFFVk!mj&ZF#O|cPQ
z#R^OyqMM5ArkT^Dj=UcwW~|fyAwzG%U>i)BGfmeqax|{QTG)A
zt&`039M!c^G|?6AdezN?w_w)p(I(nf+Gg0xxE$)k(Zh&D=0>zjiaJ9`^ORexPuFMQ
zZx|MhQtvFj8CV?H0_e_Bfc3341=x~V{2y?ks7AG`vIc%SCOVbX^~8I2wok&=#jbw5
zX3QkDZM^X&*6|eT!>aLf_0CCUKUj;`kvP3ug~ys<%1`9l$)=9y!e;{y-nA>&Yj|F{
z5q_a$x}1V=N2_6J2$g*L@%4$IZw>JlHLwI9_L=mb9y`A4gMH+H9@F~cM-JoYO!23AtiO(SFFMoJb)+*
zdmj>NI#}K|HC4qVr?r|g#RocFWubeaGN1j*E2K91Ug#qSBT&+T`!R<@`&-Wd$_$(M
zW4@!Xk?)uo3#yyqj%+bsBA2U^Ki5g$3HSG|gnozsj>MKfS1^DL`AB#CDbZ(gNj;;D
zn^$Hu%*=P4-OJObD%ftj2x&grd#G6!ZCeYLb7cGUz+PecXxkx}sN|mxx43M5JOzf_
zfk1li&)zMqK;2bc#Rr4tG0)vTa3+%y|HAgiLQPZhGGV1QCjh+`|@ZAAD@F6
zhk#J(LDlAM=z}os55QA64{ue=j3`nKs`>qW^iLZ((N{~3oPCjbmgS6GwG0~rnBb)#
z`M|l;m-=e(!?u3@Zr!+Kt3&D7xpezU?9?h2DFsI$*|X`|(h)rl;c_XzkXyo@p*qO&MzM|P%&dM0jtok84}`r8N+U6FD4oCS;92QaC!
z2v=Y~!w~DYL!0;@rp*(h$SJ`
z90bo&tfjlY#&cpgPpF7nCHx%x^5TgXPge>MmVH&g&!79SS)zB2%qI!wQ!a{goaZ|=
z0G2Q9-t`QCUlCcb%5=uEA*aPIg(@t_qoul4Y_8Oc3d=#vgGStIbMaV&BOHGV0{&L@
z308(PWa068w2J6mPoxXjoL1uS_gt@qja_Ap(LY>ne71BkPSQGcdPT<5sqFPDQf&Eo
zrECyO)PxK+1+=-q@(=~3zOF>M7V?B06CVkQ~Cg!}&k`(>#_@<>m`0~&d`-dpKq%}yhMy$^-8
z&ofj#ccr>$iuI%Q{A<6c5Xo11g%AC)slfdA^_yY~ANJLX+xe@ap^_3smj3D+RDtV8
zVD@QBA+%ZRP+zCf{p0~cg3}-9%$h@rYDqJotmZf{^k|!ffpcsT^6KX*n2$%f?EYxA
zN4s>Ei>p7nUnqpel=6Gztfri4`GGLBD2snFU<^NeyeW~w&GU*iH4
z)2()$$}tn)nKy@O^L`gEokR`{QYhnJ8l|85)h)B|v;BsaS|4WM;hJvpPvLgcr
zyQAWbub-^74QY_FUp*!1O1ACmv@V`0>Swb~Yl(?!-!da4&Dnkgm6rA^;%FT07DX??OXNQ=ueW*78d`&3xo4uF>bW-IJwj}Z+h
zVd78Utg3xXp5^pc`jTl!cgq3d$^J86!E*me^9OMZlCXd>_>~@0rz@3n1ycx^-XILt
zh5;dbxnwHcv^g7}oKxGI8yRrCtZDv^{R~gx0L1AocK0nZ8kSZ^s=&jC%*@W_j0A*B
z`Ve8BTH>+pm`n;>h^d*%Fr{HwR!f4>ZzvB={w^f
z9twO6+Z=PIuadwzKWxoIkv`ph(|wGZblVN;lca$>Rpv;ct!->8(fyLR5@UGToP;74{P$Xw^FZb&O|s3n^_1-v|$iuflW`4=NFz?L9;#3myK3t
zJCkqvrM8;pO6v$$l5e%YzXoU_FSS5<@zNFtvH#N+6gXT@F6T?+mHO70^&z%$qXbDi
zXSN6do$PM~V;ysWcR_Q(&p$PL76`-ZVT!KY?w%(zuYRk?TP^|s8|*D+Esev@nToU1
z2MkRO&iDO@_+F*1FIcwrAk$JkS^_met*!TE+)-^#iMr$DJ{29(*jK{g#?IaqH7~+u
zS#gXCj4YGg`|T~Sp6Ew-o109og4Ob{*4pzyuE!-Xd}nwji~=hQgI}cM!3k(cHgiGT
zD@HtSnXY;@_qZRxfrj3YHtvq$8qbuL4Q2@K0ExFJt2}
zFSux*U3K0a;DD45v$`i=Imk-m1hwhwL>dEl3qJlknZ4KESpz&J5!0y`uCI3H=FY4^w?V=$d2Ol4Gc&pbzc;>N98EzAmvo
zVe!IvQDia5a@zLps&Iq3E|He?kZxp$fT!%31V;pXyw5m4GiXdcFS{}G4FHEDRAE7z
zB`2P!$GTTx0*)b6hapeA6iyCZRz`ZO7gt~DO>szJ_IS30F`m!zfZ%hXJB^@i1vLg~
z3L9=;+~(PoMHAAd$&(s6HtEz9#**14=Xx~Gur1CvPEfLO$EQqWfjv|HP{}8e$BTQ;
z4}H$hU~=T`cI1tMqA-cIuR!XVzec^pLzK)#1jp#}$v?gios0er7##eLd--)Hno9~t>u8(Mc%G9p+eT-1T|J$hE#b2
zqi-%lq`<$s6=uLWgxXO`MhEBhI!Mj+r%0E*C6GO_5$l6#T@JJE1y{$(c?z@7(;Sj(oH;c$N^y
zJft%kOdBL$iy;s{br%EptQrBZdBEG0tC`PAA9NDD5yPQmk7Vg%t0)!mmiH!1%KTk1
zxEjboFDJT(=dFUJM>hTQ_*B+oYBOm$QC7^1l&_Jz#_nD_A3sCP=uXL%HcM+3KCk{%sJTY*`e5
zNqbvZ39?8Ds4fW!a0NWP{_Lv8=7w^4}<
zxDqNE{7MCEF*%Z*yrS83${QFAdaiJW-L&?Ik@MDTP}RG~UX(3!Gmij_=KE5t!&t$~
z%USGaMZI=ppX-v_M(~a&kYQfAdvgjps9m*rTTSq0U2*w85EERX*@c9&O3oB7=D2(!
z&$9y$2oSG7J<)#Aryf
zY!}gn%WOF1bVzH(<8?+Ny@tbr1OG!2M)SDr;auMe=S>Mp*BAdx%8A;5yhy_9&mkGh
z7%*1mDZ=ig9H^%tO&u?q)RIFjoDcZ-H*OZK+#>CH<@h!#3v$5D{h@e
zUp9ilnT<9Vz&~}ybM;@$SQomH_vcpgzF(*5_UU~lqwLOcRk__f0b7ll<{i?X)Fv_a
z#csVUboPf=1pWO(p&QBH&L3!}K-?u8bfSaUx{^Hx$-4p-)$KDRyB8Wm=_PSdB2>Hr
zhS1#Z4v{&8MB9L}Dhdy%Q%{Nz1!Ye;!PM`0uDj6to#vBT{}4bA1rA8z2U*NO7TX@W
zJbgz=zR?1CE3A0vI@A|TZiK#r26;sg{9I5WY9T-v9lH$hMk3M!8Qt!HJ7^99)mS4K
zO#!*H19>g45VRG*iZi(k03r)MfseZBKst`}`De}}g~i3Gi*|{qss(I#FcM9ygO7p!
zVDMM}K40$huQW#{B^2}ol`!}|_F*KVSqDFbC;9*>^ivD4b_xKZWoZRk)4(K
z)Tyj7^7TJ?()|0G8F5`wC#i>D0@y#;{_Ue~QdLG(u_4-|HL3(v;|<5LSX6*pJJ*Dt
zSCHEw(MyIYOsR&hMCbtFyuipUg%CQx2v&R`(YjmY97tL4H$PJ~LaM8fdB>ggFx+b}
zmKCkq5wm4{(S?GiJw^w|_2lw%U5HY*HD$r~4E|D0Bbfg=;{8uOxM$O4=j^6e89{TK
z0e^tDEh&>q#faA_jTT25Sp^v7H;ph%k4XXt8DpeE7G{hLAV^MxZkvAFN=G}oVv*A
zD8TcB++!3#V`l!@MHH(`Gv1J&w-QG)*xs*6UBc>z
z0{c!I#!RicBuZVXp&_iz(EFu(54`RwdtDMipBLV>*AJBo(a|sFExnO79|Bz;O=bIpi*t-`$FYq~l{{6;f(RJW|`EdqQr
z*>jn}vaVcLBvWSnhg2OV9R(9&31$NC?wG#PR=sk??oaMYywJzjuSNQ_K%ED+LvL`T
zdag4_1h2V8FAlEH<Nfk9>#V@8@q&Am>$YKl=m|zzc&1rdcT)-{a@U}=r8sSE>;*%*exZRQcaXdvclMQ7qk{73?^kwdz0fI`M>5<9H8<1TAuPdo?
zhMl%1PE%V?b|DL
z`hw6jk>t}=rZbEWuWkLXTcgI(9C1a53aY!RER(rd(RqqdQ6|5af6{$eM%l0td~qE|
zaRb@qK>^)Dv}v4TJ3SSOl=Z5%SQ8d_arTue8vWWy(wPO7ji=gjgaFzAyY~0q;Y@cX
z=Q(%0+++{=%PSX}2jrkF&YAqJ*1WANrpxuGCZ$(=clMhrRsvLb{Z)8j@_b-apA78h
z3p{`B`yWbpdtkad`m;~#%y7E#?Qc9Phd-|bI5f~dwJct>84O5?bk)~Z0NP~%}vP;
zd~ZgxT;vvdHe#i?WBC4jm3C!IT8%jbK7tAX-DiDygT>j^2*gBVeS?G6#JRWz>|*PX
z-`)St$XGwUk$zZW$#VYSsX!#cz#Lr7)j3faBWc?s$9EghC_U(`*vuxiNBrd!*y%AS
zFSMXm3PxInQ6ps1jr8voDhm{<3q;GK$vw0=%mmj0HQkHi;}Swm^9gwKWT*2o@5X6Oh8P|>?k
z!=A@}*G9S4xPa;1B4$AKC<=(k&N}DUwEmxBd%Jg)C%@M^3%^`idX^Z?rK(OVW?}Nl
zS_4g&ma%t;@a9ATlqM3W
zoU#!6cC2r`YVS&B!*crWKAHahg>F;EI)K|F5=WEzWa>s-DTTIqH>iG2WTztWteuaDld
zZPM&ERObP}*|m56vqY@|-h*CZB_FFr1}FX@?E
z=8bdA2xTenJ*+*RSMILr4mS&^)NXKuZ}X9r+h)!usK&QMdt{`J?qOUcfRMmDwbJgH
zyti)uvv+9a4FLnbFHy9i6nlhEVZfjoD_Z>M?H(>C=U2fG)^(^`x-63J0_j1}c34lZ
zj-*B*TSraN9+}@s6j++l9x&nIT&7_qfY8$
z7=6Le@94im2_11y#;J^iqIJkvCyRu+eh71$%d2>iL^70EFVgAXoMMvp^0#sp*hLue
z41<*~KSa3WVjGNkgPr-<-P?eUYEPbY52Zl+8LUZQe?{{drHQP+B@BdY%63j9x>Z}a
zBi{7G%qdukbe*TzgI|FAAp#Tu$^!8R|$Er#%SvsQ41k>KAD)vnLLZN-5fTR+!BI}$eqvbun3_AeLm
z3{pCm7Kf>L?%0`vqN`tYry82`eOH7l+FyAeFF&<7F}#!e)he94v&~K~IX`eTDMQJa
zmX9~ZoxFAFJVsulq%e3ZCR4PG$smw*A}Sr}v=23#$D7RmbtF7~x{h!8?lc-lfedh*8
zP3G8H`K$DyKxLAKURPbq!hI9=6?)90OUJ3J3u)dX>ZcJ_Cm0>+*49`HSH<>W8Hh?uL7PPgM!|#-cY0HX#fFsXOWOr>2fb!(lJXewO<25OnbFYW2LN=
z_B3&9CVcFk$lcg$wOJUpw)MLe3TpeXm8tdn1l~fD?f%b|;YA`1knM~wuAk63_-~ni
z&B4xff-!pNURJnP%csj1O8)TqXtY1yQcX-3VUmu<%k^_%?!qF5PUQv={nw&Ow4j;@C
z@QHDC9JfC7m-_wRgMURP7ZM)IF%;}(kG*ThA@zp{{ZZ)iLGMs0!=+EvJ+PP=s~0&6usGfHkRRME39r~D@6|y*Pkn<|{|+4t?E4MeO+HIoNB6v7
zTNJi`4RCfb)7M}Bsz^9@Fqp29h{&9#7_TZkn6f9bC6?Zjdc9*0+rHF4;W3ZeI79CI53w3IaMz6DO+NXto`pLBR2>*?)CW
zjYuWK56yq(^jfPpHV!Oi1mItffaeZ;9IkDzsAd;&mH%vIa;Nd*fhs}tdoy$*vsE)oSuI9-KHcFJH`?G-X#SLc}w#7NBf$7;f5f*
zo$q^wL@)w9c!M}esOIPEPIpx92=V};SA-n_`1Kb&taFdB7kDaFgZ#)Z5pf9dp!nD&
zONAQ&q4QdaDyRK(+xM7RZxFmraRijcf7NqtRw8x$ks3{Ecy-<>Sei-}El81(yK*jr
zwhqRKt`3FxRupKSWVXNUQu*ypN4UCnGhyaiI`{G+#?QOHY0_vZw+NjZ5qvOKH3IYt|4$Wy-WZnKiT_`GFI5Brf#aC+5OE&Yi4(@poHAs7~fnjqti0h$L&x8$lvk70D
z$`p~FXzL2R6++nOWH=34QE6&OVE8K4dxq8?d03&x?z|M7GvPqmAUIynl5QFkGL?;Z
z?_df6Fcd%d@HecpOEh=y9&o)60?P}f@+3e1^sj;nYsWH~Y|3McjdGg}RNZNJuAFD6
z^GeWmiZ(Zi6do)5e0)JBNeY*TfkwhM?wV!tc+xMpO#TvrJNTV@4jkIA?BD;0*$;pg
zJG+obD#P4TEG3md4F&v?aGTKxcQ!7e7(O8|Bg?Vp(ZBZH>z7F3&45YCP5d_XYq{hs
zfkf%T?gW~Yvx@F8^aEyKmLknkM41fZjV#OR*O*iGzyY~~s%y*(+|`C3)2W$dd%7gh
zFPm>OHN5o(Y2S8V?yjt~3zS++s;T_ZGN?MMth;3T&F{afYOv+fd|ST#YM0PJK&1aS
zsu~#!QxhW(BWsgC2H!bxMi%x4w$A^hv>8%=b;Di!wy#xSj^OQ!vByZ9Ef6#}9!dUf
zn7)uFiN7zXxC+~Bl-NiWRY>8{?=ChOg~zk0wFb8N{9Rpzi1o6X?ZOjZ%5pu=VhNbA
zXiggg$m!2V-S1a-o9|a01-i*T|1x(JnOs|$_qhj&j?ux%c
zz#^Dr`VwMk6nF{#cKyo)Fmc)&QSOFg@xD~S{7LRGGi;5}cuh@gwBqJRAE4Jx)+S7D
zh#g%Ock!kT>3VTT$mykU@urTTD0peq{*pn@iQc8+#UEl(96Qc`@kZC1y%n|CO*z<~
z;O@@fufWOA9&OXrb;pS@)~-qIR=n9Tz0bZcxs
zZ4H^8t4C7@81~;)RnkG>jUZ0;I^ydMD3CE79}OjGJORkGCLKrH0A$N3a&hb_seppL
z@cP4Wi+L)xy|(5ZMiZ_8C_V-_)72x|vt-f^3p+!IT!Y9X1#m94oCdqFPricr=`O21@l-fiVd4
z%!oGF6DTByrl*$z&{dt^u`l
zEvUy?ja%tV;0i1++N$ZK4y}gMsTS+Tjla_;>6oluLBTnd6LWSu?2UVstTg6&8Dz2H
zCpY#7OeF&*2XgnVa14bTxbhswmv3DVFGW!*$D`X^K$_+(8mAg;4rD`hkgi#zXQ?A-8
zwlvG55fVg&Y`Hj4;_f!tS$9t!v)JL{t=Z>+`c9+4e&P4c-FnP+q}G#t($%BFl&JmTx2wAFF3!vdVM+O8e0IoDSW
zCYnu<#X4pzTY-K}AwGEwV4^0mdys*f8l)bo!{@zpt^}PmtF5C1nlF=m-3J*Q7LUP|
zSHH+ZmpVcQAljrQ1-71&5|k2rd+EO8iLz4?M~hZEVxWc$Xem-kvw*VnO28Rt^A2|BDqbL$E=6se)X>nBGNC)8I2Xw;gOHX!wUlotNt
zOKgpcr*3qKRh}yll*iNmVJM;~`KyL5{YJhT5{!h%`AY3UZ>`2Mb|l9pqU;(R5#EgN
zCrqk&Y9$*GVk9fIvnE=ntF^8M2xA5}mU>oIzBt2-qfM*&bUb{=X>cq;hTzcEzvRAb
zqXFA|g^F{Q)hG0vov5*Rm7EF*(_0MXT<^Qpoku#B;NB}he%9&lnEmB-PCbD!1UA3o
zO9PsIG5qR{yd4u??GWWMBYeK3D>C6SN;J#|6Pv
zo?h0^4dM|#JUhc{(eNqWE4??eZW&0@nWA+MU
zn`HlLtI+R!KT`pOy|F687LK>iQm9%=SX-LmTZC!jseqP9zMyN~NvN+K
zLfc`wqVlDh$R6Ewf_xHd>1j%24i{XE`JT;UHqGlao%ZqgJ|YiNV;?7sp`|fdkSPv1
z(`aqAI#*v|G9$JqHWe}v2m5OqGs2()g=*VM)Mjp1a=Qaz5D~kJMtIvQwX!AYOD=@j
zccxbDBL%$4DdYj=kdcKWCLt!UD4{F}CJL1jU{l}=Av07?g=Lkh7Q4p})_Owm;nXI-eU
z&wL-)8Pc7-^#zZ6SYQwv-MMB2_^Q79hn|t4dN<9rQv;1Alklk>dQXxr>V%*Gxo~;8
zSH+~SbJFSYM4)HU%sky}Y#Om2Xk?*uBX@DDDEqN5wA1;vzE9r2zb5mdg%Z*d`>+b^
zxBjZJP^;>F-yu}8MUv-z2x;QNrduCy6~2*4e|gNOt^bXwUV+||=x6_mMYOk}X5>TK
zAnxD+Z{`2V$$?ZE?_vfLsT7drumid}kk%Vbd=qS1Q*w>9^GSEHfOX52R|ZxH+Hc$ku$I{`LFBd
zKV>aDG;#FL;TD&T=&V$2oKOr2u)%ZE39>voSlAyB5lO+&BtLkGhM#tyuEZ^mu1WW)ZX0oG_2;mD=gF;X#E4}ENR#(f)ZsIFn
zr3xTjP6d+lNb!l%Mp0rsbX9SvbFW98DXDow^Bgo*39@Efx&jv4EoB{UrAtQq>YM
z>9Zs~uVFu^Hs}R~F>W#Ts*vX4($OW6C|$!BFl6Fhr95#Zi8Eh@7?x}FP-9WeGgU_9
zHqhC8Ze;%(Ykce+5lu|HjwLcTjBTT8ueEb#Mq{=c6J6slJuIq?iuEDF!#?b=G^|sZ
zFd~bqXHgT?00us?nD<700BNh@(B1tk?ADnSr1on9yYj7Gc5wzTEdiFEfXbanMeORT
z90Qu&YkZ1_>l7T;;s3DqjzOA5(XwEn%eHOX>e5%XZQHipW!tuG+qR9a>hjc```*O7
z8}TAu%$|#>0THc66UZq@iLwg4CxTs!o%5PNk*EPX*m}S;ucI?RvWl92STx1-$VwDf{lEfHn)BfAp^vHv3
z=QYei&S?m}u?^}Vn4io~#^u4=?rg~nfkW|)ilXg6-AkCgQTem`s4-4CWOi1Z;)U9e
zq&uHx_HJG5L=5(l{G2S>tk_qSx_uN`0#u8bQJr4~idGuYIDt;vuy-*mSmPY3abOH=
z&@8YQc{AO(iNJz};cGZ4qlmjvBZm%IYY~FefsL$V4S0gCM@u$B{SNhbtMC%i=gcYO3#245+!;(Ibj3{}
zislwD&JvU%&i&i|!TzEJ2~QNhXMRggl8d)7=vESW_MF@-xBtQKCf#m3?Vf7Cb-Tg!
z8EP!rV9sTS;*siDn@FnTISUPFW}w$VG~lQgs2v@hNd;;TG863CAW@6jO8Z9(LdEJx
z&EhQop`(NK&_uulbL5-0ASrkKelNLwWbY2=pNwUrGd&{zrs_DR1?`D9Efh%uH{<9Dd{~6*L!;(1MARTS04-rgC%A^1DG$_ofX)C3$ESBIJG*9$A~)X
zK?+LmZo{+!7+zRa)^W^F44sWOHg7(mb8R9s+LN>J04m_8^%9ftrOt`tP
zPrZmHz7C+r%YD?QNiDIAhl6E-7yMh3PSR!YAg$G((&s^9N6KP4ys0nBZMEU(wBhpc
zAyQ!(yNOm+{b;j5?VhwbKg7LsA%S?3Wkx#7oI~ga(z_3Z$Vp+ZE6DtI4!RksS-Qol
z66>q>sPK=HwWCUV=LIv)0`af-;7!CbnI!d;0kM4Q(N6R}#(~GC_5osWYf|%}$8O7E
zX~Q$@DApidnr-z-Ssow5kK6;V(67_#7_*Z4@XGfqx=SfbC01h&ImZCQ0MuYqzfoUd
zma8X|wYaq^In=0KDzv}|L3hD{!}Za*=H#GEVT_T0`(_x!C>DitJY)V|OdIiEV(e;N)#2>Fs6IWkt45#wKa1;D^_Yl>*)zr{fH;0hp=X*tUu)5}+qDe0o
zQhOCKC3uC9#xIC-kP0d(vEp^nxF;CNRD#4
zLlglk62()hbK|W5yY=P?;wPuaFz7m(Nq?PL?rB&_f!nT#?aZ*%?YVlB)raz1{m(DQ
z0x>PFpz11SU8Uvsiq`#_>Mhm^6)7*jLCRX_30X=G3a!TPa$Zk6mAgRX|E%{k!nNFhFwl|2ky+tbw}+u%}M;N*7qT1RUQa^6%TjzLu!T7>%YF{
zl%P(46dO>x!vfz$I&TY%fu7k1=GrXU%U#PtM4In30dva|D>4)yF4_-h*bk;Oxa93#
z2ZRw9gkM9kv7;(PEIq_Xr3PY>sT!T_dvSb|UlD1(r_Fu-xbwaVN1#3iMN9(3`2=Sg
zhfYBh+NGsykP~0RBZpj5&U8vb-^;{`kNBiNIV}p5rB2vz;qFozp(dK=7o7ef<3l<=
zvy!)yh)5wAK`#!uSEaDDE1eW~xkg*D9N7wTJ_EBud28rDgIZ&xp>Tc?T#>xmSzC`E
z1bMqNJ<>J)K7Tw~iCKVbC7RZ2%>x$zc&VQMgbF|0unG&X4
zs1M9HfgFzvW|+WbD&4Z@fiP+xXpKiGC`Io5J7h}|k9Eum@XUIoW#UEr@Pd8(2?#60ItLBX#bQhk+Yb
zlq|LpXA`VYsSL9hhnF>7NQZ*)4UK!d2&%o-q*wpjf1HT$h439*DYboSoREaLG4gEk
zoaOU;yt$e#@csIe9=5m^ZUm%ecs448t5#ozzxq}eeCaTB??8~^P>8_cAjQWFu6jV4
zp(5Fud`59`UmlPZz&q$q;k{Y|zTlD)qJxAl%zfx7YGt+P6t&{1vVoJb2F0rjsXa^m
zXtS@bX{P0_S|iFe&6rTh9Y7n!c0TE%MavnBV^6lvqP>&?a}!E=Utyo$wOj5dvNdyB5QrHi`O{_&%3uBk8GzS}m)!KK>b|c{epFW>QN(Dp1
z08&0vE=(d_idtk!=2&p>mY`fjXr?OUg~DbN6_r2csig00$+`Dgzy
z(Q*X!V3|F;PAy?Y9ZrPh=80EBAO*pFzY|S&9spOhBIeYo;27Avp#h!Q
zhOV=T9`bwjpS^jv!GSlLi56MtaSMjiI6MnqF8dvpu(w4iA+_fMocy_=hlaDe{9npk
zF9!UhZ~x_t6q1UykwCp>P@O>(QKTb0-Fqq9x!_HULpkRSepulnP73ROpZlU+_nqnm
zUft9RsaZa+)GReDLMv_*o)qU6%1qUi)M0S0_PPEC-6|&uhKu$VMy5a|a@|1*P&etW
zfJSouQH-Ti@|y!G6uo^b&=+a%AiHuwY5QSqu>gaLLdguw(M|H1L1v~P4U|;IN&8DO
zTkX(v_V6MU$l3iCvPrbf4sv!bmtlkDLx{|1epHzexQ4!Qrok(xR^2g6X_Ud~)R*qU
zxOUsB=S%ON!5aQ17ua)#WSah(ue6=w+A@Z~AbNDd|B?u
z+TGC5`VsFMk_3P23diMj7EYcT9lUg=#jwrJUo8E?rtMH`y`S!R(sC=Q*2c}25;
zmz;%yXSG4^$cc1DlqkmOGa5swGYledN-!{m?jrS~W=qG4yv)Bm_WH??E4?!kH05QJ
zOJP9HhM_)5gI-%;Vy7pFv)Pwow<)<}+#=m@Z(yJL98fm-g!6_P>;?6X?=`uKOC@QS
zX=)d_Y04RdGPI&ztp%k-&hj;@OVhQJNaNYDxg2A;$XUH|ZD)Qyxu&1-ZhI^Dt^F5H
zlEs8y^7#XRA#nd~0F*ItHZV4DHux_9sAi+|1AxBtMb?S5U{v}R)taJ{QfCVrl4PhJ
zKa7k>IdEz9f=oL>ilg)VrUX<#w3pkcS9LU?A0ajubM$*Z*B)$?c9)
zuG5T@?(^1BoS*M2TAxdj;n{#OYIa(r0e0(QTTmxc(>}@HvW{jm}zFy0X!N0694Ck+~bF-Dar(=WHwkjF8bk((Sr48|p7x
zS35L`Cw{=w9%p}N7{b3s#B?2GPnS8t?GG%wt~*v11~@aIA&F~nu6r`wDm=8lm*O|
z!wqz(j4{)4q`YBQTdcN-s)poT3E-}uGurh)H`E{xN^b|%_k*MG<*?p?3A2(o!6QiG
z7u&rL2YJF{LoC{h%wMSANx9}ptt6uej_ivH+=8=p#P||^#<;==Q*h}bx9s(}uh`KZ
zrkr(l)2Jxvetp}0Mos`}ET*qD6*UU3H!XA#u8fVxfQMc>i%NPT%{GTbtls}fw)XRV
zspLHapcm#SdWSeDxRvA$>MXzOT*N_f$ngi*?|}g_sC#MRhD-id1>s^#4CK7UXFrQr
z@$`6>j}WuzY*;?)urZ#L5qQzDtD~MCTJVzh${RW9j2jeaq#^D+{m;s;%vb5P~dA
z$wu{(3oL{Zd0yQAWm(>p))-6DWs&*bGZF0~RNHOhgN~hu}zGhdO%|7IEA*ZVfUnn+OW~M?K9gVR-CF5b=00)0$y^zcUIgO*u?cYqp7V@
zND(3RG}#d4Tg_ewwe5}uduqNaWQ9;I!x;w?X3S~?KPEdbc^H;sw=h$XOm*G|zZR!?
ztOyA{b3o^rVFI)RsO2wBN230gRVHqlLHLL+>9=7m4SQx}YKnp!biNt(6R@&K&^L;uIJ
z6kmocryM~6>6NVxtI}9Q${}8GHiD)z*!S+IAS32XWs4kM6H|MZ(r
zj5YCwx$Hs!`_1e^mup~YJ*XsMG{bTvZDe;Ldlg`J*yJltfua$3Vu4aP&HGt7A@lU{
zUy4mPo{8kLAGP%j5(r54zgKL^Tf3N9*vc5#8vH*Rf3=E^9gZj}Ut-((&^ot`B6Kl)
zvCRgjYu0fkY6c=?%_eGAnWH@Z8~eKS71aCijkWPNOwKEUxUx|7UVs?>KN&~|GuO){
z8HGc|>Lxe-n=G%JZ6~wux6>B^An!m3QR)34V)FZZd>>ZP!nmj=2P^E{16QQneJ2<<
z+A3yg=MapDwESi0!&oPG#4tQFhUA?d5>M)r8fc@@Q!!$D!}#Ex`0(r)^>NU~MZ1oi
zXyo~s)&!X#KwIjDMP3nus|oQ7Z#~kl`~%5Q!2$ctV8=n9`D2R%k({%52VntdP$bGQ
z0v;Zldtqv${lyN`PIA!l6|4~&sZQ7sa0AAvCS$@DB|o{LT?gn9#!YCQo9hSl3y&Uk
zc^!%Q-dBML{-=dl*cSIb9`b3J2ziU8D=H+?3_C6R+15L-LFb@O9qMSDk$2eVAVNfk
z!9xk+@WEL$;Xttsia6}y{e1i6#$q#s#_n#}XZZJe~@
zzE9FAyFGiIt_j4mE@!X>PY@ftrp+C)1~WI5H@>~YBtf?OQq&8wX^Jv85_mTIXc@Zm
z#^S4?%CqqS<XGGmZU
zt+fYTaP0OIVu(xVyiJTi^^|hhvD>WjEe`Q-Sgd=>$<5jx--_t07hGO4d`9aY^eDmt
zVLkoxS!u~Su6DMHWn;FGF4`KI;RipPb+N7i-1B59(iy%*ZzrLCL-vQt*OD=EFZO^l
z5j{*pr(_JH^P1|II}`iluL7kfKNHNw1rn=8>%c3GhC^eC+$Vl9mM~fpuaY;^D<-%R
zdww_Y2)C3S*Ass(ZN@xOi-(0Ug8bhI@)au=a!jh#bctBsCjN?~FiB80JA^7Mx405<
zJHsYHy2P~}hdKjZZ9=5Gr49klH4D@(in1?a85Yrwf7MV^a`&cOKjpG&zeAxya~U-%
z1p@xAWi8#C#oV${kH}~b5f5v6#)^6L74q&Wu+NX44dJ5O2f1%e(fp~VU62e>CGYBpJ!g@&Q^CQ@3>84o-2-I>xwD&`!5|B9YfCmJ*1y~%``wjO8-4f
zP&Bc(bF%mkMk;OLmK1%P=o<>8NbBU
zNmZXtj>+5a9(Eq-{_HJSYIT&OtqN(Wp*A;z4P(TotC~<}&p4M%Mzbth*zjb#nsVz{
z^GH>eefpj5v>qLs2>VLRaJ^*RzR$G!oo0DocBl3NM55Yuh2PKMV!pM00OD=48~6c_-}G=3>=>e!rwBz4|BEW
z919<+D?7*wy(s>>lcxl{=3inbUuq{`a)sY+iv%W(0v=~Jo?1YV@(Uwm8?CQp_RqusirZkR(#GR9AoC$(JL_daJD;T_vA4gR^D)n(en1S
z=TYowL@aX#;NB@%7EC09-@YtUO$|(#HxJlEgR3^4UeTYXeK^oG?sh@z7DFHN)%!&A
zmuZrkhCg--qnjpT)y~~2gkOOQB$i;HNF~LQEevZ0t~mAK97h~p^RIz?`TBjZBHaM!
zwN1kv)W!i*gpBA7|&Ei14BQ%?PViB$*^U|!O!0}7RkccMy3nW)Fz
z+57sbt$e1hJYA-CtgwId*VZYo`aCc3j`Z8?vS;l=h)sRKxz-Ctqp|S-OKsv>tR{aF
z>A0uD8LUHqmJPA-8iwEc-=H${^@orpgPfDriiv_^=VbGnpG6By5tOP6Y&Ndk;aE1{
z0Ewj4Ir0g%QsDGwGKwH}(9Ymm(t3IPs{XVfKI
z50Tl@HO+g07D9z+jG0D!*QV~BeJp5f;69;#rARGDo9WJ?opixtDF_GfRA?NU@(#Zv
zxCfg|D&$f*_kEOx3tITH>U8bWeZ%z@+E&->)|CSV__3;bjf1mhP^t
zn1v)slkKtDY!}+n#~W)_%|eyUt-3@HtEH7Xrp^epnCa+i+XykzvcjuF_X7L{%$@u8
zq)m?3aaoJYT;9q0!`mOEvB-JNfBmvfy~_jMlo8NZpZ|-@JZCRGw2iNA)RK`m$>?a2
zp~@lgC)g;zY(0KS29KIx(K>D9;%bsLjX2eD&ccFz3!70W{w6Q7KKRm9{15s)ov9pS
z#tH2o(+huhe}T=Jp`%#NBBI|o(z)cnBdy$P|A@f>Y=RBQU=?ZYSU-uFs@#4lt>G_Q
z0*Hl_qzgo^5K1oU^N5i3XNXx`uyTMMq11Q*i8`a^>qYa
zI82L*4=Z`Q55Qj)OpP>~r(w8H@qBiBp>qxjYpxt#1LLHLD7qZ_?2W&TEWI&Df3YCz
z;E1sP1(3Ct;5M;g1LO_kx*(&2?wlwfNejGx%Sh9bqtL)^7C6u-y0eyT%=KP*moC
zD`8H_44bou#xsty+BtU;cE5dqXZ0>3m)2sNMnw}d$lf%%V6c3w2vU}Jd50@y1U_(Q
zv>Yy`%D-roI8EKmX7U3zj}&X$CMmz1;Z_HgoNl*v}s=GwD;%G$Zz{PC7DWwUJy>SlBv
z)@9tzoLROD7D=A|%%I;cd*nj$IQ`2Ars?+teZHXNn~SMYrp|%{2Zn%5g>@<#Qaqg^
zDs{{L{8_)l4Z&5k$nsPr`?3g%y<6mz=77Cc;-s7w!DP(a*UkNMFE|QbC?~@0lXElo
z&otYvAKsg0x8yPGl|4qF;PJXk&4|`l`84jHg7vFtCig(@@&l^}g{bCH$GdeV_s5>h
zKjo#M!4jm3$3fa~Vkpp?e}yfZ5|wo1Q_Q1@Z6)Q>lX{zY;u|b}4otH_=Y(*V&CMsr
zaORw$3y7Rl6p#N6QO3>tV9PB)Nh>$5n#LNMEifwP%s-L4`U=DRMl4YENb*rO6L)`D
zDDP+*Fm?&_Gp!C0UH9-#@kbE6$e#2#U|s9f=%Z*v?%X>o*R*21!$gSJ?qa0~xpQXr
zx7`P=pG2wClT6lDt&+=DV33Yp2Ia5uu?jl0WU2W?1%vWL56Nms>EC*%3?#k9DC#)T
zuRN0R+{i@f0H`};%GvvD>0oQNi=v80wcyiQNhhI8;uX3(V@0rdCbc@OO6`Hg^{TFIsKYbHc``dP<3p!m=zhyK>t;PhH_>HR7*
zJeOvIr=j>`XD0(WxJmjajev5esSn+~$B;6bM=;a1YY4to@4V~nHi6n?BLdF;Y_4NW
zOJpDv=90b2-xhtYfVC_8gDUb&DLAKB22=utd!zEZ$(XU@7TWAcyjqtavmupj|6PlJ
zM!)gGx@cas1!_n-ppmT
zm`+p8*hc4Cr)Wk$q=BVe(lu&EXvFSMn6^aOuD$!)H#yY*Q;jsf*Sj7*u$cCM
z0_vj#R0@{tA=191;5mbfL!eoX4a$iuW##m_vJ#`!LQ)#zQ0QIPD~vB%0y|bX{u1P!
zB?{+9TKGktpqWYMcL7ywV~l|L1j{@VSp_5hAXIjOL-YzUJl5{fcU?pe%HO}kcz0vY
z=wM9MM4X9;mE!rUZzZ|V&7G&5j37RH5@N9^n0dLmXdE(vT)4h$;zjfr%POIfA_9-t
zlruY03M8exi1&R{3#6L=5ZJnZY}^zmR7tLNW7gD!#E_{4W
zh+aBNF>92HsvT#bU)@|OtUJv=#ZF@@!Yi0LKQ_*#yUlSeHx5DXli_)4?!W~n0NzjY
zYs=hklfiIr-FGV^jWHbUX+k>NbH){&A{jxua!o>QPIn{Bsw8(rA-3y5ANL2tpNO8g
z9+Fth>4oM6otQQR%x{4H!e?9-1_hnimS}-b|G=ur6xz5F#+#7$uB=<$CuL6!IR-!^
zk@`J?W>$XYX3KZNCtjNZEj=zj+3y}@8e|XpjT6uj^E>Pe1sXinu!(RaiYoPi(ar6c
zYJz4$Kt>bZ;A~{Kjnl(5pP_El(@-0;^qx5fu{1(7@j!V={Wx;*h&mnl52uWvT9^
zMar^ex@2_+BZtb{_H?ZTMyUmxSzKc2ii%Uq(BHp_dpCstcz(^m&zGv?R+4IoqRiD4
zgJ_B^n)6MUz#SFK{BdGgUSwMe&SkeO!7+!K>VgZk0xni9v*XzoB-ywd31#mf2oG|2Py#OC9oYBYJsP^WPt(@hw8srX6NOCzdY==GjdiK@FV04#p
zBM8X|#5alcWOT7lqkGKUNFyu=!#_~}j@o{pseCIZUCdYoHH#}?Tn=@l96NJz7P0Ym
zPHNf^oaCUb#`QRHGXG?IIu`0(XjzIbY*J6JxoP;yrJ;SY%~Mtge`p^e}cTi4AE6(KM04i>hSr4FSRJI
z4pOOxLxvA(fABchX&KP=meccTmsJndc2O(wj2%hd=;;3XXIn-2CI}d
zRDo5c6N*SGoWv5aoWjp~%XqqqX*GO%7l|InfJFHhCfg&rJBiJ?_jBeZi18R!^=J`J
zviU12#2v5QbImO)m0l5qi<^w@H7Mn{2K#|tQV
zX%J^ZZOe(_zuExA^o$LMhFGcdN_}I%7e6aN;?Ijni)h3Zj
zLD4Ja3T#zfU&hR_rh@e%dVqJ_gkv>x;_SS0#z9(lWw!3$Y711wk&<4e#%Byg48Sm_
z6ji(sUAX$wd9@%sY7vLzb^Hn>5;Le=E96-er`kv%Y$-qOd%I!m=xmw-=q
zDEpQ(PU@^uvNGUQ3YKpl`!U>CjG)3WihQv7!3CD3woR2K<)tk-tqj6dRR!ob8_6DQ
zE#BfnYD_oD2q~9xSewjSJHo?OBs@Pcmh)SM;~rtOyPw<+1V>-J@#e7|da6Oi<5bSe
zes8n9R9yYpKwFW}bY&2@h^6z8sN?-@SKgi_rb7b>%Q+vZEKR-qYln^bqc-BUt##UR
zM_CzxYef&solE819I_ghgu#b`ZA>Wt=*_FGcu9;r3i(w$u?d;MFL@9isPF|Bp9x@U
zqz(}Qe?)fu@W9>3V&`zTg0JXz|qI+Ov&3Qw@u1W;%s
z&dJI=y?_j-lrQL};Vl%(Q$A?(C%C`wZ^b}-V!O^lPq0!(_RW6Mx02FF6Hxx=+xVhQL-8y77C=y6!l7a<#0Wo@mlD#2x
zyF!Rx@acCiW5IsnC{|92Rnuj&yNW%X8c>O#Rmpb*NLFAEn;LIgVpT#D(*klRgiDqm
zz{i{R3|eKnK$H1pm?P)1=nj?K>HScwO~&B474H8~PG*drZQQSTSVMMXa%#3JSOv-@
zog5Mt4c#ByeoMK}Q<{M`5Up7Wnx11E74Yy1DVn4@ii^gTOCZrAwkX@jt~AmmA5kx=|Z
zdM+6ETWO(*m6Z+GMlW&os5x`nByPbzf|znF23It1>o|#rNCZ)VGAx{!x~$BdP*e2o
z_phRvE+0F$#BKIWpkC)4x9KU4$LWvL_Y@2Pp93l&r!!216(cz&*&!XF8+6CGKVE#&
zH%bn@5Y*4&r&`P({tyTUL_`QM9z#K5_i2&z2mt*!ob3mntr+}tZz1k(p&q(ziV$A1
zBOdCUlv_7Gcktauecm31hFq2VEC{jitnLRN-7y&5uJBalkG`1H?SJ_Yi+7Qc1fBO4
zI0E;1u(eRH<$rS9@0|=hwqrPozF|0gsdn|CJQkz$4moT;R$^MW{$=Czs@+2%Sl)@D
z*?HB8TWbdkKzWFb<%WES4)Y_VM2+-T{L6$GooLYPx*XoU&-wErdL=N#_l+9$ReZ3E
zE7XL9i#V@FuVMo!lO$>|J$$GB8$#1OtKMz=Er3@rwkIB(Fpe+XB{N6BD`#qJ?Z)
z$EJT3E%rCXofG?P&J8o)f4Ffl9DL=kQ4P_*RWc!m#zAvtqmz`pdRA2EQ5Ov)lBrhG
z)?lp|S3V>dk-!nzuclxg;{88Jkq}Q$hUh+;13&5r?~Gzp;C>7H`qokwvGEFQGuTD7
zN|&$DMpbkFui0uA;{CL7@-%QeW8+VDpD_HrSN*)5=!nn*KqFnzD2hUmy?5_0r)+o*XNFNNzyZ#S;M040-x??1I1`$h7n!LQ`iika(St|-)DU+y>
zXuf_pJg4jtiN(pMO%#FsTtVsWIge(?wjuuSt61Wpz=}@h);qTsbqaLteM6xOguQtl
zYwEs56gH2MLt){3_;6Y?Kw8mg96sNWu?qLZW(2sk-rat4ELJo%C%hG<0-STCxGQpT
z(7IkR^fh$7zK$K2RqG)Vv^4!lA89a~)84pjC12vAd@?IV6$2V;{#cL?9mNX+mU*jW
zBT#*Nt>?m07@4Q~7k5N+k?2yhk=n0v}np2s+2F~FOrpF2RPDh`}WmqmCJ
z8>{B!9ygy68Sd8Ds1#0rTvLObYMWz^heUP`!7BQaZd2_NsnRas>RZ{n!mxJ&erQpf
zr%fM`IG0K!^cSkwQ^cBfS(|YgV`^lMu+r$d8gJ+G8PU(gBO*mjW6{jClI6E_dct(7
z%anFvJFe5RN(5wf?qCF5b71swq19BtEDl6_si0a2lj@Z5aw?OL;95H;>|#*=vLVhp
z19Bp*1dm`SY|Bh|y}FzcCz1_QEE}0g2?FH13UO$&=7B5t)Es-z_DY@`W;L5E