Feature/docker CVE issues (#558)

* updated Dockerfile to try to remove security vulnerabilities w/ burnettk

* we require curl for health checks w/ burnettk

* try to scan docker image in ci

* use Dockerfile from backend w/ burnettk

* continue-on-error w/ burnettk

* attempt to elevate permissions of snyk w/ burnettk

* added snyk security github workflow w/ burnettk

* fixed location of constraints w/ burnettk

* add in or true for snyk tests w/ burnettk

* sent the snyk token w/ burnettk

* specify the directory for the sarif file w/ burnettk

* updated spiffworkflow-connector-command for snyk issue w/ burnettk

* updated sql statements sanitize input

* ignore issues for debug_controller and check frontend with snyk w/ burnettk

* updated babel and electron for snyk w/ burnettk

* some more updates to fix vulnerabilities w/ burnettk

* prune repeated deps for frontend builds since

* uncomment ci code so it runs again and use node for frontend base image w/ burnettk

* fixed backend image name w/ burnettk

* pyl w/ burnettk

---------

Co-authored-by: jasquat <jasquat@users.noreply.github.com>
This commit is contained in:
jasquat 2023-10-19 14:22:52 -04:00 committed by GitHub
parent baa93f5e2c
commit fe4dc14b8d
9 changed files with 635 additions and 498 deletions

View File

@ -208,6 +208,10 @@ jobs:
run: ./bin/run_pre_commit_in_ci
check_docker_start_script:
permissions:
contents: read # for actions/checkout to fetch code
security-events: write # for github/codeql-action/upload-sarif to upload SARIF results
actions: read # only required for a private repository by github/codeql-action/upload-sarif to get the Action run status
runs-on: ubuntu-latest
steps:
- name: Check out the repository

137
.github/workflows/snyk-security.yml vendored Normal file
View File

@ -0,0 +1,137 @@
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
# A sample workflow which sets up Snyk to analyze the full Snyk platform (Snyk Open Source, Snyk Code,
# Snyk Container and Snyk Infrastructure as Code)
# The setup installs the Snyk CLI - for more details on the possible commands
# check https://docs.snyk.io/snyk-cli/cli-reference
# The results of Snyk Code are then uploaded to GitHub Security Code Scanning
#
# In order to use the Snyk Action you will need to have a Snyk API token.
# More details in https://github.com/snyk/actions#getting-your-snyk-token
# or you can signup for free at https://snyk.io/login
#
# For more examples, including how to limit scans to only high-severity issues
# and fail PR checks, see https://github.com/snyk/actions/
name: Snyk Security
# on:
# push:
# branches: ["main" ]
# pull_request:
# branches: ["main"]
on:
- push
- pull_request
permissions:
contents: read
jobs:
snyk-backend:
permissions:
contents: read # for actions/checkout to fetch code
security-events: write # for github/codeql-action/upload-sarif to upload SARIF results
actions: read # only required for a private repository by github/codeql-action/upload-sarif to get the Action run status
runs-on: ubuntu-latest
defaults:
run:
working-directory: spiffworkflow-backend
env:
# This is where you will need to introduce the Snyk API token created with your Snyk account
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
steps:
- uses: actions/checkout@v3
- name: Set up Snyk CLI to check for security issues
# Snyk can be used to break the build when it detects security issues.
# In this case we want to upload the SAST issues to GitHub Code Scanning
uses: snyk/actions/setup@806182742461562b67788a64410098c9d9b96adb
env:
# This is where you will need to introduce the Snyk API token created with your Snyk account
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
- name: Upgrade pip
run: |
pip install --constraint=../.github/workflows/constraints.txt pip
pip --version
- name: Install Poetry
run: |
pipx install --pip-args=--constraint=../.github/workflows/constraints.txt poetry
poetry --version
- name: Poetry Install
run: poetry install
# Runs Snyk Code (SAST) analysis and uploads result into GitHub.
# Use || true to not fail the pipeline
- name: Snyk Code test
run: snyk code test --sarif > snyk-code.sarif || true
# Runs Snyk Open Source (SCA) analysis and uploads result to Snyk.
- name: Snyk Open Source monitor
run: snyk monitor --all-projects
# Build the docker image for testing
- name: Build a Docker image
run: docker build -t spiffworkflow-backend/snyk-test .
# Runs Snyk Container (Container and SCA) analysis and uploads result to Snyk.
- name: Snyk Container monitor
run: snyk container monitor spiffworkflow-backend/snyk-test --file=Dockerfile
# Push the Snyk Code results into GitHub Code Scanning tab
- name: Upload result to GitHub Code Scanning
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: spiffworkflow-backend/snyk-code.sarif
snyk-frontend:
permissions:
contents: read # for actions/checkout to fetch code
security-events: write # for github/codeql-action/upload-sarif to upload SARIF results
actions: read # only required for a private repository by github/codeql-action/upload-sarif to get the Action run status
runs-on: ubuntu-latest
defaults:
run:
working-directory: spiffworkflow-frontend
env:
# This is where you will need to introduce the Snyk API token created with your Snyk account
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
steps:
- uses: actions/checkout@v3
- name: Set up Snyk CLI to check for security issues
# Snyk can be used to break the build when it detects security issues.
# In this case we want to upload the SAST issues to GitHub Code Scanning
uses: snyk/actions/setup@806182742461562b67788a64410098c9d9b96adb
env:
# This is where you will need to introduce the Snyk API token created with your Snyk account
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
- name: Setup Node
uses: actions/setup-node@v3
with:
node-version: 18.x
- run: npm install
# Runs Snyk Code (SAST) analysis and uploads result into GitHub.
# Use || true to not fail the pipeline
- name: Snyk Code test
run: snyk code test --sarif > snyk-code.sarif || true
# Runs Snyk Open Source (SCA) analysis and uploads result to Snyk.
- name: Snyk Open Source monitor
run: snyk monitor --all-projects
# Build the docker image for testing
- name: Build a Docker image
run: docker build -t spiffworkflow-frontend/snyk-test .
# Runs Snyk Container (Container and SCA) analysis and uploads result to Snyk.
- name: Snyk Container monitor
# pruning repeated subdependencies because it fails otherwise
run: snyk container monitor spiffworkflow-frontend/snyk-test --file=Dockerfile --prune-repeated-subdependencies
# Push the Snyk Code results into GitHub Code Scanning tab
- name: Upload result to GitHub Code Scanning
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: spiffworkflow-frontend/snyk-code.sarif

View File

@ -1,12 +1,19 @@
# Snyk (https://snyk.io) policy file, patches or ignores known vulnerabilities.
version: v1.25.0
# ignores vulnerabilities until expiry date; change duration by modifying expiry date
#
# leaving for documenting how to ignore items
# ignore:
# SNYK-PYTHON-FLASK-5490129:
# - '*':
# reason: Filed ticket to upgrade flask
# expires: 2024-06-02T14:48:14.372Z
# created: 2023-05-03T14:48:14.379Z
ignore: {}
patch: {}
# when running snyk ignore to ignore issues with "snyk code test"
# make sure to EXCLUDE the id option. Otherwise a bad file is created.
#
# Works:
# snyk ignore --file-path=src/spiffworkflow_backend/routes/debug_controller.py
#
# Des not work:
# snyk ignore --file-path=src/spiffworkflow_backend/routes/debug_controller.py --id=whatever
#
# a single vulnerability cannot be ignored for "snyk code test". Only whole files can be ingored.
exclude:
global:
- src/spiffworkflow_backend/routes/debug_controller.py

View File

@ -1,5 +1,5 @@
# Base image to share ENV vars that activate VENV.
FROM ghcr.io/sartography/python:3.11 AS base
FROM python:3.11.6-slim-bookworm AS base
ENV VIRTUAL_ENV=/app/venv
RUN python3 -m venv $VIRTUAL_ENV
@ -11,11 +11,18 @@ WORKDIR /app
# vim is just for debugging
FROM base AS deployment
# git-core because the app does "git commit", etc
# curl because the docker health check uses it
# gunicorn3 for web server
# default-mysql-client for convenience accessing mysql docker container
# vim ftw
RUN apt-get update \
&& apt-get clean -y \
&& apt-get install -y -q curl git-core gunicorn3 default-mysql-client vim \
&& apt-get install -y -q git-core curl gunicorn3 default-mysql-client vim \
&& rm -rf /var/lib/apt/lists/*
RUN pip install poetry==1.6.1
# Setup image for installing Python dependencies.
FROM base AS setup
@ -24,7 +31,7 @@ FROM base AS setup
# problem with poetry but with lazy-object-proxy (1.7.1) not supporting PEP 517 builds.
# You can verify this by running 'pip wheel --use-pep517 "lazy-object-proxy (==1.7.1) ; python_version >= "3.6""'.
# Pinnning to 1.3.2 to attempt to avoid it.
RUN pip install poetry==1.3.2
RUN pip install poetry==1.6.1
RUN useradd _gunicorn --no-create-home --user-group
# default-libmysqlclient-dev for mysqlclient lib

View File

@ -2389,7 +2389,7 @@ typing-extensions = "^4.8.0"
type = "git"
url = "https://github.com/sartography/spiffworkflow-connector-command.git"
reference = "main"
resolved_reference = "022c01826e98fd3997ec21652c88c1343279b6f4"
resolved_reference = "7e20603849ee609267ddc97c8433077e09ccd124"
[[package]]
name = "sqlalchemy"

View File

@ -37,7 +37,7 @@ class KKVDataStore(BpmnDataStoreSpecification): # type: ignore
def set(self, my_task: SpiffTask) -> None:
"""set."""
data = my_task.data[self.bpmn_id]
if type(data) != dict:
if not isinstance(data, dict):
raise Exception(
f"When writing to this data store, a dictionary is expected as the value for variable '{self.bpmn_id}'"
)
@ -45,7 +45,7 @@ class KKVDataStore(BpmnDataStoreSpecification): # type: ignore
if second_level is None:
self._delete_all_for_top_level_key(top_level_key)
continue
if type(second_level) != dict:
if not isinstance(second_level, dict):
raise Exception(
"When writing to this data store, a dictionary is expected as the value for"
f" '{self.bpmn_id}[\"{top_level_key}\"]'"

View File

@ -33,8 +33,9 @@ from spiffworkflow_backend.models.user_group_assignment_waiting import UserGroup
from spiffworkflow_backend.routes.openid_blueprint import openid_blueprint
from spiffworkflow_backend.services.user_service import UserService
from sqlalchemy import and_
from sqlalchemy import func
from sqlalchemy import literal
from sqlalchemy import or_
from sqlalchemy import text
@dataclass
@ -111,10 +112,11 @@ class AuthorizationService:
.join(PermissionTargetModel)
.filter(
or_(
text(f"'{target_uri_normalized}' LIKE permission_target.uri"),
# found from https://stackoverflow.com/a/46783555
literal(target_uri_normalized).like(PermissionTargetModel.uri),
# to check for exact matches as well
# see test_user_can_access_base_path_when_given_wildcard_permission unit test
text(f"'{target_uri_normalized}' = replace(replace(permission_target.uri, '/%', ''), ':%', '')"),
func.REPLACE(func.REPLACE(PermissionTargetModel.uri, "/%", ""), ":%", "") == target_uri_normalized,
)
)
.all()

View File

@ -1,7 +1,8 @@
# Base image to share ENV vars that activate VENV.
FROM quay.io/sartography/node:latest AS base
FROM node:20.8.1-bookworm-slim AS base
RUN mkdir /app
WORKDIR /app
# this matches total memory on spiffworkflow-demo
@ -31,11 +32,14 @@ FROM base AS final
LABEL description="Frontend component of SpiffWorkflow, a software development platform for building, running, and monitoring executable diagrams"
# WARNING: On localhost frontend assumes backend is one port lowe.
# WARNING: On localhost frontend assumes backend is one port lower.
ENV PORT0=7001
COPY --from=setup /app/build /app/build
COPY --from=setup /app/bin /app/bin
COPY --from=setup /app/node_modules.justserve /app/node_modules
ENTRYPOINT ["/app/bin/boot_server_in_docker"]
RUN chown -R node:node /app
USER node
CMD ["/app/bin/boot_server_in_docker"]

File diff suppressed because it is too large Load Diff