78 Commits

Author SHA1 Message Date
7ddb3c153e form for upload, fix
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 17:24:34 +02:00
677e34f1f3 form for upload 2025-07-15 17:16:14 +02:00
2a2a1316e1 tune error messages, fix 2025-07-15 16:31:56 +02:00
f2d6178304 tune error messages 2025-07-15 16:28:01 +02:00
7236c35ef9 fix paths, fix 2025-07-15 16:01:09 +02:00
b2cf3fe4c7 fix paths 2025-07-15 15:59:21 +02:00
a56119379a add forgotten module 2025-07-15 15:48:46 +02:00
bd368822aa fix paths in api 2025-07-15 15:45:45 +02:00
1cb9451c47 fix shell in entrypoint script of server 2025-07-15 15:40:14 +02:00
5eedb7c523 fix image name confusion, fix 2025-07-15 15:36:32 +02:00
294f30eb38 fix image name confusion
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 15:32:53 +02:00
5af202469c fix start script, add port 2025-07-15 15:26:10 +02:00
81bd403069 fix start script 2025-07-15 15:20:26 +02:00
93222237ee adjust ci rules, fix 2025-07-15 15:18:46 +02:00
d5bda1c2d4 adjust ci rules 2025-07-15 15:16:54 +02:00
b430afcfef add deploy stage, fix 3 2025-07-15 15:10:24 +02:00
3ce0b0a4cf add deploy stage, fix 2 2025-07-15 15:09:50 +02:00
c88a74daa3 add deploy stage, fix 2025-07-15 15:09:05 +02:00
10d14d87fb add deploy stage
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 15:07:53 +02:00
58795aca81 rename dockerfiles, fix 2 2025-07-15 14:45:32 +02:00
13271a6d5e rename dockerfiles, fix 2025-07-15 14:44:18 +02:00
5a9493fe32 rename dockerfiles
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 14:42:53 +02:00
708b99852f add second dockerfile, add ci snippet
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 14:40:40 +02:00
e15973db53 add second dockerfile 2025-07-15 14:40:06 +02:00
b2db5b35ad prepare second dockerfile
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-15 14:33:07 +02:00
b21bd408f7 there is still an error
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-14 23:13:30 +02:00
e1aa900f4d works
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-14 22:47:08 +02:00
91dd245318 add server
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-14 22:06:16 +02:00
921a784fc0 add webservice boilerplate snippet
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-14 17:58:43 +02:00
43678c69fb hidden imports, 9 2025-07-14 16:55:56 +02:00
4577f8f0a5 hidden imports, 8 2025-07-14 16:49:32 +02:00
4dd3e9e799 hidden imports, 7 2025-07-14 16:40:35 +02:00
46ce0e1d54 hidden imports, 6 2025-07-14 16:37:16 +02:00
07b5a2a512 hidden imports, 5 2025-07-14 16:32:22 +02:00
d30abf3d0c hidden imports, 4 2025-07-14 16:28:23 +02:00
f8061aaa7a hidden imports, 3 2025-07-14 16:23:01 +02:00
e1cce96308 hidden imports, 2 2025-07-14 16:18:08 +02:00
3bd9882beb hidden imports
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-14 16:13:18 +02:00
6004f6aeb4 add windows build step, 9
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:22:11 +02:00
3ffcf262e5 add windows build step, 8
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:17:24 +02:00
550b5ff28a add windows build step, 7
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:10:35 +02:00
302f4df307 add windows build step, 6
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:09:15 +02:00
b8f4a3c46f add windows build step, 5
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:07:54 +02:00
1cee3b5dae add windows build step, 4
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:03:20 +02:00
0d28c61c0f add windows build step, 3
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 16:01:54 +02:00
7fefc75d64 add windows build step, 2
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 15:59:48 +02:00
e0398bd8fb add windows build step
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 15:49:58 +02:00
5ff83f3af7 change artifacts
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-11 15:00:30 +02:00
e85858d342 Multiple cpe experiments, failed. Add reimport feature.
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-11 13:29:02 +02:00
6811740835 prepare local env
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-10 16:23:46 +02:00
86ab9808d8 local build env
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-10 14:55:34 +02:00
117a74989e hallo hier ein commit
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-10 12:06:41 +02:00
b91a7ae0fc fix in ci script, 2
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-07-09 12:18:13 +02:00
e3043c5646 fix in ci script
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-09 11:44:18 +02:00
9afa00f61f add minimal sbom converter
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-07-09 11:26:50 +02:00
bd92d8eb87 drop plantuml snippet
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
2025-07-09 08:43:37 +02:00
5a1d6903e8 test plantuml integration
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-06-25 16:05:48 +02:00
67bab6710c documentation
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-06-20 10:38:12 +02:00
f55c3da3ef solve conflicting option
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-11 10:40:53 +02:00
f50d821aec verbose switch
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-11 09:59:13 +02:00
609f33b181 use correct custom ca location
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-11 07:40:14 +02:00
7c8e1156aa some debug
All checks were successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-10 18:48:47 +02:00
226456ccd2 absolute pathes
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-10 17:18:34 +02:00
227ef294d3 custom ca, 14
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-06-04 16:14:16 +02:00
a14e0ab2c5 custom ca, 13
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 16:13:14 +02:00
471fcb2177 custom ca, 12
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 16:01:37 +02:00
0d4ac4022a custom ca, 11
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:56:55 +02:00
405d66cdcb custom ca, 10
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:53:49 +02:00
a32d9fd643 custom ca, 9
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:51:21 +02:00
7f394f82ee custom ca, 8 2025-06-04 15:49:42 +02:00
c8577edf0c custom ca, 7
Some checks failed
ci/woodpecker/tag/woodpecker Pipeline failed
2025-06-04 15:45:06 +02:00
02aba34391 custom ca, 6
Some checks failed
ci/woodpecker/tag/woodpecker Pipeline failed
2025-06-04 15:42:15 +02:00
1fb4c387a7 custom ca, 5
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:37:48 +02:00
92b61fdae0 custom ca, 4
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:35:23 +02:00
4ddb6cfd30 custom ca, 3
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:33:14 +02:00
0eb761db27 custom ca, 2
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:31:14 +02:00
9cc81373dc custom ca
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:24:15 +02:00
b856424640 factorize stages
Some checks are pending
ci/woodpecker/tag/woodpecker Pipeline is pending
2025-06-04 15:08:00 +02:00
23 changed files with 1085 additions and 320 deletions

1
.gitignore vendored
View File

@@ -4,4 +4,5 @@ defs/
__pycache__/
.*.swp
tmp/
locallibs

View File

@@ -1,15 +1,17 @@
stages:
- generate-api-clients
- dockerize
- build
- deploy
variables:
REGISTRY: devnexus.krohne.com:18079/repository/docker-krohne
IMAGE_NAME: $REGISTRY/$CI_PROJECT_NAME
IMAGE_NAME_PREFIX: $REGISTRY/$CI_PROJECT_NAME
DTRACK_API_URL: https://dtrack-api-rd.krohne.com
DEFECTDOJO_API_URL: https://defectdojo-rd.krohne.com
KROHNE_CA_URL: https://devwiki.krohnegroup.com/lib/exe/fetch.php?media=krohne-ca.crt
KROHNE_CA_CHECKSUM: a921e440a742f1e67c7714306e2c0d76
generate-dtrack-api:
.generate-api:
stage: generate-api-clients
image: openapitools/openapi-generator-cli:v7.12.0
tags:
@@ -19,9 +21,18 @@ generate-dtrack-api:
rules:
- if: '$CI_COMMIT_BRANCH == "main"'
- if: '$CI_COMMIT_TAG'
before_script:
- curl --insecure $KROHNE_CA_URL -o krohne-ca.crt
- echo "$KROHNE_CA_CHECKSUM krohne-ca.crt" | md5sum -c
- mv krohne-ca.crt /usr/local/share/ca-certificates
- update-ca-certificates
generate-dtrack-api:
extends: .generate-api
artifacts:
paths:
- dtrack-api-client.tgz
- dependencytrack-client
expire_in: 1 week
script:
- curl ${DTRACK_API_URL}/api/openapi.json > dependencytrack-openapi.json
@@ -40,22 +51,13 @@ generate-dtrack-api:
-o dependencytrack-client \
--package-name dependencytrack_api \
-t dependencytrack-openapi-custom-template
- tar -czvf dtrack-api-client.tgz dependencytrack-client
generate-defectdojo-api:
stage: generate-api-clients
image: openapitools/openapi-generator-cli:v7.12.0
tags:
- linux
- docker
- bash
rules:
- if: '$CI_COMMIT_BRANCH == "main"'
- if: '$CI_COMMIT_TAG'
extends: .generate-api
artifacts:
paths:
- defectdojo-api-client.tgz
- defectdojo-client
expire_in: 1 week
script:
- curl ${DEFECTDOJO_API_URL}/api/v2/oa3/schema/?format=json > defectdojo-openapi.json
@@ -66,61 +68,91 @@ generate-defectdojo-api:
-g python \
-o defectdojo-client \
--package-name defectdojo_api
- tar -czvf defectdojo-api-client.tgz defectdojo-client
dockerize:
stage: dockerize
.dockerize:
stage: build
image: devnexus.krohne.com:18079/repository/docker-krohne/krohnedockerbash:0.5
tags:
- linux
- docker
- bash
rules:
- if: '$CI_COMMIT_TAG'
- if: '$CI_COMMIT_BRANCH == "main" || $CI_COMMIT_BRANCH == "production_deployment"'
script:
- tar -xzf defectdojo-api-client.tgz
- tar -xzf dtrack-api-client.tgz
- docker build --tag $IMAGE_NAME:latest
- IMAGE_NAME=$IMAGE_NAME_PREFIX"-"$IMAGE_NAME_SUFFIX
- docker build --build-arg ADDITIONAL_CA_URL="$KROHNE_CA_URL"
--build-arg ADDITIONAL_CA_CHECKSUM=$KROHNE_CA_CHECKSUM
--tag $IMAGE_NAME:latest-$CI_COMMIT_BRANCH
--tag $IMAGE_NAME:$CI_COMMIT_SHA
--tag $IMAGE_NAME:$CI_COMMIT_TAG
-f $DOCKERFILE
.
- docker login -u $NEXUS_USER -p $NEXUS_PASSWORD $REGISTRY
- docker push $IMAGE_NAME:latest
- docker push $IMAGE_NAME:latest-$CI_COMMIT_BRANCH
- docker push $IMAGE_NAME:$CI_COMMIT_SHA
- docker push $IMAGE_NAME:$CI_COMMIT_TAG
dockerize-cli:
extends: .dockerize
variables:
IMAGE_NAME_SUFFIX: cli
DOCKERFILE: Dockerfile-cli
dockerize-server:
extends: .dockerize
variables:
IMAGE_NAME_SUFFIX: server
DOCKERFILE: Dockerfile-server
#
# build:
# image: plugins/kaniko
# settings:
# repo: ${FORGE_NAME}/${CI_REPO}
# registry:
# from_secret: container_registry
# tags: latest,${CI_COMMIT_SHA},${CI_COMMIT_TAG}
# username:
# from_secret: container_registry_username
# password:
# from_secret: container_registry_password
# dockerfile: Dockerfile
# when:
# - event: [ push, tag ]
#
# build-for-quay:
# image: plugins/kaniko
# settings:
# repo: quay.io/wollud1969/${CI_REPO_NAME}
# registry: quay.io
# tags:
# - latest
# - ${CI_COMMIT_TAG}
# username:
# from_secret: quay_username
# password:
# from_secret: quay_password
# dockerfile: Dockerfile
# when:
# - event: [tag]
#
.deploy:
stage: deploy
image: devnexus.krohne.com:18079/repository/docker-krohne/krohnedockerbash:0.5
variables:
GIT_STRATEGY: none
SERVICE: sbom-dd-dt-integrator
script:
- IMAGE_NAME=$IMAGE_NAME_PREFIX"-server"
- VERSION=$CI_COMMIT_SHA
- CONTAINER_NAME=$SERVICE"-"$INSTANCE_SPECIFIER
- SERVICE_VOLUME=$SERVICE"-"$INSTANCE_SPECIFIER"-data"
- docker volume inspect $SERVICE_VOLUME || docker volume create $SERVICE_VOLUME
- docker stop $CONTAINER_NAME || echo "$CONTAINER_NAME not running, anyway okay"
- docker rm $CONTAINER_NAME || echo "$CONTAINER_NAME not running, anyway okay"
- docker login -u $NEXUS_USER -p $NEXUS_PASSWORD $REGISTRY
- docker pull $IMAGE_NAME:$VERSION
- |
cat - > /start-scripts/${CONTAINER_NAME}.sh << EOT
docker run \
-d \
--restart always \
-p 4701:8000 \
--name $CONTAINER_NAME \
-e DTRACK_API_URL=$DTRACK_API_URL \
-e DTRACK_TOKEN=$DTRACK_TOKEN \
-e DEFECTDOJO_URL=$DEFECTDOJO_URL \
-e DEFECTDOJO_TOKEN=$DEFECTDOJO_TOKEN \
$IMAGE_NAME:$VERSION
EOT
- chmod 755 /start-scripts/${CONTAINER_NAME}.sh
- /start-scripts/${CONTAINER_NAME}.sh
deploy-test:
extends: .deploy
rules:
- if: '$CI_COMMIT_BRANCH == "main"'
tags:
- test-deployment-de01rdtst01
variables:
INSTANCE_SPECIFIER: test
environment:
name: test
deploy-dev:
extends: .deploy
rules:
- if: '$CI_COMMIT_BRANCH == "production_deployment"'
tags:
- for-common-services-prod-deployment-only
variables:
INSTANCE_SPECIFIER: prod
environment:
name: prod

View File

@@ -41,14 +41,16 @@ steps:
repo: ${FORGE_NAME}/${CI_REPO}
registry:
from_secret: container_registry
tags: latest,${CI_COMMIT_SHA},${CI_COMMIT_TAG}
tags:
- latest
- ${CI_COMMIT_SHA}
username:
from_secret: container_registry_username
password:
from_secret: container_registry_password
dockerfile: Dockerfile
when:
- event: [ push, tag ]
- event: [ push ]
build-for-quay:
image: plugins/kaniko

View File

@@ -1,35 +0,0 @@
FROM python:3.12.10-alpine3.22
ENV DTRACK_API_URL=""
ENV DTRACK_TOKEN=""
ENV DEFECTDOJO_URL=""
ENV DEFECTDOJO_TOKEN=""
ARG APP_DIR=/opt/app
RUN \
apk add --no-cache syft &&\
adduser -s /bin/sh -D user &&\
mkdir -p $APP_DIR &&\
chown user:user $APP_DIR
USER user
WORKDIR $APP_DIR
COPY src/requirements.txt .
COPY src/sbom-dt-dd.py .
COPY src/entrypoint.sh .
COPY dependencytrack-client/ ./dependencytrack-client
COPY defectdojo-client/ ./defectdojo-client
RUN \
python -m venv .venv &&\
. ./.venv/bin/activate &&\
pip install -r requirements.txt &&\
pip install -r dependencytrack-client/requirements.txt &&\
pip install -r defectdojo-client/requirements.txt
ENTRYPOINT [ "./entrypoint.sh" ]

51
Dockerfile-cli Normal file
View File

@@ -0,0 +1,51 @@
FROM python:3.12.10-alpine3.22
ENV DTRACK_API_URL=""
ENV DTRACK_TOKEN=""
ENV DEFECTDOJO_URL=""
ENV DEFECTDOJO_TOKEN=""
ARG APP_DIR=/opt/app
ARG ADDITIONAL_CA_URL="x"
ARG ADDITIONAL_CA_CHECKSUM="y"
RUN \
set -e &&\
apk add --no-cache syft &&\
adduser -s /bin/sh -D user &&\
mkdir -p $APP_DIR &&\
chown user:user $APP_DIR &&\
echo $ADDITIONAL_CA_URL &&\
echo $ADDITIONAL_CA_CHECKSUM &&\
if [ "$ADDITIONAL_CA_URL" != "x" ]; then \
cd /usr/local/share/ca-certificates; \
wget --no-check-certificate -O custom-ca.crt $ADDITIONAL_CA_URL; \
echo "$ADDITIONAL_CA_CHECKSUM custom-ca.crt" | md5sum -c; \
/usr/sbin/update-ca-certificates; \
echo "custom ca added"; \
else \
echo "no additional ca"; \
fi
USER user
WORKDIR $APP_DIR
COPY src/requirements.txt .
COPY src/sbom_dt_dd.py .
COPY src/sbom_dt_dd_cli.py .
COPY src/converter.py .
COPY src/entrypoint-cli.sh .
COPY dependencytrack-client/ ./dependencytrack-client
COPY defectdojo-client/ ./defectdojo-client
RUN \
python -m venv .venv &&\
. ./.venv/bin/activate &&\
pip install -r requirements.txt &&\
pip install -r dependencytrack-client/requirements.txt &&\
pip install -r defectdojo-client/requirements.txt
ENTRYPOINT [ "./entrypoint-cli.sh" ]

52
Dockerfile-server Normal file
View File

@@ -0,0 +1,52 @@
FROM python:3.12.10-alpine3.22
ENV DTRACK_API_URL=""
ENV DTRACK_TOKEN=""
ENV DEFECTDOJO_URL=""
ENV DEFECTDOJO_TOKEN=""
ARG APP_DIR=/opt/app
ARG ADDITIONAL_CA_URL="x"
ARG ADDITIONAL_CA_CHECKSUM="y"
RUN \
set -e &&\
adduser -s /bin/sh -D user &&\
mkdir -p $APP_DIR &&\
chown user:user $APP_DIR &&\
echo $ADDITIONAL_CA_URL &&\
echo $ADDITIONAL_CA_CHECKSUM &&\
if [ "$ADDITIONAL_CA_URL" != "x" ]; then \
cd /usr/local/share/ca-certificates; \
wget --no-check-certificate -O custom-ca.crt $ADDITIONAL_CA_URL; \
echo "$ADDITIONAL_CA_CHECKSUM custom-ca.crt" | md5sum -c; \
/usr/sbin/update-ca-certificates; \
echo "custom ca added"; \
else \
echo "no additional ca"; \
fi
USER user
WORKDIR $APP_DIR
COPY src/requirements.txt .
COPY src/sbom_dt_dd.py .
COPY src/sbom_dt_dd_api.py .
COPY src/converter.py .
COPY src/entrypoint-server.sh .
COPY dependencytrack-client/ ./dependencytrack-client
COPY defectdojo-client/ ./defectdojo-client
RUN \
python -m venv .venv &&\
. ./.venv/bin/activate &&\
pip install -r requirements.txt &&\
pip install -r dependencytrack-client/requirements.txt &&\
pip install -r defectdojo-client/requirements.txt
EXPOSE 8000
ENTRYPOINT [ "./entrypoint-server.sh" ]

100
readme.md
View File

@@ -1,6 +1,96 @@
# Python Client Packages for the DependencyTrack and DefectDojo API
# DependencyTrack and DefectDojo Automation
## Download the OpenAPI definitions
## Using
### Distribution
The glue logic comes in a docker image and can be started as a docker container. Due to the dependencies, especially the ones related to the
APIs of DependencyTrack and DefectDojo this approach has been chosen.
The image is available at
```
quay.io/wollud1969/dtrack-defectdojo-automation
```
and at
```
devnexus.krohne.com:18079/repository/docker-krohne/dtrack-defectdojo-automation
```
The tag to be used at the moment is `1.0.5`.
### Start script
On Linux I've created two files to start the beast:
env-sbom-dd-dt
```
DTRACK_API_URL=https://dtrack-api-rd.krohne.com
DEFECTDOJO_URL=https://defectdojo-rd.krohne.com
DTRACK_TOKEN=...
DEFECTDOJO_TOKEN=...
```
The correct values for the tokens must be set here, obviously.
sbom-dd-dt.sh
```
#!/bin/bash
docker run -t -v $PWD:/work --rm --env-file ~/env-sbom-dt-dd devnexus.krohne.com:18079/repository/docker-krohne/dtrack-defectdojo-automation:1.0.5 "$@"
```
I've both files directly in my home-dir.
### File locations
When using the container and the script, you must consider that the container has no full access to your filesystem and you need to mount required parts of your filesystem into the container. In the above script I do this with the option `-v $PWD:/work`. This option mounts the current directory (the one from where you are starting the script and thus the container) into the directory `/work` within the container.
This is required when scanning a directory or uploading a prepared SBOM file.
### Options of the container/script
The container has the glue logic script as entrypoint. To find out about the options, call
```
dehottgw@DE01RDDEV01:~$ docker run -t -v $PWD:/work --rm --env-file ~/env-sbom-dt-dd devnexus.krohne.com:18079/repository/docker-krohne/dtrack-defectdojo-automation:1.0.5 -- -h
usage: sbom-dt-dd.py [-h] --name NAME --version VERSION --description DESCRIPTION --type TYPE --classifier
{APPLICATION,FRAMEWORK,LIBRARY,CONTAINER,OPERATING_SYSTEM,DEVICE,FIRMWARE,FILE,PLATFORM,DEVICE_DRIVER,MACHINE_LEARNING_MODEL,DATA}
[--uploadsbom] [--sbomfile SBOMFILE] [--target TARGET] [--verbose]
sbom-dt-dd.py: error: the following arguments are required: --name/-n, --version/-v, --description/-d, --type/-t, --classifier/-c
dehottgw@DE01RDDEV01:~$
```
Note the double-dash at the end of the commandline before the `-h`. It is necessary, otherwise the `-h` would be considered as an option for the docker command itself.
### SBOM upload example
For this example I've a file `combined-sbom.json` in the directory `software1`:
```
cd software1/
~/sbom-dt-dd.sh --name software1-server --version 0.0.1 --description "Server software for the Software1 platform" --type 1 --classifier APPLICATION --uploadsbom --sbomfile /work/combined-sbom.json -V
```
## Building
### Python Client Packages for the DependencyTrack and DefectDojo API
#### Download the OpenAPI definitions
```
curl https://dtrack-api.hottis.de/api/openapi.json \
@@ -10,7 +100,7 @@ curl https://defectdojo.hottis.de/api/v2/oa3/schema/?format=json \
```
## Naive Generation of the Client Package for DefectDojo
#### Naive Generation of the Client Package for DefectDojo
```
docker run \
@@ -28,7 +118,7 @@ docker run \
For DefectDojo the naive code generation works.
## Naive Generation of the Client Package for DependencyTrack
#### Naive Generation of the Client Package for DependencyTrack
```
docker run \
@@ -43,7 +133,7 @@ docker run \
--package-name dependencytrack_api
```
## Fixed Generation of the Client Package for DependencyTrack
#### Fixed Generation of the Client Package for DependencyTrack
In the OpenAPI definition of DependencyTrack a regex is used which is not understood by Python's
default regex implement `re`, which in turn is hardwired in the openapi-generator provided code.

33
snippets/websrv/main.py Normal file
View File

@@ -0,0 +1,33 @@
from fastapi import FastAPI
from fastapi.responses import JSONResponse
app = FastAPI(
title="My FastAPI App",
version="1.0.0",
description="A simple FastAPI example with uvicorn and gunicorn."
)
@app.get("/hello")
async def say_hello(name: str):
"""
Returns a friendly greeting.
---
parameters:
- name: name
in: query
required: true
schema:
type: string
responses:
200:
description: Successful Response
content:
application/json:
schema:
type: object
properties:
message:
type: string
"""
return JSONResponse(content={"message": f"Hello, {name}!"})

View File

@@ -0,0 +1,3 @@
fastapi==0.116.1
gunicorn==23.0.0
uvicorn==0.35.0

4
snippets/websrv/server.sh Executable file
View File

@@ -0,0 +1,4 @@
#!/bin/bash
./.venv/bin/gunicorn main:app -k uvicorn.workers.UvicornWorker -w 4 -b 0.0.0.0:8000

10
src/ENV-krohne.asc Normal file
View File

@@ -0,0 +1,10 @@
-----BEGIN PGP MESSAGE-----
jA0ECQMC0qbzN9I9kGP/0sAlARybIFvSNy12iziCC4waAcAPBvvvVrutjyIYtaV1
z9WeoBv7TlHB9aKAgxj8LuSh44iDH6uz9FvZfYcZ2BpC9PQYr5IkIw9+iqq9hODM
P90Kr9CPazMR8BQUb+4iJjNlHKJL1HCYaFnSHdquzCD4KGqUkkRPPt4Oj/5baJVi
kfhU6bKuM6rarcVL0ebSbc2jUIEaugXhnvEWRTiAfOE8v6o7CneoK5hdMbhVA1iC
j3sVIcCWgfgMOGDfL2P8DCr7GsGoOxMXvfsPZZL1BRNIf8WXWGpml/TA5Q1vw8TM
z8l6SIHklQ==
=T8wW
-----END PGP MESSAGE-----

5
src/ENV-python Normal file
View File

@@ -0,0 +1,5 @@
export PYTHONPATH=./locallibs/defectdojo-client:./locallibs/dependencytrack-client

117
src/converter.py Normal file
View File

@@ -0,0 +1,117 @@
from loguru import logger
import yaml
import uuid
from packageurl import PackageURL
from cyclonedx.builder.this import this_component as cdx_lib_component
from cyclonedx.factory.license import LicenseFactory
from cyclonedx.model.bom import Bom
from cyclonedx.model.component import Component, ComponentType
from cyclonedx.model.contact import OrganizationalEntity
from cyclonedx.model import XsUri
from cyclonedx.model import ExternalReference
from cyclonedx.output.json import JsonV1Dot5
class MyLocalConverterException(Exception): pass
def __converterClassifierToComponentType(classifier):
componentType = ''
match classifier:
case 'APPLICATION':
componentType = ComponentType.APPLICATION
case 'FRAMEWORK':
componentType = ComponentType.FRAMEWORK
case 'LIBRARY':
componentType = ComponentType.LIBRARY
case 'CONTAINER':
componentType = ComponentType.CONTAINER
case 'OPERATING_SYSTEM':
componentType = ComponentType.OPERATING_SYSTEM
case 'DEVICE':
componentType = ComponentType.DEVICE
case 'FIRMWARE':
componentType = ComponentType.FIRMWARE
case 'FILE':
componentType = ComponentType.FILE
case 'PLATFORM':
componentType = ComponentType.PLATFORM
case 'DEVICE_DRIVER':
componentType = ComponentType.DEVICE_DRIVER
case 'MACHINE_LEARNING_MODEL':
componentType = ComponentType.MACHINE_LEARNING_MODEL
case 'DATA':
componentType = ComponentType.DATA
case _:
raise MyLocalConverterException(f"No componentType for {classifier} found")
return componentType
def minimalSbomFormatConverter(minimalSbom):
logger.info(f"Minimal input: {minimalSbom}")
lc_factory = LicenseFactory()
minimalSbomObject = yaml.safe_load(minimalSbom)
logger.debug(f"{minimalSbomObject=}")
bom = Bom(
version=minimalSbomObject['sbomVersion']
)
bom.metadata.tools.components.add(cdx_lib_component())
bom.metadata.tools.components.add(Component(
name='sbom-dt-dd',
type=ComponentType.APPLICATION
))
bom.metadata.component = root_component = Component(
name=minimalSbomObject['product'],
type=__converterClassifierToComponentType(minimalSbomObject['classifier']),
description=minimalSbomObject['description'],
version=minimalSbomObject['version'],
licenses=[lc_factory.make_from_string(minimalSbomObject['license'])],
supplier=OrganizationalEntity(
name=minimalSbomObject['supplier']['name'],
urls=[XsUri(minimalSbomObject['supplier']['url'])]
),
bom_ref = f"urn:uuid:{uuid.uuid4()}"
)
component = Component(
type=__converterClassifierToComponentType(minimalSbomObject['classifier']),
name=f"{minimalSbomObject['supplier']['name']}´s own code",
version=minimalSbomObject['version'],
licenses=[lc_factory.make_from_string(minimalSbomObject['license'])],
supplier=OrganizationalEntity(
name=minimalSbomObject['supplier']['name'],
urls=[XsUri(minimalSbomObject['supplier']['url'])]
),
bom_ref = f"urn:uuid:{uuid.uuid4()}"
)
bom.components.add(component)
bom.register_dependency(root_component, [component])
for minimalComponentDescription in minimalSbomObject['components']:
component = Component(
type=ComponentType.LIBRARY,
name=minimalComponentDescription['name'],
version=minimalComponentDescription['version'],
licenses=[lc_factory.make_from_string(minimalComponentDescription['license'])],
bom_ref = f"urn:uuid:{uuid.uuid4()}"
)
if 'cpe' in minimalComponentDescription:
component.cpe = minimalComponentDescription['cpe']
if 'purl' in minimalComponentDescription:
component.purl = PackageURL.from_string(minimalComponentDescription['purl'])
bom.components.add(component)
bom.register_dependency(root_component, [component])
outputSbom = JsonV1Dot5(bom).output_as_string(indent=2)
logger.info(outputSbom)
with open('/tmp/bom.json', 'w') as f:
f.write(outputSbom)
return (outputSbom, minimalSbomObject['product'], minimalSbomObject['version'], minimalSbomObject['classifier'], minimalSbomObject['description'])

10
src/entrypoint-cli.sh Executable file
View File

@@ -0,0 +1,10 @@
#!/bin/sh
# entrypoint.sh
source /opt/app/.venv/bin/activate
PYTHONPATH="$PYTHONPATH:/opt/app/dependencytrack-client"
PYTHONPATH="$PYTHONPATH:/opt/app/defectdojo-client"
export PYTHONPATH
exec python /opt/app/sbom_dt_dd_cli.py "$@"

9
src/entrypoint-server.sh Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/sh
source /opt/app/.venv/bin/activate
PYTHONPATH="$PYTHONPATH:/opt/app/dependencytrack-client"
PYTHONPATH="$PYTHONPATH:/opt/app/defectdojo-client"
export PYTHONPATH
gunicorn sbom_dt_dd_api:app -k uvicorn.workers.UvicornWorker -w 4 -b 0.0.0.0:8000

View File

@@ -1,11 +0,0 @@
#!/bin/sh
# entrypoint.sh
source ./.venv/bin/activate
PYTHONPATH="$PYTHONPATH:./dependencytrack-client"
PYTHONPATH="$PYTHONPATH:./defectdojo-client"
export PYTHONPATH
exec python sbom-dt-dd.py "$@"

47
src/prepare-local-env.sh Executable file
View File

@@ -0,0 +1,47 @@
#!/bin/bash
set -e
. ./ENV
LOCALLBIS=./locallibs
OPENAPI_GENERATOR=openapitools/openapi-generator-cli:v7.12.0
mkdir $LOCALLBIS && cd $LOCALLBIS
# --- DependencyTrack Client Library -----------------------------------------------------
curl ${DTRACK_API_URL}/api/openapi.json >dependencytrack-openapi.json
docker run -v $PWD:/work -u $UID $OPENAPI_GENERATOR \
author template \
-g python \
-o /work/dependencytrack-openapi-custom-template
sed -i -e 's/import re/import regex as re/' dependencytrack-openapi-custom-template/model_anyof.mustache
sed -i -e 's/import re/import regex as re/' dependencytrack-openapi-custom-template/model_generic.mustache
docker run -v $PWD:/work -u $UID $OPENAPI_GENERATOR \
generate \
-i /work/dependencytrack-openapi.json \
-g python \
-o /work/dependencytrack-client \
--package-name dependencytrack_api \
-t /work/dependencytrack-openapi-custom-template
# --- Defectdojo Client Library ----------------------------------------------------------
curl ${DEFECTDOJO_URL}/api/v2/oa3/schema/?format=json >defectdojo-openapi.json
docker run -v $PWD:/work -u $UID $OPENAPI_GENERATOR \
generate \
-i /work/defectdojo-openapi.json \
-g python \
-o /work/defectdojo-client \
--package-name defectdojo_api
cd ..
python3 -m venv .venv
. .venv/bin/activate
pip install -r requirements.txt
pip install -r $LOCALLBIS/dependencytrack-client/requirements.txt
pip install -r $LOCALLBIS/defectdojo-client/requirements.txt

View File

@@ -1,3 +1,8 @@
regex==2024.11.6
loguru==0.7.3
PyYAML==6.0.2
cyclonedx-python-lib==10.4.1
fastapi==0.116.1
gunicorn==23.0.0
uvicorn==0.35.0
python-multipart==0.0.20

View File

@@ -1,208 +0,0 @@
import os
from loguru import logger
import argparse
import subprocess
import json
import defectdojo_api
from defectdojo_api.rest import ApiException as DefectDojoApiException
import datetime
from dateutil.relativedelta import relativedelta
import dependencytrack_api
from dependencytrack_api.rest import ApiException as DependencyTrackApiException
class MyLocalException(Exception): pass
def executeApiCall(apiClient, ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams=[]):
try:
logger.info(f"Calling {ApiClass}.{EndpointMethod} with {RequestClass} ({additionalParams}, {requestParams})")
instance = ApiClass(apiClient)
if RequestClass:
request = RequestClass(**requestParams)
response = EndpointMethod(instance, *additionalParams, request)
else:
response = EndpointMethod(instance, *additionalParams)
logger.info(f"Response is {response}")
return response
except Exception as e:
logger.error(f"Caught error {e} with {str(e)}")
raise MyLocalException(e)
def generateSBOM(target='.', name='dummyName', version='0.0.0'):
try:
result = subprocess.run(
["syft", "scan", target, "-o", "cyclonedx-json", "--source-name", name, "--source-version", version],
check=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True
)
sbom = json.loads(result.stdout)
return sbom
except subprocess.CalledProcessError as e:
logger.error(f"SBOM scanner failed: {e.stderr}")
raise MyLocalException(e)
# ---- main starts here with preparation of config -----------------------------------------------------------------------
try:
DTRACK_API_URL = os.environ["DTRACK_API_URL"]
DTRACK_TOKEN = os.environ["DTRACK_TOKEN"]
DEFECTDOJO_URL = os.environ["DEFECTDOJO_URL"]
DEFECTDOJO_TOKEN = os.environ["DEFECTDOJO_TOKEN"]
except KeyError as e:
raise Exception(f"Env variable {e} is shall be set")
parser = argparse.ArgumentParser(description='sbom-dt-dd glue logic')
parser.add_argument('--name', '-n',
help='Project Name',
required=True)
parser.add_argument('--version', '-v',
help='Project Version',
required=True)
parser.add_argument('--description', '-d',
help='Project Description',
required=True)
parser.add_argument('--type', '-t',
help='Product Type from DefectDojo',
type=int,
required=True)
parser.add_argument('--classifier', '-c',
help='Project Classifier from DependencyTrack',
choices=['APPLICATION', 'FRAMEWORK', 'LIBRARY', 'CONTAINER', 'OPERATING_SYSTEM', 'DEVICE',
'FIRMWARE', 'FILE', 'PLATFORM', 'DEVICE_DRIVER', 'MACHINE_LEARNING_MODEL', 'DATA'],
required=True)
parser.add_argument('--uploadsbom', '-U',
help='Upload a already existing SBOM instead of generating it. Give the SBOM file at -F instead of a target',
required=False,
action='store_true',
default=False)
parser.add_argument('--sbomfile', '-F',
help='Filename of existing SBOM file to upload, use together with -U, do not use together with -T',
required=False)
parser.add_argument('--target', '-T',
help='Target to scan, either path name for sources or docker image tag',
required=False)
args = parser.parse_args()
projectName = args.name
projectVersion = args.version
projectDescription = args.description
productType = args.type
projectClassifier = args.classifier
uploadSbomFlag = args.uploadsbom
if uploadSbomFlag:
sbomFileName = args.sbomfile
else:
target = args.target
# ---- main starts here --------------------------------------------------------------------------------------------------
if uploadSbomFlag:
# ------- read uploaded SBOM -------------
logger.info(f"Reading SBOM from file {sbomFileName}")
with open(sbomFileName, 'r') as sbomFile:
sbom = sbomFile.read()
logger.info("Done.")
else:
# ------- generate SBOM ------------
logger.info(f"Generating SBOM for {target}")
sbomJson = generateSBOM(target, projectName, projectVersion)
sbom = json.dumps(sbomJson)
logger.info("Done.")
# ------- create product and engagement in DefectDojo -------
defectdojo_configuration = defectdojo_api.Configuration(
host = DEFECTDOJO_URL
)
defectdojo_configuration.api_key['tokenAuth'] = DEFECTDOJO_TOKEN
defectdojo_configuration.api_key_prefix['tokenAuth'] = 'Token'
with defectdojo_api.ApiClient(defectdojo_configuration) as defectdojo_api_client:
print("Create product in DefectDojo")
productName = f"{projectName}:{projectVersion}"
product_response = \
executeApiCall(
defectdojo_api_client,
defectdojo_api.ProductsApi,
defectdojo_api.ProductsApi.products_create,
defectdojo_api.ProductRequest,
{ 'name': productName, 'description': projectDescription, 'prod_type': productType },
[]
)
product_id = product_response.id
print(f"{product_id=}")
print("Create engagement in DefectDojo")
start_time = datetime.date.today()
end_time = start_time + relativedelta(years=10)
engagementName = f"{productName} DTrack Link"
engagement_response = \
executeApiCall(
defectdojo_api_client,
defectdojo_api.EngagementsApi,
defectdojo_api.EngagementsApi.engagements_create,
defectdojo_api.EngagementRequest,
{ 'name': engagementName, 'target_start': start_time, 'target_end': end_time, 'status': 'In Progress', 'product': product_id },
[]
)
engagement_id = engagement_response.id
print(f"{engagement_id=}")
# ------- create project in DependencyTrack, connect project to engagement in DefectDojo, upload SBOM --------
dependencytrack_configuration = dependencytrack_api.Configuration(
host = f"{DTRACK_API_URL}/api"
)
dependencytrack_configuration.debug = False
dependencytrack_configuration.api_key['ApiKeyAuth'] = DTRACK_TOKEN
with dependencytrack_api.ApiClient(dependencytrack_configuration) as dependencytrack_api_client:
project_response = \
executeApiCall(
dependencytrack_api_client,
dependencytrack_api.ProjectApi,
dependencytrack_api.ProjectApi.create_project,
dependencytrack_api.Project,
{ 'name': projectName, 'version': projectVersion, 'classifier': projectClassifier, 'uuid': "", 'last_bom_import': 0 },
[]
)
project_uuid = project_response.uuid
print(f"{project_uuid=}")
properties = [
{ 'group_name': "integrations", 'property_name': "defectdojo.engagementId",
'property_value': str(engagement_id), 'property_type': "STRING" },
{ 'group_name': "integrations", 'property_name': "defectdojo.doNotReactivate",
'property_value': "true", 'property_type': "BOOLEAN" },
{ 'group_name': "integrations", 'property_name': "defectdojo.reimport",
'property_value': "true", 'property_type': "BOOLEAN" }
]
for property in properties:
executeApiCall(
dependencytrack_api_client,
dependencytrack_api.ProjectPropertyApi,
dependencytrack_api.ProjectPropertyApi.create_property1,
dependencytrack_api.ProjectProperty,
property,
[ project_uuid ]
)
bom_response = \
executeApiCall(
dependencytrack_api_client,
dependencytrack_api.BomApi,
dependencytrack_api.BomApi.upload_bom,
None,
None,
[ None, False, projectName, projectVersion, None, None, None, None, True, sbom ]
)

175
src/sbom_dt_dd.py Normal file
View File

@@ -0,0 +1,175 @@
import os
import sys
from loguru import logger
import json
import datetime
from dateutil.relativedelta import relativedelta
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'defectdojo-client'))
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'dependencytrack-client'))
import defectdojo_api
from defectdojo_api.rest import ApiException as DefectDojoApiException
import dependencytrack_api
from dependencytrack_api.rest import ApiException as DependencyTrackApiException
class ApiException(Exception):
def __init__(self, cause):
self.cause = cause
self.status = cause.status
self.reason = cause.reason
self.body = cause.body
self.data = cause.data
self.headers = cause.headers
class ApiCallExecutor:
def __init__(self, verbose):
self.verbose = verbose
def innerExecuteApiCall(self, ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams):
logger.info(f"Calling {ApiClass=}.{EndpointMethod=} with {RequestClass=})")
if self.verbose:
logger.debug(f"{additionalParams=}, {requestParams=}")
instance = ApiClass(self)
if RequestClass:
request = RequestClass(**requestParams)
response = EndpointMethod(instance, *additionalParams, request)
else:
response = EndpointMethod(instance, *additionalParams)
logger.info(f"Response is {response}")
return response
class DefectDojoApiClient(defectdojo_api.ApiClient, ApiCallExecutor):
def __init__(self, config, verbose):
defectdojo_api.ApiClient.__init__(self, config)
ApiCallExecutor.__init__(self, verbose)
def executeApiCall(self, ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams):
try:
return self.innerExecuteApiCall(ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams)
except defectdojo_api.exceptions.ApiException as e:
raise ApiException(e)
class DependencyTrackApiClient(dependencytrack_api.ApiClient, ApiCallExecutor):
def __init__(self, config, verbose):
dependencytrack_api.ApiClient.__init__(self, config)
ApiCallExecutor.__init__(self, verbose)
def executeApiCall(self, ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams):
try:
return self.innerExecuteApiCall(ApiClass, EndpointMethod, RequestClass, requestParams, additionalParams)
except dependencytrack_api.exceptions.ApiException as e:
raise ApiException(e)
def generateSBOM(target='.', name='dummyName', version='0.0.0'):
try:
result = subprocess.run(
["syft", "scan", target, "-o", "cyclonedx-json", "--source-name", name, "--source-version", version],
check=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True
)
sbom = json.loads(result.stdout)
return sbom
except subprocess.CalledProcessError as e:
logger.error(f"SBOM scanner failed: {e.stderr}")
raise MyLocalException(e)
def loadToDTrackAndDefectDojo(config, projectName, projectVersion, projectClassifier, projectDescription, productType, sbom, reImport):
# ------- create product and engagement in DefectDojo -------
if not reImport:
# in case of a reimport no modification on DefectDojo are required
defectdojo_configuration = defectdojo_api.Configuration(
host = config['DEFECTDOJO_URL']
)
defectdojo_configuration.api_key['tokenAuth'] = config['DEFECTDOJO_TOKEN']
defectdojo_configuration.api_key_prefix['tokenAuth'] = 'Token'
with DefectDojoApiClient(defectdojo_configuration, config['VERBOSE']) as client:
print("Create product in DefectDojo")
productName = f"{projectName}:{projectVersion}"
product_response = \
client.executeApiCall(
defectdojo_api.ProductsApi,
defectdojo_api.ProductsApi.products_create,
defectdojo_api.ProductRequest,
{ 'name': productName, 'description': projectDescription, 'prod_type': productType },
[]
)
product_id = product_response.id
print(f"{product_id=}")
print("Create engagement in DefectDojo")
start_time = datetime.date.today()
end_time = start_time + relativedelta(years=10)
engagementName = f"{productName} DTrack Link"
engagement_response = \
client.executeApiCall(
defectdojo_api.EngagementsApi,
defectdojo_api.EngagementsApi.engagements_create,
defectdojo_api.EngagementRequest,
{ 'name': engagementName, 'target_start': start_time, 'target_end': end_time, 'status': 'In Progress', 'product': product_id },
[]
)
engagement_id = engagement_response.id
print(f"{engagement_id=}")
# ------- create project in DependencyTrack, connect project to engagement in DefectDojo, upload SBOM --------
dependencytrack_configuration = dependencytrack_api.Configuration(
host = f"{config['DTRACK_API_URL']}/api"
)
dependencytrack_configuration.debug = False
dependencytrack_configuration.api_key['ApiKeyAuth'] = config['DTRACK_TOKEN']
with DependencyTrackApiClient(dependencytrack_configuration, config['VERBOSE']) as client:
if not reImport:
# in case of a reimport it is not necessary to create the project
project_response = \
client.executeApiCall(
dependencytrack_api.ProjectApi,
dependencytrack_api.ProjectApi.create_project,
dependencytrack_api.Project,
{ 'name': projectName, 'version': projectVersion, 'classifier': projectClassifier, 'uuid': "", 'last_bom_import': 0 },
[]
)
project_uuid = project_response.uuid
print(f"{project_uuid=}")
properties = [
{ 'group_name': "integrations", 'property_name': "defectdojo.engagementId",
'property_value': str(engagement_id), 'property_type': "STRING" },
{ 'group_name': "integrations", 'property_name': "defectdojo.doNotReactivate",
'property_value': "true", 'property_type': "BOOLEAN" },
{ 'group_name': "integrations", 'property_name': "defectdojo.reimport",
'property_value': "true", 'property_type': "BOOLEAN" }
]
for property in properties:
client.executeApiCall(
dependencytrack_api.ProjectPropertyApi,
dependencytrack_api.ProjectPropertyApi.create_property1,
dependencytrack_api.ProjectProperty,
property,
[ project_uuid ]
)
bom_response = \
client.executeApiCall(
dependencytrack_api.BomApi,
dependencytrack_api.BomApi.upload_bom,
None,
None,
[ None, False, projectName, projectVersion, None, None, None, None, True, sbom ]
)

240
src/sbom_dt_dd_api.py Normal file
View File

@@ -0,0 +1,240 @@
import os
import json
import yaml
from loguru import logger
from fastapi import FastAPI, UploadFile, File, Form, HTTPException, Request
from fastapi.responses import JSONResponse, HTMLResponse
from fastapi.templating import Jinja2Templates
from converter import minimalSbomFormatConverter
from sbom_dt_dd import generateSBOM, loadToDTrackAndDefectDojo, ApiException
app = FastAPI(
title="SBOM DTrack DefectDojo Synchronization API",
version="0.0.1",
description="",
root_path="/sbom-integrator/v1"
)
config = {}
try:
config['DTRACK_API_URL'] = os.environ["DTRACK_API_URL"]
config['DTRACK_TOKEN'] = os.environ["DTRACK_TOKEN"]
config['DEFECTDOJO_URL'] = os.environ["DEFECTDOJO_URL"]
config['DEFECTDOJO_TOKEN'] = os.environ["DEFECTDOJO_TOKEN"]
config['VERBOSE'] = True
except KeyError as e:
raise Exception(f"Env variable {e} is shall be set")
app.state.config = config
@app.get("/upload-form", response_class=HTMLResponse)
async def upload_form(request: Request):
"""
Route serving an HTML page with the upload form
"""
# BY AWARE OF THE HARDCODED ROOT_PATH BELOW
html_content = """
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Upload Minimal SBOM</title>
</head>
<body>
<h1>Upload Minimal SBOM</h1>
<form id="sbomForm">
<label for="file">Select SBOM file:</label><br>
<input type="file" id="file" name="file" required><br><br>
<label for="reimport">Reimport:</label>
<select name="reimport" id="reimport">
<option value="true">true</option>
<option value="false" selected>false</option>
</select><br><br>
<button type="submit">Upload SBOM</button>
</form>
<div id="result"></div>
<script>
document.getElementById("sbomForm").addEventListener("submit", async function(event) {
event.preventDefault();
let form = document.getElementById("sbomForm");
let formData = new FormData(form);
try {
let response = await fetch("/sbom-integrator/v1/upload-minimal-sbom/", {
method: "POST",
body: formData
});
let resultDiv = document.getElementById("result");
if (response.ok) {
let data = await response.json();
resultDiv.innerHTML = "<p style='color:green;'>Upload successful</p>";
} else {
let errorData = await response.json();
let detail = errorData.detail;
// Dynamisch HTML generieren
let html = "<p style='color:red;'>Upload failed:</p><ul>";
for (const [key, value] of Object.entries(detail)) {
html += "<li style='color:red'><strong>" + key + ":</strong> " + formatValue(value) + "</li>";
}
html += "</ul>";
resultDiv.innerHTML = html;
}
} catch (error) {
console.log(error);
document.getElementById("result").innerHTML = "<p style='color:red;'>Error: " + error + "</p>";
}
});
// Hilfsfunktion für verschachtelte Objekte
function formatValue(value) {
if (typeof value === 'object' && value !== null) {
return "<pre>" + escapeHtml(JSON.stringify(value, null, 2)) + "</pre>";
} else {
return escapeHtml(value);
}
}
function escapeHtml(unsafe) {
if (unsafe === null || unsafe === undefined) {
return '';
}
return String(unsafe)
.replace(/&/g, "&amp;")
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")
.replace(/"/g, "&quot;")
.replace(/'/g, "&#39;");
}
</script>
</body>
</html>
"""
return HTMLResponse(content=html_content)
@app.post("/upload-minimal-sbom/")
async def uploadMinimalSBOM(
file: UploadFile = File(...),
reimport: bool = Form(...)
):
"""
Endpoint to upload a minimal SBOM definition
"""
try:
sbom = await file.read()
logger.info("Start converting from minimal format into cyclonedx")
(sbom, projectName, projectVersion, projectClassifier, projectDescription) = minimalSbomFormatConverter(sbom)
logger.info("Converted")
loadToDTrackAndDefectDojo(app.state.config, projectName, projectVersion, projectClassifier, projectDescription, 1, sbom, reimport)
logger.info("Done.")
except yaml.scanner.ScannerError as e:
logger.warning(f"uploadMinimalSBOM, yaml ScannerError: {e.context=}, {e.context_mark=}, {e.problem=}, {e.problem_mark=}, {e.note=}")
raise HTTPException(
status_code=400,
detail={
"error": "yaml ScannerError",
"context": e.context,
"context_mark": str(e.context_mark),
"problem": e.problem,
"problem_mark": str(e.problem_mark),
"note": e.note
}
)
except ApiException as e:
logger.warning(f"uploadMinimalSBOM, ApiException: {type(e.cause)=}, {e.status=}, {e.reason=}, {e.body=}")
raise HTTPException(
status_code=e.status,
detail={
"type": str(type(e.cause)),
"reason": e.reason,
"body": e.body,
"data": e.data
}
)
except Exception as e:
logger.warning(f"uploadMinimalSBOM, Exception: {type(e)=}, {str(e)=}")
raise HTTPException(
status_code=500,
detail={
"error": "Exception occurred",
"type": str(type(e)),
"message": str(e)
}
)
return JSONResponse(content={
"message": "Upload successful!"
})
@app.post("/upload-sbom/")
async def uploadSBOM(
file: UploadFile = File(...),
projectName: str = Form(...),
projectVersion: str = Form(...),
projectClassifier: str = Form(...),
projectDescription: str = Form(...),
reimport: bool = Form(...)
):
"""
Endpoint to upload a CycloneDX SBOM
"""
sbom = await file.read()
try:
sbomJson = json.loads(sbom)
sbom = json.dumps(sbomJson)
loadToDTrackAndDefectDojo(app.state.config, projectName, projectVersion, projectClassifier, projectDescription, 1, str(sbom), reimport)
logger.info("Done.")
except json.decoder.JSONDecodeError as e:
logger.warning(f"uploadSBOM, JSONDecodeError: {e.msg=}")
raise HTTPException(
status_code=400,
detail={
"error": "JSON decoding error",
"msg": e.msg,
"doc": e.doc,
"pos": e.pos,
"lineno": e.lineno,
"colno": e.colno
}
)
except ApiException as e:
logger.warning(f"uploadSBOM, ApiException: {type(e.cause)=}, {e.status=}, {e.reason=}, {e.body=}")
raise HTTPException(
status_code=e.status,
detail={
"type": str(type(e.cause)),
"reason": e.reason,
"body": e.body,
"data": e.data
}
)
except Exception as e:
logger.warning(f"uploadSBOM, Exception: {type(e)=}, {str(e)=}")
raise HTTPException(
status_code=500,
detail={
"error": "Exception occurred",
"type": str(type(e)),
"message": str(e)
}
)
return JSONResponse(content={
"message": "Upload successful!"
})

133
src/sbom_dt_dd_cli.py Normal file
View File

@@ -0,0 +1,133 @@
import os
import sys
from loguru import logger
import argparse
import subprocess
import json
from converter import minimalSbomFormatConverter
from sbom_dt_dd import generateSBOM, loadToDTrackAndDefectDojo
# ---- main starts here with preparation of config -----------------------------------------------------------------------
parser = argparse.ArgumentParser(description='sbom-dt-dd glue logic')
parser.add_argument('--name', '-n',
help='Project Name',
required=False,
default=''),
parser.add_argument('--version', '-v',
help='Project Version',
required=False,
default='')
parser.add_argument('--description', '-d',
help='Project Description',
required=False,
default='')
parser.add_argument('--type', '-t',
help='Product Type from DefectDojo',
type=int,
required=True)
parser.add_argument('--classifier', '-c',
help='Project Classifier from DependencyTrack',
choices=['APPLICATION', 'FRAMEWORK', 'LIBRARY', 'CONTAINER', 'OPERATING_SYSTEM', 'DEVICE',
'FIRMWARE', 'FILE', 'PLATFORM', 'DEVICE_DRIVER', 'MACHINE_LEARNING_MODEL', 'DATA'],
required=False,
default='')
parser.add_argument('--uploadsbom', '-U',
help='Upload a already existing SBOM instead of generating it. Give the SBOM file at -F instead of a target',
required=False,
action='store_true',
default=False)
parser.add_argument('--sbomfile', '-F',
help='Filename of existing SBOM file to upload, use together with -U, do not use together with -T',
required=False)
parser.add_argument('--minimalsbomformat', '-K',
help='SBOM file comes in dedicated minimal format and will be converted into cyclonedx before uploading',
action='store_true',
default=False)
parser.add_argument('--overwritemetadata', '-O',
help='Overwrite name, version, description and classifier with data from minimal SBOM',
action='store_true',
default=False)
parser.add_argument('--target', '-T',
help='Target to scan, either path name for sources or docker image tag',
required=False)
parser.add_argument('--reimport', '-R',
help='Import the SBOM for an existing project/product once again',
required=False,
action='store_true',
default=False)
parser.add_argument('--verbose', '-V',
help='A lot of debug output',
required=False,
action='store_true',
default=False)
args = parser.parse_args()
projectName = args.name
projectVersion = args.version
projectDescription = args.description
productType = args.type
projectClassifier = args.classifier
reImport = args.reimport
uploadSbomFlag = args.uploadsbom
if uploadSbomFlag:
sbomFileName = args.sbomfile
minimalSbomFormat = args.minimalsbomformat
else:
target = args.target
if minimalSbomFormat:
overwriteMetadata = args.overwritemetadata
if not overwriteMetadata and not (projectName and projectVersion and projectClassifier and projectDescription):
raise MyLocalException("If overwriteMetadata is not selected, projectName, projectVersion, projectClassifier and projectDescription must be set.")
CONFIG = {}
try:
CONFIG['DTRACK_API_URL'] = os.environ["DTRACK_API_URL"]
CONFIG['DTRACK_TOKEN'] = os.environ["DTRACK_TOKEN"]
CONFIG['DEFECTDOJO_URL'] = os.environ["DEFECTDOJO_URL"]
CONFIG['DEFECTDOJO_TOKEN'] = os.environ["DEFECTDOJO_TOKEN"]
except KeyError as e:
raise Exception(f"Env variable {e} is shall be set")
CONFIG['VERBOSE'] = args.verbose
# ---- main starts here --------------------------------------------------------------------------------------------------
if uploadSbomFlag:
# ------- read uploaded SBOM -------------
logger.info(f"Reading SBOM from file {sbomFileName}")
with open(sbomFileName, 'r') as sbomFile:
sbom = sbomFile.read()
logger.info("SBOM file read.")
if minimalSbomFormat:
logger.info("Start converting from minimal format into cyclonedx")
(sbom, nameFromMinimalSbom, versionFromMinimalSbom, classifierFromMinimalSbom, descriptionFromMinimalSbom) = minimalSbomFormatConverter(sbom)
logger.info("Converted")
if overwriteMetadata:
projectName = nameFromMinimalSbom
projectVersion = versionFromMinimalSbom
projectClassifier = classifierFromMinimalSbom
projectDescription = descriptionFromMinimalSbom
logger.info("Done.")
else:
# ------- generate SBOM ------------
logger.info(f"Generating SBOM for {target}")
sbomJson = generateSBOM(target, projectName, projectVersion)
sbom = json.dumps(sbomJson)
logger.info("Done.")
loadToDTrackAndDefectDojo(CONFIG, projectName, projectVersion, projectClassifier, projectDescription, productType, sbom, reImport)