mirror of
https://github.com/pcvolkmer/etl-processor.git
synced 2025-07-01 06:02:54 +00:00
Compare commits
70 Commits
Author | SHA1 | Date | |
---|---|---|---|
8a6f9a6e02 | |||
91f17f6af5 | |||
8d4497bf2c | |||
4ab20a5f16 | |||
167587a473 | |||
e5d80f89b0 | |||
5d0e815037 | |||
a5a19e0cea | |||
1493a63e02 | |||
fe927e65aa | |||
add09c3f9c | |||
5eb969c36a | |||
3cc4f8c1a4 | |||
707bc55ab6 | |||
d7949a7dce | |||
f5999ff325 | |||
a62da60809 | |||
ced6609d9a | |||
8dee349c37 | |||
3e45de56cf | |||
7f54efe034 | |||
effcdd811f | |||
acf49a892e | |||
284806d130 | |||
cf2d338e13 | |||
d5552b3ca4 | |||
892c0dea8f | |||
0305e69e9e | |||
1a913b2644 | |||
0eee1908df | |||
ffea9343c8 | |||
eb24995ed9 | |||
4196664060 | |||
2824951e5e | |||
1e1db1c4d9 | |||
7440fe1e23 | |||
3f5c5e28fa | |||
6397b2a019 | |||
bf8f87b261 | |||
2f32834de0 | |||
79709caa39 | |||
c52509054d | |||
8fd587c2a3 | |||
edafe30a4b | |||
e24be0d325 | |||
5e93e834ad | |||
5e5bd579fb | |||
a24f869c84 | |||
635985bfd1 | |||
25143745c4 | |||
532254593f | |||
01ff53ab23 | |||
9643c80cc5 | |||
aa40da4995 | |||
da26b5a2c8 | |||
bbea48322f | |||
480f165c7b | |||
3d2c73ff8f | |||
9921e1e684 | |||
5bd26b894c | |||
8dc82225a4 | |||
2eb5cc61b9 | |||
78b2287163 | |||
66dc96680d | |||
64b8636145 | |||
2e7ef25a49 | |||
7186a45f6c | |||
72295202ec | |||
bc48a7217e | |||
a075f73162 |
1
.gitignore
vendored
1
.gitignore
vendored
@ -36,3 +36,4 @@ out/
|
||||
### VS Code ###
|
||||
.vscode/
|
||||
/dev/gpas*
|
||||
/deploy/.env
|
||||
|
121
README.md
121
README.md
@ -2,9 +2,31 @@
|
||||
|
||||
Diese Anwendung versendet ein bwHC-MTB-File an das bwHC-Backend und pseudonymisiert die Patienten-ID.
|
||||
|
||||
### Einordnung innerhalb einer DNPM-ETL-Strecke
|
||||
|
||||
Diese Anwendung erlaubt das Entgegennehmen HTTP/REST-Anfragen aus dem Onkostar-Plugin **[onkostar-plugin-dnpmexport](https://github.com/CCC-MF/onkostar-plugin-dnpmexport)**.
|
||||
|
||||
Der Inhalt einer Anfrage, wenn ein bwHC-MTBFile, wird pseudonymisiert und auf Duplikate geprüft.
|
||||
Duplikate werden verworfen, Änderungen werden weitergeleitet.
|
||||
|
||||
Löschanfragen werden immer als Löschanfrage an das bwHC-backend weitergeleitet.
|
||||
|
||||

|
||||
|
||||
#### HTTP/REST-Konfiguration
|
||||
|
||||
Anfragen werden, wenn nicht als Duplikat behandelt, nach der Pseudonymisierung direkt an das bwHC-Backend gesendet.
|
||||
|
||||
#### Konfiguration für Apache Kafka
|
||||
|
||||
Anfragen werden, wenn nicht als Duplikat behandelt, nach der Pseudonymisierung an Apache Kafka übergeben.
|
||||
Eine Antwort wird dabei ebenfalls mithilfe von Apache Kafka übermittelt und nach der Entgegennahme verarbeitet.
|
||||
|
||||
Siehe hierzu auch: https://github.com/CCC-MF/kafka-to-bwhc
|
||||
|
||||
## Pseudonymisierung der Patienten-ID
|
||||
|
||||
Wenn eine URI zu einer gPAS-Instanz angegeben ist, wird diese verwendet.
|
||||
Wenn eine URI zu einer gPAS-Instanz (Version >= 2023.1.0) angegeben ist, wird diese verwendet.
|
||||
Ist diese nicht gesetzt. wird intern eine Anonymisierung der Patienten-ID vorgenommen.
|
||||
|
||||
* `APP_PSEUDONYMIZE_PREFIX`: Standortbezogenes Prefix - `UNKNOWN`, wenn nicht gesetzt
|
||||
@ -20,12 +42,27 @@ als Patienten-Pseudonym verwendet.
|
||||
|
||||
Wurde die Verwendung von gPAS konfiguriert, so sind weitere Angaben zu konfigurieren.
|
||||
|
||||
* `APP_PSEUDONYMIZE_GPAS_URI`: URI der gPAS-Instanz inklusive Endpoint (z.B. `http://localhost:8080/ttp-fhir/fhir/gpas/$pseudonymizeAllowCreate`)
|
||||
* `APP_PSEUDONYMIZE_GPAS_URI`: URI der gPAS-Instanz inklusive Endpoint (
|
||||
z.B. `http://localhost:8080/ttp-fhir/fhir/gpas/$$pseudonymizeAllowCreate`)
|
||||
* `APP_PSEUDONYMIZE_GPAS_TARGET`: gPas Domänenname
|
||||
* `APP_PSEUDONYMIZE_GPAS_USERNAME`: gPas Basic-Auth Benutzername
|
||||
* `APP_PSEUDONYMIZE_GPAS_PASSWORD`: gPas Basic-Auth Passwort
|
||||
* `APP_PSEUDONYMIZE_GPAS_SSLCALOCATION`: Root Zertifikat für gPas, falls es dediziert hinzugefügt werden muss.
|
||||
|
||||
## Transformation von Werten
|
||||
|
||||
In Onkostar kann es vorkommen, dass ein Wert eines Merkmalskatalogs an einem Standort angepasst wurde und dadurch nicht dem Wert entspricht,
|
||||
der vom bwHC-Backend akzeptiert wird.
|
||||
|
||||
Diese Anwendung bietet daher die Möglichkeit, eine Transformation vorzunehmen. Hierzu muss der "Pfad" innerhalb des JSON-MTB-Files angegeben werden und
|
||||
welcher Wert wie ersetzt werden soll.
|
||||
|
||||
Hier ein Beispiel für die erste (Index 0 - weitere dann mit 1,2,...) Transformationsregel:
|
||||
|
||||
* `APP_TRANSFORMATIONS_0_PATH`: Pfad zum Wert in der JSON-MTB-Datei. Beispiel: `diagnoses[*].icd10.version` für **alle** Diagnosen
|
||||
* `APP_TRANSFORMATIONS_0_FROM`: Angabe des Werts, der ersetzt werden soll. Andere Werte bleiben dabei unverändert.
|
||||
* `APP_TRANSFORMATIONS_0_TO`: Angabe des neuen Werts.
|
||||
|
||||
## Mögliche Endpunkte
|
||||
|
||||
Für REST-Requests als auch zur Nutzung von Kafka-Topics können Endpunkte konfiguriert werden.
|
||||
@ -55,6 +92,84 @@ Weitere Einstellungen können über die Parameter von Spring Kafka konfiguriert
|
||||
Lässt sich keine Verbindung zu dem bwHC-Backend aufbauen, wird eine Rückantwort mit Status-Code `900` erwartet, welchen es
|
||||
für HTTP nicht gibt.
|
||||
|
||||
#### Retention Time
|
||||
|
||||
Generell werden in Apache Kafka alle Records entsprechend der Konfiguration vorgehalten.
|
||||
So wird ohne spezielle Konfiguration ein Record für 7 Tage in Apache Kafka gespeichert.
|
||||
Es sind innerhalb dieses Zeitraums auch alte Informationen weiterhin enthalten, wenn der Consent später abgelehnt wurde.
|
||||
|
||||
Durch eine entsprechende Konfiguration des Topics kann dies verhindert werden.
|
||||
|
||||
Beispiel - auszuführen innerhalb des Kafka-Containers: Löschen alter Records nach einem Tag
|
||||
```
|
||||
kafka-configs.sh --bootstrap-server localhost:9092 --alter --topic test --add-config retention.ms=86400000
|
||||
```
|
||||
|
||||
#### Key based Retention
|
||||
|
||||
Möchten Sie hingegen immer nur die letzte Meldung für einen Patienten und eine Erkrankung in Apache Kafka vorhalten,
|
||||
so ist die nachfolgend genannte Konfiguration der Kafka-Topics hilfreich.
|
||||
|
||||
|
||||
* `retention.ms`: Möglichst kurze Zeit in der alte Records noch erhalten bleiben, z.B. 10 Sekunden 10000
|
||||
* `cleanup.policy`: Löschen alter Records und Beibehalten des letzten Records zu einem Key [delete,compact]
|
||||
|
||||
Beispiele für ein Topic `test`, hier bitte an die verwendeten Topics anpassen.
|
||||
|
||||
```
|
||||
kafka-configs.sh --bootstrap-server localhost:9092 --alter --topic test --add-config retention.ms=10000
|
||||
kafka-configs.sh --bootstrap-server localhost:9092 --alter --topic test --add-config cleanup.policy=[delete,compact]
|
||||
```
|
||||
|
||||
Da als Key eines Records die (pseudonymisierte) Patienten-ID und die (anonymisierte) Erkrankungs-ID verwendet wird,
|
||||
stehen mit obiger Konfiguration der Kafka-Topics nach 10 Sekunden nur noch der jeweils letzte Eintrag für den entsprechenden
|
||||
Key zur Verfügung.
|
||||
|
||||
Da der Key sowohl für die Records in Richtung bwHC-Backend für die Rückantwort identisch aufgebaut ist, lassen sich so
|
||||
auch im Falle eines Consent-Widerspruchs die enthaltenen Daten als auch die Offenlegung durch Verifikationsdaten in der
|
||||
Antwort effektiv verhindern, da diese nach 10 Sekunden gelöscht werden.
|
||||
Es steht dann nur noch die jeweils letzten Information zur Verfügung, dass für einen Patienten/eine Erkrankung
|
||||
ein Consent-Widerspruch erfolgte.
|
||||
|
||||
## Docker-Images
|
||||
|
||||
Diese Anwendung ist auch als Docker-Image verfügbar: https://github.com/CCC-MF/etl-processor/pkgs/container/etl-processor
|
||||
Diese Anwendung ist auch als Docker-Image verfügbar: https://github.com/CCC-MF/etl-processor/pkgs/container/etl-processor
|
||||
|
||||
### Images lokal bauen
|
||||
|
||||
```bash
|
||||
./gradlew bootBuildImage
|
||||
```
|
||||
|
||||
## Deployment
|
||||
*Ausführen als Docker Conatiner:*
|
||||
|
||||
```bash
|
||||
cd ./deploy
|
||||
cp env-sample.env .env
|
||||
```
|
||||
Wenn gewünscht, Änderungen in der `.env` vornehmen.
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Entwicklungssetup
|
||||
|
||||
Zum Starten einer lokalen Entwicklungs- und Testumgebung kann die beiliegende Datei `dev-compose.yml` verwendet werden.
|
||||
Diese kann zur Nutzung der Datenbanken **MariaDB** als auch **PostgreSQL** angepasst werden.
|
||||
|
||||
Zur Nutzung von Apache Kafka muss dazu ein Eintrag im hosts-File vorgenommen werden und der Hostname `kafka` auf die lokale
|
||||
IP-Adresse verweisen. Ohne diese Einstellung ist eine Nutzung von Apache Kafka außerhalb der Docker-Umgebung nicht möglich.
|
||||
|
||||
Beim Start der Anwendung mit dem Profil `dev` wird die in `dev-compose.yml` definierte Umgebung beim Start der
|
||||
Anwendung mit gestartet:
|
||||
|
||||
```
|
||||
SPRING_PROFILES_ACTIVE=dev ./gradlew bootRun
|
||||
```
|
||||
|
||||
Die Datei `application-dev.yml` enthält hierzu die Konfiguration für das Profil `dev`.
|
||||
|
||||
Beim Ausführen der Integrationstests wird eine Testdatenbank in einem Docker-Container gestartet.
|
||||
Siehe hier auch die Klasse `AbstractTestcontainerTest` unter `src/integrationTest`.
|
||||
|
@ -4,14 +4,21 @@ import org.springframework.boot.gradle.tasks.bundling.BootBuildImage
|
||||
|
||||
plugins {
|
||||
war
|
||||
id("org.springframework.boot") version "3.1.2"
|
||||
id("io.spring.dependency-management") version "1.1.0"
|
||||
kotlin("jvm") version "1.9.0"
|
||||
kotlin("plugin.spring") version "1.9.0"
|
||||
id("org.springframework.boot") version "3.2.1"
|
||||
id("io.spring.dependency-management") version "1.1.4"
|
||||
kotlin("jvm") version "1.9.22"
|
||||
kotlin("plugin.spring") version "1.9.22"
|
||||
}
|
||||
|
||||
group = "de.ukw.ccc"
|
||||
version = "0.1.1"
|
||||
version = "0.4.0"
|
||||
|
||||
var versions = mapOf(
|
||||
"bwhc-dto-java" to "0.2.0",
|
||||
"hapi-fhir" to "6.10.2",
|
||||
"httpclient5" to "5.2.1",
|
||||
"mockito-kotlin" to "5.2.1"
|
||||
)
|
||||
|
||||
java {
|
||||
sourceCompatibility = JavaVersion.VERSION_17
|
||||
@ -52,10 +59,11 @@ dependencies {
|
||||
implementation("org.flywaydb:flyway-mysql")
|
||||
implementation("commons-codec:commons-codec")
|
||||
implementation("io.projectreactor.kotlin:reactor-kotlin-extensions")
|
||||
implementation("de.ukw.ccc:bwhc-dto-java:0.2.0")
|
||||
implementation("ca.uhn.hapi.fhir:hapi-fhir-base:6.6.2")
|
||||
implementation("ca.uhn.hapi.fhir:hapi-fhir-structures-r4:6.6.2")
|
||||
implementation("org.apache.httpcomponents.client5:httpclient5:5.2.1")
|
||||
implementation("de.ukw.ccc:bwhc-dto-java:${versions["bwhc-dto-java"]}")
|
||||
implementation("ca.uhn.hapi.fhir:hapi-fhir-base:${versions["hapi-fhir"]}")
|
||||
implementation("ca.uhn.hapi.fhir:hapi-fhir-structures-r4:${versions["hapi-fhir"]}")
|
||||
implementation("org.apache.httpcomponents.client5:httpclient5:${versions["httpclient5"]}")
|
||||
implementation("com.jayway.jsonpath:json-path")
|
||||
runtimeOnly("org.mariadb.jdbc:mariadb-java-client")
|
||||
runtimeOnly("org.postgresql:postgresql")
|
||||
developmentOnly("org.springframework.boot:spring-boot-devtools")
|
||||
@ -64,7 +72,7 @@ dependencies {
|
||||
providedRuntime("org.springframework.boot:spring-boot-starter-tomcat")
|
||||
testImplementation("org.springframework.boot:spring-boot-starter-test")
|
||||
testImplementation("io.projectreactor:reactor-test")
|
||||
testImplementation("org.mockito.kotlin:mockito-kotlin:5.0.0")
|
||||
testImplementation("org.mockito.kotlin:mockito-kotlin:${versions["mockito-kotlin"]}")
|
||||
integrationTestImplementation("org.testcontainers:junit-jupiter")
|
||||
integrationTestImplementation("org.testcontainers:postgresql")
|
||||
}
|
||||
|
55
deploy/docker-compose.yaml
Normal file
55
deploy/docker-compose.yaml
Normal file
@ -0,0 +1,55 @@
|
||||
|
||||
|
||||
services:
|
||||
dnpm-etl-processor:
|
||||
image: ghcr.io/ccc-mf/etl-processor:latest
|
||||
environment:
|
||||
LOGGING_LEVEL_DEV: ${DNPM_LOG_LEVEL:-INFO}
|
||||
SPRING_KAFKA_SECURITY_PROTOCOL: ${DNPM_KAFKA_SECURITY_PROTOCOL:-SSL}
|
||||
SPRING_KAFKA_SSL_TRUST-STORE-TYPE: PKCS12
|
||||
SPRING_KAFKA_SSL_TRUST-STORE-LOCATION: /opt/dnpm-processor/ssl/truststore.jks
|
||||
SPRING_KAFKA_SSL_TRUST-STORE-PASSWORD: ${KAFKA_TRUST_STORE_PASSWORD}
|
||||
SPRING_KAFKA_SSL_KEY-STORE-TYPE: PKCS12
|
||||
SPRING_KAFKA_SSL_KEY-STORE-LOCATION: /opt/dnpm-processor/ssl/keystore.jks
|
||||
SPRING_KAFKA_SSL_KEY-STORE-PASSWORD: ${DNPM_PROCESSOR_KEY_STORE_PASSWORD}
|
||||
SPRING_KAFKA_PRODUCER_COMPRESSION-TYPE: gzip
|
||||
APP_KAFKA_TOPIC: ${DNPM_KAFKA_TOPIC}
|
||||
APP_KAFKA_SERVERS: ${KAFKA_BROKERS}
|
||||
APP_KAFKA_GROUP_ID: ${DNPM_KAFKA_GROUP_ID}
|
||||
APP_KAFKA_RESPONSE_TOPIC: ${DNPM_KAFKA_RESPONSE_TOPIC}
|
||||
APP_REST_URI: ${DNPM_BWHC_REST_URI}
|
||||
SPRING_DATASOURCE_URL: ${DNPM_DATASOURCE_URL}
|
||||
SPRING_DATASOURCE_PASSWORD: ${DNPM_MARIADB_USER_PW}
|
||||
SPRING_DATASOURCE_USERNAME: ${DNPM_MARIADB_DB}
|
||||
APP_PSEUDONYMIZE_GPAS_SSLCALOCATION: /workspace/opt/dnpm-processor/ssl/mosaic.crt
|
||||
APP_PSEUDONYMIZE_GPAS_PASSWORD: ${DNPM_PSEUDONYMIZE_GPAS_PASSWORD}
|
||||
APP_PSEUDONYMIZE_GPAS_USERNAME: ${DNPM_PSEUDONYMIZE_GPAS_USERNAME}
|
||||
APP_PSEUDONYMIZE_GPAS_TARGET: ${DNPM_PSEUDONYMIZE_GPAS_TARGET}
|
||||
APP_PSEUDONYMIZE_GPAS_URI: ${DNPM_PSEUDONYMIZE_GPAS_URI}
|
||||
APP_PSEUDONYMIZE_PREFIX: ${DNPM_APP_PSEUDONYMIZE_PREFIX}
|
||||
APP_PSEUDONYMIZER: ${DNPM_PSEUDONYMIZE_GENERATOR}
|
||||
volumes:
|
||||
- /etc/localtime:/etc/localtime:ro
|
||||
- /etc/timezone:/etc/timezone:ro
|
||||
#- ${DNPM_TO_SSL_KEYSTORE_LOCATION}:/workspace/opt/dnpm-processor/ssl/keystore.jks:ro
|
||||
#- ${KAFKA_TRUST_STORE_LOCATION}:/workspace/opt/dnpm-processor/ssl/truststore.jks:ro
|
||||
#- ${DNPM_PSEUDONYMIZE_GPAS_SSLCALOCATION}:/workspace/opt/dnpm-processor/ssl/mosaic.crt
|
||||
|
||||
depends_on:
|
||||
- dnpm-monitor-db
|
||||
ports:
|
||||
- "${DNPM_MONITORING_HTTP_PORT:-8080}:8080"
|
||||
|
||||
# todo add volume
|
||||
dnpm-monitor-db:
|
||||
image: mariadb:10
|
||||
environment:
|
||||
MARIADB_DATABASE: ${DNPM_MARIADB_DB}
|
||||
MARIADB_USER: ${DNPM_MARIADB_USER}
|
||||
MARIADB_PASSWORD: ${DNPM_MARIADB_USER_PW}
|
||||
MARIADB_ROOT_PASSWORD: ${DNPM_MARIADB_ROOT_PW}
|
||||
expose:
|
||||
- "3306"
|
||||
|
||||
|
||||
|
40
deploy/env-sample.env
Normal file
40
deploy/env-sample.env
Normal file
@ -0,0 +1,40 @@
|
||||
# monitoring access port
|
||||
DNPM_MONITORING_HTTP_PORT=8088
|
||||
DNPM_LOG_LEVEL=INFO
|
||||
|
||||
# GPAS or BUILDIN
|
||||
DNPM_PSEUDONYMIZE_GENERATOR=BUILDIN
|
||||
DNPM_APP_PSEUDONYMIZE_PREFIX=ANONYM
|
||||
DNPM_PSEUDONYMIZE_GPAS_URI=
|
||||
DNPM_PSEUDONYMIZE_GPAS_TARGET=
|
||||
DNPM_PSEUDONYMIZE_GPAS_USERNAME=
|
||||
DNPM_PSEUDONYMIZE_GPAS_PASSWORD=
|
||||
|
||||
# path to ca root cert if needed
|
||||
DNPM_PSEUDONYMIZE_GPAS_SSLCALOCATION=
|
||||
|
||||
DNPM_MARIADB_DB=dnpm_monitoring
|
||||
DNPM_MARIADB_USER=$DNPM_MARIADB_DB
|
||||
DNPM_MARIADB_USER_PW=MySuperSecurePassword111
|
||||
DNPM_MARIADB_ROOT_PW=MySuperDuperSecurePassword111
|
||||
|
||||
# monitoring data db
|
||||
DNPM_DATASOURCE_URL=jdbc:mariadb://dnpm-monitor-db:3306/$DNPM_MARIADB_DB
|
||||
|
||||
## TARGET SYSTEMS CONFIG
|
||||
# in case of direct access to bwhc enter endpoint url here
|
||||
DNPM_BWHC_REST_URI=
|
||||
|
||||
# produce mtb files to this topic - values 'false' disabling kafka processing
|
||||
DNPM_KAFKA_TOPIC=false
|
||||
KAFKA_BROKERS=false
|
||||
DNPM_KAFKA_SECURITY_PROTOCOL=PLAINTEXT
|
||||
|
||||
# here we receive responses from bwhc
|
||||
DNPM_KAFKA_RESPONSE_TOPIC=dnpm-response
|
||||
DNPM_KAFKA_GROUP_ID=dnpm
|
||||
|
||||
# SSL or PLAINTEXT
|
||||
DNPM_PROCESSOR_KEY_STORE_PASSWORD=
|
||||
DNPM_TO_SSL_KEYSTORE_LOCATION=
|
||||
|
@ -4,8 +4,33 @@ services:
|
||||
hostname: kafka
|
||||
ports:
|
||||
- "9092:9092"
|
||||
- "9094:9094"
|
||||
environment:
|
||||
ALLOW_PLAINTEXT_LISTENER: "yes"
|
||||
KAFKA_CFG_NODE_ID: "0"
|
||||
KAFKA_CFG_PROCESS_ROLES: "controller,broker"
|
||||
KAFKA_CFG_LISTENERS: PLAINTEXT://:9092,CONTROLLER://:9093,EXTERNAL://:9094
|
||||
KAFKA_CFG_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,EXTERNAL://localhost:9094
|
||||
KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: CONTROLLER:PLAINTEXT,EXTERNAL:PLAINTEXT,PLAINTEXT:PLAINTEXT
|
||||
KAFKA_CFG_INTER_BROKER_LISTENER_NAME: PLAINTEXT
|
||||
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: true
|
||||
KAFKA_CFG_CONTROLLER_QUORUM_VOTERS: 0@kafka:9093
|
||||
KAFKA_CFG_CONTROLLER_LISTENER_NAMES: CONTROLLER
|
||||
|
||||
akhq:
|
||||
image: tchiotludo/akhq:0.21.0
|
||||
environment:
|
||||
AKHQ_CONFIGURATION: |
|
||||
akhq:
|
||||
connections:
|
||||
docker-kafka-server:
|
||||
properties:
|
||||
bootstrap.servers: "kafka:9092"
|
||||
connect:
|
||||
- name: "kafka-connect"
|
||||
url: "http://kafka-connect:8083"
|
||||
ports:
|
||||
- "8084:8080"
|
||||
|
||||
mariadb:
|
||||
image: mariadb:10
|
||||
@ -16,6 +41,7 @@ services:
|
||||
MARIADB_USER: dev
|
||||
MARIADB_PASSWORD: dev
|
||||
MARIADB_ROOT_PASSWORD: dev
|
||||
|
||||
# postgres:
|
||||
# image: postgres:alpine
|
||||
# ports:
|
||||
@ -23,4 +49,4 @@ services:
|
||||
# environment:
|
||||
# POSTGRES_DB: dev
|
||||
# POSTGRES_USER: dev
|
||||
# POSTGRES_PASSWORD: dev
|
||||
# POSTGRES_PASSWORD: dev
|
BIN
docs/etl.png
Normal file
BIN
docs/etl.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 75 KiB |
@ -19,22 +19,125 @@
|
||||
|
||||
package dev.dnpm.etl.processor
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import de.ukw.ccc.bwhc.dto.*
|
||||
import dev.dnpm.etl.processor.monitoring.RequestRepository
|
||||
import dev.dnpm.etl.processor.monitoring.RequestStatus
|
||||
import dev.dnpm.etl.processor.output.MtbFileSender
|
||||
import org.assertj.core.api.Assertions.assertThat
|
||||
import org.junit.jupiter.api.BeforeEach
|
||||
import org.junit.jupiter.api.Nested
|
||||
import org.junit.jupiter.api.Test
|
||||
import org.junit.jupiter.api.extension.ExtendWith
|
||||
import org.mockito.kotlin.*
|
||||
import org.springframework.beans.factory.annotation.Autowired
|
||||
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc
|
||||
import org.springframework.boot.test.context.SpringBootTest
|
||||
import org.springframework.boot.test.mock.mockito.MockBean
|
||||
import org.springframework.context.ApplicationContext
|
||||
import org.springframework.http.MediaType
|
||||
import org.springframework.test.context.TestPropertySource
|
||||
import org.springframework.test.context.junit.jupiter.SpringExtension
|
||||
import org.springframework.test.web.servlet.MockMvc
|
||||
import org.springframework.test.web.servlet.post
|
||||
import org.testcontainers.junit.jupiter.Testcontainers
|
||||
|
||||
@Testcontainers
|
||||
@ExtendWith(SpringExtension::class)
|
||||
@SpringBootTest
|
||||
@MockBean(MtbFileSender::class)
|
||||
@TestPropertySource(
|
||||
properties = [
|
||||
"app.rest.uri=http://example.com"
|
||||
]
|
||||
)
|
||||
class EtlProcessorApplicationTests : AbstractTestcontainerTest() {
|
||||
|
||||
@Test
|
||||
fun contextLoadsIfMtbFileSenderConfigured() {
|
||||
fun contextLoadsIfMtbFileSenderConfigured(@Autowired context: ApplicationContext) {
|
||||
// Simply check bean configuration
|
||||
assertThat(context).isNotNull
|
||||
}
|
||||
|
||||
@Nested
|
||||
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.MOCK)
|
||||
@AutoConfigureMockMvc
|
||||
@TestPropertySource(
|
||||
properties = [
|
||||
"app.transformations[0].path=diagnoses[*].icd10.version",
|
||||
"app.transformations[0].from=2013",
|
||||
"app.transformations[0].to=2014",
|
||||
]
|
||||
)
|
||||
inner class TransformationTest {
|
||||
|
||||
@MockBean
|
||||
private lateinit var mtbFileSender: MtbFileSender
|
||||
|
||||
@Autowired
|
||||
private lateinit var mockMvc: MockMvc
|
||||
|
||||
@Autowired
|
||||
private lateinit var objectMapper: ObjectMapper
|
||||
|
||||
@BeforeEach
|
||||
fun setup(@Autowired requestRepository: RequestRepository) {
|
||||
requestRepository.deleteAll()
|
||||
}
|
||||
|
||||
@Test
|
||||
fun mtbFileIsTransformed() {
|
||||
doAnswer {
|
||||
MtbFileSender.Response(RequestStatus.SUCCESS)
|
||||
}.whenever(mtbFileSender).send(any<MtbFileSender.MtbFileRequest>())
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
.withId("TEST_12345678")
|
||||
.withBirthDate("2000-08-08")
|
||||
.withGender(Patient.Gender.MALE)
|
||||
.build()
|
||||
)
|
||||
.withConsent(
|
||||
Consent.builder()
|
||||
.withId("1")
|
||||
.withStatus(Consent.Status.ACTIVE)
|
||||
.withPatient("TEST_12345678")
|
||||
.build()
|
||||
)
|
||||
.withEpisode(
|
||||
Episode.builder()
|
||||
.withId("1")
|
||||
.withPatient("TEST_12345678")
|
||||
.withPeriod(PeriodStart("2023-08-08"))
|
||||
.build()
|
||||
)
|
||||
.withDiagnoses(
|
||||
listOf(
|
||||
Diagnosis.builder()
|
||||
.withId("1234")
|
||||
.withIcd10(Icd10.builder().withCode("F79.9").withVersion("2013").build())
|
||||
.build()
|
||||
)
|
||||
)
|
||||
.build()
|
||||
|
||||
mockMvc.post("/mtbfile") {
|
||||
content = objectMapper.writeValueAsString(mtbFile)
|
||||
contentType = MediaType.APPLICATION_JSON
|
||||
}.andExpect {
|
||||
status {
|
||||
isAccepted()
|
||||
}
|
||||
}
|
||||
|
||||
val captor = argumentCaptor<MtbFileSender.MtbFileRequest>()
|
||||
verify(mtbFileSender).send(captor.capture())
|
||||
assertThat(captor.firstValue.mtbFile.diagnoses).hasSize(1).allMatch { diagnosis ->
|
||||
diagnosis.icd10.version == "2014"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -31,13 +31,13 @@ import org.springframework.beans.factory.NoSuchBeanDefinitionException
|
||||
import org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration
|
||||
import org.springframework.boot.test.context.SpringBootTest
|
||||
import org.springframework.boot.test.mock.mockito.MockBean
|
||||
import org.springframework.boot.test.mock.mockito.MockBeans
|
||||
import org.springframework.context.ApplicationContext
|
||||
import org.springframework.test.context.ContextConfiguration
|
||||
import org.springframework.test.context.TestPropertySource
|
||||
|
||||
@SpringBootTest
|
||||
@ContextConfiguration(classes = [KafkaAutoConfiguration::class, AppKafkaConfiguration::class, AppRestConfiguration::class])
|
||||
@ContextConfiguration(classes = [AppConfiguration::class, KafkaAutoConfiguration::class, AppKafkaConfiguration::class, AppRestConfiguration::class])
|
||||
@MockBean(ObjectMapper::class)
|
||||
class AppConfigurationTest {
|
||||
|
||||
@Nested
|
||||
@ -65,10 +65,7 @@ class AppConfigurationTest {
|
||||
"app.kafka.group-id=test"
|
||||
]
|
||||
)
|
||||
@MockBeans(value = [
|
||||
MockBean(ObjectMapper::class),
|
||||
MockBean(RequestRepository::class)
|
||||
])
|
||||
@MockBean(RequestRepository::class)
|
||||
inner class AppConfigurationKafkaTest(private val context: ApplicationContext) {
|
||||
|
||||
@Test
|
||||
@ -99,4 +96,24 @@ class AppConfigurationTest {
|
||||
|
||||
}
|
||||
|
||||
@Nested
|
||||
@TestPropertySource(
|
||||
properties = [
|
||||
"app.transformations[0].path=consent.status",
|
||||
"app.transformations[0].from=rejected",
|
||||
"app.transformations[0].to=accept",
|
||||
]
|
||||
)
|
||||
inner class AppConfigurationTransformationTest(private val context: ApplicationContext) {
|
||||
|
||||
@Test
|
||||
fun shouldRecognizeTransformations() {
|
||||
val appConfigProperties = context.getBean(AppConfigProperties::class.java)
|
||||
|
||||
assertThat(appConfigProperties).isNotNull
|
||||
assertThat(appConfigProperties.transformations).hasSize(1)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
@ -32,6 +32,7 @@ import org.junit.jupiter.api.extension.ExtendWith
|
||||
import org.springframework.beans.factory.annotation.Autowired
|
||||
import org.springframework.boot.test.context.SpringBootTest
|
||||
import org.springframework.boot.test.mock.mockito.MockBean
|
||||
import org.springframework.test.context.TestPropertySource
|
||||
import org.springframework.test.context.junit.jupiter.SpringExtension
|
||||
import org.springframework.transaction.annotation.Transactional
|
||||
import org.testcontainers.junit.jupiter.Testcontainers
|
||||
@ -43,6 +44,11 @@ import java.util.*
|
||||
@SpringBootTest
|
||||
@Transactional
|
||||
@MockBean(MtbFileSender::class)
|
||||
@TestPropertySource(
|
||||
properties = [
|
||||
"app.rest.uri=http://example.com"
|
||||
]
|
||||
)
|
||||
class RequestServiceIntegrationTest : AbstractTestcontainerTest() {
|
||||
|
||||
private lateinit var requestRepository: RequestRepository
|
||||
|
@ -22,6 +22,21 @@ package dev.dnpm.etl.processor.pseudonym;
|
||||
import ca.uhn.fhir.context.FhirContext;
|
||||
import ca.uhn.fhir.parser.IParser;
|
||||
import dev.dnpm.etl.processor.config.GPasConfigProperties;
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.net.ConnectException;
|
||||
import java.security.KeyManagementException;
|
||||
import java.security.KeyStore;
|
||||
import java.security.KeyStoreException;
|
||||
import java.security.NoSuchAlgorithmException;
|
||||
import java.security.cert.CertificateException;
|
||||
import java.security.cert.CertificateFactory;
|
||||
import java.security.cert.X509Certificate;
|
||||
import java.util.Base64;
|
||||
import java.util.HashMap;
|
||||
import javax.net.ssl.SSLContext;
|
||||
import javax.net.ssl.TrustManagerFactory;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.hc.client5.http.impl.classic.CloseableHttpClient;
|
||||
import org.apache.hc.client5.http.impl.classic.HttpClients;
|
||||
@ -39,7 +54,11 @@ import org.jetbrains.annotations.NotNull;
|
||||
import org.jetbrains.annotations.Nullable;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.http.*;
|
||||
import org.springframework.http.HttpEntity;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
import org.springframework.http.HttpMethod;
|
||||
import org.springframework.http.MediaType;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;
|
||||
import org.springframework.retry.RetryCallback;
|
||||
import org.springframework.retry.RetryContext;
|
||||
@ -51,31 +70,13 @@ import org.springframework.retry.support.RetryTemplate;
|
||||
import org.springframework.web.client.RestClientException;
|
||||
import org.springframework.web.client.RestTemplate;
|
||||
|
||||
import javax.net.ssl.SSLContext;
|
||||
import javax.net.ssl.TrustManagerFactory;
|
||||
import java.io.BufferedInputStream;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.net.ConnectException;
|
||||
import java.security.KeyManagementException;
|
||||
import java.security.KeyStore;
|
||||
import java.security.KeyStoreException;
|
||||
import java.security.NoSuchAlgorithmException;
|
||||
import java.security.cert.CertificateException;
|
||||
import java.security.cert.CertificateFactory;
|
||||
import java.security.cert.X509Certificate;
|
||||
import java.util.Base64;
|
||||
import java.util.HashMap;
|
||||
|
||||
public class GpasPseudonymGenerator implements Generator {
|
||||
|
||||
private final static FhirContext r4Context = FhirContext.forR4();
|
||||
private final String gPasUrl;
|
||||
private final String psnTargetDomain;
|
||||
private static FhirContext r4Context = FhirContext.forR4();
|
||||
private final HttpHeaders httpHeader;
|
||||
|
||||
private final RetryTemplate retryTemplate = defaultTemplate();
|
||||
|
||||
private final Logger log = LoggerFactory.getLogger(GpasPseudonymGenerator.class);
|
||||
|
||||
private SSLContext customSslContext;
|
||||
@ -90,12 +91,16 @@ public class GpasPseudonymGenerator implements Generator {
|
||||
try {
|
||||
if (StringUtils.isNotBlank(gpasCfg.getSslCaLocation())) {
|
||||
customSslContext = getSslContext(gpasCfg.getSslCaLocation());
|
||||
log.debug(String.format("%s has been initialized with SSL certificate %s",
|
||||
this.getClass().getName(), gpasCfg.getSslCaLocation()));
|
||||
}
|
||||
} catch (IOException | KeyManagementException | KeyStoreException | CertificateException |
|
||||
NoSuchAlgorithmException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
|
||||
log.debug(String.format("%s has been initialized", this.getClass().getName()));
|
||||
|
||||
}
|
||||
|
||||
@Override
|
||||
@ -110,12 +115,33 @@ public class GpasPseudonymGenerator implements Generator {
|
||||
|
||||
@NotNull
|
||||
public static String unwrapPseudonym(Parameters gPasPseudonymResult) {
|
||||
Identifier pseudonym = (Identifier) gPasPseudonymResult.getParameter().stream().findFirst()
|
||||
.get().getPart().stream().filter(a -> a.getName().equals("pseudonym")).findFirst()
|
||||
final var parameters = gPasPseudonymResult.getParameter().stream().findFirst();
|
||||
|
||||
if (parameters.isEmpty()) {
|
||||
throw new PseudonymRequestFailed("Empty HL7 parameters, cannot find first one");
|
||||
}
|
||||
|
||||
final var identifier = (Identifier) parameters.get().getPart().stream()
|
||||
.filter(a -> a.getName().equals("pseudonym"))
|
||||
.findFirst()
|
||||
.orElseGet(ParametersParameterComponent::new).getValue();
|
||||
|
||||
// pseudonym
|
||||
return pseudonym.getSystem() + "|" + pseudonym.getValue();
|
||||
return sanitizeValue(identifier.getValue());
|
||||
}
|
||||
|
||||
/**
|
||||
* Allow only filename friendly values
|
||||
*
|
||||
* @param psnValue GAPS pseudonym value
|
||||
* @return cleaned up value
|
||||
*/
|
||||
public static String sanitizeValue(String psnValue) {
|
||||
// pattern to match forbidden characters
|
||||
String forbiddenCharsRegex = "[\\\\/:*?\"<>|;]";
|
||||
|
||||
// Replace all forbidden characters with underscores
|
||||
return psnValue.replaceAll(forbiddenCharsRegex, "_");
|
||||
}
|
||||
|
||||
|
||||
|
@ -23,8 +23,9 @@ import org.springframework.boot.context.properties.ConfigurationProperties
|
||||
|
||||
@ConfigurationProperties(AppConfigProperties.NAME)
|
||||
data class AppConfigProperties(
|
||||
var bwhc_uri: String?,
|
||||
var generator: PseudonymGenerator = PseudonymGenerator.BUILDIN
|
||||
var bwhcUri: String?,
|
||||
var generator: PseudonymGenerator = PseudonymGenerator.BUILDIN,
|
||||
var transformations: List<TransformationProperties> = listOf()
|
||||
) {
|
||||
companion object {
|
||||
const val NAME = "app"
|
||||
@ -78,4 +79,10 @@ data class KafkaTargetProperties(
|
||||
enum class PseudonymGenerator {
|
||||
BUILDIN,
|
||||
GPAS
|
||||
}
|
||||
}
|
||||
|
||||
data class TransformationProperties(
|
||||
val path: String,
|
||||
val from: String,
|
||||
val to: String
|
||||
)
|
@ -1,7 +1,7 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
* Copyright (c) 2024 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
@ -25,10 +25,14 @@ import dev.dnpm.etl.processor.pseudonym.AnonymizingGenerator
|
||||
import dev.dnpm.etl.processor.pseudonym.Generator
|
||||
import dev.dnpm.etl.processor.pseudonym.GpasPseudonymGenerator
|
||||
import dev.dnpm.etl.processor.pseudonym.PseudonymizeService
|
||||
import dev.dnpm.etl.processor.services.Transformation
|
||||
import dev.dnpm.etl.processor.services.TransformationService
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty
|
||||
import org.springframework.boot.context.properties.EnableConfigurationProperties
|
||||
import org.springframework.context.annotation.Bean
|
||||
import org.springframework.context.annotation.Configuration
|
||||
import org.springframework.scheduling.annotation.EnableScheduling
|
||||
import reactor.core.publisher.Sinks
|
||||
|
||||
@Configuration
|
||||
@ -39,8 +43,11 @@ import reactor.core.publisher.Sinks
|
||||
GPasConfigProperties::class
|
||||
]
|
||||
)
|
||||
@EnableScheduling
|
||||
class AppConfiguration {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(AppConfiguration::class.java)
|
||||
|
||||
@ConditionalOnProperty(value = ["app.pseudonymizer"], havingValue = "GPAS")
|
||||
@Bean
|
||||
fun gpasPseudonymGenerator(configProperties: GPasConfigProperties): Generator {
|
||||
@ -71,5 +78,16 @@ class AppConfiguration {
|
||||
return Sinks.many().multicast().directBestEffort()
|
||||
}
|
||||
|
||||
@Bean
|
||||
fun transformationService(
|
||||
objectMapper: ObjectMapper,
|
||||
configProperties: AppConfigProperties
|
||||
): TransformationService {
|
||||
logger.info("Apply ${configProperties.transformations.size} transformation rules")
|
||||
return TransformationService(objectMapper, configProperties.transformations.map {
|
||||
Transformation.of(it.path) from it.from to it.to
|
||||
})
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
@ -1,7 +1,7 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
* Copyright (c) 2024 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
@ -20,6 +20,8 @@
|
||||
package dev.dnpm.etl.processor.config
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import dev.dnpm.etl.processor.monitoring.ConnectionCheckService
|
||||
import dev.dnpm.etl.processor.monitoring.KafkaConnectionCheckService
|
||||
import dev.dnpm.etl.processor.output.KafkaMtbFileSender
|
||||
import dev.dnpm.etl.processor.output.MtbFileSender
|
||||
import dev.dnpm.etl.processor.services.kafka.KafkaResponseProcessor
|
||||
@ -76,4 +78,9 @@ class AppKafkaConfiguration {
|
||||
return KafkaResponseProcessor(applicationEventPublisher, objectMapper)
|
||||
}
|
||||
|
||||
@Bean
|
||||
fun connectionCheckService(consumerFactory: ConsumerFactory<String, String>): ConnectionCheckService {
|
||||
return KafkaConnectionCheckService(consumerFactory.createConsumer())
|
||||
}
|
||||
|
||||
}
|
@ -1,7 +1,7 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
* Copyright (c) 2024 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
@ -19,6 +19,8 @@
|
||||
|
||||
package dev.dnpm.etl.processor.config
|
||||
|
||||
import dev.dnpm.etl.processor.monitoring.ConnectionCheckService
|
||||
import dev.dnpm.etl.processor.monitoring.RestConnectionCheckService
|
||||
import dev.dnpm.etl.processor.output.MtbFileSender
|
||||
import dev.dnpm.etl.processor.output.RestMtbFileSender
|
||||
import org.slf4j.LoggerFactory
|
||||
@ -54,5 +56,13 @@ class AppRestConfiguration {
|
||||
return RestMtbFileSender(restTemplate, restTargetProperties)
|
||||
}
|
||||
|
||||
@Bean
|
||||
fun connectionCheckService(
|
||||
restTemplate: RestTemplate,
|
||||
restTargetProperties: RestTargetProperties
|
||||
): ConnectionCheckService {
|
||||
return RestConnectionCheckService(restTemplate, restTargetProperties)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
@ -0,0 +1,85 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2024 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
* by the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU Affero General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU Affero General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
|
||||
package dev.dnpm.etl.processor.monitoring
|
||||
|
||||
import dev.dnpm.etl.processor.config.RestTargetProperties
|
||||
import jakarta.annotation.PostConstruct
|
||||
import org.apache.kafka.clients.consumer.Consumer
|
||||
import org.apache.kafka.common.errors.TimeoutException
|
||||
import org.springframework.http.HttpStatus
|
||||
import org.springframework.scheduling.annotation.Scheduled
|
||||
import org.springframework.web.client.RestTemplate
|
||||
import kotlin.time.Duration.Companion.seconds
|
||||
import kotlin.time.toJavaDuration
|
||||
|
||||
interface ConnectionCheckService {
|
||||
|
||||
fun connectionAvailable(): Boolean
|
||||
|
||||
}
|
||||
|
||||
class KafkaConnectionCheckService(
|
||||
private val consumer: Consumer<String, String>
|
||||
) : ConnectionCheckService {
|
||||
|
||||
private var connectionAvailable: Boolean = false
|
||||
|
||||
|
||||
@PostConstruct
|
||||
@Scheduled(cron = "0 * * * * *")
|
||||
fun check() {
|
||||
connectionAvailable = try {
|
||||
null != consumer.listTopics(5.seconds.toJavaDuration())
|
||||
} catch (e: TimeoutException) {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
override fun connectionAvailable(): Boolean {
|
||||
return this.connectionAvailable
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
class RestConnectionCheckService(
|
||||
private val restTemplate: RestTemplate,
|
||||
private val restTargetProperties: RestTargetProperties
|
||||
) : ConnectionCheckService {
|
||||
|
||||
private var connectionAvailable: Boolean = false
|
||||
|
||||
@PostConstruct
|
||||
@Scheduled(cron = "0 * * * * *")
|
||||
fun check() {
|
||||
connectionAvailable = try {
|
||||
restTemplate.getForEntity(
|
||||
restTargetProperties.uri?.replace("/etl/api", "").toString(),
|
||||
String::class.java
|
||||
).statusCode == HttpStatus.OK
|
||||
} catch (e: Exception) {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
override fun connectionAvailable(): Boolean {
|
||||
return this.connectionAvailable
|
||||
}
|
||||
}
|
@ -34,7 +34,10 @@ class ReportService(
|
||||
return listOf()
|
||||
}
|
||||
return try {
|
||||
objectMapper.readValue(dataQualityReport, DataQualityReport::class.java).issues
|
||||
objectMapper
|
||||
.readValue(dataQualityReport, DataQualityReport::class.java)
|
||||
.issues
|
||||
.sortedBy { it.severity }
|
||||
} catch (e: Exception) {
|
||||
val otherIssue =
|
||||
Issue(Severity.ERROR, "Not parsable data quality report '$dataQualityReport'")
|
||||
@ -56,5 +59,6 @@ class ReportService(
|
||||
enum class Severity(@JsonValue val value: String) {
|
||||
ERROR("error"),
|
||||
WARNING("warning"),
|
||||
INFO("info")
|
||||
}
|
||||
}
|
@ -40,7 +40,7 @@ class KafkaMtbFileSender(
|
||||
val result = kafkaTemplate.send(
|
||||
kafkaTargetProperties.topic,
|
||||
key(request),
|
||||
objectMapper.writeValueAsString(request.mtbFile)
|
||||
objectMapper.writeValueAsString(Data(request.requestId, request.mtbFile))
|
||||
)
|
||||
if (result.get() != null) {
|
||||
logger.debug("Sent file via KafkaMtbFileSender")
|
||||
@ -68,7 +68,7 @@ class KafkaMtbFileSender(
|
||||
val result = kafkaTemplate.send(
|
||||
kafkaTargetProperties.topic,
|
||||
key(request),
|
||||
objectMapper.writeValueAsString(dummyMtbFile)
|
||||
objectMapper.writeValueAsString(Data(request.requestId, dummyMtbFile))
|
||||
)
|
||||
|
||||
if (result.get() != null) {
|
||||
@ -85,12 +85,12 @@ class KafkaMtbFileSender(
|
||||
|
||||
private fun key(request: MtbFileSender.MtbFileRequest): String {
|
||||
return "{\"pid\": \"${request.mtbFile.patient.id}\", " +
|
||||
"\"eid\": \"${request.mtbFile.episode.id}\", " +
|
||||
"\"requestId\": \"${request.requestId}\"}"
|
||||
"\"eid\": \"${request.mtbFile.episode.id}\"}"
|
||||
}
|
||||
|
||||
private fun key(request: MtbFileSender.DeleteRequest): String {
|
||||
return "{\"pid\": \"${request.patientId}\", " +
|
||||
"\"requestId\": \"${request.requestId}\"}"
|
||||
return "{\"pid\": \"${request.patientId}\"}"
|
||||
}
|
||||
|
||||
data class Data(val requestId: String, val content: MtbFile)
|
||||
}
|
@ -35,16 +35,19 @@ infix fun MtbFile.pseudonymizeWith(pseudonymizeService: PseudonymizeService) {
|
||||
this.familyMemberDiagnoses.forEach { it.patient = patientPseudonym }
|
||||
this.geneticCounsellingRequests.forEach { it.patient = patientPseudonym }
|
||||
this.histologyReevaluationRequests.forEach { it.patient = patientPseudonym }
|
||||
this.histologyReports.forEach { it.patient = patientPseudonym }
|
||||
this.histologyReports.forEach {
|
||||
it.patient = patientPseudonym
|
||||
it.tumorMorphology.patient = patientPseudonym
|
||||
}
|
||||
this.lastGuidelineTherapies.forEach { it.patient = patientPseudonym }
|
||||
this.molecularPathologyFindings.forEach { it.patient = patientPseudonym }
|
||||
this.molecularTherapies.forEach { it.history.forEach { it.patient = patientPseudonym } }
|
||||
this.molecularTherapies.forEach { molecularTherapy -> molecularTherapy.history.forEach { it.patient = patientPseudonym } }
|
||||
this.ngsReports.forEach { it.patient = patientPseudonym }
|
||||
this.previousGuidelineTherapies.forEach { it.patient = patientPseudonym }
|
||||
this.rebiopsyRequests.forEach { it.patient = patientPseudonym }
|
||||
this.recommendations.forEach { it.patient = patientPseudonym }
|
||||
this.recommendations.forEach { it.patient = patientPseudonym }
|
||||
this.responses.forEach { it.patient = patientPseudonym }
|
||||
this.specimens.forEach { it.patient = patientPseudonym }
|
||||
this.studyInclusionRequests.forEach { it.patient = patientPseudonym }
|
||||
this.specimens.forEach { it.patient = patientPseudonym }
|
||||
}
|
@ -30,7 +30,6 @@ import dev.dnpm.etl.processor.pseudonym.PseudonymizeService
|
||||
import dev.dnpm.etl.processor.pseudonym.pseudonymizeWith
|
||||
import org.apache.commons.codec.binary.Base32
|
||||
import org.apache.commons.codec.digest.DigestUtils
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.context.ApplicationEventPublisher
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.Instant
|
||||
@ -39,21 +38,20 @@ import java.util.*
|
||||
@Service
|
||||
class RequestProcessor(
|
||||
private val pseudonymizeService: PseudonymizeService,
|
||||
private val transformationService: TransformationService,
|
||||
private val sender: MtbFileSender,
|
||||
private val requestService: RequestService,
|
||||
private val objectMapper: ObjectMapper,
|
||||
private val applicationEventPublisher: ApplicationEventPublisher
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(RequestProcessor::class.java)
|
||||
|
||||
fun processMtbFile(mtbFile: MtbFile) {
|
||||
val requestId = UUID.randomUUID().toString()
|
||||
val pid = mtbFile.patient.id
|
||||
|
||||
mtbFile pseudonymizeWith pseudonymizeService
|
||||
|
||||
val request = MtbFileSender.MtbFileRequest(requestId, mtbFile)
|
||||
val request = MtbFileSender.MtbFileRequest(requestId, transformationService.transform(mtbFile))
|
||||
|
||||
requestService.save(
|
||||
Request(
|
||||
|
@ -19,7 +19,6 @@
|
||||
|
||||
package dev.dnpm.etl.processor.services
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import dev.dnpm.etl.processor.monitoring.Report
|
||||
import dev.dnpm.etl.processor.monitoring.RequestRepository
|
||||
import dev.dnpm.etl.processor.monitoring.RequestStatus
|
||||
@ -33,8 +32,7 @@ import java.util.*
|
||||
@Service
|
||||
class ResponseProcessor(
|
||||
private val requestRepository: RequestRepository,
|
||||
private val statisticsUpdateProducer: Sinks.Many<Any>,
|
||||
private val objectMapper: ObjectMapper
|
||||
private val statisticsUpdateProducer: Sinks.Many<Any>
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(ResponseProcessor::class.java)
|
||||
@ -73,7 +71,7 @@ class ResponseProcessor(
|
||||
}
|
||||
|
||||
else -> {
|
||||
logger.error("Cannot process response: Unknown response code!")
|
||||
logger.error("Cannot process response: Unknown response!")
|
||||
return@ifPresentOrElse
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,85 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
* by the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU Affero General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU Affero General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
package dev.dnpm.etl.processor.services
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import com.jayway.jsonpath.JsonPath
|
||||
import com.jayway.jsonpath.PathNotFoundException
|
||||
import de.ukw.ccc.bwhc.dto.MtbFile
|
||||
|
||||
class TransformationService(private val objectMapper: ObjectMapper, private val transformations: List<Transformation>) {
|
||||
fun transform(mtbFile: MtbFile): MtbFile {
|
||||
var json = objectMapper.writeValueAsString(mtbFile)
|
||||
|
||||
transformations.forEach { transformation ->
|
||||
val jsonPath = JsonPath.parse(json)
|
||||
|
||||
try {
|
||||
val before = transformation.path.substringBeforeLast(".")
|
||||
val last = transformation.path.substringAfterLast(".")
|
||||
|
||||
val existingValue = if (transformation.existingValue is Number) transformation.existingValue else transformation.existingValue.toString()
|
||||
val newValue = if (transformation.newValue is Number) transformation.newValue else transformation.newValue.toString()
|
||||
|
||||
jsonPath.set("$.$before.[?]$last", newValue, {
|
||||
it.item(HashMap::class.java)[last] == existingValue
|
||||
})
|
||||
} catch (e: PathNotFoundException) {
|
||||
// Ignore
|
||||
}
|
||||
|
||||
json = jsonPath.jsonString()
|
||||
}
|
||||
|
||||
return objectMapper.readValue(json, MtbFile::class.java)
|
||||
}
|
||||
|
||||
fun getTransformations(): List<Transformation> {
|
||||
return this.transformations
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
class Transformation private constructor(val path: String) {
|
||||
|
||||
lateinit var existingValue: Any
|
||||
private set
|
||||
lateinit var newValue: Any
|
||||
private set
|
||||
|
||||
infix fun from(value: Any): Transformation {
|
||||
this.existingValue = value
|
||||
return this
|
||||
}
|
||||
|
||||
infix fun to(value: Any): Transformation {
|
||||
this.newValue = value
|
||||
return this
|
||||
}
|
||||
|
||||
companion object {
|
||||
|
||||
fun of(path: String): Transformation {
|
||||
return Transformation(path)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
@ -41,50 +41,40 @@ class KafkaResponseProcessor(
|
||||
|
||||
override fun onMessage(data: ConsumerRecord<String, String>) {
|
||||
try {
|
||||
Optional.of(objectMapper.readValue(data.key(), ResponseKey::class.java))
|
||||
Optional.of(objectMapper.readValue(data.value(), ResponseBody::class.java))
|
||||
} catch (e: Exception) {
|
||||
logger.error("Cannot process Kafka response", e)
|
||||
Optional.empty()
|
||||
}.ifPresentOrElse({ responseKey ->
|
||||
val event = try {
|
||||
val responseBody = objectMapper.readValue(data.value(), ResponseBody::class.java)
|
||||
ResponseEvent(
|
||||
responseKey.requestId,
|
||||
Instant.ofEpochMilli(data.timestamp()),
|
||||
responseBody.statusCode.asRequestStatus(),
|
||||
when (responseBody.statusCode.asRequestStatus()) {
|
||||
RequestStatus.SUCCESS -> {
|
||||
Optional.empty()
|
||||
}
|
||||
|
||||
RequestStatus.WARNING, RequestStatus.ERROR -> {
|
||||
Optional.of(objectMapper.writeValueAsString(responseBody.statusBody))
|
||||
}
|
||||
|
||||
else -> {
|
||||
logger.error("Kafka response: Unknown response code!")
|
||||
Optional.empty()
|
||||
}
|
||||
}.ifPresentOrElse({ responseBody ->
|
||||
val event = ResponseEvent(
|
||||
responseBody.requestId,
|
||||
Instant.ofEpochMilli(data.timestamp()),
|
||||
responseBody.statusCode.asRequestStatus(),
|
||||
when (responseBody.statusCode.asRequestStatus()) {
|
||||
RequestStatus.SUCCESS -> {
|
||||
Optional.empty()
|
||||
}
|
||||
)
|
||||
} catch (e: Exception) {
|
||||
logger.error("Cannot process Kafka response", e)
|
||||
ResponseEvent(
|
||||
responseKey.requestId,
|
||||
Instant.ofEpochMilli(data.timestamp()),
|
||||
RequestStatus.ERROR,
|
||||
Optional.of("Cannot process Kafka response")
|
||||
)
|
||||
}
|
||||
|
||||
RequestStatus.WARNING, RequestStatus.ERROR -> {
|
||||
Optional.of(objectMapper.writeValueAsString(responseBody.statusBody))
|
||||
}
|
||||
|
||||
else -> {
|
||||
logger.error("Kafka response: Unknown response code '{}'!", responseBody.statusCode)
|
||||
Optional.empty()
|
||||
}
|
||||
}
|
||||
)
|
||||
eventPublisher.publishEvent(event)
|
||||
}, {
|
||||
logger.error("No response key in Kafka response")
|
||||
logger.error("No requestId in Kafka response")
|
||||
})
|
||||
}
|
||||
|
||||
data class ResponseKey(val requestId: String)
|
||||
|
||||
data class ResponseBody(
|
||||
@JsonProperty("status_code") @JsonAlias("status code") val statusCode: Int,
|
||||
@JsonProperty("status_body") val statusBody: Map<String, Any>
|
||||
@JsonProperty("request_id") @JsonAlias("requestId") val requestId: String,
|
||||
@JsonProperty("status_code") @JsonAlias("statusCode") val statusCode: Int,
|
||||
@JsonProperty("status_body") @JsonAlias("statusBody") val statusBody: Map<String, Any>
|
||||
)
|
||||
|
||||
}
|
@ -0,0 +1,51 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
* by the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU Affero General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU Affero General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
package dev.dnpm.etl.processor.web
|
||||
|
||||
import dev.dnpm.etl.processor.monitoring.ConnectionCheckService
|
||||
import dev.dnpm.etl.processor.output.MtbFileSender
|
||||
import dev.dnpm.etl.processor.pseudonym.Generator
|
||||
import dev.dnpm.etl.processor.services.TransformationService
|
||||
import org.springframework.stereotype.Controller
|
||||
import org.springframework.ui.Model
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RequestMapping
|
||||
|
||||
@Controller
|
||||
@RequestMapping(path = ["configs"])
|
||||
class ConfigController(
|
||||
private val transformationService: TransformationService,
|
||||
private val pseudonymGenerator: Generator,
|
||||
private val mtbFileSender: MtbFileSender,
|
||||
private val connectionCheckService: ConnectionCheckService
|
||||
|
||||
) {
|
||||
|
||||
@GetMapping
|
||||
fun index(model: Model): String {
|
||||
model.addAttribute("pseudonymGenerator", pseudonymGenerator.javaClass.simpleName)
|
||||
model.addAttribute("mtbFileSender", mtbFileSender.javaClass.simpleName)
|
||||
model.addAttribute("connectionAvailable", connectionCheckService.connectionAvailable())
|
||||
model.addAttribute("transformations", transformationService.getTransformations())
|
||||
|
||||
return "configs"
|
||||
}
|
||||
|
||||
}
|
@ -83,9 +83,9 @@ class StatisticsRestController(
|
||||
.groupBy { formatter.format(it.processedAt) }
|
||||
.map {
|
||||
val requestList = it.value
|
||||
.groupBy { it.status }
|
||||
.map {
|
||||
Pair(it.key, it.value.size)
|
||||
.groupBy { request -> request.status }
|
||||
.map { request ->
|
||||
Pair(request.key, request.value.size)
|
||||
}
|
||||
.toMap()
|
||||
Pair(
|
||||
|
@ -4,12 +4,12 @@ spring:
|
||||
file: ./dev-compose.yml
|
||||
|
||||
app:
|
||||
rest:
|
||||
uri: http://localhost:9000/bwhc/etl/api
|
||||
#rest:
|
||||
# uri: http://localhost:9000/bwhc/etl/api
|
||||
kafka:
|
||||
topic: test
|
||||
response-topic: test-response
|
||||
servers: kafka:9092
|
||||
response-topic: test_response
|
||||
servers: localhost:9094
|
||||
|
||||
server:
|
||||
port: 8000
|
||||
|
@ -1,3 +1,24 @@
|
||||
:root {
|
||||
--table-border: rgba(96, 96, 96, 1);
|
||||
|
||||
--bg-blue: rgb(0, 74, 157);
|
||||
--bg-blue-op: rgba(0, 74, 157, .35);
|
||||
|
||||
--bg-green: rgb(0, 128, 0);
|
||||
--bg-green-op: rgba(0, 128, 0, .35);
|
||||
|
||||
|
||||
--bg-yellow: rgb(255, 140, 0);
|
||||
--bg-yellow-op: rgba(255, 140, 0, .35);
|
||||
|
||||
|
||||
--bg-red: rgb(255, 0, 0);
|
||||
--bg-red-op: rgba(255, 0, 0, .35);
|
||||
|
||||
--bg-gray: rgb(112, 128, 144);
|
||||
--bg-gray-op: rgba(112, 128, 144, .35);
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
font-family: sans-serif;
|
||||
@ -57,7 +78,7 @@ nav > ul > li:first-of-type {
|
||||
display: inline;
|
||||
}
|
||||
|
||||
.breadcrumps ul li+li:before {
|
||||
.breadcrumps ul li + li:before {
|
||||
padding: .4rem;
|
||||
color: gray;
|
||||
content: "/\00a0";
|
||||
@ -68,6 +89,10 @@ nav > ul > li:first-of-type {
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.centered {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
main {
|
||||
margin: 0 auto;
|
||||
max-width: 1140px;
|
||||
@ -115,8 +140,8 @@ form.samplecode-input input:focus-visible {
|
||||
}
|
||||
|
||||
table {
|
||||
border-top: 1px solid lightgray;
|
||||
border-left: 1px solid lightgray;
|
||||
border-top: 1px solid var(--table-border);
|
||||
border-left: 1px solid var(--table-border);
|
||||
border-spacing: 0;
|
||||
border-radius: 3px;
|
||||
|
||||
@ -145,10 +170,10 @@ th {
|
||||
}
|
||||
|
||||
td, th {
|
||||
padding: .2rem;
|
||||
padding: 0.4rem .2rem;
|
||||
|
||||
border-right: 1px solid lightgray;
|
||||
border-bottom: 1px solid lightgray;
|
||||
border-right: 1px solid var(--table-border);
|
||||
border-bottom: 1px solid var(--table-border);
|
||||
|
||||
text-align: left;
|
||||
white-space: nowrap;
|
||||
@ -159,26 +184,66 @@ td {
|
||||
font-family: monospace;
|
||||
}
|
||||
|
||||
td.bg-green, th.bg-green {
|
||||
background: green;
|
||||
td.bg-blue, th.bg-blue {
|
||||
background: var(--bg-blue);
|
||||
color: white;
|
||||
}
|
||||
|
||||
tr:has(td.bg-blue) {
|
||||
background: var(--bg-blue-op);
|
||||
}
|
||||
|
||||
td.bg-green, th.bg-green {
|
||||
background: var(--bg-green);
|
||||
color: white;
|
||||
}
|
||||
|
||||
tr:has(td.bg-green) {
|
||||
background: var(--bg-green-op);
|
||||
}
|
||||
|
||||
td.bg-yellow, th.bg-yellow {
|
||||
background: darkorange;
|
||||
background: var(--bg-yellow);
|
||||
color: white;
|
||||
}
|
||||
|
||||
tr:has(td.bg-yellow) {
|
||||
background: var(--bg-yellow-op);
|
||||
}
|
||||
|
||||
td.bg-red, th.bg-red {
|
||||
background: red;
|
||||
background: var(--bg-red);
|
||||
color: white;
|
||||
}
|
||||
|
||||
tr:has(td.bg-red) {
|
||||
background: var(--bg-red-op);
|
||||
}
|
||||
|
||||
td.bg-gray, th.bg-gray {
|
||||
background: slategray;
|
||||
background: var(--bg-gray);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.bg-path {
|
||||
background: var(--bg-gray-op);
|
||||
}
|
||||
|
||||
.bg-from {
|
||||
background: var(--bg-red-op);
|
||||
}
|
||||
|
||||
.bg-to {
|
||||
background: var(--bg-green-op);
|
||||
}
|
||||
|
||||
.bg-path, .bg-from, .bg-to {
|
||||
padding: 0.25rem 0.5rem;
|
||||
border-radius: 3px;
|
||||
|
||||
font-family: monospace;
|
||||
}
|
||||
|
||||
td.bg-shaded, th.bg-shaded {
|
||||
background: repeating-linear-gradient(140deg, white, #e5e5f5 4px, white 8px);
|
||||
}
|
||||
@ -279,7 +344,7 @@ input.inline:focus-visible {
|
||||
padding: 1rem;
|
||||
margin: .2rem;
|
||||
|
||||
border: 1px solid lightgray;
|
||||
border: 1px solid var(--table-border);
|
||||
border-radius: 3px;
|
||||
|
||||
width: calc(100% - 2.4rem - 4px);
|
||||
|
76
src/main/resources/templates/configs.html
Normal file
76
src/main/resources/templates/configs.html
Normal file
@ -0,0 +1,76 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="de" xmlns:th="http://www.thymeleaf.org">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>ETL-Prozessor</title>
|
||||
<link rel="stylesheet" th:href="@{/style.css}" />
|
||||
</head>
|
||||
<body>
|
||||
<div th:replace="~{fragments.html :: nav}"></div>
|
||||
<main>
|
||||
<h1>Konfiguration</h1>
|
||||
|
||||
<h2>Allgemeine Konfiguration</h2>
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>Wert</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Pseudonym erzeugt über</td>
|
||||
<td>[[ ${pseudonymGenerator} ]]</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>MTBFile-Sender</td>
|
||||
<td>[[ ${mtbFileSender} ]]</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<h2><span th:if="${connectionAvailable}">✅</span><span th:if="${not(connectionAvailable)}">⚡</span> Verbindung zum bwHC-Backend</h2>
|
||||
<p>
|
||||
Verbindung über <code>[[ ${mtbFileSender} ]]</code>. Die Verbindung ist aktuell
|
||||
<strong th:if="${connectionAvailable}" style="color: green">verfügbar.</strong>
|
||||
<strong th:if="${not(connectionAvailable)}" style="color: red">nicht verfügbar!</strong>
|
||||
</p>
|
||||
|
||||
<h2>Transformationen</h2>
|
||||
|
||||
<h3>Syntax</h3>
|
||||
Hier einige Beispiele zum Syntax des JSON-Path
|
||||
<ul>
|
||||
<li style="padding: 0.6rem 0;"><span class="bg-path">diagnoses[*].icdO3T.version</span>: Ersetze die ICD-O3T-Version in allen Diagnosen, z.B. zur Version der deutschen Übersetzung</li>
|
||||
<li style="padding: 0.6rem 0;"><span class="bg-path">patient.gender</span>: Ersetze das Geschlecht des Patienten, z.B. in das von bwHC verlangte Format</li>
|
||||
</ul>
|
||||
|
||||
<h3>Konfigurierte Transformationen</h3>
|
||||
<p>
|
||||
Hier sehen Sie eine Übersicht der konfigurierten Transformationen.
|
||||
</p>
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>JSON-Path</th>
|
||||
<th>Transformation von ⇒ nach</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr th:each="transformation : ${transformations}">
|
||||
<td>
|
||||
<span class="bg-path" title="Ersetze Wert(e) an dieser Stelle im MTB-File">[[ ${transformation.path} ]]</span>
|
||||
</td>
|
||||
<td>
|
||||
<span class="bg-from" title="Ersetze immer dann, wenn dieser Wert enthalten ist">[[ ${transformation.existingValue} ]]</span>
|
||||
<strong>⇒</strong>
|
||||
<span class="bg-to" title="Ersetze durch diesen Wert">[[ ${transformation.newValue} ]]</span>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</main>
|
||||
</body>
|
||||
</html>
|
@ -10,6 +10,7 @@
|
||||
<ul>
|
||||
<li><a th:href="@{/}">Übersicht</a></li>
|
||||
<li><a th:href="@{/statistics}">Statistiken</a></li>
|
||||
<li><a th:href="@{/configs}">Konfiguration</a></li>
|
||||
</ul>
|
||||
</nav>
|
||||
</div>
|
||||
|
@ -45,6 +45,7 @@
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr th:each="issue : ${issues}">
|
||||
<td th:if="${issue.severity.value == 'info'}" class="bg-blue"><small>[[ ${issue.severity} ]]</small></td>
|
||||
<td th:if="${issue.severity.value == 'warning'}" class="bg-yellow"><small>[[ ${issue.severity} ]]</small></td>
|
||||
<td th:if="${issue.severity.value == 'error'}" class="bg-red"><small>[[ ${issue.severity} ]]</small></td>
|
||||
<td>[[ ${issue.message} ]]</td>
|
||||
|
@ -97,9 +97,9 @@ class KafkaMtbFileSenderTest {
|
||||
val captor = argumentCaptor<String>()
|
||||
verify(kafkaTemplate, times(1)).send(anyString(), captor.capture(), captor.capture())
|
||||
assertThat(captor.firstValue).isNotNull
|
||||
assertThat(captor.firstValue).isEqualTo("{\"pid\": \"PID\", \"eid\": \"1\", \"requestId\": \"TestID\"}")
|
||||
assertThat(captor.firstValue).isEqualTo("{\"pid\": \"PID\", \"eid\": \"1\"}")
|
||||
assertThat(captor.secondValue).isNotNull
|
||||
assertThat(captor.secondValue).isEqualTo(objectMapper.writeValueAsString(mtbFile(Consent.Status.ACTIVE)))
|
||||
assertThat(captor.secondValue).isEqualTo(objectMapper.writeValueAsString(kafkaRecordData("TestID", Consent.Status.ACTIVE)))
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -113,9 +113,9 @@ class KafkaMtbFileSenderTest {
|
||||
val captor = argumentCaptor<String>()
|
||||
verify(kafkaTemplate, times(1)).send(anyString(), captor.capture(), captor.capture())
|
||||
assertThat(captor.firstValue).isNotNull
|
||||
assertThat(captor.firstValue).isEqualTo("{\"pid\": \"PID\", \"requestId\": \"TestID\"}")
|
||||
assertThat(captor.firstValue).isEqualTo("{\"pid\": \"PID\"}")
|
||||
assertThat(captor.secondValue).isNotNull
|
||||
assertThat(captor.secondValue).isEqualTo(objectMapper.writeValueAsString(mtbFile(Consent.Status.REJECTED)))
|
||||
assertThat(captor.secondValue).isEqualTo(objectMapper.writeValueAsString(kafkaRecordData("TestID", Consent.Status.REJECTED)))
|
||||
}
|
||||
|
||||
companion object {
|
||||
@ -154,6 +154,10 @@ class KafkaMtbFileSenderTest {
|
||||
}.build()
|
||||
}
|
||||
|
||||
fun kafkaRecordData(requestId: String, consentStatus: Consent.Status): KafkaMtbFileSender.Data {
|
||||
return KafkaMtbFileSender.Data(requestId, mtbFile(consentStatus))
|
||||
}
|
||||
|
||||
data class TestData(val requestStatus: RequestStatus, val exception: Throwable? = null)
|
||||
|
||||
@JvmStatic
|
||||
|
@ -105,7 +105,7 @@ class RestMtbFileSenderTest {
|
||||
}
|
||||
""".trimIndent()
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
val mtbFile: MtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
.withId("PID")
|
||||
@ -129,7 +129,7 @@ class RestMtbFileSenderTest {
|
||||
)
|
||||
.build()
|
||||
|
||||
private val errorResponseBody = "Sonstiger Fehler bei der Übertragung"
|
||||
private const val ERROR_RESPONSE_BODY = "Sonstiger Fehler bei der Übertragung"
|
||||
|
||||
/**
|
||||
* Synthetic http responses with related request status
|
||||
@ -147,23 +147,23 @@ class RestMtbFileSenderTest {
|
||||
RequestWithResponse(
|
||||
HttpStatus.BAD_REQUEST,
|
||||
"??",
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
),
|
||||
RequestWithResponse(
|
||||
HttpStatus.UNPROCESSABLE_ENTITY,
|
||||
errorBody,
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
),
|
||||
// Some more errors not mentioned in documentation
|
||||
RequestWithResponse(
|
||||
HttpStatus.NOT_FOUND,
|
||||
"what????",
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
),
|
||||
RequestWithResponse(
|
||||
HttpStatus.INTERNAL_SERVER_ERROR,
|
||||
"what????",
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
)
|
||||
)
|
||||
}
|
||||
@ -180,12 +180,12 @@ class RestMtbFileSenderTest {
|
||||
RequestWithResponse(
|
||||
HttpStatus.NOT_FOUND,
|
||||
"what????",
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
),
|
||||
RequestWithResponse(
|
||||
HttpStatus.INTERNAL_SERVER_ERROR,
|
||||
"what????",
|
||||
MtbFileSender.Response(RequestStatus.ERROR, errorResponseBody)
|
||||
MtbFileSender.Response(RequestStatus.ERROR, ERROR_RESPONSE_BODY)
|
||||
)
|
||||
)
|
||||
}
|
||||
|
@ -0,0 +1,64 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
* by the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU Affero General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU Affero General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
package dev.dnpm.etl.processor.pseudonym
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import de.ukw.ccc.bwhc.dto.MtbFile
|
||||
import org.assertj.core.api.Assertions.assertThat
|
||||
import org.junit.jupiter.api.Test
|
||||
import org.junit.jupiter.api.extension.ExtendWith
|
||||
import org.mockito.ArgumentMatchers
|
||||
import org.mockito.Mock
|
||||
import org.mockito.junit.jupiter.MockitoExtension
|
||||
import org.mockito.kotlin.doAnswer
|
||||
import org.mockito.kotlin.whenever
|
||||
import org.springframework.core.io.ClassPathResource
|
||||
|
||||
const val FAKE_MTB_FILE_PATH = "fake_MTBFile.json"
|
||||
const val CLEAN_PATIENT_ID = "5dad2f0b-49c6-47d8-a952-7b9e9e0f7549"
|
||||
|
||||
@ExtendWith(MockitoExtension::class)
|
||||
class ExtensionsTest {
|
||||
|
||||
private fun fakeMtbFile(): MtbFile {
|
||||
val mtbFile = ClassPathResource(FAKE_MTB_FILE_PATH).inputStream
|
||||
return ObjectMapper().readValue(mtbFile, MtbFile::class.java)
|
||||
}
|
||||
|
||||
private fun MtbFile.serialized(): String {
|
||||
return ObjectMapper().writeValueAsString(this)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldNotContainCleanPatientId(@Mock pseudonymizeService: PseudonymizeService) {
|
||||
doAnswer {
|
||||
it.arguments[0]
|
||||
"PSEUDO-ID"
|
||||
}.whenever(pseudonymizeService).patientPseudonym(ArgumentMatchers.anyString())
|
||||
|
||||
val mtbFile = fakeMtbFile()
|
||||
|
||||
mtbFile.pseudonymizeWith(pseudonymizeService)
|
||||
|
||||
assertThat(mtbFile.patient.id).isEqualTo("PSEUDO-ID")
|
||||
assertThat(mtbFile.serialized()).doesNotContain(CLEAN_PATIENT_ID)
|
||||
}
|
||||
|
||||
}
|
@ -70,6 +70,13 @@ class PseudonymizeServiceTest {
|
||||
assertThat(mtbFile.patient.id).isEqualTo("123")
|
||||
}
|
||||
|
||||
@Test
|
||||
fun sanitizeFileName(@Mock generator: GpasPseudonymGenerator) {
|
||||
val result= GpasPseudonymGenerator.sanitizeValue("l://a\\bs;1*2?3>")
|
||||
|
||||
assertThat(result).isEqualTo("l___a_bs_1_2_3_")
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldUsePseudonymPrefixForBuiltin(@Mock generator: AnonymizingGenerator) {
|
||||
doAnswer {
|
||||
|
@ -41,6 +41,7 @@ class ReportServiceTest {
|
||||
{
|
||||
"patient": "4711",
|
||||
"issues": [
|
||||
{ "severity": "info", "message": "Info Message" },
|
||||
{ "severity": "warning", "message": "Warning Message" },
|
||||
{ "severity": "error", "message": "Error Message" }
|
||||
]
|
||||
@ -49,11 +50,13 @@ class ReportServiceTest {
|
||||
|
||||
val actual = this.reportService.deserialize(json)
|
||||
|
||||
assertThat(actual).hasSize(2)
|
||||
assertThat(actual[0].severity).isEqualTo(ReportService.Severity.WARNING)
|
||||
assertThat(actual[0].message).isEqualTo("Warning Message")
|
||||
assertThat(actual[1].severity).isEqualTo(ReportService.Severity.ERROR)
|
||||
assertThat(actual[1].message).isEqualTo("Error Message")
|
||||
assertThat(actual).hasSize(3)
|
||||
assertThat(actual[0].severity).isEqualTo(ReportService.Severity.ERROR)
|
||||
assertThat(actual[0].message).isEqualTo("Error Message")
|
||||
assertThat(actual[1].severity).isEqualTo(ReportService.Severity.WARNING)
|
||||
assertThat(actual[1].message).isEqualTo("Warning Message")
|
||||
assertThat(actual[2].severity).isEqualTo(ReportService.Severity.INFO)
|
||||
assertThat(actual[2].message).isEqualTo("Info Message")
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -37,6 +37,7 @@ import org.mockito.Mockito.*
|
||||
import org.mockito.junit.jupiter.MockitoExtension
|
||||
import org.mockito.kotlin.any
|
||||
import org.mockito.kotlin.argumentCaptor
|
||||
import org.mockito.kotlin.whenever
|
||||
import org.springframework.context.ApplicationEventPublisher
|
||||
import java.time.Instant
|
||||
import java.util.*
|
||||
@ -46,6 +47,7 @@ import java.util.*
|
||||
class RequestProcessorTest {
|
||||
|
||||
private lateinit var pseudonymizeService: PseudonymizeService
|
||||
private lateinit var transformationService: TransformationService
|
||||
private lateinit var sender: MtbFileSender
|
||||
private lateinit var requestService: RequestService
|
||||
private lateinit var applicationEventPublisher: ApplicationEventPublisher
|
||||
@ -55,11 +57,13 @@ class RequestProcessorTest {
|
||||
@BeforeEach
|
||||
fun setup(
|
||||
@Mock pseudonymizeService: PseudonymizeService,
|
||||
@Mock transformationService: TransformationService,
|
||||
@Mock sender: RestMtbFileSender,
|
||||
@Mock requestService: RequestService,
|
||||
@Mock applicationEventPublisher: ApplicationEventPublisher
|
||||
) {
|
||||
this.pseudonymizeService = pseudonymizeService
|
||||
this.transformationService = transformationService
|
||||
this.sender = sender
|
||||
this.requestService = requestService
|
||||
this.applicationEventPublisher = applicationEventPublisher
|
||||
@ -68,6 +72,7 @@ class RequestProcessorTest {
|
||||
|
||||
requestProcessor = RequestProcessor(
|
||||
pseudonymizeService,
|
||||
transformationService,
|
||||
sender,
|
||||
requestService,
|
||||
objectMapper,
|
||||
@ -98,6 +103,10 @@ class RequestProcessorTest {
|
||||
it.arguments[0] as String
|
||||
}.`when`(pseudonymizeService).patientPseudonym(any())
|
||||
|
||||
doAnswer {
|
||||
it.arguments[0]
|
||||
}.whenever(transformationService).transform(any())
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
@ -153,6 +162,10 @@ class RequestProcessorTest {
|
||||
it.arguments[0] as String
|
||||
}.`when`(pseudonymizeService).patientPseudonym(any())
|
||||
|
||||
doAnswer {
|
||||
it.arguments[0]
|
||||
}.whenever(transformationService).transform(any())
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
@ -212,6 +225,10 @@ class RequestProcessorTest {
|
||||
it.arguments[0] as String
|
||||
}.`when`(pseudonymizeService).patientPseudonym(any())
|
||||
|
||||
doAnswer {
|
||||
it.arguments[0]
|
||||
}.whenever(transformationService).transform(any())
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
@ -271,6 +288,10 @@ class RequestProcessorTest {
|
||||
it.arguments[0] as String
|
||||
}.`when`(pseudonymizeService).patientPseudonym(any())
|
||||
|
||||
doAnswer {
|
||||
it.arguments[0]
|
||||
}.whenever(transformationService).transform(any())
|
||||
|
||||
val mtbFile = MtbFile.builder()
|
||||
.withPatient(
|
||||
Patient.builder()
|
||||
|
@ -19,8 +19,6 @@
|
||||
|
||||
package dev.dnpm.etl.processor.services
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import com.fasterxml.jackson.module.kotlin.KotlinModule
|
||||
import dev.dnpm.etl.processor.monitoring.Request
|
||||
import dev.dnpm.etl.processor.monitoring.RequestRepository
|
||||
import dev.dnpm.etl.processor.monitoring.RequestStatus
|
||||
@ -62,12 +60,10 @@ class ResponseProcessorTest {
|
||||
@Mock requestRepository: RequestRepository,
|
||||
@Mock statisticsUpdateProducer: Sinks.Many<Any>
|
||||
) {
|
||||
val objectMapper = ObjectMapper().registerModule(KotlinModule.Builder().build())
|
||||
|
||||
this.requestRepository = requestRepository
|
||||
this.statisticsUpdateProducer = statisticsUpdateProducer
|
||||
|
||||
this.responseProcessor = ResponseProcessor(requestRepository, statisticsUpdateProducer, objectMapper)
|
||||
this.responseProcessor = ResponseProcessor(requestRepository, statisticsUpdateProducer)
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -0,0 +1,95 @@
|
||||
/*
|
||||
* This file is part of ETL-Processor
|
||||
*
|
||||
* Copyright (c) 2023 Comprehensive Cancer Center Mainfranken, Datenintegrationszentrum Philipps-Universität Marburg and Contributors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU Affero General Public License as published
|
||||
* by the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU Affero General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU Affero General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
package dev.dnpm.etl.processor.services
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import de.ukw.ccc.bwhc.dto.Consent
|
||||
import de.ukw.ccc.bwhc.dto.Diagnosis
|
||||
import de.ukw.ccc.bwhc.dto.Icd10
|
||||
import de.ukw.ccc.bwhc.dto.MtbFile
|
||||
import org.assertj.core.api.Assertions.assertThat
|
||||
import org.junit.jupiter.api.BeforeEach
|
||||
import org.junit.jupiter.api.Test
|
||||
|
||||
class TransformationServiceTest {
|
||||
|
||||
private lateinit var service: TransformationService
|
||||
|
||||
@BeforeEach
|
||||
fun setup() {
|
||||
this.service = TransformationService(
|
||||
ObjectMapper(), listOf(
|
||||
Transformation.of("consent.status") from Consent.Status.ACTIVE to Consent.Status.REJECTED,
|
||||
Transformation.of("diagnoses[*].icd10.version") from "2013" to "2014",
|
||||
)
|
||||
)
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldTransformMtbFile() {
|
||||
val mtbFile = MtbFile.builder().withDiagnoses(
|
||||
listOf(
|
||||
Diagnosis.builder().withId("1234").withIcd10(Icd10("F79.9").also {
|
||||
it.version = "2013"
|
||||
}).build()
|
||||
)
|
||||
).build()
|
||||
|
||||
val actual = this.service.transform(mtbFile)
|
||||
|
||||
assertThat(actual).isNotNull
|
||||
assertThat(actual.diagnoses[0].icd10.version).isEqualTo("2014")
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldOnlyTransformGivenValues() {
|
||||
val mtbFile = MtbFile.builder().withDiagnoses(
|
||||
listOf(
|
||||
Diagnosis.builder().withId("1234").withIcd10(Icd10("F79.9").also {
|
||||
it.version = "2013"
|
||||
}).build(),
|
||||
Diagnosis.builder().withId("5678").withIcd10(Icd10("F79.8").also {
|
||||
it.version = "2019"
|
||||
}).build()
|
||||
)
|
||||
).build()
|
||||
|
||||
val actual = this.service.transform(mtbFile)
|
||||
|
||||
assertThat(actual).isNotNull
|
||||
assertThat(actual.diagnoses[0].icd10.code).isEqualTo("F79.9")
|
||||
assertThat(actual.diagnoses[0].icd10.version).isEqualTo("2014")
|
||||
assertThat(actual.diagnoses[1].icd10.code).isEqualTo("F79.8")
|
||||
assertThat(actual.diagnoses[1].icd10.version).isEqualTo("2019")
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldTransformMtbFileWithConsentEnum() {
|
||||
val mtbFile = MtbFile.builder().withConsent(
|
||||
Consent("123", "456", Consent.Status.ACTIVE)
|
||||
).build()
|
||||
|
||||
val actual = this.service.transform(mtbFile)
|
||||
|
||||
assertThat(actual.consent).isNotNull
|
||||
assertThat(actual.consent.status).isEqualTo(Consent.Status.REJECTED)
|
||||
}
|
||||
|
||||
}
|
@ -45,8 +45,8 @@ class KafkaResponseProcessorTest {
|
||||
|
||||
private lateinit var kafkaResponseProcessor: KafkaResponseProcessor
|
||||
|
||||
private fun createkafkaRecord(
|
||||
requestId: String? = null,
|
||||
private fun createKafkaRecord(
|
||||
requestId: String,
|
||||
statusCode: Int = 200,
|
||||
statusBody: Map<String, Any>? = mapOf()
|
||||
): ConsumerRecord<String, String> {
|
||||
@ -54,15 +54,11 @@ class KafkaResponseProcessorTest {
|
||||
"test-topic",
|
||||
0,
|
||||
0,
|
||||
if (requestId == null) {
|
||||
null
|
||||
} else {
|
||||
this.objectMapper.writeValueAsString(KafkaResponseProcessor.ResponseKey(requestId))
|
||||
},
|
||||
null,
|
||||
if (statusBody == null) {
|
||||
""
|
||||
} else {
|
||||
this.objectMapper.writeValueAsString(KafkaResponseProcessor.ResponseBody(statusCode, statusBody))
|
||||
this.objectMapper.writeValueAsString(KafkaResponseProcessor.ResponseBody(requestId, statusCode, statusBody))
|
||||
}
|
||||
)
|
||||
}
|
||||
@ -78,23 +74,57 @@ class KafkaResponseProcessorTest {
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldNotProcessRecordsWithoutValidKey() {
|
||||
this.kafkaResponseProcessor.onMessage(createkafkaRecord(null, 200))
|
||||
fun shouldNotProcessRecordsWithoutRequestIdInBody() {
|
||||
val record = ConsumerRecord<String, String>(
|
||||
"test-topic",
|
||||
0,
|
||||
0,
|
||||
null,
|
||||
"""
|
||||
{
|
||||
"statusCode": 200,
|
||||
"statusBody": {}
|
||||
}
|
||||
""".trimIndent()
|
||||
)
|
||||
|
||||
verify(eventPublisher, never()).publishEvent(any())
|
||||
this.kafkaResponseProcessor.onMessage(record)
|
||||
|
||||
verify(eventPublisher, never()).publishEvent(any<ResponseEvent>())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldNotProcessRecordsWithoutValidBody() {
|
||||
this.kafkaResponseProcessor.onMessage(createkafkaRecord(requestId = "TestID1234", statusBody = null))
|
||||
fun shouldProcessRecordsWithAliasNames() {
|
||||
val record = ConsumerRecord<String, String>(
|
||||
"test-topic",
|
||||
0,
|
||||
0,
|
||||
null,
|
||||
"""
|
||||
{
|
||||
"request_id": "test0123456789",
|
||||
"status_code": 200,
|
||||
"status_body": {}
|
||||
}
|
||||
""".trimIndent()
|
||||
)
|
||||
|
||||
verify(eventPublisher, never()).publishEvent(any())
|
||||
this.kafkaResponseProcessor.onMessage(record)
|
||||
|
||||
verify(eventPublisher, times(1)).publishEvent(any<ResponseEvent>())
|
||||
}
|
||||
|
||||
@Test
|
||||
fun shouldNotProcessRecordsWithoutValidStatusBody() {
|
||||
this.kafkaResponseProcessor.onMessage(createKafkaRecord(requestId = "TestID1234", statusBody = null))
|
||||
|
||||
verify(eventPublisher, never()).publishEvent(any<ResponseEvent>())
|
||||
}
|
||||
|
||||
@ParameterizedTest
|
||||
@MethodSource("statusCodeSource")
|
||||
fun shouldProcessValidRecordsWithStatusCode(statusCode: Int) {
|
||||
this.kafkaResponseProcessor.onMessage(createkafkaRecord("TestID1234", statusCode))
|
||||
this.kafkaResponseProcessor.onMessage(createKafkaRecord("TestID1234", statusCode))
|
||||
verify(eventPublisher, times(1)).publishEvent(any<ResponseEvent>())
|
||||
}
|
||||
|
||||
|
1
src/test/resources/fake_MTBFile.json
Normal file
1
src/test/resources/fake_MTBFile.json
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user