Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
…systeme/AW40-hub-docker into bugfix/66602_keycloak_access_token_is_not_refreshing_properly
  • Loading branch information
mgr committed Dec 6, 2024
2 parents 9d0bc6e + 6615153 commit 9eb6161
Show file tree
Hide file tree
Showing 91 changed files with 4,777 additions and 393 deletions.
3 changes: 1 addition & 2 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2023 Timo Thurow
Copyright (c) 2023 Hochschule Osnabrück, LMIS AG, THGA, BO-I-T

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand All @@ -19,4 +19,3 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

20 changes: 15 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
# AW40-hub-docker

# AW40-HUB

<p align="left">
<a href="https://editor.swagger.io/?url=https://raw.githubusercontent.com/FieldRobotEvent/REST-API-24/main/docs/static/openapi.json"><img src="https://img.shields.io/badge/open--API-V3.1-brightgreen.svg?style=flat&label=OpenAPI" alt="OpenAPI"/></a>
<a href="https://www.python.org/"><img src="https://img.shields.io/badge/Python-3.12-3776AB.svg?style=flat&logo=python&logoColor=white" alt="Python 3.11"/></a>
<a href="https://fastapi.tiangolo.com/"><img src="https://img.shields.io/badge/FastAPI-0.112.2-009688.svg?style=flat&logo=FastAPI&logoColor=white" alt="FastAPI"/></a>
<a href="https://www.gnu.org/licenses/gpl-3.0"><img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: GPL v3"/></a>
</p>

## Description
This is the prototype implementation of the AW4.0 HUB architecture and part of the [Car Repair 4.0](https://www.autowerkstatt40.org/en/) research project. The purpose of the HUB is to enable car workshops to use AI driven diagnostics, persist acquired data from cars in a database as well as to participate alongside other car workshops as well as AI model providers in an [Gaia-X](https://gaia-x.eu/) compatible Dataspace to sell data and aquire new AI models.
The name AW40 is a shortened version of the german project title "Autowerkstatt 4.0".
## Requirements

- Docker v25.0 or later (run `docker --version`)
Expand All @@ -10,7 +20,7 @@ If you just need to update buildx, see [this section](#updating-docker-buildx-bu

## Overview

Prototype implementation of the AW40 HUB Architecture on Docker\
This is the prototype implementation of the AW4.0 HUB Architecture.\
Currently included services:

| Service (see [docker-compose.yml](docker-compose.yml)) | Description |
Expand All @@ -30,8 +40,8 @@ Currently included services:

## Usage

### Start the developement HUB
**WARNING: DO NOT RUN THE DEVELOPEMENT HUB ON PUBLIC SERVER**\
### Start the development HUB
**WARNING: DO NOT RUN THE DEVELOPMENT HUB ON PUBLIC SERVER**\
To start the HUB in developer mode use:\
```docker compose --env-file=dev.env --profile full up -d```

Expand Down
3 changes: 3 additions & 0 deletions api/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,9 @@ RUN groupadd -r api && \
# use api users home directory as workdir
WORKDIR /home/api

# create directory to store asset data and chown to api user
RUN mkdir /home/api/asset-data && chown api:api /home/api/asset-data

# install minimal requirements
COPY ./requirements.txt /home/api/requirements.txt
RUN pip install --upgrade pip && \
Expand Down
1 change: 1 addition & 0 deletions api/api.env
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
API_ALLOW_ORIGINS=${API_ALLOW_ORIGINS:?error}
API_KEY_DIAGNOSTICS=${API_KEY_DIAGNOSTICS:?err}
API_KEY_ASSETS=${API_KEY_ASSETS:?err}
MONGO_HOST=mongo
MONGO_USERNAME=${MONGO_API_USERNAME:-mongo-api-user}
MONGO_PASSWORD=${MONGO_API_PASSWORD:?error}
Expand Down
11 changes: 11 additions & 0 deletions api/api/data_management/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
__all__ = [
"NewAsset",
"Asset",
"AssetDefinition",
"AssetMetaData",
"NewPublication",
"Publication",
"AssetDataStatus",
"NewCase",
"Case",
"CaseUpdate",
Expand Down Expand Up @@ -31,6 +38,10 @@
"BaseSignalStore"
]

from .assets import (
NewAsset, AssetDefinition, Asset, AssetMetaData, Publication,
NewPublication, AssetDataStatus
)
from .case import NewCase, Case, CaseUpdate
from .customer import Customer, CustomerBase, CustomerUpdate
from .diagnosis import (
Expand Down
182 changes: 182 additions & 0 deletions api/api/data_management/assets.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,182 @@
import json
import os
from datetime import datetime, UTC
from enum import Enum
from typing import Optional, Annotated, ClassVar, Literal
from zipfile import ZipFile

from beanie import Document, before_event, Delete
from pydantic import BaseModel, StringConstraints, Field

from .case import Case


class AssetDataStatus(str, Enum):
defined = "defined"
processing = "processing"
ready = "ready"


class AssetDefinition(BaseModel):
"""
Defines filter conditions that cases have to match to be included in an
asset.
"""
vin: Optional[
Annotated[str, StringConstraints(min_length=3, max_length=9)]
] = Field(
default=None,
description="Partial VIN used to filter cases for inclusion in the "
"asset."
)
obd_data_dtc: Optional[
Annotated[str, StringConstraints(min_length=5, max_length=5)]
] = Field(
default=None,
description="DTC that has to be present in a case's OBD datasets for "
"inclusion in the asset."
)
timeseries_data_component: Optional[str] = Field(
default=None,
description="Timeseries data component that has to be present in a "
"case's timeseries datasets for inclusion in the asset."
)


class PublicationNetwork(str, Enum):
pontusxdev = "PONTUSXDEV"
pontusxtest = "PONTUSXTEST"


class PublicationBase(BaseModel):
network: PublicationNetwork = Field(
description="Network that an asset is available in via this "
"publication",
default=PublicationNetwork.pontusxdev
)
license: str = "CUSTOM"
price: float = 1.0


class NewPublication(PublicationBase):
"""Schema for new asset publications."""
nautilus_private_key: str = Field(
description="Key for dataspace authentication."
)


class Publication(PublicationBase):
"""Publication information for an asset."""
did: str = Field(
description="Id of this publication within its network."
)
asset_url: str = Field(
description="URL to access asset data from the network."
)
asset_key: str = Field(
description="Publication specific key to access data via `asset_url`.",
exclude=True
)


class AssetMetaData(BaseModel):
name: str
definition: AssetDefinition
description: str
timestamp: datetime = Field(default_factory=lambda: datetime.now(UTC))
type: Literal["dataset"] = "dataset"
author: str


class Asset(AssetMetaData, Document):
"""DB schema and interface for assets."""

class Settings:
name = "assets"

data_status: AssetDataStatus = AssetDataStatus.defined
publication: Optional[Publication] = None

asset_data_dir_path: ClassVar[str] = "asset-data"

@staticmethod
def _publication_case_json(case: Case) -> str:
"""Convert a Case into a publication ready json string."""
# Keep WMI+VDS from VIN and mask VIS. See
# https://de.wikipedia.org/wiki/Fahrzeug-Identifizierungsnummer#Aufbau
case.vehicle_vin = case.vehicle_vin[:9] + 8*"*"
# Exclude fields only relevant for internal data management from case
exclude = {
field: True for field in [
"customer_id", "workshop_id", "diagnosis_id",
"timeseries_data_added", "obd_data_added", "symptoms_added",
"status"
]
}
# Exclude fields only relevant for internal data management from
# submodels
for data_submodel in ["timeseries_data", "obd_data", "symptoms"]:
exclude[data_submodel] = {"__all__": {"data_id"}}

case_json = case.model_dump_json(exclude=exclude, indent=1)
return case_json

@property
def data_file_name(self):
"""Zip file name of the asset's dataset."""
return f"{str(self.id)}.zip"

@property
def data_file_path(self):
"""Path to this asset's dataset."""
return os.path.join(
self.asset_data_dir_path, self.data_file_name
)

async def process_definition(self):
"""
Process the definition of an Asset to prepare the defined data for
publication in a dataspace.
"""
self.data_status = AssetDataStatus.processing
await self.save()
# Find all cases matching the definition
cases = await Case.find_in_hub(
vin=self.definition.vin,
obd_data_dtc=self.definition.obd_data_dtc,
timeseries_data_component=self.definition.timeseries_data_component
)
# Create a new zip archive for this asset
with ZipFile(self.data_file_path, "x") as archive:
archive.mkdir("cases")
archive.mkdir("signals")
for case in cases:
case_id = str(case.id)
case_json = self._publication_case_json(case)
archive.writestr(
f"cases/{case_id}.json", data=case_json
)
for tsd in case.timeseries_data:
signal_id = str(tsd.signal_id)
signal = await tsd.get_signal()
archive.writestr(
f"signals/{signal_id}.json", data=json.dumps(signal)
)

self.data_status = AssetDataStatus.ready
await self.save()

@before_event(Delete)
def _delete_asset_data(self):
"""Remove associated data when asset is deleted."""
# If there is an archive file associated with this asset, delete it.
if os.path.exists(self.data_file_path):
os.remove(self.data_file_path)


class NewAsset(BaseModel):
"""Schema for new asset added via the api."""
name: str
definition: Optional[AssetDefinition] = AssetDefinition()
description: str
author: str
44 changes: 40 additions & 4 deletions api/api/data_management/case.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,19 +126,55 @@ async def find_in_hub(
cls,
customer_id: Optional[str] = None,
vin: Optional[str] = None,
workshop_id: Optional[str] = None
workshop_id: Optional[str] = None,
obd_data_dtc: Optional[str] = None,
timeseries_data_component: Optional[str] = None
) -> List[Self]:
"""
Get list of all cases filtered by customer_id, vehicle_vin and
workshop_id.
Get list of all cases with optional filtering by customer_id,
vehicle_vin, workshop_id or obd_data dtc.
Parameters
----------
customer_id
Customer Id to search for. Only cases associated with the specified
customer are returned
vin
(Partial) VIN to search for. The specified parameter value is
matched against the beginning of the stored vins.
This allows partial vin specification e.g. to search for cases with
vehicles by a specific manufacturer.
workshop_id
Workshop Id to search for. Only cases from the specified workshop
are returned.
obd_data_dtc
DTC to search for. Only cases with at least one occurrence of the
specified dtc in any of the OBD datasets are returned.
timeseries_data_component
Timeseries data component to search for. Only cases that contain at
least one timeseries dataset for the specified component are
returned.
Returns
-------
List of cases matching the specified search criteria.
"""
filter = {}
if customer_id is not None:
filter["customer_id"] = PydanticObjectId(customer_id)
if vin is not None:
filter["vehicle_vin"] = vin
# VIN is matched against beginning of stored vins
filter["vehicle_vin"] = {"$regex": f"^{vin}"}
if workshop_id is not None:
filter["workshop_id"] = workshop_id
if obd_data_dtc is not None:
# Only return cases that contain the specified dtc in any
# of the obd datasets
filter["obd_data.dtcs"] = obd_data_dtc
if timeseries_data_component is not None:
# Only return cases that contain a timeseries dataset with the
# specified component
filter["timeseries_data.component"] = timeseries_data_component

cases = await cls.find(filter).to_list()
return cases
Expand Down
5 changes: 5 additions & 0 deletions api/api/dataspace_management/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
__all__ = [
"Nautilus"
]

from .nautilus import Nautilus
Loading

0 comments on commit 9eb6161

Please sign in to comment.