-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrate data mesh over platform observability #57
Comments
Working on an initial POC for data mesh on application observability and also reservation system for an airline customer. This is to create an event database to be used as source for AI models for anomaly detection. |
@jpaulrajredhat as discussed, let's start to introduce the observability components into the data mesh deployment through this issue please. The end state we are looking at is metrics, logs, traces stored for long term queries in Minio and ability to query them through Trino. This will be the starting point of AIOps pipelines in the future. |
sure , I'll do this week. |
MinIO , Airflow components are already in datamesh repo , only thing we need to add OpenTelemetry collector , Elastic search APM components and Kafka. for single point of view , Elastic search APM is the right option or we need to build custom dashboard |
Integrate with OpenTelemetry to export metrics, logs, and traces from the platform (as well as potentially Kepler) into data mesh ingestion. For this we focus on technical stacks to be used long term by our engineering team for metrics / logs / traces collection in the platform.
Metrics: Prometheus / Thanos
Logs: Loki / Vector
Traces: Jaeger / OpenTelemetry
The proposed approach would create a single layer of data delivery for metrics, logs and traces for the data collected and stored (potentially via ingestion through Trino / Iceberg).
The text was updated successfully, but these errors were encountered: