Releases: AbsaOSS/atum-service
v0.3.0
Breaking Changes 💥
- Additional data methods of
AtumContext
uses REST API v2 (incompatibility of Agent of 0.3.0+ with server 0.2.0) by @lsulak in #283 - Full Flyway integration developed by @benedeki in #276
New Features 🎉
-
Atum server REST API v2 developed by @salamonpavel, @TebaleloS, @lsulak, @benedeki in #140
-
Introduced response envelopes providing additional metadata (requestId) for REST API v2 endpoints by @salamonpavel in #197
-
Replaced Json4s and Jackson serialization libraries with Circe by @TebaleloS, @salamonpavel, @benedeki in #214
-
Introduced health API endpoint in a form StatusBoard projects expects by @salamonpavel in #282
-
Dockerfile and application configuration verified for deployment with ZIO and Http4s web server by @salamonpavel in #274
-
Dockerfile adjusted to ZIO framework and custom configuration now being passed during docker run, i.e. independent of the sbt build and docker build by @lsulak in #279
Silent Live 🤫
- Introduced the Reader module to make reading of information stored in Atum server easy. by @benedeki in #248 (not publsiehd yet, only in code-base)
- Atum server REST API v2 endpoints developed by @salamonpavel, @TebaleloS, @lsulak, @benedeki in #140
- There are numerous other endpoints implemented beside those mentioned above. We yet discourage from their usage though, as they are subject to change, particularly their payloads.
Full Changelog
v0.1.1
This version fixes the configuration of the application in the Dockerized environment.
Bugfixes 🛠
- Wrong format of application.properties file by @lsulak in #130
- Renaming the
application.properties
file to be just a template and not the real one, the docker image MUST provide it by @lsulak in #131 - Bugfix/remove hardcoded
application.properties
references from the code and adding ability to use config fromSPRING_CONFIG_LOCATION
env var by @lsulak in #132
Full Changelog: v0.1.0...v0.1.1
v0.2.0
Breaking Changes 💥
- Dropped support of Spark 2.4 by @benedeki, @lsulak, @salamonpavel in #193
- Server moved from Spring to Zio/Tapir by @salamonpavel in #145
- As the application has Http4s Blaze server backend included now there is no need for any servlet container like Tomcat.
- The application is packaged as JAR file and run directly using java-jar.
- Server requires Java 11 platform by @salamonpavel in #151
- The groupId of the libraries changed from
za.co.absa
toza.co.absa.atum-service.atum-service
New Features 🎉
- Flows can now be identified by their "main partitioning" - the partitioning they were created by @benedeki in #178
- Implemented monitoring of Atum server's runtime and of http server's communication. by @salamonpavel, @benedeki in #166
- Database functions (API) to get checkpoints of a partitioning or flow by @lsulak in #187 and @TebaleloS, @benedeki in #189
- To improve testability of agent in
AtumAgent
the class was refactored andCapturingDispatcher
(in memory storage of server requests) was added by @filiphornak, @benedeki in #97 - Integration tests defined and distinguished from unite tests and added to CI/CD by @miroslavpojer in #185
- Partitioning is now checked to be in expected JSON format upon write to DB by @lsulak, @benedeki in #69
- DB login credentials are read from AWS Secrets Manager by @TebaleloS, @lsulak in #107
- _Using Fa-Db library with Doobie as engine instead of Slick by @salamonpavel in #148
- Atum server is now build using Scala 2.13 by @salamonpavel in #149
- Ability to save and retrive Additional data (additional metadata) with
AtumContext
by @benedeki, @lsulak, @salamonpavel, @TebaleloS in #36 - Measures now require 0-n columns in their definition instead of exactly one (depending on the function nature) by @salamonpavel in #100
Atum Context
content is now properly read from Atum Server by @benedeki, @lsulak in #59
Bugfixes 🛠
- A request for
AtumContext
containing custom/unknown measure will not fail anymore by @salamonpavel in #170 - Sbt cross-build fixed by @benedeki, @salamonpavel in #184
Full Changelog
v0.1.0
Initial release of the Atum service
Server
- has two endpoints
/api/v1/createPartitioning
to register or retrieve a partitioning and optionally establish a a relation with another partitioning/api/v1/createCheckpoint
to record measurement data
- connects to Postgres DB that stores the data
- newly created partitioning automatically contains the count function to measure
Agent
- spawn context based on key provided (partitioning)
- add measuring functions; supported now are:
- count
- distinctCount
- aggregatedTotal - sum of values in the column
- absAggregatedTotal
- hashCrc32
- provides interfaces to measure data completeness on DataFrames (create checkpoints)
Database
- created, including DB Roles and an ownership model of the database objects for the Roles
- stores and processes data related to:
- Partitioning
- Additional Data
- Measurement
- Measure Definition
- Checkpoint
- Flow a concept how to describe the data as they go through the systems and how different partitionings relate.