Releases: dimajix/flowman
Releases · dimajix/flowman
0.16.0
0.15.0
- New configuration variable
floman.default.target.rebalance
- New configuration variable
floman.default.target.parallelism
- Changed behaviour: The
mergeFile
target now does not assume any more that thetarget
is local. If you already
usemergeFiles
with a local file, you need to prefix the target file name withfile://
. - Add new
-t
argument for selectively building a subset of targets - Remove example-plugin
- Add quickstart guide
- Add new "flowman-parent" BOM for projects using Flowman
- Move
com.dimajix.flowman.annotations
package tocom.dimajix.flowman.spec.annotations
- Add new log redaction
- Integrate Scala scode coverage analysis
assemble
will fail when trying to use non-existing columns- Move
swagger
andjson
schema support into separate plugins - Change default build to Spark 3.0 and Hadoop 3.2
- Update Spark to 3.0.2
- Rename class
Executor
toExecution
- watch your plugins! - Implement new configurable
Executor
class for executing build targets. - Add build profile for Spark 3.1.x
- Update ScalaTest to 3.2.5 - watch your unittests for changed ScalaTest API!
- Add new
case
mapping - Add new
--dry-run
command line option - Add new
mock
andnull
mapping types - Add new
mock
relation - Add new
values
mapping - Add new
values
dataset - Implement new testing capabilities
- Rename
update
mapping toupsert
mapping, which better describes its functionality - Introduce new
VALIDATE
phase, which is executed even beforeCREATE
phase - Implement new
validate
andverify
targets - Implement new
deptree
command in Flowman shell
0.14.2
0.14.1
0.14.0
- Fix AWS plugin for Hadoop 3.x
- Improve setup of logging
- Shade Velocity for better interoperability with Spark 3
- Add new web hook facility in namespaces and jobs
- Existing targets will not be overwritten anymore by default. Either use the
--force
command line option, or set
the configuration propertyflowman.execution.target.forceDirty
totrue
for the old behaviour. - Add new command line option
--keep-going
- Implement new
com.dimajix.spark.io.DeferredFileCommitProtocol
which can be used by setting the Spark configuration
parameterspark.sql.sources.commitProtocolClass
- Add new
flowshell
application