A little Kafka daemon for shipping logs from the file system to topics.
This configuration format is compatible with the Amazon Kinesis Agent.
agent.json
{
"checkpointFile": "/tmp/aws-kinesis-agent-checkpoints/main.log",
"kinesis.endpoint": "https://kinesis.us-west-2.amazonaws.com",
"flows": [
{
"filePattern": "/tmp/aws-kinesis-agent-test1.log*",
"initialPosition": "END_OF_FILE",
"kinesisStream": "aws-kinesis-agent-test1",
"dataProcessingOptions": [
{
"optionName": "CSVTOJSON",
"customFieldNames": [ "field1", "field2", "field3" ],
"delimiter": "\\t"
}
]
},
{
"filePattern": "/tmp/aws-kinesis-agent-test2.log*",
"initialPosition": "START_OF_FILE",
"kinesisStream": "aws-kinesis-agent-test2",
"dataProcessingOptions": [
{
"optionName": "LOGTOJSON",
"logFormat": "COMMONAPACHELOG"
}
]
}
]
}
Possible file positions are:
-
START_OF_FILE
-
END_OF_FILE
Working with filekäfer is relatively straightforward, assuming
Rust is already present on the system. Docker and
docker-compose
are recommended, as is kafkacat for debugging.
Running make
should output the available targets with a short description of
each. For the most part make run
, make check
, and make watch
will be used
during development of filekäfer.