Skip to content

Commit baaee63

Browse files
anmol1vw13Anmol Vijaywargiyashubhang.balkundi
authored
Structured logging with ziggurat (#238)
* [Anmol|Shubhang] Add Structured logging capability * Add test for setting decoder when using json format * Remove log4j.test.xml * Removed unused code * Add mdc to text log * Excluding cambium macros from coverage * Quoting the cambium libs in cloverage * Add default log format * Change log level to error * Handling decreased line coverage * changes log formats and removes timezone block * moves set-properties-for-structured-logging inside initialize-config * Organize imports * Structuring ziggurat logs (#239) * Add structured logs to ziggurat logs * Fix linting * changes log formats and removes timezone block * adds logging context to mapper flow and rabbitmq consume flow * Resolve PR comments Co-authored-by: Anmol Vijaywargiya <[email protected]> Co-authored-by: shubhang.balkundi <[email protected]> * Updated changelog and project version for 4.2.0 Co-authored-by: Anmol Vijaywargiya <[email protected]> Co-authored-by: shubhang.balkundi <[email protected]>
1 parent c8343fb commit baaee63

15 files changed

+191
-127
lines changed

CHANGELOG.md

+5
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,11 @@
22

33
All notable changes to this project will be documented in this file. This change log follows the conventions of [keepachangelog.com](http://keepachangelog.com/).
44

5+
## 4.2.0
6+
- Enabled structured logging via [cambium](https://cambium-clojure.github.io/)
7+
- Replaced log4j with logback as the default slf4j implementation.
8+
- Added structured logs in ziggurat codebase as well.
9+
510
## 4.1.0
611
- Changed the code for consuming from RabbitMQ - on exception during de-serialization, message is sent to the dead-set
712
queues and NOT re-queued back in the queue like previous versions.

project.clj

+18-6
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
(cemerick.pomegranate.aether/register-wagon-factory!
33
"http" #(org.apache.maven.wagon.providers.http.HttpWagon.))
44

5-
(defproject tech.gojek/ziggurat "4.1.0"
5+
(defproject tech.gojek/ziggurat "4.2.0"
66
:description "A stream processing framework to build stateless applications on kafka"
77
:url "https://github.com/gojektech/ziggurat"
88
:license {:name "Apache License, Version 2.0"
@@ -30,8 +30,6 @@
3030
[io.opentracing.contrib/opentracing-rabbitmq-client "0.1.11" :exclusions [com.rabbitmq/amqp-client]]
3131
[org.apache.httpcomponents/fluent-hc "4.5.13"]
3232
[org.apache.kafka/kafka-streams "2.8.0" :exclusions [org.slf4j/slf4j-log4j12 log4j]]
33-
[org.apache.logging.log4j/log4j-core "2.14.1"]
34-
[org.apache.logging.log4j/log4j-slf4j-impl "2.14.1"]
3533
[org.clojure/clojure "1.10.3"]
3634
[org.clojure/tools.logging "1.1.0"]
3735
[nrepl/nrepl "0.8.3"]
@@ -53,7 +51,14 @@
5351
com.fasterxml.jackson.core/jackson-core
5452
com.fasterxml.jackson.dataformat/jackson-dataformat-smile
5553
com.fasterxml.jackson.dataformat/jackson-dataformat-cbor]]
56-
[metosin/ring-swagger-ui "3.46.0"]]
54+
[metosin/ring-swagger-ui "3.46.0"]
55+
[cambium/cambium.core "1.1.0"]
56+
[cambium/cambium.codec-cheshire "1.0.0"]
57+
[cambium/cambium.logback.json "0.4.4"]
58+
[ch.qos.logback/logback-classic "1.2.3"]
59+
[ch.qos.logback.contrib/logback-json-classic "0.1.5"]
60+
[ch.qos.logback.contrib/logback-jackson "0.1.5"]
61+
[net.logstash.logback/logstash-logback-encoder "6.6"]]
5762
:deploy-repositories [["clojars" {:url "https://clojars.org/repo"
5863
:username :env/clojars_username
5964
:password :env/clojars_password
@@ -67,14 +72,21 @@
6772
:global-vars {*warn-on-reflection* true}
6873
:pedantic? :abort}
6974
:test {:java-source-paths ["src/com" "test/com"]
70-
:jvm-opts ["-Dlog4j.configurationFile=resources/log4j2.test.xml"]
75+
:jvm-opts ["-Dlogback.configurationFile=resources/logback.test.xml"]
7176
:dependencies [[com.google.protobuf/protobuf-java "3.17.0"]
7277
[junit/junit "4.13.2"]
7378
[org.hamcrest/hamcrest-core "2.2"]
7479
[org.apache.kafka/kafka-streams "2.8.0" :classifier "test" :exclusions [org.slf4j/slf4j-log4j12 log4j]]
7580
[org.apache.kafka/kafka-clients "2.8.0" :classifier "test"]
7681
[org.clojure/test.check "1.1.0"]]
77-
:plugins [[lein-cloverage "1.0.13" :exclusions [org.clojure/clojure]]]
82+
:plugins [[lein-cloverage "1.2.2" :exclusions [org.clojure/clojure]]]
83+
:cloverage {:exclude-call ['cambium.core/info
84+
'cambium.core/debug
85+
'cambium.core/trace
86+
'cambium.core/warn
87+
'cambium.core/error
88+
'cambium.core/fatal
89+
'cambium.core/with-logging-context]}
7890
:repositories [["confluent-repo" "https://packages.confluent.io/maven/"]]}
7991
:dev {:plugins [[lein-ancient "0.6.15"]
8092
[lein-cljfmt "0.6.4"]

resources/config.test.edn

+2-1
Original file line numberDiff line numberDiff line change
@@ -85,4 +85,5 @@
8585
:value-deserializer-class-config "org.apache.kafka.common.serialization.ByteArrayDeserializer"}}
8686
:tracer {:enabled [true :bool]
8787
:custom-provider ""}
88-
:new-relic {:report-errors false}}}
88+
:new-relic {:report-errors false}
89+
:log-format "text"}}

resources/log4j2.test.xml

-21
This file was deleted.

resources/logback.test.xml

+35
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
<configuration>
2+
<property name="type" value="${ZIGGURAT_LOG_FORMAT:-text}"/>
3+
4+
<if condition='property("type").equals("json")'>
5+
<then>
6+
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
7+
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
8+
<layout class="cambium.logback.json.FlatJsonLayout">
9+
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
10+
<prettyPrint>false</prettyPrint>
11+
</jsonFormatter>
12+
<timestampFormat>yyyy-MM-dd'T'HH:mm:ss.SSS'Z'</timestampFormat>
13+
<appendLineSeparator>true</appendLineSeparator>
14+
</layout>
15+
</encoder>
16+
</appender>
17+
</then>
18+
</if>
19+
20+
<if condition='property("type").equals("text")'>
21+
<then>
22+
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
23+
<layout class="ch.qos.logback.classic.PatternLayout">
24+
<pattern>
25+
[%-5level] %d [%t] %c:%M: %m { %mdc }%n
26+
</pattern>
27+
</layout>
28+
</appender>
29+
</then>
30+
</if>
31+
32+
<root level="error">
33+
<appender-ref ref="STDOUT"/>
34+
</root>
35+
</configuration>

src/ziggurat/config.clj

+2-1
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,8 @@
3939
:http-server {:middlewares {:swagger {:enabled false}}
4040
:port 8080
4141
:thread-count 100}
42-
:new-relic {:report-errors false}}})
42+
:new-relic {:report-errors false}
43+
:log-format "text"}})
4344

4445
(defn- interpolate-val [val app-name]
4546
(if (string? val)

src/ziggurat/init.clj

+14-7
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,11 @@
11
(ns ziggurat.init
22
"Contains the entry point for your application."
3-
(:require [clojure.tools.logging :as log]
3+
(:require [cambium.codec :as codec]
4+
[cambium.logback.json.flat-layout :as flat]
5+
[clojure.set :as set]
6+
[clojure.tools.logging :as log]
47
[mount.core :as mount :refer [defstate]]
58
[schema.core :as s]
6-
[clojure.set :as set]
79
[ziggurat.config :refer [ziggurat-config] :as config]
810
[ziggurat.messaging.connection :as messaging-connection :refer [connection]]
911
[ziggurat.messaging.consumer :as messaging-consumer]
@@ -44,6 +46,10 @@
4446
(start-rabbitmq-connection args)
4547
(messaging-producer/make-queues (event-routes args)))
4648

49+
(defn- set-properties-for-structured-logging []
50+
(if (= (:log-format (ziggurat-config)) "json")
51+
(flat/set-decoder! codec/destringify-val)))
52+
4753
(defn start-kafka-producers []
4854
(start* #{#'kafka-producers}))
4955

@@ -130,7 +136,8 @@
130136
((fnk (get valid-modes-fns mode)) args)))))
131137

132138
(defn initialize-config []
133-
(start* #{#'config/config}))
139+
(start* #{#'config/config})
140+
(set-properties-for-structured-logging))
134141

135142
(defn start-common-states []
136143
(start* #{#'metrics/statsd-reporter
@@ -188,9 +195,9 @@
188195
(defn- validate-routes-against-config
189196
([routes route-type]
190197
(doseq [[topic-entity handler-map] routes]
191-
(let [route-config (-> (ziggurat-config)
192-
(get-in [route-type topic-entity]))
193-
channels (-> handler-map (dissoc :handler-fn) (keys) (set))
198+
(let [route-config (-> (ziggurat-config)
199+
(get-in [route-type topic-entity]))
200+
channels (-> handler-map (dissoc :handler-fn) (keys) (set))
194201
config-channels (-> (ziggurat-config)
195202
(get-in [route-type topic-entity :channels])
196203
(keys)
@@ -210,7 +217,7 @@
210217
(validate-routes-against-config batch-routes :batch-routes)))
211218

212219
(defn- derive-modes [stream-routes batch-routes actor-routes]
213-
(let [base-modes [:management-api :worker]]
220+
(let [base-modes [:management-api :worker]]
214221
(if (and (nil? stream-routes) (nil? batch-routes))
215222
(throw (IllegalArgumentException. "Either :stream-routes or :batch-routes should be present in init args")))
216223
(cond-> base-modes

src/ziggurat/kafka_consumer/consumer_driver.clj

+4-2
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,14 @@
66
[clojure.tools.logging :as log]
77
[ziggurat.kafka-consumer.executor-service :refer :all]
88
[mount.core :as mount]
9-
[ziggurat.metrics :as metrics])
9+
[ziggurat.metrics :as metrics]
10+
[cambium.core :as clog])
1011
(:import (java.util.concurrent ExecutorService RejectedExecutionException)
1112
(org.apache.kafka.clients.consumer Consumer)))
1213

1314
(defn- start-polling-with-consumer
1415
[consumer init-arg topic-entity consumer-config]
16+
1517
(let [message-poller (cast Runnable #(ch/poll-for-messages consumer (:handler-fn init-arg) topic-entity consumer-config))]
1618
(when message-poller
1719
(try
@@ -43,7 +45,7 @@
4345
(defn- stop-consumers [consumer-groups]
4446
(do (log/info "stopping consumers")
4547
(doseq [[topic-entity consumers] consumer-groups]
46-
(log/info "Stopping threads for consumer group: " topic-entity)
48+
(clog/info {:consumer-group topic-entity} (str "Stopping threads for consumer group: " topic-entity))
4749
(doseq [consumer consumers]
4850
(.wakeup ^Consumer consumer)))))
4951

src/ziggurat/kafka_consumer/consumer_handler.clj

+23-21
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,8 @@
33
[ziggurat.config :refer :all]
44
[ziggurat.messaging.producer :as producer]
55
[ziggurat.message-payload :refer [map->MessagePayload]]
6-
[ziggurat.metrics :as metrics])
6+
[ziggurat.metrics :as metrics]
7+
[cambium.core :as clog])
78
(:import (org.apache.kafka.common.errors WakeupException)
89
(java.time Duration Instant)
910
(tech.gojek.ziggurat.internal InvalidReturnTypeException)
@@ -26,8 +27,8 @@
2627
(producer/retry batch-payload))
2728
([batch current-retry-count topic-entity]
2829
(when (pos? (count batch))
29-
(let [message (map->MessagePayload {:message batch
30-
:retry-count current-retry-count
30+
(let [message (map->MessagePayload {:message batch
31+
:retry-count current-retry-count
3132
:topic-entity topic-entity})]
3233
(producer/retry message)))))
3334

@@ -49,17 +50,17 @@
4950
batch-size (count batch)]
5051
(try
5152
(when (not-empty batch)
52-
(log/infof "[Consumer Group: %s] Processing the batch with %d messages" topic-entity batch-size)
53-
(let [start-time (Instant/now)
54-
result (batch-handler batch)
55-
time-taken-in-millis (.toMillis (Duration/between start-time (Instant/now)))]
53+
(clog/info {:batch-size batch-size} (format "[Consumer Group: %s] Processing the batch with %d messages" topic-entity batch-size))
54+
(let [start-time (Instant/now)
55+
result (batch-handler batch)
56+
time-taken-in-millis (.toMillis (Duration/between start-time (Instant/now)))]
5657
(validate-batch-processing-result result)
5758
(let [messages-to-be-retried (:retry result)
5859
to-be-retried-count (count messages-to-be-retried)
5960
skip-count (count (:skip result))
6061
success-count (- batch-size (+ to-be-retried-count skip-count))]
61-
(log/infof "[Consumer Group: %s] Processed the batch with success: [%d], skip: [%d] and retries: [%d] \n"
62-
topic-entity success-count skip-count to-be-retried-count)
62+
63+
(clog/info {:messages-successfully-processed success-count :messages-skipped skip-count :messages-to-be-retried to-be-retried-count} (format "[Consumer Group: %s] Processed the batch with success: [%d], skip: [%d] and retries: [%d] \n" topic-entity success-count skip-count to-be-retried-count))
6364
(publish-batch-process-metrics topic-entity batch-size success-count skip-count to-be-retried-count time-taken-in-millis)
6465
(retry messages-to-be-retried current-retry-count topic-entity))))
6566
(catch InvalidReturnTypeException e
@@ -78,16 +79,17 @@
7879

7980
(defn poll-for-messages
8081
[^Consumer consumer handler-fn topic-entity consumer-config]
81-
(try
82-
(loop [records []]
83-
(when (not-empty records)
84-
(let [batch-payload (create-batch-payload records topic-entity)]
85-
(process handler-fn batch-payload)))
86-
(recur (seq (.poll consumer (Duration/ofMillis (or (:poll-timeout-ms-config consumer-config) DEFAULT_POLL_TIMEOUT_MS_CONFIG))))))
87-
(catch WakeupException e
88-
(log/errorf e "WakeupException while polling for messages for: %s" topic-entity))
89-
(catch Exception e
90-
(log/errorf e "Exception while polling for messages for: %s" topic-entity))
91-
(finally (do (log/info "Closing the Kafka Consumer for: " topic-entity)
92-
(.close consumer)))))
82+
(clog/with-logging-context {:consumer-group topic-entity}
83+
(try
84+
(loop [records []]
85+
(when (not-empty records)
86+
(let [batch-payload (create-batch-payload records topic-entity)]
87+
(process handler-fn batch-payload)))
88+
(recur (seq (.poll consumer (Duration/ofMillis (or (:poll-timeout-ms-config consumer-config) DEFAULT_POLL_TIMEOUT_MS_CONFIG))))))
89+
(catch WakeupException e
90+
(log/errorf e "WakeupException while polling for messages for: %s" topic-entity))
91+
(catch Exception e
92+
(log/errorf e "Exception while polling for messages for: %s" topic-entity))
93+
(finally (do (log/info "Closing the Kafka Consumer for: " topic-entity)
94+
(.close consumer))))))
9395

0 commit comments

Comments
 (0)