Replicate ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG -> "io.confluent.kafka.serializers.KafkaAvroSerializer" functionality #148
-
Hi, sorry if this is a silly question but we are trying to create a parallel pipeline that replicates data to an existing Kafka topic. The legacy producer serialized the keys of the kafka message using a producer config (Scala code) that contains:
Normally I would use a string serializer for this but we need to be backwards compatible with the existing messages so we are trying to replicate the logic that occurs when you produce a message. Looking at the confluent code it seems that when you pass a primitive object it generates a schema for you: https://github.com/confluentinc/schema-registry/blob/9ffc7f6cc40b19c80fd76878e6e732611816bcbe/client/src/main/java/io/confluent/kafka/schemaregistry/avro/AvroSchemaUtils.java#L104 We tried replicating this idea by creating a schema that looks something like:
and then calling encode:
This produces the following when logged:
However we get the following error while using kafka-avro-console-consumer (note we only get this error if we use the option
The legacy consumer receives the same error. My goal is to be able to serialize this key in such a way that we can continue consuming these messages without modifying the legacy consumer. Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi! It's likely that the code is using the Confluent Schema Registry to store the generated schema, and using the "messaging" variation of Avro to encode the data. Try reading that part of the README and let me know if it works. |
Beta Was this translation helpful? Give feedback.
Hi! It's likely that the code is using the Confluent Schema Registry to store the generated schema, and using the "messaging" variation of Avro to encode the data. Try reading that part of the README and let me know if it works.