-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exception while trying to use avro serdes: "No implementation of method: :clj->avro of protocol:" #169
Comments
The included confluent serde currently expects to find a local copy of the schema on the classpath when serializing a message but not when deserializing the message. This behavior is inherited from upstream Serializer. If it can't find one, that might be why it is throwing an error (admittedly not the most informative). In earlier versions, it would have thrown an error at resolution time but this PR relaxed that a bit so that consumers are not required to provide one. Try putting those avro files on the resource-path (i.e. assuming a standard lein project layout, in the files |
Thank you for your reply! Actually the schema files are already under the resources dir. When they're not, the friendlier exception "Execution error (ExceptionInfo) at jackdaw.serdes.resolver/load-schema (resolver.clj:18). |
Updated OP, using confluent platform for better reproducibility. |
Ah ok sorry. Our confluent serde expects keys in the message and value maps to be keyword, hyphenated versions of the corresponding avro fields. There is #126 which seeks to provide the option to not mangle the names in this way. |
Thank you! Using keywords will do it for now until the option not to mangle names gets released. I tried here and it worked with local Confluent Platform install. I had some problems with Confluent Cloud, but I'll open another issue. |
Hi! I'm trying to use various jackdaw serdes, but unfortunately I'm not managing to get avro serdes to work. The following exception is thrown:
I'm using clojure 1.10.0 and jackdaw 0.6.6.
Kafka clusters/schema registry is managed by Confluent Cloud.Using Confluent Platform 5.2.1. Key/value schemas are already in registry. I'd really appreciate any help or suggestion! This code is mostly the same asexamples/serdes
. String and JSON serdes are working fine.resources/key_schema.json
resources/value_schema.json
Update: Using Confluent Platform, since issue doesn't depend on Confluent Cloud.
The text was updated successfully, but these errors were encountered: