You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Production of AVRO specific record seems to be creating a Generic Record - v1.1.19/Lambda/Java 17 - Protobuf reading a typed record doesn´t find the class to map to
#339
Open
bjanischevsky opened this issue
Mar 18, 2024
· 0 comments
I am testing all features on this library and found that if I produce an AVRO specific record, when I try to read it as specific too, I get this error:
Caused by: java.lang.NullPointerException: Cannot invoke "java.lang.Class.newInstance()" because "readerClass" is null
at com.amazonaws.services.schemaregistry.deserializers.avro.DatumReaderInstance.from(DatumReaderInstance.java:42)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:114)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:111)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
but when I read is as generic record it all works. I have the same ticket open at AWS´ developer support for more than 72 hours and nobody is responding. Is this the right channel here for this?
Here´s the code that produces it:
config.put(AWSSchemaRegistryConstants.AVRO_RECORD_TYPE, AvroRecordType.SPECIFIC_RECORD.getName());
try (var producerSpecific = new KafkaProducer<String, simpleschema>(config)) {
simpleschema simpleschema = gson.fromJson(jsonMessage, simpleschema.class);
outputText = gson.toJson(simpleschema);
AppSingleton.logger.log("Specific record type: " + simpleschema.getClass().getName() + " " + outputText);
var record = new ProducerRecord<>(typedMessage.topic, key, simpleschema);
var producerResult = producerSpecific.send(record);
producerResult.get();
producerSpecific.flush();
// send the record to kafka
AppSingleton.logger.log("Finished producing a specific message to Kafka with settings: " + gson.toJson(config));
}
I have a similar issue with Protobuf, when I read the data, the library tries to instantiate an object of a given type but fails to find it with error:
Caused by: java.lang.ClassNotFoundException: /PersonSchemaDynamic$Person
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Unknown Source)
at com.amazonaws.services.schemaregistry.deserializers.protobuf.ProtobufWireFormatDecoder.deserializeToPojo(ProtobufWireFormatDecoder.java:72)
... 28 more
I made sure the class file is embeded at diferent places of the JAR file usef for deployment of the Lambda, but it always get the same error.
The text was updated successfully, but these errors were encountered:
I am testing all features on this library and found that if I produce an AVRO specific record, when I try to read it as specific too, I get this error:
Caused by: java.lang.NullPointerException: Cannot invoke "java.lang.Class.newInstance()" because "readerClass" is null
at com.amazonaws.services.schemaregistry.deserializers.avro.DatumReaderInstance.from(DatumReaderInstance.java:42)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:114)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:111)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
but when I read is as generic record it all works. I have the same ticket open at AWS´ developer support for more than 72 hours and nobody is responding. Is this the right channel here for this?
Here´s the code that produces it:
I have a similar issue with Protobuf, when I read the data, the library tries to instantiate an object of a given type but fails to find it with error:
Caused by: java.lang.ClassNotFoundException: /PersonSchemaDynamic$Person
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Unknown Source)
at com.amazonaws.services.schemaregistry.deserializers.protobuf.ProtobufWireFormatDecoder.deserializeToPojo(ProtobufWireFormatDecoder.java:72)
... 28 more
The text was updated successfully, but these errors were encountered: