Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decoding wrong values #32

Open
stepanurban opened this issue Jul 16, 2019 · 1 comment
Open

Decoding wrong values #32

stepanurban opened this issue Jul 16, 2019 · 1 comment
Assignees

Comments

@stepanurban
Copy link

All float values from Kafka printed by logstash are wrong. To Kafka I'm sending following record

ProducerRecord(topic=weather, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=BN, value={"Temperature": 20.49, "Humidity": 56.0, "Pressure": 1011.0, "Wind": 1.0, "Cloudiness": 75.0}, timestamp=1562936770006)

In Kafka it is stored properly but logstash console output is

{
   "@version" => "1",
   "Pressure" => 23.718717575073242,
 "Cloudiness" => -0.09371757507324219,
       "Wind" => 56.0,
 "@timestamp" => 2019-07-12T13:06:10.012Z,
"Temperature" => 0.0,
"Humidity" => -0.09371758997440338
}

My logstash file is

input {
kafka {
codec => avro {
    	schema_uri => "./weather.avsc"
	}
    bootstrap_servers => "localhost:9092"
    topics => ["weather"]
}
}

output {
elasticsearch {
  hosts => ["localhost:9200"]
  index => "weather"
  workers => 1
}
stdout {}
}

and my schema weather.avsc

{
"namespace": "weather",
"type": "record",
"name": "WeatherMessage",
"fields": [
	{
		"name": "Temperature",
		"type": "float"
	},
	{
		"name": "Humidity",
		"type": "float"
	},
	{
		"name": "Pressure",
		"type": "float"
	},
	{
		"name": "Wind",
		"type": "float"
	},
	{
		"name": "Cloudiness",
		"type": "float"
	}
]
}

I'm using logstash 7.2. and io.confluent:kafka-avro-serializer:3.3.3 and 'org.apache.avro', name: 'avro-tools', version: '1.9.0' OS: Ubuntu 18.04.

@colinsurprenant
Copy link
Contributor

@stepanurban thanks for the report.
It seems to me that these are probably "normal" float/double representation errors - see https://www.ericlin.me/2016/02/avro-data-types/ for some information on this. If you want to preserve the exact representation I believe you need to use the decimal logical type?
To help better diagnose, could you possibly provide an input example that matches the logstash output?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants