Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash Error #15

Open
jconlon opened this issue Feb 19, 2016 · 2 comments
Open

Logstash Error #15

jconlon opened this issue Feb 19, 2016 · 2 comments

Comments

@jconlon
Copy link

jconlon commented Feb 19, 2016

Getting the following errors while trying to decode an avro kafka payloads:

���T.2016-02-18T21:04:14.182MadisonMadison._sp80UM-LEeWzy4JC2MRxcg._dM83cNa1EeWNn7pTE75EYA jconlon@mudshark"Condor Industries"Condor Industries._iqWIsM9sEeWzy4JC2MRxcg2cdo://mudshark:2036/repo1�� {:exception=>#<NoMethodError: undefined methoddecode' for #Array:0x61a3ee90>, :backtrace=>["/opt/elastic/logstash/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-2.0.4/lib/logstash/inputs/kafka.rb:178:in queue_event'", "/opt/elastic/logstash/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-2.0.4/lib/logstash/inputs/kafka.rb:148:inrun'", "/opt/elastic/logstash/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:331:in inputworker'", "/opt/elastic/logstash/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:325:instart_input'"], :level=>:error}`

Logstash configuration

input {
     kafka {
       topic_id => "Observe"
       type => "Observe"
       reset_beginning => true
       auto_offset_reset => smallest
       codec => {
         avro => {
           schema_uri => "/home/jconlon/git/com.verticon.irouter/com.verticon.im.avro/avroSchema/observeEvent.avsc"
         }
       }
    }  
}

Payloads created with Avro 1.7.7 in Java:

Observe event = (Observe) abstractEvent;
            ObserveEvent observeEvent = ObserveEvent.newBuilder().setComments(event.getComments())
                    .setEntityKey(entity.getKey()).setEntityName(entity.getName()).setEntityTag(entity.getTag())
                    .setEventKey(event.getKey())
                    .setLocalTime(event.getLocalDateTime() != null ? event.getLocalDateTime().toString() : null)
                    .setModelRepository(modelRepo).setCurrentParentKey(parent.getKey())
                    .setCurrentParentName(parent.getName()).setCurrentParentTag(parent.getTag())
                    .setTimeStamp(event.getUtc() != null ? event.getUtc().toEpochMilli() : null)
                    .setUrl(event.getUrl() != null ? event.getUrl().toString() : null).setUser(event.getUser()).build();

            logger.debug("Transformed payload {}", observeEvent);

            ByteArrayOutputStream out = new ByteArrayOutputStream();
            // BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(out,
            // null);
            BinaryEncoder encoder = EncoderFactory.get().blockingBinaryEncoder(out, null);
            DatumWriter<ObserveEvent> writer = new SpecificDatumWriter<ObserveEvent>(ObserveEvent.class);

            try {
                writer.write(observeEvent, encoder);
                encoder.flush();
                out.close();
                serializedBytes = out.toByteArray();
            } catch (IOException e) {
                logger.error("Failed to write byte array", e);
            }

Tried both:

                     // BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(out, null);
            BinaryEncoder encoder = EncoderFactory.get().blockingBinaryEncoder(out, null);

Appreciated any tips on solving this.

thanks,
John

@rmoff
Copy link

rmoff commented Mar 16, 2016

Per http://stackoverflow.com/a/33211940/350613 have you tried this alternative logstash config for the avro codec:

input {
     kafka {
       topic_id => "Observe"
       type => "Observe"
       reset_beginning => true
       auto_offset_reset => smallest
       codec => avro  {
           schema_uri => "/home/jconlon/git/com.verticon.irouter/com.verticon.im.avro/avroSchema/observeEvent.avsc"
         }

    }  
}

@micahlmartin
Copy link

micahlmartin commented Apr 27, 2016

+1 I'm getting the same error. I tried using the config from that stackoverflow post but when I try it I get this error:

fetched an invalid config {:config=>"input {
  kafka {
    topic_id => \"test\"
    zk_connect => \"zookeeper:2181\"
    codec => avro {
        schema_uri => \"/etc/avro/test.avsc\"
      }
    }
  }
  }

output {

  file {
    path => \"/data/test-%{+YYYY-MM-dd}.log\"
  }

  }


", :reason=>"Expected one of #, input, filter, output at line 10, column 1 (byte 300) after ", :level=>:error}

@talevy talevy removed their assignment May 15, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants