Log pushed using s3 output plugin throws error "2020-06-16 13:23:35 +0000 [warn]: #0 [out_s3] buffer flush took longer time than slow_flush_log_threshold: elapsed_time=44.62332443147898 slow_flush_log_threshold=20.0 plugin_id="out_s3"" #334
Labels
help wanted
Need help from users
Hi team,
Below is my config for td-agent ::
Include config files in the ./config.d directory
@include config.d/*.conf
<match. *>
@type s3
@id out_s3
@log_level debug
aws_key_id "xx
aws_sec_key "xx
s3_bucket "xx
s3_endpoint "xx
s3_region xx
s3_object_key_format %Y-%m-%d-%H-%M-%S-%{index}-%{hostname}.%{file_extension}
store_as "gzip"
time_key time
tag_key tag
localtime false
time_format "%Y-%m-%dT%H:%M:%SZ"
time_type string
@type json
@type file
path /var/log/fluentd-buffers/s3.buffer
timekey 60
flush_at_shutdown true
timekey_wait 10
timekey_use_utc true
chunk_limit_size 10m
I am using in_Tail plugin to parse and output plugin s3 ,its consuming 100%CPU. When I checked log I am getting below error .Could anyone please let me know what I am missing here.
2020-06-16 13:24:11 +0000 [warn]: #0 [out_s3] buffer flush took longer time than slow_flush_log_threshold: elapsed_time=35.5713529381901 slow_flush_log_threshold=20.0 plugin_id="out_s3"
Version of fluentd "'fluent-plugin-s3' version '1.3.2':"
I tried with below options as well.It dint help me
@type file
path /var/log/fluentd-buffers/s3.buffer
timekey 60
flush_interval 30s
flush_thread_interval 5
flush_thread_burst_interval 15
flush_thread_count 10
timekey_wait 10
timekey_use_utc true
chunk_limit_size 6m
buffer_chunk_limit 256m
The text was updated successfully, but these errors were encountered: