Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable_input_metrics: inaccurate values #4717

Open
daipom opened this issue Nov 27, 2024 · 0 comments
Open

enable_input_metrics: inaccurate values #4717

daipom opened this issue Nov 27, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@daipom
Copy link
Contributor

daipom commented Nov 27, 2024

Describe the bug

We can take metrics of input plugins by setting enable_input_metrics.
However, the value would be inaccurate.

I don't directly confirm it yet, but I have confirmed @caller_plugin_id of EventRouter has race condition.

So, it would be possible that a wrong metric_callbacks is selected.

def find_callback
if @caller_plugin_id
@metric_callbacks[@caller_plugin_id]
else
nil
end
end

It would cause wrong metrics calculation.

To Reproduce

I haven't checked it yet, but the following settings should result in a slight error in the metric values.

<system>
  enable_input_metrics
</system>

<source>
  @type monitor_agent
</source>

<source>
  @type sample
  tag test.foo
  rate 100
</source>

<source>
  @type sample
  tag test.bar
  rate 100
</source>

<source>
  @type sample
  tag test.boo
  rate 100
</source>

<match test.**>
  @type null
</match>

Wait a few minutes and check the metrics.

curl http://localhost:24220/api/plugins.json | jq
{
  "plugins": [
    {
      "plugin_id": "object:d34",
      "plugin_category": "input",
      "type": "monitor_agent",
      "config": {
        "@type": "monitor_agent"
      },
      "output_plugin": false,
      "retry_count": null,
      "emit_records": 0,
      "emit_size": 0
    },
    {
      "plugin_id": "object:d48",
      "plugin_category": "input",
      "type": "sample",
      "config": {
        "@type": "sample",
        "tag": "test.foo",
        "rate": "100"
      },
      "output_plugin": false,
      "retry_count": null,
      "emit_records": 43112,
      "emit_size": 0
    },
    {
      "plugin_id": "object:d5c",
      "plugin_category": "input",
      "type": "sample",
      "config": {
        "@type": "sample",
        "tag": "test.bar",
        "rate": "100"
      },
      "output_plugin": false,
      "retry_count": null,
      "emit_records": 43109,
      "emit_size": 0
    },
    {
      "plugin_id": "object:d70",
      "plugin_category": "input",
      "type": "sample",
      "config": {
        "@type": "sample",
        "tag": "test.boo",
        "rate": "100"
      },
      "output_plugin": false,
      "retry_count": null,
      "emit_records": 43109,
      "emit_size": 0
    },
    {
      "plugin_id": "object:d0c",
      "plugin_category": "output",
      "type": "null",
      "config": {
        "@type": "null"
      },
      "output_plugin": true,
      "retry_count": 0,
      "emit_records": 129330,
      "emit_size": 0,
      "emit_count": 129330,
      "write_count": 0,
      "rollback_count": 0,
      "slow_flush_count": 0,
      "flush_time_count": 0,
      "retry": {}
    }
  ]
}

You can confirm a difference in the value of emit_records for each in_sample.

Expected behavior

There is no difference in the value of emit_records for each in_sample.

Your Environment

- Fluentd version: 1.18.0
- Package version:
- Operating system: Ubuntu 20.04.6 LTS (Focal Fossa)
- Kernel version: 5.15.0-124-generic

Your Configuration

Noted in `To Reproduce`.

Your Error Log

No error.

Additional context

No response

@daipom daipom moved this to To-Do in Fluentd Kanban Nov 27, 2024
@daipom daipom added bug Something isn't working and removed waiting-for-triage labels Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: To-Do
Development

No branches or pull requests

1 participant