-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unsupported usage of MetricsLevel
#280
Comments
As far as I know, |
Good catch, this seems like dead code then |
I am going to unassign myself since there is nothing to do here for Collector 1.0 |
github-merge-queue bot
pushed a commit
to open-telemetry/opentelemetry-collector
that referenced
this issue
Jan 22, 2025
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue. Ex. Adding a feature - Explain what this achieves.--> #### Description <!-- Issue number if applicable --> Drops metrics that depend on the metrics level: - Batch processor metric - otelarrow metrics (see open-telemetry/otel-arrow/issues/280 for limitation). - internal/otelarrow/netstats metrics. I did not implement https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/a25f058256e8339e49e4c89ac622a9ef47b52334/internal/otelarrow/netstats/netstats.go#L133-L136 since `LevelNone` drops all metrics. This attemps to unblock #11601 by hardcoding the metrics here since there is a small number of them. Once we do #11754 we can move this back to the individual components #### Link to tracking issue Updates #11061
github-merge-queue bot
pushed a commit
to open-telemetry/opentelemetry-collector
that referenced
this issue
Jan 22, 2025
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue. Ex. Adding a feature - Explain what this achieves.--> #### Description <!-- Issue number if applicable --> Drops metrics that depend on the metrics level: - Batch processor metric - otelarrow metrics (see open-telemetry/otel-arrow/issues/280 for limitation). - internal/otelarrow/netstats metrics. I did not implement https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/a25f058256e8339e49e4c89ac622a9ef47b52334/internal/otelarrow/netstats/netstats.go#L133-L136 since `LevelNone` drops all metrics. This attemps to unblock #11601 by hardcoding the metrics here since there is a small number of them. Once we do #11754 we can move this back to the individual components #### Link to tracking issue Updates #11061
Let's remove the code associated with this MetricsLevel. |
I will try and pick this one up and remove the unused code! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
👋 I am looking to drop
MetricsLevel
fromcomponent.TelemetrySettings
, and replace this with a system with which components can pass to the service a set of views that control this (see open-telemetry/opentelemetry-collector/issues/11754).otel-arrow is one of the two components that currently use
MetricsLevel
so I am analyzing how it usesMetricsLevel
. There is one usage here:otel-arrow/pkg/otel/arrow_record/consumer.go
Lines 199 to 201 in c39257c
This would have to be removed or changed for addressing open-telemetry/opentelemetry-collector/issues/11061. To address this issue, I will create views to replicate the current behavior (see open-telemetry/opentelemetry-collector/pull/12143) and set the metrics level to a fixed value on the otelarrow receiver.
The text was updated successfully, but these errors were encountered: