Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[processor/filter] panic: interface conversion: interface {} is string, not bool #13573

Closed
ajsaclayan opened this issue Aug 24, 2022 · 9 comments
Assignees
Labels
bug Something isn't working priority:p2 Medium processor/filter Filter processor

Comments

@ajsaclayan
Copy link
Contributor

ajsaclayan commented Aug 24, 2022

Describe the bug
Using the filter processor, we have a collector configured to route metrics to a pipeline that have a specific label. We have noticed this error happens intermittently:

panic: interface conversion: interface {} is string, not bool [recovered]
	panic: interface conversion: interface {} is string, not bool
goroutine 2858 [running]:
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End.func1()
	go.opentelemetry.io/otel/[email protected]/trace/span.go:359 +0x2a
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End(0xc031aac900, {0x0, 0x0, 0x5?})
	go.opentelemetry.io/otel/[email protected]/trace/span.go:398 +0x8ee
panic({0x3045540, 0xc036c22750})
	runtime/panic.go:884 +0x212
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/processor/filterexpr.(*Matcher).match(0x99999999999999?, 0x38?)
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/processor/filterexpr/matcher.go:124 +0x77
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/processor/filterexpr.(*Matcher).matchEnv(0x4246c7?, {0xc031b68e10, 0x27}, {0x20?})
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/processor/filterexpr/matcher.go:109 +0x88
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/processor/filterexpr.(*Matcher).matchSum(0x0?, {0xc031b68e10, 0x27}, {0x41c5e6?})
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/processor/filterexpr/matcher.go:83 +0x67
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/processor/filterexpr.(*Matcher).MatchMetric(0x40dd3f?, {0x7fe8ef1e2a68?})
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/processor/filterexpr/matcher.go:58 +0x157
github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/processor/filtermetric.(*exprMatcher).MatchMetric(0x7fe8ef1e2a68?, {0x80?})
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/processor/filtermetric/expr_matcher.go:41 +0x5f
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).shouldKeepMetric(0xc001acdc00, {0xc0304e0fb8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:161 +0x38
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1.1({0xa09c85?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:140 +0x3a
go.opentelemetry.io/collector/pdata/internal.MetricSlice.RemoveIf({0x1?}, 0xc0304e10b0)
	go.opentelemetry.io/collector/[email protected]/internal/generated_pmetric.go:536 +0x62
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1({0x32524a0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:139 +0x47
go.opentelemetry.io/collector/pdata/internal.ScopeMetricsSlice.RemoveIf({0x7fe8ef1e2a68?}, 0xc0304e1128)
	go.opentelemetry.io/collector/[email protected]/internal/generated_pmetric.go:342 +0x62
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1({0x1c0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:138 +0x85
go.opentelemetry.io/collector/pdata/internal.ResourceMetricsSlice.RemoveIf({0x7fe8ef1e2a68?}, 0xc0304e1198)
	go.opentelemetry.io/collector/[email protected]/internal/generated_pmetric.go:148 +0x62
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics(0x40e087?, {0x10?, 0x2d9c000?}, {0x1?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:128 +0x3f
go.opentelemetry.io/collector/processor/processorhelper.NewMetricsProcessor.func1({0x3d3e208, 0xc036c22660}, {0x0?})
	go.opentelemetry.io/[email protected]/processor/processorhelper/metrics.go:62 +0xf8
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics(...)
	go.opentelemetry.io/[email protected]/consumer/metrics.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics(0xc000e12810, {0x3d3e208, 0xc036c22660}, {0x3682bde?})
	go.opentelemetry.io/[email protected]/service/internal/fanoutconsumer/metrics.go:72 +0x16f
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/metrics.(*Receiver).Export(0xc00035b4b8, {0x3d3e208, 0xc036c225d0}, {0xc03235ee40?})
	go.opentelemetry.io/[email protected]/receiver/otlpreceiver/internal/metrics/otlp.go:59 +0xd3
go.opentelemetry.io/collector/pdata/pmetric/pmetricotlp.rawMetricsServer.Export({{0x3d14880?, 0xc00035b4b8?}}, {0x3d3e208, 0xc036c225d0}, 0xc038514150)
	go.opentelemetry.io/collector/[email protected]/pmetric/pmetricotlp/metrics.go:167 +0xff
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/metrics/v1._MetricsService_Export_Handler.func1({0x3d3e208, 0xc036c225d0}, {0x34c6080?, 0xc038514150})
	go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/metrics/v1/metrics_service.pb.go:216 +0x78
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1({0x3d3e208?, 0xc036c22570?}, {0x34c6080, 0xc038514150}, 0x0?, 0xc0385141f8)
	go.opentelemetry.io/[email protected]/config/configgrpc/configgrpc.go:386 +0x4c
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x3d3e208?, 0xc036c22570?}, {0x34c6080?, 0xc038514150?})
	google.golang.org/[email protected]/server.go:1117 +0x5b
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x3d3e208, 0xc036c22480}, {0x34c6080, 0xc038514150}, 0xc03480ca20, 0xc0306ae700)
	go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/[email protected]/interceptor.go:325 +0x676

Steps to reproduce
Simplified collector config with:

  1. otlp receiver
  2. filter processor
  3. logging exporter
receivers:
  otlp:

processors:
  filter/test:
    metrics:
      include:
        match_type: expr
        expressions:
          - HasLabel("some.attribute")

exporters:
  logging:

service:
  pipelines:
    metrics:
        receivers: [otlp]
        processors: [filter/test]
        exporters: [logging]

What did you expect to see?

What did you see instead?
For some telemetry, we are seeing some log errors:

{ 
   caller: [email protected]/filter_processor.go:142
   error: runtime error: index out of range [-1] (1:1)
 | HasLabel("some.attribute")
 | ^
   kind: processor
   level: error
   msg: shouldKeepMetric failed
   name: filter/test
   pipeline: metrics
   ts: 1661354049.4371357
}

What version did you use?
Version: 0.50.0
We have not seen any updates to the filterprocessor since that indicates it's fixed in future releases.

What config did you use?
Config: (e.g. the yaml config file)

Environment
OS: Linux

Additional context
We followed the stack trace down to this code which expects a boolean but gets a string https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/v0.50.0/internal/coreinternal/processor/filterexpr/matcher.go#L124

@ajsaclayan ajsaclayan added the bug Something isn't working label Aug 24, 2022
@frzifus
Copy link
Member

frzifus commented Aug 25, 2022

@GCHQDeveloper638
Copy link

GCHQDeveloper638 commented Aug 27, 2022

We're experiencing this as well on the latest release (0.58.0), given a similar basic filter expression as the initial reporter
Label("name") == "expectedValue".

Adding some extra error handling in the match function (below), it seems that the return value when this error occurs is just an empty string.

func (m *Matcher) match(env *env) (bool, error) {
	result, err := m.v.Run(m.program, env)
	if err != nil {
		return false, err
	}

	if _, ok := result.(bool); !ok {
		return false, fmt.Errorf("filter returned non-boolean value type=%T result=%v metric=%s, attributes=%v",
			result, result, env.MetricName, env.attributes.AsRaw())
	}

	return result.(bool), nil
}
2022-08-27T20:01:26.614Z	error	[email protected]/filter_processor.go:142	shouldKeepMetric failed	{"kind": "processor", "name": "filter/myFilter", "pipeline": "metrics/myMetrics", "error": "filter returned non-boolean value type=string result= metric=kafka_topic_partition_oldest_offset, attributes=map[partition:21 topic:__consumer_offsets]"}

Also similar to the initial reporter, we are periodically seeing index out of range errors returned. These are much less frequent for us than the wrong return type though:

2022-08-27T19:28:28.667Z	error	[email protected]/filter_processor.go:142	shouldKeepMetric failed	{"kind": "processor", "name": "filter/myFilter", "pipeline": "metrics/myMetrics", "error": "runtime error: index out of range [-1] (1:1)\n | Label(\"name\") == \"expectedValue\"\n | ^"}

@github-actions
Copy link
Contributor

github-actions bot commented Sep 9, 2022

Pinging code owners: @boostchicken. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Nov 10, 2022
@frzifus
Copy link
Member

frzifus commented Nov 10, 2022

Its done

@adam-kiss-sg
Copy link
Contributor

With 0.64.0, im getting all kind of strange errors (see below). Could this be some kind of race condition?

2022-11-11T21:30:07.794Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [807] with length 5"}
2022-11-11T21:30:36.565Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=latency metric=latency, attributes=map[http.method:GET http.status_code:200 operation:GET /healthz service.name:dummy span.kind:SPAN_KIND_SERVER status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:05.602Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=latency metric=latency, attributes=map[http.method:UNKOWN operation:healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:06.598Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is map[string]struct {}, not bool (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:31:07.807Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=latency, attributes=map[http.method:UNKOWN operation:healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:35.625Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - jsonParser service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:35.626Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=GET metric=latency, attributes=map[http.method:GET operation:HealthCheckController.healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:36.620Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:31:36.624Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - urlencodedParser service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:31:36.620Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:31:37.816Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=latency, attributes=map[http.method:UNKOWN operation:middleware - expressInit service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:32:06.656Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=calls_total metric=calls_total, attributes=map[http.method:UNKOWN operation:request handler - /healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:32:06.656Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1]"}
2022-11-11T21:32:06.656Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [4114] with length 5"}
2022-11-11T21:32:06.657Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:33:06.713Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=latency metric=calls_total, attributes=map[http.method:UNKOWN operation:healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:33:35.709Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:33:35.709Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:33:36.743Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:33:37.870Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=latency, attributes=map[http.method:UNKOWN operation:request handler - /healthz service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:34:05.724Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "operator \"in\"\" not defined on string (1:12)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ...........^"}
2022-11-11T21:34:05.724Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "operator \"in\"\" not defined on string (1:12)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ...........^"}
2022-11-11T21:34:05.724Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:34:36.764Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [807] with length 5"}
2022-11-11T21:34:36.764Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [807] with length 5"}
2022-11-11T21:34:36.765Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:34:37.893Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [4114] with length 5 (1:12)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ...........^"}
2022-11-11T21:34:37.893Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:12)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ...........^"}
2022-11-11T21:35:06.801Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=latency, attributes=map[http.method:UNKOWN operation:middleware - expressInit service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:36:35.859Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=latency metric=latency, attributes=map[http.method:UNKOWN operation:middleware - corsMiddleware service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:36:36.878Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - expressInit service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:37:05.873Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=UNKOWN metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - jsonParser service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:37:07.947Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=calls_total metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - expressInit service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:37:07.948Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is map[string]struct {}, not bool (1:1)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ^"}
2022-11-11T21:37:35.886Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is map[string]struct {}, not bool (1:42)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .........................................^"}
2022-11-11T21:37:35.886Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "reflect.Value.MapIndex: value of type map[string]struct {} is not assignable to type string (1:42)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .........................................^"}
2022-11-11T21:37:36.914Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:1)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ^"}
2022-11-11T21:38:05.904Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=calls_total metric=calls_total, attributes=map[http.method:UNKOWN operation:middleware - query service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:38:07.975Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [4114] with length 5"}
2022-11-11T21:38:07.975Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1]"}
2022-11-11T21:38:07.975Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=GET metric=calls_total, attributes=map[http.method:GET http.status_code:200 operation:GET /healthz service.name:dummy span.kind:SPAN_KIND_SERVER status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:38:37.974Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:38:37.975Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is map[string]struct {}, not bool (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:38:37.974Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:66)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .................................................................^"}
2022-11-11T21:39:07.015Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [-1] (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}
2022-11-11T21:39:07.016Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "filter returned non-boolean value type=string result=latency metric=latency, attributes=map[http.method:UNKOWN operation:middleware - expressInit service.name:dummy span.kind:SPAN_KIND_INTERNAL status.code:STATUS_CODE_UNSET]"}
2022-11-11T21:39:07.017Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "runtime error: index out of range [4608] with length 5 (1:42)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .........................................^"}
2022-11-11T21:39:07.017Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is map[string]struct {}, not bool (1:42)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | .........................................^"}
2022-11-11T21:39:07.985Z        error   [email protected]/filter_processor.go:142 shouldKeepMetric failed {"kind": "processor", "name": "filter/spanmetrics", "pipeline": "metrics/spanmetrics", "error": "interface conversion: interface {} is string, not bool (1:45)\n | MetricName in [\"calls_total\", \"latency\"] && Label(\"http.method\") != \"UNKOWN\"\n | ............................................^"}

@ajsaclayan
Copy link
Contributor Author

ajsaclayan commented Dec 13, 2022

@adam-kiss-sg We're also seeing this in v0.64.0, and it happens to occur intermittently.

Our config looks something like:

receivers:
  otlp:

processors:
  filter/histograms:
	metrics:
	  include:
		match_type: expr
		expressions:
		  - MetricType == "Histogram"

exporters:
  logging:

service:
  pipelines:
    metrics:
        receivers: [otlp]
        processors: [filter/histograms]
        exporters: [logging]

Log message:

error: filter returned non-boolean value type=string result=Gauge... *excluded rest of payload*
message: shouldKeepMetric failed
exception: stacktrace:github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1.1
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:142
go.opentelemetry.io/collector/pdata/pmetric.MetricSlice.RemoveIf
	go.opentelemetry.io/collector/[email protected]/pmetric/generated_metrics.go:554
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1.1
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:139
go.opentelemetry.io/collector/pdata/pmetric.ScopeMetricsSlice.RemoveIf
	go.opentelemetry.io/collector/[email protected]/pmetric/generated_metrics.go:354
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics.func1
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:138
go.opentelemetry.io/collector/pdata/pmetric.ResourceMetricsSlice.RemoveIf
	go.opentelemetry.io/collector/[email protected]/pmetric/generated_metrics.go:154
github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterMetricProcessor).processMetrics
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/filter_processor.go:128
go.opentelemetry.io/collector/processor/processorhelper.NewMetricsProcessor.func1
	go.opentelemetry.io/[email protected]/processor/processorhelper/metrics.go:62
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics
	go.opentelemetry.io/[email protected]/consumer/metrics.go:36
go.opentelemetry.io/collector/processor/processorhelper.NewMetricsProcessor.func1
	go.opentelemetry.io/[email protected]/processor/processorhelper/metrics.go:70
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics
	go.opentelemetry.io/[email protected]/consumer/metrics.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics
	go.opentelemetry.io/[email protected]/service/internal/fanoutconsumer/metrics.go:74
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/metrics.(*Receiver).Export
	go.opentelemetry.io/collector/receiver/[email protected]/internal/metrics/otlp.go:58
go.opentelemetry.io/collector/pdata/pmetric/pmetricotlp.rawMetricsServer.Export
	go.opentelemetry.io/collector/[email protected]/pmetric/pmetricotlp/grpc.go:71
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/metrics/v1._MetricsService_Export_Handler.func1
	go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/metrics/v1/metrics_service.pb.go:311
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1
	go.opentelemetry.io/[email protected]/config/configgrpc/configgrpc.go:421
google.golang.org/grpc.chainUnaryInterceptors.func1.1
	google.golang.org/[email protected]/server.go:1162
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1
	go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/[email protected]/interceptor.go:341
google.golang.org/grpc.chainUnaryInterceptors.func1.1
	google.golang.org/[email protected]/server.go:1165
google.golang.org/grpc.chainUnaryInterceptors.func1
	google.golang.org/[email protected]/server.go:1167
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/metrics/v1._MetricsService_Export_Handler
	go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/metrics/v1/metrics_service.pb.go:313
google.golang.org/grpc.(*Server).processUnaryRPC
	google.golang.org/[email protected]/server.go:1340
google.golang.org/grpc.(*Server).handleStream
	google.golang.org/[email protected]/server.go:1713
google.golang.org/grpc.(*Server).serveStreams.func1.2
	google.golang.org/[email protected]/server.go:965

@adam-kiss-sg
Copy link
Contributor

@ajsaclayan Can you try it now? According to the changelog the fix was deployed in v0.67.0

@ajsaclayan
Copy link
Contributor Author

ajsaclayan commented Jan 10, 2023

@adam-kiss-sg Verified the issue is no longer there. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority:p2 Medium processor/filter Filter processor
Projects
None yet
Development

No branches or pull requests

6 participants