Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix propagation issues for context in the new obsreport usage #625

Merged
merged 1 commit into from
Mar 12, 2020

Conversation

bogdandrutu
Copy link
Member

No description provided.

Copy link
Member Author

@bogdandrutu bogdandrutu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost every receiver had an issue with passing the created Span or the Tags to the next component :(

Comment on lines -335 to -336
_, span := obsreport.StartTraceDataReceiveOp(
receiverCtx, jr.instanceName, collectorHTTPTransport)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it was a bug because the Span was not propagated to the consumer.

Comment on lines -347 to -348
_, span := obsreport.StartTraceDataReceiveOp(
ctx, jtr.instanceName, collectorTChannelTransport)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it was a bug because the Span was not propagated to the consumer.

Comment on lines -369 to -370
_, span := obsreport.StartTraceDataReceiveOp(
context.Background(), jr.instanceName, agentTransport)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it was a bug because the Span was not propagated to the consumer.

Comment on lines -403 to -404
_, span := obsreport.StartTraceDataReceiveOp(
ctx, jr.instanceName, grpcTransport)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it was a bug because the Span was not propagated to the consumer.

@@ -113,13 +114,17 @@ func (ocr *Receiver) processReceivedMetrics(longLivedRPCCtx context.Context, ni
func (ocr *Receiver) sendToNextConsumer(longLivedRPCCtx context.Context, md consumerdata.MetricsData) {
// Do not use longLivedRPCCtx to start the span so this trace ends right at this
// function, and the span is not a child of any span from the stream context.
_, span := obsreport.StartMetricsReceiveOp(
tmpCtx := obsreport.StartMetricsReceiveOp(
context.Background(),
ocr.instanceName,
receiverTransport)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it was a bug because the Span was not propagated to the consumer.

@@ -160,7 +165,7 @@ func (ocr *Receiver) sendToNextConsumer(longLivedRPCCtx context.Context, traceda
err = ocr.nextConsumer.ConsumeTraceData(ctx, *tracedata)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: We did not pass the tags from the longLivedRPCCtx to the consumer so metrics recorded in the producer/exporter will not have them.

@codecov-io
Copy link

Codecov Report

Merging #625 into master will decrease coverage by <.01%.
The diff coverage is 85.1%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #625      +/-   ##
==========================================
- Coverage   75.27%   75.26%   -0.01%     
==========================================
  Files         139      139              
  Lines        9674     9672       -2     
==========================================
- Hits         7282     7280       -2     
  Misses       2072     2072              
  Partials      320      320
Impacted Files Coverage Δ
obsreport/obsreport.go 66.66% <0%> (-1.2%) ⬇️
receiver/zipkinreceiver/trace_receiver.go 60.84% <0%> (ø) ⬆️
...eceiver/opencensusreceiver/ocmetrics/opencensus.go 84.61% <100%> (ø) ⬆️
exporter/exporterhelper/tracehelper.go 93.33% <100%> (ø) ⬆️
obsreport/obsreport_exporter.go 100% <100%> (ø) ⬆️
exporter/exporterhelper/metricshelper.go 95.34% <100%> (-0.11%) ⬇️
receiver/jaegerreceiver/trace_receiver.go 85.88% <100%> (-0.06%) ⬇️
receiver/opencensusreceiver/octrace/opencensus.go 75.43% <100%> (+0.43%) ⬆️
obsreport/obsreport_receiver.go 100% <100%> (ø) ⬆️
...eceiver/prometheusreceiver/internal/transaction.go 77.45% <66.66%> (+0.75%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5334b3a...20a532d. Read the comment docs.

Copy link
Member

@tigrannajaryan tigrannajaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will you have some time to do a quick presentation on how to do the context propagation correctly? I have no idea how this code works and I suspect there may be others in the team who may benefit from what you can tell.

Copy link
Contributor

@pjanotti pjanotti left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching my bugs @bogdandrutu

@@ -75,7 +75,7 @@ func StartTraceDataReceiveOp(
operationCtx context.Context,
receiver string,
transport string,
) (context.Context, *trace.Span) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, it makes sense: the few cases that one may want to add something to the span can be done by extracting it from context.


// TODO: We should offer a version of StartTraceDataReceiveOp that starts the Span
// with as root, and links to the longLived context.
ctx := trace.NewContext(longLivedRPCCtx, trace.FromContext(tmpCtx))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I wasn't aware that this is the correct way to propagate the tags from parent while still having a trace for the individual operation. I will work on getting the TODO done.

@tigrannajaryan tigrannajaryan merged commit 4f385cd into open-telemetry:master Mar 12, 2020
@bogdandrutu bogdandrutu deleted the obsrec branch August 5, 2020 18:49
hughesjj pushed a commit to hughesjj/opentelemetry-collector that referenced this pull request Apr 27, 2023
Troels51 pushed a commit to Troels51/opentelemetry-collector that referenced this pull request Jul 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants