Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tracing: improve testing of shadow tracers and fix logging regression #61247

Merged
merged 3 commits into from
Mar 2, 2021

Conversation

tbg
Copy link
Member

@tbg tbg commented Mar 1, 2021

Rearrange the existing test, and test the Zipkin tracer as well. Phase
out the mockTracer; we can just use a zipkin tracer with an in-mem
collector.

Note that the updated tests also verify that if we create a span from a
tracer configured with a shadow tracer (zipkin), that span does indeed
record verbose log messages (cc @andreimatei).

Release justification: testing improvement
Release note: None

Previously (i.e. pre the current cycle's tracing work), a trace that had
a shadow trace attached to it was always regarded as verbose (and there
was no explicit notion of verbosity; the span was recording, which
equalled it being verbose).

Now, verbose spans are those that collect all log messages in the
crdb-internal span. The log package had not been updated to reflect
this, so when a non-verbose span was set up with, say, a zipkin trace,
it was not regarded as worth logging into by log.Event and friends.

This commit fixes this as follows:

  • a span that has a shadow span but isn't explicitly set to verbose
    remains non-verbose, but
  • the logging framework (ExpensiveLogEnabled) asks the tracer whether
    an auxiliary logging sink is currently set up; if so it does relay all
    log messages to the trace span nevertheless, in effect restoring the
    old behavior.

As a nice improvement, we are now giving log messages only to the
shadow span. It used to be such that the messages were added to the
shadow span and duplicated in the crdb-internal span, for no good
reason. This alone should cut the memory overhead of running with an
external trace collector by around 50%.

Release justification: fixes a recent regression in tracing functionality
Release note: None

@cockroach-teamcity
Copy link
Member

This change is Reviewable

Copy link
Contributor

@knz knz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this commit also fixes a bug (with the && tr != nil condition)? If so maybe mention that in the commit message.

Reviewed 4 of 4 files at r1.
Reviewable status: :shipit: complete! 0 of 0 LGTMs obtained (waiting on @angelapwen and @irfansharif)

@tbg tbg changed the title tracing: improve testing of shadow tracers tracing: improve testing of shadow tracers and fix logging regression Mar 1, 2021
@tbg
Copy link
Member Author

tbg commented Mar 1, 2021

This will need another review, as a substantial commit was added.

Copy link
Contributor

@knz knz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:lgtm: with nit

also you need a round of make bazel-generate on the 2nd commit

Reviewed 3 of 3 files at r2, 5 of 5 files at r3.
Reviewable status: :shipit: complete! 1 of 0 LGTMs obtained (waiting on @angelapwen, @irfansharif, and @tbg)


pkg/util/log/trace_client_test.go, line 23 at r2 (raw file):

func TestTrace(t *testing.T) {
	ctx := context.Background()

might need a defer leaktest.AfterTest(t)()


pkg/util/log/trace_client_test.go, line 58 at r2 (raw file):

func TestTraceWithTags(t *testing.T) {
	ctx := context.Background()

ditto

@tbg tbg force-pushed the tracing-verbose-external branch from 750e244 to fddfba6 Compare March 1, 2021 14:58
tbg added 3 commits March 2, 2021 09:19
Rearrange the existing test, and test the Zipkin tracer as well. Phase
out the mockTracer; we can just use a zipkin tracer with an in-mem
collector.

Note that the updated tests also verify that if we create a span from a
tracer configured with a shadow tracer (zipkin), that span does indeed
record verbose log messages (cc @andreimatei).

Release justification: testing improvement
Release note: None
Release justification: testing improvement
Release note: None
Previously (i.e. pre the current cycle's tracing work), a trace that had
a shadow trace attached to it was always regarded as verbose (and there
was no explicit notion of verbosity; the span was recording, which
equalled it being verbose).

Now, verbose spans are those that collect all log messages in the
crdb-internal span. The log package had not been updated to reflect
this, so when a non-verbose span was set up with, say, a zipkin trace,
it was not regarded as worth logging into by `log.Event` and friends.

This commit fixes this as follows:

- a span that has a shadow span but isn't explicitly set to verbose
remains *non-verbose*, but
- the logging framework (`ExpensiveLogEnabled`) asks the tracer whether
  an auxiliary logging sink is currently set up; if so it does relay all
  log messages to the trace span nevertheless, in effect restoring the
  old behavior.

As a nice improvement, we are now giving log messages *only* to the
shadow span. It used to be such that the messages were added to the
shadow span *and* duplicated in the crdb-internal span, for no good
reason. This alone should cut the memory overhead of running with an
external trace collector by around 50%.

Release justification: fixes a recent regression in tracing functionality
Release note: None
@tbg tbg force-pushed the tracing-verbose-external branch from fddfba6 to 3c946d0 Compare March 2, 2021 08:22
@tbg
Copy link
Member Author

tbg commented Mar 2, 2021

bors r=knz

@craig
Copy link
Contributor

craig bot commented Mar 2, 2021

Build succeeded:

@craig craig bot merged commit 9bf97ac into cockroachdb:master Mar 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants