Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS XRay extension doesn't send traces #28610

Closed
snowe2010 opened this issue Oct 14, 2022 · 5 comments
Closed

AWS XRay extension doesn't send traces #28610

snowe2010 opened this issue Oct 14, 2022 · 5 comments
Assignees
Labels
area/tracing kind/bug Something isn't working

Comments

@snowe2010
Copy link

Describe the bug

XRay traces are not mapped between nodes. This has previously been an issue. Using @patriot1burke's xray-demo provided in that issue, you can easily deploy it and see that no xray traces are generated for the subsegments. In addition to that I have created a
reproducer that uses a lambda, rather than an api endpoint, that demonstrates that automatic instrumentation showing traces between applications does not work.

Expected behavior

I expect viewing a service map in AWS
image
to display the full map of services called rather than only the quarkus app itself.

Actual behavior

For quarkus apps, we only see a single node, no dependencies. image

How to Reproduce?

Using @patriot1burke's xray-demo

  1. build with mvn clean install -Dquarkus.native.container-build=true -Dquarkus.native.builder-image=quay.io/quarkus/ubi-quarkus-native-image:22.1.0-java17
  2. deploy with sam deploy -t sam.native.yaml -g
  3. hit the endpoint through api gateway test
  4. verify in service map that there are no subsegments

reproducer

  1. build with mvn clean install -Dquarkus.native.container-build=true -Dquarkus.native.builder-image=quay.io/quarkus/ubi-quarkus-native-image:22.1.0-java17
  2. deploy with sam deploy -t sam.native.yaml -g
  3. hit the lambda through Test tab in console
  4. verify in service map that there are no calls or subsegments to any aws services

Output of uname -a or ver

Darwin MacBook-Pro.local 21.6.0 Darwin Kernel Version 21.6.0: Mon Aug 22 20:17:10 PDT 2022; root:xnu-8020.140.49~2/RELEASE_X86_64 x86_64

Output of java -version

openjdk version "17.0.3" 2022-04-19 OpenJDK Runtime Environment GraalVM CE 22.1.0 (build 17.0.3+7-jvmci-22.1-b06) OpenJDK 64-Bit Server VM GraalVM CE 22.1.0 (build 17.0.3+7-jvmci-22.1-b06, mixed mode, sharing)

GraalVM version (if different from Java)

No response

Quarkus version or git rev

2.13.1

Build tool (ie. output of mvnw --version or gradlew --version)

Apache Maven 3.8.6 (84538c9988a25aec085021c365c560670ad80f63) Maven home: /Users/tyler/.asdf/installs/maven/3.8.6 Java version: 17.0.3, vendor: GraalVM Community, runtime: /Users/tyler/.asdf/installs/java/graalvm-22.1.0+java17 Default locale: en_US, platform encoding: UTF-8 OS name: "mac os x", version: "12.6", arch: "x86_64", family: "mac"

Additional information

https://quarkusio.zulipchat.com/#narrow/stream/187030-users/topic/aws.20xray

@snowe2010 snowe2010 added the kind/bug Something isn't working label Oct 14, 2022
@quarkus-bot
Copy link

quarkus-bot bot commented Oct 14, 2022

/cc @Ladicek, @brunobat, @radcortez

@quarkus-bot
Copy link

quarkus-bot bot commented Oct 14, 2022

You added a link to a Zulip discussion, please make sure the description of the issue is comprehensive and doesn't require accessing Zulip.

This message is automatically generated by a bot.

@dvelho
Copy link

dvelho commented Nov 2, 2022

Also having the same issue

@radcortez
Copy link
Member

I've tried to reproduce the issue by running the reproducers. I believe that everything is working as intended.

If you use https://github.com/patriot1burke/xray-demo, the configuration is missing the Tracing: Active in the generated sam file. I'm wondering if we should add that automatically if you add the x-ray extension. On the other hand, I'm not sure if this is something we want to do.

Screenshot 2022-11-04 at 12 20 24

We can observe the subsegment generated in the tracing log.

When using your reproducer with the Lambda function, I was also able to observe AWS resources being called:

Screenshot 2022-11-04 at 15 04 36

I get an error invoking the service, but this is expected since I don't have an SSM, but I think it doesn't matter since we can observe the call to that AWS service.

@radcortez radcortez self-assigned this Nov 4, 2022
@radcortez
Copy link
Member

Closing due to no response back and unable to reproduce.

@radcortez radcortez closed this as not planned Won't fix, can't repro, duplicate, stale Nov 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/tracing kind/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants