Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chore][pkg/pdataset/pmetrictest] introduce IgnoreDatapointAttributesOrder option to CompareMetricsOption #34076

Merged
merged 3 commits into from
Jul 16, 2024

Conversation

odubajDT
Copy link
Contributor

@odubajDT odubajDT commented Jul 15, 2024

By introducing the new option and using it in the servicegraphconnector test, the flakyness of the test and therefore false positives are resolved

Fixes #33998

Signed-off-by: odubajDT <[email protected]>
Signed-off-by: odubajDT <[email protected]>
@jpkrohling jpkrohling changed the title [chore]: avoid flakyness in tests [chore][connector/servicegraph] avoid flakyness in tests Jul 16, 2024
@jpkrohling
Copy link
Member

Is this a duplicate of #34070?

@odubajDT
Copy link
Contributor Author

odubajDT commented Jul 16, 2024

Is this a duplicate of #34070?

I don't this this is duplicate. This PR introduces a IgnoreDatapointAttributesOrder option into CompareMetricsOption, which on one side resolves the flakyness of the tests, but can be also used in other tests to easily check if the attribute lists are equal without the need to consider the actual order of the attributes

@odubajDT odubajDT changed the title [chore][connector/servicegraph] avoid flakyness in tests [chore][pkg/pdataset/pmetrictest] introduce IgnoreDatapointAttributesOrder option to CompareMetricsOption Jul 16, 2024
@odubajDT odubajDT marked this pull request as ready for review July 16, 2024 07:42
@odubajDT odubajDT requested a review from a team July 16, 2024 07:42
@jpkrohling
Copy link
Member

Sorry, my question was whether they solve the same problem. This PR here had no description before, making it hard to tell what's the problem it's solving. Can you add "Fixes #xyz" to the description, pointing it to the issue it fixes?

@odubajDT
Copy link
Contributor Author

Sorry, my question was whether they solve the same problem. This PR here had no description before, making it hard to tell what's the problem it's solving. Can you add "Fixes #xyz" to the description, pointing it to the issue it fixes?

Yes I agree the title was a little misleading, sorry for that, I already adapted it.

Unfortunately (AFAIK) there is no issue created regarding the problem with the flakyness of the test, but since I was working on one of the components and the connector test was randomly failing, I looked at it and figured out it's due to comparison of the attribute lists (different orders, but same content). Therefore I went for a small improvement by introducing the IgnoreDatapointAttributesOrder option into CompareMetricsOption which orders the attribute lists and therefore removes the potential failures.

Hope it's not a problem there is no issue regarding this, since it's basically only adaptation of test code and not production code

@jpkrohling
Copy link
Member

Sorry, but it does sound like it's the same issue that the other PR is aiming to solve: #33998

I think I prefer this approach here, but I want to make sure we don't have two work streams for the same thing, and end up with both PRs getting merged when only one is necessary.

@odubajDT
Copy link
Contributor Author

Sorry, but it does sound like it's the same issue that the other PR is aiming to solve: #33998

I think I prefer this approach here, but I want to make sure we don't have two work streams for the same thing, and end up with both PRs getting merged when only one is necessary.

Leaving the decision up to you, thanks for the info!

@jpkrohling
Copy link
Member

I'm marking this as solving #33998 , but let me know if you think this is a different problem.

@jpkrohling
Copy link
Member

@t00mas, can you review this one as well, and compare with the other PR solving the same problem?

@jpkrohling jpkrohling merged commit 8a28ee3 into open-telemetry:main Jul 16, 2024
156 checks passed
@jpkrohling
Copy link
Member

Given how frequent this failure is, and that this change seems more sustainable than the other PR, I approved and merged this one. I'll block the other one, unblocking in case it provides extra value.

@evantorrie
Copy link
Contributor

Merging this PR did not fix the flaky test. It continues to fail regularly after this merge - usually every 3 or so runs on my desktop.

#34120 was added earlier today to completely skip this test.

#34070 does fix #33998 so I will update #34070 to reinstate the now-skipped test in the main branch.

@odubajDT
Copy link
Contributor Author

Merging this PR did not fix the flaky test. It continues to fail regularly after this merge - usually every 3 or so runs on my desktop.

#34120 was added earlier today to completely skip this test.

#34070 does fix #33998 so I will update #34070 to reinstate the now-skipped test in the main branch.

Hi, can you point to the failing test run? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[connector/servicegraph] Unit test failure: TestConnectorConsume/test_fix_failed_label_not_work
5 participants