Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Testing Platform] InconclusiveTestNodeStateProperty - Discussion #3699

Open
thomhurst opened this issue Aug 26, 2024 · 16 comments
Open

[Testing Platform] InconclusiveTestNodeStateProperty - Discussion #3699

thomhurst opened this issue Aug 26, 2024 · 16 comments
Labels
Area: Testing Platform Belongs to the Microsoft.Testing.Platform core library Type: Discussion

Comments

@thomhurst
Copy link
Contributor

There is no current way to mark a test as inconclusive. What do you think about a TestNodeStateProperty for this?

NUnit has a method to mark a test as inconclusive: Assert.Inconclusive() - So I'm not sure how they plan to handle this in the new world.

For my framework, I have allowed tests to depend on other tests. If the dependency fails, I don't necessarily want to mark the dependent tests as failed, as that creates noise and log pollution and makes it more difficult to trick down the issue, so I'd want them to be marked as inconclusive. Basically saying, "I don't know what state this test is in".

I don't want to set it as Skipped because I see that as an exclusive choice by the user, which might cause confusion.

What do you think?

@Evangelink
Copy link
Member

What does "inconclusive" represents for you? We see that as just a skip for some reason, there is for me no strong difference between Skip/Ignore attribute, some Conditional attribute or a condition in code that makes you skip the test.

@Evangelink Evangelink added Type: Discussion Area: Testing Platform Belongs to the Microsoft.Testing.Platform core library and removed Needs: Triage 🔍 labels Aug 28, 2024
@thomhurst
Copy link
Contributor Author

What does "inconclusive" represents for you? We see that as just a skip for some reason, there is for me no strong difference between Skip/Ignore attribute, some Conditional attribute or a condition in code that makes you skip the test.

I feel like "Skip" is a user choice, they decide whether to Skip a test.

I mean here's the dictionary definition:

not giving or having a result or decision
https://dictionary.cambridge.org/dictionary/english/inconclusive

So for my scenario:

  • Test has a dependency
  • Dependency fails
  • My test may not have the correct state now, so it isn't run (but it was intended to be run)
  • Therefore no result to give to the user, means inconclusive

@thomhurst
Copy link
Contributor Author

I see discussions are turned on now! Shall we convert this issue to a discussion?

@MarcoRossignoli
Copy link
Contributor

MarcoRossignoli commented Sep 5, 2024

So for my scenario:
Test has a dependency
Dependency fails
My test may not have the correct state now, so it isn't run (but it was intended to be run)
Therefore no result to give to the user, means inconclusive

Isn't this a Skip(Reason="Dependency x failed")?

I mean your expectation is that the suite should fail or succeed?

@thomhurst
Copy link
Contributor Author

thomhurst commented Sep 5, 2024

So for my scenario:
Test has a dependency
Dependency fails
My test may not have the correct state now, so it isn't run (but it was intended to be run)
Therefore no result to give to the user, means inconclusive

Isn't this a Skip(Reason="Dependency x failed")?

I mean your expectation is that the suite should fail or succeed?

I would say fail? Because if we've got inconclusive tests, we're not fully confident that our test suite has exercised all of our paths?

Also, to throw another spanner in the works, the trx report has a counter for inconclusive.

  <ResultSummary outcome="Completed">
    <Counters total="1" executed="1" passed="0" failed="1" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
  </ResultSummary>

@bradwilson
Copy link

bradwilson commented Sep 5, 2024

In xUnit.net terms, the rough equivalent is an Explicit test which is not run unless you ask it to be run (I believe NUnit also has this concept). When I asked about this, the suggestion to Skip was given, and that's where I went at first...until I realized that would mark the test as Skipped in Test Explorer, which I didn't want. Test Explorer already has a Not Run state, and I wanted to preserve that.

Also, to throw another spanner in the works, the trx report has a counter for inconclusive.

Yep, it also has notRunnable, which seems like it might fit perfectly for my use case (and now I'm going to go update my TRX report to use that 😂).

@MarcoRossignoli
Copy link
Contributor

MarcoRossignoli commented Sep 5, 2024

I think that TRX outcomes are more "extended" than the cases supported by VS for historical reason(AzDo support?) and predates VS? https://github.com/microsoft/vstest/blob/main/src/Microsoft.TestPlatform.Extensions.TrxLogger/ObjectModel/TestOutcome.cs

@nohwnd @drognanar do you know if my assumption makes some sense or VS is supporting all these states?

If we find out that are supported we can add new states too if we don't break VS, if we do we can do some updates there to handle the states correctly.

@bradwilson
Copy link

I can verify that Visual Studio 2022 17.11 does correctly show (and differentiate) NotRunnable tests after updating our TRX report to show it as such.

Abbreviated test run of four modified tests to show our four states:

<?xml version="1.0" encoding="utf-8"?>
<TestRun id="00a0cc90-a4a4-4e1f-8b53-b53fdc7816a0" name="bradwilson@SERVER 09/05/2024 13:31:55" runUser="bradwilson" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Times creation="2024-09-05T13:31:55.4660973-07:00" queuing="2024-09-05T13:31:55.4660973-07:00" start="2024-09-05T13:31:55.4660973-07:00" finish="2024-09-05T20:31:55.5612491+00:00" />
  <TestSettings name="default" id="6c4d5628-128d-4c3b-a1a4-ab366a4594ad" />
  <Results>
    <UnitTestResult testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testName="BooleanAssertsTests+True.ThrowsExceptionWhenNull" testId="717a3529-fa4a-45fb-8530-f9c0a09f9d3a" executionId="717a3529-fa4a-45fb-8530-f9c0a09f9d3a" computerName="SERVER" outcome="Failed" duration="00:00:00.0140000" startTime="2024-09-05T20:31:55.5182366+00:00" endTime="2024-09-05T20:31:55.5532361+00:00">
      <Output>
        <ErrorInfo>
          <Message>System.DivideByZeroException : Attempted to divide by zero.</Message>
          <StackTrace>   at BooleanAssertsTests.True.ThrowsExceptionWhenNull() in C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\Asserts\BooleanAssertsTests.cs:line 90</StackTrace>
        </ErrorInfo>
      </Output>
    </UnitTestResult>
    <UnitTestResult testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testName="BooleanAssertsTests+True.ThrowsExceptionWhenFalse" testId="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2" executionId="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2" computerName="SERVER" outcome="NotRunnable" duration="00:00:00.0000000" startTime="2024-09-05T20:31:55.5582487+00:00" endTime="2024-09-05T20:31:55.5592493+00:00" />
    <UnitTestResult testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testName="BooleanAssertsTests+True.AssertTrue" testId="651b4b40-af9f-4aab-815a-962a2839a324" executionId="651b4b40-af9f-4aab-815a-962a2839a324" computerName="SERVER" outcome="NotExecuted" duration="00:00:00.0000000" startTime="2024-09-05T20:31:55.5592493+00:00" endTime="2024-09-05T20:31:55.5592493+00:00">
      <Output>
        <StdOut>Not run</StdOut>
      </Output>
    </UnitTestResult>
    <UnitTestResult testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testName="BooleanAssertsTests+True.UserSuppliedMessage" testId="17bfb80a-1894-4bce-8232-4d8e8837eb18" executionId="17bfb80a-1894-4bce-8232-4d8e8837eb18" computerName="SERVER" outcome="Passed" duration="00:00:00.0010000" startTime="2024-09-05T20:31:55.5602491+00:00" endTime="2024-09-05T20:31:55.5612491+00:00" />
  </Results>
  <TestDefinitions>
    <UnitTest name="BooleanAssertsTests+True.ThrowsExceptionWhenNull" storage="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" id="717a3529-fa4a-45fb-8530-f9c0a09f9d3a">
      <Execution id="717a3529-fa4a-45fb-8530-f9c0a09f9d3a" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" className="BooleanAssertsTests+True" name="ThrowsExceptionWhenNull" adapterTypeName="executor://00a0cc90-a4a4-4e1f-8b53-b53fdc7816a0/" />
    </UnitTest>
    <UnitTest name="BooleanAssertsTests+True.ThrowsExceptionWhenFalse" storage="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" id="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2">
      <Execution id="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" className="BooleanAssertsTests+True" name="ThrowsExceptionWhenFalse" adapterTypeName="executor://00a0cc90-a4a4-4e1f-8b53-b53fdc7816a0/" />
    </UnitTest>
    <UnitTest name="BooleanAssertsTests+True.AssertTrue" storage="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" id="651b4b40-af9f-4aab-815a-962a2839a324">
      <Execution id="651b4b40-af9f-4aab-815a-962a2839a324" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" className="BooleanAssertsTests+True" name="AssertTrue" adapterTypeName="executor://00a0cc90-a4a4-4e1f-8b53-b53fdc7816a0/" />
    </UnitTest>
    <UnitTest name="BooleanAssertsTests+True.UserSuppliedMessage" storage="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" id="17bfb80a-1894-4bce-8232-4d8e8837eb18">
      <Execution id="17bfb80a-1894-4bce-8232-4d8e8837eb18" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" className="BooleanAssertsTests+True" name="UserSuppliedMessage" adapterTypeName="executor://00a0cc90-a4a4-4e1f-8b53-b53fdc7816a0/" />
    </UnitTest>
  </TestDefinitions>
  <TestEntries>
    <TestEntry testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testId="717a3529-fa4a-45fb-8530-f9c0a09f9d3a" executionId="717a3529-fa4a-45fb-8530-f9c0a09f9d3a" />
    <TestEntry testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testId="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2" executionId="5a7cb26b-40c6-4f4d-b399-59c7c58b9dc2" />
    <TestEntry testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testId="651b4b40-af9f-4aab-815a-962a2839a324" executionId="651b4b40-af9f-4aab-815a-962a2839a324" />
    <TestEntry testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" testId="17bfb80a-1894-4bce-8232-4d8e8837eb18" executionId="17bfb80a-1894-4bce-8232-4d8e8837eb18" />
  </TestEntries>
  <TestLists>
    <TestList name="Results Not in a List" id="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestList name="All Loaded Results" id="19431567-8539-422a-85d7-44ee4e166bda" />
  </TestLists>
  <ResultSummary outcome="Completed">
    <Counters total="4" executed="2" passed="1" failed="1" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="1" notExecuted="1" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
  </ResultSummary>
</TestRun>

And the Visual Studio UI after double-clicking on this TRX file:

image

As an FYI, the Microsoft.Testing.Platform TRX report does not include the skipped tests in the notExecuted counters list (despite showing it had an outcome of NotExecuted), nor is it showing the reported StdOut which includes the skip reason:

<?xml version="1.0" encoding="utf-8"?>
<TestRun id="ee72b06a-db83-4d18-a7ea-379c91c78bc1" name="bradwilson@SERVER 2024-09-05 20:31:55.626" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Times creation="2024-09-05T20:31:55.2773214Z" queuing="2024-09-05T20:31:55.2773214Z" start="2024-09-05T20:31:55.2773214Z" finish="2024-09-05T20:31:55.6274699Z" />
  <TestSettings name="default" id="33afb8f1-3a79-4db4-8e85-26eadb430422">
    <Deployment runDeploymentRoot="bradwilson_SERVER_2024-09-05_20_31_55.626" />
  </TestSettings>
  <Results>
    <UnitTestResult executionId="066eeefb-a5d7-42dd-bd56-caeebe3c7236" testId="f2ea74e4-8466-e2c0-42a0-bf8451cc7d0d" testName="BooleanAssertsTests+True.ThrowsExceptionWhenNull" computerName="SERVER" duration="00:00:00.0140000" startTime="2024-09-05T20:31:55.5182366+00:00" endTime="2024-09-05T20:31:55.5532361+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Failed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="066eeefb-a5d7-42dd-bd56-caeebe3c7236">
      <Output>
        <ErrorInfo>
          <Message>System.DivideByZeroException : Attempted to divide by zero.</Message>
          <StackTrace>   at BooleanAssertsTests.True.ThrowsExceptionWhenNull() in C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\Asserts\BooleanAssertsTests.cs:line 90</StackTrace>
        </ErrorInfo>
      </Output>
    </UnitTestResult>
    <UnitTestResult executionId="a9ca34ac-304b-44ea-818b-7dd677932b0c" testId="f4695967-e356-4f63-e2e5-6afcb3d3c000" testName="BooleanAssertsTests+True.AssertTrue" computerName="SERVER" duration="00:00:00.0000000" startTime="2024-09-05T20:31:55.5592493+00:00" endTime="2024-09-05T20:31:55.5592493+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="NotExecuted" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="a9ca34ac-304b-44ea-818b-7dd677932b0c" />
    <UnitTestResult executionId="b55cabed-b31b-4c33-a362-5614cc016fc5" testId="3cdd7e2c-5b59-cfb4-42fe-471a34930670" testName="BooleanAssertsTests+True.UserSuppliedMessage" computerName="SERVER" duration="00:00:00.0010000" startTime="2024-09-05T20:31:55.5602491+00:00" endTime="2024-09-05T20:31:55.5612491+00:00" testType="13CDC9D9-DDB5-4fa4-A97D-D965CCFC6D4B" outcome="Passed" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" relativeResultsDirectory="b55cabed-b31b-4c33-a362-5614cc016fc5" />
  </Results>
  <TestDefinitions>
    <UnitTest name="BooleanAssertsTests+True.ThrowsExceptionWhenNull" storage="c:\dev\xunit\xunit\src\xunit.v3.assert.tests\bin\debug\net472\xunit.v3.assert.tests.exe" id="f2ea74e4-8466-e2c0-42a0-bf8451cc7d0d">
      <Execution id="066eeefb-a5d7-42dd-bd56-caeebe3c7236" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/0.3.0" className="BooleanAssertsTests+True" name="BooleanAssertsTests+True.ThrowsExceptionWhenNull" />
    </UnitTest>
    <UnitTest name="BooleanAssertsTests+True.AssertTrue" storage="c:\dev\xunit\xunit\src\xunit.v3.assert.tests\bin\debug\net472\xunit.v3.assert.tests.exe" id="f4695967-e356-4f63-e2e5-6afcb3d3c000">
      <Execution id="a9ca34ac-304b-44ea-818b-7dd677932b0c" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/0.3.0" className="BooleanAssertsTests+True" name="BooleanAssertsTests+True.AssertTrue" />
    </UnitTest>
    <UnitTest name="BooleanAssertsTests+True.UserSuppliedMessage" storage="c:\dev\xunit\xunit\src\xunit.v3.assert.tests\bin\debug\net472\xunit.v3.assert.tests.exe" id="3cdd7e2c-5b59-cfb4-42fe-471a34930670">
      <Execution id="b55cabed-b31b-4c33-a362-5614cc016fc5" />
      <TestMethod codeBase="C:\Dev\xunit\xunit\src\xunit.v3.assert.tests\bin\Debug\net472\xunit.v3.assert.tests.exe" adapterTypeName="executor://30ea7c6e-dd24-4152-a360-1387158cd41d/0.3.0" className="BooleanAssertsTests+True" name="BooleanAssertsTests+True.UserSuppliedMessage" />
    </UnitTest>
  </TestDefinitions>
  <TestEntries>
    <TestEntry testId="f2ea74e4-8466-e2c0-42a0-bf8451cc7d0d" executionId="066eeefb-a5d7-42dd-bd56-caeebe3c7236" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
    <TestEntry testId="f4695967-e356-4f63-e2e5-6afcb3d3c000" executionId="a9ca34ac-304b-44ea-818b-7dd677932b0c" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
    <TestEntry testId="3cdd7e2c-5b59-cfb4-42fe-471a34930670" executionId="b55cabed-b31b-4c33-a362-5614cc016fc5" testListId="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
  </TestEntries>
  <TestLists>
    <TestList name="Results Not in a List" id="8C84FA94-04C1-424b-9868-57A2D4851A1D" />
    <TestList name="All Loaded Results" id="19431567-8539-422a-85d7-44ee4e166bda" />
  </TestLists>
  <ResultSummary outcome="Completed">
    <Counters total="3" executed="2" passed="1" failed="1" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
    <CollectorDataEntries>
      <Collector agentName="SERVER" uri="datacollector://30ea7c6e-dd24-4152-a360-1387158cd41d/0.3.0" collectorDisplayName="xUnit.net v3 Microsoft.Testing.Platform test framework">
        <UriAttachments>
          <UriAttachment>
            <A href="SERVER\xunit.trx" />
          </UriAttachment>
        </UriAttachments>
      </Collector>
    </CollectorDataEntries>
  </ResultSummary>
</TestRun>

@thomhurst
Copy link
Contributor Author

As an FYI, the Microsoft.Testing.Platform TRX report does not include the skipped tests in the notExecuted counters list (despite showing it had an outcome of NotExecuted), nor is it showing the reported StdOut which includes the skip reason:

Yeah I noticed this earlier too, raised #3767 for it.

@nohwnd
Copy link
Member

nohwnd commented Sep 6, 2024

I see support for inconclusive in testfx, (where it is converted to skipped or failed based on setting) https://github.com/microsoft/testfx/blob/main/src/Adapter/MSTest.TestAdapter/Helpers/UnitTestOutcomeHelper.cs#L24

but not in vstest, https://github.com/microsoft/vstest/blob/main/src/Microsoft.TestPlatform.ObjectModel/TestOutcome.cs#L9
or vstest TRX reporter - https://github.com/microsoft/vstest/blob/main/src/Microsoft.TestPlatform.Extensions.TrxLogger/ObjectModel/TestRunSummary.cs#L104 (even though the types for TRX do define inconclusive: https://github.com/microsoft/vstest/blob/main/src/Microsoft.TestPlatform.Extensions.TrxLogger/ObjectModel/TestOutcome.cs#L42

I also don't see support for inconclusive in TestExplorer. (non-public code)

I am not against adding the states, especially in TestExplorer I would appreciate if blue (not run yet) tests always switched to a final state either error when they cannot run for whatever reason. Or a new e.g. gray (not run) state when framework decides to not run the test.

It would also be nice if we allowed a reason / explanation for the results, same as there is a skipped result.

@thomhurst
Copy link
Contributor Author

NotRun works for me - but then the question is should it error the test run? Could it be configured to have a property like "ShouldFail" that we could set?

@MarcoRossignoli
Copy link
Contributor

It would also be nice if we allowed a reason / explanation for the results, same as there is a skipped result.

We have the explanation for every state so that can already fit https://github.com/microsoft/testfx/blob/main/src/Platform/Microsoft.Testing.Platform/Messages/TestNodeProperties.cs#L24

NotRun works for me - but then the question is should it error the test run? Could it be configured to have a property like "ShouldFail" that we could set?

I think that to keep things simple all states that are not a FailedTestNodeStateProperty or ErrorTestNodeStateProperty should not fail the suite, in case of a notrun fail the adapter should create a failed state with that reson.

@MarcoRossignoli
Copy link
Contributor

MarcoRossignoli commented Sep 6, 2024

For my framework, I have allowed tests to depend on other tests. If the dependency fails, I don't necessarily want to mark the dependent tests as failed, as that creates noise and log pollution and makes it more difficult to trick down the issue, so I'd want them to be marked as inconclusive. Basically saying, "I don't know what state this test is in".

InconclusiveTestNodeStateProperty could be good for this scenario. But I expect that's a successful state for the platform.
In this case your reason would be "Dependency failed".

@thomhurst
Copy link
Contributor Author

For my framework, I have allowed tests to depend on other tests. If the dependency fails, I don't necessarily want to mark the dependent tests as failed, as that creates noise and log pollution and makes it more difficult to trick down the issue, so I'd want them to be marked as inconclusive. Basically saying, "I don't know what state this test is in".

InconclusiveTestNodeStateProperty could be good for this scenario. But I expect that's a successful state for the platform. In this case your reason would be "Dependency failed".

To be fair it's fine if it's a successful state in my scenario, because the dependency test will report a failure anyway. I just didn't want all the dependents to report failures too because then the logs/output/test explorer becomes noisy and harder to track down the culprits.

@MarcoRossignoli
Copy link
Contributor

MarcoRossignoli commented Sep 6, 2024

To be fair it's fine if it's a successful state in my scenario, because the dependency test will report a failure anyway. I just didn't want all the dependents to report failures too because then the logs/output/test explorer becomes noisy and harder to track down the culprits.

I think more important is that "it is wrong", dev doesn't have to check all failures if are not true failure, it's a huge waste of time.
For now you can add those to "skipped" with reason "dependency failed", that's also a solution.

@bradwilson
Copy link

NotRun works for me - but then the question is should it error the test run?

In the way xUnit.net and NUnit use it, definitely not. Explicit is to mark a test that should only ever be run consciously and is not considered part of the normal test suite.

Sometimes people even use it for non-test-result reasons out of convenience (for example, something very slow/expensive like completely resetting a test database back to a clean state, something that's too expensive to want to do regularly and should only need to be done in some exceptional circumstance). These are places where I'd personally prefer to rely on a script of some kind instead. 😄

I think that to keep things simple all states that are not a FailedTestNodeStateProperty or ErrorTestNodeStateProperty should not fail the suite

This feels correct to me.

For now you can add those to "skipped" with reason "dependency failed", that's also a solution.

It's tough to know for sure, but it does feel like in some ways that this is the same situation I was in with NotRun tests, where @thomhurst might not want any GUI to show an inconclusive test as skipped, because it's hard to visually differentiate those from tests which are explicitly skipped.

As an aside, I am in the "unconditionally skipping should never be permanent" camp, but with v3 have come around to the idea of "conditionally skipping is fine" when the condition is out of your control, like having tests that can only run in specific environments and your CI system runs your tests across many environments for complete coverage. We end up using this today for things where Mono is an incomplete version of .NET Framework, like how modern versions of F# are known to be broken and there's no intention of fixing them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area: Testing Platform Belongs to the Microsoft.Testing.Platform core library Type: Discussion
Projects
None yet
Development

No branches or pull requests

5 participants