Skip to content
This repository has been archived by the owner on Sep 4, 2020. It is now read-only.

Tests Not Executed Should Show as Skipped #162

Open
michaeltlombardi opened this issue Jun 22, 2017 · 8 comments
Open

Tests Not Executed Should Show as Skipped #162

michaeltlombardi opened this issue Jun 22, 2017 · 8 comments

Comments

@michaeltlombardi
Copy link
Contributor

Expected Behavior

AS A reader of the test output
I WANT all tests _NOT_ executed to show as skipped
SO THAT my test results more clearly indicate reality; the tests were not run and so should show as skipped instead of result unknown.

Current Behavior

When running Invoke-Vester, all items in the config which have been removed or whose value has been set to null show in the test results as Unknown because no tests run but the describe block still generates a data.

Pester Host Output (Tests)

image

Pester Host Output (Summary)

image

ReportUnit Report (Suites)

image

ReportUnit Report (Summary)

image

Possible Solution

  1. Modify the project to possibly either ignore the describe blocks for whom the value is null or not in the config, not showing any data at all for them;
  2. Modify the project to create tests for each applicable object but skip them.
  3. A combination of these, where items not in the config are not shown but items in the config and set to null are skipped.

These aren't the only ways to solve this, just the ones that came initially to mind.

Context

Often the readers of test output aren't the people most familiar with the ins and outs of how the test project works - they're either stakeholders who just want to be able to poke in and see how things are doing, or engineers/admins who are keeping an eye on health/compliance, or people peer reviewing changes, or auditors, or whatever. The primary value of this project from my perspective is that the people reviewing the output don't need to know how it works, just be able to use the information to drive better decisions and improve their insight into the environment.

Having these tests show up in a report as unknown or to see them visually run but without any tests underneath them executing can be confusing and concerning for people or lead them down wrong thought-lanes.

@midacts
Copy link
Contributor

midacts commented Jun 23, 2017

This seems similar to #130
Would this be a potential fix?

@michaeltlombardi
Copy link
Contributor Author

I'm not sure the gist would fix this problem - #130 seems to exist to handle the problem of "What if I need to test that a given setting is set to $Null?"

The problem I'm referencing here is that we need a way to distinguish between:

  • Tests to run
  • Tests to not include at all
  • Tests to skip
  • Tests to verify that a setting equals $Null

@midacts
Copy link
Contributor

midacts commented Jun 24, 2017

I see the differentiation now.

So if you remove entire lines in the config, when you run Invoke-Vester it shows as unknown (that is what it seems to me after reading and not testing)?

  1. Modify the project to possibly either ignore the describe blocks for whom the value is null or not in the config, not showing any data at all for them;

I'm sketched out about skipping $Null values.
If the config has $Null as the value, CURRENTLY it does not run that tests - period.
The problem i ran into is if the config says $Null, but you have something out there that is not $Null (and should be $Null) it will never be checked and corrected.

@michaeltlombardi
Copy link
Contributor Author

Agreed, skipping on null seems bad, especially since it is true that there are values we want to verify are, in fact, null.

So the other option then would be to remove the tests from the config or otherwise mark them for skipping?

Removing them from the config implies than any test not found in the config is skipped, meaning you wouldn't get the newly released tests by default, they'd have to be explicitly added. On the other hand, this is already true.

Marking them with the special value "VESTER_SKIP" or something might be an option if we want to keep the tests in the config but clearly skip them.

But I think this uncovers another problem - since we don't use tags or anything, we don't have a way to get Pester to show the tests as existing but skipped.

@brianbunke
Copy link
Contributor

I don't mean to interrupt, but as an acknowledgment:

Yeah, good report, and no quick answer. I'm sure we can figure out a reasonable option to present them properly as "skipped," it'll just take time to dive into the code and play around with it.

@jeffgreenca
Copy link
Contributor

Here's an idea, based on the above conversation. The person configuring and running Vester probably needs to know about available, unreferenced tests, but like @michaeltlombardi says the person reading the output may find "300 skipped tests" non-intuitive.

Goal How to Configure Output
Tests to run Include in .config Standard Report output
Tests to not include at all Exclude in .config Runtime message, "X available tests were unreferenced in your .config. Use -ListUnreferencedTests for details."
Tests to skip Include in .config with VESTER_SKIP Standard Report output
Tests to verify that a setting equals $Null $Desired = $Null Standard Report output

This would require implementing -ListUnreferencedTests, implementing something like #130, and a VESTER_SKIP option. I'd be up for helping on the code side if you all think this is a feasible direction.

@michaeltlombardi
Copy link
Contributor Author

I like this approach a lot - gives people a way to discover tests, makes test output pretty obvious, and lets people test for null values.

@brianbunke
Copy link
Contributor

Yes! In particular, dynamically counting the unreferenced tests as they're processed and then reporting the total number at the end sounds great. (Write-Warning?) I'd love all the help you can offer with implementing this.

Here are some quick questions that come to mind:

  1. Would we want to pursue New-VesterConfig automatically setting some tests to VESTER_SKIP? If so, what condition(s) would cause that?
  2. Would -ListUnreferencedTests be a new parameter on Get-VesterTest or Invoke-Vester?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants