goss icon indicating copy to clipboard operation
goss copied to clipboard

Retrying using `--retry-timeout` and using an output format like JSON is makes it very difficult to pipe or capture with other tools

Open mkobit opened this issue 5 years ago • 6 comments

I think output reporting and STDERR output should be looked into to better support immediate feedback and also generating useful reports.

For example, given a goss.yaml like this:

command:
  exit 1:
    exit-status: 0

Running it without a format is like the human readable format

> goss validate
F

Failures/Skipped:

Command: exit 1: exit-status:
Expected
    <int>: 1
to equal
    <int>: 0

Total Duration: 0.002s
Count: 1, Failed: 1, Skipped: 0

Running with a --retry-timeout sort of groups things by attempt

goss validate --retry-timeout 3s
F

Failures/Skipped:

Command: exit 1: exit-status:
Expected
    <int>: 1
to equal
    <int>: 0

Total Duration: 0.001s
Count: 1, Failed: 1, Skipped: 0
Retrying in 1s (elapsed/timeout time: 0.001s/3s)


Attempt #2:
F

Failures/Skipped:

Command: exit 1: exit-status:
Expected
    <int>: 1
to equal
    <int>: 0

Total Duration: 0.001s
Count: 1, Failed: 1, Skipped: 0
Retrying in 1s (elapsed/timeout time: 1.002s/3s)


Attempt #3:
F

Failures/Skipped:

Command: exit 1: exit-status:
Expected
    <int>: 1
to equal
    <int>: 0

Total Duration: 0.001s
Count: 1, Failed: 1, Skipped: 0

ERROR: Timeout of 3s reached before tests entered a passing state

But, using an output format like json, then the output is unparseable and can't be combined with other tooling because it all goes to the same place.

> goss validate --retry-timeout 3s --format json         
{                                                          
    "results": [                                           
        {                                                  
            "duration": 1560616,                                                                                       
            "err": null,                                   
            "expected": [                                  
                "0"                                        
            ],                                             
            "found": [                                     
                "1"
            ],
            "human": "Expected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "meta": null,                                  
            "property": "exit-status",                     
            "resource-id": "exit 1",                       
            "resource-type": "Command",                    
            "result": 1,                                   
            "successful": false,                           
            "summary-line": "Command: exit 1: exit-status:\nExpected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "test-type": 0,                                
            "title": ""                                    
        }                                                  
    ],                                                     
    "summary": {                                                                                                       
        "failed-count": 1,                                 
        "summary-line": "Count: 1, Failed: 1, Duration: 0.002s",
        "test-count": 1,                                   
        "total-duration": 2035821                          
    }                                                      
}                                                          
Retrying in 1s (elapsed/timeout time: 0.003s/3s)

                                                           
Attempt #2:                                                
{                                                          
    "results": [                                           
        {                                                                                                              
            "duration": 1455115,                                
            "err": null,                                   
            "expected": [                                  
                "0"                                        
            ],        
            "found": [                                                                                                 
                "1"                                              
            ],                                                                            
            "human": "Expected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "meta": null,
            "property": "exit-status",
            "resource-id": "exit 1",
            "resource-type": "Command",
            "result": 1,
            "successful": false,
            "summary-line": "Command: exit 1: exit-status:\nExpected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "test-type": 0,
            "title": ""
        }
    ],
    "summary": {
        "failed-count": 1,
        "summary-line": "Count: 1, Failed: 1, Duration: 0.002s",
        "test-count": 1,
        "total-duration": 1954849
    }
}
Retrying in 1s (elapsed/timeout time: 1.006s/3s)

                                    
Attempt #3:                            
{                       
    "results": [                
        {                                                                                                                       
            "duration": 1710724,
            "err": null,
            "expected": [
                "0"
            ],  
            "found": [    
                "1"                                             
            ],          
            "human": "Expected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "meta": null,
            "property": "exit-status",
            "resource-id": "exit 1",            
            "resource-type": "Command",
            "result": 1,
            "successful": false,
            "summary-line": "Command: exit 1: exit-status:\nExpected\n    \u003cint\u003e: 1\nto equal\n    \u003cint\u003e: 0",
            "test-type": 0,
            "title": ""
        }                       
    ],                  
    "summary": {         
        "failed-count": 1,
        "summary-line": "Count: 1, Failed: 1, Duration: 0.002s",
        "test-count": 1,
        "total-duration": 2280836
    }         
}                                                                                         
                         
ERROR: Timeout of 3s reached before tests entered a passing state

Expectation

My expectation would be that reporting enables me to capture some of the output in the format regardless of what user facing output is displayed. It would be nice for it to behave more like other unix tools.

mkobit avatar Apr 10 '19 16:04 mkobit

Can you give an example of how you'd like the output to be when retry+JSON is enabled?

Wonder if a flag like --only-display-last would resolve this. So only the very last attempt is printed out.

aelsabbahy avatar Apr 10 '19 16:04 aelsabbahy

Tough to say exactly what the output should be.

I think it could be useful to have all of the output go.. somewhere with tracking the results and invocations of each, or maybe even if retry was only for a specific test?

I'm trying to think of how something would be handled in a test framework like https://junit.org/junit5/docs/current/user-guide/#writing-tests-repeated-tests which each execution is an individual test event where reporter and listeners can decide how to deal with them and the test engine/extension decides whether or not the overall test plan should fail or not. Right now, we primitively just capture the output and shove it into a file.

I haven't tried the above case with the junit output, but I would bet it is similar. Is there a "standard" for test reporting output?

mkobit avatar Apr 11 '19 15:04 mkobit

That link doesn't show any output format.

I'm not really sure there's a good way to handle this without changing things either:

  1. To only display the last one (feels hacky and silent)
  2. Allowing reporters to consolidate output into an array:
{
  "attempts": [
    {"results": [], "summary": {}},
    {"results": [], "summary": {}}
  ]
}

That said, the output would have to be delayed until all the test attempts have concluded.

Any thoughts?

aelsabbahy avatar Dec 27 '19 19:12 aelsabbahy

I ran into this too, imo the correct behavior would be to output just the last one always. Since the previous tests are not useful since you’re just seeing if you can converge with retires. There isn’t a use case we have at least where we’d care about the intermediate tests that weren’t successful.

cove avatar Feb 28 '20 03:02 cove

I ran into this too, imo the correct behavior would be to output just the last one always. Since the previous tests are not useful since you’re just seeing if you can converge with retires. There isn’t a use case we have at least where we’d care about the intermediate tests that weren’t successful.

I support just showing the last test and the attempt number it belongs.

Wonder if a flag like --only-display-last would resolve this. So only the very last attempt is printed out.

The other direction makes more sense even if it's a breaking change. A flag called --show-all-attempts would allow the users that want to log the failed attempts or even just for debugging.

pedroMMM avatar Feb 29 '20 15:02 pedroMMM

Wonder if a flag like --only-display-last would resolve this. So only the very last attempt is printed out.

This is my preferred solution. This way a new goss user can see what the retry does since it's transparent by default.

Marking this as approved for the --only-display-last option.

aelsabbahy avatar Feb 29 '20 15:02 aelsabbahy