nf-test icon indicating copy to clipboard operation
nf-test copied to clipboard

Can't use `setup` construct for pipeline testing

Open Aratz opened this issue 2 years ago • 4 comments

Hi!

I'm having some issues with setting up some tests using the setup method. Working from the examples in the documentation, I've nailed it down to the following minimal reproducible examples (I'm using nf-test 0.8.1):

  • All of this is based on the following script pipeline.nf from the documentation:
#!/usr/bin/env nextflow
nextflow.enable.dsl=2

process sayHello {
    input:
        val cheers

    output:
        stdout emit: verbiage_ch
        path '*.txt', emit: verbiage_ch2

    script:
    """
    echo -n $cheers
    echo -n $cheers > ${cheers}.txt
    """
}

workflow trial {
    take: things
    main:
        sayHello(things)
        sayHello.out.verbiage_ch.view()
    emit:
        trial_out_ch = sayHello.out.verbiage_ch2
}

workflow {
    Channel.from('hello','nf-test') | trial
}
  • Testing the workflow trial with a setup is no problem and works as expected:
nextflow_workflow {

    name "Test Pipeline with 1 process"
    script "pipeline.nf"
    workflow "trial"

    test("Should run without failures") {

        setup {
            run("sayHello") {
                script "pipeline.nf"
                process {
                    """
                    input[0] = 'hello'
                    """
                }
            }
        }

        when {
            workflow {
                """
                input[0] = Channel.from('hello','nf-test')
                """
            }
        }

        then {
            assert workflow.success
        }
    }
}
  • Testing the pipeline without setup (as is done in the docs) also works without issue:
nextflow_pipeline {

    name "Test Pipeline with 1 process"
    script "pipeline.nf"

    test("Should run without failures") {

        when {
            params {
              input_text = "hello,nf-test"
            }
        }

        then {
            assert workflow.success
        }
    }
}
  • However, adding the same setup block as before does not work:
nextflow_pipeline {

    name "Test Pipeline with 1 process"
    script "pipeline.nf"

    test("Should run without failures") {

        setup {
            run("sayHello") {
                script "pipeline.nf"
                process {
                    """
                    input[0] = 'hello'
                    """
                }
            }
        }

        when {
            params {
              input_text = "hello,nf-test"
            }
        }

        then {
            assert workflow.success
        }
    }
}

It gives me the following error:

  groovy.lang.MissingMethodException: No signature of method: pipeline_nf$_run_closure1$_closure2.run() is applicable for argument types: (String, pipeline_nf$_run_closure1$_closure2$_closure3$_closure6) values: [sayHello, pipeline_nf$_run_closure1$_closure2$_closure3$_closure6@1e886a5b]
  Possible solutions: find(), any(), dump(), grep(), find(), any()

Aratz avatar Oct 18 '23 07:10 Aratz

Thanks! setup is currently not supported by nextflow_pipeline testsuites. ATM, it is only supported by nextflow_process and nextflow_workflow.

lukfor avatar Dec 27 '23 09:12 lukfor

I am closing this issue for now. Pipeline tests resemble more of an end-to-end nature, and running dependency processes in the setup is therefore not suitable.

lukfor avatar Jan 27 '24 14:01 lukfor

Well, I have a use case where I get a compressed file which I need to decompress to pass it to the pipeline. Do you maybe have a suggestion on how I could address this?

The file is here :point_right: https://github.com/nf-core/demultiplex/blob/cfaa725883897f6fcd7a6c63bc3e0cd14ceb3130/tests/pipeline/kraken.nf.test

Also, it would be nice if it was made clearer in the documentation that this feature is not available for pipeline testing.

Aratz avatar Jan 31 '24 07:01 Aratz

@lukfor Are there any plans to support setup() for pipelines? My use case is that I'd like to do an end to end run and then write several tests on the results. I do not want to write one big test (that asserts all expectations) because a) this is somewhat cumbersome and b) I'd like to have the tests results separated (i.e. not a single PASSED but several, esp. in combination with --csv/--junitxml).

Or maybe do you have another suggestion how to achieve that(=run pipeline only once; execute several independent tests on the results)?

rollf avatar Sep 20 '24 12:09 rollf