nextflow
nextflow copied to clipboard
Missing plugin 'nf-amazon' required to read file
Hello,
I run my nextflow pipeline within a docker container in a kubernetes environment.
I have troubles with the plugin nf-amazon
. In particular:
-
nextflow plugins install nf-amazon
does not seem to do anything - there is no folder
.nextflow/plugins
where I execute nextflow - I can see a folder
/root/.nextflow/plugins/nf-amazon-1.3.4
though - adding the plugin to nextflow.config does not do much either
- I always get the error:
Missing plugin 'nf-amazon' required to read file: s3://path/to/my/file
I checked the /tmp directory, but could not see anything related to this plugin. I believe nextflow is not looking at the location /root/.nextflow/plugins/nf-amazon-1.3.4
.
Could you help me, please? Alternatively, is it possible to download this plugin manually and store it where nextflow expects it? If so, could you give me the instructions, please?
Thanks
nextflow info
Version: 21.10.6 build 5660
Created: 21-12-2021 16:55 UTC
System: Linux 5.13.0-28-generic
Runtime: Groovy 3.0.9 on OpenJDK 64-Bit Server VM 11.0.13+8-post-Debian-1deb11u1
Encoding: UTF-8 (UTF-8)
Hi @pdp10 ,
Not sure why this might be happening so I can't comment on that. But in case this is an immediate blocker you might want to try out with the fully-bundled distribution https://github.com/nextflow-io/nextflow/releases/download/v21.10.6/nextflow-21.10.6-all
Please refer the note in the installation docs.
Thank you, @abhi18av ,
I downloaded the fully-bundled distribution and called it directly to execute my pipeline, but for some odd reason I get the same error message. Interestingly, nextflow reports the file on S3. Is there a way to get more logging messages?
I cancelled the whole dir .nextflow
and rerun my pipeline:
# rm -rf .nextflow/
# cd -
#/app/pipeline# nextflow run .
CAPSULE: Downloading dependency org.multiverse:multiverse-core:jar:0.7.0
CAPSULE: Downloading dependency ch.qos.logback:logback-classic:jar:1.2.9
CAPSULE: Downloading dependency org.checkerframework:checker-compat-qual:jar:2.0.0
CAPSULE: Downloading dependency org.codehaus.mojo:animal-sniffer-annotations:jar:1.14
CAPSULE: Downloading dependency com.google.errorprone:error_prone_annotations:jar:2.1.3
CAPSULE: Downloading dependency com.google.guava:guava:jar:24.1.1-jre
CAPSULE: Downloading dependency org.codehaus.jsr166-mirror:jsr166y:jar:1.7.0
CAPSULE: Downloading dependency org.slf4j:jcl-over-slf4j:jar:1.7.32
CAPSULE: Downloading dependency com.beust:jcommander:jar:1.35
CAPSULE: Downloading dependency org.slf4j:jul-to-slf4j:jar:1.7.32
CAPSULE: Downloading dependency org.jsoup:jsoup:jar:1.11.2
CAPSULE: Downloading dependency com.google.j2objc:j2objc-annotations:jar:1.1
CAPSULE: Downloading dependency ch.grengine:grengine:jar:1.3.0
CAPSULE: Downloading dependency io.nextflow:nf-httpfs:jar:21.10.6
CAPSULE: Downloading dependency jline:jline:jar:2.9
CAPSULE: Downloading dependency io.nextflow:nf-commons:jar:21.10.6
CAPSULE: Downloading dependency org.codehaus.groovy:groovy-nio:jar:3.0.9
CAPSULE: Downloading dependency com.github.zafarkhaja:java-semver:jar:0.9.0
CAPSULE: Downloading dependency org.codehaus.groovy:groovy:jar:3.0.9
CAPSULE: Downloading dependency javax.mail:mail:jar:1.4.7
CAPSULE: Downloading dependency commons-lang:commons-lang:jar:2.6
CAPSULE: Downloading dependency org.iq80.leveldb:leveldb-api:jar:0.12
CAPSULE: Downloading dependency org.codehaus.gpars:gpars:jar:1.2.1
CAPSULE: Downloading dependency org.slf4j:slf4j-api:jar:1.7.32
CAPSULE: Downloading dependency com.google.code.findbugs:jsr305:jar:1.3.9
CAPSULE: Downloading dependency org.objenesis:objenesis:jar:2.1
CAPSULE: Downloading dependency org.codehaus.groovy:groovy-json:jar:3.0.9
CAPSULE: Downloading dependency com.esotericsoftware.kryo:kryo:jar:2.24.0
CAPSULE: Downloading dependency org.apache.ivy:ivy:jar:2.3.0
CAPSULE: Downloading dependency org.codehaus.groovy:groovy-templates:jar:3.0.9
CAPSULE: Downloading dependency com.googlecode.javaewah:JavaEWAH:jar:1.1.6
CAPSULE: Downloading dependency org.slf4j:log4j-over-slf4j:jar:1.7.32
CAPSULE: Downloading dependency io.nextflow:nextflow:jar:21.10.6
CAPSULE: Downloading dependency javax.activation:activation:jar:1.1.1
CAPSULE: Downloading dependency org.yaml:snakeyaml:jar:1.28
CAPSULE: Downloading dependency com.jcraft:jsch:jar:0.1.54
CAPSULE: Downloading dependency org.pf4j:pf4j:jar:3.4.1
CAPSULE: Downloading dependency org.iq80.leveldb:leveldb:jar:0.12
CAPSULE: Downloading dependency org.codehaus.groovy:groovy-xml:jar:3.0.9
CAPSULE: Downloading dependency ch.qos.logback:logback-core:jar:1.2.9
CAPSULE: Downloading dependency com.jcraft:jzlib:jar:1.1.1
CAPSULE: Downloading dependency com.google.code.gson:gson:jar:2.2.4
CAPSULE: Downloading dependency org.eclipse.jgit:org.eclipse.jgit:jar:5.2.1.201812262042-r
CAPSULE: Downloading dependency org.pf4j:pf4j-update:jar:2.3.0
CAPSULE: Downloading dependency commons-codec:commons-codec:jar:1.10
N E X T F L O W ~ version 21.10.6
Launching `./main.nf` [determined_jepsen] - revision: 4b5f532c5d
Downloading plugin [email protected]
Missing plugin 'nf-amazon' required to read file: s3:/****.fastq.gz
Interestingly, I can see the plugin:
ls /root/.nextflow/plugins/
nf-amazon-1.3.4
It looks like that the downloading worked fine, but for some reason nextflow cannot see that location.
Not entirely sure about this @pdp10 , I've marked this as a bug and raised it with people more knowledgeable with me π
Could you please share a minimal reproducible case, like a repo with some steps?
Hey,
we've seen the same error when trying to use an S3 bucket as a working directory (using -work-dir "s3://bucket-name"
with nextflow run).
Digging in the code we came across the log location of said error message in this line of modules/nf-commons/src/main/nextflow/file/FileHelper.groovy
Inverting the condition linked above, i.e. if the provided URI scheme is NOT in the static PLUGINS_MAP
, made nextflow not throw the "missing plugin" error and proceed execution.
We've ran into followup errors after that, but still investigating them and that's not the topic here.
Kind regards, Lukas
I believe I found the issue, or at least a way to by pass the issue.
This was my original code, which works when the file is local.
import java.nio.file.Paths
def bamSearchPath = []
dirNameGlob = params.directory.replaceAll(/\/+$/, "") + '**'
fileNameGlob = '*.bam'
searchPath = Paths.get(dirNameGlob, fileNameGlob)
bamSearchPath.add(searchPath.toString())
If this is replaced with:
filePathGlob = params.directory.replaceAll(/\/+$/, "") + '**' + '/*.bam'
I pretty much get what I need. The latter works with both local and s3 files, in particular it does not raise the error in the first post, and instead prints: Staging foreign file: s3://****.bam
when the pipeline runs.
It seems to me that Paths has issues with S3.
Thanks @pdp10 for following up on this but to be honest, I haven't seen the need to do this manually in NF so far π€
In any case, still haven't been able to reproduce this issue with the standard nextflow-io/rnasesq-nf
pipeline using the batch profile for AWS Batch.
- Delete the
nf-amazon-xyz
plugins from the plugins cache
~/projects/code/nextflow_on_azure π
base
+ >_ rm /Users/eklavya/.nextflow/plugins/nf-amazon-1.* -r
- Then run the pipeline with tweaks to the configuration for your
process.queue
,aws.region
andprocess.workDir
for your configs
~/projects/code/nextflow_on_azure π
base
+ >_ nextflow run ./rnaseq-nf/main.nf -profile batch
N E X T F L O W ~ version 21.10.6
Launching `./rnaseq-nf/main.nf` [awesome_solvay] - revision: 4ba66eb0c8
Downloading plugin [email protected]
R N A S E Q - N F P I P E L I N E
===================================
transcriptome: s3://rnaseq-nf/data/ggal/transcript.fa
reads : s3://rnaseq-nf/data/ggal/lung_{1,2}.fq
outdir : results
Uploading local `bin` scripts folder to s3://rnaseq-nf/work/tmp/f4/c4d5991675dae3c1c6cfa3b485520f/bin
executor > awsbatch (4)
[d0/bdb9c1] process > RNASEQ:INDEX (transcript) [100%] 1 of 1 β
[51/9b7fb4] process > RNASEQ:FASTQC (FASTQC on lung) [100%] 1 of 1 β
[92/2b7197] process > RNASEQ:QUANT (lung) [100%] 1 of 1 β
[ee/6bcee6] process > MULTIQC [100%] 1 of 1 β
Done! Open the following report in your browser --> results/multiqc_report.html
Completed at: 09-Feb-2022 20:07:58
Duration : 7m 44s
CPU hours : (a few seconds)
Succeeded : 4
@pdp10 I think your issue is what you described, I was able to reproduce it as well. In your example the error is
-`Missing plugin 'nf-amazon' required to read file: s3:/****.fastq.gz`
If you notice, the path to s3 is missing one /
, and this is why your workaround works.
To verify this was the problem you can test by providing the parameter that expects the s3 path as
-`s3:/my-bucket/my-file.bam`
and then the same with
+`s3://my-bucket/my-file.bam`
This should show you if you are getting the error because of this typo.
Thanks @cgpu , I think you are correct.
I had a similar situation with a path with an extra slash, e.g. s3:///some_bucket
Cleanup your $HOME/.nextflow/plugins
directory and try it again.
In my case it had nothing to do with plugins. I just deleted the extra slash and it worked fine. I suspect that failure to read from S3 goes straight to that plugins error message regardless of the cause.
Interesting, we'll check this
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Wondering if this issue has been solved? Running into same problem...
@snafees , based on our discussion - I'm wondering whether nextflow
was installed in the sudo mode on your machine?
I don't see any other reason why sudo nextfow run ...
command was able to download the nf-amazon
plugin and move it to the $HOME/.nextflow
folder.
I think it's worth downloading NF again using https://www.nextflow.io/docs/edge/getstarted.html#installation
Hello, I recently started using this pipeline as well. I was able to download the latest nf-core/rnaseq but get the same error as @snafees. I was also able to download a separate nf-amazon and a few other plugins but still get the same error. Not too sure what to do here. I wanted to follow up in case it could be due to new updates?
Please include the full .nextflow.log
file
Hello, here is my nextflow.log file I'm following the quickstart tutorial on nf-core/rnaseq trying to test out the pipeline and get the "missing nf-amazon plugin"
I manually downloaded the plug-ins as well if that can help
Are you behind a proxy? I can see this error un the logs
javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
El mar., 18 oct. 2022 20:55, avald7 @.***> escribiΓ³:
Hello, here is my nextflow.log file I'm following the quickstart tutorial on nf-core/rnaseq trying to test out the pipeline and get the "missing nf-amazon plugin"
I manually downloaded the plug-ins as well if that can help
AVnextflow.log https://github.com/nextflow-io/nextflow/files/9814167/AVnextflow.log
β Reply to this email directly, view it on GitHub https://github.com/nextflow-io/nextflow/issues/2633#issuecomment-1282866229, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXRKKL5SUZQGQ2EQOD2DFODWD3XA7ANCNFSM5NZ55QFQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>
I believe I am for my worksite so I got around a few things by cloning nf-core/rnaseq repository as well as cloning nextflow plug-ins. Is there by any chance a way to use the plug-ins that I manually downloaded?
Apologies if this is the wrong place to ask this!
I think this is caused by a failed TSL certificate validation. Plugins are download from GitHub that only allows TLS1.2.
Please try updating nextflow to latest version using this command
nextflow -self-update
If it still fails, upgrade your Hava Java runtime to version 17. See sdkman how to install it.
I did indeed have file permission issues. After resolving those, I don't have the issue anymore. Thank you!
Mystery solved!