You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the steps provided in the README to run the workflow for my data but no results were generated. I only get the following output and there are no other errors.
(snakemake) username@x111:MEDUSA-main$ snakemake --cores
Building DAG of jobs...
Using shell: /usr/bin/bash
Provided cores: 32
Rules claiming more threads will be scaled down.
Job stats:
job count min threads max threads
----- ------- ------------- -------------
all 1 1 1
total 1 1 1
Select jobs to execute...
[Sun Jan 29 14:19:32 2023]
localrule all:
jobid: 0
reason: Rules with neither input nor output files are always executed.
resources: tmpdir=/tmp
[Sun Jan 29 14:19:32 2023]
Finished job 0.
1 of 1 steps (100%) done
Complete log: .snakemake/log/2023-01-29T141932.335381.snakemake.log
I executed the snakemake --cores command from where the Snakefile is and ensured that my raw FASTQ files are in the Pipeline/data/raw folder. Also I made sure that my reads end with the suffixes _1.fast and _2.fastq.
Can you please help me figure this out?
Thank you!
The text was updated successfully, but these errors were encountered:
Hello!
I followed the steps provided in the README to run the workflow for my data but no results were generated. I only get the following output and there are no other errors.
I executed the
snakemake --cores
command from where the Snakefile is and ensured that my raw FASTQ files are in thePipeline/data/raw
folder. Also I made sure that my reads end with the suffixes_1.fast
and_2.fastq
.Can you please help me figure this out?
Thank you!
The text was updated successfully, but these errors were encountered: