Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Align parabricks subworkflow #6876

Open
wants to merge 69 commits into
base: master
Choose a base branch
from
Open

Conversation

famosab
Copy link
Contributor

@famosab famosab commented Oct 28, 2024

PR checklist

Closes #6894

  • This comment contains a description of changes (with reason).
  • If you've fixed a bug or added code that should be tested, add tests!
  • If you've added a new tool - have you followed the module conventions in the contribution docs
  • If necessary, include test data in your PR.
  • Remove all TODO statements.
  • Emit the versions.yml file.
  • Follow the naming conventions.
  • Follow the parameters requirements.
  • Follow the input/output options guidelines.
  • Add a resource label
  • Use BioConda and BioContainers if possible to fulfil software requirements.
  • Ensure that the test works with either Docker / Singularity. Conda CI tests can be quite flaky:
    • For modules:
      • nf-core modules test <MODULE> --profile docker
      • nf-core modules test <MODULE> --profile singularity
      • nf-core modules test <MODULE> --profile conda
    • For subworkflows:
      • nf-core subworkflows test <SUBWORKFLOW> --profile docker
      • nf-core subworkflows test <SUBWORKFLOW> --profile singularity
      • nf-core subworkflows test <SUBWORKFLOW> --profile conda

@famosab famosab linked an issue Oct 29, 2024 that may be closed by this pull request
4 tasks
Comment on lines 28 to 29
ch_qc_metrics = ch_qc_metrics.mix(PARABRICKS_FQ2BAM.out.qc_metrics)
ch_duplicate_metrics = ch_duplicate_metrics.mix(PARABRICKS_FQ2BAM.out.duplicate_metrics)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where do you pass this downstream?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I planned to emit it in the end, its not needed for applybqsr though.

@famosab
Copy link
Contributor Author

famosab commented Jan 21, 2025

@sateeshperi Do you have any idea why the fq2bam process seems to not produce the expected output bam files?

ext.args = '--low-memory'
}
// Ref: https://forums.developer.nvidia.com/t/problem-with-gpu/256825/6
// Parabricks’s fq2bam requires 24GB of memory.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we maybe do a

Suggested change
// Parabricks’s fq2bam requires 24GB of memory.
// Parabricks’s fq2bam requires 24GB of memory.
// resourceLimits = [cpus: 6GB, memory: 24.GB]
memory = '24.GB'

https://docs.nvidia.com/clara/parabricks/latest/documentation/tooldocs/man_fq2bam.html#man-fq2bam

Also it would be awesome to do some thing like --memory-limit ${task.memory} / 2 by default or make sure there's 16 cpus per GPU requested.

Just trying to push the resourceLimits syntax to the limits here 😆

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
gpu module uses GPU help wanted Extra attention is needed new subworkflow
Projects
None yet
Development

Successfully merging this pull request may close these issues.

new subworkflow: fastq_align_parabricks
5 participants