Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating workflows/microbiome/pathogen-identification/gene-based-pathogen-identification from 0.1 to 0.2 #756

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

gxydevbot
Copy link
Contributor

Hello! This is an automated update of the following workflow: workflows/microbiome/pathogen-identification/gene-based-pathogen-identification. I created this PR because I think one or more of the component tools are out of date, i.e. there is a newer version available on the ToolShed.

By comparing with the latest versions available on the ToolShed, it seems the following tools are outdated:

  • toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/split_file_to_collection/0.5.0 should be updated to toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/split_file_to_collection/0.5.2
  • toolshed.g2.bx.psu.edu/repos/bgruening/flye/flye/2.9.1+galaxy0 should be updated to toolshed.g2.bx.psu.edu/repos/bgruening/flye/flye/2.9.5+galaxy1
  • toolshed.g2.bx.psu.edu/repos/iuc/medaka_consensus_pipeline/medaka_consensus_pipeline/1.7.2+galaxy0 should be updated to toolshed.g2.bx.psu.edu/repos/iuc/medaka_consensus_pipeline/medaka_consensus_pipeline/1.7.2+galaxy1
  • toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_find_and_replace/1.1.4 should be updated to toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_find_and_replace/9.5+galaxy0

The workflow release number has been updated from 0.1 to 0.2.

If you want to skip this change, close this PR without deleting the branch. It will be reopened if another change is detected.
Any commit from another author than 'planemo-autoupdate' will prevent more auto-updates.
To ignore manual changes and allow autoupdates, delete the branch.

Copy link

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored Tests
  • ❌ Gene-based-Pathogen-Identification.ga_0

    Execution Problem:

    • Failed to run workflow, at least one job is in [error] state.
      

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: collection_of_preprocessed_samples:

        • step_state: scheduled
      • Step 2: toolshed.g2.bx.psu.edu/repos/iuc/collection_element_identifiers/collection_element_identifiers/0.0.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • mv '/tmp/tmpaqgrhj37/job_working_directory/000/3/configs/tmp_c7k36p8' '/tmp/tmpaqgrhj37/job_working_directory/000/3/outputs/dataset_13aafb64-7bc0-4018-a3d5-e66f9d935cf7.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "fastqsanger.gz"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input_collection {"values": [{"id": 1, "src": "hdca"}]}
      • Step 11: toolshed.g2.bx.psu.edu/repos/iuc/abricate/abricate/1.0.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/d/3/d/dataset_d3dad808-fab5-4eff-ab12-f25ed11c2b19.dat' nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10 &&  abricate nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10  --minid=50.0 --mincov=50.0 --db=vfdb > '/tmp/tmpaqgrhj37/job_working_directory/000/15/outputs/dataset_682fe403-9478-4f42-9a77-c5c7027d4bd3.dat'

            Exit Code:

            • 0

            Standard Error:

            • Using nucl database vfdb:  2597 sequences -  2024-Dec-15
              Processing: nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10
              Found 3 genes in nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10
              Tip: abricate can also find virulence factors; use --list to see all supported databases.
              Done.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              adv {"db": "vfdb", "min_cov": "50.0", "min_dna_id": "50.0", "no_header": false}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
          • Job 2:

            • Job state is ok

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/2/9/5/dataset_295f5a79-06b4-4f45-a3ef-790e1d53131a.dat' nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12 &&  abricate nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12  --minid=50.0 --mincov=50.0 --db=vfdb > '/tmp/tmpaqgrhj37/job_working_directory/000/16/outputs/dataset_311a4555-dc88-416a-839c-c8c6232cd86f.dat'

            Exit Code:

            • 0

            Standard Error:

            • Using nucl database vfdb:  2597 sequences -  2024-Dec-15
              Processing: nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12
              Found 133 genes in nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12
              Tip: have a suggestion for abricate? Tell me at https://github.com/tseemann/abricate/issues
              Done.
              
              Exception in thread Thread-42 (run_postfork):
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              adv {"db": "vfdb", "min_cov": "50.0", "min_dna_id": "50.0", "no_header": false}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/abricate/abricate/1.0.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/d/3/d/dataset_d3dad808-fab5-4eff-ab12-f25ed11c2b19.dat' nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10 &&  abricate nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10  --minid=50.0 --mincov=50.0 --db=ncbi > '/tmp/tmpaqgrhj37/job_working_directory/000/17/outputs/dataset_6b7d132f-8b4a-4daf-a968-1fe6e15b98cc.dat'

            Exit Code:

            • 0

            Standard Error:

            • Using nucl database ncbi:  5386 sequences -  2024-Dec-15
              Processing: nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10
              Found 0 genes in nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10
              Tip: abricate can also find virulence factors; use --list to see all supported databases.
              Done.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              adv {"db": "ncbi", "min_cov": "50.0", "min_dna_id": "50.0", "no_header": false}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
          • Job 2:

            • Job state is running

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/2/9/5/dataset_295f5a79-06b4-4f45-a3ef-790e1d53131a.dat' nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12 &&  abricate nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12  --minid=50.0 --mincov=50.0 --db=ncbi > '/tmp/tmpaqgrhj37/job_working_directory/000/18/outputs/dataset_921ba470-0285-458c-bddf-ae72e52c77ce.dat'

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              adv {"db": "ncbi", "min_cov": "50.0", "min_dna_id": "50.0", "no_header": false}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • Step 13: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_find_and_replace/9.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/3dc70b59608c/text_processing/find_and_replace' -c 1 -o '/tmp/tmpaqgrhj37/job_working_directory/000/23/outputs/dataset_083a8c8a-c85a-415c-9c33-0e48107b7cc9.dat' -g    -r '^(.+)$' 'nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10_$1' '/tmp/tmpaqgrhj37/files/b/b/d/dataset_bbd0fa53-07fa-41ce-9ddf-442be2508a16.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10_$1", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}]
          • Job 2:

            • Job state is ok

            Command Line:

            • perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/3dc70b59608c/text_processing/find_and_replace' -c 1 -o '/tmp/tmpaqgrhj37/job_working_directory/000/24/outputs/dataset_ca5b6b08-a0c9-408b-afe4-b7d05c1241fe.dat' -g    -r '^(.+)$' 'nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12_$1' '/tmp/tmpaqgrhj37/files/a/6/b/dataset_a6b8f7dd-d609-4e75-bef3-5b6eaa048fcd.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12_$1", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}]
      • Step 14: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_find_and_replace/9.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is running

            Command Line:

            • perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/3dc70b59608c/text_processing/find_and_replace' -c 1 -g     '#FILE' 'SampleID' '/tmp/tmpaqgrhj37/files/6/8/2/dataset_682fe403-9478-4f42-9a77-c5c7027d4bd3.dat' | perl '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/3dc70b59608c/text_processing/find_and_replace' -c 2 -o '/tmp/tmpaqgrhj37/job_working_directory/000/25/outputs/dataset_eb2c0017-0aea-41eb-9b58-a372d7aab4e4.dat' -g   -s -r '^(.+)$' 'nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10_$1'

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "#FILE", "global": true, "is_regex": false, "replace_pattern": "SampleID", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}, {"__index__": 1, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10_$1", "searchwhere": {"__current_case__": 1, "column": "2", "searchwhere_select": "column"}, "skip_first_line": true, "wholewords": false}]
          • Job 2:

            • Job state is new

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "#FILE", "global": true, "is_regex": false, "replace_pattern": "SampleID", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}, {"__index__": 1, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12_$1", "searchwhere": {"__current_case__": 1, "column": "2", "searchwhere_select": "column"}, "skip_first_line": true, "wholewords": false}]
      • Step 15: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_find_and_replace/9.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is error

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "#FILE", "global": true, "is_regex": false, "replace_pattern": "SampleID", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}, {"__index__": 1, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10_$1", "searchwhere": {"__current_case__": 1, "column": "2", "searchwhere_select": "column"}, "skip_first_line": true, "wholewords": false}]
          • Job 2:

            • Job state is new

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              find_and_replace [{"__index__": 0, "caseinsensitive": false, "find_pattern": "#FILE", "global": true, "is_regex": false, "replace_pattern": "SampleID", "searchwhere": {"__current_case__": 1, "column": "1", "searchwhere_select": "column"}, "skip_first_line": false, "wholewords": false}, {"__index__": 1, "caseinsensitive": false, "find_pattern": "^(.+)$", "global": true, "is_regex": true, "replace_pattern": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12_$1", "searchwhere": {"__current_case__": 1, "column": "2", "searchwhere_select": "column"}, "skip_first_line": true, "wholewords": false}]
      • Step 16: toolshed.g2.bx.psu.edu/repos/devteam/tabular_to_fasta/tab2fasta/1.1.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/tabular_to_fasta/0a7799698fe5/tabular_to_fasta/tabular_to_fasta.py' '/tmp/tmpaqgrhj37/files/0/8/3/dataset_083a8c8a-c85a-415c-9c33-0e48107b7cc9.dat' 1 2 '/tmp/tmpaqgrhj37/job_working_directory/000/29/outputs/dataset_55abdf72-0bec-4107-b22d-d8600b73e0a2.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              seq_col "2"
              title_col "1"
          • Job 2:

            • Job state is queued

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/tabular_to_fasta/0a7799698fe5/tabular_to_fasta/tabular_to_fasta.py' '/tmp/tmpaqgrhj37/files/c/a/5/dataset_ca5b6b08-a0c9-408b-afe4-b7d05c1241fe.dat' 1 2 '/tmp/tmpaqgrhj37/job_working_directory/000/30/outputs/dataset_209f0d11-8976-44c8-9f48-dcec01df0df8.dat'

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              seq_col "2"
              title_col "1"
      • Step 3: __BUILD_LIST__:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              datasets [{"__index__": 0, "id_cond": {"__current_case__": 0, "id_select": "idx"}, "input": {"values": [{"id": 1, "src": "hda"}]}}]
          • Job 2:

            • Job state is ok

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              datasets [{"__index__": 0, "id_cond": {"__current_case__": 0, "id_select": "idx"}, "input": {"values": [{"id": 2, "src": "hda"}]}}]
      • Step 4: toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/split_file_to_collection/0.5.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • mkdir ./out && python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/2dae863c8f42/split_file_to_collection/split_file_to_collection.py' --out ./out --in '/tmp/tmpaqgrhj37/files/1/3/a/dataset_13aafb64-7bc0-4018-a3d5-e66f9d935cf7.dat' --ftype 'txt' --chunksize 1 --file_names 'split_file' --file_ext 'txt'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              split_parms {"__current_case__": 5, "input": {"values": [{"id": 3, "src": "hda"}]}, "newfilenames": "split_file", "select_allocate": {"__current_case__": 2, "allocate": "byrow"}, "select_ftype": "txt", "select_mode": {"__current_case__": 0, "chunksize": "1", "mode": "chunk"}}
      • Step 5: toolshed.g2.bx.psu.edu/repos/bgruening/flye/flye/2.9.5+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/8/4/8/dataset_84880410-a9fc-43f1-8adb-663a8c06f453.dat' ./input_0.fastq.gz && flye --nano-hq ./input_0.fastq.gz -o out_dir -t ${GALAXY_SLOTS:-4} -i 1 --meta

            Exit Code:

            • 0

            Standard Error:

            • [2025-03-21 11:57:38] INFO: Starting Flye 2.9.5-b1801
              [2025-03-21 11:57:38] INFO: >>>STAGE: configure
              [2025-03-21 11:57:38] INFO: Configuring run
              [2025-03-21 11:57:38] INFO: Total read length: 16599171
              [2025-03-21 11:57:38] INFO: Reads N50/N90: 13590 / 510
              [2025-03-21 11:57:38] INFO: Minimum overlap set to 1000
              [2025-03-21 11:57:38] INFO: >>>STAGE: assembly
              [2025-03-21 11:57:38] INFO: Assembling disjointigs
              [2025-03-21 11:57:38] INFO: Reading sequences
              [2025-03-21 11:57:38] INFO: Building minimizer index
              [2025-03-21 11:57:38] INFO: Pre-calculating index storage
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:39] INFO: Filling index
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:44] INFO: Extending reads
              [2025-03-21 11:57:47] INFO: Overlap-based coverage: 1
              [2025-03-21 11:57:47] INFO: Median overlap divergence: 0.0672248
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:52] INFO: Assembled 162 disjointigs
              [2025-03-21 11:57:52] INFO: Generating sequence
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:53] INFO: Filtering contained disjointigs
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:55] INFO: Contained seqs: 7
              [2025-03-21 11:57:55] INFO: >>>STAGE: consensus
              [2025-03-21 11:57:55] INFO: Running Minimap2
              [2025-03-21 11:58:03] INFO: Computing consensus
              [2025-03-21 11:58:28] INFO: Alignment error rate: 0.089401
              [2025-03-21 11:58:28] INFO: >>>STAGE: repeat
              [2025-03-21 11:58:28] INFO: Building and resolving repeat graph
              [2025-03-21 11:58:28] INFO: Parsing disjointigs
              [2025-03-21 11:58:28] INFO: Building repeat graph
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:34] INFO: Median overlap divergence: 0.228709
              [2025-03-21 11:58:34] INFO: Parsing reads
              [2025-03-21 11:58:35] INFO: Aligning reads to the graph
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:48] INFO: Aligned read sequence: 8293453 / 13948017 (0.594597)
              [2025-03-21 11:58:48] INFO: Median overlap divergence: 0.0403482
              [2025-03-21 11:58:48] INFO: Mean edge coverage: 1
              [2025-03-21 11:58:48] INFO: Simplifying the graph
              [2025-03-21 11:58:48] INFO: >>>STAGE: contigger
              [2025-03-21 11:58:48] INFO: Generating contigs
              [2025-03-21 11:58:48] INFO: Reading sequences
              [2025-03-21 11:58:49] INFO: Generated 16 contigs
              [2025-03-21 11:58:49] INFO: Added 0 scaffold connections
              [2025-03-21 11:58:49] INFO: >>>STAGE: polishing
              [2025-03-21 11:58:49] INFO: Polishing genome (1/1)
              [2025-03-21 11:58:49] INFO: Running minimap2
              [2025-03-21 11:58:50] INFO: Separating alignment into bubbles
              [2025-03-21 11:58:51] INFO: Alignment error rate: 0.126796
              [2025-03-21 11:58:51] INFO: Correcting bubbles
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 12:00:20] INFO: >>>STAGE: finalize
              [2025-03-21 12:00:20] INFO: Assembly statistics:
              
              	Total length:	294971
              	Fragments:	13
              	Fragments N50:	41795
              	Largest frg:	76247
              	Scaffolds:	0
              	Mean coverage:	4
              
              [2025-03-21 12:00:20] INFO: Final assembly: /tmp/tmpaqgrhj37/job_working_directory/000/7/working/out_dir/assembly.fasta
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              asm {"__current_case__": 1, "asm_select": "false"}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              generate_log false
              iterations "1"
              keep_haplotypes false
              meta true
              min_overlap None
              mode_conditional {"__current_case__": 2, "mode": "--nano-hq"}
              no_alt_contigs false
              scaffold false
          • Job 2:

            • Job state is ok

            Command Line:

            • ln -sf '/tmp/tmpaqgrhj37/files/e/2/7/dataset_e27ed020-17b2-4eff-83a4-e91a92e63613.dat' ./input_0.fastq.gz && flye --nano-hq ./input_0.fastq.gz -o out_dir -t ${GALAXY_SLOTS:-4} -i 1 --meta

            Exit Code:

            • 0

            Standard Error:

            • [2025-03-21 11:57:38] INFO: Starting Flye 2.9.5-b1801
              [2025-03-21 11:57:38] INFO: >>>STAGE: configure
              [2025-03-21 11:57:38] INFO: Configuring run
              [2025-03-21 11:57:38] INFO: Total read length: 16082845
              [2025-03-21 11:57:38] INFO: Reads N50/N90: 31018 / 4353
              [2025-03-21 11:57:38] INFO: Minimum overlap set to 4000
              [2025-03-21 11:57:38] INFO: >>>STAGE: assembly
              [2025-03-21 11:57:38] INFO: Assembling disjointigs
              [2025-03-21 11:57:38] INFO: Reading sequences
              [2025-03-21 11:57:38] INFO: Building minimizer index
              [2025-03-21 11:57:38] INFO: Pre-calculating index storage
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:39] INFO: Filling index
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:57:56] INFO: Extending reads
              [2025-03-21 11:58:09] INFO: Overlap-based coverage: 3
              [2025-03-21 11:58:09] INFO: Median overlap divergence: 0.0622277
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:13] INFO: Assembled 46 disjointigs
              [2025-03-21 11:58:13] INFO: Generating sequence
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:14] INFO: Filtering contained disjointigs
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:16] INFO: Contained seqs: 0
              [2025-03-21 11:58:16] INFO: >>>STAGE: consensus
              [2025-03-21 11:58:16] INFO: Running Minimap2
              [2025-03-21 11:58:24] INFO: Computing consensus
              [2025-03-21 11:58:45] INFO: Alignment error rate: 0.069028
              [2025-03-21 11:58:45] INFO: >>>STAGE: repeat
              [2025-03-21 11:58:45] INFO: Building and resolving repeat graph
              [2025-03-21 11:58:45] INFO: Parsing disjointigs
              [2025-03-21 11:58:45] INFO: Building repeat graph
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:47] INFO: Median overlap divergence: 0.0646416
              [2025-03-21 11:58:47] INFO: Parsing reads
              [2025-03-21 11:58:47] INFO: Aligning reads to the graph
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:58:57] INFO: Aligned read sequence: 13430829 / 14568068 (0.921936)
              [2025-03-21 11:58:57] INFO: Median overlap divergence: 0.0324551
              [2025-03-21 11:58:57] INFO: Mean edge coverage: 3
              [2025-03-21 11:58:57] INFO: Simplifying the graph
              [2025-03-21 11:58:57] INFO: >>>STAGE: contigger
              [2025-03-21 11:58:57] INFO: Generating contigs
              [2025-03-21 11:58:57] INFO: Reading sequences
              [2025-03-21 11:58:58] INFO: Generated 23 contigs
              [2025-03-21 11:58:58] INFO: Added 0 scaffold connections
              [2025-03-21 11:58:58] INFO: >>>STAGE: polishing
              [2025-03-21 11:58:58] INFO: Polishing genome (1/1)
              [2025-03-21 11:58:58] INFO: Running minimap2
              [2025-03-21 11:59:03] INFO: Separating alignment into bubbles
              [2025-03-21 11:59:15] INFO: Alignment error rate: 0.049562
              [2025-03-21 11:59:15] INFO: Correcting bubbles
              0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 
              [2025-03-21 11:59:47] INFO: >>>STAGE: finalize
              [2025-03-21 11:59:47] INFO: Assembly statistics:
              
              	Total length:	3294294
              	Fragments:	21
              	Fragments N50:	183164
              	Largest frg:	409469
              	Scaffolds:	0
              	Mean coverage:	3
              
              [2025-03-21 11:59:47] INFO: Final assembly: /tmp/tmpaqgrhj37/job_working_directory/000/8/working/out_dir/assembly.fasta
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              asm {"__current_case__": 1, "asm_select": "false"}
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              generate_log false
              iterations "1"
              keep_haplotypes false
              meta true
              min_overlap None
              mode_conditional {"__current_case__": 2, "mode": "--nano-hq"}
              no_alt_contigs false
              scaffold false
      • Step 6: param_value_from_file:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
          • Job 2:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/medaka_consensus_pipeline/medaka_consensus_pipeline/1.7.2+galaxy1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpaqgrhj37/files/7/4/4/dataset_744d204f-9ba3-4529-91e0-8d96b01d9b88.dat' 'input_assembly.fa' && medaka_consensus -m r941_min_hac_g507 -b 100 -o results -t ${GALAXY_SLOTS:-4} -i '/tmp/tmpaqgrhj37/files/8/4/8/dataset_84880410-a9fc-43f1-8adb-663a8c06f453.dat' -d 'input_assembly.fa'  2>&1 | tee '/tmp/tmpaqgrhj37/job_working_directory/000/9/outputs/dataset_aadced41-2f55-4e1f-8d2f-e9dad7899c85.dat'

            Exit Code:

            • 0

            Standard Output:

            • Checking program versions
              This is medaka 1.7.2
              Program    Version    Required   Pass     
              bcftools   1.15.1     1.11       True     
              bgzip      1.16       1.11       True     
              minimap2   2.24       2.11       True     
              samtools   1.15.1     1.11       True     
              tabix      1.16       1.11       True     
              Aligning basecalls to draft
              Creating fai index file /tmp/tmpaqgrhj37/job_working_directory/000/9/working/input_assembly.fa.fai
              Creating mmi index file /tmp/tmpaqgrhj37/job_working_directory/000/9/working/input_assembly.fa.map-ont.mmi
              [M::mm_idx_gen::0.011*1.09] collected minimizers
              [M::mm_idx_gen::0.019*1.34] sorted minimizers
              [M::main::0.022*1.30] loaded/built the index for 13 target sequence(s)
              [M::mm_idx_stat] kmer size: 15; skip: 10; is_hpc: 0; #seq: 13
              [M::mm_idx_stat::0.023*1.29] distinct minimizers: 54348 (99.26% are singletons); average occurrences: 1.034; average spacing: 5.248; total length: 294971
              [M::main] Version: 2.24-r1122
              [M::main] CMD: minimap2 -I 16G -x map-ont -d /tmp/tmpaqgrhj37/job_working_directory/000/9/working/input_assembly.fa.map-ont.mmi /tmp/tmpaqgrhj37/job_working_directory/000/9/working/input_assembly.fa
              [M::main] Real time: 0.024 sec; CPU: 0.031 sec; Peak RSS: 0.011 GB
              [M::main::0.010*1.09] loaded/built the index for 13 target sequence(s)
              [M::mm_mapopt_update::0.011*1.08] mid_occ = 10
              [M::mm_idx_stat] kmer size: 15; skip: 10; is_hpc: 0; #seq: 13
              [M::mm_idx_stat::0.012*1.08] distinct minimizers: 54348 (99.26% are singletons); average occurrences: 1.034; average spacing: 5.248; total length: 294971
              [M::worker_pipeline::1.557*0.89] mapped 8974 sequences
              [M::main] Version: 2.24-r1122
              [M::main] CMD: minimap2 -x map-ont --secondary=no -L --MD -A 2 -B 4 -O 4,24 -E 2,1 -t 1 -a /tmp/tmpaqgrhj37/job_working_directory/000/9/working/input_assembly.fa.map-ont.mmi /tmp/tmpaqgrhj37/files/8/4/8/dataset_84880410-a9fc-43f1-8adb-663a8c06f453.dat
              [M::main] Real time: 1.559 sec; CPU: 1.392 sec; Peak RSS: 0.134 GB
              Running medaka consensus
              [12:00:52 - Predict] Setting tensorflow inter/intra-op threads to 1/1.
              [12:00:52 - Predict] Processing region(s): contig_1:0-16886 contig_10:0-19537 contig_12:0-5568 contig_13:0-24846 contig_14:0-1299 contig_15:0-76247 contig_16:0-35171 contig_2:0-41795 contig_3:0-3617 contig_4:0-2172 contig_5:0-7123 contig_8:0-58135 contig_9:0-2575
              [12:00:52 - Predict] Using model: /usr/local/lib/python3.8/site-packages/medaka/data/r941_min_hac_g507_model.tar.gz.
              [12:00:52 - Predict] Processing 13 long region(s) with batching.
              [12:00:53 - MdlStrTF] Model <keras.engine.sequential.Sequential object at 0x7f39cc5edaf0>
              [12:00:53 - MdlStrTF] loading weights from /tmp/tmpaqgrhj37/tmp/tmpxbs0pi72/model/variables/variables
              [12:00:53 - BAMFile] Creating pool of 16 BAM file sets.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_1:0-16886.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_10:0-19537.
              [12:00:53 - PWorker] Running inference for 0.3M draft bases.
              [12:00:53 - Feature] Processed contig_1:0.0-16885.0 (median depth 3.0)
              [12:00:53 - Sampler] Took 0.21s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_12:0-5568.
              [12:00:53 - Feature] Processed contig_10:0.0-19536.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 0.31s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_13:0-24846.
              [12:00:53 - Feature] Processed contig_12:0.0-5567.0 (median depth 2.0)
              [12:00:53 - Sampler] Took 0.18s to make features.
              [12:00:53 - Sampler] Region contig_12:0.0-5567.0 (5741 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_14:0-1299.
              [12:00:53 - Feature] Processed contig_14:0.0-1298.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 0.11s to make features.
              [12:00:53 - Sampler] Region contig_14:0.0-1298.0 (1334 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_15:0-76247.
              [12:00:53 - Feature] Processed contig_13:0.0-24845.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 0.23s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_16:0-35171.
              [12:00:54 - Feature] Processed contig_15:0.0-76246.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.26s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_2:0-41795.
              [12:00:54 - Feature] Processed contig_16:0.0-35170.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.21s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_3:0-3617.
              [12:00:54 - Feature] Processed contig_3:0.0-3616.0 (median depth 3.0)
              [12:00:54 - Sampler] Took 0.18s to make features.
              [12:00:54 - Sampler] Region contig_3:0.0-3616.0 (3707 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_4:0-2172.
              [12:00:54 - Feature] Processed contig_4:0.0-2171.0 (median depth 5.0)
              [12:00:54 - Sampler] Took 0.17s to make features.
              [12:00:54 - Sampler] Region contig_4:0.0-2171.0 (3070 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_5:0-7123.
              [12:00:54 - Feature] Pileup counts do not span requested region, requested contig_5:0-7123, received 1435-4496.
              [12:00:54 - Feature] Processed contig_2:0.0-41794.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.44s to make features.
              [12:00:54 - Feature] Processed contig_5:1435.0-4496.0 (median depth 2.0)
              [12:00:54 - Feature] Pileup counts do not span requested region, requested contig_5:0-7123, received 4897-6362.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_8:0-58135.
              [12:00:54 - Feature] Processed contig_5:4897.0-6362.0 (median depth 2.0)
              [12:00:54 - Feature] Pileup counts do not span requested region, requested contig_5:0-7123, received 6378-6901.
              [12:00:54 - Feature] Processed contig_5:6378.0-6901.0 (median depth 2.0)
              [12:00:54 - Sampler] Took 0.20s to make features.
              [12:00:54 - Sampler] Region contig_5:1435.0-4496.0 (3140 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Sampler] Region contig_5:4897.0-6362.0 (1855 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Sampler] Region contig_5:6378.0-6901.0 (681 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_9:0-2575.
              [12:00:54 - Feature] Processed contig_9:0.0-2574.0 (median depth 1.0)
              [12:00:54 - Sampler] Took 0.22s to make features.
              [12:00:54 - Sampler] Region contig_9:0.0-2574.0 (2599 positions) is smaller than inference chunk length 10000, quarantining.
              [12:00:54 - Feature] Processed contig_8:0.0-58134.0 (median depth 6.0)
              [12:00:54 - Sampler] Took 0.33s to make features.
              [12:01:03 - PWorker] Batches in cache: 1.
              [12:01:03 - PWorker] Processed 1 batches
              [12:01:03 - PWorker] All done, 8 remainder regions.
              [12:01:03 - Predict] Processing 8 short region(s).
              [12:01:03 - MdlStrTF] Model <keras.engine.sequential.Sequential object at 0x7f39c9515d90>
              [12:01:03 - MdlStrTF] loading weights from /tmp/tmpaqgrhj37/tmp/tmpxbs0pi72/model/variables/variables
              [12:01:03 - Sampler] Initializing sampler for consensus of region contig_12:0-5568.
              [12:01:03 - Sampler] Initializing sampler for consensus of region contig_14:0-1299.
              [12:01:03 - PWorker] Running inference for 0.0M draft bases.
              [12:01:04 - Feature] Processed contig_12:0.0-5567.0 (median depth 2.0)
              [12:01:04 - Sampler] Took 0.11s to make features.
              [12:01:04 - Feature] Processed contig_14:0.0-1298.0 (median depth 4.0)
              [12:01:04 - Sampler] Took 0.11s to make features.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_3:0-3617.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_4:0-2172.
              [12:01:04 - Feature] Processed contig_3:0.0-3616.0 (median depth 3.0)
              [12:01:04 - Sampler] Took 0.01s to make features.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_5:1435-4497.
              [12:01:04 - Feature] Processed contig_4:0.0-2171.0 (median depth 5.0)
              [12:01:04 - Sampler] Took 0.05s to make features.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_5:4897-6363.
              [12:01:04 - Feature] Processed contig_5:1435.0-4496.0 (median depth 2.0)
              [12:01:04 - Sampler] Took 0.05s to make features.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_5:6378-6902.
              [12:01:04 - Feature] Processed contig_5:4897.0-6362.0 (median depth 2.0)
              [12:01:04 - Sampler] Took 0.04s to make features.
              [12:01:04 - Feature] Processed contig_5:6378.0-6901.0 (median depth 2.0)
              [12:01:04 - Sampler] Took 0.02s to make features.
              [12:01:04 - Sampler] Initializing sampler for consensus of region contig_9:0-2575.
              [12:01:04 - Feature] Processed contig_9:0.0-2574.0 (median depth 1.0)
              [12:01:04 - Sampler] Took 0.07s to make features.
              [12:01:14 - PWorker] Batches in cache: 8.
              [12:01:14 - PWorker] 22.7% Done (0.0/0.0 Mbases) in 10.5s
              [12:01:23 - PWorker] Batches in cache: 6.
              [12:01:29 - PWorker] Batches in cache: 5.
              [12:01:29 - PWorker] 43.1% Done (0.0/0.0 Mbases) in 25.3s
              [12:01:34 - PWorker] Batches in cache: 4.
              [12:01:39 - PWorker] 59.9% Done (0.0/0.0 Mbases) in 35.7s
              [12:01:44 - PWorker] Batches in cache: 1.
              [12:01:44 - PWorker] Processed 8 batches
              [12:01:44 - PWorker] All done, 0 remainder regions.
              [12:01:44 - Predict] Finished processing all regions.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:46 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:47 - TrimOlap] contig_5:1435.0-4496.0 and contig_5:4897.0-6362.0 cannot be concatenated as there is no overlap and they do not abut.
              [12:01:47 - TrimOlap] contig_5:4897.0-6362.0 and contig_5:6378.0-6901.0 cannot be concatenated as there is no overlap and they do not abut.
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              /usr/local/lib/python3.8/site-packages/medaka/labels.py:387: RuntimeWarning: divide by zero encountered in log10
                q = -10 * np.log10(err)
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:01:47 - DataIndx] Loaded 1/1 (100.00%) sample files.
              Polished assembly written to results/consensus.fasta, have a nice day.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              b "100"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              g false
              m "r941_min_hac_g507"
              out ["consensus", "probs", "calls", "log", "gaps"]
          • Job 2:

            • Job state is ok

            Command Line:

            • cp '/tmp/tmpaqgrhj37/files/b/8/1/dataset_b81a2fe5-1639-4bd2-8f5f-a6eade77e6d4.dat' 'input_assembly.fa' && medaka_consensus -m r941_min_hac_g507 -b 100 -o results -t ${GALAXY_SLOTS:-4} -i '/tmp/tmpaqgrhj37/files/e/2/7/dataset_e27ed020-17b2-4eff-83a4-e91a92e63613.dat' -d 'input_assembly.fa'  2>&1 | tee '/tmp/tmpaqgrhj37/job_working_directory/000/10/outputs/dataset_95d8c176-20b3-401a-bafe-ea2ef7a438c3.dat'

            Exit Code:

            • 0

            Standard Output:

            • Checking program versions
              This is medaka 1.7.2
              Program    Version    Required   Pass     
              bcftools   1.15.1     1.11       True     
              bgzip      1.16       1.11       True     
              minimap2   2.24       2.11       True     
              samtools   1.15.1     1.11       True     
              tabix      1.16       1.11       True     
              Aligning basecalls to draft
              Creating fai index file /tmp/tmpaqgrhj37/job_working_directory/000/10/working/input_assembly.fa.fai
              Creating mmi index file /tmp/tmpaqgrhj37/job_working_directory/000/10/working/input_assembly.fa.map-ont.mmi
              [M::mm_idx_gen::0.079*1.01] collected minimizers
              [M::mm_idx_gen::0.122*1.08] sorted minimizers
              [M::main::0.154*1.06] loaded/built the index for 21 target sequence(s)
              [M::mm_idx_stat] kmer size: 15; skip: 10; is_hpc: 0; #seq: 21
              [M::mm_idx_stat::0.160*1.06] distinct minimizers: 600555 (98.14% are singletons); average occurrences: 1.026; average spacing: 5.345; total length: 3294294
              [M::main] Version: 2.24-r1122
              [M::main] CMD: minimap2 -I 16G -x map-ont -d /tmp/tmpaqgrhj37/job_working_directory/000/10/working/input_assembly.fa.map-ont.mmi /tmp/tmpaqgrhj37/job_working_directory/000/10/working/input_assembly.fa
              [M::main] Real time: 0.166 sec; CPU: 0.176 sec; Peak RSS: 0.036 GB
              [M::main::0.054*0.77] loaded/built the index for 21 target sequence(s)
              [M::mm_mapopt_update::0.062*0.80] mid_occ = 10
              [M::mm_idx_stat] kmer size: 15; skip: 10; is_hpc: 0; #seq: 21
              [M::mm_idx_stat::0.068*0.81] distinct minimizers: 600555 (98.14% are singletons); average occurrences: 1.026; average spacing: 5.345; total length: 3294294
              [M::worker_pipeline::4.863*0.88] mapped 2700 sequences
              [M::main] Version: 2.24-r1122
              [M::main] CMD: minimap2 -x map-ont --secondary=no -L --MD -A 2 -B 4 -O 4,24 -E 2,1 -t 1 -a /tmp/tmpaqgrhj37/job_working_directory/000/10/working/input_assembly.fa.map-ont.mmi /tmp/tmpaqgrhj37/files/e/2/7/dataset_e27ed020-17b2-4eff-83a4-e91a92e63613.dat
              [M::main] Real time: 4.869 sec; CPU: 4.269 sec; Peak RSS: 0.152 GB
              Running medaka consensus
              [12:00:51 - Predict] Setting tensorflow inter/intra-op threads to 1/1.
              [12:00:51 - Predict] Processing region(s): contig_1:0-234039 contig_10:0-133454 contig_11:0-62420 contig_12:0-260499 contig_13:0-166546 contig_14:0-147861 contig_15:0-105710 contig_16:0-114258 contig_17:0-18503 contig_18:0-87162 contig_19:0-247063 contig_2:0-77870 contig_20:0-83469 contig_21:0-289421 contig_22:0-159392 contig_23:0-153507 contig_3:0-409469 contig_4:0-124313 contig_5:0-52874 contig_6:0-183164 contig_9:0-183300
              [12:00:51 - Predict] Using model: /usr/local/lib/python3.8/site-packages/medaka/data/r941_min_hac_g507_model.tar.gz.
              [12:00:51 - Predict] Processing 21 long region(s) with batching.
              [12:00:52 - MdlStrTF] Model <keras.engine.sequential.Sequential object at 0x7fd865e09af0>
              [12:00:52 - MdlStrTF] loading weights from /tmp/tmpaqgrhj37/tmp/tmpr474z745/model/variables/variables
              [12:00:52 - BAMFile] Creating pool of 16 BAM file sets.
              [12:00:52 - Sampler] Initializing sampler for consensus of region contig_1:0-234039.
              [12:00:52 - Sampler] Initializing sampler for consensus of region contig_10:0-133454.
              [12:00:52 - PWorker] Running inference for 3.3M draft bases.
              [12:00:53 - Feature] Processed contig_10:0.0-133453.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 0.94s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_11:0-62420.
              [12:00:53 - Feature] Processed contig_1:0.0-234038.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 1.01s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_12:0-260499.
              [12:00:53 - Feature] Processed contig_11:0.0-62419.0 (median depth 3.0)
              [12:00:53 - Sampler] Took 0.55s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_13:0-166546.
              [12:00:53 - Feature] Processed contig_12:0.0-260498.0 (median depth 4.0)
              [12:00:53 - Sampler] Took 0.65s to make features.
              [12:00:53 - Sampler] Initializing sampler for consensus of region contig_14:0-147861.
              [12:00:54 - Feature] Processed contig_13:0.0-166545.0 (median depth 3.0)
              [12:00:54 - Sampler] Took 0.38s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_15:0-105710.
              [12:00:54 - Feature] Processed contig_14:0.0-147860.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.31s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_16:0-114258.
              [12:00:54 - Feature] Processed contig_15:0.0-105709.0 (median depth 5.0)
              [12:00:54 - Sampler] Took 0.25s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_17:0-18503.
              [12:00:54 - Feature] Processed contig_17:0.0-18502.0 (median depth 3.0)
              [12:00:54 - Sampler] Took 0.05s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_18:0-87162.
              [12:00:54 - Feature] Processed contig_16:0.0-114257.0 (median depth 3.0)
              [12:00:54 - Sampler] Took 0.19s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_19:0-247063.
              [12:00:54 - Feature] Processed contig_18:0.0-87161.0 (median depth 3.0)
              [12:00:54 - Sampler] Took 0.15s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_2:0-77870.
              [12:00:54 - Feature] Processed contig_19:0.0-247062.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.35s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_20:0-83469.
              [12:00:54 - Feature] Processed contig_2:0.0-77869.0 (median depth 4.0)
              [12:00:54 - Sampler] Took 0.27s to make features.
              [12:00:54 - Sampler] Initializing sampler for consensus of region contig_21:0-289421.
              [12:00:54 - Feature] Pileup counts do not span requested region, requested contig_20:0-83469, received 2-83468.
              [12:00:55 - Feature] Processed contig_20:2.0-83468.0 (median depth 4.0)
              [12:00:55 - Sampler] Took 0.20s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_22:0-159392.
              [12:00:55 - Feature] Processed contig_21:0.0-289420.0 (median depth 4.0)
              [12:00:55 - Sampler] Took 0.28s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_23:0-153507.
              [12:00:55 - Feature] Processed contig_22:0.0-159391.0 (median depth 3.0)
              [12:00:55 - Sampler] Took 0.27s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_3:0-409469.
              [12:00:55 - Feature] Processed contig_23:0.0-153506.0 (median depth 4.0)
              [12:00:55 - Sampler] Took 0.19s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_4:0-124313.
              [12:00:55 - Feature] Processed contig_3:0.0-409468.0 (median depth 3.0)
              [12:00:55 - Sampler] Took 0.38s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_5:0-52874.
              [12:00:55 - Feature] Processed contig_4:0.0-124312.0 (median depth 5.0)
              [12:00:55 - Sampler] Took 0.32s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_6:0-183164.
              [12:00:55 - Feature] Pileup counts do not span requested region, requested contig_6:0-183164, received 178-183163.
              [12:00:55 - Feature] Processed contig_5:0.0-52873.0 (median depth 4.0)
              [12:00:55 - Sampler] Took 0.13s to make features.
              [12:00:55 - Sampler] Initializing sampler for consensus of region contig_9:0-183300.
              [12:00:55 - Feature] Processed contig_6:178.0-183163.0 (median depth 4.0)
              [12:00:55 - Sampler] Took 0.21s to make features.
              [12:00:56 - Feature] Processed contig_9:0.0-183299.0 (median depth 3.0)
              [12:00:56 - Sampler] Took 0.26s to make features.
              [12:01:15 - PWorker] Batches in cache: 4.
              [12:01:15 - PWorker] 26.5% Done (0.9/3.3 Mbases) in 23.4s
              [12:01:34 - PWorker] Batches in cache: 3.
              [12:01:34 - PWorker] 52.9% Done (1.7/3.3 Mbases) in 42.0s
              [12:01:52 - PWorker] Batches in cache: 2.
              [12:01:52 - PWorker] 79.3% Done (2.6/3.3 Mbases) in 60.1s
              [12:02:09 - PWorker] Batches in cache: 1.
              [12:02:09 - PWorker] 100.0% Done (3.3/3.3 Mbases) in 77.3s
              [12:02:10 - PWorker] Processed 4 batches
              [12:02:10 - PWorker] All done, 0 remainder regions.
              [12:02:10 - Predict] Finished processing all regions.
              [12:02:12 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:12 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:12 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:12 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:13 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              /usr/local/lib/python3.8/site-packages/medaka/labels.py:387: RuntimeWarning: divide by zero encountered in log10
                q = -10 * np.log10(err)
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:14 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:15 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:15 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:15 - DataIndx] Loaded 1/1 (100.00%) sample files.
              [12:02:15 - DataIndx] Loaded 1/1 (100.00%) sample files.
              Polished assembly written to results/consensus.fasta, have a nice day.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              b "100"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              g false
              m "r941_min_hac_g507"
              out ["consensus", "probs", "calls", "log", "gaps"]
      • Step 8: toolshed.g2.bx.psu.edu/repos/iuc/bandage/bandage_image/2022.09+galaxy4:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpaqgrhj37/files/0/4/7/dataset_04725390-733b-4065-acd1-c1dfcf2ad2b6.dat' input.gfa &&  export QT_QPA_PLATFORM='offscreen' && Bandage image input.gfa 'out.jpg' --height '1000'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fontsize None
              height "1000"
              lengths false
              names false
              nodewidth None
              output_format "jpg"
              width None
          • Job 2:

            • Job state is ok

            Command Line:

            • ln -s '/tmp/tmpaqgrhj37/files/b/1/8/dataset_b18f85a9-af23-4fb9-95a8-c65131137cc0.dat' input.gfa &&  export QT_QPA_PLATFORM='offscreen' && Bandage image input.gfa 'out.jpg' --height '1000'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              fontsize None
              height "1000"
              lengths false
              names false
              nodewidth None
              output_format "jpg"
              width None
      • Step 9: toolshed.g2.bx.psu.edu/repos/iuc/compose_text_param/compose_text_param/0.1.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode10", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "_$1", "select_param_type": "text"}}]
              dbkey "?"
          • Job 2:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "nanopore_preprocessed_collection_of_all_samples_Spike3bBarcode12", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "_$1", "select_param_type": "text"}}]
              dbkey "?"
      • Step 10: toolshed.g2.bx.psu.edu/repos/devteam/fasta_to_tabular/fasta2tab/1.1.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/fasta_to_tabular/e7ed3c310b74/fasta_to_tabular/fasta_to_tabular.py' '/tmp/tmpaqgrhj37/files/d/3/d/dataset_d3dad808-fab5-4eff-ab12-f25ed11c2b19.dat' '/tmp/tmpaqgrhj37/job_working_directory/000/13/outputs/dataset_bbd0fa53-07fa-41ce-9ddf-442be2508a16.dat' 0 1

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              descr_columns "1"
              keep_first "0"
          • Job 2:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/fasta_to_tabular/e7ed3c310b74/fasta_to_tabular/fasta_to_tabular.py' '/tmp/tmpaqgrhj37/files/2/9/5/dataset_295f5a79-06b4-4f45-a3ef-790e1d53131a.dat' '/tmp/tmpaqgrhj37/job_working_directory/000/14/outputs/dataset_a6b8f7dd-d609-4e75-bef3-5b6eaa048fcd.dat' 0 1

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "9b9d22ac064b11f08a57002248903f92"
              chromInfo "/tmp/tmpaqgrhj37/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              descr_columns "1"
              keep_first "0"
    • Other invocation details
      • error_message

        • Failed to run workflow, at least one job is in [error] state.
      • history_id

        • ff7eda6804dad4a7
      • history_state

        • error
      • invocation_id

        • ff7eda6804dad4a7
      • invocation_state

        • scheduled
      • workflow_id

        • ff7eda6804dad4a7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant