Fix concat_logfiles error
Workflow crashes when concatenating files when running from within a Docker or Apptainer container, with the following error:
INFO [job concat_logfiles] /tmp/opbuchxo$ sh \
concatenate.sh \
/tmp/lrvs7szl/stg2560ce15-998e-44e0-9023-d593eb66cc21/L888536_SAP000_SB026_uv.MS_preprocess.log \
/tmp/lrvs7szl/stg38a03685-96c7-47c2-a2c4-4a8a1b9a826a/L888536_SAP000_SB026_uv.MS_preprocess_err.log
concatenate.sh: 2: Syntax error: "(" unexpected
DEBUG Could not collect memory usage, job ended before monitoring began.
WARNING [job concat_logfiles] exited with status: 2
ERROR [job concat_logfiles] Job error:
("Error collecting output for parameter 'output': ../../usr/local/share/prep/steps/concatenate_files.cwl:23:7: Did not find output file with glob pattern: ['pipeline.log'].", {})
WARNING [job concat_logfiles] completed permanentFail
DEBUG [job concat_logfiles] outputs {}
ERROR [step concat_logfiles] Output is missing expected field file:///usr/local/share/prep/workflows/pipeline.cwl#concat_logfiles/output
DEBUG [step concat_logfiles] produced output {}
WARNING [step concat_logfiles] completed permanentFail
This is likely caused by using sh
instead of bash
when running the step. Hence, this MR reverts this change.
In a previously MR, it was suggested to use sh
instead of bash
, which seemed to work fine when running on DAS-6 as sh
is symlinked to bash
; unbeknownst to me. Thus, not causing issues until running it in a container, where: /usr/bin/sh -> dash
.
Edited by Mick Veldhuis