[med-svn] [Git][med-team/toil][master] 2 commits: d/patches/soften-configargparser-deps: removed, no longed needed
Michael R. Crusoe (@crusoe)
gitlab at salsa.debian.org
Mon Mar 17 16:55:52 GMT 2025
Michael R. Crusoe pushed to branch master at Debian Med / toil
Commits:
a4bba5fe by Michael R. Crusoe at 2025-03-17T14:10:22+01:00
d/patches/soften-configargparser-deps: removed, no longed needed
- - - - -
0bc69b5f by Michael R. Crusoe at 2025-03-17T17:48:08+01:00
more test fixes
- - - - -
16 changed files:
- debian/changelog
- debian/patches/accept_debian_packaged_docker_version.patch
- debian/patches/allow_newer_requests
- debian/patches/atomic_copy_as_alternative.patch
- debian/patches/debianize_docs
- debian/patches/fix_tests
- debian/patches/intersphinx
- debian/patches/needs_aws-proxyfix
- debian/patches/no_galaxy_lib
- debian/patches/pyproject.toml
- debian/patches/series
- debian/patches/setting_version.patch
- − debian/patches/soften-configargparser-deps
- debian/patches/soften-cwltool-dep.patch
- debian/patches/soften-mesos-deps
- debian/patches/soften-psutil
Changes:
=====================================
debian/changelog
=====================================
@@ -27,6 +27,7 @@ toil (8.0.0-1) UNRELEASED; urgency=medium
* Switch autopkgtest to autopkgtest-pkg-pybuild
* d/TODO: removed several completed items.
* d/toil.lintian-overrides: removed, no longer applicable
+ * d/patches/soften-configargparser-deps: removed, no longed needed
[ Andreas Tille ]
* Fix Python3.12 string syntax
=====================================
debian/patches/accept_debian_packaged_docker_version.patch
=====================================
@@ -1,10 +1,18 @@
-Description: Do not force any specific docker version, just take the Debian packaged one
-Author: Andreas Tille <tille at debian.org>
-Last-Update: Sat, 24 Feb 2024 22:17:54 +0100
+From: Andreas Tille <tille at debian.org>
+Date: Sun, 25 Feb 2024 07:06:47 +0100
+Subject: Do not force any specific docker version,
+ just take the Debian packaged one
+
+Last-Update: 2025-02-13
Forwarded: not-needed
+---
+ requirements.txt | 2 +-
+ 1 file changed, 1 insertion(+), 1 deletion(-)
---- toil.orig/requirements.txt
-+++ toil/requirements.txt
+diff --git a/requirements.txt b/requirements.txt
+index af6cc67..8c7ac27 100644
+--- a/requirements.txt
++++ b/requirements.txt
@@ -1,6 +1,6 @@
dill>=0.3.2, <0.4
requests<=2.31.0
=====================================
debian/patches/allow_newer_requests
=====================================
@@ -1,9 +1,16 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: remove max version pin for 'requests'
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Tue, 18 Jun 2024 18:23:13 +0200
+Subject: remove max version pin for 'requests'
+
Forwarded: not-needed
+---
+ requirements.txt | 2 +-
+ 1 file changed, 1 insertion(+), 1 deletion(-)
---- toil.orig/requirements.txt
-+++ toil/requirements.txt
+diff --git a/requirements.txt b/requirements.txt
+index 8c7ac27..9a22200 100644
+--- a/requirements.txt
++++ b/requirements.txt
@@ -1,5 +1,5 @@
dill>=0.3.2, <0.4
-requests<=2.31.0
=====================================
debian/patches/atomic_copy_as_alternative.patch
=====================================
@@ -1,11 +1,21 @@
-Description: os.link does not work with debci based setup and hence default to atomic copy
- when it does not work as intended
-Author: Nilesh Patra <nilesh at debian.org>
-Last-Update: 2022-03-10
+From: Nilesh Patra <nilesh at debian.org>
+Date: Thu, 10 Mar 2022 17:55:03 +0530
+Subject: os.link does not work with debci based setup and hence default to
+ atomic copy
+
+Last-Update: 2025-02-13
Forwarded: not-needed
---- toil.orig/src/toil/jobStores/fileJobStore.py
-+++ toil/src/toil/jobStores/fileJobStore.py
-@@ -608,13 +608,16 @@
+
+when it does not work as intended
+---
+ src/toil/jobStores/fileJobStore.py | 17 ++++++++++-------
+ 1 file changed, 10 insertions(+), 7 deletions(-)
+
+diff --git a/src/toil/jobStores/fileJobStore.py b/src/toil/jobStores/fileJobStore.py
+index fa6df1d..39dc2d4 100644
+--- a/src/toil/jobStores/fileJobStore.py
++++ b/src/toil/jobStores/fileJobStore.py
+@@ -608,13 +608,16 @@ class FileJobStore(AbstractJobStore):
except OSError as e:
# For the list of the possible errno codes, see: https://linux.die.net/man/2/link
if e.errno == errno.EEXIST:
=====================================
debian/patches/debianize_docs
=====================================
@@ -1,19 +1,19 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: Update docs to reflect a local install
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Mon, 7 Jan 2019 02:43:15 -0800
+Subject: Update docs to reflect a local install
+
Forwarded: not-needed
---- toil.orig/docs/index.rst
-+++ toil/docs/index.rst
-@@ -29,7 +29,6 @@
- .. toctree::
- :caption: Getting Started
-
-- gettingStarted/install
- gettingStarted/quickStart
-
- .. toctree::
---- toil.orig/docs/gettingStarted/quickStart.rst
-+++ toil/docs/gettingStarted/quickStart.rst
-@@ -13,18 +13,6 @@
+Last-Update: 2024-01-23
+---
+ docs/gettingStarted/quickStart.rst | 85 +++++++++++++++-----------------------
+ docs/index.rst | 1 -
+ 2 files changed, 33 insertions(+), 53 deletions(-)
+
+diff --git a/docs/gettingStarted/quickStart.rst b/docs/gettingStarted/quickStart.rst
+index 7a4dcf2..fbaeb41 100644
+--- a/docs/gettingStarted/quickStart.rst
++++ b/docs/gettingStarted/quickStart.rst
+@@ -13,18 +13,6 @@ The `Common Workflow Language`_ (CWL) is an emerging standard for writing
workflows that are portable across multiple workflow engines and platforms.
Running CWL workflows using Toil is easy.
@@ -32,7 +32,7 @@ Forwarded: not-needed
#. Copy and paste the following code block into ``example.cwl``:
.. code-block:: yaml
-@@ -50,11 +38,11 @@
+@@ -50,11 +38,11 @@ Running CWL workflows using Toil is easy.
#. To run the workflow simply enter ::
@@ -46,7 +46,7 @@ Forwarded: not-needed
Hello world!
-@@ -84,13 +72,6 @@
+@@ -84,13 +72,6 @@ Running a basic WDL workflow
The `Workflow Description Language`_ (WDL) is another emerging language for writing workflows that are portable across multiple workflow engines and platforms.
Running WDL workflows using Toil is still in alpha, and currently experimental. Toil currently supports basic workflow syntax (see :ref:`wdl` for more details and examples). Here we go over running a basic WDL helloworld workflow.
@@ -60,7 +60,7 @@ Forwarded: not-needed
#. Copy and paste the following code block into ``wdl-helloworld.wdl``::
workflow write_simple_file {
-@@ -110,11 +91,11 @@
+@@ -110,11 +91,11 @@ Running WDL workflows using Toil is still in alpha, and currently experimental.
#. To run the workflow simply enter ::
@@ -74,7 +74,7 @@ Forwarded: not-needed
Hello world!
This will, like the CWL example above, use the ``single_machine`` batch system
-@@ -144,7 +125,7 @@
+@@ -144,7 +125,7 @@ An example Toil Python workflow can be run with just three steps:
3. Specify the name of the :ref:`job store <jobStoreOverview>` and run the workflow::
@@ -83,7 +83,7 @@ Forwarded: not-needed
For something beyond a "Hello, world!" example, refer to :ref:`runningDetail`.
-@@ -169,7 +150,7 @@
+@@ -169,7 +150,7 @@ Running the example
#. Run it with the default settings::
@@ -92,7 +92,7 @@ Forwarded: not-needed
The workflow created a file called ``sortedFile.txt`` in your current directory.
Have a look at it and notice that it contains a whole lot of sorted lines!
-@@ -186,7 +167,7 @@
+@@ -186,7 +167,7 @@ Running the example
3. Run with custom options::
@@ -101,7 +101,7 @@ Forwarded: not-needed
--numLines=5000 \
--lineLength=10 \
--overwriteOutput=True \
-@@ -305,7 +286,7 @@
+@@ -305,7 +286,7 @@ in addition to messages from the batch system and jobs. This can be configured
with the ``--logLevel`` flag. For example, to only log ``CRITICAL`` level
messages to the screen::
@@ -110,7 +110,7 @@ Forwarded: not-needed
--logLevel=critical \
--overwriteOutput=True
-@@ -331,7 +312,7 @@
+@@ -331,7 +312,7 @@ example (the first line of ``down()``):
When we run the pipeline, Toil will show a detailed failure log with a traceback::
@@ -119,7 +119,7 @@ Forwarded: not-needed
...
---TOIL WORKER OUTPUT LOG---
...
-@@ -353,13 +334,13 @@
+@@ -353,13 +334,13 @@ that a job store of the same name already exists. By default, in the event of a
failure, the job store is preserved so that the workflow can be restarted,
starting from the previously failed jobs. We can restart the pipeline by running ::
@@ -135,7 +135,7 @@ Forwarded: not-needed
--retryCount 2 \
--restart \
--overwriteOutput=True
-@@ -373,7 +354,7 @@
+@@ -373,7 +354,7 @@ line 30, or remove it, and then run
::
@@ -144,7 +144,7 @@ Forwarded: not-needed
--restart \
--overwriteOutput=True
-@@ -401,7 +382,7 @@
+@@ -401,7 +382,7 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Launch a cluster in AWS using the :ref:`launchCluster` command::
@@ -153,7 +153,7 @@ Forwarded: not-needed
--clusterType kubernetes \
--keyPairName <AWS-key-pair-name> \
--leaderNodeType t2.medium \
-@@ -412,13 +393,13 @@
+@@ -412,13 +393,13 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Copy ``helloWorld.py`` to the ``/tmp`` directory on the leader node using the :ref:`rsyncCluster` command::
@@ -169,7 +169,7 @@ Forwarded: not-needed
Note that this command will log you in as the ``root`` user.
-@@ -439,7 +420,7 @@
+@@ -439,7 +420,7 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Use the :ref:`destroyCluster` command to destroy the cluster::
@@ -178,7 +178,7 @@ Forwarded: not-needed
Note that this command will destroy the cluster leader
node and any resources created to run the job, including the S3 bucket.
-@@ -457,7 +438,7 @@
+@@ -457,7 +438,7 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. First launch a node in AWS using the :ref:`launchCluster` command::
@@ -187,7 +187,7 @@ Forwarded: not-needed
--clusterType kubernetes \
--keyPairName <AWS-key-pair-name> \
--leaderNodeType t2.medium \
-@@ -467,12 +448,12 @@
+@@ -467,12 +448,12 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Copy ``example.cwl`` and ``example-job.yaml`` from the :ref:`CWL example <cwlquickstart>` to the node using
the :ref:`rsyncCluster` command::
@@ -203,7 +203,7 @@ Forwarded: not-needed
#. Once on the leader node, command line tools such as ``kubectl`` will be available to you. It's also a good idea to
update and install the following::
-@@ -503,7 +484,7 @@
+@@ -503,7 +484,7 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Finally, log out of the leader node and from your local computer, destroy the cluster::
@@ -212,7 +212,7 @@ Forwarded: not-needed
.. _awscactus:
-@@ -543,7 +524,7 @@
+@@ -543,7 +524,7 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
#. Launch a cluster using the :ref:`launchCluster` command::
@@ -221,7 +221,7 @@ Forwarded: not-needed
--provisioner <aws, gce> \
--keyPairName <key-pair-name> \
--leaderNodeType <type> \
-@@ -559,11 +540,11 @@
+@@ -559,11 +540,11 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
When using AWS, setting the environment variable eliminates having to specify the ``--zone`` option
for each command. This will be supported for GCE in the future. ::
@@ -235,7 +235,7 @@ Forwarded: not-needed
$ mkdir /root/cact_ex
$ exit
-@@ -572,18 +553,18 @@
+@@ -572,18 +553,18 @@ Also! Remember to use the :ref:`destroyCluster` command when finished to destro
`here <https://github.com/ComparativeGenomicsToolkit/cactus#seqfile-the-input-file>`__), organisms' genome sequence
files in FASTA format, and configuration files (e.g. blockTrim1.xml, if desired), up to the leader node::
@@ -263,3 +263,15 @@ Forwarded: not-needed
#. Set up the environment of the leader node to run Cactus::
+diff --git a/docs/index.rst b/docs/index.rst
+index 02e32f0..7c30b13 100644
+--- a/docs/index.rst
++++ b/docs/index.rst
+@@ -29,7 +29,6 @@ If using Toil for your research, please cite
+ .. toctree::
+ :caption: Getting Started
+
+- gettingStarted/install
+ gettingStarted/quickStart
+
+ .. toctree::
=====================================
debian/patches/fix_tests
=====================================
@@ -1,53 +1,61 @@
-From: Michael R. Crusoe <crusoe at debian.org>
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Mon, 7 Jan 2019 05:08:12 -0800
Subject: Make tests more independent
+
Forwarded: https://github.com/DataBiosphere/toil/pull/5211
---- toil.orig/src/toil/test/provisioners/clusterTest.py
-+++ toil/src/toil/test/provisioners/clusterTest.py
-@@ -39,7 +39,7 @@
- class AbstractClusterTest(ToilTest):
- def __init__(self, methodName: str) -> None:
- super().__init__(methodName=methodName)
-- self.keyName = os.getenv("TOIL_AWS_KEYNAME").strip() or "id_rsa"
-+ self.keyName = os.getenv("TOIL_AWS_KEYNAME", "id_rsa").strip()
- self.clusterName = f"aws-provisioner-test-{uuid4()}"
- self.leaderNodeType = "t2.medium"
- self.clusterType = "mesos"
---- toil.orig/src/toil/test/utils/toilKillTest.py
-+++ toil/src/toil/test/utils/toilKillTest.py
-@@ -23,7 +23,7 @@
- from toil.common import Toil
- from toil.jobStores.abstractJobStore import NoSuchFileException, NoSuchJobStoreException
- from toil.jobStores.utils import generate_locator
--from toil.test import ToilTest, needs_aws_s3, needs_cwl
-+from toil.test import ToilTest, get_data, needs_aws_s3, needs_cwl
-
- logger = logging.getLogger(__name__)
-
-@@ -40,8 +40,8 @@
-
- def setUp(self):
- """Shared test variables."""
-- self.cwl = os.path.abspath("src/toil/test/utils/ABCWorkflowDebug/sleep.cwl")
-- self.yaml = os.path.abspath("src/toil/test/utils/ABCWorkflowDebug/sleep.yaml")
-+ self.cwl = get_data("toil/test/utils/ABCWorkflowDebug/sleep.cwl")
-+ self.yaml = get_data("toil/test/utils/ABCWorkflowDebug/sleep.yaml")
-
- def tearDown(self):
- """Default tearDown for unittest."""
-@@ -90,8 +90,8 @@
- class ToilKillTestWithAWSJobStore(ToilKillTest):
- """A set of test cases for "toil kill" using the AWS job store."""
-
-- def __init__(self, *args, **kwargs):
-- super().__init__(*args, **kwargs)
-+ def setUp(self):
-+ super().setUp()
- self.job_store = generate_locator("aws", decoration="testkill")
-
-
---- toil.orig/src/toil/test/__init__.py
-+++ toil/src/toil/test/__init__.py
-@@ -30,6 +30,7 @@
+Last-Update: 2025-03-17
+---
+ MANIFEST.in | 9 ++
+ setup.py | 4 +-
+ src/toil/test/__init__.py | 19 ++++
+ src/toil/test/cwl/cwlTest.py | 113 ++++++++++-----------
+ src/toil/test/docs/scripts/tutorial_staging.py | 10 +-
+ .../test/provisioners/aws/awsProvisionerTest.py | 5 +-
+ src/toil/test/provisioners/clusterTest.py | 2 +-
+ src/toil/test/provisioners/gceProvisionerTest.py | 9 +-
+ src/toil/test/src/resourceTest.py | 2 -
+ .../test/utils/ABCWorkflowDebug/debugWorkflow.py | 10 +-
+ src/toil/test/utils/toilDebugTest.py | 14 ++-
+ src/toil/test/utils/toilKillTest.py | 10 +-
+ src/toil/test/utils/utilsTest.py | 11 +-
+ src/toil/test/wdl/wdltoil_test.py | 31 +++---
+ 14 files changed, 134 insertions(+), 115 deletions(-)
+
+diff --git a/MANIFEST.in b/MANIFEST.in
+index a4a46ee..35b3a05 100644
+--- a/MANIFEST.in
++++ b/MANIFEST.in
+@@ -1 +1,10 @@
+ include requirements*.txt
++include src/toil/server/api_spec/workflow_execution_service.swagger.yaml
++include src/toil/test/cwl/colon_test_output_job.yaml
++include src/toil/test/cwl/conditional_wf.yaml
++include src/toil/test/cwl/mock_mpi/fake_mpi.yml
++include src/toil/test/docs/scripts/*
++include src/toil/test/utils/ABCWorkflowDebug/sleep.yaml
++include src/toil/test/utils/ABCWorkflowDebug/*
++recursive-include src/toil/test/ *.cwl
++recursive-include src/toil/test/ *.txt
+diff --git a/setup.py b/setup.py
+index b5e0288..ad0f716 100755
+--- a/setup.py
++++ b/setup.py
+@@ -111,9 +111,7 @@ def run_setup():
+ extras_require=extras_require,
+ package_dir={"": "src"},
+ packages=find_packages(where="src"),
+- package_data={
+- "": ["*.yml", "*.yaml", "cloud-config", "*.cwl"],
+- },
++ include_package_data=True,
+ # Unfortunately, the names of the entry points are hard-coded elsewhere in the code base so
+ # you can't just change them here. Luckily, most of them are pretty unique strings, and thus
+ # easy to search for.
+diff --git a/src/toil/test/__init__.py b/src/toil/test/__init__.py
+index 1ebaf74..c9198a2 100644
+--- a/src/toil/test/__init__.py
++++ b/src/toil/test/__init__.py
+@@ -30,6 +30,7 @@ from abc import ABCMeta, abstractmethod
from collections.abc import Generator
from contextlib import contextmanager
from inspect import getsource
@@ -55,7 +63,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
from shutil import which
from tempfile import mkstemp
from textwrap import dedent
-@@ -38,6 +39,8 @@
+@@ -38,6 +39,8 @@ from unittest.util import strclass
from urllib.error import HTTPError, URLError
from urllib.request import urlopen
@@ -64,7 +72,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
from toil import ApplianceImageNotFound, applianceSelf, toilPackageDirPath
from toil.lib.accelerators import (
have_working_nvidia_docker_runtime,
-@@ -52,6 +55,22 @@
+@@ -52,6 +55,22 @@ from toil.version import distVersion
logger = logging.getLogger(__name__)
@@ -87,9 +95,11 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
class ToilTest(unittest.TestCase):
"""
A common base class for Toil tests.
---- toil.orig/src/toil/test/cwl/cwlTest.py
-+++ toil/src/toil/test/cwl/cwlTest.py
-@@ -51,6 +51,7 @@
+diff --git a/src/toil/test/cwl/cwlTest.py b/src/toil/test/cwl/cwlTest.py
+index 4098976..822cfb4 100644
+--- a/src/toil/test/cwl/cwlTest.py
++++ b/src/toil/test/cwl/cwlTest.py
+@@ -51,6 +51,7 @@ from toil.fileStores.abstractFileStore import AbstractFileStore
from toil.lib.threading import cpu_count
from toil.test import (
ToilTest,
@@ -97,7 +107,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
needs_aws_s3,
needs_cwl,
needs_docker,
-@@ -237,7 +238,6 @@
+@@ -237,7 +238,6 @@ class CWLWorkflowTest(ToilTest):
"""Runs anew before each test to create farm fresh temp dirs."""
self.outDir = f"/tmp/toil-cwl-test-{str(uuid.uuid4())}"
os.makedirs(self.outDir)
@@ -105,7 +115,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self.jobStoreDir = f"./jobstore-{str(uuid.uuid4())}"
def tearDown(self) -> None:
-@@ -254,7 +254,7 @@
+@@ -254,7 +254,7 @@ class CWLWorkflowTest(ToilTest):
"""
from toil.cwl import cwltoil
@@ -114,7 +124,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
args = [cwlfile, "--message", "str", "--sleep", "2"]
st = StringIO()
# If the workflow runs, it must have had options
-@@ -278,8 +278,8 @@
+@@ -278,8 +278,8 @@ class CWLWorkflowTest(ToilTest):
main_args.extend(["--logDebug", "--outdir", self.outDir])
main_args.extend(
[
@@ -125,7 +135,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
)
cwltoil.main(main_args, stdout=st)
-@@ -312,8 +312,8 @@
+@@ -312,8 +312,8 @@ class CWLWorkflowTest(ToilTest):
"--debugWorker",
"--outdir",
self.outDir,
@@ -136,7 +146,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
],
stdout=st,
)
-@@ -325,46 +325,46 @@
+@@ -325,46 +325,46 @@ class CWLWorkflowTest(ToilTest):
def revsort(self, cwl_filename: str, tester_fn: TesterFuncType) -> None:
tester_fn(
@@ -195,7 +205,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
input_location,
self._expected_download_output(self.outDir),
)
-@@ -379,11 +379,11 @@
+@@ -379,11 +379,11 @@ class CWLWorkflowTest(ToilTest):
"--enable-dev",
"--enable-ext",
"--mpi-config-file",
@@ -210,7 +220,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
cwltoil.main(main_args, stdout=stdout)
os.environ["PATH"] = path
out = json.loads(stdout.getvalue())
-@@ -401,8 +401,8 @@
+@@ -401,8 +401,8 @@ class CWLWorkflowTest(ToilTest):
main_args = [
"--outdir",
self.outDir,
@@ -221,7 +231,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
cwltoil.main(main_args, stdout=stdout)
out = json.loads(stdout.getvalue())
-@@ -434,8 +434,8 @@
+@@ -434,8 +434,8 @@ class CWLWorkflowTest(ToilTest):
def test_run_colon_output(self) -> None:
self._tester(
@@ -232,7 +242,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self._expected_colon_output(self.outDir),
out_name="result",
)
-@@ -464,8 +464,8 @@
+@@ -464,8 +464,8 @@ class CWLWorkflowTest(ToilTest):
# We need to output to the current directory to make sure that
# works.
self._tester(
@@ -243,7 +253,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self._expected_glob_dir_output(os.getcwd()),
main_args=["--bypass-file-store"],
output_here=True,
-@@ -480,8 +480,8 @@
+@@ -480,8 +480,8 @@ class CWLWorkflowTest(ToilTest):
def test_required_input_condition_protection(self) -> None:
# This doesn't run containerized
self._tester(
@@ -254,7 +264,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
{},
)
-@@ -506,7 +506,7 @@
+@@ -506,7 +506,7 @@ class CWLWorkflowTest(ToilTest):
"--slurmDefaultAllMem=True",
"--outdir",
self.outDir,
@@ -263,7 +273,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
try:
log.debug("Start test workflow")
-@@ -600,8 +600,8 @@
+@@ -600,8 +600,8 @@ class CWLWorkflowTest(ToilTest):
@unittest.skip("Fails too often due to remote service")
def test_bioconda(self) -> None:
self._tester(
@@ -274,7 +284,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self._expected_seqtk_output(self.outDir),
main_args=["--beta-conda-dependencies"],
out_name="output1",
-@@ -610,8 +610,8 @@
+@@ -610,8 +610,8 @@ class CWLWorkflowTest(ToilTest):
@needs_docker
def test_default_args(self) -> None:
self._tester(
@@ -285,7 +295,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self._expected_seqtk_output(self.outDir),
main_args=[
"--default-container",
-@@ -625,8 +625,8 @@
+@@ -625,8 +625,8 @@ class CWLWorkflowTest(ToilTest):
@unittest.skip("Fails too often due to remote service")
def test_biocontainers(self) -> None:
self._tester(
@@ -296,7 +306,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self._expected_seqtk_output(self.outDir),
main_args=["--beta-use-biocontainers"],
out_name="output1",
-@@ -637,8 +637,8 @@
+@@ -637,8 +637,8 @@ class CWLWorkflowTest(ToilTest):
@needs_local_cuda
def test_cuda(self) -> None:
self._tester(
@@ -307,7 +317,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
{},
out_name="result",
)
-@@ -709,8 +709,8 @@
+@@ -709,8 +709,8 @@ class CWLWorkflowTest(ToilTest):
Test that a file with 'streamable'=True is a named pipe.
This is a CWL1.2 feature.
"""
@@ -318,7 +328,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
out_name = "output"
jobstore = f"--jobStore=aws:us-west-1:toil-stream-{uuid.uuid4()}"
from toil.cwl import cwltoil
-@@ -721,8 +721,8 @@
+@@ -721,8 +721,8 @@ class CWLWorkflowTest(ToilTest):
"--outdir",
self.outDir,
jobstore,
@@ -329,7 +339,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
if extra_args:
args = extra_args + args
-@@ -747,8 +747,8 @@
+@@ -747,8 +747,8 @@ class CWLWorkflowTest(ToilTest):
"""
Tests that the http://arvados.org/cwl#UsePreemptible extension is supported.
"""
@@ -340,7 +350,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
out_name = "output"
from toil.cwl import cwltoil
-@@ -756,8 +756,8 @@
+@@ -756,8 +756,8 @@ class CWLWorkflowTest(ToilTest):
args = [
"--outdir",
self.outDir,
@@ -351,7 +361,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
cwltoil.main(args, stdout=st)
out = json.loads(st.getvalue())
-@@ -771,16 +771,16 @@
+@@ -771,16 +771,16 @@ class CWLWorkflowTest(ToilTest):
"""
Tests that the http://arvados.org/cwl#UsePreemptible extension is validated.
"""
@@ -372,7 +382,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
]
try:
cwltoil.main(args, stdout=st)
-@@ -938,8 +938,7 @@
+@@ -938,8 +938,7 @@ class CWLv10Test(ToilTest):
"""Runs anew before each test to create farm fresh temp dirs."""
self.outDir = f"/tmp/toil-cwl-test-{str(uuid.uuid4())}"
os.makedirs(self.outDir)
@@ -382,7 +392,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
self.workDir = os.path.join(self.cwlSpec, "v1.0")
# The latest cwl git commit hash from https://github.com/common-workflow-language/common-workflow-language.
# Update it to get the latest tests.
-@@ -1072,15 +1071,13 @@
+@@ -1072,15 +1071,13 @@ class CWLv11Test(ToilTest):
Run the CWL 1.1 conformance tests in various environments.
"""
@@ -399,7 +409,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
cls.test_yaml = os.path.join(cls.cwlSpec, "conformance_tests.yaml")
# TODO: Use a commit zip in case someone decides to rewrite master's history?
url = "https://github.com/common-workflow-language/cwl-v1.1.git"
-@@ -1152,7 +1149,7 @@
+@@ -1152,7 +1149,7 @@ class CWLv12Test(ToilTest):
def setUpClass(cls) -> None:
"""Runs anew before each test."""
cls.rootDir = cls._projectRootPath()
@@ -408,7 +418,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
cls.test_yaml = os.path.join(cls.cwlSpec, "conformance_tests.yaml")
# TODO: Use a commit zip in case someone decides to rewrite master's history?
url = "https://github.com/common-workflow-language/cwl-v1.2.git"
-@@ -1770,8 +1767,8 @@
+@@ -1770,8 +1767,8 @@ def test_download_structure(tmp_path: Path) -> None:
@pytest.mark.timeout(300)
def test_import_on_workers() -> None:
args = [
@@ -419,9 +429,52 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
"--runImportsOnWorkers",
"--importWorkersDisk=10MiB",
"--realTimeLogging=True",
---- toil.orig/src/toil/test/provisioners/aws/awsProvisionerTest.py
-+++ toil/src/toil/test/provisioners/aws/awsProvisionerTest.py
-@@ -28,6 +28,7 @@
+diff --git a/src/toil/test/docs/scripts/tutorial_staging.py b/src/toil/test/docs/scripts/tutorial_staging.py
+index 17c2782..31a84d3 100644
+--- a/src/toil/test/docs/scripts/tutorial_staging.py
++++ b/src/toil/test/docs/scripts/tutorial_staging.py
+@@ -3,7 +3,7 @@ import os
+ from toil.common import Toil
+ from toil.job import Job
+ from toil.lib.io import mkdtemp
+-
++from toil.test import get_data
+
+ class HelloWorld(Job):
+ def __init__(self, id):
+@@ -22,6 +22,7 @@ class HelloWorld(Job):
+
+ if __name__ == "__main__":
+ jobstore: str = mkdtemp("tutorial_staging")
++ tmp: str = mkdtemp("tutorial_staging_tmp")
+ os.rmdir(jobstore)
+ options = Job.Runner.getDefaultOptions(jobstore)
+ options.logLevel = "INFO"
+@@ -29,11 +30,8 @@ if __name__ == "__main__":
+
+ with Toil(options) as toil:
+ if not toil.options.restart:
+- ioFileDirectory = os.path.join(
+- os.path.dirname(os.path.abspath(__file__)), "stagingExampleFiles"
+- )
+ inputFileID = toil.importFile(
+- "file://" + os.path.abspath(os.path.join(ioFileDirectory, "in.txt"))
++ "file://" + get_data("toil/test/docs/scripts/stagingExampleFiles/in.txt")
+ )
+ outputFileID = toil.start(HelloWorld(inputFileID))
+ else:
+@@ -41,5 +39,5 @@ if __name__ == "__main__":
+
+ toil.exportFile(
+ outputFileID,
+- "file://" + os.path.abspath(os.path.join(ioFileDirectory, "out.txt")),
++ "file://" + get_data("toil/test/docs/scripts/stagingExampleFiles/out.txt"),
+ )
+diff --git a/src/toil/test/provisioners/aws/awsProvisionerTest.py b/src/toil/test/provisioners/aws/awsProvisionerTest.py
+index cae25e4..a505da4 100644
+--- a/src/toil/test/provisioners/aws/awsProvisionerTest.py
++++ b/src/toil/test/provisioners/aws/awsProvisionerTest.py
+@@ -28,6 +28,7 @@ from toil.provisioners import cluster_factory
from toil.provisioners.aws.awsProvisioner import AWSProvisioner
from toil.test import (
ToilTest,
@@ -429,7 +482,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
integrative,
needs_aws_ec2,
needs_fetchable_appliance,
-@@ -286,7 +287,7 @@
+@@ -286,7 +287,7 @@ class AWSAutoscaleTest(AbstractAWSAutoscaleTest):
# Fixme: making this file larger causes the test to hang
f.write("01234567890123456789012345678901")
self.rsyncUtil(
@@ -438,7 +491,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
":" + self.script(),
)
self.rsyncUtil(fileToSort, ":" + self.data("sortFile"))
-@@ -502,7 +503,7 @@
+@@ -502,7 +503,7 @@ class AWSAutoscaleTestMultipleNodeTypes(AbstractAWSAutoscaleTest):
with open(sseKeyFile, "w") as f:
f.write("01234567890123456789012345678901")
self.rsyncUtil(
@@ -447,9 +500,24 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
":" + self.script(),
)
self.rsyncUtil(sseKeyFile, ":" + self.data("keyFile"))
---- toil.orig/src/toil/test/provisioners/gceProvisionerTest.py
-+++ toil/src/toil/test/provisioners/gceProvisionerTest.py
-@@ -21,6 +21,7 @@
+diff --git a/src/toil/test/provisioners/clusterTest.py b/src/toil/test/provisioners/clusterTest.py
+index d642831..273c27a 100644
+--- a/src/toil/test/provisioners/clusterTest.py
++++ b/src/toil/test/provisioners/clusterTest.py
+@@ -39,7 +39,7 @@ log = logging.getLogger(__name__)
+ class AbstractClusterTest(ToilTest):
+ def __init__(self, methodName: str) -> None:
+ super().__init__(methodName=methodName)
+- self.keyName = os.getenv("TOIL_AWS_KEYNAME").strip() or "id_rsa"
++ self.keyName = os.getenv("TOIL_AWS_KEYNAME", "id_rsa").strip()
+ self.clusterName = f"aws-provisioner-test-{uuid4()}"
+ self.leaderNodeType = "t2.medium"
+ self.clusterType = "mesos"
+diff --git a/src/toil/test/provisioners/gceProvisionerTest.py b/src/toil/test/provisioners/gceProvisionerTest.py
+index a2852e0..56ad79a 100644
+--- a/src/toil/test/provisioners/gceProvisionerTest.py
++++ b/src/toil/test/provisioners/gceProvisionerTest.py
+@@ -21,6 +21,7 @@ import pytest
from toil.test import (
ToilTest,
@@ -457,7 +525,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
integrative,
needs_fetchable_appliance,
needs_google_project,
-@@ -215,7 +216,7 @@
+@@ -215,7 +216,7 @@ class GCEAutoscaleTest(AbstractGCEAutoscaleTest):
# Fixme: making this file larger causes the test to hang
f.write("01234567890123456789012345678901")
self.rsyncUtil(
@@ -466,7 +534,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
":/home/sort.py",
)
self.rsyncUtil(fileToSort, ":/home/sortFile")
-@@ -325,7 +326,7 @@
+@@ -325,7 +326,7 @@ class GCEAutoscaleTestMultipleNodeTypes(AbstractGCEAutoscaleTest):
with open(sseKeyFile, "w") as f:
f.write("01234567890123456789012345678901")
self.rsyncUtil(
@@ -475,7 +543,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
":/home/sort.py",
)
self.rsyncUtil(sseKeyFile, ":/home/keyFile")
-@@ -377,9 +378,7 @@
+@@ -377,9 +378,7 @@ class GCERestartTest(AbstractGCEAutoscaleTest):
def _getScript(self):
self.rsyncUtil(
@@ -486,9 +554,24 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
":" + self.scriptName,
)
---- toil.orig/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py
-+++ toil/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py
-@@ -6,6 +6,7 @@
+diff --git a/src/toil/test/src/resourceTest.py b/src/toil/test/src/resourceTest.py
+index d2cc555..a7ab8ca 100644
+--- a/src/toil/test/src/resourceTest.py
++++ b/src/toil/test/src/resourceTest.py
+@@ -150,8 +150,6 @@ class ResourceTest(ToilTest):
+ # Assert basic attributes and properties
+ self.assertEqual(module.belongsToToil, shouldBelongToToil)
+ self.assertEqual(module.name, module_name)
+- if shouldBelongToToil:
+- self.assertTrue(module.dirPath.endswith("/src"))
+
+ # Before the module is saved as a resource, localize() and globalize() are identity
+ # methods. This should log.warnings.
+diff --git a/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py b/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py
+index e32f59a..9d35008 100644
+--- a/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py
++++ b/src/toil/test/utils/ABCWorkflowDebug/debugWorkflow.py
+@@ -6,6 +6,7 @@ import sys
from toil.common import Toil
from toil.job import Job
from toil.lib.io import mkdtemp
@@ -496,7 +579,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
from toil.version import python
logger = logging.getLogger(__name__)
-@@ -157,6 +158,7 @@
+@@ -157,6 +158,7 @@ def broken_job(job, num):
if __name__ == "__main__":
jobStorePath = sys.argv[1] if len(sys.argv) > 1 else mkdtemp("debugWorkflow")
@@ -504,7 +587,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
options = Job.Runner.getDefaultOptions(jobStorePath)
options.clean = "never"
options.stats = True
-@@ -164,20 +166,18 @@
+@@ -164,20 +166,18 @@ if __name__ == "__main__":
with Toil(options) as toil:
B_file0 = toil.importFile(
@@ -528,9 +611,11 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
job0 = Job.wrapJobFn(initialize_jobs)
job1 = Job.wrapJobFn(writeA, file_maker)
---- toil.orig/src/toil/test/utils/toilDebugTest.py
-+++ toil/src/toil/test/utils/toilDebugTest.py
-@@ -17,7 +17,7 @@
+diff --git a/src/toil/test/utils/toilDebugTest.py b/src/toil/test/utils/toilDebugTest.py
+index 010cff6..aa8d773 100644
+--- a/src/toil/test/utils/toilDebugTest.py
++++ b/src/toil/test/utils/toilDebugTest.py
+@@ -17,7 +17,7 @@ import subprocess
import tempfile
from toil.lib.resources import glob
@@ -539,7 +624,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
from toil.version import python
logger = logging.getLogger(__name__)
-@@ -28,7 +28,7 @@
+@@ -28,7 +28,7 @@ def workflow_debug_jobstore() -> str:
subprocess.check_call(
[
python,
@@ -548,7 +633,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
job_store_path,
]
)
-@@ -49,7 +49,7 @@
+@@ -49,7 +49,7 @@ def testJobStoreContents():
subprocess.check_call(
[
python,
@@ -557,7 +642,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
workflow_debug_jobstore(),
"--logDebug",
"--listFilesInJobStore=True",
-@@ -86,7 +86,7 @@
+@@ -86,7 +86,7 @@ def fetchFiles(symLink: bool, jobStoreDir: str, outputDir: str):
contents = ["A.txt", "B.txt", "C.txt", "ABC.txt", "mkFile.py"]
cmd = [
python,
@@ -566,7 +651,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
jobStoreDir,
"--fetch",
"*A.txt",
-@@ -137,7 +137,7 @@
+@@ -137,7 +137,7 @@ class DebugJobTest(ToilTest):
subprocess.check_call(
[
python,
@@ -575,7 +660,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
"--retryCount=0",
"--logCritical",
"--disableProgress",
-@@ -172,9 +172,7 @@
+@@ -172,9 +172,7 @@ class DebugJobTest(ToilTest):
wf_result = subprocess.run(
[
"toil-wdl-runner",
@@ -586,9 +671,46 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
"--retryCount=0",
"--logDebug",
"--disableProgress",
---- toil.orig/src/toil/test/utils/utilsTest.py
-+++ toil/src/toil/test/utils/utilsTest.py
-@@ -33,6 +33,7 @@
+diff --git a/src/toil/test/utils/toilKillTest.py b/src/toil/test/utils/toilKillTest.py
+index 1f8ef4b..a0037e2 100644
+--- a/src/toil/test/utils/toilKillTest.py
++++ b/src/toil/test/utils/toilKillTest.py
+@@ -23,7 +23,7 @@ import unittest
+ from toil.common import Toil
+ from toil.jobStores.abstractJobStore import NoSuchFileException, NoSuchJobStoreException
+ from toil.jobStores.utils import generate_locator
+-from toil.test import ToilTest, needs_aws_s3, needs_cwl
++from toil.test import ToilTest, get_data, needs_aws_s3, needs_cwl
+
+ logger = logging.getLogger(__name__)
+
+@@ -40,8 +40,8 @@ class ToilKillTest(ToilTest):
+
+ def setUp(self):
+ """Shared test variables."""
+- self.cwl = os.path.abspath("src/toil/test/utils/ABCWorkflowDebug/sleep.cwl")
+- self.yaml = os.path.abspath("src/toil/test/utils/ABCWorkflowDebug/sleep.yaml")
++ self.cwl = get_data("toil/test/utils/ABCWorkflowDebug/sleep.cwl")
++ self.yaml = get_data("toil/test/utils/ABCWorkflowDebug/sleep.yaml")
+
+ def tearDown(self):
+ """Default tearDown for unittest."""
+@@ -90,8 +90,8 @@ class ToilKillTest(ToilTest):
+ class ToilKillTestWithAWSJobStore(ToilKillTest):
+ """A set of test cases for "toil kill" using the AWS job store."""
+
+- def __init__(self, *args, **kwargs):
+- super().__init__(*args, **kwargs)
++ def setUp(self):
++ super().setUp()
+ self.job_store = generate_locator("aws", decoration="testkill")
+
+
+diff --git a/src/toil/test/utils/utilsTest.py b/src/toil/test/utils/utilsTest.py
+index 1525f44..92d3e85 100644
+--- a/src/toil/test/utils/utilsTest.py
++++ b/src/toil/test/utils/utilsTest.py
+@@ -33,6 +33,7 @@ from toil.job import Job
from toil.lib.bioio import system
from toil.test import (
ToilTest,
@@ -596,7 +718,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
get_temp_file,
integrative,
needs_aws_ec2,
-@@ -444,10 +445,10 @@
+@@ -444,10 +445,10 @@ class UtilsTest(ToilTest):
self.toilDir,
"--clean=never",
"--badWorker=1",
@@ -609,7 +731,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
f"--outdir={self.tempDir}",
]
logger.info("Run command: %s", " ".join(cmd))
-@@ -465,10 +466,10 @@
+@@ -465,10 +466,10 @@ class UtilsTest(ToilTest):
"--jobStore",
self.toilDir,
"--clean=never",
@@ -622,7 +744,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
f"--outdir={self.tempDir}",
]
wf = subprocess.Popen(cmd)
-@@ -487,7 +488,7 @@
+@@ -487,7 +488,7 @@ class UtilsTest(ToilTest):
"--jobStore",
self.toilDir,
"--clean=never",
@@ -631,8 +753,10 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
"--message",
"Testing",
]
---- toil.orig/src/toil/test/wdl/wdltoil_test.py
-+++ toil/src/toil/test/wdl/wdltoil_test.py
+diff --git a/src/toil/test/wdl/wdltoil_test.py b/src/toil/test/wdl/wdltoil_test.py
+index 68beb2b..8b34061 100644
+--- a/src/toil/test/wdl/wdltoil_test.py
++++ b/src/toil/test/wdl/wdltoil_test.py
@@ -1,7 +1,6 @@
import json
import logging
@@ -641,7 +765,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
import re
import shutil
import string
-@@ -11,12 +10,14 @@
+@@ -11,12 +10,14 @@ from typing import Any, Optional, Union
from unittest.mock import patch
from uuid import uuid4
@@ -656,7 +780,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
needs_docker,
needs_docker_cuda,
needs_google_storage,
-@@ -203,8 +204,8 @@
+@@ -203,8 +204,8 @@ class WDLTests(BaseWDLTest):
def test_MD5sum(self):
"""Test if Toil produces the same outputs as known good outputs for WDL's
GATK tutorial #1."""
@@ -667,7 +791,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
result_json = subprocess.check_output(
self.base_command
-@@ -221,7 +222,7 @@
+@@ -221,7 +222,7 @@ class WDLTests(BaseWDLTest):
"""
Test if web URL strings can be coerced to usable Files.
"""
@@ -676,7 +800,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
result_json = subprocess.check_output(
self.base_command
-@@ -238,7 +239,7 @@
+@@ -238,7 +239,7 @@ class WDLTests(BaseWDLTest):
"""
Test if Bash "wait" works in WDL scripts.
"""
@@ -685,7 +809,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
result_json = subprocess.check_output(
self.base_command
-@@ -262,7 +263,7 @@
+@@ -262,7 +263,7 @@ class WDLTests(BaseWDLTest):
"""
Test if Toil can collect all call outputs from a workflow that doesn't expose them.
"""
@@ -694,7 +818,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
# With no flag we don't include the call outputs
result_json = subprocess.check_output(
-@@ -319,7 +320,7 @@
+@@ -319,7 +320,7 @@ class WDLTests(BaseWDLTest):
"""
Test if Toil can detect and do something sensible with Cromwell Output Organizer workflows.
"""
@@ -703,7 +827,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
# With no flag we should include all task outputs
result_json = subprocess.check_output(
-@@ -357,7 +358,7 @@
+@@ -357,7 +358,7 @@ class WDLTests(BaseWDLTest):
"""
Test if Toil can cache task runs.
"""
@@ -712,7 +836,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
caching_env = dict(os.environ)
caching_env["MINIWDL__CALL_CACHE__GET"] = "true"
-@@ -412,7 +413,7 @@
+@@ -412,7 +413,7 @@ class WDLTests(BaseWDLTest):
"""
Test if missing and error-producing URLs are handled correctly for optional File? values.
"""
@@ -721,7 +845,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
def run_for_code(code: int) -> dict:
"""
-@@ -457,8 +458,8 @@
+@@ -457,8 +458,8 @@ class WDLTests(BaseWDLTest):
"""
Test if Toil can run a WDL workflow into a new directory.
"""
@@ -732,7 +856,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
subprocess.check_call(
self.base_command
+ [
-@@ -474,8 +475,8 @@
+@@ -474,8 +475,8 @@ class WDLTests(BaseWDLTest):
@needs_singularity_or_docker
def test_miniwdl_self_test(self, extra_args: Optional[list[str]] = None) -> None:
"""Test if the MiniWDL self test runs and produces the expected output."""
@@ -743,7 +867,7 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
result_json = subprocess.check_output(
self.base_command
-@@ -656,8 +657,8 @@
+@@ -656,8 +657,8 @@ class WDLTests(BaseWDLTest):
@needs_google_storage
def test_gs_uri(self):
"""Test if Toil can access Google Storage URIs."""
@@ -754,61 +878,3 @@ Forwarded: https://github.com/DataBiosphere/toil/pull/5211
result_json = subprocess.check_output(
self.base_command + [wdl, json_file, "-o", self.output_dir, "--logDebug"]
---- toil.orig/src/toil/test/src/resourceTest.py
-+++ toil/src/toil/test/src/resourceTest.py
-@@ -150,8 +150,6 @@
- # Assert basic attributes and properties
- self.assertEqual(module.belongsToToil, shouldBelongToToil)
- self.assertEqual(module.name, module_name)
-- if shouldBelongToToil:
-- self.assertTrue(module.dirPath.endswith("/src"))
-
- # Before the module is saved as a resource, localize() and globalize() are identity
- # methods. This should log.warnings.
---- toil.orig/MANIFEST.in
-+++ toil/MANIFEST.in
-@@ -1 +1,10 @@
- include requirements*.txt
-+include src/toil/server/api_spec/workflow_execution_service.swagger.yaml
-+include src/toil/test/cwl/colon_test_output_job.yaml
-+include src/toil/test/cwl/conditional_wf.yaml
-+include src/toil/test/cwl/mock_mpi/fake_mpi.yml
-+include src/toil/test/docs/scripts/*
-+include src/toil/test/utils/ABCWorkflowDebug/sleep.yaml
-+include src/toil/test/utils/ABCWorkflowDebug/*
-+recursive-include src/toil/test/ *.cwl
-+recursive-include src/toil/test/ *.txt
---- toil.orig/setup.py
-+++ toil/setup.py
-@@ -111,9 +111,7 @@
- extras_require=extras_require,
- package_dir={"": "src"},
- packages=find_packages(where="src"),
-- package_data={
-- "": ["*.yml", "*.yaml", "cloud-config", "*.cwl"],
-- },
-+ include_package_data=True,
- # Unfortunately, the names of the entry points are hard-coded elsewhere in the code base so
- # you can't just change them here. Luckily, most of them are pretty unique strings, and thus
- # easy to search for.
---- toil.orig/src/toil/test/docs/scripts/tutorial_staging.py
-+++ toil/src/toil/test/docs/scripts/tutorial_staging.py
-@@ -22,6 +22,7 @@
-
- if __name__ == "__main__":
- jobstore: str = mkdtemp("tutorial_staging")
-+ tmp: str = mkdtemp("tutorial_staging_tmp")
- os.rmdir(jobstore)
- options = Job.Runner.getDefaultOptions(jobstore)
- options.logLevel = "INFO"
-@@ -29,9 +30,7 @@
-
- with Toil(options) as toil:
- if not toil.options.restart:
-- ioFileDirectory = os.path.join(
-- os.path.dirname(os.path.abspath(__file__)), "stagingExampleFiles"
-- )
-+ ioFileDirectory = os.path.join(tmp, "stagingExampleFiles")
- inputFileID = toil.importFile(
- "file://" + os.path.abspath(os.path.join(ioFileDirectory, "in.txt"))
- )
=====================================
debian/patches/intersphinx
=====================================
@@ -1,9 +1,18 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: Link to the offline Python docs.
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Tue, 23 Jan 2024 12:48:46 +0100
+Subject: Link to the offline Python docs.
+
Forwarded: not-needed
---- toil.orig/docs/conf.py
-+++ toil/docs/conf.py
-@@ -71,8 +71,8 @@
+Last-Update: 2024-06-18
+---
+ docs/conf.py | 4 ++--
+ 1 file changed, 2 insertions(+), 2 deletions(-)
+
+diff --git a/docs/conf.py b/docs/conf.py
+index a26d25a..e1daea4 100644
+--- a/docs/conf.py
++++ b/docs/conf.py
+@@ -71,8 +71,8 @@ extensions = [
]
intersphinx_mapping = {
=====================================
debian/patches/needs_aws-proxyfix
=====================================
@@ -1,12 +1,20 @@
-From: Michael R. Crusoe <crusoe at debian.org>
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Tue, 18 Jun 2024 19:36:42 +0200
Subject: Skip AWS requiring tests when a broken proxy is set.
When building Debian packages we purposely set HTTP{,S}_PROXY to http://127.0.0.1:9/
to quickly avoid internet access.
---- toil.orig/src/toil/test/__init__.py
-+++ toil/src/toil/test/__init__.py
-@@ -52,6 +52,8 @@
+Last-Update: 2025-02-13
+---
+ src/toil/test/__init__.py | 7 +++++++
+ 1 file changed, 7 insertions(+)
+
+diff --git a/src/toil/test/__init__.py b/src/toil/test/__init__.py
+index c9198a2..3d33f70 100644
+--- a/src/toil/test/__init__.py
++++ b/src/toil/test/__init__.py
+@@ -52,6 +52,8 @@ from toil.lib.memoize import memoize
from toil.lib.threading import ExceptionalThread, cpu_count
from toil.version import distVersion
@@ -15,7 +23,7 @@ to quickly avoid internet access.
logger = logging.getLogger(__name__)
-@@ -402,6 +404,11 @@
+@@ -402,6 +404,11 @@ def needs_aws_s3(test_item: MT) -> MT:
return unittest.skip("Install Toil with the 'aws' extra to include this test.")(
test_item
)
=====================================
debian/patches/no_galaxy_lib
=====================================
@@ -1,8 +1,17 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: skip galaxy-lib, not yet packaged for Debian
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Mon, 7 Jan 2019 02:04:13 -0800
+Subject: skip galaxy-lib, not yet packaged for Debian
+
Forwarded: not-needed
---- toil.orig/requirements-cwl.txt
-+++ toil/requirements-cwl.txt
+Last-Update: 2025-02-13
+---
+ requirements-cwl.txt | 2 --
+ 1 file changed, 2 deletions(-)
+
+diff --git a/requirements-cwl.txt b/requirements-cwl.txt
+index e506d44..68616f4 100644
+--- a/requirements-cwl.txt
++++ b/requirements-cwl.txt
@@ -1,7 +1,5 @@
cwltool==3.1.20250110105449
schema-salad>=8.4.20230128170514,<9
=====================================
debian/patches/pyproject.toml
=====================================
@@ -1,8 +1,17 @@
-From: Michael R. Crusoe <crusoe at debian.org>
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Thu, 13 Feb 2025 09:06:59 +0100
Subject: minimal PEP-517 compliance
+---
+ pyproject.toml | 3 +++
+ 1 file changed, 3 insertions(+)
+ create mode 100644 pyproject.toml
+
+diff --git a/pyproject.toml b/pyproject.toml
+new file mode 100644
+index 0000000..dbdf92a
--- /dev/null
-+++ toil/pyproject.toml
++++ b/pyproject.toml
@@ -0,0 +1,3 @@
+[build-system]
+requires = ["setuptools>=64", "setuptools_scm>=8"]
=====================================
debian/patches/series
=====================================
@@ -1,7 +1,6 @@
pyproject.toml
fix_tests
needs_aws-proxyfix
-soften-configargparser-deps
intersphinx
setting_version.patch
no_galaxy_lib
=====================================
debian/patches/setting_version.patch
=====================================
@@ -1,16 +1,26 @@
-Description: Fixing version.py that is variable and required
-Author: Steffen Moeller <moeller at debian.org>
+From: Steffen Moeller <moeller at debian.org>
+Date: Thu, 14 Jun 2018 12:09:05 +0200
+Subject: Fixing version.py that is variable and required
+
Bug-Debian: https://bugs.debian.org/851365
Forwarded: not-needed
-Last-Update: 2024-01-23
+Last-Update: 2024-02-13
Generated by running `python version_template.py` from a git checkout of the
source to the tag corresponding to the release
As of Toil 5.7.0 the `cwltool_version` field is set in requirements-cwl.txt, please
copy/update that field manually
+---
+ src/toil/version.py | 14 ++++++++++++++
+ 1 file changed, 14 insertions(+)
+ create mode 100644 src/toil/version.py
+
+diff --git a/src/toil/version.py b/src/toil/version.py
+new file mode 100644
+index 0000000..a2bbe38
--- /dev/null
-+++ toil/src/toil/version.py
++++ b/src/toil/version.py
@@ -0,0 +1,14 @@
+baseVersion = '8.0.0'
+cgcloudVersion = '1.6.0a1.dev393'
=====================================
debian/patches/soften-configargparser-deps deleted
=====================================
@@ -1,14 +0,0 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: Older configargparser is good enough
-Forwarded: not-needed
---- toil.orig/requirements.txt
-+++ toil/requirements.txt
-@@ -8,7 +8,7 @@
- addict>=2.2.1, <2.5
- backports.zoneinfo[tzdata];python_version<"3.9"
- enlighten>=1.5.2, <2
--configargparse>=1.7,<2
-+configargparse>=1.5.3,<2
- ruamel.yaml>=0.15
- pyyaml>=6,<7
- typing-extensions>=4.6.2, <5
=====================================
debian/patches/soften-cwltool-dep.patch
=====================================
@@ -1,9 +1,17 @@
-Description: Soften versioned dependency from cwltool
-Author: Andreas Tille <tille at debian.org>
-Last-Update: Tue, 12 Dec 2023 16:57:57 +0100
+From: Andreas Tille <tille at debian.org>
+Date: Tue, 23 Jan 2024 10:30:09 +0100
+Subject: Soften versioned dependency from cwltool
+
+Last-Update: 2025-02-13
Forwarded: not-needed
---- toil.orig/requirements-cwl.txt
-+++ toil/requirements-cwl.txt
+---
+ requirements-cwl.txt | 2 +-
+ 1 file changed, 1 insertion(+), 1 deletion(-)
+
+diff --git a/requirements-cwl.txt b/requirements-cwl.txt
+index 68616f4..fe5253b 100644
+--- a/requirements-cwl.txt
++++ b/requirements-cwl.txt
@@ -1,4 +1,4 @@
-cwltool==3.1.20250110105449
+cwltool>=3.1.20250110105449
=====================================
debian/patches/soften-mesos-deps
=====================================
@@ -1,13 +1,22 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Sun, 13 Jan 2019 23:06:51 -0800
Subject: Skip addict
+
Forwarded: not-needed
---- toil.orig/requirements.txt
-+++ toil/requirements.txt
-@@ -5,7 +5,6 @@
+Last-Update: 2025-02-13
+---
+ requirements.txt | 1 -
+ 1 file changed, 1 deletion(-)
+
+diff --git a/requirements.txt b/requirements.txt
+index 4f302d5..af6cc67 100644
+--- a/requirements.txt
++++ b/requirements.txt
+@@ -5,7 +5,6 @@ urllib3>=1.26.0,<3
python-dateutil
psutil >= 6.1.0, < 7
PyPubSub >=4.0.3, <5
-addict>=2.2.1, <2.5
backports.zoneinfo[tzdata];python_version<"3.9"
enlighten>=1.5.2, <2
- configargparse>=1.5.3,<2
+ configargparse>=1.7,<2
=====================================
debian/patches/soften-psutil
=====================================
@@ -1,9 +1,17 @@
-Author: Michael R. Crusoe <crusoe at debian.org>
-Description: Remove restrictions on the psutil version.
+From: "Michael R. Crusoe" <crusoe at debian.org>
+Date: Thu, 13 Feb 2025 09:01:20 +0100
+Subject: Remove restrictions on the psutil version.
+
Forwarded: not-needed
---- toil.orig/requirements.txt
-+++ toil/requirements.txt
-@@ -3,7 +3,7 @@
+---
+ requirements.txt | 2 +-
+ 1 file changed, 1 insertion(+), 1 deletion(-)
+
+diff --git a/requirements.txt b/requirements.txt
+index 9a22200..51ab826 100644
+--- a/requirements.txt
++++ b/requirements.txt
+@@ -3,7 +3,7 @@ requests
docker
urllib3>=1.26.0,<3
python-dateutil
View it on GitLab: https://salsa.debian.org/med-team/toil/-/compare/d01104ffbcb8907adfbf06986e8827f80c5bf83d...0bc69b5fcd2d933429b6500c82abc8ba03b35b1b
--
View it on GitLab: https://salsa.debian.org/med-team/toil/-/compare/d01104ffbcb8907adfbf06986e8827f80c5bf83d...0bc69b5fcd2d933429b6500c82abc8ba03b35b1b
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20250317/2a080ba5/attachment-0001.htm>
More information about the debian-med-commit
mailing list