[med-svn] [Git][med-team/snakemake][master] 3 commits: New upstream version 4.8.1

Andreas Tille gitlab at salsa.debian.org
Mon May 7 13:05:53 BST 2018


Andreas Tille pushed to branch master at Debian Med / snakemake


Commits:
03479336 by Andreas Tille at 2018-05-07T13:58:29+02:00
New upstream version 4.8.1
- - - - -
87315e65 by Andreas Tille at 2018-05-07T13:59:17+02:00
Update upstream source from tag 'upstream/4.8.1'

Update to upstream version '4.8.1'
with Debian dir eda4fa5fcf2572e571e50e6fbd672574f5e44166
- - - - -
d2fb2919 by Andreas Tille at 2018-05-07T14:00:16+02:00
Upstream version 4.8.1

- - - - -


27 changed files:

- .circleci/common.sh
- .gitignore
- CHANGELOG.md
- Dockerfile
- debian/changelog
- docs/getting_started/installation.rst
- docs/project_info/authors.rst
- docs/project_info/faq.rst
- docs/snakefiles/modularization.rst
- docs/tutorial/basics.rst
- snakemake/__init__.py
- snakemake/_version.py
- snakemake/executors.py
- snakemake/io.py
- snakemake/jobs.py
- snakemake/output_index.py
- snakemake/parser.py
- snakemake/remote/FTP.py
- snakemake/remote/SFTP.py
- snakemake/remote/__init__.py
- snakemake/remote/gfal.py
- snakemake/workflow.py
- + tests/test_issue805/Snakefile
- + tests/test_issue805/expected-results/test.out
- tests/test_kubernetes/Snakefile
- tests/test_speed/Snakefile
- tests/tests.py


Changes:

=====================================
.circleci/common.sh
=====================================
--- a/.circleci/common.sh
+++ b/.circleci/common.sh
@@ -1 +1 @@
-SINGULARITY_VER=2.4.2
+SINGULARITY_VER=2.4.5


=====================================
.gitignore
=====================================
--- a/.gitignore
+++ b/.gitignore
@@ -9,3 +9,5 @@ dist/
 *.egg
 .eggs/
 .snakemake*
+
+.idea


=====================================
CHANGELOG.md
=====================================
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,16 @@
 # Change Log
 
+# [4.8.1] - 2018-04-25
+# Added
+- Allow URLs for the conda directive.
+# Changed
+- Various minor updates in the docs.
+- Several bug fixes with remote file handling.
+- Fix ImportError occuring with script directive.
+- Use latest singularity.
+- Improved caching for file existence checks. We first check existence of parent directories and cache these results. By this, large parts of the generated FS tree can be pruned if files are not yet present. If files are present, the overhead is minimal, since the checks for the parents are cached.
+- Various minor bug fixes.
+
 # [4.8.0] - 2018-03-13
 ### Added
 - Integration with CWL: the `cwl` directive allows to use CWL tool definitions in addition to shell commands or Snakemake wrappers.
@@ -37,7 +48,7 @@
 ### Added
 - A new shadow mode (minimal) that only symlinks input files has been added.
 ### Changed
-- The default shell is now bash on linux and maxOS. If bash is not installed, we fall back to sh. Previously, Snakemake used the default shell of the user, which defeats the purpose of portability. If the developer decides so, the shell can be always overwritten using shell.executable().
+- The default shell is now bash on linux and macOS. If bash is not installed, we fall back to sh. Previously, Snakemake used the default shell of the user, which defeats the purpose of portability. If the developer decides so, the shell can be always overwritten using shell.executable().
 - Snakemake now requires Singularity 2.4.1 at least (only when running with --use-singularity).
 - HTTP remote provider no longer automatically unpacks gzipped files.
 - Fixed various smaller bugs.
@@ -75,7 +86,7 @@
 ### Changed
 - The scheduler now tries to get rid of the largest temp files first.
 - The Docker image used for kubernetes support can now be configured at the command line.
-- Rate-limiting for cluster interaction has be unified.
+- Rate-limiting for cluster interaction has been unified.
 - S3 remote provider uses boto3.
 - Resource functions can now use an additional `attempt` parameter, that contains the number of times this job has already been tried.
 - Various minor fixes.
@@ -133,7 +144,7 @@
 ### Changed
 - Benchmark files now also include the maximal RSS and VMS size of the Snakemake process and all sub processes.
 - Speedup conda environment creation.
-- Allow specification, of DRMAA log dir.
+- Allow specification of DRMAA log dir.
 - Pass cluster config to subworkflow.
 
 
@@ -189,7 +200,7 @@
 
 ## [3.9.0] - 2016-11-15
 ### Added
-- Ability to define isolated conda software environments (YAML) per rule. Environment will be deployed by Snakemake upon workflow execution.
+- Ability to define isolated conda software environments (YAML) per rule. Environments will be deployed by Snakemake upon workflow execution.
 - Command line argument --wrapper-prefix in order to overwrite the default URL for looking up wrapper scripts.
 ### Changed
 - --summary now displays the log files correspoding to each output file.


=====================================
Dockerfile
=====================================
--- a/Dockerfile
+++ b/Dockerfile
@@ -1,8 +1,8 @@
 FROM bitnami/minideb:stretch
 MAINTAINER Johannes Köster <johannes.koester at tu-dortmund.de>
-ENV SINGULARITY_VERSION=2.3.2
+ENV SINGULARITY_VERSION=2.4.5
 ADD . /tmp/repo
-RUN install_packages wget bzip2 ca-certificates gnupg2
+RUN install_packages wget bzip2 ca-certificates gnupg2 squashfs-tools
 RUN wget -O- http://neuro.debian.net/lists/xenial.us-ca.full > /etc/apt/sources.list.d/neurodebian.sources.list
 RUN wget -O- http://neuro.debian.net/_static/neuro.debian.net.asc | apt-key add -
 RUN install_packages singularity-container
@@ -12,5 +12,5 @@ RUN wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh &
 ENV PATH /opt/conda/bin:${PATH}
 ENV LANG C.UTF-8
 ENV SHELL /bin/bash
-RUN conda env update --name root --file /tmp/repo/environment.yml && conda clean --all -y
+RUN conda update -n base conda && conda env update --name root --file /tmp/repo/environment.yml && conda clean --all -y
 RUN pip install /tmp/repo


=====================================
debian/changelog
=====================================
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,4 +1,4 @@
-snakemake (4.8.0-1) UNRELEASED; urgency=medium
+snakemake (4.8.1-1) UNRELEASED; urgency=medium
 
   * Team upload.
   * New upstream version


=====================================
docs/getting_started/installation.rst
=====================================
--- a/docs/getting_started/installation.rst
+++ b/docs/getting_started/installation.rst
@@ -25,7 +25,7 @@ Then, you can install Snakemake with
 
 .. code-block:: console
 
-    $ conda install -c bioconda snakemake
+    $ conda install -c bioconda -c conda-forge snakemake
 
 from the `Bioconda <https://bioconda.github.io>`_ channel.
 


=====================================
docs/project_info/authors.rst
=====================================
--- a/docs/project_info/authors.rst
+++ b/docs/project_info/authors.rst
@@ -19,6 +19,7 @@ Development Team
 - Manuel Holtgrewe
 - Christian Arnold
 - Wibowo Arindrarto
+- Rasmus Ågren
 
 Contributors
 ------------


=====================================
docs/project_info/faq.rst
=====================================
--- a/docs/project_info/faq.rst
+++ b/docs/project_info/faq.rst
@@ -183,6 +183,14 @@ I get a NameError with my shell command. Are braces unsupported?
 
 You can use the entire Python `format minilanguage <http://docs.python.org/3/library/string.html#formatspec>`_ in shell commands. Braces in shell commands that are not intended to insert variable values thus have to be escaped by doubling them:
 
+This:
+
+.. code-block:: python
+
+    ...
+    shell: "awk '{print $1}' {input}"
+    
+becomes:
 
 .. code-block:: python
 
@@ -191,11 +199,19 @@ You can use the entire Python `format minilanguage <http://docs.python.org/3/lib
 
 Here the double braces are escapes, i.e. there will remain single braces in the final command. In contrast, ``{input}`` is replaced with an input filename.
 
-In addition, if your shell command has literal slashes, `\`, you must escape them with a slash, `\\`. For example:
+In addition, if your shell command has literal slashes, `\\ `, you must escape them with a slash, `\\\\ `. For example:
+
+This:
 
 .. code-block:: python
 
-    shell: """printf \\">%s\\"" {{input}}""" 
+    shell: """printf \">%s\"" {{input}}""" 
+
+becomes:
+
+.. code-block:: python
+
+    shell: """printf \\">%s\\"" {{input}}"""  
     
 How do I incorporate files that do not follow a consistent naming scheme?
 -------------------------------------------------------------------------
@@ -466,3 +482,40 @@ There are two ways to exit a currently running workflow.
    .. code-block:: bash
 
        killall -TERM snakemake
+
+How do I access elements of input or output by a variable index?
+----------------------------------------------------------------
+
+Assuming you have something like the following rule
+
+   .. code-block:: python
+
+      rule a:
+          output:
+              expand("test.{i}.out", i=range(20))
+          run:
+              for i in range(20):
+                  shell("echo test > {output[i]}")
+
+Snakemake will fail upon execution with the error ``'OutputFiles' object has no attribute 'i'``. The reason is that the shell command is using the `Python format mini language <https://docs.python.org/3/library/string.html#formatspec>`_, which does only allow indexing via constants, e.g., ``output[1]``, but not via variables. Variables are treated as attribute names instead. The solution is to write
+
+   .. code-block:: python
+
+      rule a:
+          output:
+              expand("test.{i}.out", i=range(20))
+          run:
+              for i in range(20):
+                  f = output[i]
+                  shell("echo test > {f}")
+
+or, more concise in this special case:
+
+   .. code-block:: python
+
+      rule a:
+          output:
+              expand("test.{i}.out", i=range(20))
+          run:
+              for f in output:
+                  shell("echo test > {f}")


=====================================
docs/snakefiles/modularization.rst
=====================================
--- a/docs/snakefiles/modularization.rst
+++ b/docs/snakefiles/modularization.rst
@@ -6,7 +6,7 @@ Modularization
 
 Modularization in Snakemake comes at different levels.
 
-1. The most fine-grained level are wrappers. They are available and can be published at the `Snakemake Wrapper Repository <https://snakemake-wrappers.readthedocs.io>`_. These wrappers can then be composed and customized according to your needs, by copying skeleton rules into your workflow. In combination with conda integration, wrappers also automatically deploy the needed software dependencies into isolated environments.
+1. The most fine-grained level are wrappers. They are available and can be published at the `Snakemake Wrapper Repository <https://snakemake-wrappers.readthedocs.io>`_. These wrappers can then be composed and customized according to your needs, by copying skeleton rules into your workflow. In combination with conda integration, wrappers also automatically deploy the needed software dependencies into isolated environments.
 2. For larger, reusable parts that shall be integrated into a common workflow, it is recommended to write small Snakefiles and include them into a master Snakefile via the include statement. In such a setup, all rules share a common config file.
 3. The third level of separation are subworkflows. Importantly, these are rather meant as links between otherwise separate data analyses.
 
@@ -73,7 +73,7 @@ Snakemake will execute the rule by invoking `cwltool`, which has to be available
 When using in combination with :ref:`--use-singularity <singularity>`, Snakemake will instruct `cwltool` to execute the command via Singularity in user space.
 Otherwise, `cwltool` will in most cases use a Docker container, which requires Docker to be set up properly.
 
-The advantage is that predefined tools available via the `official repository <https://github.com/common-workflow-language/workflows/tree/master/tools>`_ can be used in any supporting workflow management system.
+The advantage is that predefined tools available via any `repository of CWL tool definitions <http://www.commonwl.org/#Repositories_of_CWL_Tools_and_Workflows>`_ can be used in any supporting workflow management system.
 In contrast to a :ref:`Snakemake wrapper <snakefiles-wrappers>`, CWL tool definitions are in general not suited to alter the behavior of a tool, e.g., by normalizing output names or special input handling.
 As you can see in comparison to the analog :ref:`wrapper declaration <snakefiles-wrappers>` above, the rule becomes slightly more verbose, because input, output, and params have to be dispatched to the specific expectations of the CWL tool definition.
 


=====================================
docs/tutorial/basics.rst
=====================================
--- a/docs/tutorial/basics.rst
+++ b/docs/tutorial/basics.rst
@@ -66,7 +66,7 @@ For technical reasons, DNA sequencing cuts the DNA of a sample into millions
 of small pieces, called **reads**.
 In order to recover the genome of the sample, one has to map these reads against
 a known **reference genome** (e.g., the human one obtained during the famous
-`human genome genome project <https://en.wikipedia.org/wiki/Human_Genome_Project>`_).
+`human genome project <https://en.wikipedia.org/wiki/Human_Genome_Project>`_).
 This task is called **read mapping**.
 Often, it is of interest where an individual genome is different from the species-wide consensus
 represented with the reference genome.


=====================================
snakemake/__init__.py
=====================================
--- a/snakemake/__init__.py
+++ b/snakemake/__init__.py
@@ -746,7 +746,7 @@ def get_argument_parser(profile=None):
          "dictionary inside the workflow."))
     parser.add_argument("--list", "-l",
                         action="store_true",
-                        help="Show availiable rules in given Snakefile.")
+                        help="Show available rules in given Snakefile.")
     parser.add_argument("--list-target-rules", "--lt",
                         action="store_true",
                         help="Show available target rules in given Snakefile.")


=====================================
snakemake/_version.py
=====================================
--- a/snakemake/_version.py
+++ b/snakemake/_version.py
@@ -23,9 +23,9 @@ def get_keywords():
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (tag: v4.8.0)"
-    git_full = "e0c4734235c57aa7db250e2057d1545b3b5aac62"
-    git_date = "2018-03-13 18:56:01 +0100"
+    git_refnames = " (tag: v4.8.1)"
+    git_full = "7f3006da24affc63752c1fcb261152105fe160ec"
+    git_date = "2018-04-25 15:55:48 +0200"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 


=====================================
snakemake/executors.py
=====================================
--- a/snakemake/executors.py
+++ b/snakemake/executors.py
@@ -306,7 +306,7 @@ class CPUExecutor(RealExecutor):
             '--force-use-threads --wrapper-prefix {workflow.wrapper_prefix} ',
             '--latency-wait {latency_wait} ',
             self.get_default_remote_provider_args(),
-            '{overwrite_workdir} {overwrite_config} ',
+            '{overwrite_workdir} {overwrite_config} {printshellcmds} ',
             '--notemp --quiet --no-hooks --nolock --mode {} '.format(Mode.subprocess)))
 
         if self.workflow.use_conda:
@@ -457,8 +457,6 @@ class ClusterExecutor(RealExecutor):
         else:
             self.exec_job = exec_job
 
-        if printshellcmds:
-            self.exec_job += " --printshellcmds "
         if self.workflow.use_conda:
             self.exec_job += " --use-conda "
             if self.workflow.conda_prefix:
@@ -545,8 +543,8 @@ class ClusterExecutor(RealExecutor):
         if self.assume_shared_fs:
             wait_for_files.append(self.tmpdir)
             wait_for_files.extend(job.local_input)
-            wait_for_files.extend(f.local_file()
-                                  for f in job.remote_input if not f.stay_on_remote)
+            wait_for_files.extend(f for f in job.remote_input
+                                    if not f.should_stay_on_remote)
 
             if job.shadow_dir:
                 wait_for_files.append(job.shadow_dir)
@@ -1092,7 +1090,11 @@ class KubernetesExecutor(ClusterExecutor):
         # use relative path to Snakefile
         self.snakefile = os.path.relpath(workflow.snakefile)
 
-        from kubernetes import config
+        try:
+            from kubernetes import config
+        except ImportError:
+            raise WorkflowError("The Python 3 package 'kubernetes' "
+                                "must be installed to use Kubernetes")
         config.load_kube_config()
 
         import kubernetes.client
@@ -1104,9 +1106,10 @@ class KubernetesExecutor(ClusterExecutor):
         self.run_namespace = str(uuid.uuid4())
         self.secret_envvars = {}
         self.register_secret()
+        last_stable_version = __version__.split("+")[0]
         self.container_image = (
             container_image or
-            "quay.io/snakemake/snakemake:{}".format(__version__))
+            "quay.io/snakemake/snakemake:{}".format(last_stable_version))
 
     def register_secret(self):
         import kubernetes.client


=====================================
snakemake/io.py
=====================================
--- a/snakemake/io.py
+++ b/snakemake/io.py
@@ -6,6 +6,7 @@ __license__ = "MIT"
 import collections
 import os
 import shutil
+from pathlib import Path
 import re
 import stat
 import time
@@ -64,14 +65,16 @@ def lchmod(f, mode):
 class IOCache:
     def __init__(self):
         self.mtime = dict()
-        self.exists = dict()
+        self.exists_local = dict()
+        self.exists_remote = dict()
         self.size = dict()
         self.active = True
 
     def clear(self):
         self.mtime.clear()
-        self.exists.clear()
         self.size.clear()
+        self.exists_local.clear()
+        self.exists_remote.clear()
 
     def deactivate(self):
         self.clear()
@@ -147,17 +150,17 @@ class _IOFile(str):
     def update_remote_filepath(self):
         # if the file string is different in the iofile, update the remote object
         # (as in the case of wildcard expansion)
-        remote_object = get_flag_value(self._file, "remote_object")
+        remote_object = self.remote_object
         if remote_object._file != self._file:
             remote_object._iofile = self
 
     @property
     def should_keep_local(self):
-        return get_flag_value(self._file, "remote_object").keep_local
+        return self.remote_object.keep_local
 
     @property
     def should_stay_on_remote(self):
-        return get_flag_value(self._file, "remote_object").stay_on_remote
+        return self.remote_object.stay_on_remote
 
     @property
     def remote_object(self):
@@ -196,18 +199,59 @@ class _IOFile(str):
                     self._file, os.path.sep, hint))
 
     @property
-    @iocache
-    @_refer_to_remote
     def exists(self):
-        return self.exists_local
+        if self.is_remote:
+            return self.exists_remote
+        else:
+            return self.exists_local
+
+    def parents(self, omit=0):
+        """Yield all parent paths, omitting the given number of ancenstors."""
+        for p in list(Path(self.file).parents)[::-1][omit:]:
+            p = IOFile(str(p), rule=self.rule)
+            p.clone_flags(self)
+            yield p
 
     @property
+    @iocache
     def exists_local(self):
+        if self.rule.workflow.iocache.active:
+            # The idea is to first check existence of parent directories and
+            # cache the results.
+            # We omit the last ancestor, because this is always "." or "/" or a
+            # drive letter.
+            for p in self.parents(omit=1):
+                try:
+                    if not p.exists_local:
+                        return False
+                except:
+                    # In case of an error, we continue, because it can be that
+                    # we simply don't have the permissions to access a parent
+                    # directory.
+                    continue
         return os.path.exists(self.file)
 
     @property
+    @iocache
     def exists_remote(self):
-        return (self.is_remote and self.remote_object.exists())
+        if not self.is_remote:
+            return False
+        if (self.rule.workflow.iocache.active and
+            self.remote_object.provider.allows_directories):
+            # The idea is to first check existence of parent directories and
+            # cache the results.
+            # We omit the last 2 ancestors, because these are "." and the host
+            # name of the remote location.
+            for p in self.parents(omit=2):
+                try:
+                    if not p.exists_remote:
+                        return False
+                except:
+                    # In case of an error, we continue, because it can be that
+                    # we simply don't have the permissions to access a parent
+                    # directory in the remote.
+                    continue
+        return self.remote_object.exists()
 
     @property
     def protected(self):


=====================================
snakemake/jobs.py
=====================================
--- a/snakemake/jobs.py
+++ b/snakemake/jobs.py
@@ -179,9 +179,6 @@ class Job:
             if self._conda_env is None:
                 self._conda_env = self.dag.conda_envs.get(
                     (self.conda_env_file, self.singularity_img_url))
-            if self._conda_env is None:
-                raise ValueError("Conda environment {} not found.".format(
-                                 self.conda_env_file))
             return self._conda_env
         return None
 


=====================================
snakemake/output_index.py
=====================================
--- a/snakemake/output_index.py
+++ b/snakemake/output_index.py
@@ -5,13 +5,12 @@ __license__ = "MIT"
 
 from itertools import chain
 
-import datrie
-
 from snakemake.io import _IOFile
 
 
 class OutputIndex:
     def __init__(self, rules):
+        import datrie
         def prefixes(rule):
             return (str(o.constant_prefix()) for o in rule.products)
         def reverse_suffixes(rule):


=====================================
snakemake/parser.py
=====================================
--- a/snakemake/parser.py
+++ b/snakemake/parser.py
@@ -638,6 +638,10 @@ class Rule(GlobalKeywordState):
                             "Multiple run or shell keywords in rule {}.".format(
                                 self.rulename), token)
                     self.run = True
+                elif self.run:
+                    raise self.error("No rule keywords allowed after "
+                                     "run/shell/script/wrapper/cwl in "
+                                     "rule {}.".format(self.rulename), token)
                 for t in self.subautomaton(token.string,
                                            rulename=self.rulename).consume():
                     yield t


=====================================
snakemake/remote/FTP.py
=====================================
--- a/snakemake/remote/FTP.py
+++ b/snakemake/remote/FTP.py
@@ -26,6 +26,7 @@ except ImportError as e:
 class RemoteProvider(AbstractRemoteProvider):
 
     supports_default = True
+    allows_directories = True
 
     def __init__(self, *args, stay_on_remote=False, immediate_close=False, **kwargs):
         super(RemoteProvider, self).__init__(*args, stay_on_remote=stay_on_remote, **kwargs)


=====================================
snakemake/remote/SFTP.py
=====================================
--- a/snakemake/remote/SFTP.py
+++ b/snakemake/remote/SFTP.py
@@ -21,6 +21,7 @@ except ImportError as e:
 class RemoteProvider(AbstractRemoteProvider):
 
     supports_default = True
+    allows_directories = True
 
     def __init__(self, *args, stay_on_remote=False, **kwargs):
         super(RemoteProvider, self).__init__(*args, stay_on_remote=stay_on_remote, **kwargs)


=====================================
snakemake/remote/__init__.py
=====================================
--- a/snakemake/remote/__init__.py
+++ b/snakemake/remote/__init__.py
@@ -45,6 +45,9 @@ class AbstractRemoteProvider:
         and are then passed to RemoteObjects.
     """
     __metaclass__ = ABCMeta
+    
+    supports_default = False
+    allows_directories = False
 
     def __init__(self, *args, keep_local=False, stay_on_remote=False, **kwargs):
         self.args = args


=====================================
snakemake/remote/gfal.py
=====================================
--- a/snakemake/remote/gfal.py
+++ b/snakemake/remote/gfal.py
@@ -24,6 +24,7 @@ if not shutil.which("gfal-copy"):
 class RemoteProvider(AbstractRemoteProvider):
 
     supports_default = True
+    allows_directories = True
 
     def __init__(self, *args, stay_on_remote=False, retry=5, **kwargs):
         super(RemoteProvider, self).__init__(*args, stay_on_remote=stay_on_remote, **kwargs)


=====================================
snakemake/workflow.py
=====================================
--- a/snakemake/workflow.py
+++ b/snakemake/workflow.py
@@ -773,7 +773,8 @@ class Workflow:
                     raise RuleException("Conda environments are only allowed "
                         "with shell, script, or wrapper directives "
                         "(not with run).", rule=rule)
-                if not os.path.isabs(ruleinfo.conda_env):
+                if not (urllib.parse.urlparse(ruleinfo.conda_env).scheme
+                        or os.path.isabs(ruleinfo.conda_env)):
                     ruleinfo.conda_env = os.path.join(self.current_basedir, ruleinfo.conda_env)
                 rule.conda_env = ruleinfo.conda_env
 


=====================================
tests/test_issue805/Snakefile
=====================================
--- /dev/null
+++ b/tests/test_issue805/Snakefile
@@ -0,0 +1,16 @@
+rule a:
+    output:
+        "test.out"
+    shell:
+        "echo {params.b} > {output}"
+    params:
+        b=1
+
+
+rule b:
+    input:
+        "test.out"
+    output:
+        "test2.out"
+    shell:
+        "touch {output}"


=====================================
tests/test_issue805/expected-results/test.out
=====================================
--- /dev/null
+++ b/tests/test_issue805/expected-results/test.out
@@ -0,0 +1 @@
+1


=====================================
tests/test_kubernetes/Snakefile
=====================================
--- a/tests/test_kubernetes/Snakefile
+++ b/tests/test_kubernetes/Snakefile
@@ -23,6 +23,8 @@ rule pack:
         "landsat-data.txt.bz2"
     conda:
         "envs/gzip.yaml"
+    singularity:
+        "docker://continuumio/miniconda3:4.4.10"
     log:
         "logs/pack.log"
     shell:


=====================================
tests/test_speed/Snakefile
=====================================
--- a/tests/test_speed/Snakefile
+++ b/tests/test_speed/Snakefile
@@ -1,7 +1,7 @@
 
 rule all:
     input:
-        expand("step3/{sample}.txt", sample=range(10000))
+        expand("step3/{sample}.txt", sample=range(100))
 
 
 rule a:


=====================================
tests/tests.py
=====================================
--- a/tests/tests.py
+++ b/tests/tests.py
@@ -566,6 +566,8 @@ def test_remote_log():
 def test_remote_http():
     run(dpath("test_remote_http"))
 
+def test_remote_http_cluster():
+    run(dpath("test_remote_http"), cluster=os.path.abspath(dpath("test14/qsub")))
 
 def test_profile():
     run(dpath("test_profile"))
@@ -604,6 +606,9 @@ def test_gcloud():
             sudo $GCLOUD container clusters get-credentials {cluster} --zone us-central1-a
             """)
             run(dpath("test_kubernetes"))
+            run(dpath("test_kubernetes"), use_conda=True)
+            run(dpath("test_kubernetes"), use_singularity=True)
+            run(dpath("test_kubernetes"), use_singularity=True, use_conda=True)
         finally:
             shell("sudo $GCLOUD container clusters delete {cluster} --zone us-central1-a --quiet")
     print("Skipping google cloud test")
@@ -617,6 +622,10 @@ def test_cwl_singularity():
     run(dpath("test_cwl"), use_singularity=True)
 
 
+def test_issue805():
+    run(dpath("test_issue805"), shouldfail=True)
+
+
 if __name__ == '__main__':
     import nose
     nose.run(defaultTest=__name__)



View it on GitLab: https://salsa.debian.org/med-team/snakemake/compare/6906708e0b1ea72d7044bc26618ebb0623611395...d2fb29195bfada1fcd8a8451e84dd5026730a581

---
View it on GitLab: https://salsa.debian.org/med-team/snakemake/compare/6906708e0b1ea72d7044bc26618ebb0623611395...d2fb29195bfada1fcd8a8451e84dd5026730a581
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20180507/58f2caa8/attachment-0001.html>


More information about the debian-med-commit mailing list