[med-svn] [Git][med-team/ariba][master] 11 commits: routine-update: New upstream version

Étienne Mollier (@emollier) gitlab at salsa.debian.org
Sat Oct 14 11:13:04 BST 2023



Étienne Mollier pushed to branch master at Debian Med / ariba


Commits:
1fb08341 by Étienne Mollier at 2023-10-14T10:48:34+02:00
routine-update: New upstream version

- - - - -
310aabf3 by Étienne Mollier at 2023-10-14T10:48:35+02:00
New upstream version 2.14.7+ds
- - - - -
505982d8 by Étienne Mollier at 2023-10-14T10:48:40+02:00
Update upstream source from tag 'upstream/2.14.7+ds'

Update to upstream version '2.14.7+ds'
with Debian dir ebb85cbdb1d1c44d88188549cbd3748e05e8b742
- - - - -
6a88e723 by Étienne Mollier at 2023-10-14T10:58:53+02:00
port-to-pytest.patch: remove: applied upstream.

- - - - -
1eec3b9c by Étienne Mollier at 2023-10-14T10:59:27+02:00
mpileup-1.16.patch: remove: applied upstream.

- - - - -
9f9c4e61 by Étienne Mollier at 2023-10-14T10:59:50+02:00
add-testdata.patch: refresh.

- - - - -
0a68b8f5 by Étienne Mollier at 2023-10-14T10:59:59+02:00
skip-cluster-test.patch: refresh.

- - - - -
5fc4ae35 by Étienne Mollier at 2023-10-14T11:22:40+02:00
adjust-bowtie2-test.patch: remove: fixed upstream.

- - - - -
18b239a9 by Étienne Mollier at 2023-10-14T11:23:56+02:00
skip-cluster-test.patch: remove: fixed upstream.

- - - - -
77b95580 by Étienne Mollier at 2023-10-14T11:24:27+02:00
disable-tests-with-internet-access.patch: forwarding not-needed.

- - - - -
ecb9b56f by Étienne Mollier at 2023-10-14T12:10:59+02:00
ready to upload to unstable

- - - - -


23 changed files:

- + .github/workflows/build.yaml
- − .travis.yml
- Dockerfile
- README.md
- + Singularity.def
- ariba/external_progs.py
- ariba/mlst_profile.py
- ariba/ref_genes_getter.py
- ariba/samtools_variants.py
- ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.depths.gz
- ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.vcf
- ariba/tests/samtools_variants_test.py
- debian/changelog
- debian/patches/add-testdata.patch
- − debian/patches/adjust-bowtie2-test.patch
- debian/patches/disable-tests-with-internet-access.patch
- − debian/patches/mpileup-1.16.patch
- − debian/patches/port-to-pytest.patch
- debian/patches/series
- − debian/patches/skip-cluster-test.patch
- install_dependencies.sh
- + requirements.txt
- setup.py


Changes:

=====================================
.github/workflows/build.yaml
=====================================
@@ -0,0 +1,116 @@
+name: Build ariba images
+
+on:
+  push:
+    tags:
+      - 'v*.*.*'
+    branches:
+      - master
+  pull_request:
+    branches:
+      - master
+
+env:
+  REGISTRY: ghcr.io
+  IMAGE_NAME: ${{ github.repository }}
+
+jobs:
+  build:
+    name: Build
+    runs-on: ubuntu-20.04
+    steps:
+
+    - name: Set up Go 1.16
+      uses: actions/setup-go at v1
+      with:
+        go-version: 1.16
+      id: go
+
+    - name: Install Dependencies
+      run: |
+        sudo apt-get update && sudo apt-get install -y \
+          build-essential \
+          libssl-dev \
+          uuid-dev \
+          libgpgme11-dev \
+          squashfs-tools \
+          libseccomp-dev \
+          pkg-config \
+          debootstrap \
+          debian-keyring \
+          debian-archive-keyring \
+          rsync
+
+    - name: Install Singularity
+      env:
+        SINGULARITY_VERSION: 3.5.3
+        GOPATH: /tmp/go
+      run: |
+        mkdir -p $GOPATH
+        sudo mkdir -p /usr/local/var/singularity/mnt
+        mkdir -p $GOPATH/src/github.com/sylabs
+        cd $GOPATH/src/github.com/sylabs
+        wget https://github.com/hpcng/singularity/releases/download/v${SINGULARITY_VERSION}/singularity-${SINGULARITY_VERSION}.tar.gz
+        tar -xzf singularity-${SINGULARITY_VERSION}.tar.gz
+        cd singularity
+        ./mconfig -v -p /usr/local
+        make -j `nproc 2>/dev/null || echo 1` -C ./builddir all
+        sudo make -C ./builddir install
+
+
+    - name: Check out code for the container build
+      uses: actions/checkout at v2
+
+    - name: Set release version if is a release
+      if: startsWith(github.event.ref, 'refs/tags/v')
+      run: echo "RELEASE_VERSION=${GITHUB_REF#refs/*/}" >> $GITHUB_ENV
+
+    - name: Set release version if not a release
+      if: false == startsWith(github.event.ref, 'refs/tags/v')
+      run: echo "RELEASE_VERSION=test" >> $GITHUB_ENV
+
+    - name: Build Singularity container
+      env:
+        SINGULARITY_RECIPE: Singularity.def
+        OUTPUT_CONTAINER: ariba_${{env.RELEASE_VERSION}}.img
+      run: |
+        ls
+        if [ -f "${SINGULARITY_RECIPE}" ]; then
+            sudo -E singularity build ${OUTPUT_CONTAINER} ${SINGULARITY_RECIPE}
+        else
+            echo "${SINGULARITY_RECIPE} is not found."
+            echo "Present working directory: $PWD"
+            ls
+        fi
+
+    - name: Release
+      if: startsWith(github.event.ref, 'refs/tags/v')
+      uses: softprops/action-gh-release at v1
+      with:
+        files: ariba*.img
+      env:
+        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+
+    - name: Log in to github container registry
+      if: startsWith(github.event.ref, 'refs/tags/v')
+      uses: docker/login-action at f054a8b539a109f9f41c372932f1ae047eff08c9
+      with:
+        registry: ${{ env.REGISTRY }}
+        username: ${{ github.actor }}
+        password: ${{ secrets.GITHUB_TOKEN }}
+
+    - name: Extract metadata (tags, labels) for Docker
+      if: startsWith(github.event.ref, 'refs/tags/v')
+      id: meta
+      uses: docker/metadata-action at 98669ae865ea3cffbcbaa878cf57c20bbf1c6c38
+      with:
+        images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
+
+    - name: Build and push Docker image
+      if: startsWith(github.event.ref, 'refs/tags/v')
+      uses: docker/build-push-action at ad44023a93711e3deb337508980b4b5e9bcdc5dc
+      with:
+        context: .
+        push: true
+        tags: ${{ steps.meta.outputs.tags }}
+        labels: ${{ steps.meta.outputs.labels }}


=====================================
.travis.yml deleted
=====================================
@@ -1,21 +0,0 @@
-language: python
-addons:
-  apt:
-    packages:
-    - zlib1g-dev
-    - libblas-dev
-    - liblapack-dev
-    - libgfortran3
-    - libncurses5-dev
-python:
-- '3.6'
-sudo: false
-install:
-- pip install pysam
-- source ./install_dependencies.sh
-before_script:
-- pip install codecov
-script:
-- coverage run setup.py test
-after_success:
-- codecov


=====================================
Dockerfile
=====================================
@@ -1,62 +1,31 @@
-FROM ubuntu:18.04
+FROM ubuntu:20.04
 
 ENV DEBIAN_FRONTEND=noninteractive
 
-MAINTAINER ariba-help at sanger.ac.uk
-
-# Software version numbers
-ARG BOWTIE2_VERSION=2.2.9
-ARG SPADES_VERSION=3.13.1
-ARG ARIBA_TAG=master
 ARG ARIBA_BUILD_DIR=/ariba
-
+ARG DEPS_DIR=/bioinf-tools
 ARG LOCALE_COUNTRY=en_GB
 
-RUN apt-get -qq update && \
-    apt-get install --no-install-recommends -y \
-  build-essential \
-  cd-hit \
-  curl \
-  git \
-  libbz2-dev \
-  liblzma-dev \
-  mummer \
-  python3-dev \
-  python3-setuptools \
-  python3-pip \
-  python3-tk \
-  python3-matplotlib \
-  unzip \
-  wget \
-  zlib1g-dev
-
 # Install locales
-RUN apt-get update && apt-get install -y locales-all && rm -rf /var/lib/apt/lists/* 
+RUN apt-get update && apt-get install -y locales-all && rm -rf /var/lib/apt/lists/*
 # Set a default locale.
 ENV LANG=${LOCALE_COUNTRY}.UTF-8 \
-    LANGUAGE=${LOCALE_COUNTRY}:en 
+    LANGUAGE=${LOCALE_COUNTRY}:en
 
-# Install bowtie
-RUN wget -q http://downloads.sourceforge.net/project/bowtie-bio/bowtie2/${BOWTIE2_VERSION}/bowtie2-${BOWTIE2_VERSION}-linux-x86_64.zip \
-  && unzip bowtie2-${BOWTIE2_VERSION}-linux-x86_64.zip \
-  && rm -f bowtie2-${BOWTIE2_VERSION}-linux-x86_64.zip
-
-# Install SPAdes
-RUN wget -q https://github.com/ablab/spades/releases/download/v${SPADES_VERSION}/SPAdes-${SPADES_VERSION}-Linux.tar.gz \
-  && tar -zxf SPAdes-${SPADES_VERSION}-Linux.tar.gz \
-  && rm -f SPAdes-${SPADES_VERSION}-Linux.tar.gz
+RUN mkdir -p $ARIBA_BUILD_DIR
+COPY . $ARIBA_BUILD_DIR
+RUN $ARIBA_BUILD_DIR/install_dependencies.sh $DEPS_DIR
 
 # Need MPLBACKEND="agg" to make matplotlib work without X11, otherwise get the error
 # _tkinter.TclError: no display name and no $DISPLAY environment variable
-ENV ARIBA_BOWTIE2=$PWD/bowtie2-${BOWTIE2_VERSION}/bowtie2 ARIBA_CDHIT=cdhit-est MPLBACKEND="agg"
-ENV PATH=$PATH:$PWD/SPAdes-${SPADES_VERSION}-Linux/bin
-
-RUN ln -s -f /usr/bin/python3 /usr/local/bin/python
+ENV ARIBA_BOWTIE2=$DEPS_DIR/bowtie2/bowtie2 \
+    ARIBA_CDHIT=cdhit-est \
+    MPLBACKEND="agg" \
+    PATH=$PATH:$DEPS_DIR/SPAdes/bin
 
 # Install Ariba
-RUN mkdir -p $ARIBA_BUILD_DIR
-COPY . $ARIBA_BUILD_DIR
 RUN cd $ARIBA_BUILD_DIR \
+  && python3 -m pip install -r requirements.txt \
   && python3 setup.py clean --all \
   && python3 setup.py test \
   && python3 setup.py install \


=====================================
README.md
=====================================
@@ -6,15 +6,12 @@ For how to use ARIBA, please see the [ARIBA wiki page][ARIBA wiki].
 
 PLEASE NOTE: we currently do not have the resources to provide support for Ariba - see the [Feedback/Issues](#feedbackissues) section.
 
-[![Unmaintained](http://unmaintained.tech/badge.svg)](http://unmaintained.tech/)  
-[![Build Status](https://travis-ci.org/sanger-pathogens/ariba.svg?branch=master)](https://travis-ci.org/sanger-pathogens/ariba)   
-[![License: GPL v3](https://img.shields.io/badge/License-GPL%20v3-brightgreen.svg)](https://github.com/sanger-pathogens/ariba/blob/master/LICENSE)   
-[![status](https://img.shields.io/badge/MGEN-10.1099%2Fmgen.0.000131-brightgreen.svg)](http://mgen.microbiologyresearch.org/content/journal/mgen/10.1099/mgen.0.000131)   
-[![install with bioconda](https://img.shields.io/badge/install%20with-bioconda-brightgreen.svg)](http://bioconda.github.io/recipes/ariba/README.html)  
-[![Container ready](https://img.shields.io/badge/container-ready-brightgreen.svg)](https://quay.io/repository/biocontainers/ariba)  
-[![Docker Build Status](https://img.shields.io/docker/build/sangerpathogens/ariba.svg)](https://hub.docker.com/r/sangerpathogens/ariba)  
-[![Docker Pulls](https://img.shields.io/docker/pulls/sangerpathogens/ariba.svg)](https://hub.docker.com/r/sangerpathogens/ariba)  
-[![codecov](https://codecov.io/gh/sanger-pathogens/ariba/branch/master/graph/badge.svg)](https://codecov.io/gh/sanger-pathogens/ariba)
+[![Unmaintained](http://unmaintained.tech/badge.svg)](http://unmaintained.tech/)
+[![Build Status](https://github.com/sanger-pathogens/ariba/actions/workflows/build.yaml/badge.svg?branch=master)](https://github.com/sanger-pathogens/ariba/actions/workflows/build.yaml)
+[![License: GPL v3](https://img.shields.io/badge/License-GPL%20v3-brightgreen.svg)](https://github.com/sanger-pathogens/ariba/blob/master/LICENSE)
+[![status](https://img.shields.io/badge/MGEN-10.1099%2Fmgen.0.000131-brightgreen.svg)](http://mgen.microbiologyresearch.org/content/journal/mgen/10.1099/mgen.0.000131)
+[![install with bioconda](https://img.shields.io/badge/install%20with-bioconda-brightgreen.svg)](http://bioconda.github.io/recipes/ariba/README.html)
+[![Container ready](https://img.shields.io/badge/container-ready-brightgreen.svg)](https://quay.io/repository/biocontainers/ariba)
 
 ## Contents
 * [Introduction](#introduction)
@@ -24,6 +21,7 @@ PLEASE NOTE: we currently do not have the resources to provide support for Ariba
   * [Using pip3](#using-pip3)
   * [From Source](#from-source)
   * [Docker](#docker)
+  * [Singularity](#singularity)
   * [Debian (testing)](#debian-testing)
   * [Ubuntu](#ubuntu)
   * [Dependencies and environment variables](#dependencies-and-environment-variables)
@@ -102,9 +100,13 @@ Alternatively, install directly from github using:
     pip3 install git+https://github.com/sanger-pathogens/ariba.git #--user
 
 ### Docker
-ARIBA can be run in a Docker container. First install Docker, then install ARIBA:
+ARIBA can be run in a Docker container. First install Docker, then install the latest
+version of ARIBA:
 
-    docker pull sangerpathogens/ariba
+    docker pull gchr.io/sanger-pathogens/ariba:latest
+
+All Docker images are listed in the
+[packages page](https://github.com/sanger-pathogens/ariba/pkgs/container/ariba).
 
 To use ARIBA use a command like this (substituting in your directories), where your files are assumed to be stored in /home/ubuntu/data:
 
@@ -113,6 +115,19 @@ To use ARIBA use a command like this (substituting in your directories), where y
 When calling Ariba via Docker (as above) you'll also need to add **/data/** in front of all the passed in file or directory names (e.g. /data/my_output_folder).
 
 
+### Singularity
+
+ARIBA can be run in a Singularity container. First install Singularity.
+[Releases](https://github.com/sanger-pathogens/ariba/releases) include
+a Singularity image to download.
+
+Alternatively, build your own Singularity image:
+
+```
+singularity build ariba.simg Singularity.def
+```
+
+
 ### Debian (Ariba version may not be the latest)
 ARIBA is available in the latest version of Debian, and over time will progressively filter through to Ubuntu and other distributions which use Debian. To install it as root:
 


=====================================
Singularity.def
=====================================
@@ -0,0 +1,31 @@
+BootStrap: library
+From: ubuntu:20.04
+
+%environment
+    export PATH=/bioinf-tools/SPAdes/bin/:$PATH
+    export LANG=C.UTF-8
+    export ARIBA_BOWTIE2=/bioinf-tools/bowtie2/bowtie2
+    export ARIBA_CDHIT=cdhit-est
+    export MPLBACKEND="agg"
+
+
+%setup
+    mkdir $SINGULARITY_ROOTFS/ariba
+    rsync -a install_dependencies.sh MANIFEST.in LICENSE scripts ariba requirements.txt setup.py third_party $SINGULARITY_ROOTFS/ariba/
+
+
+%post
+    export PATH=/bioinf-tools/SPAdes/bin/:$PATH
+    export ARIBA_BOWTIE2=/bioinf-tools/bowtie2/bowtie2
+    export ARIBA_CDHIT=cdhit-est
+
+    /ariba/install_dependencies.sh /bioinf-tools
+    cd /ariba
+    python3 -m pip install -r requirements.txt
+    python3 setup.py clean --all
+	python3 setup.py test
+	python3 setup.py install
+
+
+%runscript
+    ariba "$@"


=====================================
ariba/external_progs.py
=====================================
@@ -33,7 +33,7 @@ prog_to_version_cmd = {
     'bowtie2': ('--version', re.compile('.*bowtie2.*version (.*)$')),
     'cdhit': ('', re.compile('CD-HIT version ([0-9\.]+) \(')),
     'nucmer': ('--version', re.compile('([0-9]+\.[0-9\.]+.*$)')),
-    'spades': ('--version', re.compile('SPAdes\s+v([0-9\.]+)'))
+    'spades': ('--version', re.compile('SPAdes.*v([0-9\.]+)'))
 }
 
 


=====================================
ariba/mlst_profile.py
=====================================
@@ -8,7 +8,7 @@ class MlstProfile:
     def __init__(self, infile, duplicate_warnings=True):
         self.infile = infile
         self.duplicate_warnings = duplicate_warnings
-        self.columns_to_ignore = ['clonal_complex', 'CC', 'Lineage', 'mlst_clade', 'species']
+        self.columns_to_ignore = ['clonal_complex', 'CC', 'Lineage', 'mlst_clade', 'species', 'comments']
 
         if not os.path.exists(self.infile):
             raise Error('Error! Input file "' + self.infile + '" not found.')


=====================================
ariba/ref_genes_getter.py
=====================================
@@ -611,13 +611,7 @@ class RefGenesGetter:
             #up to here
         if len(acc_list) > 0:
             print(f"E-fetching {len(acc_list)} genbank records from BioProject {BIOPROJECT} and writing to.  This may take a while.", file=sys.stderr)
-            records = Entrez.efetch(db="nucleotide",
-                                               rettype="gbwithparts", retmode="text",
-                                               retstart=0, retmax=RETMAX,
-                                               webenv=webenv, query_key=query_key,
-                                               idtype="acc")
             #pull out the records as fasta from the genbank
-            from Bio.Alphabet import generic_dna
             from Bio import SeqIO
             from Bio.Seq import Seq
             from Bio.SeqRecord import SeqRecord
@@ -625,8 +619,16 @@ class RefGenesGetter:
             print(f"Parsing genbank records.")
             with open(final_fasta, "w") as f_out_fa, \
                  open(final_tsv, "w") as f_out_tsv:
-                for idx, gb_record in enumerate(SeqIO.parse(records, "genbank")):
-                    print(f"'{gb_record.id}'")
+                for idx, record_id in enumerate(acc_list):
+                    print(f"'{record_id}'")
+                    if record_id.startswith('AP'):
+                        print('skipping')
+                        continue
+                    record = Entrez.efetch(db="nucleotide",
+                                                rettype="gbwithparts", retmode="text",
+                                                id=record_id)
+                    gb_record = SeqIO.read(record, "genbank")
+
                     n=0
                     record_new=[]
                     for index, feature in enumerate(gb_record.features):
@@ -645,7 +647,7 @@ class RefGenesGetter:
                                 except KeyError:
                                     print(f"gb_feature.qualifer not found", file=sys.stderr)
                             accession = gb_record.id
-                            seq_out = Seq(str(gb_feature.extract(gb_record.seq)), generic_dna)
+                            seq_out = Seq(str(gb_feature.extract(gb_record.seq)))
                             record_new.append(SeqRecord(seq_out,
                                          id=f"{id[0]}.{accession}",
                                          description=""))


=====================================
ariba/samtools_variants.py
=====================================
@@ -1,6 +1,7 @@
 import os
 import sys
 import pysam
+import pysam.bcftools
 import pyfastaq
 import vcfcall_ariba
 
@@ -36,13 +37,11 @@ class SamtoolsVariants:
 
         tmp_vcf = self.vcf_file + '.tmp'
         with open(tmp_vcf, 'w') as f:
-            print(pysam.mpileup(
-                '-t', 'INFO/AD,INFO/ADF,INFO/ADR',
+            print(pysam.bcftools.mpileup(
+                '-a', 'INFO/AD,INFO/ADF,INFO/ADR',
                 '-L', '99999999',
                 '-A',
                 '-f', self.ref_fa,
-                '-u',
-                '-v',
                 self.bam,
             ), end='', file=f)
 


=====================================
ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.depths.gz
=====================================
Binary files a/ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.depths.gz and b/ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.depths.gz differ


=====================================
ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.vcf
=====================================
@@ -1,4 +1,4 @@
-ref1	121	.	C	A,T,<*>	0	.	DP=142;ADF=72,16,18,0;ADR=22,8,1,0;AD=94,24,19,0;I16=72,22,34,9,3752,149792,1660,65086,3948,165816,1300,42424,1488,30666,682,13620;QS=0.748454,0.135248,0.116298,0;VDB=0.419877;SGB=-0.693146;RPB=0.693393;MQB=2.09583e-09;MQSB=0.677623;BQB=0.54095;MQ0F=0	PL	78,0,255,65,197,255,255,255,255,255
-ref1	149	.	AC	ACC	0	.	INDEL;IDV=162;IMF=0.910112;DP=178;ADF=0,96;ADR=0,79;AD=0,175;I16=0,0,96,79,0,0,6916,273882,0,0,5838,208760,0,0,2791,56033;QS=0,1;VDB=0.470182;SGB=-0.693147;MQSB=0.999996;MQ0F=0	PL	255,255,0
-ref1	201	.	C	G,<*>	0	.	DP=190;ADF=0,80,0;ADR=0,94,0;AD=0,174,0;I16=0,0,80,94,0,0,6796,267896,0,0,6554,257056,0,0,2900,58094;QS=0,1,0;VDB=0.437278;SGB=-0.693147;MQSB=0.849709;MQ0F=0	PL	255,255,0,255,255,255
-ref2	170	.	A	T,<*>	0	.	DP=117;ADF=0,52,0;ADR=0,59,0;AD=0,111,0;I16=0,0,52,59,0,0,4390,174306,0,0,4662,195804,0,0,1824,36678;QS=0,1,0;VDB=0.30232;SGB=-0.693147;MQSB=1;MQ0F=0	PL	255,255,0,255,255,255
+ref1	121	.	C	A,T,<*>	0	.	DP=142;ADF=72,16,20,0;ADR=22,8,2,0;AD=94,24,22,0;I16=72,22,36,10,3752,149792,1684,65278,3948,165816,1408,46528,1488,30666,685,13623;QS=0.744888,0.134604,0.120508,0;VDB=0.740082;SGB=-0.693147;RPBZ=-0.623486;MQBZ=-8.79322;MQSBZ=-1.56952;BQBZ=-3.61356;SCBZ=0;MQ0F=0	PL	77,0,255,51,195,255,255,255,255,255
+ref1	149	.	AC	ACC	0	.	INDEL;IDV=162;IMF=0.910112;DP=178;ADF=6,91;ADR=2,71;AD=8,162;I16=6,2,91,71,320,12800,6480,259200,296,11478,5382,191916,12,22,2780,56012;QS=0.0254111,0.974589;VDB=0.0591298;SGB=-0.693147;RPBZ=1.64808;MQBZ=0.803052;MQSBZ=-1.56952;BQBZ=-3.61356;SCBZ=0;MQ0F=0	PL	255,255,0
+ref1	201	.	C	G,<*>	0	.	DP=190;ADF=0,84,0;ADR=0,99,0;AD=0,183,0;I16=0,0,84,99,0,0,6868,268472,0,0,6842,266992,0,0,2911,58109;QS=0,1,0;VDB=0.708247;SGB=-0.693147;MQSBZ=0.614759;MQ0F=0	PL	255,255,0,255,255,255
+ref2	170	.	A	T,<*>	0	.	DP=117;ADF=0,55,0;ADR=0,62,0;AD=0,117,0;I16=0,0,55,62,0,0,4680,187200,0,0,4914,206388,0,0,1827,36681;QS=0,1,0;VDB=0.655552;SGB=-0.693147;MQSBZ=0;MQ0F=0	PL	255,255,0,255,255,255


=====================================
ariba/tests/samtools_variants_test.py
=====================================
@@ -48,7 +48,9 @@ class TestSamtoolsVariants(unittest.TestCase):
             got = got_lines[i].split('\t')
             self.assertEqual(len(expected), len(got))
             self.assertEqual(expected[:7], got[:7])
-            self.assertEqual(expected[-2:], got[-2:])
+            # last two cols are PL and then values. These are not used, and
+            # can change slightly between samtools/bcftools versions. Ignore.
+            #self.assertEqual(expected[-2:], got[-2:])
             exp_set = set(expected[7].split(';'))
             got_set = set(got[7].split(';'))
             self.assertEqual(exp_set, got_set)


=====================================
debian/changelog
=====================================
@@ -1,3 +1,15 @@
+ariba (2.14.7+ds-1) unstable; urgency=medium
+
+  * New upstream version 2.14.7+ds
+  * port-to-pytest.patch: remove: applied upstream.
+  * mpileup-1.16.patch: remove: applied upstream.
+  * add-testdata.patch: refresh.
+  * adjust-bowtie2-test.patch: remove: fixed upstream.
+  * skip-cluster-test.patch: remove: fixed upstream.
+  * disable-tests-with-internet-access.patch: forwarding not-needed.
+
+ -- Étienne Mollier <emollier at debian.org>  Sat, 14 Oct 2023 11:24:46 +0200
+
 ariba (2.14.6+ds-5) unstable; urgency=medium
 
   * d/salsa-ci.yml: disable i386 run due to missing B-D.


=====================================
debian/patches/add-testdata.patch
=====================================
@@ -1,9 +1,9 @@
 Description: include testdata in package
 Author: Sascha Steinbiss <satta at debian.org>
 Last-Update: 2019-09-09
---- a/setup.py
-+++ b/setup.py
-@@ -57,7 +57,43 @@
+--- ariba.orig/setup.py
++++ ariba/setup.py
+@@ -61,7 +61,43 @@
      version='2.14.6',
      description='ARIBA: Antibiotic Resistance Identification By Assembly',
      packages = find_packages(),
@@ -46,5 +46,5 @@ Last-Update: 2019-09-09
 +        'tests/data/cluster_test_init_no_reads_1/*',
 +        'tests/data/cluster_full_run_smtls_snp_presabs_gene/*']},
      author='Martin Hunt',
-     author_email='ariba-help at sanger.ac.uk',
      url='https://github.com/sanger-pathogens/ariba',
+     scripts=glob.glob('scripts/*'),


=====================================
debian/patches/adjust-bowtie2-test.patch deleted
=====================================
@@ -1,35 +0,0 @@
-Description: skip tests failing with new bowtie2
- Some of the test data seem to need 
-Author: Étienne Mollier <emollier at debian.org>
-Bug: https://github.com/sanger-pathogens/ariba/issues/329
-Bug-Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1021675#23
-Last-Update: 2022-12-14
----
-This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
---- ariba.orig/ariba/tests/mapping_test.py
-+++ ariba/ariba/tests/mapping_test.py
-@@ -2,6 +2,7 @@
- import os
- import pysam
- import pyfastaq
-+import pytest
- from ariba import mapping, external_progs
- 
- modules_dir = os.path.dirname(os.path.abspath(mapping.__file__))
-@@ -41,7 +42,7 @@
- 
-         os.unlink(tmp_ref)
- 
--
-+    @pytest.mark.skip("Test data needs being updated.")
-     def test_run_bowtie2(self):
-         '''Test run_bowtie2 unsorted'''
-         self.maxDiff = None
-@@ -85,6 +86,7 @@
-         os.unlink(out_prefix + '.bam')
- 
- 
-+    @pytest.mark.skip("Test data needs being updated.")
-     def test_run_bowtie2_and_sort(self):
-         '''Test run_bowtie2 sorted'''
-         ref = os.path.join(data_dir, 'mapping_test_bowtie2_ref.fa')


=====================================
debian/patches/disable-tests-with-internet-access.patch
=====================================
@@ -1,6 +1,7 @@
 Description: disable tests that need Internet access
 Author: Sascha Steinbiss <satta at debian.org>
 Last-Update: 2019-09-09
+Forwarded: not-needed
 --- a/ariba/tests/ncbi_getter_test.py
 +++ b/ariba/tests/ncbi_getter_test.py
 @@ -3,16 +3,16 @@


=====================================
debian/patches/mpileup-1.16.patch deleted
=====================================
@@ -1,36 +0,0 @@
-Description: move to bcftools 1.16 mpileup
- The original mpileup command from samtools has been deprecated and removed
- starting with version 1.16.  The command has been moved to bcftools 1.16.
- This patch adjusts the pysam call to reach for bcftools appropriately instead
- of the plain samtools wrapper.
-Author: Étienne Mollier <emollier at debian.org>
-Bug: https://github.com/sanger-pathogens/ariba/issues/327
-Bug-Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1022508
-Last-Update: 2022-11-28
----
-This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
---- ariba.orig/ariba/samtools_variants.py
-+++ ariba/ariba/samtools_variants.py
-@@ -1,6 +1,7 @@
- import os
- import sys
- import pysam
-+import pysam.bcftools
- import pyfastaq
- import vcfcall_ariba
- 
-@@ -36,13 +37,11 @@
- 
-         tmp_vcf = self.vcf_file + '.tmp'
-         with open(tmp_vcf, 'w') as f:
--            print(pysam.mpileup(
-+            print(pysam.bcftools.mpileup(
-                 '-t', 'INFO/AD,INFO/ADF,INFO/ADR',
-                 '-L', '99999999',
-                 '-A',
-                 '-f', self.ref_fa,
--                '-u',
--                '-v',
-                 self.bam,
-             ), end='', file=f)
- 


=====================================
debian/patches/port-to-pytest.patch deleted
=====================================
@@ -1,21 +0,0 @@
-Description: Switch from nose -> pytest as the former is deprecated
-Author: Nilesh Patra <nilesh at debian.org>
-Forwarded: https://github.com/sanger-pathogens/ariba/pull/318
-Last-Update: December 19, 2021
---- /dev/null
-+++ b/pytest.ini
-@@ -0,0 +1,2 @@
-+[pytest]
-+addopts = --ignore=env
---- a/setup.py
-+++ b/setup.py
-@@ -98,8 +98,7 @@
-     author_email='ariba-help at sanger.ac.uk',
-     url='https://github.com/sanger-pathogens/ariba',
-     scripts=glob.glob('scripts/*'),
--    test_suite='nose.collector',
--    tests_require=['nose >= 1.3'],
-+    tests_require=['pytest'],
-     install_requires=[
-         'BeautifulSoup4 >= 4.1.0',
-         'biopython',


=====================================
debian/patches/series
=====================================
@@ -4,7 +4,3 @@ add-testdata.patch
 disable-tests-with-internet-access.patch
 support-pymummer-0.11.patch
 run-debian-spades-wrapper.patch
-port-to-pytest.patch
-mpileup-1.16.patch
-skip-cluster-test.patch
-adjust-bowtie2-test.patch


=====================================
debian/patches/skip-cluster-test.patch deleted
=====================================
@@ -1,55 +0,0 @@
-Description: skip tests using dataset from older samtools mpileup
- This will need to be reverted once new upstream versions support htslib /
- samtools / bcftools 1.16.
-Author: Étienne Mollier <emollier at debian.org>
-Bug: https://github.com/sanger-pathogens/ariba/issues/327
-Bug-Debian: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1022508
-Forwarded: not-needed
-Last-Update: 2022-11-28
----
-This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
---- ariba.orig/ariba/tests/cluster_test.py
-+++ ariba/ariba/tests/cluster_test.py
-@@ -2,6 +2,7 @@
- import os
- import shutil
- import filecmp
-+import pytest
- from ariba import cluster, common, reference_data
- 
- modules_dir = os.path.dirname(os.path.abspath(cluster.__file__))
-@@ -28,6 +29,7 @@
-                 os.unlink(full_path)
- 
- 
-+ at pytest.mark.skip(reason="test crafted for older samtools versions.")
- class TestCluster(unittest.TestCase):
-     def test_init_fail_files_missing(self):
-         '''test init_fail_files_missing'''
---- ariba.orig/ariba/tests/samtools_variants_test.py
-+++ ariba/ariba/tests/samtools_variants_test.py
-@@ -2,6 +2,7 @@
- import os
- import filecmp
- import pyfastaq
-+import pytest
- from ariba import samtools_variants
- 
- modules_dir = os.path.dirname(os.path.abspath(samtools_variants.__file__))
-@@ -16,6 +17,7 @@
- 
- 
- class TestSamtoolsVariants(unittest.TestCase):
-+    @pytest.mark.skip("since samtools update, one index seems off by two")
-     def test_make_vcf_and_depths_files(self):
-         '''test _make_vcf_and_read_depths_files'''
-         ref = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.asmbly.fa')
-@@ -146,7 +148,7 @@
-         got = samtools_variants.SamtoolsVariants.variants_in_coords(nucmer_hits, vcf_file)
-         self.assertEqual(expected, got)
- 
--
-+    @pytest.mark.skip("refs seem to have been reorganised")
-     def test_get_depths_at_position(self):
-         '''test get_depths_at_position'''
-         bam = os.path.join(data_dir, 'samtools_variants_test_get_depths_at_position.bam')


=====================================
install_dependencies.sh
=====================================
@@ -1,73 +1,56 @@
-#!/bin/bash
-set -e
-set -x
+#!/usr/bin/env bash
+set -vexu
 
-start_dir=$(pwd)
+install_root=$1
 
 BOWTIE2_VERSION=2.3.1
-CDHIT_VERSION=4.6.5
-MUMMER_VERSION=3.23
-
-BOWTIE2_DOWNLOAD_URL="http://downloads.sourceforge.net/project/bowtie-bio/bowtie2/${BOWTIE2_VERSION}/bowtie2-${BOWTIE2_VERSION}-legacy-linux-x86_64.zip"
-CDHIT_DOWNLOAD_URL="https://github.com/weizhongli/cdhit/archive/V${CDHIT_VERSION}.tar.gz"
-MUMMER_DOWNLOAD_URL="http://downloads.sourceforge.net/project/mummer/mummer/${MUMMER_VERSION}/MUMmer${MUMMER_VERSION}.tar.gz"
-
-
-# Make an install location
-if [ ! -d 'build' ]; then
-  mkdir build
+SPADES_VERSION=3.13.1
+
+apt-get update -qq
+apt-get install -y software-properties-common
+apt-add-repository universe
+apt-get update -qq
+
+apt-get install --no-install-recommends -y \
+  build-essential \
+  cd-hit \
+  curl \
+  git \
+  libcurl4-gnutls-dev \
+  libssl-dev \
+  libbz2-dev \
+  liblzma-dev \
+  mummer \
+  python3-dev \
+  python3-setuptools \
+  python3-pip \
+  python3-tk \
+  python3-matplotlib \
+  unzip \
+  wget \
+  zlib1g-dev
+
+ln -s -f /usr/bin/python3 /usr/local/bin/python
+
+if [ ! -d $install_root ]; then
+  mkdir $install_root
 fi
-cd build
-build_dir=$(pwd)
-
-# DOWNLOAD ALL THE THINGS
-download () {
-  url=$1
-  download_location=$2
-
-  if [ -e $download_location ]; then
-    echo "Skipping download of $url, $download_location already exists"
-  else
-    echo "Downloading $url to $download_location"
-    wget $url -O $download_location
-  fi
-}
+cd $install_root
 
+# pysam from pip because apt-get version is too old
+python3 -m pip install pysam>=0.21.0
 
 # --------------- bowtie2 ------------------
-cd $build_dir
-download $BOWTIE2_DOWNLOAD_URL "bowtie2-${BOWTIE2_VERSION}-legacy.zip"
-bowtie2_dir="$build_dir/bowtie2-${BOWTIE2_VERSION}-legacy"
-unzip -n bowtie2-${BOWTIE2_VERSION}-legacy.zip
-
-
-# --------------- cdhit --------------------
-cd $build_dir
-download $CDHIT_DOWNLOAD_URL "cdhit-${CDHIT_VERSION}.tar.gz"
-tar -zxf cdhit-${CDHIT_VERSION}.tar.gz
-cdhit_dir="$build_dir/cdhit-${CDHIT_VERSION}"
-cd $cdhit_dir
-make
-
-
-# --------------- mummer ------------------
-cd $build_dir
-download $MUMMER_DOWNLOAD_URL "MUMmer${MUMMER_VERSION}.tar.gz"
-mummer_dir="$build_dir/MUMmer${MUMMER_VERSION}"
-tar -zxf MUMmer${MUMMER_VERSION}.tar.gz
-cd $mummer_dir
-make
-
-
-cd $start_dir
-
-update_path () {
-  new_dir=$1
-  if [[ ! "$PATH" =~ (^|:)"${new_dir}"(:|$) ]]; then
-    export PATH=${new_dir}:${PATH}
-  fi
-}
-
-update_path ${bowtie2_dir}
-update_path ${cdhit_dir}
-update_path ${mummer_dir}
+cd $install_root
+wget -q http://downloads.sourceforge.net/project/bowtie-bio/bowtie2/${BOWTIE2_VERSION}/bowtie2-${BOWTIE2_VERSION}-legacy-linux-x86_64.zip
+unzip -n bowtie2-${BOWTIE2_VERSION}-legacy-linux-x86_64.zip
+rm bowtie2-${BOWTIE2_VERSION}-legacy-linux-x86_64.zip
+mv bowtie2-${BOWTIE2_VERSION}-legacy bowtie2
+
+
+# --------------- spades -------------------
+cd $install_root
+wget -q https://github.com/ablab/spades/releases/download/v${SPADES_VERSION}/SPAdes-${SPADES_VERSION}-Linux.tar.gz
+tar xf SPAdes-${SPADES_VERSION}-Linux.tar.gz
+rm SPAdes-${SPADES_VERSION}-Linux.tar.gz
+mv SPAdes-${SPADES_VERSION}-Linux SPAdes


=====================================
requirements.txt
=====================================
@@ -0,0 +1,7 @@
+BeautifulSoup4 >= 4.1.0
+biopython < 1.78
+dendropy >= 4.2.0
+pyfastaq >= 3.12.0
+pysam >= 0.21.0
+pymummer <= 0.10.3
+matplotlib >= 3.1.0


=====================================
setup.py
=====================================
@@ -52,6 +52,10 @@ vcfcall_mod = Extension(
     [os.path.join('ariba', 'ext', 'vcfcall_ariba.cpp')],
 )
 
+
+with open("requirements.txt") as f:
+    install_requires = [x.rstrip() for x in f]
+
 setup(
     ext_modules=[minimap_mod, fermilite_mod, vcfcall_mod],
     name='ariba',
@@ -60,20 +64,11 @@ setup(
     packages = find_packages(),
     package_data={'ariba': ['test_run_data/*', 'tb_data/*']},
     author='Martin Hunt',
-    author_email='ariba-help at sanger.ac.uk',
     url='https://github.com/sanger-pathogens/ariba',
     scripts=glob.glob('scripts/*'),
     test_suite='nose.collector',
     tests_require=['nose >= 1.3'],
-    install_requires=[
-        'BeautifulSoup4 >= 4.1.0',
-        'biopython',
-        'dendropy >= 4.2.0',
-        'pyfastaq >= 3.12.0',
-        'pysam >= 0.9.1',
-        'pymummer<=0.10.3',
-        'matplotlib >= 3.1.0',
-    ],
+    install_requires=install_requires,
     license='GPLv3',
     classifiers=[
         'Development Status :: 4 - Beta',



View it on GitLab: https://salsa.debian.org/med-team/ariba/-/compare/48edc5431d070d5c312978f0188d1e5357e2a0f5...ecb9b56fa137953bd658c15dfb2ee9622d2dc8a8

-- 
View it on GitLab: https://salsa.debian.org/med-team/ariba/-/compare/48edc5431d070d5c312978f0188d1e5357e2a0f5...ecb9b56fa137953bd658c15dfb2ee9622d2dc8a8
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20231014/c74d6034/attachment-0001.htm>


More information about the debian-med-commit mailing list