[med-svn] [Git][med-team/ataqv][master] 5 commits: New upstream version 1.3.0+ds
Andreas Tille (@tille)
gitlab at salsa.debian.org
Sun Jan 16 07:21:25 GMT 2022
Andreas Tille pushed to branch master at Debian Med / ataqv
Commits:
b740d52e by Andreas Tille at 2022-01-16T08:12:21+01:00
New upstream version 1.3.0+ds
- - - - -
564c1e1f by Andreas Tille at 2022-01-16T08:12:21+01:00
routine-update: New upstream version
- - - - -
592d2b8c by Andreas Tille at 2022-01-16T08:12:25+01:00
Update upstream source from tag 'upstream/1.3.0+ds'
Update to upstream version '1.3.0+ds'
with Debian dir 1fcd03a74ae95294d479bd086a2014dc58320185
- - - - -
c8fd655d by Andreas Tille at 2022-01-16T08:12:34+01:00
Add missing build dependency on dh addon.
Changes-By: lintian-brush
Fixes: lintian: missing-build-dependency-for-dh_-command
See-also: https://lintian.debian.org/tags/missing-build-dependency-for-dh_-command.html
- - - - -
1529c1c0 by Andreas Tille at 2022-01-16T08:20:34+01:00
routine-update: Ready to upload to unstable
- - - - -
9 changed files:
- Makefile
- README.rst
- debian/changelog
- debian/control
- src/cpp/Metrics.cpp
- src/cpp/Metrics.hpp
- src/cpp/Version.hpp
- src/cpp/ataqv.cpp
- src/cpp/test_metrics.cpp
Changes:
=====================================
Makefile
=====================================
@@ -2,7 +2,7 @@
# VARIABLES
#
-VERSION = 1.2.1
+VERSION = 1.3.0
#
# PATHS
@@ -143,7 +143,7 @@ dist-static: checkdirs $(BUILD_DIR)/ataqv-static
deb:
(cd .. && tar czf ataqv_$(VERSION).orig.tar.gz --exclude .git --exclude build ataqv)
- debuild -uc -us
+ debuild -uc -us -d
deb-static: deb static
install -m 0755 $(BUILD_DIR)/ataqv-static debian/ataqv/usr/bin/ataqv
=====================================
README.rst
=====================================
@@ -108,7 +108,7 @@ installed via `Homebrew`_ or `Linuxbrew`_.
On Debian-based Linux distributions, you can install dependencies
with::
- sudo apt install libboost-all-dev libhts-dev libncurses5-dev libtinfo-dev zlib1g-dev lcov
+ sudo apt install libboost-all-dev libhts-dev ncurses-dev libtinfo-dev zlib1g-dev lcov
and the latest supported option among::
@@ -262,6 +262,67 @@ The main program is ataqv, which is run as follows::
derived from the read group IDs, with ".problems" appended. If no read groups
are found, the reads will be written to one file named after the BAM file.
+ --less-redundant
+ If given, output a subset of metrics that should be less redundant. If this flag is used,
+ the same flag should be passed to mkarv when making the viewer.
+
+ Metadata
+ --------
+
+ The following options provide metadata to be included in the metrics JSON file.
+ They make it easier to compare results in the ataqv web interface.
+
+ --name "name"
+ A label to be used for the metrics when there are no read groups. If there are read
+ groups, each will have its metrics named using its ID field. With no read groups and
+ no --name given, your metrics will be named after the alignment file.
+
+ --ignore-read-groups
+ Even if read groups are present in the BAM file, ignore them and combine metrics
+ for all reads under a single sample and library named with the --name option. This
+ also implies that a single peak file will be used for all reads; see the --peak option.
+
+ --nucleus-barcode-tag "nucleus_barcode_tag"
+ Data is single-nucleus, with the barcode stored in this BAM tag.
+ In this case, metrics will be collected per barcode.
+
+ --description "description"
+ A short description of the experiment.
+
+ --url "URL"
+ A URL for more detail on the experiment (perhaps using a DOI).
+
+ --library-description "description"
+ Use this description for all libraries in the BAM file, instead of using the DS
+ field from each read group.
+
+ Reference Genome Configuration
+ ------------------------------
+
+ ataqv includes lists of autosomes for several organisms:
+
+ Organism Autosomal References
+ ------- ------------------
+ fly 2R 2L 3R 3L 4
+ human 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
+ mouse 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
+ rat 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
+ worm I II III IV V
+ yeast I II III IV V VI VII VIII IX X XI XII XIII XIV XV XVI
+
+ The default autosomal reference lists contain names with "chr" prefixes
+ ("chr1") and without ("1"). If you need a different set of autosomes, you can
+ supply a list with --autosomal-reference-file.
+
+ --autosomal-reference-file "file name"
+ A file containing autosomal reference names, one per line. The names must match
+ the reference names in the alignment file exactly, or the metrics based on counts
+ of autosomal alignments will be wrong.
+
+ --mitochondrial-reference-name "name"
+ If the reference name for mitochondrial DNA in your alignment file is not "chrM",.
+ use this option to supply the correct name. Again, if this name is wrong, all the
+ measurements involving mitochondrial alignments will be wrong.
When run, ataqv prints a human-readable summary to its standard
output, and writes complete metrics to the JSON file named with the
=====================================
debian/changelog
=====================================
@@ -1,3 +1,11 @@
+ataqv (1.3.0+ds-1) unstable; urgency=medium
+
+ * Team upload.
+ * New upstream version
+ * Add missing build dependency on dh addon.
+
+ -- Andreas Tille <tille at debian.org> Sun, 16 Jan 2022 08:12:45 +0100
+
ataqv (1.2.1+ds-2) unstable; urgency=medium
* Team upload.
=====================================
debian/control
=====================================
@@ -19,7 +19,8 @@ Build-Depends: debhelper-compat (= 13),
node-normalize.css,
help2man,
python3,
- node-d3
+ node-d3,
+ debhelper
Standards-Version: 4.6.0
Vcs-Browser: https://salsa.debian.org/med-team/ataqv
Vcs-Git: https://salsa.debian.org/med-team/ataqv.git
=====================================
src/cpp/Metrics.cpp
=====================================
@@ -24,6 +24,7 @@
MetricsCollector::MetricsCollector(const std::string& name,
const std::string& organism,
+ const std::string& nucleus_barcode_tag,
const std::string& description,
const std::string& library_description,
const std::string& url,
@@ -36,12 +37,14 @@ MetricsCollector::MetricsCollector(const std::string& name,
bool verbose,
const int thread_limit,
bool ignore_read_groups,
+ bool is_single_nucleus,
bool log_problematic_reads,
bool less_redundant,
const std::vector<std::string>& excluded_region_filenames) :
metrics({}),
name(name),
organism(organism),
+ nucleus_barcode_tag(nucleus_barcode_tag),
description(description),
library_description(library_description),
url(url),
@@ -54,6 +57,7 @@ MetricsCollector::MetricsCollector(const std::string& name,
verbose(verbose),
thread_limit(thread_limit),
ignore_read_groups(ignore_read_groups),
+ is_single_nucleus(is_single_nucleus),
log_problematic_reads(log_problematic_reads),
less_redundant(less_redundant),
excluded_region_filenames(excluded_region_filenames)
@@ -80,7 +84,8 @@ std::string MetricsCollector::configuration_string() const {
cs
<< "Thread limit: " << thread_limit << std::endl
- << "Ignoring read groups: " << (ignore_read_groups ? "yes" : "no") << std::endl;
+ << "Ignoring read groups: " << (ignore_read_groups ? "yes" : "no") << std::endl
+ << "Is single nucleus: " << (is_single_nucleus ? "yes" : "no") << std::endl;
if (!tss_filename.empty()) {
cs << "TSS extension: " << tss_extension << std::endl;
@@ -320,7 +325,7 @@ void MetricsCollector::load_alignments() {
try {
sam_header header = parse_sam_header(alignment_file_header->text);
- if (!ignore_read_groups && header.count("RG") > 0) {
+ if (!ignore_read_groups && header.count("RG") > 0 && !is_single_nucleus) {
for (auto read_group : header["RG"]) {
std::string read_group_id = read_group["ID"];
metrics[read_group_id] = new Metrics(this, read_group_id);
@@ -341,7 +346,7 @@ void MetricsCollector::load_alignments() {
metrics[read_group_id]->library = library;
}
- } else {
+ } else if (ignore_read_groups && !is_single_nucleus) {
metrics[default_metrics_id] = new Metrics(this, default_metrics_id);
Library library;
@@ -361,22 +366,42 @@ void MetricsCollector::load_alignments() {
Metrics* m;
uint8_t* rgaux = bam_aux_get(record, "RG");
- if (!ignore_read_groups && rgaux) {
- std::string read_group_id = bam_aux2Z(rgaux);
-
- // It can happen that records have RG tags that don't
- // exist in the file header. If we're not ignoring
- // read groups altogether, create new Metrics
- // instances for these rapscallions.
- try {
- m = metrics.at(read_group_id);
- } catch (std::out_of_range&) {
- std::cout << "Adding metrics for read group missing from file header: " << read_group_id << std::endl;
- metrics[read_group_id] = new Metrics(this, read_group_id);
- m = metrics[read_group_id];
- }
+ uint8_t* bcaux = bam_aux_get(record, nucleus_barcode_tag.c_str());
+ std::string barcode = bcaux ? bam_aux2Z(bcaux) : "no_barcode";
+ std::string read_group_id = rgaux ? bam_aux2Z(rgaux) : default_metrics_id;
+ std::string metrics_id;
+
+ if (!ignore_read_groups && is_single_nucleus) {
+ metrics_id = read_group_id + "-" + barcode;
+ } else if (ignore_read_groups && is_single_nucleus) {
+ metrics_id = barcode;
+ } else if (!ignore_read_groups && !is_single_nucleus) {
+ metrics_id = read_group_id;
} else {
- m = metrics[default_metrics_id];
+ metrics_id = default_metrics_id;
+ }
+
+ // If running in single nucleus mode, barcodes
+ // are unknown ahead of time and Metrics must be created
+ // as new barcodes are encountered
+ //
+ // If not running in single nucleus mode,
+ // it can happen that records have RG tags that don't
+ // exist in the file header. If we're not ignoring
+ // read groups altogether, create new Metrics
+ // instances for these rapscallions.
+ try {
+ m = metrics.at(metrics_id);
+ } catch (std::out_of_range&) {
+ if (!ignore_read_groups && !is_single_nucleus) {
+ std::cout << "Adding metrics for read group missing from file header: " << metrics_id << std::endl;
+ } else if (!ignore_read_groups && is_single_nucleus) {
+ std::cout << "Adding metrics for read group and barcode: " << read_group_id << ", " << barcode << std::endl;
+ } else if (ignore_read_groups && is_single_nucleus) {
+ std::cout << "Adding metrics for barcode: " << barcode << std::endl;
+ }
+ metrics[metrics_id] = new Metrics(this, metrics_id);
+ m = metrics[metrics_id];
}
m->add_alignment(alignment_file_header, record);
@@ -989,11 +1014,21 @@ std::map<std::string,std::map<int, unsigned long long int>> MetricsCollector::ge
fragments_seen[qname] = true;
if (fragment.overlaps(tss_region)) {
- std::string metrics_id = name.empty() ? basename(alignment_filename) : name;
- if (!ignore_read_groups) {
- uint8_t* rgaux = bam_aux_get(record, "RG");
- metrics_id = bam_aux2Z(rgaux);
+ std::string default_metrics_id = name.empty() ? basename(alignment_filename) : name;
+ uint8_t* rgaux = bam_aux_get(record, "RG");
+ uint8_t* bcaux = bam_aux_get(record, nucleus_barcode_tag.c_str());
+ std::string barcode = bcaux ? bam_aux2Z(bcaux) : "no_barcode";
+ std::string read_group_id = rgaux ? bam_aux2Z(rgaux) : default_metrics_id;
+ std::string metrics_id;
+
+ if (!ignore_read_groups && is_single_nucleus) {
+ metrics_id = read_group_id + "-" + barcode;
+ } else if (ignore_read_groups && is_single_nucleus) {
+ metrics_id = barcode;
+ } else {
+ metrics_id = read_group_id;
}
+
for (unsigned long long int pos = tss_region.start; pos <= tss_region.end; pos++) {
if (pos >= fragment.start && pos <= fragment.end) {
int base = tss.is_reverse() ? (tss_region.end - pos) : (pos - tss_region.start);
=====================================
src/cpp/Metrics.hpp
=====================================
@@ -43,6 +43,7 @@ public:
std::string name = "";
std::string organism = "";
+ std::string nucleus_barcode_tag = "";
std::string description = "";
std::string library_description = "";
std::string url = "";
@@ -65,6 +66,7 @@ public:
bool verbose = false;
int thread_limit = 1;
bool ignore_read_groups = false;
+ bool is_single_nucleus = false;
bool log_problematic_reads = false;
bool less_redundant = false;
@@ -73,6 +75,7 @@ public:
MetricsCollector(const std::string& name = "",
const std::string& organism = "human",
+ const std::string& nucleus_barcode_tag = "",
const std::string& description = "",
const std::string& library_description = "",
const std::string& url = "",
@@ -85,6 +88,7 @@ public:
bool verbose = false,
const int thread_limit = 1,
bool ignore_read_groups = false,
+ bool is_single_nucleus = false,
bool log_problematic_reads = false,
bool less_redundant = false,
const std::vector<std::string>& excluded_region_filenames = {});
=====================================
src/cpp/Version.hpp
=====================================
@@ -4,4 +4,4 @@
// Licensed under Version 3 of the GPL or any later version
//
-#define VERSION "1.2.1"
+#define VERSION "1.3.0"
=====================================
src/cpp/ataqv.cpp
=====================================
@@ -42,6 +42,7 @@ enum {
OPT_NAME,
OPT_IGNORE_READ_GROUPS,
+ OPT_NUCLEUS_BARCODE_TAG,
OPT_DESCRIPTION,
OPT_LIBRARY_DESCRIPTION,
OPT_URL,
@@ -114,8 +115,9 @@ void print_usage() {
<< " derived from the read group IDs, with \".problems\" appended. If no read groups" << std::endl
<< " are found, the reads will be written to one file named after the BAM file." << std::endl << std::endl
- << "--less-redundant" << std::endl
- << " If given, output a subset of metrics that should be less redundant. If this flag is used, the same flag should be passed to mkarv when making the viewer." << std::endl
+ << "--less-redundant" << std::endl
+ << " If given, output a subset of metrics that should be less redundant. If this flag is used," << std::endl
+ << " the same flag should be passed to mkarv when making the viewer." << std::endl
<< std::endl
@@ -135,6 +137,10 @@ void print_usage() {
<< " for all reads under a single sample and library named with the --name option. This" << std::endl
<< " also implies that a single peak file will be used for all reads; see the --peak option." << std::endl << std::endl
+ << "--nucleus-barcode-tag \"nucleus_barcode_tag\"" << std::endl
+ << " Data is single-nucleus, with the barcode stored in this BAM tag." << std::endl
+ << " In this case, metrics will be collected per barcode." << std::endl << std::endl
+
<< "--description \"description\"" << std::endl
<< " A short description of the experiment." << std::endl << std::endl
@@ -221,6 +227,8 @@ int main(int argc, char **argv) {
std::string name;
bool ignore_read_groups = false;
std::string description;
+ bool is_single_nucleus = false;
+ std::string nucleus_barcode_tag;
std::string library_description;
std::string url;
std::string organism;
@@ -246,6 +254,7 @@ int main(int argc, char **argv) {
{"less-redundant", no_argument, nullptr, OPT_LESS_REDUNDANT},
{"name", required_argument, nullptr, OPT_NAME},
{"ignore-read-groups", no_argument, nullptr, OPT_IGNORE_READ_GROUPS},
+ {"nucleus-barcode-tag", required_argument, nullptr, OPT_NUCLEUS_BARCODE_TAG},
{"description", required_argument, nullptr, OPT_DESCRIPTION},
{"library-description", required_argument, nullptr, OPT_LIBRARY_DESCRIPTION},
{"url", required_argument, nullptr, OPT_URL},
@@ -277,7 +286,7 @@ int main(int argc, char **argv) {
case OPT_LOG_PROBLEMATIC_READS:
log_problematic_reads = true;
break;
- case OPT_LESS_REDUNDANT:
+ case OPT_LESS_REDUNDANT:
less_redundant = true;
break;
case OPT_NAME:
@@ -286,6 +295,10 @@ int main(int argc, char **argv) {
case OPT_IGNORE_READ_GROUPS:
ignore_read_groups = true;
break;
+ case OPT_NUCLEUS_BARCODE_TAG:
+ nucleus_barcode_tag = optarg;
+ is_single_nucleus = true;
+ break;
case OPT_DESCRIPTION:
description = optarg;
break;
@@ -351,6 +364,7 @@ int main(int argc, char **argv) {
MetricsCollector collector(
name,
organism,
+ nucleus_barcode_tag,
description,
library_description,
url,
@@ -363,6 +377,7 @@ int main(int argc, char **argv) {
verbose,
thread_limit,
ignore_read_groups,
+ is_single_nucleus,
log_problematic_reads,
less_redundant,
excluded_region_filenames);
=====================================
src/cpp/test_metrics.cpp
=====================================
@@ -6,7 +6,7 @@
TEST_CASE("MetricsCollector basics", "[metrics/collector]") {
- MetricsCollector collector("Test collector", "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", "test.bam");
+ MetricsCollector collector("Test collector", "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", "test.bam");
SECTION("MetricsCollector::is_autosomal") {
REQUIRE(collector.is_autosomal("chr1"));
@@ -26,7 +26,8 @@ TEST_CASE("MetricsCollector basics", "[metrics/collector]") {
"Operating parameters\n" +
"====================\n" +
"Thread limit: 1\n" +
- "Ignoring read groups: no\n\n" +
+ "Ignoring read groups: no\n" +
+ "Is single nucleus: no\n\n" +
"Experiment information\n" +
"======================\n" +
"Organism: human\n" +
@@ -63,7 +64,7 @@ TEST_CASE("MetricsCollector::test_supplied_references", "[metrics/test_supplied_
*out << "I\nII\nIII\n";
}
- MetricsCollector collector("Test collector", "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", "test.bam", autosomal_reference_file, "M", "", "", 1000, true, false);
+ MetricsCollector collector("Test collector", "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", "test.bam", autosomal_reference_file, "M", "", "", 1000, true, false, false);
std::remove(autosomal_reference_file.c_str());
@@ -82,7 +83,7 @@ TEST_CASE("MetricsCollector::test_supplied_references", "[metrics/test_supplied_
}
SECTION("Bad autosomal reference file") {
- REQUIRE_THROWS(MetricsCollector badcollector("Test collector", "human", "a collector with a bad autosomal reference file", "a library of brutal tests?", "https://theparkerlab.org", "test.bam", "bad_autosomal_reference_file.txt"));
+ REQUIRE_THROWS(MetricsCollector badcollector("Test collector", "human", "", "a collector with a bad autosomal reference file", "a library of brutal tests?", "https://theparkerlab.org", "test.bam", "bad_autosomal_reference_file.txt"));
}
}
@@ -92,7 +93,7 @@ TEST_CASE("Metrics::load_alignments with no excluded regions", "[metrics/load_al
std::string alignment_file_name("SRR891275.bam");
std::string peak_file_name("SRR891275.peaks.gz");
- MetricsCollector collector(name, "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", true, false);
+ MetricsCollector collector(name, "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", true, false, false);
collector.load_alignments();
@@ -108,7 +109,7 @@ TEST_CASE("Metrics::load_alignments", "[metrics/load_alignments]") {
std::string peak_file_name("test.peaks.gz");
std::string tss_file_name("hg19.tss.refseq.bed.gz");
- MetricsCollector collector(name, "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, tss_file_name, 1000, true, 1, false, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
+ MetricsCollector collector(name, "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, tss_file_name, 1000, true, 1, false, false, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
collector.load_alignments();
@@ -168,12 +169,12 @@ TEST_CASE("Metrics::load_alignments", "[metrics/load_alignments]") {
TEST_CASE("Metrics::load_alignments errors", "[metrics/load_alignments_errors]") {
SECTION("MetricsCollector::load_alignments fails without alignment file name") {
- MetricsCollector collector("Broken collector", "human", "a collector without an alignment file", "a library of brutal tests?", "https://theparkerlab.org", "", "", "", "");
+ MetricsCollector collector("Broken collector", "human", "", "a collector without an alignment file", "a library of brutal tests?", "https://theparkerlab.org", "", "", "", "");
REQUIRE_THROWS_AS(collector.load_alignments(), FileException);
}
SECTION("MetricsCollector::load_alignments fails with bad alignment file name") {
- MetricsCollector collector("Broken collector", "human", "a collector with a non-existent alignment file", "a library of brutal tests?", "https://theparkerlab.org", "missing_alignment_file.bam", "", "chrM", "");
+ MetricsCollector collector("Broken collector", "human", "", "a collector with a non-existent alignment file", "a library of brutal tests?", "https://theparkerlab.org", "missing_alignment_file.bam", "", "chrM", "");
REQUIRE_THROWS_AS(collector.load_alignments(), FileException);
}
}
@@ -183,7 +184,7 @@ TEST_CASE("Metrics::ignore_read_groups", "[metrics/ignore_read_groups]") {
std::string alignment_file_name("test.bam");
std::string peak_file_name("test.peaks.gz");
- MetricsCollector collector(name, "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", 1000, true, 1, true, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
+ MetricsCollector collector(name, "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", 1000, true, 1, true, false, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
collector.load_alignments();
@@ -244,7 +245,7 @@ TEST_CASE("Metrics::missing_peak_file", "[metrics/missing_peak_file]") {
std::string alignment_file_name("test.bam");
std::string peak_file_name("notthere.peaks.gz");
- MetricsCollector collector(name, "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", 1000, true, 1, true, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
+ MetricsCollector collector(name, "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, "", 1000, true, 1, true, false, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
REQUIRE_THROWS_AS(collector.load_alignments(), FileException);
}
@@ -254,6 +255,6 @@ TEST_CASE("Metrics::missing_tss_file", "[metrics/missing_ss_file]") {
std::string peak_file_name("test.peaks.gz");
std::string tss_file_name("notthere.bed.gz");
- MetricsCollector collector(name, "human", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, tss_file_name, 1000, true, 1, true, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
+ MetricsCollector collector(name, "human", "", "a collector for unit tests", "a library of brutal tests?", "https://theparkerlab.org", alignment_file_name, "", "chrM", peak_file_name, tss_file_name, 1000, true, 1, true, false, true, false, {"exclude.dac.bed.gz", "exclude.duke.bed.gz"});
REQUIRE_THROWS_AS(collector.load_alignments(), FileException);
}
View it on GitLab: https://salsa.debian.org/med-team/ataqv/-/compare/26fb17817775d37a30252c6316dc8f6d04b25922...1529c1c0cf48bf0439092efe0ce5a43da8e37799
--
View it on GitLab: https://salsa.debian.org/med-team/ataqv/-/compare/26fb17817775d37a30252c6316dc8f6d04b25922...1529c1c0cf48bf0439092efe0ce5a43da8e37799
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20220116/8ff553ce/attachment-0001.htm>
More information about the debian-med-commit
mailing list