[Debian-med-packaging] Bug#1022508: ariba: FTBFS: failed tests

Lucas Nussbaum lucas at debian.org
Sun Oct 23 14:39:57 BST 2022


Source: ariba
Version: 2.14.6+ds-3
Severity: serious
Justification: FTBFS
Tags: bookworm sid ftbfs
User: lucas at debian.org
Usertags: ftbfs-20221023 ftbfs-bookworm

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
>  debian/rules binary
> dh binary --with python3 --buildsystem=pybuild
>    dh_update_autotools_config -O--buildsystem=pybuild
>    dh_autoreconf -O--buildsystem=pybuild
>    dh_auto_configure -O--buildsystem=pybuild
> I: pybuild base:240: python3.10 setup.py config 
> running config
>    dh_auto_build -O--buildsystem=pybuild
> I: pybuild base:240: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/sequence_metadata.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/summary_sample.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/reference_data.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/summary_cluster_variant.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/assembly.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/sequence_variant.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/pubmlst_getter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/histogram.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/read_filter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/mapping.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/versions.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/megares_data_finder.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/summary_cluster.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/megares_zip_parser.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/ref_genes_getter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/summary.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/card_record.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/report_flag_expander.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/report.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/clusters.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/mlst_profile.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/bam_parse.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/report_filter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/tb.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/cdhit.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/aln_to_metadata.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/scaffold_graph.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/mlst_reporter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/read_store.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/common.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/pubmlst_ref_preparer.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/external_progs.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/samtools_variants.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/vfdb_parser.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/link.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/ref_seq_chooser.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/refdata_query.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/cluster.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/assembly_compare.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/mic_plotter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/flag.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/assembly_variants.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/ref_preparer.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> copying ariba/faidx.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/read_filter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/flag_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/ref_seq_chooser_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/aln_to_metadata_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/summary_sample_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/common_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/samtools_variants_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/scaffold_graph_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/versions_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/histogram_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/megares_data_finder_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/ncbi_getter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/clusters_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/read_store_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/cdhit_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/pubmlst_getter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/faidx_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/test_refdata_query.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/summary_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/mapping_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/cluster_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/mic_plotter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/link_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/megares_zip_parser_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/mlst_reporter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/sequence_metadata_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/ref_genes_getter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/sequence_variant_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/tb_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/summary_cluster_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/reference_data_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/ref_preparer_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/card_record_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/vfdb_parser_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/assembly_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/bam_parse_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/mlst_profile_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/report_flag_expander_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/assembly_compare_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/report_filter_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/external_progs_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/summary_cluster_variant_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/assembly_variants_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> copying ariba/tests/pubmlst_ref_preparer_test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/test.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/reportfilter.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/micplot.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/expandflag.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/summary.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/prepareref_tb.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/run.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/aln2meta.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/pubmlstget.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/prepareref.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/refquery.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/pubmlstspecies.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/flag.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/getref.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> copying ariba/tasks/version.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tasks
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> copying ariba/test_run_data/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> copying ariba/test_run_data/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> copying ariba/test_run_data/ref_seqs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> copying ariba/test_run_data/ref_fasta_to_make_reads_from.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> copying ariba/test_run_data/metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/test_run_data
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20190102.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20190102.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20181115.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20181221.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/NC_000962.3.gb -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20181221.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/NC_000962.3.fa.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> copying ariba/tb_data/panel.20181115.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tb_data
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_load_user_clusters_file.bad3 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_variant_positions_from_vcf.vcf -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_contained_ref_seq.all_refs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_one_var_one_ctg_cdg.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_sam_to_soft_clipped.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_clean.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.vcf -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_make_cluster_file.out -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.good.fa.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_read_depths.gz.tbi -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.depths.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_load_fofn -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_dummy_db.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite_fails.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_to_matrix.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_not_gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_total_alignment_score_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_subset.2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_fasta_file.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_parse.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_remove_both_unmapped_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.good.fa.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_fails.empty.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.4.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_write_all_links_to_file.out -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_read_depths.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene.ref_for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_sort_file.out -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_varonly.not_present.always_report.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_get_best_seq_by_alignment_score_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.asmbly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_make_reads_for_assembly.out2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/tb_write_prepareref_metadata_file.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.expected.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_noncoding.out.cluster -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_variants_presence_absence.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.in.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_parse_assembly_bam.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_gene_fa.db.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_write_distance_matrix.distances -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_init_refdata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_is_in_cluster.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_load_user_clusters_file.good -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_bam_to_clusters_reads.db.fa.smi -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.bad.fa.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_all_metadata_tsvs.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.bad.csv.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_data_info_file -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_fails.empty.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_metadata_tsv.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_remove_both_unmapped_reads.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_load_vars_file_bad.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_files.insertHistogram -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_remove_bad_noncoding.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.no_link.bam.bai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_coding.out.cluster -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_write_seqs_to_fasta.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster.all_refs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_clstrs_file.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_dummy_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_sam_pair_to_insert_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_get_best_seq_by_alignment_score_ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_catted_assemblies_fasta.expected.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_contained_ref_seq.cluster_refs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_rename_sequences.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_fix_out.rename -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_load_summary_file.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_run.expected.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_fasta_to_metadata.noncoding.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_sequence.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snps.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.expect.cov -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_genes_getter.fix_virulencefinder_fasta_file.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.profile.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_sequence.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_run.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.new_st.report.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_assemble_with_spades_fails_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_profile_test.profile.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_not_in_cluster.allrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_mlst_reports.ariba.report.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_best_seq_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_remove_bad_genes.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_assemble_with_spades_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_one_var_one_ctg_noncdg.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_sample_test_column_summary_data.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch.ref_for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snps.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_soft_clipped_from_sam.reads.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/tb_report_to_resistance_dict.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_whole_run.in.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_coding.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parser_load_annotations.csv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.expected.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_check_spades_log_file.log.good -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_run_fermilite.expected.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_write_seqs_to_fasta.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.new_st.report.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_make_reads_for_assembly.out1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/vfdb_parser_test_run.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_remove_bad_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_not_in_cluster.clusterrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_whole_run.out.csv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_soft_clipped_from_sam.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite.expected.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_nocluster.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_run.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_best_seq_ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_total_alignment_score_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.no.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_get_best_seq_by_alignment_score_ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_assembly_fail.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_test_flanking.expected_contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_test_flanking.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_sequences_to_files.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.out.clstr_count -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_insert_histogram.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parser_write_files.expect.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_bam_to_clusters_reads.db.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_rename_file.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_noncoding.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.all_present_perfect.report.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_unmapped_mates_from_sam.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.yes.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_clstrs_file.in.clstrs.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.out.pairs -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_fake_run.non-unique.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parser_load_header_mappings.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_test_flanking.cluster_refs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.1.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite.expected.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.2.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_profile_test.init.profile.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_files.clusterCounts -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.bad.fa.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_report.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.exclude.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run_cdhit_est_2d.ref.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_run_fermilite_fails.expected.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_not_in_cluster.clusterrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_profile_test.init_multiple_extra_columns.profile.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_load_vars_file_bad.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_load_vars_file_good.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.bad.csv.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_ok.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_update_from_sam.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_to_matrix.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/test_common_cat_files.in.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_mlst_reports.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_metadata_tsv.expected.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_run_get_clusters_from_dict.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_sample_test_var_groups.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_non_coding.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_sam_pair_to_insert_ref.fa.sma -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_proper_pairs.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.no.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_one_var_one_ctg_cdg.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.expected.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_newick_from_dist_matrix.distances -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_load_input_files.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_too_long.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_test_flanking.all_refs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_no_nucmer_match.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_load_user_clusters_file.bad1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/faidx_test_write_fa_subset.in.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_load_user_clusters_file.bad2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_unmapped_mates_from_sam.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_sample_test_load_file.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_1.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_unsorted.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_files.properPairs -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snps.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_nocluster.expect.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_total_depth_per_contig -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_unmapped_mates_from_sam.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.one_gene_missing.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_write_cluster_allocation_file.expected -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_contained_ref_seq.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_ok.params.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_not_in_cluster.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_sam_to_fastq.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_load_mic_file.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/vfdb_parser_test_run.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_delete_codon.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_is_in_cluster.allrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_parse.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_run_with_tb.reads_1.fq.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data_metadata.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_delete_codon.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_partial_asmbly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.expected.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snp.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_not_in_cluster.contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_sorted.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_too_short.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_count_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_write_tb_resistance_calls_json.out.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_cat_genes_match_ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite_fails.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_fix_out.removed.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_insert_codon.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_keep_seqs_from_dict.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.make_link.bam.bai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_fasta_to_metadata.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.no.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_coding.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_is_in_cluster.clusterrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_mlst_reports.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_variants_presence_absence.snps -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.out.clstr2rep -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_remove_bad_noncoding.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_sample_test_non_synon_variants.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_metadata_tsv.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_sequences_to_files.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.expected.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_coding.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_cdhit_clstr_to_reads.in.clstr -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_depths_at_position.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_fasta_to_metadata.coding.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_scaffold_with_sspace_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_mlst_reports.mlst_profile.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_clstrs_file.in.meta.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_noncoding.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_nocluster.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_make_reads_for_assembly.in2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_flag_expander.run.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_sequence_type.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_choose_ref_fail.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_out_cluster2representative.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_ok.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_sam_to_soft_clipped.reads.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_gather_unfiltered_output_data.in.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/pubmlst_rename_seqs.expected.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_matching_contig_pieces.expect.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_sequence_is_in_fasta_file.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run_cdhit_est_2d.reads.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_rename_sequences_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_rename_sequences.out -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_get_best_seq_by_alignment_score_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_assemble_with_spades_fails_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_all_fasta_files.in.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data_non_coding.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_fix_out.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.out.hist -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_mummer_variants.snp.snps -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_sequence_type.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.no_link_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.new_st.report.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_parse.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.all_present_perfect.report.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_out_cluster_counts.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_coding.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_ok.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_whole_run.in.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_init_refdata.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_insert_codon.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_load_aln_file.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_all_non_wild_type_variants.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.one_gene_missing.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_variants_variants_only.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parse_extract_files_one_missing.zip -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_presence_absence.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene.ref_for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_scaffold_with_sspace_contigs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_load_minimap_files.cluster2representative -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_choose_ref_fail.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_noncoding.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_presence_absence.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only.not_present.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_sam_pair_to_insert_ref.fa.smi -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_best_seq_ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_total_alignment_score_ref_seqs.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_fails.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_sam_pair_to_insert_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_bam_to_clusters_reads.db.fa.sma -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refdata_query_prepareref.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data_variants_only.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_update_from_sam.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_variants_variants_only.snps -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_duplicate_name.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_sort_file.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_remove_bad_genes.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parse_extract_files_ok.zip -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_gene_fa.db.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_non_coding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data_presence_absence.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_write_report.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_variants.read_depths.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_update_from_sam.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_depths_at_position.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_filter_bad_data.expected.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_init_ok.rename.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.in.read_store -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_scaffold_with_sspace_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_multiple_vars.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_one_var_one_ctg_noncdg.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_cluster_with_cdhit.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_fix_in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_init.fofn -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_count_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.one_gene_missing.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_genes_getter.fix_virulencefinder_fasta_file.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_check_spaces_in_name.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_run_fermilite.expected.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_no_nucmer_match.allrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_multiple_vars.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_fake_run.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.no.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.3.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/aln_to_metadata_run_noncoding.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_catted_assembled_genes_fasta.expected.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snp.out.details.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_fix_contig_orientation.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_matching_contig_pieces.ctg.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.good.csv.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_flag_expander.run.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.yes.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_assemble_with_fermilite_fails.expected.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_preparer_test_run.in.4.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/pubmlst_getter.dbases.xml -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run_cdhit_est_2d.expected.clstr -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.yes.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.asmbly.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_update_from_sam.ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_load_report_good.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_variants.vcf -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/test_common_cat_files.in.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.exclude.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_sequences_to_files.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only.present.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_load_report_bad.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_all_fasta_files.in.2 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/tb_genbank_to_gene_coords.gb -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_fix_contig_orientation.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_gather_unfiltered_output_data.in.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_no_reads_after_filtering.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_run_get_clusters_from_dict_rename.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_init_bad.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_sam_pair_to_insert.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_check_spades_log_file.log.bad -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_remove_bad_noncoding.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_test_load_input_files.2.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_parse_assembly_bam.assembly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/graph_test_update_from_sam.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_minimap_reads_to_all_refs.clstrs.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.make_link.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_sam_pair_to_insert_ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/vfdb_parser_test_run.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/faidx_test_write_fa_subset.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_run_fermilite_fail.reads.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_run_fermilite.reads.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.in.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/report_filter_test_init_good.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.expected.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_make_vcf_and_depths_files.for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_run_with_tb.reads_2.fq.gz -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_assembly_fail.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_sequences_to_files.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_2.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_write_gene_fa.out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refdata_query_prepareref.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.expect.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/summary_sample_test_column_names_tuples_and_het_snps.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_dummy_db.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_parse.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_total_alignment_score_ref_seqs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_write_tb_resistance_calls_json.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/clusters_test_dummy_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_keep_seqs_from_dict.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_cluster_w_cdhit_clstrs_file.expect.clstrs.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_write_sequences_to_files.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.no_link.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_assemble_with_spades_reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_variants.read_depths.gz.tbi -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_unmapped_mates_from_sam.reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_test_fix_contig_orientation.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_input_check_seq_names.good.csv.1 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.exclude.no_combinations.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio1.yes.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_no_reads_after_filtering.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.het_snp.out.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.expected.reads_subset.1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_get_reads.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/refcheck_test_fix_out.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_run_get_clusters_from_dict.in.clusters -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_update_soft_clipped_from_sam.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_filter_test_run.in.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/test_common_cat_files.in.3 -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/bam_parse_test_sam_to_soft_clipped.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cdhit_test_get_clusters_from_bak_file.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_variants_in_coords.vcf -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/test_common_cat_files.out -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/pubmlst_rename_seqs.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mlst_reporter.all_present_perfect.report.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_matching_contig_pieces.coords -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.make_link_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/dummy.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_cluster_with_cdhit.expected.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/best_seq_chooser_best_seq_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_partial_asmbly.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/megares_zip_parser_write_files.expect.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_variants_test_get_mummer_variants.none.snps -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/read_store_test_compress_and_index_file.in -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_bowtie2_remove_both_unmapped_reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.no_link_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_write_seqs_to_fasta.expected.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_matching_contig_pieces.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/faidx_test_write_fa_subset.in.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_best_match_not_in_cluster.allrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/samtools_variants_test_get_depths_at_position.ref.fa.fai -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_all_non_wild_type_variants.ref.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/ref_seq_chooser_full_run_no_nucmer_match.clusterrefs.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch.metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/assembly_compare_parse_nucmer_coords_file.coords -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mic_plotter_to_boxplot_tsv.antibio2.exclude.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_load_all_metadata_tsvs.1.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/link_test_init.reads.make_link_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/reference_data_test_cluster_with_cdhit.in.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/cluster_test_make_reads_for_assembly.in1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> copying ariba/tests/data/mapping_test_get_total_alignment_score.bam -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/00.rename_info -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/01.filter.check_noncoding.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/00.version_info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/01.filter.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/01.filter.check_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> copying ariba/tests/data/refdata_query_prepareref/02.cdhit.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/refdata_query_prepareref
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_non_coding
> copying ariba/tests/data/cluster_test_full_run_ok_non_coding/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_non_coding
> copying ariba/tests/data/cluster_test_full_run_ok_non_coding/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_non_coding
> copying ariba/tests/data/cluster_test_full_run_ok_non_coding/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_non_coding
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ref_not_in_cluster
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ref_not_in_cluster
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ref_not_in_cluster
> copying ariba/tests/data/cluster_test_full_run_ref_not_in_cluster/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ref_not_in_cluster
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_no_snp
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/01.filter.check_noncoding.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/00.version_info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/01.filter.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/01.filter.check_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> copying ariba/tests/data/ref_preparer_test_run.out/02.cdhit.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run.out
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/00.rename_info -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/01.filter.check_noncoding.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/00.version_info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/01.filter.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/01.filter.check_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> copying ariba/tests/data/ref_preparer_test_run_noncoding_checks.out/02.cdhit.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_noncoding_checks.out
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/00.params.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/00.version_info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/01.filter.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/01.filter.check_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> copying ariba/tests/data/clusters_run_with_tb.ref/02.cdhit.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_run_with_tb.ref
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_delete_codon
> copying ariba/tests/data/cluster_test_full_run_delete_codon/for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_delete_codon
> copying ariba/tests/data/cluster_test_full_run_delete_codon/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_delete_codon
> copying ariba/tests/data/cluster_test_full_run_delete_codon/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_delete_codon
> copying ariba/tests/data/cluster_test_full_run_delete_codon/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_delete_codon
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_nonc
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_assembly_fail
> copying ariba/tests/data/cluster_test_full_run_assembly_fail/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_assembly_fail
> copying ariba/tests/data/cluster_test_full_run_assembly_fail/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_assembly_fail
> copying ariba/tests/data/cluster_test_full_run_assembly_fail/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_assembly_fail
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch
> copying ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_gene_start_mismatch
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_presence_absence
> copying ariba/tests/data/cluster_test_full_run_ok_presence_absence/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_presence_absence
> copying ariba/tests/data/cluster_test_full_run_ok_presence_absence/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_presence_absence
> copying ariba/tests/data/cluster_test_full_run_ok_presence_absence/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_presence_absence
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_multiple_vars
> copying ariba/tests/data/cluster_test_full_run_multiple_vars/for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_multiple_vars
> copying ariba/tests/data/cluster_test_full_run_multiple_vars/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_multiple_vars
> copying ariba/tests/data/cluster_test_full_run_multiple_vars/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_multiple_vars
> copying ariba/tests/data/cluster_test_full_run_multiple_vars/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_multiple_vars
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_known_smtls_snp_presabs_gene
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_partial_asmbly
> copying ariba/tests/data/cluster_test_full_run_partial_asmbly/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_partial_asmbly
> copying ariba/tests/data/cluster_test_full_run_partial_asmbly/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_partial_asmbly
> copying ariba/tests/data/cluster_test_full_run_partial_asmbly/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_partial_asmbly
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in
> copying ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in/gene2.tfa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in
> copying ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in/gene1.tfa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in
> copying ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in/profile.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/pubmlst_ref_prepare.test_load_fa_and_clusters.in
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_refs_fa
> copying ariba/tests/data/cluster_test_init_no_refs_fa/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_refs_fa
> copying ariba/tests/data/cluster_test_init_no_refs_fa/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_refs_fa
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_gene_2
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_choose_ref_fail
> copying ariba/tests/data/cluster_test_full_run_choose_ref_fail/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_choose_ref_fail
> copying ariba/tests/data/cluster_test_full_run_choose_ref_fail/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_choose_ref_fail
> copying ariba/tests/data/cluster_test_full_run_choose_ref_fail/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_choose_ref_fail
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/00.auto_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.gene.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.gene.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/01.filter.check_noncoding.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/00.version_info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/01.filter.check_metadata.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/01.filter.check_genes.log -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.noncoding.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.noncoding.varonly.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> copying ariba/tests/data/ref_preparer_test_run_all_noncoding.out/02.cdhit.clusters.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/ref_preparer_test_run_all_noncoding.out
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/megares_zip_parser_write_files
> copying ariba/tests/data/megares_zip_parser_write_files/megares_annotations_v1.01.csv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/megares_zip_parser_write_files
> copying ariba/tests/data/megares_zip_parser_write_files/megares_to_external_header_mappings_v1.01.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/megares_zip_parser_write_files
> copying ariba/tests/data/megares_zip_parser_write_files/megares_database_v1.01.fasta -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/megares_zip_parser_write_files
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_nonc
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_no_reads_after_filtering
> copying ariba/tests/data/cluster_test_full_run_no_reads_after_filtering/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_no_reads_after_filtering
> copying ariba/tests/data/cluster_test_full_run_no_reads_after_filtering/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_no_reads_after_filtering
> copying ariba/tests/data/cluster_test_full_run_no_reads_after_filtering/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_no_reads_after_filtering
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc_no_snp
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_2
> copying ariba/tests/data/cluster_test_init_no_reads_2/genes.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_2
> copying ariba/tests/data/cluster_test_init_no_reads_2/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_2
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> copying ariba/tests/data/clusters_load_ref_data_from_dir/02.cdhit.clusters.pickle -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> copying ariba/tests/data/clusters_load_ref_data_from_dir/00.params.json -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> copying ariba/tests/data/clusters_load_ref_data_from_dir/01.filter.check_metadata.tsv -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> copying ariba/tests/data/clusters_load_ref_data_from_dir/02.cdhit.all.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> copying ariba/tests/data/clusters_load_ref_data_from_dir/00.info.txt -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/clusters_load_ref_data_from_dir
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc
> copying ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_known_snp_presabs_nonc
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_smtls_snp_varonly_nonc
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_variants_only
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_variants_only
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_variants_only
> copying ariba/tests/data/cluster_test_full_run_ok_variants_only/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_ok_variants_only
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc
> copying ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_varonly_nonc
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_insert_codon
> copying ariba/tests/data/cluster_test_full_run_insert_codon/for_reads.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_insert_codon
> copying ariba/tests/data/cluster_test_full_run_insert_codon/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_insert_codon
> copying ariba/tests/data/cluster_test_full_run_insert_codon/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_insert_codon
> copying ariba/tests/data/cluster_test_full_run_insert_codon/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_full_run_insert_codon
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_1
> copying ariba/tests/data/cluster_test_init_no_reads_1/genes.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_1
> copying ariba/tests/data/cluster_test_init_no_reads_1/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_test_init_no_reads_1
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene/reads_2.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene/reads_1.fq -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene
> copying ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene/references.fa -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/data/cluster_full_run_smtls_snp_presabs_gene
> running build_ext
> building 'minimap_ariba' extension
> creating build
> creating build/temp.linux-x86_64-cpython-310
> creating build/temp.linux-x86_64-cpython-310/ariba
> creating build/temp.linux-x86_64-cpython-310/ariba/ext
> x86_64-linux-gnu-gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/minimap -I/usr/include/python3.10 -c ariba/ext/minimap_ariba.cpp -o build/temp.linux-x86_64-cpython-310/ariba/ext/minimap_ariba.o
> ariba/ext/minimap_ariba.cpp: In function ‘bool readMappingOk(const mm_reg1_t*, const mm_idx_t*, const kseq_t*, uint32_t)’:
> ariba/ext/minimap_ariba.cpp:371:23: warning: comparison of integer expressions of different signedness: ‘int’ and ‘const unsigned int’ [-Wsign-compare]
>   371 |     if (r->qe - r->qs < std::min((unsigned) 50, (int) 0.5 * ks->seq.l))
>       |         ~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> ariba/ext/minimap_ariba.cpp:381:26: warning: comparison of integer expressions of different signedness: ‘const int32_t’ {aka ‘const int’} and ‘uint32_t’ {aka ‘unsigned int’} [-Wsign-compare]
>   381 |         startOk = (r->qs < endTolerance || refLength - r->re < endTolerance);
>       |                    ~~~~~~^~~~~~~~~~~~~~
> ariba/ext/minimap_ariba.cpp:382:60: warning: comparison of integer expressions of different signedness: ‘const int32_t’ {aka ‘const int’} and ‘uint32_t’ {aka ‘unsigned int’} [-Wsign-compare]
>   382 |         endOk = (ks->seq.l - r->qe < endTolerance || r->rs < endTolerance);
>       |                                                      ~~~~~~^~~~~~~~~~~~~~
> ariba/ext/minimap_ariba.cpp:386:26: warning: comparison of integer expressions of different signedness: ‘const int32_t’ {aka ‘const int’} and ‘uint32_t’ {aka ‘unsigned int’} [-Wsign-compare]
>   386 |         startOk = (r->qs < endTolerance || r->rs < endTolerance);
>       |                    ~~~~~~^~~~~~~~~~~~~~
> ariba/ext/minimap_ariba.cpp:386:50: warning: comparison of integer expressions of different signedness: ‘const int32_t’ {aka ‘const int’} and ‘uint32_t’ {aka ‘unsigned int’} [-Wsign-compare]
>   386 |         startOk = (r->qs < endTolerance || r->rs < endTolerance);
>       |                                            ~~~~~~^~~~~~~~~~~~~~
> x86_64-linux-gnu-g++ -shared -Wl,-O1 -Wl,-Bsymbolic-functions -g -fwrapv -O2 -Wl,-z,relro -Wl,-z,now -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-cpython-310/ariba/ext/minimap_ariba.o -L/usr/lib/x86_64-linux-gnu -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/minimap_ariba.cpython-310-x86_64-linux-gnu.so -lz -lminimap
> building 'fermilite_ariba' extension
> x86_64-linux-gnu-gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.10 -c ariba/ext/fml-asm_ariba.cpp -o build/temp.linux-x86_64-cpython-310/ariba/ext/fml-asm_ariba.o
> x86_64-linux-gnu-g++ -shared -Wl,-O1 -Wl,-Bsymbolic-functions -g -fwrapv -O2 -Wl,-z,relro -Wl,-z,now -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-cpython-310/ariba/ext/fml-asm_ariba.o -L/usr/lib/x86_64-linux-gnu -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/fermilite_ariba.cpython-310-x86_64-linux-gnu.so -lz -lfml
> building 'vcfcall_ariba' extension
> x86_64-linux-gnu-gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.10 -c ariba/ext/vcfcall_ariba.cpp -o build/temp.linux-x86_64-cpython-310/ariba/ext/vcfcall_ariba.o
> x86_64-linux-gnu-g++ -shared -Wl,-O1 -Wl,-Bsymbolic-functions -g -fwrapv -O2 -Wl,-z,relro -Wl,-z,now -g -O2 -ffile-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-cpython-310/ariba/ext/vcfcall_ariba.o -L/usr/lib/x86_64-linux-gnu -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/vcfcall_ariba.cpython-310-x86_64-linux-gnu.so
> running build_scripts
> creating build/scripts-3.10
> copying and adjusting scripts/ariba -> build/scripts-3.10
> changing mode of build/scripts-3.10/ariba from 644 to 755
>    dh_auto_test -O--buildsystem=pybuild
> I: pybuild base:240: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build; python3.10 -m pytest 
> ============================= test session starts ==============================
> platform linux -- Python 3.10.7, pytest-7.1.2, pluggy-1.0.0+repack
> rootdir: /<<PKGBUILDDIR>>, configfile: pytest.ini
> collected 356 items
> 
> ariba/tests/aln_to_metadata_test.py ....................                 [  5%]
> ariba/tests/assembly_compare_test.py ................                    [ 10%]
> ariba/tests/assembly_test.py .........                                   [ 12%]
> ariba/tests/assembly_variants_test.py ...........                        [ 15%]
> ariba/tests/bam_parse_test.py ......                                     [ 17%]
> ariba/tests/card_record_test.py ........                                 [ 19%]
> ariba/tests/cdhit_test.py ................                               [ 24%]
> ariba/tests/cluster_test.py ..FFFFFF.FFFFFFF.FFFFFFFF....                [ 32%]
> ariba/tests/clusters_test.py .........F......                            [ 36%]
> ariba/tests/common_test.py ..                                            [ 37%]
> ariba/tests/external_progs_test.py .                                     [ 37%]
> ariba/tests/faidx_test.py .                                              [ 37%]
> ariba/tests/flag_test.py .......                                         [ 39%]
> ariba/tests/histogram_test.py ....                                       [ 41%]
> ariba/tests/link_test.py .........                                       [ 43%]
> ariba/tests/mapping_test.py .......                                      [ 45%]
> ariba/tests/megares_data_finder_test.py ..                               [ 46%]
> ariba/tests/megares_zip_parser_test.py .....                             [ 47%]
> ariba/tests/mic_plotter_test.py .............                            [ 51%]
> ariba/tests/mlst_profile_test.py ....                                    [ 52%]
> ariba/tests/mlst_reporter_test.py .....                                  [ 53%]
> ariba/tests/pubmlst_getter_test.py ...                                   [ 54%]
> ariba/tests/pubmlst_ref_preparer_test.py ..                              [ 55%]
> ariba/tests/read_filter_test.py .                                        [ 55%]
> ariba/tests/read_store_test.py .......                                   [ 57%]
> ariba/tests/ref_genes_getter_test.py .                                   [ 57%]
> ariba/tests/ref_preparer_test.py .....                                   [ 58%]
> ariba/tests/ref_seq_chooser_test.py .FFFF..                              [ 60%]
> ariba/tests/reference_data_test.py ..............................        [ 69%]
> ariba/tests/report_filter_test.py .................                      [ 74%]
> ariba/tests/report_flag_expander_test.py .                               [ 74%]
> ariba/tests/samtools_variants_test.py F...F..                            [ 76%]
> ariba/tests/scaffold_graph_test.py .....                                 [ 77%]
> ariba/tests/sequence_metadata_test.py .....                              [ 79%]
> ariba/tests/sequence_variant_test.py .......                             [ 81%]
> ariba/tests/summary_cluster_test.py .......................              [ 87%]
> ariba/tests/summary_cluster_variant_test.py ....                         [ 88%]
> ariba/tests/summary_sample_test.py ....                                  [ 89%]
> ariba/tests/summary_test.py ....................                         [ 95%]
> ariba/tests/tb_test.py .......                                           [ 97%]
> ariba/tests/test_refdata_query.py .....                                  [ 98%]
> ariba/tests/versions_test.py .                                           [ 99%]
> ariba/tests/vfdb_parser_test.py ...                                      [100%]
> 
> =================================== FAILURES ===================================
> ____ TestCluster.test_full_run_cluster_test_full_run_smtls_snp_varonly_nonc ____
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_cluster_test_full_run_smtls_snp_varonly_nonc>
> 
>     def test_full_run_cluster_test_full_run_smtls_snp_varonly_nonc(self):
>         '''test complete run where samtools calls a snp at a known snp location in a presence/absence noncoding and sample has the var'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_smtls_snp_varonly_nonc.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_smtls_snp_varonly_nonc.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_smtls_snp_varonly_nonc'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:493: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ____________________ TestCluster.test_full_run_delete_codon ____________________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_delete_codon>
> 
>     def test_full_run_delete_codon(self):
>         '''Test complete run where there is a deleted codon'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_delete_codon.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_delete_codon.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_delete_codon'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_delete_codon'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=292, total_reads_bases=20900)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:550: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ____________________ TestCluster.test_full_run_insert_codon ____________________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_insert_codon>
> 
>     def test_full_run_insert_codon(self):
>         '''Test complete run where there is a inserted codon'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_insert_codon.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_insert_codon.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_insert_codon'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_insert_codon'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=292, total_reads_bases=20900)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:568: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ____________ TestCluster.test_full_run_known_smtls_snp_presabs_gene ____________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_known_smtls_snp_presabs_gene>
> 
>     def test_full_run_known_smtls_snp_presabs_gene(self):
>         '''test complete run where samtools calls a snp at a known snp location in a presence/absence gene'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_gene.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_gene.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_gene'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:342: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ____________ TestCluster.test_full_run_known_smtls_snp_presabs_nonc ____________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_known_smtls_snp_presabs_nonc>
> 
>     def test_full_run_known_smtls_snp_presabs_nonc(self):
>         '''test complete run where samtools calls a snp at a known snp location in a presence/absence noncoding'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_nonc.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_nonc.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_known_smtls_snp_presabs_nonc'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:453: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _______________ TestCluster.test_full_run_multiple_vars_in_codon _______________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_multiple_vars_in_codon>
> 
>     def test_full_run_multiple_vars_in_codon(self):
>         '''Test complete run where there is a codon with a SNP and an indel'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_multiple_vars.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_multiple_vars.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_multiple_vars'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_multiple_vars'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=292, total_reads_bases=20900)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:531: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _______________ TestCluster.test_full_run_ok_gene_start_mismatch _______________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_gene_start_mismatch>
> 
>     def test_full_run_ok_gene_start_mismatch(self):
>         '''test complete run where gene extended because too different at end for full nucmer match'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_gene_start_mismatch.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_ok_gene_start_mismatch.metadata.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_gene_start_mismatch'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_gene_start_mismatch'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=112, total_reads_bases=1080)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:289: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ___________________ TestCluster.test_full_run_ok_non_coding ____________________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_non_coding>
> 
>     def test_full_run_ok_non_coding(self):
>         '''test complete run of cluster on a noncoding sequence'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_non_coding.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_ok_non_coding.metadata.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.test_full_run_ok_non_coding'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_non_coding'), tmpdir)
>     
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=72, total_reads_bases=3600)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:185: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ________________ TestCluster.test_full_run_ok_presence_absence _________________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_presence_absence>
> 
>     def test_full_run_ok_presence_absence(self):
>         '''test complete run of cluster on a presence absence gene'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_presence_absence.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_ok_presence_absence.metadata.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_presence_absence'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_presence_absence'), tmpdir)
>     
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=64, total_reads_bases=3200)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:211: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ________ TestCluster.test_full_run_ok_variants_only_variant_is_present _________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_variants_only_variant_is_present>
> 
>     def test_full_run_ok_variants_only_variant_is_present(self):
>         '''test complete run of cluster on a variants only gene when variant is present'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only.present.metadata.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_variants_only.present'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only'), tmpdir)
>     
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=66, total_reads_bases=3300)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:270: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ________ TestCluster.test_full_run_ok_variants_only_variant_not_present ________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_variants_only_variant_not_present>
> 
>     def test_full_run_ok_variants_only_variant_not_present(self):
>         '''test complete run of cluster on a variants only gene when variant not present'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only.not_present.metadata.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_variants_only.not_present'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only'), tmpdir)
>     
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=66, total_reads_bases=3300)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:234: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _ TestCluster.test_full_run_ok_variants_only_variant_not_present_always_report _
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_ok_variants_only_variant_not_present_always_report>
> 
>     def test_full_run_ok_variants_only_variant_not_present_always_report(self):
>         '''test complete run of cluster on a variants only gene when variant not present but always report variant'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_varonly.not_present.always_report.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_full_run_varonly.not_present.always_report'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_ok_variants_only'), tmpdir)
>     
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=66, total_reads_bases=3300)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:252: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> __________________ TestCluster.test_full_run_partial_assembly __________________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_partial_assembly>
> 
>     def test_full_run_partial_assembly(self):
>         '''Test complete run where only part of the ref gene is present in the reads'''
>         fasta_in = os.path.join(data_dir, 'cluster_test_full_run_partial_asmbly.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_test_full_run_partial_asmbly.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_partial_assembly'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_test_full_run_partial_asmbly'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=278, total_reads_bases=15020)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:513: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> ____________ TestCluster.test_full_run_smtls_known_snp_presabs_nonc ____________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_known_snp_presabs_nonc>
> 
>     def test_full_run_smtls_known_snp_presabs_nonc(self):
>         '''test complete run where samtools calls a snp in a presence/absence noncoding sequence at a known snp position'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_known_snp_presabs_nonc.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_known_snp_presabs_nonc.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_known_snp_presabs_nonc'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:419: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _______________ TestCluster.test_full_run_smtls_snp_presabs_gene _______________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_presabs_gene>
> 
>     def test_full_run_smtls_snp_presabs_gene(self):
>         '''test complete run where samtools calls a snp in a presence/absence gene'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_gene.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_gene.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_gene'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:306: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _______________ TestCluster.test_full_run_smtls_snp_presabs_nonc _______________
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_presabs_nonc>
> 
>     def test_full_run_smtls_snp_presabs_nonc(self):
>         '''test complete run where samtools calls a snp in a presence/absence noncoding sequence'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_nonc.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_nonc.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_smtls_snp_presabs_nonc'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_presabs_nonc'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:402: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:447: in _run
>     self.samtools_vars.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _______________ TestCluster.test_full_run_smtls_snp_varonly_gene _______________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 97\n# constru...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_varonly_gene>
> 
>     def test_full_run_smtls_snp_varonly_gene(self):
>         '''test complete run where samtools calls a snp at a known snp location in a variant only gene, gene does have variant'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:382: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:365: in _run
>     self.assembly.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/assembly.py:285: in run
>     ref_chooser.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:174: in run
>     best_hit_from_all_seqs, not_needed = RefSeqChooser._closest_nucmer_match_between_fastas(self.all_refs_fasta, pieces_fasta_file, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, True, False)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 97
> # construct suffix tree for sequence of length 97
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.choose_ref.ncsxe5i8/nucmer_vs_cluster_refs.pieces.fa" of length 96
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.choose_ref.ncsxe5i8/nucmer_vs_cluster_refs.pieces.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> ______________ TestCluster.test_full_run_smtls_snp_varonly_gene_2 ______________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 97\n# constru...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_varonly_gene_2>
> 
>     def test_full_run_smtls_snp_varonly_gene_2(self):
>         '''test complete run where samtools calls a snp in a variant only gene'''
>         # _2 because I think test_full_run_smtls_snp_varonly_gene tests the asame functionality.
>         # ... but let's leave both tests in anyway
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_2.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_2.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_full_run_smtls_snp_varonly_gene_2'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_2'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:325: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:365: in _run
>     self.assembly.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/assembly.py:285: in run
>     ref_chooser.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:161: in run
>     best_hit_from_cluster, nucmer_matches = RefSeqChooser._closest_nucmer_match_between_fastas(self.cluster_fasta, self.assembly_fasta_in, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, False, True)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 97
> # construct suffix tree for sequence of length 97
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/Assembly/debug_all_contigs.fa" of length 1911
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/Assembly/debug_all_contigs.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> ___________ TestCluster.test_full_run_smtls_snp_varonly_gene_no_snp ____________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 97\n# constru...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_varonly_gene_no_snp>
> 
>     def test_full_run_smtls_snp_varonly_gene_no_snp(self):
>         '''test complete run where samtools calls a snp at a known snp location in a variant only gene, gene does not have variant'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_no_snp.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_no_snp.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_gene_no_snp'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:362: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:365: in _run
>     self.assembly.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/assembly.py:285: in run
>     ref_chooser.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:161: in run
>     best_hit_from_cluster, nucmer_matches = RefSeqChooser._closest_nucmer_match_between_fastas(self.cluster_fasta, self.assembly_fasta_in, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, False, True)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 97
> # construct suffix tree for sequence of length 97
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/Assembly/debug_all_contigs.fa" of length 1911
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/Assembly/debug_all_contigs.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> _______________ TestCluster.test_full_run_smtls_snp_varonly_nonc _______________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 97\n# constru...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_varonly_nonc>
> 
>     def test_full_run_smtls_snp_varonly_nonc(self):
>         '''test complete run where samtools calls a snp in a presence/absence noncoding sequence'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_full_run_smtls_snp_varonly_nonc'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:436: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:365: in _run
>     self.assembly.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/assembly.py:285: in run
>     ref_chooser.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:161: in run
>     best_hit_from_cluster, nucmer_matches = RefSeqChooser._closest_nucmer_match_between_fastas(self.cluster_fasta, self.assembly_fasta_in, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, False, True)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 97
> # construct suffix tree for sequence of length 97
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/Assembly/debug_all_contigs.fa" of length 1911
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/Assembly/debug_all_contigs.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> ___________ TestCluster.test_full_run_smtls_snp_varonly_nonc_no_snp ____________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 97\n# constru...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.cluster_test.TestCluster testMethod=test_full_run_smtls_snp_varonly_nonc_no_snp>
> 
>     def test_full_run_smtls_snp_varonly_nonc_no_snp(self):
>         '''test complete run where samtools calls a snp at a known snp location in a presence/absence noncoding and sample does not have the var'''
>         fasta_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc_no_snp.fa')
>         tsv_in = os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc_no_snp.tsv')
>         refdata = reference_data.ReferenceData([fasta_in], [tsv_in])
>         tmpdir = 'tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding'
>         common.rmtree(tmpdir)
>         shutil.copytree(os.path.join(data_dir, 'cluster_full_run_smtls_snp_varonly_nonc_no_snp'), tmpdir)
>         c = cluster.Cluster(tmpdir, 'cluster_name', refdata, total_reads=148, total_reads_bases=13320)
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/cluster_test.py:473: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:313: in run
>     self._run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/cluster.py:365: in _run
>     self.assembly.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/assembly.py:285: in run
>     ref_chooser.run()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:161: in run
>     best_hit_from_cluster, nucmer_matches = RefSeqChooser._closest_nucmer_match_between_fastas(self.cluster_fasta, self.assembly_fasta_in, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, False, True)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster_name detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 97
> # construct suffix tree for sequence of length 97
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/Assembly/debug_all_contigs.fa" of length 1911
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/Assembly/debug_all_contigs.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> ________________________ TestClusters.test_run_with_tb _________________________
> 
> self = <ariba.clusters.Clusters object at 0x7f1705cd5d20>
> 
>     def run(self):
>         try:
> >           self._run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/clusters.py:612: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <ariba.clusters.Clusters object at 0x7f1705cd5d20>
> 
>     def _run(self):
>         cwd = os.getcwd()
>         try:
>             os.chdir(self.outdir)
>             self.write_versions_file(cwd)
>             self._map_and_cluster_reads()
>             self.log_files = None
>     
>             if len(self.cluster_to_dir) > 0:
>                 got_insert_data_ok = self._set_insert_size_data()
>                 if not got_insert_data_ok:
>                     print('WARNING: not enough proper read pairs (found ' + str(self.proper_pairs) + ') to determine insert size.', file=sys.stderr)
>                     print('This probably means that very few reads were mapped at all. No local assemblies will be run', file=sys.stderr)
>                     if self.verbose:
>                         print('Not enough proper read pairs mapped to determine insert size. Skipping all assemblies.', flush=True)
>                 else:
>                     if self.verbose:
>                         print('{:_^79}'.format(' Assembling each cluster '))
>                         print('Will run', self.threads, 'cluster(s) in parallel', flush=True)
>                     self._init_and_run_clusters()
>                     if self.verbose:
>                         print('Finished assembling clusters\n')
>             else:
>                 if self.verbose:
>                     print('No reads mapped. Skipping all assemblies', flush=True)
>                 print('WARNING: no reads mapped to reference genes. Therefore no local assemblies will be run', file=sys.stderr)
>     
>             if not self.clusters_all_ran_ok:
> >               raise Error('At least one cluster failed! Stopping...')
> E               ariba.clusters.Error: At least one cluster failed! Stopping...
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/clusters.py:646: Error
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.clusters_test.TestClusters testMethod=test_run_with_tb>
> 
>     def test_run_with_tb(self):
>         '''test complete run with TB amr calling'''
>         tmp_out = 'tmp.clusters_run_with_tb'
>         c = clusters.Clusters(
>             os.path.join(data_dir, 'clusters_run_with_tb.ref'),
>             os.path.join(data_dir, 'clusters_run_with_tb.reads_1.fq.gz'),
>             os.path.join(data_dir, 'clusters_run_with_tb.reads_2.fq.gz'),
>             tmp_out,
>             extern_progs,
>         )
> >       c.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/clusters_test.py:342: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <ariba.clusters.Clusters object at 0x7f1705cd5d20>
> 
>     def run(self):
>         try:
>             self._run()
>         except Error as err:
>             self._emergency_stop()
> >           raise Error('Something went wrong during ariba run. Cannot continue. Error was:\n' + str(err))
> E           ariba.clusters.Error: Something went wrong during ariba run. Cannot continue. Error was:
> E           At least one cluster failed! Stopping...
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/clusters.py:615: Error
> ----------------------------- Captured stdout call -----------------------------
> cluster detected 1 threads available to it
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 676
> # construct suffix tree for sequence of length 676
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.clusters_run_with_tb/ariba.tmp.jbwmd4sf/cluster/Assembly/debug_all_contigs.fa" of length 4573
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.clusters_run_with_tb/ariba.tmp.jbwmd4sf/cluster/Assembly/debug_all_contigs.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.01
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> Failed cluster: cluster
> Other clusters failed. Will not start cluster cluster_1
> _____________ TestRefSeqChooser.test_run_best_match_is_in_cluster ______________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 2505\n# const...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.ref_seq_chooser_test.TestRefSeqChooser testMethod=test_run_best_match_is_in_cluster>
> 
>     def test_run_best_match_is_in_cluster(self):
>         '''Test full run where the best match is in the cluster'''
>         all_ref_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_is_in_cluster.allrefs.fa')
>         cluster_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_is_in_cluster.clusterrefs.fa')
>         contig_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_is_in_cluster.contigs.fa')
>         tmp_out = 'tmp.ref_seq_chooser_full_run_best_match_is_in_cluster.fa'
>         refchooser = ref_seq_chooser.RefSeqChooser(cluster_fasta, all_ref_fasta, contig_fasta, tmp_out, sys.stdout)
> >       refchooser.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/ref_seq_chooser_test.py:78: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:174: in run
>     best_hit_from_all_seqs, not_needed = RefSeqChooser._closest_nucmer_match_between_fastas(self.all_refs_fasta, pieces_fasta_file, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, True, False)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> Looking for closest match from sequences within cluster
> [choose ref nucmer]	1	500	1	500	500	500	100.00	500	500	1	ref1	ref1.l30.c4.ctg.1
> Closest cluster ref sequence is ref1 to assembly ref1.l30.c4
> Checking for a better match to a ref sequence outside the cluster
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 2505
> # construct suffix tree for sequence of length 2505
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.choose_ref.ywr6ak5u/nucmer_vs_cluster_refs.pieces.fa" of length 500
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.choose_ref.ywr6ak5u/nucmer_vs_cluster_refs.pieces.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> _____________ TestRefSeqChooser.test_run_best_match_not_in_cluster _____________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 2505\n# const...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.ref_seq_chooser_test.TestRefSeqChooser testMethod=test_run_best_match_not_in_cluster>
> 
>     def test_run_best_match_not_in_cluster(self):
>         '''Test full run where there is a match in cluster, but better match to seq not in cluster'''
>         all_ref_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_not_in_cluster.allrefs.fa')
>         cluster_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_not_in_cluster.clusterrefs.fa')
>         contig_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_best_match_not_in_cluster.contigs.fa')
>         tmp_out = 'tmp.ref_seq_chooser_full_run_best_match_not_in_cluster.fa'
>         refchooser = ref_seq_chooser.RefSeqChooser(cluster_fasta, all_ref_fasta, contig_fasta, tmp_out, sys.stdout)
> >       refchooser.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/ref_seq_chooser_test.py:65: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:174: in run
>     best_hit_from_all_seqs, not_needed = RefSeqChooser._closest_nucmer_match_between_fastas(self.all_refs_fasta, pieces_fasta_file, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, True, False)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> Looking for closest match from sequences within cluster
> [choose ref nucmer]	1	500	1	500	500	500	99.00	500	500	1	ref1	ref2.l30.c4.ctg.1
> Closest cluster ref sequence is ref1 to assembly ref2.l30.c4
> Checking for a better match to a ref sequence outside the cluster
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 2505
> # construct suffix tree for sequence of length 2505
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.choose_ref.bg89_0xu/nucmer_vs_cluster_refs.pieces.fa" of length 500
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.choose_ref.bg89_0xu/nucmer_vs_cluster_refs.pieces.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> _________________ TestRefSeqChooser.test_run_contained_ref_seq _________________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 1744\n# const...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.ref_seq_chooser_test.TestRefSeqChooser testMethod=test_run_contained_ref_seq>
> 
>     def test_run_contained_ref_seq(self):
>         '''Test full run where ref seq completely contains another seq outside cluster'''
>         all_ref_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_contained_ref_seq.all_refs.fa')
>         cluster_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_contained_ref_seq.cluster_refs.fa')
>         contig_fasta = os.path.join(data_dir, 'ref_seq_chooser_full_run_contained_ref_seq.contigs.fa')
>         tmp_out = 'tmp.ref_seq_chooser_full_run_contained_ref_seq.fa'
>         refchooser = ref_seq_chooser.RefSeqChooser(cluster_fasta, all_ref_fasta, contig_fasta, tmp_out, sys.stdout)
> >       refchooser.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/ref_seq_chooser_test.py:92: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:174: in run
>     best_hit_from_all_seqs, not_needed = RefSeqChooser._closest_nucmer_match_between_fastas(self.all_refs_fasta, pieces_fasta_file, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, True, False)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> Looking for closest match from sequences within cluster
> [choose ref nucmer]	1	500	1	500	500	500	99.80	500	500	1	ref2	ref2.l30.c4.ctg.1
> Closest cluster ref sequence is ref2 to assembly ref2.l30.c4
> Checking for a better match to a ref sequence outside the cluster
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 1744
> # construct suffix tree for sequence of length 1744
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.run_nucmer.7r3lm6w5/tmp.choose_ref.zpqk_6l7/nucmer_vs_cluster_refs.pieces.fa" of length 500
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.run_nucmer.7r3lm6w5/tmp.choose_ref.zpqk_6l7/nucmer_vs_cluster_refs.pieces.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> ________________ TestRefSeqChooser.test_run_flanking_different _________________
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
> >           output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:20: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> timeout = None, popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2}
> 
>     def check_output(*popenargs, timeout=None, **kwargs):
>         r"""Run command with arguments and return its output.
>     
>         If the exit code was non-zero it raises a CalledProcessError.  The
>         CalledProcessError object will have the return code in the returncode
>         attribute and output in the output attribute.
>     
>         The arguments are the same as for the Popen constructor.  Example:
>     
>         >>> check_output(["ls", "-l", "/dev/null"])
>         b'crw-rw-rw- 1 root root 1, 3 Oct 18  2007 /dev/null\n'
>     
>         The stdout argument is not allowed as it is used internally.
>         To capture standard error in the result, use stderr=STDOUT.
>     
>         >>> check_output(["/bin/sh", "-c",
>         ...               "ls -l non_existent_file ; exit 0"],
>         ...              stderr=STDOUT)
>         b'ls: non_existent_file: No such file or directory\n'
>     
>         There is an additional optional argument, "input", allowing you to
>         pass a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it too will be used internally.  Example:
>     
>         >>> check_output(["sed", "-e", "s/foo/bar/"],
>         ...              input=b"when in the course of fooman events\n")
>         b'when in the course of barman events\n'
>     
>         By default, all communication is in bytes, and therefore any "input"
>         should be bytes, and the return value will be bytes.  If in text mode,
>         any "input" should be a string, and the return value will be a string
>         decoded according to locale encoding, or by "encoding" if set. Text mode
>         is triggered by setting any of text, encoding, errors or universal_newlines.
>         """
>         if 'stdout' in kwargs:
>             raise ValueError('stdout argument not allowed, it will be overridden.')
>     
>         if 'input' in kwargs and kwargs['input'] is None:
>             # Explicitly passing input=None was previously equivalent to passing an
>             # empty string. That is maintained here for backwards compatibility.
>             if kwargs.get('universal_newlines') or kwargs.get('text'):
>                 empty = ''
>             else:
>                 empty = b''
>             kwargs['input'] = empty
>     
> >       return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
>                    **kwargs).stdout
> 
> /usr/lib/python3.10/subprocess.py:420: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> input = None, capture_output = False, timeout = None, check = True
> popenargs = ('bash run_nucmer.sh',)
> kwargs = {'shell': True, 'stderr': -2, 'stdout': -1}
> process = <Popen: returncode: 255 args: 'bash run_nucmer.sh'>
> stdout = b'1: PREPARING DATA\n2,3: RUNNING mummer AND CREATING CLUSTERS\n# reading input file "p.ntref" of length 1002\n# const...Could not parse delta file, p.delta\nerror no: 400\nERROR: Could not parse delta file, p.delta.filter\nerror no: 402\n'
> stderr = None, retcode = 255
> 
>     def run(*popenargs,
>             input=None, capture_output=False, timeout=None, check=False, **kwargs):
>         """Run command with arguments and return a CompletedProcess instance.
>     
>         The returned instance will have attributes args, returncode, stdout and
>         stderr. By default, stdout and stderr are not captured, and those attributes
>         will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.
>     
>         If check is True and the exit code was non-zero, it raises a
>         CalledProcessError. The CalledProcessError object will have the return code
>         in the returncode attribute, and output & stderr attributes if those streams
>         were captured.
>     
>         If timeout is given, and the process takes too long, a TimeoutExpired
>         exception will be raised.
>     
>         There is an optional argument "input", allowing you to
>         pass bytes or a string to the subprocess's stdin.  If you use this argument
>         you may not also use the Popen constructor's "stdin" argument, as
>         it will be used internally.
>     
>         By default, all communication is in bytes, and therefore any "input" should
>         be bytes, and the stdout and stderr will be bytes. If in text mode, any
>         "input" should be a string, and stdout and stderr will be strings decoded
>         according to locale encoding, or by "encoding" if set. Text mode is
>         triggered by setting any of text, encoding, errors or universal_newlines.
>     
>         The other arguments are the same as for the Popen constructor.
>         """
>         if input is not None:
>             if kwargs.get('stdin') is not None:
>                 raise ValueError('stdin and input arguments may not both be used.')
>             kwargs['stdin'] = PIPE
>     
>         if capture_output:
>             if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
>                 raise ValueError('stdout and stderr arguments may not be used '
>                                  'with capture_output.')
>             kwargs['stdout'] = PIPE
>             kwargs['stderr'] = PIPE
>     
>         with Popen(*popenargs, **kwargs) as process:
>             try:
>                 stdout, stderr = process.communicate(input, timeout=timeout)
>             except TimeoutExpired as exc:
>                 process.kill()
>                 if _mswindows:
>                     # Windows accumulates the output in a single blocking
>                     # read() call run on child threads, with the timeout
>                     # being done in a join() on those threads.  communicate()
>                     # _after_ kill() is required to collect that and add it
>                     # to the exception.
>                     exc.stdout, exc.stderr = process.communicate()
>                 else:
>                     # POSIX _communicate already populated the output so
>                     # far into the TimeoutExpired exception.
>                     process.wait()
>                 raise
>             except:  # Including KeyboardInterrupt, communicate handled that.
>                 process.kill()
>                 # We don't call process.wait() as .__exit__ does that for us.
>                 raise
>             retcode = process.poll()
>             if check and retcode:
> >               raise CalledProcessError(retcode, process.args,
>                                          output=stdout, stderr=stderr)
> E               subprocess.CalledProcessError: Command 'bash run_nucmer.sh' returned non-zero exit status 255.
> 
> /usr/lib/python3.10/subprocess.py:524: CalledProcessError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <ariba.tests.ref_seq_chooser_test.TestRefSeqChooser testMethod=test_run_flanking_different>
> 
>     def test_run_flanking_different(self):
>         '''Test full run where amount of flanking seq varies'''
>         all_ref_fasta = os.path.join(data_dir, 'ref_seq_chooser_test_flanking.all_refs.fa')
>         cluster_fasta = os.path.join(data_dir, 'ref_seq_chooser_test_flanking.cluster_refs.fa')
>         contig_fasta = os.path.join(data_dir, 'ref_seq_chooser_test_flanking.contigs.fa')
>         expected_fa = os.path.join(data_dir, 'ref_seq_chooser_test_flanking.expected_contigs.fa')
>         tmp_out = 'tmp.ref_seq_chooser_test_flanking.fa'
>         refchooser = ref_seq_chooser.RefSeqChooser(cluster_fasta, all_ref_fasta, contig_fasta, tmp_out, sys.stdout)
> >       refchooser.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/ref_seq_chooser_test.py:107: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:174: in run
>     best_hit_from_all_seqs, not_needed = RefSeqChooser._closest_nucmer_match_between_fastas(self.all_refs_fasta, pieces_fasta_file, self.log_fh, self.nucmer_min_id, self.nucmer_min_len, self.nucmer_breaklen, True, False)
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/ref_seq_chooser.py:148: in _closest_nucmer_match_between_fastas
>     ).run()
> /usr/lib/python3/dist-packages/pymummer/nucmer.py:144: in run
>     syscall.run('bash ' + script, verbose=self.verbose)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> cmd = 'bash run_nucmer.sh', verbose = False
> 
>     def run(cmd, verbose=False):
>         if verbose:
>             print('Running command:', cmd, flush=True)
>         try:
>             output = subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT)
>         except subprocess.CalledProcessError as error:
>             print('The following command failed with exit code', error.returncode, file=sys.stderr)
>             print(cmd, file=sys.stderr)
>             print('\nThe output was:\n', file=sys.stderr)
>             print(error.output.decode(), file=sys.stderr)
> >           raise Error('Error running command:', cmd)
> E           pymummer.syscall.Error: ('Error running command:', 'bash run_nucmer.sh')
> 
> /usr/lib/python3/dist-packages/pymummer/syscall.py:26: Error
> ----------------------------- Captured stdout call -----------------------------
> Looking for closest match from sequences within cluster
> [choose ref nucmer]	1	494	61	554	494	494	100.00	500	560	1	ref1	cluster.l15.c17.ctg.1
> [choose ref nucmer]	1	494	61	554	494	494	100.00	500	600	1	ref1	cluster.l6.c4.ctg.1
> Closest cluster ref sequence is ref1 to assembly cluster.l6.c4
> Checking for a better match to a ref sequence outside the cluster
> ----------------------------- Captured stderr call -----------------------------
> The following command failed with exit code 255
> bash run_nucmer.sh
> 
> The output was:
> 
> 1: PREPARING DATA
> 2,3: RUNNING mummer AND CREATING CLUSTERS
> # reading input file "p.ntref" of length 1002
> # construct suffix tree for sequence of length 1002
> # (maximum reference length is 536870908)
> # (maximum query length is 4294967295)
> # CONSTRUCTIONTIME /usr/bin/mummer p.ntref 0.00
> # reading input file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.run_nucmer.7r3lm6w5/tmp.run_nucmer.j4f4vvlc/tmp.choose_ref.zj1wsc8o/nucmer_vs_cluster_refs.pieces.fa" of length 494
> # matching query-file "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.cluster_test_full_delete_codon/tmp.cluster_test_full_insert_codon/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_gene/tmp.cluster_test_full_run_ok_samtools_snp_known_position_pres_abs_noncoding/tmp.cluster_test_full_run_multiple_vars/tmp.cluster_test_full_run_ok_gene_start_mismatch/tmp.test_full_run_ok_non_coding/tmp.cluster_test_full_run_ok_presence_absence/tmp.cluster_test_full_run_ok_variants_only.present/tmp.cluster_test_full_run_ok_variants_only.not_present/tmp.cluster_full_run_varonly.not_present.always_report/tmp.cluster_test_full_run_partial_assembly/tmp.cluster_test_full_run_smtls_known_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_pres_abs_gene/tmp.cluster_test_full_run_smtls_snp_presabs_nonc/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_gene_does_have_var/tmp.run_nucmer.1upsbxjt/tmp.cluster_full_run_smtls_snp_varonly_gene_2/tmp.run_nucmer.d_iigmi0/tmp.cluster_test_full_run_smtls_snp_varonly_gene_no_snp/tmp.run_nucmer.1o4pw2ni/tmp.cluster_full_run_smtls_snp_varonly_nonc/tmp.run_nucmer.gyynmk9t/tmp.cluster_test_full_run_ok_samtools_snp_known_position_var_only_noncoding/tmp.run_nucmer.lutusrvz/tmp.run_nucmer.77haucmq/tmp.run_nucmer.7r3lm6w5/tmp.run_nucmer.j4f4vvlc/tmp.choose_ref.zj1wsc8o/nucmer_vs_cluster_refs.pieces.fa"
> # against subject-file "p.ntref"
> # COMPLETETIME /usr/bin/mummer p.ntref 0.00
> # SPACE /usr/bin/mummer p.ntref 0.00
> 4: FINISHING DATA
> *** buffer overflow detected ***: terminated
> Aborted
> ERROR: postnuc returned non-zero
> ERROR: Could not parse delta file, p.delta
> error no: 400
> ERROR: Could not parse delta file, p.delta.filter
> error no: 402
> 
> _______________ TestSamtoolsVariants.test_get_depths_at_position _______________
> 
> self = <ariba.tests.samtools_variants_test.TestSamtoolsVariants testMethod=test_get_depths_at_position>
> 
>     def test_get_depths_at_position(self):
>         '''test get_depths_at_position'''
>         bam = os.path.join(data_dir, 'samtools_variants_test_get_depths_at_position.bam')
>         ref_fa = os.path.join(data_dir, 'samtools_variants_test_get_depths_at_position.ref.fa')
>         tmp_prefix = 'tmp.test_get_depths_at_position'
>         samtools_vars = samtools_variants.SamtoolsVariants(
>             ref_fa,
>             bam,
>             tmp_prefix,
>         )
> >       samtools_vars.run()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/samtools_variants_test.py:160: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:169: in run
>     self._make_vcf_and_read_depths_files()
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> _____________ TestSamtoolsVariants.test_make_vcf_and_depths_files ______________
> 
> self = <ariba.tests.samtools_variants_test.TestSamtoolsVariants testMethod=test_make_vcf_and_depths_files>
> 
>     def test_make_vcf_and_depths_files(self):
>         '''test _make_vcf_and_read_depths_files'''
>         ref = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.asmbly.fa')
>         bam = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.bam')
>         expected_vcf = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.expect.vcf')
>         expected_depths = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.expect.depths.gz')
>         expected_coverage = os.path.join(data_dir, 'samtools_variants_make_vcf_and_depths_files.expect.cov')
>         tmp_prefix = 'tmp.test_make_vcf_and_depths_files'
>         sv = samtools_variants.SamtoolsVariants(
>             ref,
>             bam,
>             tmp_prefix,
>         )
> >       sv._make_vcf_and_read_depths_files()
> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/tests/samtools_variants_test.py:32: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/samtools_variants.py:39: in _make_vcf_and_read_depths_files
>     print(pysam.mpileup(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> self = <pysam.utils.PysamDispatcher object at 0x7f174ab56f80>
> args = ('-t', 'INFO/AD,INFO/ADF,INFO/ADR', '-L', '99999999', '-A', '-f', ...)
> kwargs = {}, retval = 1
> stderr = '\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is i...up" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> stdout = ''
> 
>     def __call__(self, *args, **kwargs):
>         '''execute a samtools command.
>     
>         Keyword arguments:
>         catch_stdout -- redirect stdout from the samtools command and
>             return as variable (default True)
>         save_stdout -- redirect stdout to a filename.
>         raw -- ignore any parsers associated with this samtools command.
>         split_lines -- return stdout (if catch_stdout is True and stderr
>                        as a list of strings.
>         '''
>         retval, stderr, stdout = _pysam_dispatch(
>             self.collection,
>             self.dispatch,
>             args,
>             catch_stdout=kwargs.get("catch_stdout", True),
>             save_stdout=kwargs.get("save_stdout", None))
>     
>         if kwargs.get("split_lines", False):
>             stdout = stdout.splitlines()
>             if stderr:
>                 stderr = stderr.splitlines()
>     
>         if retval:
> >           raise SamtoolsError(
>                 "%s returned with error %i: "
>                 "stdout=%s, stderr=%s" %
>                 (self.collection,
>                  retval,
>                  stdout,
>                  stderr))
> E           pysam.utils.SamtoolsError: 'samtools returned with error 1: stdout=, stderr=\nUsage: samtools mpileup [options] in1.bam [in2.bam [...]]\n\nInput options:\n  -6, --illumina1.3+      quality is in the Illumina-1.3+ encoding\n  -A, --count-orphans     do not discard anomalous read pairs\n  -b, --bam-list FILE     list of input BAM filenames, one per line\n  -B, --no-BAQ            disable BAQ (per-Base Alignment Quality)\n  -C, --adjust-MQ INT     adjust mapping quality; recommended:50, disable:0 [0]\n  -d, --max-depth INT     max per-file depth; avoids excessive memory usage [8000]\n  -E, --redo-BAQ          recalculate BAQ on the fly, ignore existing BQs\n  -f, --fasta-ref FILE    faidx indexed reference sequence file\n  -G, --exclude-RG FILE   exclude read groups listed in FILE\n  -l, --positions FILE    skip unlisted positions (chr pos) or regions (BED)\n  -q, --min-MQ INT        skip alignments with mapQ smaller than INT [0]\n  -Q, --min-BQ INT        skip bases with baseQ/BAQ smaller than INT [13]\n  -r, --region REG        region in which pileup is generated\n  -R, --ignore-RG         ignore RG tags (one BAM = one sample)\n  --rf, --incl-flags STR|INT  required flags: include reads with any of the mask bits set []\n  --ff, --excl-flags STR|INT  filter flags: skip reads with any of the mask bits set\n                                            [UNMAP,SECONDARY,QCFAIL,DUP]\n  -x, --ignore-overlaps   disable read-pair overlap detection\n  -X, --customized-index  use customized index files\n\nOutput options:\n  -o, --output FILE        write output to FILE [standard output]\n  -O, --output-BP          output base positions on reads, current orientation\n      --output-BP-5        output base positions on reads, 5\' to 3\' orientation\n  -M, --output-mods        output base modifications\n  -s, --output-MQ          output mapping quality\n      --output-QNAME       output read names\n      --output-extra STR   output extra read fields and read tag values\n      --output-sep CHAR    set the separator character for tag lists [,]\n      --output-empty CHAR  set the no value character for tag lists [*]\n      --no-output-ins      skip insertion sequence after +NUM\n                           Use twice for complete insertion removal\n      --no-output-ins-mods don\'t display base modifications within insertions\n      --no-output-del      skip deletion sequence after -NUM\n                           Use twice for complete deletion removal\n      --no-output-ends     remove ^MQUAL and $ markup in sequence column\n      --reverse-del        use \'#\' character for deletions on the reverse strand\n  -a                       output all positions (including zero depth)\n  -a -a (or -aa)           output absolutely all positions, including unused ref. sequences\n\nGeneric options:\n      --input-fmt-option OPT[=VAL]\n               Specify a single input file format option in the form\n               of OPTION or OPTION=VALUE\n      --reference FILE\n               Reference sequence FASTA FILE [null]\n      --verbosity INT\n               Set level of verbosity\n\nNote that using "samtools mpileup" to generate BCF or VCF files has been\nremoved.  To output these formats, please use "bcftools mpileup" instead.\n'
> 
> /usr/lib/python3/dist-packages/pysam/utils.py:69: SamtoolsError
> ----------------------------- Captured stderr call -----------------------------
> mpileup: invalid option -- 't'
> =============================== warnings summary ===============================
> ariba/mapping.py:3
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/mapping.py:3: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
>     from distutils.version import LooseVersion
> 
> ariba/external_progs.py:34
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/external_progs.py:34: DeprecationWarning: invalid escape sequence '\.'
>     'cdhit': ('', re.compile('CD-HIT version ([0-9\.]+) \(')),
> 
> ariba/external_progs.py:35
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/external_progs.py:35: DeprecationWarning: invalid escape sequence '\.'
>     'nucmer': ('--version', re.compile('([0-9]+\.[0-9\.]+.*$)')),
> 
> ariba/external_progs.py:36
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build/ariba/external_progs.py:36: DeprecationWarning: invalid escape sequence '\s'
>     'spades': ('--version', re.compile('SPAdes\s+v([0-9\.]+)'))
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info ============================
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_cluster_test_full_run_smtls_snp_varonly_nonc
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_delete_codon
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_insert_codon
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_known_smtls_snp_presabs_gene
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_known_smtls_snp_presabs_nonc
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_multiple_vars_in_codon
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_gene_start_mismatch
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_non_coding
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_presence_absence
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_variants_only_variant_is_present
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_variants_only_variant_not_present
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_ok_variants_only_variant_not_present_always_report
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_partial_assembly
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_known_snp_presabs_nonc
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_presabs_gene
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_presabs_nonc
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_varonly_gene
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_varonly_gene_2
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_varonly_gene_no_snp
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_varonly_nonc
> FAILED ariba/tests/cluster_test.py::TestCluster::test_full_run_smtls_snp_varonly_nonc_no_snp
> FAILED ariba/tests/clusters_test.py::TestClusters::test_run_with_tb - ariba.c...
> FAILED ariba/tests/ref_seq_chooser_test.py::TestRefSeqChooser::test_run_best_match_is_in_cluster
> FAILED ariba/tests/ref_seq_chooser_test.py::TestRefSeqChooser::test_run_best_match_not_in_cluster
> FAILED ariba/tests/ref_seq_chooser_test.py::TestRefSeqChooser::test_run_contained_ref_seq
> FAILED ariba/tests/ref_seq_chooser_test.py::TestRefSeqChooser::test_run_flanking_different
> FAILED ariba/tests/samtools_variants_test.py::TestSamtoolsVariants::test_get_depths_at_position
> FAILED ariba/tests/samtools_variants_test.py::TestSamtoolsVariants::test_make_vcf_and_depths_files
> ================= 28 failed, 328 passed, 4 warnings in 59.07s ==================
> E: pybuild pybuild:379: test: plugin distutils failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.10_ariba/build; python3.10 -m pytest 
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.10 returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2022/10/23/ariba_2.14.6+ds-3_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20221023;users=lucas@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20221023&fusertaguser=lucas@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please marking it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.



More information about the Debian-med-packaging mailing list