Bug#966971: tpot: FTBFS: dh_auto_test: error: pybuild --test --test-nose -i python{version} -p 3.8 returned exit code 13

Lucas Nussbaum lucas at debian.org
Mon Aug 3 09:35:06 BST 2020


Source: tpot
Version: 0.11.1+dfsg2-3
Severity: serious
Justification: FTBFS on amd64
Tags: bullseye sid ftbfs
Usertags: ftbfs-20200802 ftbfs-bullseye

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.

Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build
> I: pybuild base:217: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/gp_types.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/driver.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/decorators.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/export_utils.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/operator_utils.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/gp_deap.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/metrics.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/_version.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/base.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> copying tpot/tpot.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/classifier_mdr.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/regressor_sparse.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/classifier_light.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/regressor_mdr.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/regressor_light.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/classifier.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/regressor.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> copying tpot/config/classifier_sparse.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/config
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/one_hot_encoder.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/zero_count.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/stacking_estimator.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/feature_set_selector.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/combine_dfs.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> copying tpot/builtins/feature_transformers.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tpot/builtins
> mkdocs build --clean --theme readthedocs
> WARNING -  Config value: 'pages'. Warning: The 'pages' configuration option has been deprecated and will be removed in a future release of MkDocs. Use 'nav' instead. 
> INFO    -  Cleaning site directory 
> INFO    -  Building documentation to directory: /<<PKGBUILDDIR>>/docs 
> rm -f docs/sitemap.xml.gz
> cp -r images docs/
> sed -i -e 's,https://raw.githubusercontent.com/EpistasisLab/tpot/master/,,' docs/index.html
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
>    dh_auto_test -O--buildsystem=pybuild
> I: pybuild pybuild:284: cp -r /<<PKGBUILDDIR>>/tests /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build; sed -i -e 's/python -m/python3.8 -m/' /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tests/driver_tests.py
> I: pybuild base:217: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build; python3.8 -m nose -v tests
> Assert that the TPOT driver stores correct default values for all parameters. ... ok
> Assert that _print_args prints correct values for all parameters in default settings. ... ok
> Assert that _print_args prints correct values for all parameters in regression mode. ... ok
> driver_tests.test_scoring_function_argument ... ok
> Assert that the TPOT driver outputs normal result in mode mode. ... /usr/lib/python3.8/runpy.py:127: RuntimeWarning: 'tpot.driver' found in sys.modules after import of package 'tpot', but prior to execution of 'tpot.driver'; this may result in unpredictable behaviour
>   warn(RuntimeWarning(msg))
> ok
> Assert that the tpot_driver() in TPOT driver outputs normal result with verbosity = 1. ... ok
> Assert that the tpot_driver() in TPOT driver outputs normal result with verbosity = 2. ... ok
> Assert that the tpot_driver() in TPOT driver outputs normal result with verbosity = 3. ... ok
> Assert that the tpot_driver() in TPOT driver outputs normal result with exported python file and verbosity = 0. ... ok
> Assert that _read_data_file raises ValueError when the targe column is missing. ... ok
> Assert that the TPOT CLI interface's integer parsing throws an exception when n < 0. ... ok
> Assert that the TPOT CLI interface's integer parsing returns the integer value of a string encoded integer when n > 0. ... ok
> Assert that the TPOT CLI interface's integer parsing throws an exception when n is not an integer. ... ok
> Assert that the TPOT CLI interface's positive_integer_or_none parsing throws an exception when n < 0. ... ok
> Assert that the TPOT CLI interface's positive_integer_or_none parsing returns the integer value of a string encoded integer when n > 0. ... ok
> Assert that the TPOT CLI interface's positive_integer_or_none parsing throws an exception when n is not an integer and not None. ... ok
> Assert that the TPOT CLI interface's positive_integer_or_none parsing return None when value is string 'None' or 'none'. ... ok
> Assert that the TPOT CLI interface's float range returns a float with input is in 0. - 1.0. ... ok
> Assert that the TPOT CLI interface's float range throws an exception when input it out of range. ... ok
> Assert that the TPOT CLI interface's float range throws an exception when input is not a float. ... ok
> Assert that the TPOTClassifier can generate the same pipeline export with random seed of 39. ... ok
> Assert that TPOT's export function throws a RuntimeError when no optimized pipeline exists. ... ok
> Assert that TPOT's export function returns the expected pipeline text as a string. ... ok
> Assert that generate_pipeline_code() returns the correct code given a specific pipeline. ... ok
> Assert that generate_pipeline_code() returns the correct code given a specific pipeline with two CombineDFs. ... ok
> Assert that generate_import_code() returns the correct set of dependancies for a given pipeline. ... ok
> Assert that generate_import_code() returns the correct set of dependancies and dependancies are importable. ... ok
> Assert that the TPOT FeatureAgglomeration operator exports as expected ... ok
> Assert that the TPOT FastICA operator exports as expected ... ok
> Assert that the TPOT PCA operator exports as expected ... ok
> Assert that the TPOT ExtraTreesClassifier operator exports as expected ... ok
> Assert that the TPOT GradientBoostingClassifier operator exports as expected ... ok
> Assert that the TPOT RandomForestClassifier operator exports as expected ... ok
> Assert that the TPOT RFE operator exports as expected ... ok
> Assert that the TPOT SelectFromModel operator exports as expected ... ok
> Assert that the TPOT SelectFwe operator exports as expected ... ok
> Assert that the TPOT SelectPercentile operator exports as expected ... ok
> Assert that the TPOT VarianceThreshold operator exports as expected ... ok
> Assert that the TPOT Nystroem operator exports as expected ... ok
> Assert that the TPOT RBFSampler operator exports as expected ... ok
> Assert that the TPOT LogisticRegression operator exports as expected ... ok
> Assert that the TPOT SGDClassifier operator exports as expected ... ok
> Assert that the TPOT BernoulliNB operator exports as expected ... ok
> Assert that the TPOT GaussianNB operator exports as expected ... ok
> Assert that the TPOT MultinomialNB operator exports as expected ... ok
> Assert that the TPOT KNeighborsClassifier operator exports as expected ... ok
> Assert that the TPOT Binarizer operator exports as expected ... ok
> Assert that the TPOT MaxAbsScaler operator exports as expected ... ok
> Assert that the TPOT MinMaxScaler operator exports as expected ... ok
> Assert that the TPOT Normalizer operator exports as expected ... ok
> Assert that the TPOT PolynomialFeatures operator exports as expected ... ok
> Assert that the TPOT RobustScaler operator exports as expected ... ok
> Assert that the TPOT StandardScaler operator exports as expected ... ok
> Assert that the TPOT LinearSVC operator exports as expected ... ok
> Assert that the TPOT DecisionTreeClassifier operator exports as expected ... ok
> Assert that the TPOT OneHotEncoder operator exports as expected ... ok
> Assert that the TPOT ZeroCount operator exports as expected ... ok
> Assert that exported_pipeline() generated a compile source file as expected given a fixed pipeline. ... ok
> Assert that exported_pipeline() generated a compile source file as expected given a fixed simple pipeline (only one classifier). ... ok
> Assert that exported_pipeline() generated a compile source file as expected given a fixed simple pipeline with a preprocessor. ... ok
> Assert that exported_pipeline() generated a compile source file as expected given a fixed simple pipeline with input_matrix in CombineDFs. ... ok
> Assert that exported_pipeline() generated a compile source file as expected given a fixed simple pipeline with SelectFromModel. ... ok
> Assert that exported_pipeline() generated a compile source file with random_state and data_file_path. ... ok
> Assert that a TPOT operator can export properly with a callable function as a parameter. ... ok
> Assert that a TPOT operator can export properly with a BaseEstimator as a parameter. ... ok
> Assert that the Operator class returns operators by name appropriately. ... ok
> Assert that get_by_name raises TypeError with a incorrect operator name. ... ok
> Assert that get_by_name raises ValueError with duplicate operators in operator dictionary. ... ok
> Assert that indenting a multiline string by 4 spaces prepends 4 spaces before each new line. ... ok
> Assert that the TPOTClassifier can generate a scored pipeline export correctly. ... ok
> Assert that TPOT exports a pipeline with an imputation step if imputation was used in fit(). ... ok
> export_tests.test_set_param_recursive ... ok
> Assert that set_param_recursive sets "random_state" to 42 in nested estimator in SelectFromModel. ... ok
> Assert that set_param_recursive sets "random_state" to 42 in nested estimator in StackingEstimator in a complex pipeline. ... ok
> Assert that the StackingEstimator returns transformed X based on test feature list 1. ... ok
> Assert that the StackingEstimator returns transformed X based on test feature list 2. ... ok
> Assert that the StackingEstimator returns transformed X based on 2 subsets' names ... ok
> Assert that the StackingEstimator returns transformed X based on 2 subsets' indexs ... ok
> Assert that the StackingEstimator returns transformed X seleced based on test feature list 1's index. ... ok
> Assert that the _get_support_mask function returns correct mask. ... ok
> Assert that the StackingEstimator works as expected when input X is np.array. ... ok
> Assert that the StackingEstimator rasies ValueError when features are not available. ... ok
> Assert that the StackingEstimator __name__ returns correct class name. ... ok
> Assert that CategoricalSelector works as expected. ... ok
> Assert that CategoricalSelector works as expected with threshold=5. ... ok
> Assert that CategoricalSelector works as expected with threshold=20. ... ok
> Assert that CategoricalSelector rasies ValueError without categorical features. ... ok
> Assert that fit() in CategoricalSelector does nothing. ... ok
> Assert that ContinuousSelector works as expected. ... ok
> Assert that ContinuousSelector works as expected with threshold=5. ... ok
> Assert that ContinuousSelector works as expected with svd_solver='full' ... ok
> Assert that ContinuousSelector rasies ValueError without categorical features. ... ok
> Assert that fit() in ContinuousSelector does nothing. ... ok
> /usr/lib/python3/dist-packages/sklearn/utils/deprecation.py:143: FutureWarning: The sklearn.utils.testing module is  deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.utils. Anything that cannot be imported from sklearn.utils is now part of the private API.
>   warnings.warn(message, FutureWarning)
> Assert that automatic selection of categorical features works as expected with a threshold of 10. ... ok
> Test fit_transform a dense matrix. ... ok
> Test fit_transform a dense matrix with minimum_fraction=0.5. ... ok
> Test fit_transform a dense matrix including NaNs. ... ok
> Test fit_transform a dense matrix including NaNs with minimum_fraction=0.5 ... ok
> Test fit_transform a dense matrix including NaNs with specifying categorical_features. ... ok
> Test fit_transform a dense matrix with minimum_fraction as sparse ... ok
> Test fit_transform a dense matrix including all NaN slice. ... ok
> Test fit_transform a sparse matrix. ... ok
> Test fit_transform a sparse matrix with minimum_fraction=0.5. ... ok
> Test fit_transform a sparse matrix with specifying categorical_features. ... ok
> Test fit_transform a sparse matrix including all zeros slice. ... ok
> Test fit_transform a sparse matrix including all zeros slice with minimum_fraction=0.5. ... ok
> Test fit_transform another sparse matrix including all zeros slice. ... ok
> Test OneHotEncoder with both dense and sparse matrixes. ... ok
> Assert _transform_selected return original X when selected is empty list ... ok
> Assert _transform_selected return original X when selected is a list of False values ... ok
> Test OneHotEncoder with categorical_features='auto'. ... ok
> Assert that the StackingEstimator returns transformed X with synthetic features in classification. ... ok
> Assert that the StackingEstimator returns transformed X with a synthetic feature in regression. ... ok
> Assert that the StackingEstimator worked as expected in scikit-learn pipeline in classification. ... ok
> Assert that the StackingEstimator worked as expected in scikit-learn pipeline in regression. ... FAIL
> Asserts that gp_deap.initialize_stats_dict initializes individual statistics correctly ... ok
> Assert that self._mate_operator updates stats as expected. ... ok
> Asserts that self._random_mutation_operator updates stats as expected. ... ok
> Failure: SkipTest () ... SKIP
> Assert that the TPOT instantiator stores the TPOT variables properly. ... ok
> Assert that TPOT intitializes with the correct default scoring function. ... ok
> Assert that TPOT rasies ValueError with a invalid sklearn metric function. ... ok
> Assert that TPOT intitializes with a valid _BaseScorer. ... ok
> Assert that TPOT intitializes with a valid scorer. ... ok
> Assert that TPOT rasies ValueError with a invalid sklearn metric function roc_auc_score. ... ok
> Assert that TPOT rasies ValueError with a invalid sklearn metric function from __main__. ... ok
> Assert that TPOT rasies ValueError with a valid sklearn metric function from __main__. ... ok
> Assert that the TPOT intitializes raises a ValueError when the scoring metrics is not available in SCORERS. ... ok
> Assert that the TPOT fit function raises a ValueError when dataset is not in right format. ... ok
> Assert that the TPOT intitializes raises a ValueError when subsample ratio is not in the range (0.0, 1.0]. ... ok
> Assert that the TPOT intitializes raises a ValueError when the sum of crossover and mutation probabilities is large than 1. ... ok
> Assert that the TPOT init stores max run time and sets generations to 1000000. ... ok
> Assert that the TPOT init stores max run time but keeps the generations at the user-supplied value. ... ok
> Assert that the TPOT init stores current number of processes. ... ok
> Assert that the TPOT init assign right ... ok
> Assert that the TPOT init rasies ValueError if n_jobs=0. ... ok
> Assert that _wrapped_cross_val_score return Timeout in a time limit. ... ok
> Assert that _wrapped_cross_val_score return -float('inf') with a invalid_pipeline ... ok
> Assert that the balanced_accuracy in TPOT returns correct accuracy. ... ok
> Assert that get_params returns the exact dictionary of parameters used by TPOT. ... ok
> Assert that set_params returns a reference to the TPOT instance. ... ok
> Assert that set_params updates TPOT's instance variables. ... ok
> Assert that TPOTBase class raises RuntimeError when using it directly. ... ok
> Assert that TPOT uses the pre-configured dictionary of operators when config_dict is 'TPOT light' or 'TPOT MDR'. ... ok
> Assert that TPOT uses a custom dictionary of operators when config_dict is Python dictionary. ... ok
> Assert that TPOT uses a custom dictionary of operators when config_dict is the path of Python dictionary. ... ok
> Assert that _read_config_file rasies FileNotFoundError with a wrong path. ... ok
> Assert that _read_config_file rasies ValueError with wrong dictionary format ... ok
> Assert that _read_config_file rasies ValueError without a dictionary named 'tpot_config'. ... ok
> Assert that the TPOTClassifier can generate the same pipeline with same random seed. ... ok
> Assert that the TPOTRegressor can generate the same pipeline with same random seed. ... ok
> Assert that the TPOT score function raises a RuntimeError when no optimized pipeline exists. ... ok
> Assert that the TPOTClassifier score function outputs a known score for a fixed pipeline. ... ok
> Assert that the TPOTRegressor score function outputs a known score for a fixed pipeline. ... ok
> Assert that the TPOTRegressor score function outputs a known score for a fixed pipeline with sample weights. ... FAIL
> Assert that TPOT template option generates pipeline when each step is a type of operator. ... ok
> Assert that TPOT template option generates pipeline when each step is operator type with a duplicate main type. ... ok
> Assert that TPOT template option generates pipeline when one of steps is a specific operator. ... ok
> Assert that TPOT template option generates pipeline when one of steps is a specific operator. ... ok
> Assert that TPOT rasie ValueError when template parameter is invalid. ... ok
> Assert that TPOT properly handles the group parameter when using GroupKFold. ... ok
> Assert that the TPOT predict function raises a RuntimeError when no optimized pipeline exists. ... ok
> Assert that the TPOT predict function returns a numpy matrix of shape (num_testing_rows,). ... ok
> Assert that the TPOT predict function works on dataset with nan ... ok
> Assert that the TPOT predict_proba function returns a numpy matrix of shape (num_testing_rows, num_testing_target). ... ok
> Assert that the TPOT predict_proba function returns a numpy matrix filled with probabilities (float). ... ok
> Assert that the TPOT predict_proba function raises a RuntimeError when no optimized pipeline exists. ... ok
> Assert that the TPOT predict_proba function raises a RuntimeError when the optimized pipeline do not have the predict_proba() function ... ok
> Assert that the TPOT predict_proba function works on dataset with nan. ... ok
> Assert that the TPOT warm_start flag stores the pop and pareto_front from the first run. ... ok
> Assert that the TPOT fit function provides an optimized pipeline. ... ok
> Assert that the TPOT fit function provides an optimized pipeline when config_dict is 'TPOT light'. ... ok
> Assert that the TPOT fit function provides an optimized pipeline with subsample of 0.8. ... ok
> Assert that the TPOT fit function provides an optimized pipeline with max_time_mins of 2 second. ... ok
> Assert that the TPOT fit function provides an optimized pipeline with max_time_mins of 2 second with warm_start=True. ... ok
> Assert that the TPOT fit function provides an optimized pipeline with pandas DataFrame ... ok
> Assert that the TPOT fit function runs normally with memory='auto'. ... ok
> Assert that the TPOT _setup_memory function runs normally with a valid path. ... ok
> Assert that the TPOT fit function does not clean up caching directory when memory is a valid path. ... ok
> Assert that the TPOT _setup_memory function create a directory which does not exist. ... ok
> Assert that the TPOT _setup_memory function runs normally with a Memory object. ... ok
> Assert that the TPOT _setup_memory function rasies ValueError with a invalid object. ... ok
> Assert that the _check_periodic_pipeline exports periodic pipeline. ... ok
> Assert that the _check_periodic_pipeline rasie StopIteration if self._last_optimized_pareto_front_n_gens >= self.early_stop. ... ok
> Assert that the _save_periodic_pipeline does not export periodic pipeline if exception happened ... ok
> Assert that _save_periodic_pipeline creates the checkpoint folder and exports to it if it didn't exist ... ok
> Assert that the _save_periodic_pipeline does not export periodic pipeline if the pipeline has been saved before. ... ok
> Assert that the TPOT fit_predict function provides an optimized pipeline and correct output. ... ok
> Assert that the TPOT _update_top_pipeline updated an optimized pipeline. ... ok
> Assert that the TPOT _update_top_pipeline raises RuntimeError when self._pareto_front is empty. ... ok
> Assert that the TPOT _update_top_pipeline raises RuntimeError when self._optimized_pipeline is not updated. ... ok
> Assert that the TPOT _update_top_pipeline raises RuntimeError when self._optimized_pipeline is not updated. ... ok
> Assert that evaluated_individuals_ stores current pipelines and their CV scores. ... ok
> Assert that _stop_by_max_time_mins raises KeyboardInterrupt when maximum minutes have elapsed. ... ok
> Assert that _update_evaluated_individuals_ raises ValueError when scoring function does not return a float. ... ok
> Assert that _evaluate_individuals returns operator_counts and CV scores in correct order. ... ok
> Assert that _evaluate_individuals returns operator_counts and CV scores in correct order with n_jobs=2 ... ok
> Assert that _update_pbar updates self._pbar with printing correct warning message. ... ok
> Assert _update_val updates result score in list and prints timeout message. ... ok
> Assert _preprocess_individuals preprocess DEAP individuals including one evaluated individual ... ok
> Assert _preprocess_individuals preprocess DEAP individuals with one invalid pipeline ... ok
> Assert _preprocess_individuals updatas self._pbar.total when max_time_mins is not None ... ok
> Assert that the check_dataset function returns feature and target as expected. ... ok
> Assert that the check_dataset function raise ValueError when sample_weight can not be converted to float array ... ok
> Assert that the check_dataset function raise ValueError when sample_weight has NaN ... ok
> Assert that the check_dataset function raise ValueError when sample_weight has a length different length ... ok
> Assert that the check_dataset function returns feature and target as expected. ... ok
> Assert that the TPOT fit function will not raise a ValueError in a dataset where NaNs are present. ... ok
> Assert that the TPOT predict function will not raise a ValueError in a dataset where NaNs are present. ... ok
> Assert that the TPOT _impute_values function returns a feature matrix with imputed NaN values. ... ok
> Assert that the TPOT score function will not raise a ValueError in a dataset where NaNs are present. ... ok
> Assert that the TPOT fit function will raise a ValueError in a sparse matrix with config_dict='TPOT light'. ... ok
> Assert that the TPOT fit function will raise a ValueError in a sparse matrix with config_dict=None. ... ok
> Assert that the TPOT fit function will raise a ValueError in a sparse matrix with config_dict='TPOT MDR'. ... ok
> Assert that the TPOT fit function will not raise a ValueError in a sparse matrix with config_dict='TPOT sparse'. ... ok
> Assert that the TPOT fit function will not raise a ValueError in a sparse matrix with a customized config dictionary. ... ok
> Assert that the source_decode can decode operator source and import operator class. ... ok
> Assert that the source_decode return None when sourcecode is not available. ... ok
> Assert that the source_decode raise ImportError when sourcecode is not available and verbose=3. ... ok
> Assert that the TPOT operators class factory. ... ok
> Assert that TPOT allows only one PolynomialFeatures operator in a pipeline. ... ok
> Assert that pick_two_individuals_eligible_for_crossover() picks the correct pair of nodes to perform crossover with ... ok
> Assert that pick_two_individuals_eligible_for_crossover() returns the right output when no pair is eligible ... ok
> Assert that self._mate_operator returns offsprings as expected. ... ok
> Assert that cxOnePoint() returns the correct type of node between two fixed pipelines. ... ok
> Assert that mutNodeReplacement() returns the correct type of mutation node in a fixed pipeline. ... ok
> Assert that mutNodeReplacement() returns the correct type of mutation node in a complex pipeline. ... ok
> Assert that varOr() applys crossover only and removes CV scores in offsprings. ... ok
> Assert that varOr() applys mutation only and removes CV scores in offsprings. ... ok
> Assert that varOr() applys reproduction only and does NOT remove CV scores in offsprings. ... ok
> Assert that TPOT operators return their type, e.g. 'Classifier', 'Preprocessor'. ... ok
> Assert that TPOT's gen_grow_safe function returns a pipeline of expected structure. ... ok
> Assert that clean_pipeline_string correctly returns a string without parameter prefixes ... ok
> Assert that ZeroCount operator returns correct transformed X. ... ok
> Assert that fit() in ZeroCount does nothing. ... ok
> 
> ======================================================================
> FAIL: Assert that the StackingEstimator worked as expected in scikit-learn pipeline in regression.
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/nose/case.py", line 197, in runTest
>     self.test(*self.arg)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tests/stacking_estimator_tests.py", line 114, in test_StackingEstimator_4
>     assert np.allclose(known_cv_score, cv_score)
> AssertionError
> 
> ======================================================================
> FAIL: Assert that the TPOTRegressor score function outputs a known score for a fixed pipeline with sample weights.
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/nose/case.py", line 197, in runTest
>     self.test(*self.arg)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build/tests/tpot_tests.py", line 633, in test_sample_weight_func
>     assert np.allclose(known_score, score, rtol=0.01)
> AssertionError: 
> -------------------- >> begin captured stdout << ---------------------
> Warning: xgboost.XGBRegressor is not available and will not be used by TPOT.
> 
> --------------------- >> end captured stdout << ----------------------
> 
> ----------------------------------------------------------------------
> Ran 235 tests in 28.971s
> 
> FAILED (SKIP=1, failures=2)
> E: pybuild pybuild:352: test: plugin distutils failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_tpot/build; python3.8 -m nose -v tests
> dh_auto_test: error: pybuild --test --test-nose -i python{version} -p 3.8 returned exit code 13

The full build log is available from:
   http://qa-logs.debian.net/2020/08/02/tpot_0.11.1+dfsg2-3_unstable.log

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

About the archive rebuild: The rebuild was done on EC2 VM instances from
Amazon Web Services, using a clean, minimal and up-to-date chroot. Every
failed build was retried once to eliminate random failures.



More information about the debian-science-maintainers mailing list