[Debian-med-packaging] Bug#1058302: patsy: FTBFS: dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 3.11" returned exit code 13
Lucas Nussbaum
lucas at debian.org
Tue Dec 12 08:19:47 GMT 2023
Source: patsy
Version: 0.5.3-1
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lucas at debian.org
Usertags: ftbfs-20231212 ftbfs-trixie
Hi,
During a rebuild of all packages in sid, your package failed to build
on amd64.
Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dpkg-query: no packages found matching ipython
> py3versions: no X-Python3-Version in control file, using supported versions
> py3versions: no X-Python3-Version in control file, using supported versions
> python3.12 setup.py build
> /usr/lib/python3/dist-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
> !!
>
> ********************************************************************************
> The license_file parameter is deprecated, use license_files instead.
>
> This deprecation is overdue, please update your project and remove deprecated
> calls to avoid build errors in the future.
>
> See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
> ********************************************************************************
>
> !!
> parsed = self.parsers.get(option_name, lambda x: x)(value)
> running build
> running build_py
> creating build
> creating build/lib
> creating build/lib/patsy
> copying patsy/tokens.py -> build/lib/patsy
> copying patsy/compat.py -> build/lib/patsy
> copying patsy/__init__.py -> build/lib/patsy
> copying patsy/test_state.py -> build/lib/patsy
> copying patsy/desc.py -> build/lib/patsy
> copying patsy/test_splines_crs_data.py -> build/lib/patsy
> copying patsy/redundancy.py -> build/lib/patsy
> copying patsy/missing.py -> build/lib/patsy
> copying patsy/parse_formula.py -> build/lib/patsy
> copying patsy/version.py -> build/lib/patsy
> copying patsy/test_splines_bs_data.py -> build/lib/patsy
> copying patsy/builtins.py -> build/lib/patsy
> copying patsy/test_regressions.py -> build/lib/patsy
> copying patsy/origin.py -> build/lib/patsy
> copying patsy/build.py -> build/lib/patsy
> copying patsy/contrasts.py -> build/lib/patsy
> copying patsy/design_info.py -> build/lib/patsy
> copying patsy/state.py -> build/lib/patsy
> copying patsy/mgcv_cubic_splines.py -> build/lib/patsy
> copying patsy/user_util.py -> build/lib/patsy
> copying patsy/constraint.py -> build/lib/patsy
> copying patsy/splines.py -> build/lib/patsy
> copying patsy/infix_parser.py -> build/lib/patsy
> copying patsy/eval.py -> build/lib/patsy
> copying patsy/highlevel.py -> build/lib/patsy
> copying patsy/categorical.py -> build/lib/patsy
> copying patsy/compat_ordereddict.py -> build/lib/patsy
> copying patsy/util.py -> build/lib/patsy
> copying patsy/test_highlevel.py -> build/lib/patsy
> copying patsy/test_build.py -> build/lib/patsy
> python3.11 setup.py build
> /usr/lib/python3/dist-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
> !!
>
> ********************************************************************************
> The license_file parameter is deprecated, use license_files instead.
>
> This deprecation is overdue, please update your project and remove deprecated
> calls to avoid build errors in the future.
>
> See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
> ********************************************************************************
>
> !!
> parsed = self.parsers.get(option_name, lambda x: x)(value)
> running build
> running build_py
> dh_auto_build
> pybuild --build -i python{version} -p "3.12 3.11"
> I: pybuild base:310: /usr/bin/python3.12 setup.py build
> /usr/lib/python3/dist-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
> !!
>
> ********************************************************************************
> The license_file parameter is deprecated, use license_files instead.
>
> This deprecation is overdue, please update your project and remove deprecated
> calls to avoid build errors in the future.
>
> See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
> ********************************************************************************
>
> !!
> parsed = self.parsers.get(option_name, lambda x: x)(value)
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/tokens.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/compat.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_state.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/desc.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_splines_crs_data.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/redundancy.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/missing.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/parse_formula.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/version.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_splines_bs_data.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/builtins.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_regressions.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/origin.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/build.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/contrasts.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/design_info.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/state.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/mgcv_cubic_splines.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/user_util.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/constraint.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/splines.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/infix_parser.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/eval.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/highlevel.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/categorical.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/compat_ordereddict.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/util.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_highlevel.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> copying patsy/test_build.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build/patsy
> I: pybuild base:310: /usr/bin/python3 setup.py build
> /usr/lib/python3/dist-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
> !!
>
> ********************************************************************************
> The license_file parameter is deprecated, use license_files instead.
>
> This deprecation is overdue, please update your project and remove deprecated
> calls to avoid build errors in the future.
>
> See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
> ********************************************************************************
>
> !!
> parsed = self.parsers.get(option_name, lambda x: x)(value)
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/tokens.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/compat.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_state.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/desc.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_splines_crs_data.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/redundancy.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/missing.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/parse_formula.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/version.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_splines_bs_data.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/builtins.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_regressions.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/origin.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/build.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/contrasts.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/design_info.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/state.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/mgcv_cubic_splines.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/user_util.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/constraint.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/splines.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/infix_parser.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/eval.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/highlevel.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/categorical.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/compat_ordereddict.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/util.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_highlevel.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> copying patsy/test_build.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build/patsy
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
> dh_auto_test -O--buildsystem=pybuild
> pybuild --test --test-pytest -i python{version} -p "3.12 3.11"
> I: pybuild base:310: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build; python3.12 -m pytest
> ============================= test session starts ==============================
> platform linux -- Python 3.12.1, pytest-7.4.3, pluggy-1.3.0
> rootdir: /<<PKGBUILDDIR>>
> configfile: setup.cfg
> collected 148 items
>
> patsy/build.py ....... [ 4%]
> patsy/builtins.py .. [ 6%]
> patsy/categorical.py .... [ 8%]
> patsy/constraint.py ..... [ 12%]
> patsy/contrasts.py ......... [ 18%]
> patsy/desc.py ....F. [ 22%]
> patsy/design_info.py ........ [ 27%]
> patsy/eval.py ............... [ 37%]
> patsy/infix_parser.py . [ 38%]
> patsy/mgcv_cubic_splines.py ............ [ 46%]
> patsy/missing.py ..... [ 50%]
> patsy/origin.py . [ 50%]
> patsy/parse_formula.py ...FF [ 54%]
> patsy/redundancy.py .... [ 56%]
> patsy/splines.py .... [ 59%]
> patsy/test_build.py ................. [ 70%]
> patsy/test_highlevel.py .................. [ 83%]
> patsy/test_regressions.py . [ 83%]
> patsy/test_state.py ... [ 85%]
> patsy/tokens.py F. [ 87%]
> patsy/user_util.py ... [ 89%]
> patsy/util.py ................ [100%]
>
> =================================== FAILURES ===================================
> ______________________ test_eval_formula_error_reporting _______________________
>
> def test_eval_formula_error_reporting():
> from patsy.parse_formula import _parsing_error_test
> parse_fn = lambda formula: ModelDesc.from_formula(formula)
> > _parsing_error_test(parse_fn, _eval_error_tests)
>
> patsy/desc.py:617:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> patsy/parse_formula.py:273: in _parsing_error_test
> parse_fn(bad_code)
> patsy/desc.py:616: in <lambda>
> parse_fn = lambda formula: ModelDesc.from_formula(formula)
> patsy/desc.py:164: in from_formula
> tree = parse_formula(tree_or_string)
> patsy/parse_formula.py:146: in parse_formula
> tree = infix_parse(_tokenize_formula(code, operator_strings),
> patsy/infix_parser.py:210: in infix_parse
> for token in token_source:
> patsy/parse_formula.py:89: in _tokenize_formula
> for pytype, token_string, origin in it:
> patsy/util.py:349: in next
> return six.advance_iterator(self._it)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
>
> code = 'a + ('
>
> def python_tokenize(code):
> # Since formulas can only contain Python expressions, and Python
> # expressions cannot meaningfully contain newlines, we'll just remove all
> # the newlines up front to avoid any complications:
> code = code.replace("\n", " ").strip()
> it = tokenize.generate_tokens(StringIO(code).readline)
> try:
> for (pytype, string, (_, start), (_, end), code) in it:
> if pytype == tokenize.ENDMARKER:
> break
> origin = Origin(code, start, end)
> > assert pytype != tokenize.NL
> E assert 65 != 65
> E + where 65 = tokenize.NL
>
> patsy/tokens.py:35: AssertionError
> ----------------------------- Captured stdout call -----------------------------
> a <+>
> 'a +' 2 3
> expected a noun, but instead the expression ended
> a +
> ^
> a + <(>
> 'a + (' 4 5
> ______________________________ test_parse_errors _______________________________
>
> extra_operators = []
>
> def test_parse_errors(extra_operators=[]):
> def parse_fn(code):
> return parse_formula(code, extra_operators=extra_operators)
> > _parsing_error_test(parse_fn, _parser_error_tests)
>
> patsy/parse_formula.py:285:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> patsy/parse_formula.py:273: in _parsing_error_test
> parse_fn(bad_code)
> patsy/parse_formula.py:284: in parse_fn
> return parse_formula(code, extra_operators=extra_operators)
> patsy/parse_formula.py:146: in parse_formula
> tree = infix_parse(_tokenize_formula(code, operator_strings),
> patsy/infix_parser.py:210: in infix_parse
> for token in token_source:
> patsy/parse_formula.py:89: in _tokenize_formula
> for pytype, token_string, origin in it:
> patsy/util.py:349: in next
> return six.advance_iterator(self._it)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
>
> code = 'a + ('
>
> def python_tokenize(code):
> # Since formulas can only contain Python expressions, and Python
> # expressions cannot meaningfully contain newlines, we'll just remove all
> # the newlines up front to avoid any complications:
> code = code.replace("\n", " ").strip()
> it = tokenize.generate_tokens(StringIO(code).readline)
> try:
> for (pytype, string, (_, start), (_, end), code) in it:
> if pytype == tokenize.ENDMARKER:
> break
> origin = Origin(code, start, end)
> > assert pytype != tokenize.NL
> E assert 65 != 65
> E + where 65 = tokenize.NL
>
> patsy/tokens.py:35: AssertionError
> ----------------------------- Captured stdout call -----------------------------
> a <+>
> 'a +' 2 3
> expected a noun, but instead the expression ended
> a +
> ^
> a + <(>
> 'a + (' 4 5
> _____________________________ test_parse_extra_op ______________________________
>
> def test_parse_extra_op():
> extra_operators = [Operator("|", 2, 250)]
> _do_parse_test(_parser_tests,
> extra_operators=extra_operators)
> _do_parse_test(_extra_op_parser_tests,
> extra_operators=extra_operators)
> > test_parse_errors(extra_operators=extra_operators)
>
> patsy/parse_formula.py:298:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> patsy/parse_formula.py:285: in test_parse_errors
> _parsing_error_test(parse_fn, _parser_error_tests)
> patsy/parse_formula.py:273: in _parsing_error_test
> parse_fn(bad_code)
> patsy/parse_formula.py:284: in parse_fn
> return parse_formula(code, extra_operators=extra_operators)
> patsy/parse_formula.py:146: in parse_formula
> tree = infix_parse(_tokenize_formula(code, operator_strings),
> patsy/infix_parser.py:210: in infix_parse
> for token in token_source:
> patsy/parse_formula.py:89: in _tokenize_formula
> for pytype, token_string, origin in it:
> patsy/util.py:349: in next
> return six.advance_iterator(self._it)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
>
> code = 'a + ('
>
> def python_tokenize(code):
> # Since formulas can only contain Python expressions, and Python
> # expressions cannot meaningfully contain newlines, we'll just remove all
> # the newlines up front to avoid any complications:
> code = code.replace("\n", " ").strip()
> it = tokenize.generate_tokens(StringIO(code).readline)
> try:
> for (pytype, string, (_, start), (_, end), code) in it:
> if pytype == tokenize.ENDMARKER:
> break
> origin = Origin(code, start, end)
> > assert pytype != tokenize.NL
> E assert 65 != 65
> E + where 65 = tokenize.NL
>
> patsy/tokens.py:35: AssertionError
> ----------------------------- Captured stdout call -----------------------------
> '' ['~', '1']
> ParseNode('~', Token('~', <Origin ->~<- 1 (0-1)>), [ParseNode('ONE', Token('ONE', <Origin ~ ->1<- (2-3)>, extra='1'), [])])
> ' ' ['~', '1']
> ParseNode('~', Token('~', <Origin ->~<- 1 (0-1)>), [ParseNode('ONE', Token('ONE', <Origin ~ ->1<- (2-3)>, extra='1'), [])])
> ' \n ' ['~', '1']
> ParseNode('~', Token('~', <Origin ->~<- 1 (0-1)>), [ParseNode('ONE', Token('ONE', <Origin ~ ->1<- (2-3)>, extra='1'), [])])
> '1' ['~', '1']
> ParseNode('~', None, [ParseNode('ONE', Token('ONE', <Origin ->1<- (0-1)>, extra='1'), [])])
> 'a' ['~', 'a']
> ParseNode('~', None, [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- (0-1)>, extra='a'), [])])
> 'a ~ b' ['~', 'a', 'b']
> ParseNode('~', Token('~', <Origin a ->~<- b (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- ~ b (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a ~ ->b<- (4-5)>, extra='b'), [])])
> '(a ~ b)' ['~', 'a', 'b']
> ParseNode('~', Token('~', <Origin (a ->~<- b) (3-4)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin (->a<- ~ b) (1-2)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin (a ~ ->b<-) (5-6)>, extra='b'), [])])
> 'a ~ ((((b))))' ['~', 'a', 'b']
> ParseNode('~', Token('~', <Origin a ->~<- ((((b)))) (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- ~ ((((b)))) (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a ~ ((((->b<-)))) (8-9)>, extra='b'), [])])
> 'a ~ ((((+b))))' ['~', 'a', ['+', 'b']]
> ParseNode('~', Token('~', <Origin a ->~<- ((((+b)))) (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- ~ ((((+b)))) (0-1)>, extra='a'), []), ParseNode('+', Token('+', <Origin a ~ ((((->+<-b)))) (8-9)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a ~ ((((+->b<-)))) (9-10)>, extra='b'), [])])])
> 'a + b + c' ['~', ['+', ['+', 'a', 'b'], 'c']]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a + b ->+<- c (6-7)>), [ParseNode('+', Token('+', <Origin a ->+<- b + c (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + b + c (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->b<- + c (4-5)>, extra='b'), [])]), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + b + ->c<- (8-9)>, extra='c'), [])])])
> 'a + (b ~ c) + d' ['~', ['+', ['+', 'a', ['~', 'b', 'c']], 'd']]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a + (b ~ c) ->+<- d (12-13)>), [ParseNode('+', Token('+', <Origin a ->+<- (b ~ c) + d (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + (b ~ c) + d (0-1)>, extra='a'), []), ParseNode('~', Token('~', <Origin a + (b ->~<- c) + d (7-8)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + (->b<- ~ c) + d (5-6)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + (b ~ ->c<-) + d (9-10)>, extra='c'), [])])]), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + (b ~ c) + ->d<- (14-15)>, extra='d'), [])])])
> 'a + np.log(a, base=10)' ['~', ['+', 'a', 'np.log(a, base=10)']]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a ->+<- np.log(a, base=10) (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + np.log(a, base=10) (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->np.log(a, base=10)<- (4-22)>, extra='np.log(a, base=10)'), [])])])
> 'a + np . log(a , base = 10)' ['~', ['+', 'a', 'np.log(a, base=10)']]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a ->+<- np . log(a , base = 10) (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + np . log(a , base = 10) (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->np . log(a , base = 10)<- (4-27)>, extra='np.log(a, base=10)'), [])])])
> 'a + b ~ c * d' ['~', ['+', 'a', 'b'], ['*', 'c', 'd']]
> ParseNode('~', Token('~', <Origin a + b ->~<- c * d (6-7)>), [ParseNode('+', Token('+', <Origin a ->+<- b ~ c * d (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + b ~ c * d (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->b<- ~ c * d (4-5)>, extra='b'), [])]), ParseNode('*', Token('*', <Origin a + b ~ c ->*<- d (10-11)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + b ~ ->c<- * d (8-9)>, extra='c'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + b ~ c * ->d<- (12-13)>, extra='d'), [])])])
> 'a + b * c' ['~', ['+', 'a', ['*', 'b', 'c']]]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a ->+<- b * c (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + b * c (0-1)>, extra='a'), []), ParseNode('*', Token('*', <Origin a + b ->*<- c (6-7)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->b<- * c (4-5)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + b * ->c<- (8-9)>, extra='c'), [])])])])
> '-a**2' ['~', ['-', ['**', 'a', '2']]]
> ParseNode('~', None, [ParseNode('-', Token('-', <Origin ->-<-a**2 (0-1)>), [ParseNode('**', Token('**', <Origin -a->**<-2 (2-4)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin -->a<-**2 (1-2)>, extra='a'), []), ParseNode('NUMBER', Token('NUMBER', <Origin -a**->2<- (4-5)>, extra='2'), [])])])])
> '-a:b' ['~', ['-', [':', 'a', 'b']]]
> ParseNode('~', None, [ParseNode('-', Token('-', <Origin ->-<-a:b (0-1)>), [ParseNode(':', Token(':', <Origin -a->:<-b (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin -->a<-:b (1-2)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin -a:->b<- (3-4)>, extra='b'), [])])])])
> 'a + b:c' ['~', ['+', 'a', [':', 'b', 'c']]]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a ->+<- b:c (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- + b:c (0-1)>, extra='a'), []), ParseNode(':', Token(':', <Origin a + b->:<-c (5-6)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + ->b<-:c (4-5)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a + b:->c<- (6-7)>, extra='c'), [])])])])
> '(a + b):c' ['~', [':', ['+', 'a', 'b'], 'c']]
> ParseNode('~', None, [ParseNode(':', Token(':', <Origin (a + b)->:<-c (7-8)>), [ParseNode('+', Token('+', <Origin (a ->+<- b):c (3-4)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin (->a<- + b):c (1-2)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin (a + ->b<-):c (5-6)>, extra='b'), [])]), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin (a + b):->c<- (8-9)>, extra='c'), [])])])
> 'a*b:c' ['~', ['*', 'a', [':', 'b', 'c']]]
> ParseNode('~', None, [ParseNode('*', Token('*', <Origin a->*<-b:c (1-2)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<-*b:c (0-1)>, extra='a'), []), ParseNode(':', Token(':', <Origin a*b->:<-c (3-4)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a*->b<-:c (2-3)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a*b:->c<- (4-5)>, extra='c'), [])])])])
> 'a+b / c' ['~', ['+', 'a', ['/', 'b', 'c']]]
> ParseNode('~', None, [ParseNode('+', Token('+', <Origin a->+<-b / c (1-2)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<-+b / c (0-1)>, extra='a'), []), ParseNode('/', Token('/', <Origin a+b ->/<- c (4-5)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a+->b<- / c (2-3)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a+b / ->c<- (6-7)>, extra='c'), [])])])])
> '~ a' ['~', 'a']
> ParseNode('~', Token('~', <Origin ->~<- a (0-1)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ~ ->a<- (2-3)>, extra='a'), [])])
> '-1' ['~', ['-', '1']]
> ParseNode('~', None, [ParseNode('-', Token('-', <Origin ->-<-1 (0-1)>), [ParseNode('ONE', Token('ONE', <Origin -->1<- (1-2)>, extra='1'), [])])])
> 'a | b' ['~', ['|', 'a', 'b']]
> ParseNode('~', None, [ParseNode('|', Token('|', <Origin a ->|<- b (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- | b (0-1)>, extra='a'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a | ->b<- (4-5)>, extra='b'), [])])])
> 'a * b|c' ['~', ['*', 'a', ['|', 'b', 'c']]]
> ParseNode('~', None, [ParseNode('*', Token('*', <Origin a ->*<- b|c (2-3)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin ->a<- * b|c (0-1)>, extra='a'), []), ParseNode('|', Token('|', <Origin a * b->|<-c (5-6)>), [ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a * ->b<-|c (4-5)>, extra='b'), []), ParseNode('PYTHON_EXPR', Token('PYTHON_EXPR', <Origin a * b|->c<- (6-7)>, extra='c'), [])])])])
> a <+>
> 'a +' 2 3
> expected a noun, but instead the expression ended
> a +
> ^
> a + <(>
> 'a + (' 4 5
> _____________________________ test_python_tokenize _____________________________
>
> def test_python_tokenize():
> code = "a + (foo * -1)"
> tokens = list(python_tokenize(code))
> expected = [(tokenize.NAME, "a", Origin(code, 0, 1)),
> (tokenize.OP, "+", Origin(code, 2, 3)),
> (tokenize.OP, "(", Origin(code, 4, 5)),
> (tokenize.NAME, "foo", Origin(code, 5, 8)),
> (tokenize.OP, "*", Origin(code, 9, 10)),
> (tokenize.OP, "-", Origin(code, 11, 12)),
> (tokenize.NUMBER, "1", Origin(code, 12, 13)),
> (tokenize.OP, ")", Origin(code, 13, 14))]
> assert tokens == expected
>
> code2 = "a + (b"
> > tokens2 = list(python_tokenize(code2))
>
> patsy/tokens.py:74:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
>
> code = 'a + (b'
>
> def python_tokenize(code):
> # Since formulas can only contain Python expressions, and Python
> # expressions cannot meaningfully contain newlines, we'll just remove all
> # the newlines up front to avoid any complications:
> code = code.replace("\n", " ").strip()
> it = tokenize.generate_tokens(StringIO(code).readline)
> try:
> for (pytype, string, (_, start), (_, end), code) in it:
> if pytype == tokenize.ENDMARKER:
> break
> origin = Origin(code, start, end)
> > assert pytype != tokenize.NL
> E assert 65 != 65
> E + where 65 = tokenize.NL
>
> patsy/tokens.py:35: AssertionError
> =============================== warnings summary ===============================
> ../../../../../../usr/lib/python3/dist-packages/dateutil/tz/tz.py:37
> /usr/lib/python3/dist-packages/dateutil/tz/tz.py:37: DeprecationWarning: datetime.datetime.utcfromtimestamp() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.fromtimestamp(timestamp, datetime.UTC).
> EPOCH = datetime.datetime.utcfromtimestamp(0)
>
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info ============================
> FAILED patsy/desc.py::test_eval_formula_error_reporting - assert 65 != 65
> FAILED patsy/parse_formula.py::test_parse_errors - assert 65 != 65
> FAILED patsy/parse_formula.py::test_parse_extra_op - assert 65 != 65
> FAILED patsy/tokens.py::test_python_tokenize - assert 65 != 65
> ================== 4 failed, 144 passed, 1 warning in 36.75s ===================
> E: pybuild pybuild:395: test: plugin distutils failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12/build; python3.12 -m pytest
> I: pybuild base:310: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11/build; python3.11 -m pytest
> ============================= test session starts ==============================
> platform linux -- Python 3.11.7, pytest-7.4.3, pluggy-1.3.0
> rootdir: /<<PKGBUILDDIR>>
> configfile: setup.cfg
> collected 148 items
>
> patsy/build.py ....... [ 4%]
> patsy/builtins.py .. [ 6%]
> patsy/categorical.py .... [ 8%]
> patsy/constraint.py ..... [ 12%]
> patsy/contrasts.py ......... [ 18%]
> patsy/desc.py ...... [ 22%]
> patsy/design_info.py ........ [ 27%]
> patsy/eval.py ............... [ 37%]
> patsy/infix_parser.py . [ 38%]
> patsy/mgcv_cubic_splines.py ............ [ 46%]
> patsy/missing.py ..... [ 50%]
> patsy/origin.py . [ 50%]
> patsy/parse_formula.py ..... [ 54%]
> patsy/redundancy.py .... [ 56%]
> patsy/splines.py .... [ 59%]
> patsy/test_build.py ................. [ 70%]
> patsy/test_highlevel.py .................. [ 83%]
> patsy/test_regressions.py . [ 83%]
> patsy/test_state.py ... [ 85%]
> patsy/tokens.py .. [ 87%]
> patsy/user_util.py ... [ 89%]
> patsy/util.py ................ [100%]
>
> ============================= 148 passed in 35.93s =============================
> rm -fr -- /tmp/dh-xdg-rundir-1FwRDwYw
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 3.11" returned exit code 13
The full build log is available from:
http://qa-logs.debian.net/2023/12/12/patsy_0.5.3-1_unstable.log
All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20231212;users=lucas@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20231212&fusertaguser=lucas@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results
A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!
If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects
If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.
More information about the Debian-med-packaging
mailing list