[med-svn] [Git][med-team/hdmf][master] 4 commits: routine-update: New upstream version
Andreas Tille (@tille)
gitlab at salsa.debian.org
Tue Oct 11 20:27:49 BST 2022
Andreas Tille pushed to branch master at Debian Med / hdmf
Commits:
7161cd80 by Andreas Tille at 2022-10-11T20:48:08+02:00
routine-update: New upstream version
- - - - -
4caf6955 by Andreas Tille at 2022-10-11T20:48:09+02:00
New upstream version 3.4.6
- - - - -
51740433 by Andreas Tille at 2022-10-11T20:48:10+02:00
Update upstream source from tag 'upstream/3.4.6'
Update to upstream version '3.4.6'
with Debian dir d4254178d3381db7d0a471b8584258f2b03d4882
- - - - -
d572b2da by Andreas Tille at 2022-10-11T20:49:37+02:00
routine-update: Ready to upload to unstable
- - - - -
24 changed files:
- PKG-INFO
- README.rst
- debian/changelog
- requirements-min.txt
- setup.cfg
- src/hdmf.egg-info/PKG-INFO
- src/hdmf/_version.py
- src/hdmf/backends/hdf5/h5_utils.py
- src/hdmf/build/classgenerator.py
- src/hdmf/common/__init__.py
- src/hdmf/common/io/table.py
- src/hdmf/common/table.py
- src/hdmf/container.py
- src/hdmf/data_utils.py
- src/hdmf/utils.py
- test.py
- tests/unit/build_tests/test_classgenerator.py
- tests/unit/common/test_common_io.py
- tests/unit/common/test_generate_table.py
- tests/unit/test_io_hdf5_h5tools.py
- tests/unit/test_multicontainerinterface.py
- tests/unit/utils_test/test_core_DataIO.py
- tests/unit/utils_test/test_docval.py
- tox.ini
Changes:
=====================================
PKG-INFO
=====================================
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: hdmf
-Version: 3.4.0
+Version: 3.4.6
Summary: A package for standardizing hierarchical object data
Home-page: https://github.com/hdmf-dev/hdmf
Author: Andrew Tritt
@@ -29,9 +29,9 @@ The Hierarchical Data Modeling Framework
The Hierarchical Data Modeling Framework, or *HDMF*, is a Python package for working with hierarchical data.
It provides APIs for specifying data models, reading and writing data to different storage backends, and
-representing data with Python object.
+representing data with Python objects.
-Documentation of HDMF can be found at https://hdmf.readthedocs.io
+Documentation of HDMF can be found at https://hdmf.readthedocs.io.
Latest Release
==============
@@ -74,14 +74,14 @@ Overall Health
:target: https://requires.io/github/hdmf-dev/hdmf/requirements/?branch=dev
:alt: Requirements Status
-.. image:: https://readthedocs.org/projects/hdmf/badge/?version=latest
- :target: https://hdmf.readthedocs.io/en/latest/?badge=latest
+.. image:: https://readthedocs.org/projects/hdmf/badge/?version=stable
+ :target: https://hdmf.readthedocs.io/en/stable/?badge=stable
:alt: Documentation Status
Installation
============
-See the HDMF documentation for details http://hdmf.readthedocs.io/en/latest/getting_started.html#installation
+See the `HDMF documentation <http://hdmf.readthedocs.io/en/stable/getting_started.html#installation>`_.
Code of Conduct
===============
=====================================
README.rst
=====================================
@@ -4,9 +4,9 @@ The Hierarchical Data Modeling Framework
The Hierarchical Data Modeling Framework, or *HDMF*, is a Python package for working with hierarchical data.
It provides APIs for specifying data models, reading and writing data to different storage backends, and
-representing data with Python object.
+representing data with Python objects.
-Documentation of HDMF can be found at https://hdmf.readthedocs.io
+Documentation of HDMF can be found at https://hdmf.readthedocs.io.
Latest Release
==============
@@ -49,14 +49,14 @@ Overall Health
:target: https://requires.io/github/hdmf-dev/hdmf/requirements/?branch=dev
:alt: Requirements Status
-.. image:: https://readthedocs.org/projects/hdmf/badge/?version=latest
- :target: https://hdmf.readthedocs.io/en/latest/?badge=latest
+.. image:: https://readthedocs.org/projects/hdmf/badge/?version=stable
+ :target: https://hdmf.readthedocs.io/en/stable/?badge=stable
:alt: Documentation Status
Installation
============
-See the HDMF documentation for details http://hdmf.readthedocs.io/en/latest/getting_started.html#installation
+See the `HDMF documentation <http://hdmf.readthedocs.io/en/stable/getting_started.html#installation>`_.
Code of Conduct
===============
=====================================
debian/changelog
=====================================
@@ -1,3 +1,10 @@
+hdmf (3.4.6-1) unstable; urgency=medium
+
+ * Team upload.
+ * New upstream version
+
+ -- Andreas Tille <tille at debian.org> Tue, 11 Oct 2022 20:48:38 +0200
+
hdmf (3.4.0-1) unstable; urgency=medium
* Team upload.
=====================================
requirements-min.txt
=====================================
@@ -2,7 +2,7 @@
h5py==2.10 # support for selection of datasets with list of indices added in 2.10
jsonschema==2.6.0
numpy==1.16
-pandas==1.0.5
+pandas==1.0.5 # when this is changed to >=1.5.0, see TODO items referenced in #762
ruamel.yaml==0.16
scipy==1.1
setuptools
=====================================
setup.cfg
=====================================
@@ -22,15 +22,15 @@ exclude =
docs/source/tutorials/
docs/_build/
per-file-ignores =
- docs/gallery/*:E402,T001
+ docs/gallery/*:E402,T201
src/hdmf/__init__.py:F401
src/hdmf/backends/__init__.py:F401
src/hdmf/backends/hdf5/__init__.py:F401
src/hdmf/build/__init__.py:F401
src/hdmf/spec/__init__.py:F401
src/hdmf/validate/__init__.py:F401
- setup.py:T001
- test.py:T001
+ setup.py:T201
+ test.py:T201
[metadata]
description_file = README.rst
=====================================
src/hdmf.egg-info/PKG-INFO
=====================================
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: hdmf
-Version: 3.4.0
+Version: 3.4.6
Summary: A package for standardizing hierarchical object data
Home-page: https://github.com/hdmf-dev/hdmf
Author: Andrew Tritt
@@ -29,9 +29,9 @@ The Hierarchical Data Modeling Framework
The Hierarchical Data Modeling Framework, or *HDMF*, is a Python package for working with hierarchical data.
It provides APIs for specifying data models, reading and writing data to different storage backends, and
-representing data with Python object.
+representing data with Python objects.
-Documentation of HDMF can be found at https://hdmf.readthedocs.io
+Documentation of HDMF can be found at https://hdmf.readthedocs.io.
Latest Release
==============
@@ -74,14 +74,14 @@ Overall Health
:target: https://requires.io/github/hdmf-dev/hdmf/requirements/?branch=dev
:alt: Requirements Status
-.. image:: https://readthedocs.org/projects/hdmf/badge/?version=latest
- :target: https://hdmf.readthedocs.io/en/latest/?badge=latest
+.. image:: https://readthedocs.org/projects/hdmf/badge/?version=stable
+ :target: https://hdmf.readthedocs.io/en/stable/?badge=stable
:alt: Documentation Status
Installation
============
-See the HDMF documentation for details http://hdmf.readthedocs.io/en/latest/getting_started.html#installation
+See the `HDMF documentation <http://hdmf.readthedocs.io/en/stable/getting_started.html#installation>`_.
Code of Conduct
===============
=====================================
src/hdmf/_version.py
=====================================
@@ -8,11 +8,11 @@ import json
version_json = '''
{
- "date": "2022-08-05T10:51:30-0700",
+ "date": "2022-10-04T14:35:09-0700",
"dirty": false,
"error": null,
- "full-revisionid": "7be40229d778dea940134753de957f2457a11e1f",
- "version": "3.4.0"
+ "full-revisionid": "abb5d3b115823ff9581575cf049d025540b194c4",
+ "version": "3.4.6"
}
''' # END VERSION_JSON
=====================================
src/hdmf/backends/hdf5/h5_utils.py
=====================================
@@ -489,6 +489,7 @@ class H5DataIO(DataIO):
# Get the list of I/O options that user has passed in
ioarg_names = [name for name in kwargs.keys() if name not in ['data', 'link_data', 'allow_plugin_filters',
'dtype', 'shape']]
+
# Remove the ioargs from kwargs
ioarg_values = [popargs(argname, kwargs) for argname in ioarg_names]
# Consume link_data parameter
=====================================
src/hdmf/build/classgenerator.py
=====================================
@@ -292,7 +292,8 @@ class CustomClassGenerator:
parent_docval_args = set(arg['name'] for arg in get_docval(base.__init__))
new_args = list()
for attr_name, field_spec in not_inherited_fields.items():
- # auto-initialize arguments not found in superclass
+ # store arguments for fields that are not in the superclass and not in the superclass __init__ docval
+ # so that they are set after calling base.__init__
if attr_name not in parent_docval_args:
new_args.append(attr_name)
@@ -300,13 +301,24 @@ class CustomClassGenerator:
def __init__(self, **kwargs):
if name is not None: # force container name to be the fixed name in the spec
kwargs.update(name=name)
+
+ # remove arguments from kwargs that correspond to fields that are new (not inherited)
+ # set these arguments after calling base.__init__
new_kwargs = dict()
for f in new_args:
new_kwargs[f] = popargs(f, kwargs) if f in kwargs else None
- base.__init__(self, **kwargs) # special case: need to pass self to __init__
+
+ # NOTE: the docval of some constructors do not include all of the fields. the constructor may set
+ # some fields to fixed values. so only keep the kwargs that are used in the constructor docval
+ kwargs_to_pass = {k: v for k, v in kwargs.items() if k in parent_docval_args}
+
+ base.__init__(self, **kwargs_to_pass) # special case: need to pass self to __init__
# TODO should super() be used above instead of base?
+
+ # set the fields that are new to this class (not inherited)
for f, arg_val in new_kwargs.items():
setattr(self, f, arg_val)
+
classdict['__init__'] = __init__
=====================================
src/hdmf/common/__init__.py
=====================================
@@ -181,12 +181,12 @@ def validate(**kwargs):
@docval(*get_docval(HDF5IO.__init__), is_method=False)
def get_hdf5io(**kwargs):
"""
- A convenience method for getting an HDF5IO object
+ A convenience method for getting an HDF5IO object using an HDMF-common build manager if none is provided.
"""
manager = getargs('manager', kwargs)
if manager is None:
kwargs['manager'] = get_manager()
- return HDF5IO.__init__(**kwargs)
+ return HDF5IO(**kwargs)
# load the hdmf-common namespace
=====================================
src/hdmf/common/io/table.py
=====================================
@@ -82,6 +82,8 @@ class DynamicTableGenerator(CustomClassGenerator):
# the spec does not know which table this DTR points to
# the user must specify the table attribute on the DTR after it is generated
column_conf['table'] = True
+ else:
+ column_conf['class'] = dtype
index_counter = 0
index_name = attr_name
=====================================
src/hdmf/common/table.py
=====================================
@@ -654,13 +654,16 @@ class DynamicTable(Container):
{'name': 'col_cls', 'type': type, 'default': VectorData,
'doc': ('class to use to represent the column data. If table=True, this field is ignored and a '
'DynamicTableRegion object is used. If enum=True, this field is ignored and a EnumData '
- 'object is used.')}, )
+ 'object is used.')},
+ allow_extra=True)
def add_column(self, **kwargs): # noqa: C901
"""
Add a column to this table.
If data is provided, it must contain the same number of rows as the current state of the table.
+ Extra keyword arguments will be passed to the constructor of the column class ("col_cls").
+
:raises ValueError: if the column has already been added to the table
"""
name, data = getargs('name', 'data', kwargs)
=====================================
src/hdmf/container.py
=====================================
@@ -728,10 +728,10 @@ class MultiContainerInterface(Container):
@classmethod
def __make_add(cls, func_name, attr_name, container_type):
- doc = "Add %s to this %s" % (cls.__add_article(container_type), cls.__name__)
+ doc = "Add one or multiple %s objects to this %s" % (cls.__join(container_type), cls.__name__)
@docval({'name': attr_name, 'type': (list, tuple, dict, container_type),
- 'doc': 'the %s to add' % cls.__join(container_type)},
+ 'doc': 'one or multiple %s objects to add to this %s' % (cls.__join(container_type), cls.__name__)},
func_name=func_name, doc=doc)
def _func(self, **kwargs):
container = getargs(attr_name, kwargs)
@@ -759,7 +759,7 @@ class MultiContainerInterface(Container):
@classmethod
def __make_create(cls, func_name, add_name, container_type):
- doc = "Create %s and add it to this %s" % (cls.__add_article(container_type), cls.__name__)
+ doc = "Create %s object and add it to this %s" % (cls.__add_article(container_type), cls.__name__)
@docval(*get_docval(container_type.__init__), func_name=func_name, doc=doc,
returns="the %s object that was created" % cls.__join(container_type), rtype=container_type)
=====================================
src/hdmf/data_utils.py
=====================================
@@ -921,11 +921,16 @@ class DataIO:
'default': None})
def __init__(self, **kwargs):
data, dtype, shape = popargs('data', 'dtype', 'shape', kwargs)
- if data is not None:
+ if data is None:
+ if (dtype is None) ^ (shape is None):
+ raise ValueError("Must specify 'dtype' and 'shape' if not specifying 'data'")
+ else:
if dtype is not None:
- raise ValueError("Setting the dtype when data is not None is not supported")
+ warn("Argument 'dtype' is ignored when 'data' is specified")
+ dtype = None
if shape is not None:
- raise ValueError("Setting the shape when data is not None is not supported")
+ warn("Argument 'shape' is ignored when 'data' is specified")
+ shape = None
self.__data = data
self.__dtype = dtype
self.__shape = shape
@@ -995,6 +1000,8 @@ class DataIO:
def __len__(self):
"""Number of values in self.data"""
+ if self.__shape is not None:
+ return self.__shape[0]
if not self.valid:
raise InvalidDataIOError("Cannot get length of data. Data is not valid.")
return len(self.data)
=====================================
src/hdmf/utils.py
=====================================
@@ -319,7 +319,7 @@ def __parse_args(validator, args, kwargs, enforce_type=True, enforce_shape=True,
ret[argname] = _copy.deepcopy(arg['default'])
argval = ret[argname]
if enforce_type:
- if not __type_okay(argval, arg['type'], arg['default'] is None):
+ if not __type_okay(argval, arg['type'], arg['default'] is None or arg.get('allow_none', False)):
if argval is None and arg['default'] is None:
fmt_val = (argname, __format_type(arg['type']))
type_errors.append("None is not allowed for '%s' (expected '%s', not None)" % fmt_val)
@@ -522,7 +522,9 @@ def docval(*validator, **options): # noqa: C901
must contain the following keys: ``'name'``, ``'type'``, and ``'doc'``. This will define a
positional argument. To define a keyword argument, specify a default value
using the key ``'default'``. To validate the dimensions of an input array
- add the optional ``'shape'`` parameter.
+ add the optional ``'shape'`` parameter. To allow a None value for an argument,
+ either the default value must be None or a different default value must be provided
+ and ``'allow_none': True`` must be passed.
The decorated method must take ``self`` and ``**kwargs`` as arguments.
@@ -570,7 +572,7 @@ def docval(*validator, **options): # noqa: C901
kw = list()
for a in validator:
# catch unsupported keys
- allowable_terms = ('name', 'doc', 'type', 'shape', 'enum', 'default', 'help')
+ allowable_terms = ('name', 'doc', 'type', 'shape', 'enum', 'default', 'allow_none', 'help')
unsupported_terms = set(a.keys()) - set(allowable_terms)
if unsupported_terms:
raise Exception('docval for {}: keys {} are not supported by docval'.format(a['name'],
@@ -596,6 +598,10 @@ def docval(*validator, **options): # noqa: C901
msg = ('docval for {}: enum values are of types not allowed by arg type (got {}, '
'expected {})'.format(a['name'], [type(x) for x in a['enum']], a['type']))
raise Exception(msg)
+ if a.get('allow_none', False) and 'default' not in a:
+ msg = ('docval for {}: allow_none=True can only be set if a default value is provided.').format(
+ a['name'])
+ raise Exception(msg)
if 'default' in a:
kw.append(a)
else:
=====================================
test.py
=====================================
@@ -1,6 +1,7 @@
#!/usr/bin/env python
-# NOTE this script is currently used in CI *only* to test the sphinx gallery examples using python test.py -e
+# NOTE This script is deprecated. Please use pytest to run unit tests and run python test_gallery.py to
+# test Sphinx Gallery files.
import warnings
import re
@@ -92,6 +93,12 @@ def run_example_tests():
def main():
+ warnings.warn(
+ "python test.py is deprecated. Please use pytest to run unit tests and run python test_gallery.py to "
+ "test Sphinx Gallery files.",
+ DeprecationWarning
+ )
+
# setup and parse arguments
parser = argparse.ArgumentParser('python test.py [options]')
parser.set_defaults(verbosity=1, suites=[])
=====================================
tests/unit/build_tests/test_classgenerator.py
=====================================
@@ -8,7 +8,7 @@ from hdmf.build.classgenerator import ClassGenerator, MCIClassGenerator
from hdmf.container import Container, Data, MultiContainerInterface, AbstractContainer
from hdmf.spec import GroupSpec, AttributeSpec, DatasetSpec, SpecCatalog, SpecNamespace, NamespaceCatalog, LinkSpec
from hdmf.testing import TestCase
-from hdmf.utils import get_docval
+from hdmf.utils import get_docval, docval
from .test_io_map import Bar
from tests.unit.utils import CORE_NAMESPACE, create_test_type_map, create_load_namespace_yaml
@@ -263,6 +263,40 @@ class TestDynamicContainer(TestCase):
obj = Baz(data=[1, 2, 3, 4], attr1='string attribute', attr2=1000)
self.assertEqual(obj.name, 'Baz')
+ def test_dynamic_container_super_init_fixed_value(self):
+ """Test that dynamic class generation when the superclass init does not include all fields works"""
+
+ class FixedAttrBar(Bar):
+ @docval({'name': 'name', 'type': str, 'doc': 'the name of this Bar'},
+ {'name': 'data', 'type': ('data', 'array_data'), 'doc': 'some data'},
+ {'name': 'attr2', 'type': int, 'doc': 'another attribute'},
+ {'name': 'attr3', 'type': float, 'doc': 'a third attribute', 'default': 3.14},
+ {'name': 'foo', 'type': 'Foo', 'doc': 'a group', 'default': None})
+ def __init__(self, **kwargs):
+ kwargs["attr1"] = "fixed_attr1"
+ super().__init__(**kwargs)
+
+ # overwrite the "Bar" to Bar class mapping from setUp()
+ self.type_map.register_container_type(CORE_NAMESPACE, "Bar", FixedAttrBar)
+
+ baz_spec = GroupSpec('A test extension with no Container class',
+ data_type_def='Baz', data_type_inc=self.bar_spec,
+ attributes=[AttributeSpec('attr3', 'a float attribute', 'float'),
+ AttributeSpec('attr4', 'another float attribute', 'float')])
+ self.spec_catalog.register_spec(baz_spec, 'extension.yaml')
+ cls = self.type_map.get_dt_container_cls('Baz', CORE_NAMESPACE)
+ expected_args = {'name', 'data', 'attr2', 'attr3', 'attr4'}
+ received_args = set()
+ for x in get_docval(cls.__init__):
+ if x['name'] != 'foo':
+ received_args.add(x['name'])
+ with self.subTest(name=x['name']):
+ self.assertNotIn('default', x)
+ self.assertSetEqual(expected_args, received_args)
+ self.assertTrue(issubclass(cls, FixedAttrBar))
+ inst = cls(name="My Baz", data=[1, 2, 3, 4], attr2=1000, attr3=98.6, attr4=1.0)
+ self.assertEqual(inst.attr1, "fixed_attr1")
+
def test_multi_container_spec(self):
multi_spec = GroupSpec(
doc='A test extension that contains a multi',
=====================================
tests/unit/common/test_common_io.py
=====================================
@@ -1,7 +1,7 @@
from h5py import File
from hdmf.backends.hdf5 import HDF5IO
-from hdmf.common import Container, get_manager
+from hdmf.common import Container, get_manager, get_hdf5io
from hdmf.spec import NamespaceCatalog
from hdmf.testing import TestCase, remove_test_file
@@ -67,3 +67,23 @@ class TestCacheSpec(TestCase):
self.assertIsNotNone(cached_spec)
with self.subTest('Cached spec matches original spec'):
self.assertDictEqual(original_spec, cached_spec)
+
+
+class TestGetHdf5IO(TestCase):
+
+ def setUp(self):
+ self.path = get_temp_filepath()
+
+ def tearDown(self):
+ remove_test_file(self.path)
+
+ def test_gethdf5io(self):
+ """Test the get_hdf5io convenience method with manager=None."""
+ with get_hdf5io(self.path, "w") as io:
+ self.assertIsNotNone(io.manager)
+
+ def test_gethdf5io_manager(self):
+ """Test the get_hdf5io convenience method with manager set."""
+ manager = get_manager()
+ with get_hdf5io(self.path, "w", manager=manager) as io:
+ self.assertIs(io.manager, manager)
=====================================
tests/unit/common/test_generate_table.py
=====================================
@@ -5,7 +5,7 @@ import tempfile
from hdmf.backends.hdf5 import HDF5IO
from hdmf.build import BuildManager, TypeMap
-from hdmf.common import get_type_map, DynamicTable
+from hdmf.common import get_type_map, DynamicTable, VectorData
from hdmf.spec import GroupSpec, DatasetSpec, SpecCatalog, SpecNamespace, NamespaceCatalog
from hdmf.testing import TestCase
from hdmf.validate import ValidatorMap
@@ -142,11 +142,12 @@ class TestDynamicDynamicTable(TestCase):
def test_dynamic_table(self):
assert issubclass(self.TestTable, DynamicTable)
- assert self.TestTable.__columns__[0] == dict(
- name='my_col',
- description='a test column',
- required=True
- )
+ assert self.TestTable.__columns__[0] == {
+ 'name': 'my_col',
+ 'description': 'a test column',
+ 'class': VectorData,
+ 'required': True
+ }
def test_forbids_incorrect_col(self):
test_table = self.TestTable(name='test_table', description='my test table')
=====================================
tests/unit/test_io_hdf5_h5tools.py
=====================================
@@ -3137,3 +3137,29 @@ class HDF5IOClassmethodTests(TestCase):
HDF5IO.__setup_empty_dset__(self.f, 'foo', {'shape': (3, 3), 'dtype': 'float'})
with self.assertRaisesRegex(Exception, "Could not create dataset foo in /"):
HDF5IO.__setup_empty_dset__(self.f, 'foo', {'shape': (3, 3), 'dtype': 'float'})
+
+
+class H5DataIOTests(TestCase):
+
+ def _bad_arg_cm(self):
+ return self.assertRaisesRegex(ValueError, "Must specify 'dtype' and 'shape' "
+ "if not specifying 'data'")
+
+ def test_dataio_bad_args(self):
+ with self._bad_arg_cm():
+ H5DataIO(shape=(10, 10))
+ with self._bad_arg_cm():
+ H5DataIO(dtype=int)
+ with self.assertWarnsRegex(UserWarning, "Argument 'dtype' is ignored when 'data' is specified"):
+ H5DataIO(data=np.zeros((10, 10)), dtype=int)
+ with self.assertWarnsRegex(UserWarning, "Argument 'shape' is ignored when 'data' is specified"):
+ H5DataIO(data=np.zeros((10, 10)), shape=(10, 10))
+
+ def test_dataio_len(self):
+ dataio = H5DataIO(shape=(10, 10), dtype=int)
+ self.assertEqual(len(dataio), 10)
+
+ def test_dataio_shape_then_data(self):
+ dataio = H5DataIO(shape=(10, 10), dtype=int)
+ with self.assertRaisesRegex(ValueError, "Setting data when dtype and shape are not None is not supported"):
+ dataio.data = list()
=====================================
tests/unit/test_multicontainerinterface.py
=====================================
@@ -101,10 +101,12 @@ class TestBasic(TestCase):
def test_add_docval(self):
"""Test that the docval for the add method is set correctly."""
+ expected_doc = "add_container(containers)\n\nAdd one or multiple Container objects to this Foo"
+ self.assertTrue(Foo.add_container.__doc__.startswith(expected_doc))
dv = get_docval(Foo.add_container)
self.assertEqual(dv[0]['name'], 'containers')
self.assertTupleEqual(dv[0]['type'], (list, tuple, dict, Container))
- self.assertEqual(dv[0]['doc'], 'the Container to add')
+ self.assertEqual(dv[0]['doc'], 'one or multiple Container objects to add to this Foo')
self.assertFalse('default' in dv[0])
def test_create_docval(self):
=====================================
tests/unit/utils_test/test_core_DataIO.py
=====================================
@@ -60,9 +60,9 @@ class DataIOTests(TestCase):
"""
Test that either data or dtype+shape are specified exclusively
"""
- with self.assertRaisesRegex(ValueError, "Setting the dtype when data is not None is not supported"):
+ with self.assertWarnsRegex(UserWarning, "Argument 'dtype' is ignored when 'data' is specified"):
DataIO(data=np.arange(5), dtype=int)
- with self.assertRaisesRegex(ValueError, "Setting the shape when data is not None is not supported"):
+ with self.assertWarnsRegex(UserWarning, "Argument 'shape' is ignored when 'data' is specified"):
DataIO(data=np.arange(5), shape=(3,))
dataio = DataIO(shape=(3,), dtype=int)
=====================================
tests/unit/utils_test/test_docval.py
=====================================
@@ -644,6 +644,61 @@ class TestDocValidator(TestCase):
with self.assertRaisesWith(SyntaxError, msg):
method(self, True)
+ def test_allow_none_false(self):
+ """Test that docval with allow_none=True and non-None default value works"""
+ @docval({'name': 'arg1', 'type': bool, 'doc': 'this is a bool or None with a default', 'default': True,
+ 'allow_none': False})
+ def method(self, **kwargs):
+ return popargs('arg1', kwargs)
+
+ # if provided, None is not allowed
+ msg = ("TestDocValidator.test_allow_none_false.<locals>.method: incorrect type for 'arg1' "
+ "(got 'NoneType', expected 'bool')")
+ with self.assertRaisesWith(TypeError, msg):
+ res = method(self, arg1=None)
+
+ # if not provided, the default value is used
+ res = method(self)
+ self.assertTrue(res)
+
+ def test_allow_none(self):
+ """Test that docval with allow_none=True and non-None default value works"""
+ @docval({'name': 'arg1', 'type': bool, 'doc': 'this is a bool or None with a default', 'default': True,
+ 'allow_none': True})
+ def method(self, **kwargs):
+ return popargs('arg1', kwargs)
+
+ # if provided, None is allowed
+ res = method(self, arg1=None)
+ self.assertIsNone(res)
+
+ # if not provided, the default value is used
+ res = method(self)
+ self.assertTrue(res)
+
+ def test_allow_none_redundant(self):
+ """Test that docval with allow_none=True and default=None works"""
+ @docval({'name': 'arg1', 'type': bool, 'doc': 'this is a bool or None with a default', 'default': None,
+ 'allow_none': True})
+ def method(self, **kwargs):
+ return popargs('arg1', kwargs)
+
+ # if provided, None is allowed
+ res = method(self, arg1=None)
+ self.assertIsNone(res)
+
+ # if not provided, the default value is used
+ res = method(self)
+ self.assertIsNone(res)
+
+ def test_allow_none_no_default(self):
+ """Test that docval with allow_none=True and no default raises an error"""
+ msg = ("docval for arg1: allow_none=True can only be set if a default value is provided.")
+ with self.assertRaisesWith(Exception, msg):
+ @docval({'name': 'arg1', 'type': bool, 'doc': 'this is a bool or None with a default', 'allow_none': True})
+ def method(self, **kwargs):
+ return popargs('arg1', kwargs)
+
def test_enum_str(self):
"""Test that the basic usage of an enum check on strings works"""
@docval({'name': 'arg1', 'type': str, 'doc': 'an arg', 'enum': ['a', 'b']}) # also use enum: list
=====================================
tox.ini
=====================================
@@ -123,7 +123,7 @@ commands = {[testenv:build]commands}
# Envs that will test installation from a wheel
[testenv:wheelinstall]
deps = null
-commands = python -c "import hdmf"
+commands = python -c "import hdmf; import hdmf.common"
# Envs that will execute gallery tests
[testenv:gallery]
@@ -136,7 +136,7 @@ deps =
-rrequirements-doc.txt
commands =
- python test.py --example
+ python test_gallery.py
[testenv:gallery-py37]
basepython = python3.7
View it on GitLab: https://salsa.debian.org/med-team/hdmf/-/compare/d8d51fe68ca4967a8cf7feec23cf7d74b45a64c0...d572b2dadceeaa0627fe4a7f2e376c7d9e024afe
--
View it on GitLab: https://salsa.debian.org/med-team/hdmf/-/compare/d8d51fe68ca4967a8cf7feec23cf7d74b45a64c0...d572b2dadceeaa0627fe4a7f2e376c7d9e024afe
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20221011/59d4aef1/attachment-0001.htm>
More information about the debian-med-commit
mailing list