[Debian-med-packaging] Bug#1064752: intake: FTBFS: dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 3.11" returned exit code 13

Lucas Nussbaum lucas at debian.org
Sun Feb 25 19:37:35 GMT 2024


Source: intake
Version: 0.6.6-3
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lucas at debian.org
Usertags: ftbfs-20240224 ftbfs-trixie

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_install
> # For some reason, not all data is movable via install or dh_install, so
> # just force copy (and overwrite if needed) for pytest to actually work
> set -e \
> ; for py in `py3versions -sv` \
> ; do builddir=".pybuild/cpython3_${py}_intake/build" \
> ;    cp -a intake/source/tests ${builddir}/intake/source \
> ;    cp -a intake/catalog/tests ${builddir}/intake/catalog \
> ;    cp -a intake/interface/tests ${builddir}/intake/interface \
> ; done
> PYBUILD_SYSTEM=custom \
> 	       PYBUILD_TEST_ARGS='cd {build_dir}; PATH=/<<PKGBUILDDIR>>/debian/{package}/usr/bin:/<<PKGBUILDDIR>>/debian/{package}/usr/lib:/<<PKGBUILDDIR>>/debian/{package}/build/intake:$PATH {interpreter} -m pytest' \
> 	       dh_auto_test --buildsystem=pybuild
> I: pybuild base:305: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build; PATH=/<<PKGBUILDDIR>>/debian/python3-intake/usr/bin:/<<PKGBUILDDIR>>/debian/python3-intake/usr/lib:/<<PKGBUILDDIR>>/debian/python3-intake/build/intake:$PATH python3.12 -m pytest
> ============================= test session starts ==============================
> platform linux -- Python 3.12.2, pytest-7.4.4, pluggy-1.4.0
> rootdir: /<<PKGBUILDDIR>>
> collected 427 items / 10 skipped
> 
> intake/auth/tests/test_auth.py ......                                    [  1%]
> intake/catalog/tests/test_alias.py ..                                    [  1%]
> intake/catalog/tests/test_auth_integration.py ..                         [  2%]
> intake/catalog/tests/test_caching_integration.py ...F..............      [  6%]
> intake/catalog/tests/test_catalog_save.py .                              [  6%]
> intake/catalog/tests/test_core.py ..                                     [  7%]
> intake/catalog/tests/test_default.py .                                   [  7%]
> intake/catalog/tests/test_discovery.py ..                                [  7%]
> intake/catalog/tests/test_gui.py ..s..s                                  [  9%]
> intake/catalog/tests/test_local.py ..F.................................. [ 18%]
> ......................................................ssssss             [ 32%]
> intake/catalog/tests/test_parameters.py ..............                   [ 35%]
> intake/catalog/tests/test_persist.py .s                                  [ 35%]
> intake/catalog/tests/test_reload_integration.py ....                     [ 36%]
> intake/catalog/tests/test_remote_integration.py F...F..F.F.FFFFFF.F..... [ 42%]
> ...                                                                      [ 43%]
> intake/catalog/tests/test_utils.py .............                         [ 46%]
> intake/catalog/tests/test_zarr.py ...                                    [ 46%]
> intake/cli/client/tests/test_cache.py ......                             [ 48%]
> intake/cli/client/tests/test_conf.py .....                               [ 49%]
> intake/cli/client/tests/test_local_integration.py .....FF..              [ 51%]
> intake/cli/server/tests/test_serializer.py sss.........                  [ 54%]
> intake/cli/server/tests/test_server.py ..FFF.ss..                        [ 56%]
> intake/cli/tests/test_util.py ........                                   [ 58%]
> intake/container/tests/test_generics.py .                                [ 58%]
> intake/container/tests/test_persist.py ...s                              [ 59%]
> intake/interface/tests/test_init_gui.py ..s                              [ 60%]
> intake/source/tests/test_base.py .......................                 [ 65%]
> intake/source/tests/test_cache.py ...............s                       [ 69%]
> intake/source/tests/test_csv.py ...........s..                           [ 72%]
> intake/source/tests/test_derived.py ...F                                 [ 73%]
> intake/source/tests/test_discovery.py .....                              [ 74%]
> intake/source/tests/test_json.py .....................                   [ 79%]
> intake/source/tests/test_npy.py ...........                              [ 82%]
> intake/source/tests/test_text.py ................FF                      [ 86%]
> intake/source/tests/test_utils.py ................................       [ 94%]
> intake/tests/test_config.py .........                                    [ 96%]
> intake/tests/test_top_level.py ......s...                                [ 98%]
> intake/tests/test_utils.py ......                                        [100%]
> 
> =================================== FAILURES ===================================
> ______________________________ test_load_textfile ______________________________
> 
> catalog_cache = <Intake catalog: catalog_caching>
> 
>     def test_load_textfile(catalog_cache):
>         cat = catalog_cache['text_cache']
>         cache = cat.cache[0]
>     
>         cache_paths = cache.load(cat._urlpath, output=False)
> >       cache_path = cache_paths[-1]
> E       TypeError: 'NoneType' object is not subscriptable
> 
> intake/catalog/tests/test_caching_integration.py:53: TypeError
> _________________________________ test_nested __________________________________
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7ff50e4e9f80>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
> >           return func(*args, **kwargs)
> 
> /usr/lib/python3/dist-packages/dask/backends.py:136: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:763: in read
>     return read_pandas(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> reader = <function read_csv at 0x7ff50f5aaf20>
> urlpath = '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv'
> blocksize = 'default', lineterminator = '\n', compression = 'infer'
> sample = 256000, sample_rows = 10, enforce = False, assume_missing = False
> storage_options = None, include_path_column = False, kwargs = {}
> reader_name = 'read_csv', kw = 'chunksize', lastskiprow = 0, firstrow = 0
> path_converter = None, paths = []
> 
>     def read_pandas(
>         reader,
>         urlpath,
>         blocksize="default",
>         lineterminator=None,
>         compression="infer",
>         sample=256000,
>         sample_rows=10,
>         enforce=False,
>         assume_missing=False,
>         storage_options=None,
>         include_path_column=False,
>         **kwargs,
>     ):
>         reader_name = reader.__name__
>         if lineterminator is not None and len(lineterminator) == 1:
>             kwargs["lineterminator"] = lineterminator
>         else:
>             lineterminator = "\n"
>         if include_path_column and isinstance(include_path_column, bool):
>             include_path_column = "path"
>         if "index" in kwargs or (
>             "index_col" in kwargs and kwargs.get("index_col") is not False
>         ):
>             raise ValueError(
>                 "Keywords 'index' and 'index_col' not supported, except for "
>                 "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead"
>             )
>         for kw in ["iterator", "chunksize"]:
>             if kw in kwargs:
>                 raise ValueError(f"{kw} not supported for dd.{reader_name}")
>         if kwargs.get("nrows", None):
>             raise ValueError(
>                 "The 'nrows' keyword is not supported by "
>                 "`dd.{0}`. To achieve the same behavior, it's "
>                 "recommended to use `dd.{0}(...)."
>                 "head(n=nrows)`".format(reader_name)
>             )
>         if isinstance(kwargs.get("skiprows"), int):
>             lastskiprow = firstrow = kwargs.get("skiprows")
>         elif kwargs.get("skiprows") is None:
>             lastskiprow = firstrow = 0
>         else:
>             # When skiprows is a list, we expect more than max(skiprows) to
>             # be included in the sample. This means that [0,2] will work well,
>             # but [0, 440] might not work.
>             skiprows = set(kwargs.get("skiprows"))
>             lastskiprow = max(skiprows)
>             # find the firstrow that is not skipped, for use as header
>             firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows))
>         if isinstance(kwargs.get("header"), list):
>             raise TypeError(f"List of header rows not supported for dd.{reader_name}")
>         if isinstance(kwargs.get("converters"), dict) and include_path_column:
>             path_converter = kwargs.get("converters").get(include_path_column, None)
>         else:
>             path_converter = None
>     
>         # If compression is "infer", inspect the (first) path suffix and
>         # set the proper compression option if the suffix is recongnized.
>         if compression == "infer":
>             # Translate the input urlpath to a simple path list
>             paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[
>                 2
>             ]
>     
>             # Check for at least one valid path
>             if len(paths) == 0:
> >               raise OSError(f"{urlpath} resolved to no files")
> E               OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:535: OSError
> 
> The above exception was the direct cause of the following exception:
> 
> catalog1 = <Intake catalog: name_in_cat>
> 
>     def test_nested(catalog1):
>         assert 'nested' in catalog1
>         assert 'entry1' in catalog1.nested.nested()
> >       assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read())
> 
> intake/catalog/tests/test_local.py:86: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/csv.py:129: in read
>     self._get_schema()
> intake/source/csv.py:115: in _get_schema
>     self._open_dataset(urlpath)
> intake/source/csv.py:94: in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7ff50e4e9f80>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
>             return func(*args, **kwargs)
>         except Exception as e:
> >           raise type(e)(
>                 f"An error occurred while calling the {funcname(func)} "
>                 f"method registered to the {self.backend} backend.\n"
>                 f"Original Message: {e}"
>             ) from e
> E           OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> E           Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/backends.py:138: OSError
> ______________________________ test_info_describe ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_info_describe(intake_server):
>         catalog = open_catalog(intake_server)
>     
>         assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1',
>                                            'entry1_part', 'remote_env',
>                                            'local_env', 'text', 'arr', 'datetime'])
>     
> >       info = catalog['entry1'].describe()
> 
> intake/catalog/tests/test_remote_integration.py:29: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8c93a0>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ---------------------------- Captured stderr setup -----------------------------
> 2024-02-24 22:52:23,664 - intake - INFO - __main__.py:main:L53 - Creating catalog from:
> 2024-02-24 22:52:23,664 - intake - INFO - __main__.py:main:L55 -   - /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/catalog1.yml
> 2024-02-24 22:52:23,970 - intake - INFO - __main__.py:main:L62 - catalog_args: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/catalog1.yml
> 2024-02-24 22:52:23,970 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 159.88ms
> ______________________________ test_remote_direct ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_direct(intake_server):
>         from intake.container.dataframe import RemoteDataFrame
>         catalog = open_catalog(intake_server)
> >       s0 = catalog.entry1()
> 
> intake/catalog/tests/test_remote_integration.py:74: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dc1d0>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.71ms
> _______________________ test_remote_datasource_interface _______________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_datasource_interface(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog['entry1']
> 
> intake/catalog/tests/test_remote_integration.py:101: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dc2c0>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.43ms
> __________________________________ test_read ___________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog['entry1']
> 
> intake/catalog/tests/test_remote_integration.py:116: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dc200>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.32ms
> _______________________________ test_read_chunks _______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read_chunks(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:170: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c92f740>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.42ms
> _____________________________ test_read_partition ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read_partition(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:186: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dc560>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.44ms
> __________________________________ test_close __________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_close(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:201: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c92d430>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.82ms
> __________________________________ test_with ___________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_with(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       with catalog.entry1 as f:
> 
> intake/catalog/tests/test_remote_integration.py:208: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dfd40>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.71ms
> _________________________________ test_pickle __________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_pickle(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:215: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50cb861b0>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.55ms
> _________________________________ test_to_dask _________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_to_dask(intake_server):
>         catalog = open_catalog(intake_server)
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:231: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7ff50c8dc080>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.56ms
> _____________________________ test_remote_sequence _____________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_sequence(intake_server):
>         import glob
>         d = os.path.dirname(TEST_CATALOG_PATH)
>         catalog = open_catalog(intake_server)
>         assert 'text' in catalog
>         s = catalog.text()
>         s.discover()
> >       assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml')))
> E       AssertionError: assert 0 == 29
> E        +  where 0 = sources:\n  text:\n    args:\n      dtype: null\n      extra_metadata:\n        catalog_dir: /<<BUILDDIR>>/intake-0....tadata:\n      catalog_dir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/\n.npartitions
> E        +  and   29 = len(['/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/catalog_dup_sources.yml',...d/intake-5i3flj/intake-0.6.6/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/params_name_non_string.yml', ...])
> E        +    where ['/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/catalog_dup_sources.yml',...d/intake-5i3flj/intake-0.6.6/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/params_name_non_string.yml', ...] = <function glob at 0x7ff51426a7a0>('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/*.yml')
> E        +      where <function glob at 0x7ff51426a7a0> = <module 'glob' from '/usr/lib/python3.12/glob.py'>.glob
> E        +      and   '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests/*.yml' = <function join at 0x7ff514738c20>('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/catalog/tests', '*.yml')
> E        +        where <function join at 0x7ff514738c20> = <module 'posixpath' (frozen)>.join
> E        +          where <module 'posixpath' (frozen)> = os.path
> 
> intake/catalog/tests/test_remote_integration.py:263: AssertionError
> ________________________________ test_discover _________________________________
> 
>     def test_discover():
>         cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML,
>                'entry1']
>         process = subprocess.Popen(cmd, stdout=subprocess.PIPE,
>                                    universal_newlines=True)
>         out, _ = process.communicate()
>     
> >       assert "'dtype':" in out
> E       assert "'dtype':" in ''
> 
> intake/cli/client/tests/test_local_integration.py:89: AssertionError
> ----------------------------- Captured stderr call -----------------------------
> ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/client/tests//entry1_*.csv resolved to no files')
> ________________________________ test_get_pass _________________________________
> 
>     def test_get_pass():
>         cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1']
>         process = subprocess.Popen(cmd, stdout=subprocess.PIPE,
>                                    universal_newlines=True)
>         out, _ = process.communicate()
>     
> >       assert 'Charlie1   25.0     3' in out
> E       AssertionError: assert 'Charlie1   25.0     3' in ''
> 
> intake/cli/client/tests/test_local_integration.py:101: AssertionError
> ----------------------------- Captured stderr call -----------------------------
> ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/client/tests//entry1_*.csv resolved to no files')
> ______________________ TestServerV1Source.test_idle_timer ______________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_idle_timer>
> 
>     def test_idle_timer(self):
>         self.server.start_periodic_functions(close_idle_after=0.1,
>                                              remove_idle_after=0.2)
>     
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:208: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 4.68ms
> ______________________ TestServerV1Source.test_no_format _______________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_no_format>
> 
>     def test_no_format(self):
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:195: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 4.07ms
> _________________________ TestServerV1Source.test_open _________________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_open>
> 
>     def test_open(self):
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:112: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 3.90ms
> ________________________________ test_other_cat ________________________________
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7ff50e4e9f80>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
> >           return func(*args, **kwargs)
> 
> /usr/lib/python3/dist-packages/dask/backends.py:136: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:763: in read
>     return read_pandas(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> reader = <function read_csv at 0x7ff50f5aaf20>
> urlpath = '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv'
> blocksize = 'default', lineterminator = '\n', compression = 'infer'
> sample = 256000, sample_rows = 10, enforce = False, assume_missing = False
> storage_options = None, include_path_column = False, kwargs = {}
> reader_name = 'read_csv', kw = 'chunksize', lastskiprow = 0, firstrow = 0
> path_converter = None, paths = []
> 
>     def read_pandas(
>         reader,
>         urlpath,
>         blocksize="default",
>         lineterminator=None,
>         compression="infer",
>         sample=256000,
>         sample_rows=10,
>         enforce=False,
>         assume_missing=False,
>         storage_options=None,
>         include_path_column=False,
>         **kwargs,
>     ):
>         reader_name = reader.__name__
>         if lineterminator is not None and len(lineterminator) == 1:
>             kwargs["lineterminator"] = lineterminator
>         else:
>             lineterminator = "\n"
>         if include_path_column and isinstance(include_path_column, bool):
>             include_path_column = "path"
>         if "index" in kwargs or (
>             "index_col" in kwargs and kwargs.get("index_col") is not False
>         ):
>             raise ValueError(
>                 "Keywords 'index' and 'index_col' not supported, except for "
>                 "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead"
>             )
>         for kw in ["iterator", "chunksize"]:
>             if kw in kwargs:
>                 raise ValueError(f"{kw} not supported for dd.{reader_name}")
>         if kwargs.get("nrows", None):
>             raise ValueError(
>                 "The 'nrows' keyword is not supported by "
>                 "`dd.{0}`. To achieve the same behavior, it's "
>                 "recommended to use `dd.{0}(...)."
>                 "head(n=nrows)`".format(reader_name)
>             )
>         if isinstance(kwargs.get("skiprows"), int):
>             lastskiprow = firstrow = kwargs.get("skiprows")
>         elif kwargs.get("skiprows") is None:
>             lastskiprow = firstrow = 0
>         else:
>             # When skiprows is a list, we expect more than max(skiprows) to
>             # be included in the sample. This means that [0,2] will work well,
>             # but [0, 440] might not work.
>             skiprows = set(kwargs.get("skiprows"))
>             lastskiprow = max(skiprows)
>             # find the firstrow that is not skipped, for use as header
>             firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows))
>         if isinstance(kwargs.get("header"), list):
>             raise TypeError(f"List of header rows not supported for dd.{reader_name}")
>         if isinstance(kwargs.get("converters"), dict) and include_path_column:
>             path_converter = kwargs.get("converters").get(include_path_column, None)
>         else:
>             path_converter = None
>     
>         # If compression is "infer", inspect the (first) path suffix and
>         # set the proper compression option if the suffix is recongnized.
>         if compression == "infer":
>             # Translate the input urlpath to a simple path list
>             paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[
>                 2
>             ]
>     
>             # Check for at least one valid path
>             if len(paths) == 0:
> >               raise OSError(f"{urlpath} resolved to no files")
> E               OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:535: OSError
> 
> The above exception was the direct cause of the following exception:
> 
>     def test_other_cat():
>         cat = intake.open_catalog(catfile)
> >       df1 = cat.other_cat.read()
> 
> intake/source/tests/test_derived.py:35: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/derived.py:252: in read
>     return self.to_dask().compute()
> intake/source/derived.py:239: in to_dask
>     self._df = self._transform(self._source.to_dask(),
> intake/source/csv.py:133: in to_dask
>     self._get_schema()
> intake/source/csv.py:115: in _get_schema
>     self._open_dataset(urlpath)
> intake/source/csv.py:94: in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7ff50e4e9f80>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
>             return func(*args, **kwargs)
>         except Exception as e:
> >           raise type(e)(
>                 f"An error occurred while calling the {funcname(func)} "
>                 f"method registered to the {self.backend} backend.\n"
>                 f"Original Message: {e}"
>             ) from e
> E           OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> E           Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/backends.py:138: OSError
> ______________________________ test_text_persist _______________________________
> 
> temp_cache = None
> 
>     def test_text_persist(temp_cache):
>         cat = intake.open_catalog(os.path.join(here, 'sources.yaml'))
>         s = cat.sometext()
> >       s2 = s.persist()
> 
> intake/source/tests/test_text.py:88: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/base.py:226: in persist
>     out = self._export(store.getdir(self), **kwargs)
> intake/source/base.py:460: in _export
>     out = method(self, path=path, **kwargs)
> intake/container/semistructured.py:70: in _persist
>     return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs)
> intake/container/semistructured.py:90: in _data_to_source
>     files = open_files(posixpath.join(path, 'part.*'), mode='wt',
> /usr/lib/python3/dist-packages/fsspec/core.py:283: in open_files
>     fs, fs_token, paths = get_fs_token_paths(
> /usr/lib/python3/dist-packages/fsspec/core.py:649: in get_fs_token_paths
>     paths = _expand_paths(paths, name_function, num)
> /usr/lib/python3/dist-packages/fsspec/core.py:668: in _expand_paths
>     name_function = build_name_function(num - 1)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> max_int = -0.99999999
> 
>     def build_name_function(max_int: float) -> Callable[[int], str]:
>         """Returns a function that receives a single integer
>         and returns it as a string padded by enough zero characters
>         to align with maximum possible integer
>     
>         >>> name_f = build_name_function(57)
>     
>         >>> name_f(7)
>         '07'
>         >>> name_f(31)
>         '31'
>         >>> build_name_function(1000)(42)
>         '0042'
>         >>> build_name_function(999)(42)
>         '042'
>         >>> build_name_function(0)(0)
>         '0'
>         """
>         # handle corner cases max_int is 0 or exact power of 10
>         max_int += 1e-8
>     
> >       pad_length = int(math.ceil(math.log10(max_int)))
> E       ValueError: math domain error
> 
> /usr/lib/python3/dist-packages/fsspec/utils.py:175: ValueError
> _______________________________ test_text_export _______________________________
> 
> temp_cache = None
> 
>     def test_text_export(temp_cache):
>         import tempfile
>         outdir = tempfile.mkdtemp()
>         cat = intake.open_catalog(os.path.join(here, 'sources.yaml'))
>         s = cat.sometext()
> >       out = s.export(outdir)
> 
> intake/source/tests/test_text.py:97: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/base.py:452: in export
>     return self._export(path, **kwargs)
> intake/source/base.py:460: in _export
>     out = method(self, path=path, **kwargs)
> intake/container/semistructured.py:70: in _persist
>     return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs)
> intake/container/semistructured.py:90: in _data_to_source
>     files = open_files(posixpath.join(path, 'part.*'), mode='wt',
> /usr/lib/python3/dist-packages/fsspec/core.py:283: in open_files
>     fs, fs_token, paths = get_fs_token_paths(
> /usr/lib/python3/dist-packages/fsspec/core.py:649: in get_fs_token_paths
>     paths = _expand_paths(paths, name_function, num)
> /usr/lib/python3/dist-packages/fsspec/core.py:668: in _expand_paths
>     name_function = build_name_function(num - 1)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> max_int = -0.99999999
> 
>     def build_name_function(max_int: float) -> Callable[[int], str]:
>         """Returns a function that receives a single integer
>         and returns it as a string padded by enough zero characters
>         to align with maximum possible integer
>     
>         >>> name_f = build_name_function(57)
>     
>         >>> name_f(7)
>         '07'
>         >>> name_f(31)
>         '31'
>         >>> build_name_function(1000)(42)
>         '0042'
>         >>> build_name_function(999)(42)
>         '042'
>         >>> build_name_function(0)(0)
>         '0'
>         """
>         # handle corner cases max_int is 0 or exact power of 10
>         max_int += 1e-8
>     
> >       pad_length = int(math.ceil(math.log10(max_int)))
> E       ValueError: math domain error
> 
> /usr/lib/python3/dist-packages/fsspec/utils.py:175: ValueError
> =============================== warnings summary ===============================
> ../../../../../../usr/lib/python3/dist-packages/dateutil/tz/tz.py:37
>   /usr/lib/python3/dist-packages/dateutil/tz/tz.py:37: DeprecationWarning: datetime.datetime.utcfromtimestamp() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.fromtimestamp(timestamp, datetime.UTC).
>     EPOCH = datetime.datetime.utcfromtimestamp(0)
> 
> .pybuild/cpython3_3.12_intake/build/intake/catalog/tests/test_remote_integration.py::test_dir
> .pybuild/cpython3_3.12_intake/build/intake/catalog/tests/test_remote_integration.py::test_dir
>   /usr/lib/python3/dist-packages/_pytest/python.py:194: PytestRemovedIn8Warning: Passing None has been deprecated.
>   See https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests for alternatives in common use cases.
>     result = testfunction(**testargs)
> 
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_filtered_compressed_cache
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tgz]
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tgz]
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tbz]
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tbz]
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tar]
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_cache.py::test_compressions[tar]
>   /usr/lib/python3.12/tarfile.py:2221: DeprecationWarning: Python 3.14 will, by default, filter extracted tar archives and reject files or modify their metadata. Use the filter argument to control this behavior.
>     warnings.warn(
> 
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_discovery.py::test_package_scan
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_discovery.py::test_package_scan
> .pybuild/cpython3_3.12_intake/build/intake/source/tests/test_discovery.py::test_enable_and_disable
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed
>     warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning)
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info ============================
> FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile
> FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc...
> FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface
> FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception...
> FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex...
> FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition
> FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio...
> FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception...
> FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti...
> FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence
> FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass...
> FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass...
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open
> FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro...
> FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math...
> FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ...
> =========== 21 failed, 387 passed, 29 skipped, 13 warnings in 42.43s ===========
> E: pybuild pybuild:391: test: plugin custom failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_intake/build; PATH=/<<PKGBUILDDIR>>/debian/python3-intake/usr/bin:/<<PKGBUILDDIR>>/debian/python3-intake/usr/lib:/<<PKGBUILDDIR>>/debian/python3-intake/build/intake:$PATH python3.12 -m pytest
> I: pybuild base:305: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build; PATH=/<<PKGBUILDDIR>>/debian/python3-intake/usr/bin:/<<PKGBUILDDIR>>/debian/python3-intake/usr/lib:/<<PKGBUILDDIR>>/debian/python3-intake/build/intake:$PATH python3.11 -m pytest
> ============================= test session starts ==============================
> platform linux -- Python 3.11.8, pytest-7.4.4, pluggy-1.4.0
> rootdir: /<<PKGBUILDDIR>>
> collected 427 items / 10 skipped
> 
> intake/auth/tests/test_auth.py ......                                    [  1%]
> intake/catalog/tests/test_alias.py ..                                    [  1%]
> intake/catalog/tests/test_auth_integration.py ..                         [  2%]
> intake/catalog/tests/test_caching_integration.py ...F..............      [  6%]
> intake/catalog/tests/test_catalog_save.py .                              [  6%]
> intake/catalog/tests/test_core.py ..                                     [  7%]
> intake/catalog/tests/test_default.py .                                   [  7%]
> intake/catalog/tests/test_discovery.py ..                                [  7%]
> intake/catalog/tests/test_gui.py ..s..s                                  [  9%]
> intake/catalog/tests/test_local.py ..F.................................. [ 18%]
> ......................................................ssssss             [ 32%]
> intake/catalog/tests/test_parameters.py ..............                   [ 35%]
> intake/catalog/tests/test_persist.py .s                                  [ 35%]
> intake/catalog/tests/test_reload_integration.py ....                     [ 36%]
> intake/catalog/tests/test_remote_integration.py F...F..F.F.FFFFFF.F..... [ 42%]
> ...                                                                      [ 43%]
> intake/catalog/tests/test_utils.py .............                         [ 46%]
> intake/catalog/tests/test_zarr.py ...                                    [ 46%]
> intake/cli/client/tests/test_cache.py ......                             [ 48%]
> intake/cli/client/tests/test_conf.py .....                               [ 49%]
> intake/cli/client/tests/test_local_integration.py .....FF..              [ 51%]
> intake/cli/server/tests/test_serializer.py sss.........                  [ 54%]
> intake/cli/server/tests/test_server.py ..FFF.ss..                        [ 56%]
> intake/cli/tests/test_util.py ........                                   [ 58%]
> intake/container/tests/test_generics.py .                                [ 58%]
> intake/container/tests/test_persist.py ...s                              [ 59%]
> intake/interface/tests/test_init_gui.py ..s                              [ 60%]
> intake/source/tests/test_base.py .......................                 [ 65%]
> intake/source/tests/test_cache.py ...............s                       [ 69%]
> intake/source/tests/test_csv.py ...........s..                           [ 72%]
> intake/source/tests/test_derived.py ...F                                 [ 73%]
> intake/source/tests/test_discovery.py .....                              [ 74%]
> intake/source/tests/test_json.py .....................                   [ 79%]
> intake/source/tests/test_npy.py ...........                              [ 82%]
> intake/source/tests/test_text.py ................FF                      [ 86%]
> intake/source/tests/test_utils.py ................................       [ 94%]
> intake/tests/test_config.py .........                                    [ 96%]
> intake/tests/test_top_level.py ......s...                                [ 98%]
> intake/tests/test_utils.py ......                                        [100%]
> 
> =================================== FAILURES ===================================
> ______________________________ test_load_textfile ______________________________
> 
> catalog_cache = <Intake catalog: catalog_caching>
> 
>     def test_load_textfile(catalog_cache):
>         cat = catalog_cache['text_cache']
>         cache = cat.cache[0]
>     
>         cache_paths = cache.load(cat._urlpath, output=False)
> >       cache_path = cache_paths[-1]
> E       TypeError: 'NoneType' object is not subscriptable
> 
> intake/catalog/tests/test_caching_integration.py:53: TypeError
> _________________________________ test_nested __________________________________
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7f6de05d1080>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
> >           return func(*args, **kwargs)
> 
> /usr/lib/python3/dist-packages/dask/backends.py:136: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:763: in read
>     return read_pandas(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> reader = <function read_csv at 0x7f6de15aade0>
> urlpath = '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv'
> blocksize = 'default', lineterminator = '\n', compression = 'infer'
> sample = 256000, sample_rows = 10, enforce = False, assume_missing = False
> storage_options = None, include_path_column = False, kwargs = {}
> reader_name = 'read_csv', kw = 'chunksize', lastskiprow = 0, firstrow = 0
> path_converter = None, paths = []
> 
>     def read_pandas(
>         reader,
>         urlpath,
>         blocksize="default",
>         lineterminator=None,
>         compression="infer",
>         sample=256000,
>         sample_rows=10,
>         enforce=False,
>         assume_missing=False,
>         storage_options=None,
>         include_path_column=False,
>         **kwargs,
>     ):
>         reader_name = reader.__name__
>         if lineterminator is not None and len(lineterminator) == 1:
>             kwargs["lineterminator"] = lineterminator
>         else:
>             lineterminator = "\n"
>         if include_path_column and isinstance(include_path_column, bool):
>             include_path_column = "path"
>         if "index" in kwargs or (
>             "index_col" in kwargs and kwargs.get("index_col") is not False
>         ):
>             raise ValueError(
>                 "Keywords 'index' and 'index_col' not supported, except for "
>                 "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead"
>             )
>         for kw in ["iterator", "chunksize"]:
>             if kw in kwargs:
>                 raise ValueError(f"{kw} not supported for dd.{reader_name}")
>         if kwargs.get("nrows", None):
>             raise ValueError(
>                 "The 'nrows' keyword is not supported by "
>                 "`dd.{0}`. To achieve the same behavior, it's "
>                 "recommended to use `dd.{0}(...)."
>                 "head(n=nrows)`".format(reader_name)
>             )
>         if isinstance(kwargs.get("skiprows"), int):
>             lastskiprow = firstrow = kwargs.get("skiprows")
>         elif kwargs.get("skiprows") is None:
>             lastskiprow = firstrow = 0
>         else:
>             # When skiprows is a list, we expect more than max(skiprows) to
>             # be included in the sample. This means that [0,2] will work well,
>             # but [0, 440] might not work.
>             skiprows = set(kwargs.get("skiprows"))
>             lastskiprow = max(skiprows)
>             # find the firstrow that is not skipped, for use as header
>             firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows))
>         if isinstance(kwargs.get("header"), list):
>             raise TypeError(f"List of header rows not supported for dd.{reader_name}")
>         if isinstance(kwargs.get("converters"), dict) and include_path_column:
>             path_converter = kwargs.get("converters").get(include_path_column, None)
>         else:
>             path_converter = None
>     
>         # If compression is "infer", inspect the (first) path suffix and
>         # set the proper compression option if the suffix is recongnized.
>         if compression == "infer":
>             # Translate the input urlpath to a simple path list
>             paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[
>                 2
>             ]
>     
>             # Check for at least one valid path
>             if len(paths) == 0:
> >               raise OSError(f"{urlpath} resolved to no files")
> E               OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:535: OSError
> 
> The above exception was the direct cause of the following exception:
> 
> catalog1 = <Intake catalog: name_in_cat>
> 
>     def test_nested(catalog1):
>         assert 'nested' in catalog1
>         assert 'entry1' in catalog1.nested.nested()
> >       assert catalog1.entry1.read().equals(catalog1.nested.nested.entry1.read())
> 
> intake/catalog/tests/test_local.py:86: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/csv.py:129: in read
>     self._get_schema()
> intake/source/csv.py:115: in _get_schema
>     self._open_dataset(urlpath)
> intake/source/csv.py:94: in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7f6de05d1080>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
>             return func(*args, **kwargs)
>         except Exception as e:
> >           raise type(e)(
>                 f"An error occurred while calling the {funcname(func)} "
>                 f"method registered to the {self.backend} backend.\n"
>                 f"Original Message: {e}"
>             ) from e
> E           OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> E           Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/backends.py:138: OSError
> ______________________________ test_info_describe ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_info_describe(intake_server):
>         catalog = open_catalog(intake_server)
>     
>         assert_items_equal(list(catalog), ['use_example1', 'nested', 'entry1',
>                                            'entry1_part', 'remote_env',
>                                            'local_env', 'text', 'arr', 'datetime'])
>     
> >       info = catalog['entry1'].describe()
> 
> intake/catalog/tests/test_remote_integration.py:29: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6de0474490>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ---------------------------- Captured stderr setup -----------------------------
> 2024-02-24 22:53:06,307 - intake - INFO - __main__.py:main:L53 - Creating catalog from:
> 2024-02-24 22:53:06,307 - intake - INFO - __main__.py:main:L55 -   - /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/catalog1.yml
> 2024-02-24 22:53:06,602 - intake - INFO - __main__.py:main:L62 - catalog_args: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/catalog1.yml
> 2024-02-24 22:53:06,603 - intake - INFO - __main__.py:main:L70 - Listening on localhost:7483
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 177.84ms
> ______________________________ test_remote_direct ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_direct(intake_server):
>         from intake.container.dataframe import RemoteDataFrame
>         catalog = open_catalog(intake_server)
> >       s0 = catalog.entry1()
> 
> intake/catalog/tests/test_remote_integration.py:74: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd727510>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.70ms
> _______________________ test_remote_datasource_interface _______________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_datasource_interface(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog['entry1']
> 
> intake/catalog/tests/test_remote_integration.py:101: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd86c990>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.39ms
> __________________________________ test_read ___________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog['entry1']
> 
> intake/catalog/tests/test_remote_integration.py:116: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd7d6b50>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.33ms
> _______________________________ test_read_chunks _______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read_chunks(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:170: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd85c990>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.70ms
> _____________________________ test_read_partition ______________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_read_partition(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:186: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6de36d1f50>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.51ms
> __________________________________ test_close __________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_close(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:201: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6de0509890>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 3.15ms
> __________________________________ test_with ___________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_with(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       with catalog.entry1 as f:
> 
> intake/catalog/tests/test_remote_integration.py:208: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd789790>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.61ms
> _________________________________ test_pickle __________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_pickle(intake_server):
>         catalog = open_catalog(intake_server)
>     
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:215: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd5e0490>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.58ms
> _________________________________ test_to_dask _________________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_to_dask(intake_server):
>         catalog = open_catalog(intake_server)
> >       d = catalog.entry1
> 
> intake/catalog/tests/test_remote_integration.py:231: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/catalog/base.py:391: in __getattr__
>     return self[item]  # triggers reload_on_change
> intake/catalog/base.py:436: in __getitem__
>     s = self._get_entry(key)
> intake/catalog/utils.py:45: in wrapper
>     return f(self, *args, **kwargs)
> intake/catalog/base.py:323: in _get_entry
>     return entry()
> intake/catalog/entry.py:77: in __call__
>     s = self.get(**kwargs)
> intake/catalog/remote.py:459: in get
>     return open_remote(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> url = 'http://localhost:7483/', entry = 'entry1', container = 'dataframe'
> user_parameters = {}, description = 'entry1 full', http_args = {'headers': {}}
> page_size = None, persist_mode = 'default'
> auth = <intake.auth.base.BaseClientAuth object at 0x7f6ddd7e7110>, getenv = True
> getshell = True
> 
>     def open_remote(url, entry, container, user_parameters, description, http_args,
>                     page_size=None, persist_mode=None, auth=None, getenv=None, getshell=None):
>         """Create either local direct data source or remote streamed source"""
>         from intake.container import container_map
>         import msgpack
>         import requests
>         from requests.compat import urljoin
>     
>         if url.startswith('intake://'):
>             url = url[len('intake://'):]
>         payload = dict(action='open',
>                        name=entry,
>                        parameters=user_parameters,
>                        available_plugins=list(plugin_registry))
>         req = requests.post(urljoin(url, 'v1/source'),
>                             data=msgpack.packb(payload, **pack_kwargs),
>                             **http_args)
>         if req.ok:
>             response = msgpack.unpackb(req.content, **unpack_kwargs)
>     
>             if 'plugin' in response:
>                 pl = response['plugin']
>                 pl = [pl] if isinstance(pl, str) else pl
>                 # Direct access
>                 for p in pl:
>                     if p in plugin_registry:
>                         source = plugin_registry[p](**response['args'])
>                         proxy = False
>                         break
>                 else:
>                     proxy = True
>             else:
>                 proxy = True
>             if proxy:
>                 response.pop('container')
>                 response.update({'name': entry, 'parameters': user_parameters})
>                 if container == 'catalog':
>                     response.update({'auth': auth,
>                                      'getenv': getenv,
>                                      'getshell': getshell,
>                                      'page_size': page_size,
>                                      'persist_mode': persist_mode
>                                      # TODO ttl?
>                                      # TODO storage_options?
>                                      })
>                 source = container_map[container](url, http_args, **response)
>             source.description = description
>             return source
>         else:
> >           raise Exception('Server error: %d, %s' % (req.status_code, req.reason))
> E           Exception: Server error: 400, An error occurred while calling the read_csv method registered to the pandas backend. Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> intake/catalog/remote.py:519: Exception
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests//entry1_*.csv resolved to no files
> 400 POST /v1/source (127.0.0.1): Discover failed
> 400 POST /v1/source (127.0.0.1) 2.86ms
> _____________________________ test_remote_sequence _____________________________
> 
> intake_server = 'intake://localhost:7483'
> 
>     def test_remote_sequence(intake_server):
>         import glob
>         d = os.path.dirname(TEST_CATALOG_PATH)
>         catalog = open_catalog(intake_server)
>         assert 'text' in catalog
>         s = catalog.text()
>         s.discover()
> >       assert s.npartitions == len(glob.glob(os.path.join(d, '*.yml')))
> E       AssertionError: assert 0 == 29
> E        +  where 0 = sources:\n  text:\n    args:\n      dtype: null\n      extra_metadata:\n        catalog_dir: /<<BUILDDIR>>/intake-0....tadata:\n      catalog_dir: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/\n.npartitions
> E        +  and   29 = len(['/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/catalog_dup_sources.yml',...d/intake-5i3flj/intake-0.6.6/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/params_name_non_string.yml', ...])
> E        +    where ['/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/catalog_dup_sources.yml',...d/intake-5i3flj/intake-0.6.6/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/params_name_non_string.yml', ...] = <function glob at 0x7f6de619a700>('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/*.yml')
> E        +      where <function glob at 0x7f6de619a700> = <module 'glob' from '/usr/lib/python3.11/glob.py'>.glob
> E        +      and   '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests/*.yml' = <function join at 0x7f6de6728720>('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/catalog/tests', '*.yml')
> E        +        where <function join at 0x7f6de6728720> = <module 'posixpath' (frozen)>.join
> E        +          where <module 'posixpath' (frozen)> = os.path
> 
> intake/catalog/tests/test_remote_integration.py:263: AssertionError
> ________________________________ test_discover _________________________________
> 
>     def test_discover():
>         cmd = [ex, '-m', 'intake.cli.client', 'discover', TEST_CATALOG_YAML,
>                'entry1']
>         process = subprocess.Popen(cmd, stdout=subprocess.PIPE,
>                                    universal_newlines=True)
>         out, _ = process.communicate()
>     
> >       assert "'dtype':" in out
> E       assert "'dtype':" in ''
> 
> intake/cli/client/tests/test_local_integration.py:89: AssertionError
> ----------------------------- Captured stderr call -----------------------------
> ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/client/tests//entry1_*.csv resolved to no files')
> ________________________________ test_get_pass _________________________________
> 
>     def test_get_pass():
>         cmd = [ex, '-m', 'intake.cli.client', 'get', TEST_CATALOG_YAML, 'entry1']
>         process = subprocess.Popen(cmd, stdout=subprocess.PIPE,
>                                    universal_newlines=True)
>         out, _ = process.communicate()
>     
> >       assert 'Charlie1   25.0     3' in out
> E       AssertionError: assert 'Charlie1   25.0     3' in ''
> 
> intake/cli/client/tests/test_local_integration.py:101: AssertionError
> ----------------------------- Captured stderr call -----------------------------
> ERROR: OSError('An error occurred while calling the read_csv method registered to the pandas backend.\nOriginal Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/client/tests//entry1_*.csv resolved to no files')
> ______________________ TestServerV1Source.test_idle_timer ______________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_idle_timer>
> 
>     def test_idle_timer(self):
>         self.server.start_periodic_functions(close_idle_after=0.1,
>                                              remove_idle_after=0.2)
>     
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:208: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 4.75ms
> ______________________ TestServerV1Source.test_no_format _______________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_no_format>
> 
>     def test_no_format(self):
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:195: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 3.90ms
> _________________________ TestServerV1Source.test_open _________________________
> 
> self = <intake.cli.server.tests.test_server.TestServerV1Source testMethod=test_open>
> 
>     def test_open(self):
>         msg = dict(action='open', name='entry1', parameters={})
> >       resp_msg, = self.make_post_request(msg)
> 
> intake/cli/server/tests/test_server.py:112: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/cli/server/tests/test_server.py:96: in make_post_request
>     self.assertEqual(response.code, expected_status)
> E   AssertionError: 400 != 200
> ----------------------------- Captured stderr call -----------------------------
> Traceback (most recent call last):
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 136, in wrapper
>     return func(*args, **kwargs)
>            ^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 763, in read
>     return read_pandas(
>            ^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/dataframe/io/csv.py", line 535, in read_pandas
>     raise OSError(f"{urlpath} resolved to no files")
> OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/server.py", line 306, in post
>     source.discover()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 347, in discover
>     self._load_metadata()
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/base.py", line 285, in _load_metadata
>     self._schema = self._get_schema()
>                    ^^^^^^^^^^^^^^^^^^
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 115, in _get_schema
>     self._open_dataset(urlpath)
>   File "/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/csv.py", line 94, in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
>                       ^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/usr/lib/python3/dist-packages/dask/backends.py", line 138, in wrapper
>     raise type(e)(
> OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/cli/server/tests//entry1_*.csv resolved to no files
> ------------------------------ Captured log call -------------------------------
> WARNING  tornado.general:web.py:1873 400 POST /v1/source (127.0.0.1): Discover failed
> WARNING  tornado.access:web.py:2348 400 POST /v1/source (127.0.0.1) 3.92ms
> ________________________________ test_other_cat ________________________________
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7f6de05d1080>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
> >           return func(*args, **kwargs)
> 
> /usr/lib/python3/dist-packages/dask/backends.py:136: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:763: in read
>     return read_pandas(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> reader = <function read_csv at 0x7f6de15aade0>
> urlpath = '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv'
> blocksize = 'default', lineterminator = '\n', compression = 'infer'
> sample = 256000, sample_rows = 10, enforce = False, assume_missing = False
> storage_options = None, include_path_column = False, kwargs = {}
> reader_name = 'read_csv', kw = 'chunksize', lastskiprow = 0, firstrow = 0
> path_converter = None, paths = []
> 
>     def read_pandas(
>         reader,
>         urlpath,
>         blocksize="default",
>         lineterminator=None,
>         compression="infer",
>         sample=256000,
>         sample_rows=10,
>         enforce=False,
>         assume_missing=False,
>         storage_options=None,
>         include_path_column=False,
>         **kwargs,
>     ):
>         reader_name = reader.__name__
>         if lineterminator is not None and len(lineterminator) == 1:
>             kwargs["lineterminator"] = lineterminator
>         else:
>             lineterminator = "\n"
>         if include_path_column and isinstance(include_path_column, bool):
>             include_path_column = "path"
>         if "index" in kwargs or (
>             "index_col" in kwargs and kwargs.get("index_col") is not False
>         ):
>             raise ValueError(
>                 "Keywords 'index' and 'index_col' not supported, except for "
>                 "'index_col=False'. Use dd.{reader_name}(...).set_index('my-index') instead"
>             )
>         for kw in ["iterator", "chunksize"]:
>             if kw in kwargs:
>                 raise ValueError(f"{kw} not supported for dd.{reader_name}")
>         if kwargs.get("nrows", None):
>             raise ValueError(
>                 "The 'nrows' keyword is not supported by "
>                 "`dd.{0}`. To achieve the same behavior, it's "
>                 "recommended to use `dd.{0}(...)."
>                 "head(n=nrows)`".format(reader_name)
>             )
>         if isinstance(kwargs.get("skiprows"), int):
>             lastskiprow = firstrow = kwargs.get("skiprows")
>         elif kwargs.get("skiprows") is None:
>             lastskiprow = firstrow = 0
>         else:
>             # When skiprows is a list, we expect more than max(skiprows) to
>             # be included in the sample. This means that [0,2] will work well,
>             # but [0, 440] might not work.
>             skiprows = set(kwargs.get("skiprows"))
>             lastskiprow = max(skiprows)
>             # find the firstrow that is not skipped, for use as header
>             firstrow = min(set(range(len(skiprows) + 1)) - set(skiprows))
>         if isinstance(kwargs.get("header"), list):
>             raise TypeError(f"List of header rows not supported for dd.{reader_name}")
>         if isinstance(kwargs.get("converters"), dict) and include_path_column:
>             path_converter = kwargs.get("converters").get(include_path_column, None)
>         else:
>             path_converter = None
>     
>         # If compression is "infer", inspect the (first) path suffix and
>         # set the proper compression option if the suffix is recongnized.
>         if compression == "infer":
>             # Translate the input urlpath to a simple path list
>             paths = get_fs_token_paths(urlpath, mode="rb", storage_options=storage_options)[
>                 2
>             ]
>     
>             # Check for at least one valid path
>             if len(paths) == 0:
> >               raise OSError(f"{urlpath} resolved to no files")
> E               OSError: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/dataframe/io/csv.py:535: OSError
> 
> The above exception was the direct cause of the following exception:
> 
>     def test_other_cat():
>         cat = intake.open_catalog(catfile)
> >       df1 = cat.other_cat.read()
> 
> intake/source/tests/test_derived.py:35: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/derived.py:252: in read
>     return self.to_dask().compute()
> intake/source/derived.py:239: in to_dask
>     self._df = self._transform(self._source.to_dask(),
> intake/source/csv.py:133: in to_dask
>     self._get_schema()
> intake/source/csv.py:115: in _get_schema
>     self._open_dataset(urlpath)
> intake/source/csv.py:94: in _open_dataset
>     self._dataframe = dask.dataframe.read_csv(
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> args = ('/<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv',)
> kwargs = {'storage_options': None}
> func = <function make_reader.<locals>.read at 0x7f6de05d1080>
> 
>     @wraps(fn)
>     def wrapper(*args, **kwargs):
>         func = getattr(self, dispatch_name)
>         try:
>             return func(*args, **kwargs)
>         except Exception as e:
> >           raise type(e)(
>                 f"An error occurred while calling the {funcname(func)} "
>                 f"method registered to the {self.backend} backend.\n"
>                 f"Original Message: {e}"
>             ) from e
> E           OSError: An error occurred while calling the read_csv method registered to the pandas backend.
> E           Original Message: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/tests/../../catalog/tests//entry1_*.csv resolved to no files
> 
> /usr/lib/python3/dist-packages/dask/backends.py:138: OSError
> ______________________________ test_text_persist _______________________________
> 
> temp_cache = None
> 
>     def test_text_persist(temp_cache):
>         cat = intake.open_catalog(os.path.join(here, 'sources.yaml'))
>         s = cat.sometext()
> >       s2 = s.persist()
> 
> intake/source/tests/test_text.py:88: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/base.py:226: in persist
>     out = self._export(store.getdir(self), **kwargs)
> intake/source/base.py:460: in _export
>     out = method(self, path=path, **kwargs)
> intake/container/semistructured.py:70: in _persist
>     return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs)
> intake/container/semistructured.py:90: in _data_to_source
>     files = open_files(posixpath.join(path, 'part.*'), mode='wt',
> /usr/lib/python3/dist-packages/fsspec/core.py:283: in open_files
>     fs, fs_token, paths = get_fs_token_paths(
> /usr/lib/python3/dist-packages/fsspec/core.py:649: in get_fs_token_paths
>     paths = _expand_paths(paths, name_function, num)
> /usr/lib/python3/dist-packages/fsspec/core.py:668: in _expand_paths
>     name_function = build_name_function(num - 1)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> max_int = -0.99999999
> 
>     def build_name_function(max_int: float) -> Callable[[int], str]:
>         """Returns a function that receives a single integer
>         and returns it as a string padded by enough zero characters
>         to align with maximum possible integer
>     
>         >>> name_f = build_name_function(57)
>     
>         >>> name_f(7)
>         '07'
>         >>> name_f(31)
>         '31'
>         >>> build_name_function(1000)(42)
>         '0042'
>         >>> build_name_function(999)(42)
>         '042'
>         >>> build_name_function(0)(0)
>         '0'
>         """
>         # handle corner cases max_int is 0 or exact power of 10
>         max_int += 1e-8
>     
> >       pad_length = int(math.ceil(math.log10(max_int)))
> E       ValueError: math domain error
> 
> /usr/lib/python3/dist-packages/fsspec/utils.py:175: ValueError
> _______________________________ test_text_export _______________________________
> 
> temp_cache = None
> 
>     def test_text_export(temp_cache):
>         import tempfile
>         outdir = tempfile.mkdtemp()
>         cat = intake.open_catalog(os.path.join(here, 'sources.yaml'))
>         s = cat.sometext()
> >       out = s.export(outdir)
> 
> intake/source/tests/test_text.py:97: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> intake/source/base.py:452: in export
>     return self._export(path, **kwargs)
> intake/source/base.py:460: in _export
>     out = method(self, path=path, **kwargs)
> intake/container/semistructured.py:70: in _persist
>     return RemoteSequenceSource._data_to_source(b, path, encoder, **kwargs)
> intake/container/semistructured.py:90: in _data_to_source
>     files = open_files(posixpath.join(path, 'part.*'), mode='wt',
> /usr/lib/python3/dist-packages/fsspec/core.py:283: in open_files
>     fs, fs_token, paths = get_fs_token_paths(
> /usr/lib/python3/dist-packages/fsspec/core.py:649: in get_fs_token_paths
>     paths = _expand_paths(paths, name_function, num)
> /usr/lib/python3/dist-packages/fsspec/core.py:668: in _expand_paths
>     name_function = build_name_function(num - 1)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 
> max_int = -0.99999999
> 
>     def build_name_function(max_int: float) -> Callable[[int], str]:
>         """Returns a function that receives a single integer
>         and returns it as a string padded by enough zero characters
>         to align with maximum possible integer
>     
>         >>> name_f = build_name_function(57)
>     
>         >>> name_f(7)
>         '07'
>         >>> name_f(31)
>         '31'
>         >>> build_name_function(1000)(42)
>         '0042'
>         >>> build_name_function(999)(42)
>         '042'
>         >>> build_name_function(0)(0)
>         '0'
>         """
>         # handle corner cases max_int is 0 or exact power of 10
>         max_int += 1e-8
>     
> >       pad_length = int(math.ceil(math.log10(max_int)))
> E       ValueError: math domain error
> 
> /usr/lib/python3/dist-packages/fsspec/utils.py:175: ValueError
> =============================== warnings summary ===============================
> .pybuild/cpython3_3.11_intake/build/intake/catalog/tests/test_remote_integration.py::test_dir
> .pybuild/cpython3_3.11_intake/build/intake/catalog/tests/test_remote_integration.py::test_dir
>   /usr/lib/python3/dist-packages/_pytest/python.py:194: PytestRemovedIn8Warning: Passing None has been deprecated.
>   See https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests for alternatives in common use cases.
>     result = testfunction(**testargs)
> 
> .pybuild/cpython3_3.11_intake/build/intake/source/tests/test_discovery.py::test_package_scan
> .pybuild/cpython3_3.11_intake/build/intake/source/tests/test_discovery.py::test_package_scan
> .pybuild/cpython3_3.11_intake/build/intake/source/tests/test_discovery.py::test_enable_and_disable
>   /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build/intake/source/discovery.py:194: PendingDeprecationWarning: Package scanning may be removed
>     warnings.warn("Package scanning may be removed", category=PendingDeprecationWarning)
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info ============================
> FAILED intake/catalog/tests/test_caching_integration.py::test_load_textfile
> FAILED intake/catalog/tests/test_local.py::test_nested - OSError: An error oc...
> FAILED intake/catalog/tests/test_remote_integration.py::test_info_describe - ...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_direct - ...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_datasource_interface
> FAILED intake/catalog/tests/test_remote_integration.py::test_read - Exception...
> FAILED intake/catalog/tests/test_remote_integration.py::test_read_chunks - Ex...
> FAILED intake/catalog/tests/test_remote_integration.py::test_read_partition
> FAILED intake/catalog/tests/test_remote_integration.py::test_close - Exceptio...
> FAILED intake/catalog/tests/test_remote_integration.py::test_with - Exception...
> FAILED intake/catalog/tests/test_remote_integration.py::test_pickle - Excepti...
> FAILED intake/catalog/tests/test_remote_integration.py::test_to_dask - Except...
> FAILED intake/catalog/tests/test_remote_integration.py::test_remote_sequence
> FAILED intake/cli/client/tests/test_local_integration.py::test_discover - ass...
> FAILED intake/cli/client/tests/test_local_integration.py::test_get_pass - Ass...
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_idle_timer
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_no_format
> FAILED intake/cli/server/tests/test_server.py::TestServerV1Source::test_open
> FAILED intake/source/tests/test_derived.py::test_other_cat - OSError: An erro...
> FAILED intake/source/tests/test_text.py::test_text_persist - ValueError: math...
> FAILED intake/source/tests/test_text.py::test_text_export - ValueError: math ...
> =========== 21 failed, 387 passed, 29 skipped, 5 warnings in 42.01s ============
> E: pybuild pybuild:391: test: plugin custom failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_intake/build; PATH=/<<PKGBUILDDIR>>/debian/python3-intake/usr/bin:/<<PKGBUILDDIR>>/debian/python3-intake/usr/lib:/<<PKGBUILDDIR>>/debian/python3-intake/build/intake:$PATH python3.11 -m pytest
> dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.12 3.11" returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/02/24/intake_0.6.6-3_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240224;users=lucas@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240224&fusertaguser=lucas@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.



More information about the Debian-med-packaging mailing list