[Git][debian-gis-team/pyshp][master] 4 commits: New upstream version 3.0.1
Bas Couwenberg (@sebastic)
gitlab at salsa.debian.org
Wed Aug 20 04:52:47 BST 2025
Bas Couwenberg pushed to branch master at Debian GIS Project / pyshp
Commits:
d439a599 by Bas Couwenberg at 2025-08-20T05:45:53+02:00
New upstream version 3.0.1
- - - - -
c5562a0f by Bas Couwenberg at 2025-08-20T05:46:03+02:00
Update upstream source from tag 'upstream/3.0.1'
Update to upstream version '3.0.1'
with Debian dir 0fcc27b081fb1c311d327556efd66d005180b4ae
- - - - -
2f696a7c by Bas Couwenberg at 2025-08-20T05:48:11+02:00
New upstream release.
- - - - -
548bbc52 by Bas Couwenberg at 2025-08-20T05:49:16+02:00
Set distribution to unstable.
- - - - -
10 changed files:
- .github/workflows/run_checks_build_and_test.yml
- .pre-commit-config.yaml
- LICENSE.TXT
- README.md
- changelog.txt
- debian/changelog
- pyproject.toml
- run_benchmarks.py
- src/shapefile.py
- test_shapefile.py
Changes:
=====================================
.github/workflows/run_checks_build_and_test.yml
=====================================
@@ -13,14 +13,28 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout at v4
- - uses: actions/setup-python at v5
- uses: pre-commit/action at v3.0.1
+ mypy-strict:
+ runs-on: ubuntu-latest
+ steps:
+ - uses: actions/checkout at v4
+ - uses: actions/setup-python at v5
+ with:
+ python-version: "3.13"
+ - name: Install mypy
+ run: pip install mypy
+ - name: Run mypy --strict
+ run: mypy --strict ./src/shapefile.py
+
+
build_wheel_and_sdist:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout at v4
- uses: actions/setup-python at v5
+ with:
+ python-version: "3.13"
- name: Build wheel from the project repo
uses: ./.github/actions/build_wheel_and_sdist
=====================================
.pre-commit-config.yaml
=====================================
@@ -7,12 +7,6 @@ repos:
args: [ --fix ]
# Run the formatter
- id: ruff-format
-- repo: https://github.com/pycqa/isort
- rev: 6.0.1
- hooks:
- - id: isort
- name: isort (python)
- args: ["--profile", "black"]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
=====================================
LICENSE.TXT
=====================================
@@ -1,8 +1,8 @@
The MIT License (MIT)
-Copyright © 2013 Joel Lawhead
+Copyright © 2013 Joel Lawhead
-Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
+Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Softwareâ€), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
=====================================
README.md
=====================================
@@ -93,6 +93,17 @@ part of your geospatial project.
# Version Changes
+## 3.0.1
+
+### Improvements
+- Reader(shp=, dbf=, shx=) now support pathlib.Paths, and any pathlike object (@mwtoews).
+### Bug fixes
+- PyShp 3 no longer modifies the global doctest module (@JamesParrott).
+### Code quality
+- isort replaced by Ruff check's I rule (@mwtoews).
+- mypy --strict used in CI (@JamesParrott).
+- LICENSE.TXT re-encoded in UTF-8 (@musicinmybrain).
+
## 3.0.0
### Breaking Changes:
=====================================
changelog.txt
=====================================
@@ -1,3 +1,16 @@
+VERSION 3.0.1
+
+2025-08-19
+ Improvements:
+ * Reader(shp=, dbf=, shx=) now support pathlib.Paths, and any pathlike object (@mwtoews).
+ Bug fixes:
+ * PyShp 3 no longer modifies the global doctest module (@JamesParrott).
+ Code quality:
+ * isort replaced by Ruff check's I rule (@mwtoews).
+ * mypy --strict used in CI (@JamesParrott).
+ * LICENSE.TXT re-encoded in UTF-8 (@musicinmybrain)
+
+
VERSION 3.0.0
2025-08-03
@@ -15,13 +28,13 @@ VERSION 3.0.0
Code quality
* Statically typed and checked with Mypy
- * Checked with Ruff.
+ * Checked with Ruff. (@mwtoews)
* Type hints
* f-strings
* Remove Python 2 specific functions.
* Run doctests against wheels.
* Testing of wheels before publishing them
- * pyproject.toml src layout
+ * Updated metadata, changed build backend to Hatch, and restructured repor into pyproject.toml src layout (@mwtoews)
* Slow test marked.
Improvements:
=====================================
debian/changelog
=====================================
@@ -1,9 +1,10 @@
-pyshp (3.0.0-2) UNRELEASED; urgency=medium
+pyshp (3.0.1-1) unstable; urgency=medium
* Team upload.
+ * New upstream release.
* Mark python3-pyshp as Multi-Arch: foreign.
- -- Bas Couwenberg <sebastic at debian.org> Tue, 12 Aug 2025 21:06:41 +0200
+ -- Bas Couwenberg <sebastic at debian.org> Wed, 20 Aug 2025 05:49:06 +0200
pyshp (3.0.0-1) unstable; urgency=medium
=====================================
pyproject.toml
=====================================
@@ -87,12 +87,14 @@ exclude = [
line-length = 88
indent-width = 4
-# Assume Python 3.9
-target-version = "py39"
-
[tool.ruff.lint]
-# Enable Pyflakes (`F`) and a subset of the pycodestyle (`E`) codes by default.
-select = ["E4", "E7", "E9", "F"]
+select = [
+ "E4", # pycodestyle: Import
+ "E7", # pycodestyle: Statement
+ "E9", # pycodestyle: Runtime
+ "F", # pyflakes
+ "I", # isort
+]
ignore = []
# Allow fix for all enabled rules (when `--fix`) is provided.
=====================================
run_benchmarks.py
=====================================
@@ -7,9 +7,10 @@ import functools
import os
import timeit
from collections.abc import Callable
+from os import PathLike
from pathlib import Path
from tempfile import TemporaryFile as TempF
-from typing import Union
+from typing import Iterable, Union, cast
import shapefile
@@ -50,21 +51,22 @@ fields = {}
shapeRecords = collections.defaultdict(list)
-def open_shapefile_with_PyShp(target: Union[str, os.PathLike]):
+def open_shapefile_with_PyShp(target: Union[str, PathLike]):
with shapefile.Reader(target) as r:
fields[target] = r.fields
for shapeRecord in r.iterShapeRecords():
shapeRecords[target].append(shapeRecord)
-def write_shapefile_with_PyShp(target: Union[str, os.PathLike]):
+def write_shapefile_with_PyShp(target: Union[str, PathLike]):
with TempF("wb") as shp, TempF("wb") as dbf, TempF("wb") as shx:
with shapefile.Writer(shp=shp, dbf=dbf, shx=shx) as w: # type: ignore [arg-type]
for field_info_tuple in fields[target]:
w.field(*field_info_tuple)
for shapeRecord in shapeRecords[target]:
- w.shape(shapeRecord.shape)
- w.record(*shapeRecord.record)
+ w.shape(cast(shapefile.Shape, shapeRecord.shape))
+ record = cast(Iterable, shapeRecord.record)
+ w.record(*record)
SHAPEFILES = {
=====================================
src/shapefile.py
=====================================
@@ -8,7 +8,7 @@ Compatible with Python versions >=3.9
from __future__ import annotations
-__version__ = "3.0.0"
+__version__ = "3.0.1"
import array
import doctest
@@ -20,7 +20,9 @@ import tempfile
import time
import zipfile
from datetime import date
+from os import PathLike
from struct import Struct, calcsize, error, pack, unpack
+from types import TracebackType
from typing import (
IO,
Any,
@@ -36,6 +38,7 @@ from typing import (
Protocol,
Reversible,
Sequence,
+ SupportsIndex,
TypedDict,
TypeVar,
Union,
@@ -49,8 +52,6 @@ from urllib.request import Request, urlopen
# Create named logger
logger = logging.getLogger(__name__)
-doctest.NORMALIZE_WHITESPACE = 1
-
# Module settings
VERBOSE = True
@@ -130,34 +131,34 @@ ZBox = tuple[float, float]
class WriteableBinStream(Protocol):
- def write(self, b: bytes): ...
+ def write(self, b: bytes) -> int: ...
class ReadableBinStream(Protocol):
- def read(self, size: int = -1): ...
+ def read(self, size: int = -1) -> bytes: ...
class WriteSeekableBinStream(Protocol):
- def write(self, b: bytes): ...
- def seek(self, offset: int, whence: int = 0): ...
- def tell(self): ...
+ def write(self, b: bytes) -> int: ...
+ def seek(self, offset: int, whence: int = 0) -> int: ...
+ def tell(self) -> int: ...
class ReadSeekableBinStream(Protocol):
- def seek(self, offset: int, whence: int = 0): ...
- def tell(self): ...
- def read(self, size: int = -1): ...
+ def seek(self, offset: int, whence: int = 0) -> int: ...
+ def tell(self) -> int: ...
+ def read(self, size: int = -1) -> bytes: ...
class ReadWriteSeekableBinStream(Protocol):
- def write(self, b: bytes): ...
- def seek(self, offset: int, whence: int = 0): ...
- def tell(self): ...
- def read(self, size: int = -1): ...
+ def write(self, b: bytes) -> int: ...
+ def seek(self, offset: int, whence: int = 0) -> int: ...
+ def tell(self) -> int: ...
+ def read(self, size: int = -1) -> bytes: ...
# File name, file object or anything with a read() method that returns bytes.
-BinaryFileT = Union[str, IO[bytes]]
+BinaryFileT = Union[str, PathLike[Any], IO[bytes]]
BinaryFileStreamT = Union[IO[bytes], io.BytesIO, WriteSeekableBinStream]
FieldTypeT = Literal["C", "D", "F", "L", "M", "N"]
@@ -230,7 +231,7 @@ class Field(NamedTuple):
return f'Field(name="{self.name}", field_type=FieldType.{self.field_type}, size={self.size}, decimal={self.decimal})'
-RecordValueNotDate = Union[bool, int, float, str, date]
+RecordValueNotDate = Union[bool, int, float, str]
# A Possible value in a Shapefile dbf record, i.e. L, N, M, F, C, or D types
RecordValue = Union[RecordValueNotDate, date]
@@ -238,7 +239,7 @@ RecordValue = Union[RecordValueNotDate, date]
class HasGeoInterface(Protocol):
@property
- def __geo_interface__(self) -> Any: ...
+ def __geo_interface__(self) -> GeoJSONHomogeneousGeometryObject: ...
class GeoJSONPoint(TypedDict):
@@ -323,7 +324,10 @@ class GeoJSONFeatureCollection(TypedDict):
class GeoJSONFeatureCollectionWithBBox(GeoJSONFeatureCollection):
- # bbox is technically optional under the spec
+ # bbox is technically optional under the spec but this seems
+ # a very minor improvement that would require NotRequired
+ # from the typing-extensions backport for Python 3.9
+ # (PyShp's resisted having any other dependencies so far!)
bbox: list[float]
@@ -336,11 +340,11 @@ unpack_2_int32_be = Struct(">2i").unpack
@overload
-def fsdecode_if_pathlike(path: os.PathLike) -> str: ...
+def fsdecode_if_pathlike(path: PathLike[Any]) -> str: ...
@overload
def fsdecode_if_pathlike(path: T) -> T: ...
-def fsdecode_if_pathlike(path):
- if isinstance(path, os.PathLike):
+def fsdecode_if_pathlike(path: Any) -> Any:
+ if isinstance(path, PathLike):
return os.fsdecode(path) # str
return path
@@ -348,12 +352,16 @@ def fsdecode_if_pathlike(path):
# Begin
+ARR_TYPE = TypeVar("ARR_TYPE", int, float)
+
-class _Array(array.array, Generic[T]):
+# In Python 3.12 we can do:
+# class _Array(array.array[ARR_TYPE], Generic[ARR_TYPE]):
+class _Array(array.array, Generic[ARR_TYPE]): # type: ignore[type-arg]
"""Converts python tuples to lists of the appropriate type.
Used to unpack different shapefile header parts."""
- def __repr__(self):
+ def __repr__(self) -> str:
return str(self.tolist())
@@ -369,7 +377,7 @@ def signed_area(
xs, ys = map(list, list(zip(*coords))[:2]) # ignore any z or m values
xs.append(xs[1])
ys.append(ys[1])
- area2 = sum(xs[i] * (ys[i + 1] - ys[i - 1]) for i in range(1, len(coords)))
+ area2: float = sum(xs[i] * (ys[i + 1] - ys[i - 1]) for i in range(1, len(coords)))
if fast:
return area2
@@ -471,7 +479,7 @@ def ring_sample(coords: PointsT, ccw: bool = False) -> Point2D:
"""
triplet = []
- def itercoords():
+ def itercoords() -> Iterator[PointT]:
# iterate full closed ring
yield from coords
# finally, yield the second coordinate to the end to allow checking the last triplet
@@ -657,6 +665,9 @@ class _NoShapeTypeSentinel:
"""
+_NO_SHAPE_TYPE_SENTINEL: Final = _NoShapeTypeSentinel()
+
+
def _m_from_point(point: Union[PointMT, PointZT], mpos: int) -> Optional[float]:
if len(point) > mpos and point[mpos] is not None:
return cast(float, point[mpos])
@@ -694,7 +705,7 @@ class CanHaveBboxNoLinesKwargs(TypedDict, total=False):
class Shape:
def __init__(
self,
- shapeType: Union[int, _NoShapeTypeSentinel] = _NoShapeTypeSentinel(),
+ shapeType: Union[int, _NoShapeTypeSentinel] = _NO_SHAPE_TYPE_SENTINEL,
points: Optional[PointsT] = None,
parts: Optional[Sequence[int]] = None, # index of start point of each part
lines: Optional[list[PointsT]] = None,
@@ -717,11 +728,16 @@ class Shape:
are designated by their starting index in geometry record's
list of shapes. For MultiPatch geometry, partTypes designates
the patch type of each of the parts.
+ Lines allows the points-lists and parts to be denoted together
+ in one argument. It is intended for multiple point shapes
+ (polylines, polygons and multipatches) but if used as a length-1
+ nested list for a multipoint (instead of points for some reason)
+ PyShp will not complain, as multipoints only have 1 part internally.
"""
# Preserve previous behaviour for anyone who set self.shapeType = None
- if not isinstance(shapeType, _NoShapeTypeSentinel):
- self.shapeType = shapeType
+ if shapeType is not _NO_SHAPE_TYPE_SENTINEL:
+ self.shapeType = cast(int, shapeType)
else:
class_name = self.__class__.__name__
self.shapeType = SHAPETYPENUM_LOOKUP.get(class_name.upper(), NULL)
@@ -732,7 +748,6 @@ class Shape:
default_points: PointsT = []
default_parts: list[int] = []
- # Make sure polygon rings (parts) are closed
if lines is not None:
if self.shapeType in Polygon_shapeTypes:
lines = list(lines)
@@ -973,7 +988,7 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
)
@staticmethod
- def _from_geojson(geoj) -> Shape:
+ def _from_geojson(geoj: GeoJSONHomogeneousGeometryObject) -> Shape:
# create empty shape
# set shapeType
geojType = geoj["type"] if geoj else "Null"
@@ -982,18 +997,26 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
else:
raise GeoJSON_Error(f"Cannot create Shape from GeoJSON type '{geojType}'")
+ coordinates = geoj["coordinates"]
+
+ if coordinates == ():
+ raise GeoJSON_Error(f"Cannot create non-Null Shape from: {coordinates=}")
+
+ points: PointsT
+ parts: list[int]
+
# set points and parts
if geojType == "Point":
- points = [geoj["coordinates"]]
+ points = [cast(PointT, coordinates)]
parts = [0]
elif geojType in ("MultiPoint", "LineString"):
- points = geoj["coordinates"]
+ points = cast(PointsT, coordinates)
parts = [0]
elif geojType == "Polygon":
points = []
parts = []
index = 0
- for i, ext_or_hole in enumerate(geoj["coordinates"]):
+ for i, ext_or_hole in enumerate(cast(list[PointsT], coordinates)):
# although the latest GeoJSON spec states that exterior rings should have
# counter-clockwise orientation, we explicitly check orientation since older
# GeoJSONs might not enforce this.
@@ -1010,7 +1033,7 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
points = []
parts = []
index = 0
- for linestring in geoj["coordinates"]:
+ for linestring in cast(list[PointsT], coordinates):
points.extend(linestring)
parts.append(index)
index += len(linestring)
@@ -1018,7 +1041,7 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
points = []
parts = []
index = 0
- for polygon in geoj["coordinates"]:
+ for polygon in cast(list[list[PointsT]], coordinates):
for i, ext_or_hole in enumerate(polygon):
# although the latest GeoJSON spec states that exterior rings should have
# counter-clockwise orientation, we explicitly check orientation since older
@@ -1043,7 +1066,7 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
def shapeTypeName(self) -> str:
return SHAPETYPE_LOOKUP[self.shapeType]
- def __repr__(self):
+ def __repr__(self) -> str:
class_name = self.__class__.__name__
if class_name == "Shape":
return f"Shape #{self.__oid}: {self.shapeTypeName}"
@@ -1055,7 +1078,7 @@ still included but were encoded as GeoJSON exterior rings instead of holes."
class NullShape(Shape):
# Shape.shapeType = NULL already,
# to preserve handling of default args in Shape.__init__
- # Repeated for clarity.
+ # Repeated for the avoidance of doubt.
def __init__(
self,
oid: Optional[int] = None,
@@ -1100,8 +1123,8 @@ _CanHaveBBox_shapeTypes = frozenset(
class _CanHaveBBox(Shape):
"""As well as setting bounding boxes, we also utilize the
- fact that this mixin applies to all the shapes that are
- not a single point.
+ fact that this mixin only applies to all the shapes that are
+ not a single point (polylines, polygons, multipatches and multipoints).
"""
@staticmethod
@@ -1123,7 +1146,8 @@ class _CanHaveBBox(Shape):
@staticmethod
def _read_npoints_from_byte_stream(b_io: ReadableBinStream) -> int:
- return unpack("<i", b_io.read(4))[0]
+ (nPoints,) = unpack("<i", b_io.read(4))
+ return cast(int, nPoints)
@staticmethod
def _write_npoints_to_byte_stream(b_io: WriteableBinStream, s: _CanHaveBBox) -> int:
@@ -1185,10 +1209,6 @@ class _CanHaveBBox(Shape):
b_io, nParts
)
- # else:
- # parts = None
- # partTypes = None
-
if nPoints:
kwargs["points"] = cast(
PointsT, cls._read_points_from_byte_stream(b_io, nPoints)
@@ -1204,25 +1224,7 @@ class _CanHaveBBox(Shape):
b_io, nPoints, next_shape
)
- # else:
- # points = None
- # zbox, zs = None, None
- # mbox, ms = None, None
-
return ShapeClass(**kwargs)
- # return ShapeClass(
- # shapeType=shapeType,
- # # Mypy 1.17.1 doesn't figure out that an Optional[list[Point2D]] is an Optional[list[PointT]]
- # points=cast(Optional[PointsT], points),
- # parts=parts,
- # partTypes=partTypes,
- # oid=oid,
- # m=ms,
- # z=zs,
- # bbox=shape_bbox,
- # mbox=mbox,
- # zbox=zbox,
- # )
@staticmethod
def write_to_byte_stream(
@@ -1231,7 +1233,7 @@ class _CanHaveBBox(Shape):
i: int,
) -> int:
# We use static methods here and below,
- # to support s only being an instance of a the
+ # to support s only being an instance of the
# Shape base class (with shapeType set)
# i.e. not necessarily one of our newer shape specific
# sub classes.
@@ -1290,7 +1292,8 @@ class _CanHaveParts(_CanHaveBBox):
@staticmethod
def _read_nparts_from_byte_stream(b_io: ReadableBinStream) -> int:
- return unpack("<i", b_io.read(4))[0]
+ (nParts,) = unpack("<i", b_io.read(4))
+ return cast(int, nParts)
@staticmethod
def _write_nparts_to_byte_stream(b_io: WriteableBinStream, s: _CanHaveParts) -> int:
@@ -1325,8 +1328,7 @@ class Point(Shape):
Shape.__init__(self, points=[(x, y)], oid=oid)
@staticmethod
- def _x_y_from_byte_stream(b_io: ReadableBinStream):
- # Unpack _Array too
+ def _x_y_from_byte_stream(b_io: ReadableBinStream) -> tuple[float, float]:
x, y = unpack("<2d", b_io.read(16))
# Convert to tuple
return x, y
@@ -1964,7 +1966,7 @@ SHAPE_CLASS_FROM_SHAPETYPE: dict[int, type[Union[NullShape, Point, _CanHaveBBox]
}
-class _Record(list):
+class _Record(list[RecordValue]):
"""
A class to hold a record. Subclasses list to ensure compatibility with
former work and to reuse all the optimizations of the builtin list.
@@ -2023,7 +2025,7 @@ class _Record(list):
f"{item} found as a field but not enough values available."
)
- def __setattr__(self, key: str, value: RecordValue):
+ def __setattr__(self, key: str, value: RecordValue) -> None:
"""
Sets a value of a field attribute
:param key: The field name
@@ -2039,7 +2041,15 @@ class _Record(list):
except KeyError:
raise AttributeError(f"{key} is not a field name")
- def __getitem__(self, item):
+ @overload
+ def __getitem__(self, i: SupportsIndex) -> RecordValue: ...
+ @overload
+ def __getitem__(self, s: slice) -> list[RecordValue]: ...
+ @overload
+ def __getitem__(self, s: str) -> RecordValue: ...
+ def __getitem__(
+ self, item: Union[SupportsIndex, slice, str]
+ ) -> Union[RecordValue, list[RecordValue]]:
"""
Extends the normal list item access with
access using a fieldname
@@ -2049,10 +2059,10 @@ class _Record(list):
:return: the value of the field
"""
try:
- return list.__getitem__(self, item)
+ return list.__getitem__(self, item) # type: ignore[index]
except TypeError:
try:
- index = self.__field_positions[item]
+ index = self.__field_positions[item] # type: ignore[index]
except KeyError:
index = None
if index is not None:
@@ -2060,7 +2070,17 @@ class _Record(list):
raise IndexError(f'"{item}" is not a field name and not an int')
- def __setitem__(self, key, value):
+ @overload
+ def __setitem__(self, key: SupportsIndex, value: RecordValue) -> None: ...
+ @overload
+ def __setitem__(self, key: slice, value: Iterable[RecordValue]) -> None: ...
+ @overload
+ def __setitem__(self, key: str, value: RecordValue) -> None: ...
+ def __setitem__(
+ self,
+ key: Union[SupportsIndex, slice, str],
+ value: Union[RecordValue, Iterable[RecordValue]],
+ ) -> None:
"""
Extends the normal list item access with
access using a fieldname
@@ -2070,11 +2090,11 @@ class _Record(list):
:param value: the new value of the field
"""
try:
- return list.__setitem__(self, key, value)
+ return list.__setitem__(self, key, value) # type: ignore[misc,assignment]
except TypeError:
- index = self.__field_positions.get(key)
+ index = self.__field_positions.get(key) # type: ignore[arg-type]
if index is not None:
- return list.__setitem__(self, index, value)
+ return list.__setitem__(self, index, value) # type: ignore[misc]
raise IndexError(f"{key} is not a field name and not an int")
@@ -2095,7 +2115,7 @@ class _Record(list):
dct[k] = f"{v.year:04d}{v.month:02d}{v.day:02d}"
return dct
- def __repr__(self):
+ def __repr__(self) -> str:
return f"Record #{self.__oid}: {list(self)}"
def __dir__(self) -> list[str]:
@@ -2113,8 +2133,8 @@ class _Record(list):
) # plus field names (random order if Python version < 3.6)
return default + fnames
- def __eq__(self, other):
- if isinstance(other, self.__class__):
+ def __eq__(self, other: Any) -> bool:
+ if isinstance(other, _Record):
if self.__field_positions != other.__field_positions:
return False
return list.__eq__(self, other)
@@ -2147,7 +2167,7 @@ class Shapes(list[Optional[Shape]]):
In addition to the list interface, this also provides the GeoJSON __geo_interface__
to return a GeometryCollection dictionary."""
- def __repr__(self):
+ def __repr__(self) -> str:
return f"Shapes: {list(self)}"
@property
@@ -2167,7 +2187,7 @@ class ShapeRecords(list[ShapeRecord]):
In addition to the list interface, this also provides the GeoJSON __geo_interface__
to return a FeatureCollection dictionary."""
- def __repr__(self):
+ def __repr__(self) -> str:
return f"ShapeRecords: {list(self)}"
@property
@@ -2190,6 +2210,9 @@ class _NoShpSentinel:
"""
+_NO_SHP_SENTINEL = _NoShpSentinel()
+
+
class Reader:
"""Reads the three files of a shapefile as a unit or
separately. If one of the three files (.shp, .shx,
@@ -2214,21 +2237,21 @@ class Reader:
CONSTITUENT_FILE_EXTS = ["shp", "shx", "dbf"]
assert all(ext.islower() for ext in CONSTITUENT_FILE_EXTS)
- def _assert_ext_is_supported(self, ext: str):
+ def _assert_ext_is_supported(self, ext: str) -> None:
assert ext in self.CONSTITUENT_FILE_EXTS
def __init__(
self,
- shapefile_path: Union[str, os.PathLike] = "",
+ shapefile_path: Union[str, PathLike[Any]] = "",
/,
*,
encoding: str = "utf-8",
encodingErrors: str = "strict",
- shp: Union[_NoShpSentinel, Optional[BinaryFileT]] = _NoShpSentinel(),
+ shp: Union[_NoShpSentinel, Optional[BinaryFileT]] = _NO_SHP_SENTINEL,
shx: Optional[BinaryFileT] = None,
dbf: Optional[BinaryFileT] = None,
# Keep kwargs even though unused, to preserve PyShp 2.4 API
- **kwargs,
+ **kwargs: Any,
):
self.shp = None
self.shx = None
@@ -2263,7 +2286,9 @@ class Reader:
zpath = path[: path.find(".zip") + 4]
shapefile = path[path.find(".zip") + 4 + 1 :]
- zipfileobj: Union[tempfile._TemporaryFileWrapper, io.BufferedReader]
+ zipfileobj: Union[
+ tempfile._TemporaryFileWrapper[bytes], io.BufferedReader
+ ]
# Create a zip file handle
if zpath.startswith("http"):
# Zipfile is from a url
@@ -2327,6 +2352,10 @@ class Reader:
# Close and delete the temporary zipfile
try:
zipfileobj.close()
+ # TODO Does catching all possible exceptions really increase
+ # the chances of closing the zipfile successully, or does it
+ # just mean .close() failures will still fail, but fail
+ # silently?
except: # noqa: E722
pass
# Try to load shapefile
@@ -2380,7 +2409,8 @@ class Reader:
self.load(path)
return
- if not isinstance(shp, _NoShpSentinel):
+ if shp is not _NO_SHP_SENTINEL:
+ shp = cast(Union[str, PathLike[Any], IO[bytes], None], shp)
self.shp = self.__seek_0_on_file_obj_wrap_or_open_from_name("shp", shp)
self.shx = self.__seek_0_on_file_obj_wrap_or_open_from_name("shx", shx)
@@ -2401,7 +2431,7 @@ class Reader:
if file_ is None:
return None
- if isinstance(file_, str):
+ if isinstance(file_, (str, PathLike)):
baseName, __ = os.path.splitext(file_)
return self._load_constituent_file(baseName, ext)
@@ -2417,7 +2447,7 @@ class Reader:
f"Could not load shapefile constituent file from: {file_}"
)
- def __str__(self):
+ def __str__(self) -> str:
"""
Use some general info on the shapefile as __str__
"""
@@ -2430,26 +2460,34 @@ class Reader:
info.append(f" {len(self)} records ({len(self.fields)} fields)")
return "\n".join(info)
- def __enter__(self):
+ def __enter__(self) -> Reader:
"""
Enter phase of context manager.
"""
return self
- def __exit__(self, exc_type, exc_val, exc_tb):
+ # def __exit__(self, exc_type, exc_val, exc_tb) -> None:
+ def __exit__(
+ self,
+ exc_type: Optional[BaseException],
+ exc_val: Optional[BaseException],
+ exc_tb: Optional[TracebackType],
+ ) -> Optional[bool]:
"""
Exit phase of context manager, close opened files.
"""
self.close()
+ return None
- def __len__(self):
+ def __len__(self) -> int:
"""Returns the number of shapes/records in the shapefile."""
if self.dbf:
# Preferably use dbf record count
if self.numRecords is None:
self.__dbfHeader()
- return self.numRecords
+ # .__dbfHeader sets self.numRecords or raises Exception
+ return cast(int, self.numRecords)
if self.shp:
# Otherwise use shape count
@@ -2457,7 +2495,8 @@ class Reader:
if self.numShapes is None:
self.__shxHeader()
- return self.numShapes
+ # .__shxHeader sets self.numShapes or raises Exception
+ return cast(int, self.numShapes)
# Index file not available, iterate all shapes to get total count
if self.numShapes is None:
@@ -2488,7 +2527,7 @@ class Reader:
# No file loaded yet, treat as 'empty' shapefile
return 0
- def __iter__(self):
+ def __iter__(self) -> Iterator[ShapeRecord]:
"""Iterates through the shapes/records in the shapefile."""
yield from self.iterShapeRecords()
@@ -2505,7 +2544,7 @@ class Reader:
def shapeTypeName(self) -> str:
return SHAPETYPE_LOOKUP[self.shapeType]
- def load(self, shapefile=None):
+ def load(self, shapefile: Optional[str] = None) -> None:
"""Opens a shapefile from a filename or file-like
object. Normally this method would be called by the
constructor with the file name as an argument."""
@@ -2521,7 +2560,7 @@ class Reader:
)
self._try_to_set_constituent_file_headers()
- def _try_to_set_constituent_file_headers(self):
+ def _try_to_set_constituent_file_headers(self) -> None:
if self.shp:
self.__shpHeader()
if self.dbf:
@@ -2567,28 +2606,28 @@ class Reader:
self._files_to_close.append(shp_dbf_or_dhx_file)
return shp_dbf_or_dhx_file
- def load_shp(self, shapefile_name):
+ def load_shp(self, shapefile_name: str) -> None:
"""
Attempts to load file with .shp extension as both lower and upper case
"""
self.shp = self._load_constituent_file(shapefile_name, "shp")
- def load_shx(self, shapefile_name):
+ def load_shx(self, shapefile_name: str) -> None:
"""
Attempts to load file with .shx extension as both lower and upper case
"""
self.shx = self._load_constituent_file(shapefile_name, "shx")
- def load_dbf(self, shapefile_name):
+ def load_dbf(self, shapefile_name: str) -> None:
"""
Attempts to load file with .dbf extension as both lower and upper case
"""
self.dbf = self._load_constituent_file(shapefile_name, "dbf")
- def __del__(self):
+ def __del__(self) -> None:
self.close()
- def close(self):
+ def close(self) -> None:
# Close any files that the reader opened (but not those given by user)
for attribute in self._files_to_close:
if hasattr(attribute, "close"):
@@ -2692,7 +2731,7 @@ class Reader:
return shape
- def __shxHeader(self):
+ def __shxHeader(self) -> None:
"""Reads the header information from a .shx file."""
shx = self.shx
if not shx:
@@ -2704,7 +2743,7 @@ class Reader:
shxRecordLength = (unpack(">i", shx.read(4))[0] * 2) - 100
self.numShapes = shxRecordLength // 8
- def __shxOffsets(self):
+ def __shxOffsets(self) -> None:
"""Reads the shape offset positions from a .shx file"""
shx = self.shx
if not shx:
@@ -3195,7 +3234,7 @@ class Writer:
def __init__(
self,
- target: Union[str, os.PathLike, None] = None,
+ target: Union[str, PathLike[Any], None] = None,
shapeType: Optional[int] = None,
autoBalance: bool = False,
*,
@@ -3205,7 +3244,7 @@ class Writer:
shx: Optional[WriteSeekableBinStream] = None,
dbf: Optional[WriteSeekableBinStream] = None,
# Keep kwargs even though unused, to preserve PyShp 2.4 API
- **kwargs,
+ **kwargs: Any,
):
self.target = target
self.autoBalance = autoBalance
@@ -3252,28 +3291,34 @@ class Writer:
self.encoding = encoding
self.encodingErrors = encodingErrors
- def __len__(self):
+ def __len__(self) -> int:
"""Returns the current number of features written to the shapefile.
If shapes and records are unbalanced, the length is considered the highest
of the two."""
return max(self.recNum, self.shpNum)
- def __enter__(self):
+ def __enter__(self) -> Writer:
"""
Enter phase of context manager.
"""
return self
- def __exit__(self, exc_type, exc_val, exc_tb):
+ def __exit__(
+ self,
+ exc_type: Optional[BaseException],
+ exc_val: Optional[BaseException],
+ exc_tb: Optional[TracebackType],
+ ) -> Optional[bool]:
"""
Exit phase of context manager, finish writing and close the files.
"""
self.close()
+ return None
- def __del__(self):
+ def __del__(self) -> None:
self.close()
- def close(self):
+ def close(self) -> None:
"""
Write final shp, shx, and dbf headers, close opened files.
"""
@@ -3327,7 +3372,9 @@ class Writer:
def __getFileObj(self, f: None) -> NoReturn: ...
@overload
def __getFileObj(self, f: WriteSeekableBinStream) -> WriteSeekableBinStream: ...
- def __getFileObj(self, f):
+ def __getFileObj(
+ self, f: Union[str, None, WriteSeekableBinStream]
+ ) -> WriteSeekableBinStream:
"""Safety handler to verify file-like objects"""
if not f:
raise ShapefileException("No file-like object available.")
@@ -3359,7 +3406,7 @@ class Writer:
shp.seek(start)
return size
- def _update_file_bbox(self, s: Shape):
+ def _update_file_bbox(self, s: Shape) -> None:
if s.shapeType == NULL:
shape_bbox = None
elif s.shapeType in _CanHaveBBox_shapeTypes:
@@ -3369,7 +3416,7 @@ class Writer:
shape_bbox = (x, y, x, y)
if shape_bbox is None:
- return
+ return None
if self._bbox:
# compare with existing
@@ -3382,8 +3429,9 @@ class Writer:
else:
# first time bbox is being set
self._bbox = shape_bbox
+ return None
- def _update_file_zbox(self, s: Union[_HasZ, PointZ]):
+ def _update_file_zbox(self, s: Union[_HasZ, PointZ]) -> None:
if self._zbox:
# compare with existing
self._zbox = (min(s.zbox[0], self._zbox[0]), max(s.zbox[1], self._zbox[1]))
@@ -3391,7 +3439,7 @@ class Writer:
# first time zbox is being set
self._zbox = s.zbox
- def _update_file_mbox(self, s: Union[_HasM, PointM]):
+ def _update_file_mbox(self, s: Union[_HasM, PointM]) -> None:
mbox = s.mbox
if self._mbox:
# compare with existing
@@ -3542,7 +3590,7 @@ class Writer:
def shape(
self,
- s: Union[Shape, HasGeoInterface, dict],
+ s: Union[Shape, HasGeoInterface, GeoJSONHomogeneousGeometryObject],
) -> None:
# Balance if already not balanced
if self.autoBalance and self.recNum < self.shpNum:
@@ -3550,15 +3598,17 @@ class Writer:
# Check is shape or import from geojson
if not isinstance(s, Shape):
if hasattr(s, "__geo_interface__"):
- s = s.__geo_interface__ # type: ignore [assignment]
- if isinstance(s, dict):
- s = Shape._from_geojson(s)
+ s = cast(HasGeoInterface, s)
+ shape_dict = s.__geo_interface__
+ elif isinstance(s, dict): # TypedDict is a dict at runtime
+ shape_dict = s
else:
raise TypeError(
"Can only write Shape objects, GeoJSON dictionaries, "
"or objects with the __geo_interface__, "
f"not: {s}"
)
+ s = Shape._from_geojson(shape_dict)
# Write to file
offset, length = self.__shpRecord(s)
if self.shx:
@@ -3600,7 +3650,8 @@ class Writer:
# Record number, Content length place holder
b_io.write(pack(">2i", self.shpNum, -1))
- # Track number of content bytes written. Excluding self.shpNum and length t.b.c.
+ # Track number of content bytes written, excluding
+ # self.shpNum and length (t.b.c.)
n = 0
n += b_io.write(pack("<i", s.shapeType))
@@ -3962,7 +4013,7 @@ def _filter_network_doctests(
def _replace_remote_url(
old_url: str,
- # Default port of Python http.server and Python 2's SimpleHttpServer
+ # Default port of Python http.server
port: int = 8000,
scheme: str = "http",
netloc: str = "localhost",
@@ -3978,7 +4029,7 @@ def _replace_remote_url(
if path is None:
path = old_parsed.path.rpartition("/")[2]
- if port not in (None, ""):
+ if port not in (None, ""): # type: ignore[comparison-overlap]
netloc = f"{netloc}:{port}"
new_parsed = old_parsed._replace(
@@ -4043,7 +4094,7 @@ def _test(args: list[str] = sys.argv[1:], verbosity: bool = False) -> int:
return failure_count
-def main():
+def main() -> None:
"""
Doctests are contained in the file 'README.md', and are tested using the built-in
testing libraries.
=====================================
test_shapefile.py
=====================================
@@ -13,6 +13,8 @@ import pytest
# our imports
import shapefile
+shapefiles_dir = Path(__file__).parent / "shapefiles"
+
# define various test shape tuples of (type, points, parts indexes, and expected geo interface output)
geo_interface_tests = [
(
@@ -719,8 +721,7 @@ def test_reader_pathlike():
"""
Assert that path-like objects can be read.
"""
- base = Path("shapefiles")
- with shapefile.Reader(base / "blockgroups") as sf:
+ with shapefile.Reader(shapefiles_dir / "blockgroups") as sf:
assert len(sf) == 663
@@ -736,6 +737,18 @@ def test_reader_dbf_only():
assert record[1:3] == ["060750601001", 4715]
+def test_reader_dbf_only_from_Path():
+ """
+ Assert that specifying just the
+ dbf argument to the shapefile reader as a Path
+ reads just the dbf file.
+ """
+ with shapefile.Reader(dbf=shapefiles_dir / "blockgroups.dbf") as sf:
+ assert len(sf) == 663
+ record = sf.record(3)
+ assert record[1:3] == ["060750601001", 4715]
+
+
def test_reader_shp_shx_only():
"""
Assert that specifying just the
@@ -750,6 +763,20 @@ def test_reader_shp_shx_only():
assert len(shape.points) == 173
+def test_reader_shp_shx_only_from_Paths():
+ """
+ Assert that specifying just the
+ shp and shx argument to the shapefile reader as Paths
+ reads just the shp and shx file.
+ """
+ with shapefile.Reader(
+ shp=shapefiles_dir / "blockgroups.shp", shx=shapefiles_dir / "blockgroups.shx"
+ ) as sf:
+ assert len(sf) == 663
+ shape = sf.shape(3)
+ assert len(shape.points) == 173
+
+
def test_reader_shp_dbf_only():
"""
Assert that specifying just the
@@ -766,6 +793,22 @@ def test_reader_shp_dbf_only():
assert record[1:3] == ["060750601001", 4715]
+def test_reader_shp_dbf_only_from_Paths():
+ """
+ Assert that specifying just the
+ shp and shx argument to the shapefile reader as Paths
+ reads just the shp and dbf file.
+ """
+ with shapefile.Reader(
+ shp=shapefiles_dir / "blockgroups.shp", dbf=shapefiles_dir / "blockgroups.dbf"
+ ) as sf:
+ assert len(sf) == 663
+ shape = sf.shape(3)
+ assert len(shape.points) == 173
+ record = sf.record(3)
+ assert record[1:3] == ["060750601001", 4715]
+
+
def test_reader_shp_only():
"""
Assert that specifying just the
@@ -778,6 +821,18 @@ def test_reader_shp_only():
assert len(shape.points) == 173
+def test_reader_shp_only_from_Path():
+ """
+ Assert that specifying just the
+ shp argument to the shapefile reader as a Path
+ reads just the shp file (shx optional).
+ """
+ with shapefile.Reader(shp=shapefiles_dir / "blockgroups.shp") as sf:
+ assert len(sf) == 663
+ shape = sf.shape(3)
+ assert len(shape.points) == 173
+
+
def test_reader_filelike_dbf_only():
"""
Assert that specifying just the
View it on GitLab: https://salsa.debian.org/debian-gis-team/pyshp/-/compare/dd79cd9bbd7f49ca6039f2437dbdc305a6df5861...548bbc52175a2c537aee4e6df6ac9f3c551fbf9c
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/pyshp/-/compare/dd79cd9bbd7f49ca6039f2437dbdc305a6df5861...548bbc52175a2c537aee4e6df6ac9f3c551fbf9c
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20250820/fc8442bc/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list