[med-svn] [Git][python-team/packages/mypy][upstream] New upstream version 1.19.1
Michael R. Crusoe (@crusoe)
gitlab at salsa.debian.org
Mon Dec 15 09:35:16 GMT 2025
Michael R. Crusoe pushed to branch upstream at Debian Python Team / packages / mypy
Commits:
8e10d6ad by Michael R. Crusoe at 2025-12-15T10:28:57+01:00
New upstream version 1.19.1
- - - - -
29 changed files:
- CHANGELOG.md
- PKG-INFO
- mypy-requirements.txt
- mypy.egg-info/PKG-INFO
- mypy.egg-info/requires.txt
- mypy/build.py
- mypy/cache.py
- mypy/checkpattern.py
- mypy/errors.py
- mypy/join.py
- mypy/main.py
- mypy/plugins/proper_plugin.py
- mypy/semanal.py
- mypy/semanal_typeargs.py
- mypy/test/testtypes.py
- mypy/typeanal.py
- mypy/typeops.py
- mypy/version.py
- mypyc/irbuild/builder.py
- mypyc/irbuild/for_helpers.py
- mypyc/test-data/run-generators.test
- mypyc/test-data/run-loops.test
- pyproject.toml
- test-data/unit/check-incremental.test
- test-data/unit/check-overloading.test
- test-data/unit/check-python310.test
- test-data/unit/check-type-aliases.test
- test-data/unit/check-typevar-tuple.test
- test-requirements.txt
Changes:
=====================================
CHANGELOG.md
=====================================
@@ -202,6 +202,17 @@ Related PRs:
Please see [git log](https://github.com/python/typeshed/commits/main?after=ebce8d766b41fbf4d83cf47c1297563a9508ff60+0&branch=main&path=stdlib) for full list of standard library typeshed stub changes.
+### Mypy 1.19.1
+
+- Fix noncommutative joins with bounded TypeVars (Shantanu, PR [20345](https://github.com/python/mypy/pull/20345))
+- Respect output format for cached runs by serializing raw errors in cache metas (Ivan Levkivskyi, PR [20372](https://github.com/python/mypy/pull/20372))
+- Allow `types.NoneType` in match cases (A5rocks, PR [20383](https://github.com/python/mypy/pull/20383))
+- Fix mypyc generator regression with empty tuple (BobTheBuidler, PR [20371](https://github.com/python/mypy/pull/20371))
+- Fix crash involving Unpack-ed TypeVarTuple (Shantanu, PR [20323](https://github.com/python/mypy/pull/20323))
+- Fix crash on star import of redefinition (Ivan Levkivskyi, PR [20333](https://github.com/python/mypy/pull/20333))
+- Fix crash on typevar with forward ref used in other module (Ivan Levkivskyi, PR [20334](https://github.com/python/mypy/pull/20334))
+- Fail with an explicit error on PyPy (Ivan Levkivskyi, PR [20389](https://github.com/python/mypy/pull/20389))
+
### Acknowledgements
Thanks to all mypy contributors who contributed to this release:
@@ -237,7 +248,7 @@ Thanks to all mypy contributors who contributed to this release:
I’d also like to thank my employer, Dropbox, for supporting mypy development.
-## Mypy 1.18.1
+## Mypy 1.18
We’ve just uploaded mypy 1.18.1 to the Python Package Index ([PyPI](https://pypi.org/project/mypy/)).
Mypy is a static type checker for Python. This release includes new features, performance
=====================================
PKG-INFO
=====================================
@@ -1,6 +1,6 @@
Metadata-Version: 2.4
Name: mypy
-Version: 1.19.0
+Version: 1.19.1
Summary: Optional static typing for Python
Author-email: Jukka Lehtosalo <jukka.lehtosalo at iki.fi>
License: MIT
@@ -29,7 +29,7 @@ Requires-Dist: typing_extensions>=4.6.0
Requires-Dist: mypy_extensions>=1.0.0
Requires-Dist: pathspec>=0.9.0
Requires-Dist: tomli>=1.1.0; python_version < "3.11"
-Requires-Dist: librt>=0.6.2
+Requires-Dist: librt>=0.6.2; platform_python_implementation != "PyPy"
Provides-Extra: dmypy
Requires-Dist: psutil>=4.0; extra == "dmypy"
Provides-Extra: mypyc
=====================================
mypy-requirements.txt
=====================================
@@ -4,4 +4,4 @@ typing_extensions>=4.6.0
mypy_extensions>=1.0.0
pathspec>=0.9.0
tomli>=1.1.0; python_version<'3.11'
-librt>=0.6.2
+librt>=0.6.2; platform_python_implementation != 'PyPy'
=====================================
mypy.egg-info/PKG-INFO
=====================================
@@ -1,6 +1,6 @@
Metadata-Version: 2.4
Name: mypy
-Version: 1.19.0
+Version: 1.19.1
Summary: Optional static typing for Python
Author-email: Jukka Lehtosalo <jukka.lehtosalo at iki.fi>
License: MIT
@@ -29,7 +29,7 @@ Requires-Dist: typing_extensions>=4.6.0
Requires-Dist: mypy_extensions>=1.0.0
Requires-Dist: pathspec>=0.9.0
Requires-Dist: tomli>=1.1.0; python_version < "3.11"
-Requires-Dist: librt>=0.6.2
+Requires-Dist: librt>=0.6.2; platform_python_implementation != "PyPy"
Provides-Extra: dmypy
Requires-Dist: psutil>=4.0; extra == "dmypy"
Provides-Extra: mypyc
=====================================
mypy.egg-info/requires.txt
=====================================
@@ -1,6 +1,8 @@
typing_extensions>=4.6.0
mypy_extensions>=1.0.0
pathspec>=0.9.0
+
+[:platform_python_implementation != "PyPy"]
librt>=0.6.2
[:python_version < "3.11"]
=====================================
mypy/build.py
=====================================
@@ -31,10 +31,17 @@ from typing_extensions import TypeAlias as _TypeAlias
from librt.internal import cache_version
import mypy.semanal_main
-from mypy.cache import CACHE_VERSION, CacheMeta, ReadBuffer, WriteBuffer
+from mypy.cache import (
+ CACHE_VERSION,
+ CacheMeta,
+ ReadBuffer,
+ SerializedError,
+ WriteBuffer,
+ write_json,
+)
from mypy.checker import TypeChecker
from mypy.error_formatter import OUTPUT_CHOICES, ErrorFormatter
-from mypy.errors import CompileError, ErrorInfo, Errors, report_internal_error
+from mypy.errors import CompileError, ErrorInfo, Errors, ErrorTuple, report_internal_error
from mypy.graph_utils import prepare_sccs, strongly_connected_components, topsort
from mypy.indirection import TypeIndirectionVisitor
from mypy.messages import MessageBuilder
@@ -1869,7 +1876,7 @@ class State:
dep_hashes: dict[str, bytes] = {}
# List of errors reported for this file last time.
- error_lines: list[str] = []
+ error_lines: list[SerializedError] = []
# Parent package, its parent, etc.
ancestors: list[str] | None = None
@@ -3286,9 +3293,13 @@ def find_stale_sccs(
scc = order_ascc_ex(graph, ascc)
for id in scc:
if graph[id].error_lines:
- manager.flush_errors(
- manager.errors.simplify_path(graph[id].xpath), graph[id].error_lines, False
+ path = manager.errors.simplify_path(graph[id].xpath)
+ formatted = manager.errors.format_messages(
+ path,
+ deserialize_codes(graph[id].error_lines),
+ formatter=manager.error_formatter,
)
+ manager.flush_errors(path, formatted, False)
fresh_sccs.append(ascc)
else:
size = len(ascc.mod_ids)
@@ -3492,13 +3503,16 @@ def process_stale_scc(graph: Graph, ascc: SCC, manager: BuildManager) -> None:
# Flush errors, and write cache in two phases: first data files, then meta files.
meta_tuples = {}
errors_by_id = {}
+ formatted_by_id = {}
for id in stale:
if graph[id].xpath not in manager.errors.ignored_files:
- errors = manager.errors.file_messages(
- graph[id].xpath, formatter=manager.error_formatter
+ errors = manager.errors.file_messages(graph[id].xpath)
+ formatted = manager.errors.format_messages(
+ graph[id].xpath, errors, formatter=manager.error_formatter
)
- manager.flush_errors(manager.errors.simplify_path(graph[id].xpath), errors, False)
+ manager.flush_errors(manager.errors.simplify_path(graph[id].xpath), formatted, False)
errors_by_id[id] = errors
+ formatted_by_id[id] = formatted
meta_tuples[id] = graph[id].write_cache()
graph[id].mark_as_rechecked()
for id in stale:
@@ -3507,7 +3521,7 @@ def process_stale_scc(graph: Graph, ascc: SCC, manager: BuildManager) -> None:
continue
meta, meta_file = meta_tuple
meta.dep_hashes = [graph[dep].interface_hash for dep in graph[id].dependencies]
- meta.error_lines = errors_by_id.get(id, [])
+ meta.error_lines = serialize_codes(errors_by_id.get(id, []))
write_cache_meta(meta, manager, meta_file)
manager.done_sccs.add(ascc.id)
@@ -3640,3 +3654,40 @@ def write_undocumented_ref_info(
deps_json = get_undocumented_ref_info_json(state.tree, type_map)
metastore.write(ref_info_file, json_dumps(deps_json))
+
+
+def sources_to_bytes(sources: list[BuildSource]) -> bytes:
+ source_tuples = [(s.path, s.module, s.text, s.base_dir, s.followed) for s in sources]
+ buf = WriteBuffer()
+ write_json(buf, {"sources": source_tuples})
+ return buf.getvalue()
+
+
+def sccs_to_bytes(sccs: list[SCC]) -> bytes:
+ scc_tuples = [(list(scc.mod_ids), scc.id, list(scc.deps)) for scc in sccs]
+ buf = WriteBuffer()
+ write_json(buf, {"sccs": scc_tuples})
+ return buf.getvalue()
+
+
+def serialize_codes(errs: list[ErrorTuple]) -> list[SerializedError]:
+ return [
+ (path, line, column, end_line, end_column, severity, message, code.code if code else None)
+ for path, line, column, end_line, end_column, severity, message, code in errs
+ ]
+
+
+def deserialize_codes(errs: list[SerializedError]) -> list[ErrorTuple]:
+ return [
+ (
+ path,
+ line,
+ column,
+ end_line,
+ end_column,
+ severity,
+ message,
+ codes.error_codes.get(code) if code else None,
+ )
+ for path, line, column, end_line, end_column, severity, message, code in errs
+ ]
=====================================
mypy/cache.py
=====================================
@@ -48,7 +48,7 @@ bump CACHE_VERSION below.
from __future__ import annotations
from collections.abc import Sequence
-from typing import Any, Final, Union
+from typing import Any, Final, Optional, Union
from typing_extensions import TypeAlias as _TypeAlias
from librt.internal import (
@@ -70,7 +70,9 @@ from librt.internal import (
from mypy_extensions import u8
# High-level cache layout format
-CACHE_VERSION: Final = 0
+CACHE_VERSION: Final = 1
+
+SerializedError: _TypeAlias = tuple[Optional[str], int, int, int, int, str, str, Optional[str]]
class CacheMeta:
@@ -93,7 +95,7 @@ class CacheMeta:
dep_lines: list[int],
dep_hashes: list[bytes],
interface_hash: bytes,
- error_lines: list[str],
+ error_lines: list[SerializedError],
version_id: str,
ignore_all: bool,
plugin_data: Any,
@@ -158,7 +160,7 @@ class CacheMeta:
dep_lines=meta["dep_lines"],
dep_hashes=[bytes.fromhex(dep) for dep in meta["dep_hashes"]],
interface_hash=bytes.fromhex(meta["interface_hash"]),
- error_lines=meta["error_lines"],
+ error_lines=[tuple(err) for err in meta["error_lines"]],
version_id=meta["version_id"],
ignore_all=meta["ignore_all"],
plugin_data=meta["plugin_data"],
@@ -180,7 +182,7 @@ class CacheMeta:
write_int_list(data, self.dep_lines)
write_bytes_list(data, self.dep_hashes)
write_bytes(data, self.interface_hash)
- write_str_list(data, self.error_lines)
+ write_errors(data, self.error_lines)
write_str(data, self.version_id)
write_bool(data, self.ignore_all)
# Plugin data may be not a dictionary, so we use
@@ -205,7 +207,7 @@ class CacheMeta:
dep_lines=read_int_list(data),
dep_hashes=read_bytes_list(data),
interface_hash=read_bytes(data),
- error_lines=read_str_list(data),
+ error_lines=read_errors(data),
version_id=read_str(data),
ignore_all=read_bool(data),
plugin_data=read_json_value(data),
@@ -232,6 +234,7 @@ LIST_GEN: Final[Tag] = 20
LIST_INT: Final[Tag] = 21
LIST_STR: Final[Tag] = 22
LIST_BYTES: Final[Tag] = 23
+TUPLE_GEN: Final[Tag] = 24
DICT_STR_GEN: Final[Tag] = 30
# Misc classes.
@@ -391,7 +394,13 @@ def write_str_opt_list(data: WriteBuffer, value: list[str | None]) -> None:
write_str_opt(data, item)
-JsonValue: _TypeAlias = Union[None, int, str, bool, list["JsonValue"], dict[str, "JsonValue"]]
+Value: _TypeAlias = Union[None, int, str, bool]
+
+# Our JSON format is somewhat non-standard as we distinguish lists and tuples.
+# This is convenient for some internal things, like mypyc plugin and error serialization.
+JsonValue: _TypeAlias = Union[
+ Value, list["JsonValue"], dict[str, "JsonValue"], tuple["JsonValue", ...]
+]
def read_json_value(data: ReadBuffer) -> JsonValue:
@@ -409,15 +418,16 @@ def read_json_value(data: ReadBuffer) -> JsonValue:
if tag == LIST_GEN:
size = read_int_bare(data)
return [read_json_value(data) for _ in range(size)]
+ if tag == TUPLE_GEN:
+ size = read_int_bare(data)
+ return tuple(read_json_value(data) for _ in range(size))
if tag == DICT_STR_GEN:
size = read_int_bare(data)
return {read_str_bare(data): read_json_value(data) for _ in range(size)}
assert False, f"Invalid JSON tag: {tag}"
-# Currently tuples are used by mypyc plugin. They will be normalized to
-# JSON lists after a roundtrip.
-def write_json_value(data: WriteBuffer, value: JsonValue | tuple[JsonValue, ...]) -> None:
+def write_json_value(data: WriteBuffer, value: JsonValue) -> None:
if value is None:
write_tag(data, LITERAL_NONE)
elif isinstance(value, bool):
@@ -428,11 +438,16 @@ def write_json_value(data: WriteBuffer, value: JsonValue | tuple[JsonValue, ...]
elif isinstance(value, str):
write_tag(data, LITERAL_STR)
write_str_bare(data, value)
- elif isinstance(value, (list, tuple)):
+ elif isinstance(value, list):
write_tag(data, LIST_GEN)
write_int_bare(data, len(value))
for val in value:
write_json_value(data, val)
+ elif isinstance(value, tuple):
+ write_tag(data, TUPLE_GEN)
+ write_int_bare(data, len(value))
+ for val in value:
+ write_json_value(data, val)
elif isinstance(value, dict):
write_tag(data, DICT_STR_GEN)
write_int_bare(data, len(value))
@@ -457,3 +472,38 @@ def write_json(data: WriteBuffer, value: dict[str, Any]) -> None:
for key in sorted(value):
write_str_bare(data, key)
write_json_value(data, value[key])
+
+
+def write_errors(data: WriteBuffer, errs: list[SerializedError]) -> None:
+ write_tag(data, LIST_GEN)
+ write_int_bare(data, len(errs))
+ for path, line, column, end_line, end_column, severity, message, code in errs:
+ write_tag(data, TUPLE_GEN)
+ write_str_opt(data, path)
+ write_int(data, line)
+ write_int(data, column)
+ write_int(data, end_line)
+ write_int(data, end_column)
+ write_str(data, severity)
+ write_str(data, message)
+ write_str_opt(data, code)
+
+
+def read_errors(data: ReadBuffer) -> list[SerializedError]:
+ assert read_tag(data) == LIST_GEN
+ result = []
+ for _ in range(read_int_bare(data)):
+ assert read_tag(data) == TUPLE_GEN
+ result.append(
+ (
+ read_str_opt(data),
+ read_int(data),
+ read_int(data),
+ read_int(data),
+ read_int(data),
+ read_str(data),
+ read_str(data),
+ read_str_opt(data),
+ )
+ )
+ return result
=====================================
mypy/checkpattern.py
=====================================
@@ -46,6 +46,7 @@ from mypy.types import (
Type,
TypedDictType,
TypeOfAny,
+ TypeType,
TypeVarTupleType,
TypeVarType,
UninhabitedType,
@@ -556,6 +557,8 @@ class PatternChecker(PatternVisitor[PatternType]):
fallback = self.chk.named_type("builtins.function")
any_type = AnyType(TypeOfAny.unannotated)
typ = callable_with_ellipsis(any_type, ret_type=any_type, fallback=fallback)
+ elif isinstance(p_typ, TypeType) and isinstance(p_typ.item, NoneType):
+ typ = p_typ.item
elif not isinstance(p_typ, AnyType):
self.msg.fail(
message_registry.CLASS_PATTERN_TYPE_REQUIRED.format(
=====================================
mypy/errors.py
=====================================
@@ -951,7 +951,7 @@ class Errors:
self.new_messages(), use_stdout=use_stdout, module_with_blocker=self.blocker_module()
)
- def format_messages(
+ def format_messages_default(
self, error_tuples: list[ErrorTuple], source_lines: list[str] | None
) -> list[str]:
"""Return a string list that represents the error messages.
@@ -1009,24 +1009,28 @@ class Errors:
a.append(" " * (DEFAULT_SOURCE_OFFSET + column) + marker)
return a
- def file_messages(self, path: str, formatter: ErrorFormatter | None = None) -> list[str]:
- """Return a string list of new error messages from a given file.
-
- Use a form suitable for displaying to the user.
- """
+ def file_messages(self, path: str) -> list[ErrorTuple]:
+ """Return an error tuple list of new error messages from a given file."""
if path not in self.error_info_map:
return []
error_info = self.error_info_map[path]
error_info = [info for info in error_info if not info.hidden]
error_info = self.remove_duplicates(self.sort_messages(error_info))
- error_tuples = self.render_messages(error_info)
+ return self.render_messages(error_info)
+ def format_messages(
+ self, path: str, error_tuples: list[ErrorTuple], formatter: ErrorFormatter | None = None
+ ) -> list[str]:
+ """Return a string list of new error messages from a given file.
+
+ Use a form suitable for displaying to the user.
+ """
+ self.flushed_files.add(path)
if formatter is not None:
errors = create_errors(error_tuples)
return [formatter.report_error(err) for err in errors]
- self.flushed_files.add(path)
source_lines = None
if self.options.pretty and self.read_source:
# Find shadow file mapping and read source lines if a shadow file exists for the given path.
@@ -1036,7 +1040,7 @@ class Errors:
source_lines = self.read_source(mapped_path)
else:
source_lines = self.read_source(path)
- return self.format_messages(error_tuples, source_lines)
+ return self.format_messages_default(error_tuples, source_lines)
def find_shadow_file_mapping(self, path: str) -> str | None:
"""Return the shadow file path for a given source file path or None."""
@@ -1058,7 +1062,8 @@ class Errors:
msgs = []
for path in self.error_info_map.keys():
if path not in self.flushed_files:
- msgs.extend(self.file_messages(path))
+ error_tuples = self.file_messages(path)
+ msgs.extend(self.format_messages(path, error_tuples))
return msgs
def targets(self) -> set[str]:
=====================================
mypy/join.py
=====================================
@@ -297,10 +297,15 @@ class TypeJoinVisitor(TypeVisitor[ProperType]):
return self.s
def visit_type_var(self, t: TypeVarType) -> ProperType:
- if isinstance(self.s, TypeVarType) and self.s.id == t.id:
- if self.s.upper_bound == t.upper_bound:
- return self.s
- return self.s.copy_modified(upper_bound=join_types(self.s.upper_bound, t.upper_bound))
+ if isinstance(self.s, TypeVarType):
+ if self.s.id == t.id:
+ if self.s.upper_bound == t.upper_bound:
+ return self.s
+ return self.s.copy_modified(
+ upper_bound=join_types(self.s.upper_bound, t.upper_bound)
+ )
+ # Fix non-commutative joins
+ return get_proper_type(join_types(self.s.upper_bound, t.upper_bound))
else:
return self.default(self.s)
=====================================
mypy/main.py
=====================================
@@ -4,6 +4,7 @@ from __future__ import annotations
import argparse
import os
+import platform
import subprocess
import sys
import time
@@ -39,6 +40,13 @@ from mypy.version import __version__
if TYPE_CHECKING:
from _typeshed import SupportsWrite
+if platform.python_implementation() == "PyPy":
+ sys.stderr.write(
+ "ERROR: Running mypy on PyPy is not supported yet.\n"
+ "To type-check a PyPy library please use an equivalent CPython version,\n"
+ "see https://github.com/mypyc/librt/issues/16 for possible workarounds.\n"
+ )
+ sys.exit(2)
orig_stat: Final = os.stat
MEM_PROFILE: Final = False # If True, dump memory profile
=====================================
mypy/plugins/proper_plugin.py
=====================================
@@ -108,6 +108,7 @@ def is_special_target(right: ProperType) -> bool:
"mypy.types.RequiredType",
"mypy.types.ReadOnlyType",
"mypy.types.TypeGuardedType",
+ "mypy.types.PlaceholderType",
):
# Special case: these are not valid targets for a type alias and thus safe.
# TODO: introduce a SyntheticType base to simplify this?
=====================================
mypy/semanal.py
=====================================
@@ -3195,6 +3195,10 @@ class SemanticAnalyzer(
# namespace is incomplete.
self.mark_incomplete("*", i)
for name, node in m.names.items():
+ if node.no_serialize:
+ # This is either internal or generated symbol, skip it to avoid problems
+ # like accidental name conflicts or invalid cross-references.
+ continue
fullname = i_id + "." + name
self.set_future_import_flags(fullname)
# if '__all__' exists, all nodes not included have had module_public set to
@@ -4935,7 +4939,7 @@ class SemanticAnalyzer(
)
if analyzed is None:
# Type variables are special: we need to place them in the symbol table
- # soon, even if upper bound is not ready yet. Otherwise avoiding
+ # soon, even if upper bound is not ready yet. Otherwise, avoiding
# a "deadlock" in this common pattern would be tricky:
# T = TypeVar('T', bound=Custom[Any])
# class Custom(Generic[T]):
=====================================
mypy/semanal_typeargs.py
=====================================
@@ -176,12 +176,12 @@ class TypeArgumentAnalyzer(MixedTraverserVisitor):
code=codes.VALID_TYPE,
)
continue
+ if self.in_type_alias_expr and isinstance(arg, TypeVarType):
+ # Type aliases are allowed to use unconstrained type variables
+ # error will be checked at substitution point.
+ continue
if tvar.values:
if isinstance(arg, TypeVarType):
- if self.in_type_alias_expr:
- # Type aliases are allowed to use unconstrained type variables
- # error will be checked at substitution point.
- continue
arg_values = arg.values
if not arg_values:
is_error = True
@@ -205,10 +205,6 @@ class TypeArgumentAnalyzer(MixedTraverserVisitor):
and upper_bound.type.fullname == "builtins.object"
)
if not object_upper_bound and not is_subtype(arg, upper_bound):
- if self.in_type_alias_expr and isinstance(arg, TypeVarType):
- # Type aliases are allowed to use unconstrained type variables
- # error will be checked at substitution point.
- continue
is_error = True
self.fail(
message_registry.INVALID_TYPEVAR_ARG_BOUND.format(
=====================================
mypy/test/testtypes.py
=====================================
@@ -1051,6 +1051,35 @@ class JoinSuite(Suite):
self.assert_join(self.fx.type_a, self.fx.t, self.fx.o)
self.assert_join(self.fx.t, self.fx.type_a, self.fx.o)
+ def test_join_type_var_bounds(self) -> None:
+ tvar1 = TypeVarType(
+ "tvar1",
+ "tvar1",
+ TypeVarId(-100),
+ [],
+ self.fx.o,
+ AnyType(TypeOfAny.from_omitted_generics),
+ INVARIANT,
+ )
+ any_type = AnyType(TypeOfAny.special_form)
+ tvar2 = TypeVarType(
+ "tvar2",
+ "tvar2",
+ TypeVarId(-101),
+ [],
+ upper_bound=UnionType(
+ [
+ TupleType([any_type], self.fx.std_tuple),
+ TupleType([any_type, any_type], self.fx.std_tuple),
+ ]
+ ),
+ default=AnyType(TypeOfAny.from_omitted_generics),
+ variance=INVARIANT,
+ )
+
+ self.assert_join(tvar1, tvar2, self.fx.o)
+ self.assert_join(tvar2, tvar1, self.fx.o)
+
# There are additional test cases in check-inference.test.
# TODO: Function types + varargs and default args.
=====================================
mypy/typeanal.py
=====================================
@@ -346,6 +346,15 @@ class TypeAnalyser(SyntheticTypeVisitor[Type], TypeAnalyzerPluginInterface):
if hook is not None:
return hook(AnalyzeTypeContext(t, t, self))
tvar_def = self.tvar_scope.get_binding(sym)
+ if tvar_def is not None:
+ # We need to cover special-case explained in get_typevarlike_argument() here,
+ # since otherwise the deferral will not be triggered if the type variable is
+ # used in a different module. Using isinstance() should be safe for this purpose.
+ tvar_params = [tvar_def.upper_bound, tvar_def.default]
+ if isinstance(tvar_def, TypeVarType):
+ tvar_params += tvar_def.values
+ if any(isinstance(tp, PlaceholderType) for tp in tvar_params):
+ self.api.defer()
if isinstance(sym.node, ParamSpecExpr):
if tvar_def is None:
if self.allow_unbound_tvars:
=====================================
mypy/typeops.py
=====================================
@@ -508,7 +508,7 @@ def erase_to_bound(t: Type) -> Type:
def callable_corresponding_argument(
typ: NormalizedCallableType | Parameters, model: FormalArgument
) -> FormalArgument | None:
- """Return the argument a function that corresponds to `model`"""
+ """Return the argument of a function that corresponds to `model`"""
by_name = typ.argument_by_name(model.name)
by_pos = typ.argument_by_position(model.pos)
@@ -522,17 +522,23 @@ def callable_corresponding_argument(
# taking both *args and **args, or a pair of functions like so:
# def right(a: int = ...) -> None: ...
- # def left(__a: int = ..., *, a: int = ...) -> None: ...
+ # def left(x: int = ..., /, *, a: int = ...) -> None: ...
from mypy.meet import meet_types
if (
not (by_name.required or by_pos.required)
and by_pos.name is None
and by_name.pos is None
+ # This is not principled, but prevents a crash. It's weird to have a FormalArgument
+ # that has an UnpackType.
+ and not isinstance(by_name.typ, UnpackType)
+ and not isinstance(by_pos.typ, UnpackType)
):
return FormalArgument(
by_name.name, by_pos.pos, meet_types(by_name.typ, by_pos.typ), False
)
+ return by_name
+
return by_name if by_name is not None else by_pos
=====================================
mypy/version.py
=====================================
@@ -8,7 +8,7 @@ from mypy import git
# - Release versions have the form "1.2.3".
# - Dev versions have the form "1.2.3+dev" (PLUS sign to conform to PEP 440).
# - Before 1.0 we had the form "0.NNN".
-__version__ = "1.19.0"
+__version__ = "1.19.1"
base_version = __version__
mypy_dir = os.path.abspath(os.path.dirname(os.path.dirname(__file__)))
=====================================
mypyc/irbuild/builder.py
=====================================
@@ -990,8 +990,10 @@ class IRBuilder:
elif isinstance(target_type, TypeVarLikeType):
return self.get_sequence_type_from_type(target_type.upper_bound)
elif isinstance(target_type, TupleType):
+ items = target_type.items
+ assert items, "This function does not support empty tuples"
# Tuple might have elements of different types.
- rtypes = {self.mapper.type_to_rtype(item) for item in target_type.items}
+ rtypes = set(map(self.mapper.type_to_rtype, items))
if len(rtypes) == 1:
return rtypes.pop()
else:
=====================================
mypyc/irbuild/for_helpers.py
=====================================
@@ -7,7 +7,7 @@ such special case.
from __future__ import annotations
-from typing import Callable, ClassVar
+from typing import Callable, ClassVar, cast
from mypy.nodes import (
ARG_POS,
@@ -241,25 +241,45 @@ def sequence_from_generator_preallocate_helper(
rtype = builder.node_type(sequence_expr)
if not (is_sequence_rprimitive(rtype) or isinstance(rtype, RTuple)):
return None
- sequence = builder.accept(sequence_expr)
- length = get_expr_length_value(builder, sequence_expr, sequence, line, use_pyssize_t=True)
+
if isinstance(rtype, RTuple):
# If input is RTuple, box it to tuple_rprimitive for generic iteration
# TODO: this can be optimized a bit better with an unrolled ForRTuple helper
proper_type = get_proper_type(builder.types[sequence_expr])
assert isinstance(proper_type, TupleType), proper_type
- get_item_ops = [
- (
- LoadLiteral(typ.value, object_rprimitive)
- if isinstance(typ, LiteralType)
- else TupleGet(sequence, i, line)
- )
- for i, typ in enumerate(get_proper_types(proper_type.items))
- ]
+ # the for_loop_helper_with_index crashes for empty tuples, bail out
+ if not proper_type.items:
+ return None
+
+ proper_types = get_proper_types(proper_type.items)
+
+ get_item_ops: list[LoadLiteral | TupleGet]
+ if all(isinstance(typ, LiteralType) for typ in proper_types):
+ get_item_ops = [
+ LoadLiteral(cast(LiteralType, typ).value, object_rprimitive)
+ for typ in proper_types
+ ]
+
+ else:
+ sequence = builder.accept(sequence_expr)
+ get_item_ops = [
+ (
+ LoadLiteral(typ.value, object_rprimitive)
+ if isinstance(typ, LiteralType)
+ else TupleGet(sequence, i, line)
+ )
+ for i, typ in enumerate(proper_types)
+ ]
+
items = list(map(builder.add, get_item_ops))
sequence = builder.new_tuple(items, line)
+ else:
+ sequence = builder.accept(sequence_expr)
+
+ length = get_expr_length_value(builder, sequence_expr, sequence, line, use_pyssize_t=True)
+
target_op = empty_op_llbuilder(length, line)
def set_item(item_index: Value) -> None:
=====================================
mypyc/test-data/run-generators.test
=====================================
@@ -936,3 +936,11 @@ def test_generator_override() -> None:
assert base1_foo(Base1()) == [1]
assert base1_foo(Derived1()) == [2, 3]
assert derived1_foo(Derived1()) == [2, 3]
+
+[case testGeneratorEmptyTuple]
+from collections.abc import Generator
+from typing import Optional, Union
+
+def test_compiledGeneratorEmptyTuple() -> None:
+ jobs: Generator[Optional[str], None, None] = (_ for _ in ())
+ assert list(jobs) == []
=====================================
mypyc/test-data/run-loops.test
=====================================
@@ -1,7 +1,7 @@
# Test cases for "range" objects, "for" and "while" loops (compile and run)
[case testFor]
-from typing import List, Tuple
+from typing import Any, List, Tuple
def count(n: int) -> None:
for i in range(n):
print(i)
@@ -21,6 +21,10 @@ def list_iter(l: List[int]) -> None:
def tuple_iter(l: Tuple[int, ...]) -> None:
for i in l:
print(i)
+def empty_tuple_iter(l: Tuple[()]) -> None:
+ i: Any
+ for i in l:
+ print(i)
def str_iter(l: str) -> None:
for i in l:
print(i)
@@ -39,7 +43,7 @@ def count_down_short() -> None:
[file driver.py]
from native import (
count, list_iter, list_rev_iter, list_rev_iter_lol, count_between, count_down, count_double,
- count_down_short, tuple_iter, str_iter,
+ count_down_short, tuple_iter, empty_tuple_iter, str_iter,
)
count(5)
list_iter(list(reversed(range(5))))
@@ -52,6 +56,7 @@ count_down_short()
print('==')
list_rev_iter_lol(list(reversed(range(5))))
tuple_iter((1, 2, 3))
+empty_tuple_iter(())
str_iter("abc")
[out]
0
=====================================
pyproject.toml
=====================================
@@ -9,7 +9,7 @@ requires = [
"mypy_extensions>=1.0.0",
"pathspec>=0.9.0",
"tomli>=1.1.0; python_version<'3.11'",
- "librt>=0.6.2",
+ "librt>=0.6.2; platform_python_implementation != 'PyPy'",
# the following is from build-requirements.txt
"types-psutil",
"types-setuptools",
@@ -54,7 +54,7 @@ dependencies = [
"mypy_extensions>=1.0.0",
"pathspec>=0.9.0",
"tomli>=1.1.0; python_version<'3.11'",
- "librt>=0.6.2",
+ "librt>=0.6.2; platform_python_implementation != 'PyPy'",
]
dynamic = ["version"]
=====================================
test-data/unit/check-incremental.test
=====================================
@@ -7596,3 +7596,43 @@ X = 0
tmp/a.py:6: error: "object" has no attribute "dtypes"
[out2]
tmp/a.py:2: error: "object" has no attribute "dtypes"
+
+[case testStarImportCycleRedefinition]
+import m
+
+[file m.py]
+import a
+
+[file m.py.2]
+import a
+reveal_type(a.C)
+
+[file a/__init__.py]
+from a.b import *
+from a.c import *
+x = 1
+
+[file a/b.py]
+from other import C
+from a.c import y
+class C: ... # type: ignore
+
+[file a/c.py]
+from other import C
+from a import x
+y = 1
+
+[file other.py]
+class C: ...
+[out2]
+tmp/m.py:2: note: Revealed type is "def () -> other.C"
+
+[case testOutputFormatterIncremental]
+# flags2: --output json
+def wrong() -> int:
+ if wrong():
+ return 0
+[out]
+main:2: error: Missing return statement
+[out2]
+{"file": "main", "line": 2, "column": 0, "message": "Missing return statement", "hint": null, "code": "return", "severity": "error"}
=====================================
test-data/unit/check-overloading.test
=====================================
@@ -263,6 +263,19 @@ def foo(*args: int | str, **kw: int | Foo) -> None:
pass
[builtins fixtures/tuple.pyi]
+
+[case testTypeCheckOverloadImplOverlapVarArgsAndKwargsNever]
+from __future__ import annotations
+from typing import overload
+
+ at overload # E: Single overload definition, multiple required
+def foo(x: int) -> None: ...
+
+def foo(*args: int, **kw: str) -> None: # E: Overloaded function implementation does not accept all possible arguments of signature 1
+ pass
+[builtins fixtures/tuple.pyi]
+
+
[case testTypeCheckOverloadWithImplTooSpecificRetType]
from typing import overload, Any
=====================================
test-data/unit/check-python310.test
=====================================
@@ -3178,3 +3178,19 @@ match 5:
reveal_type(b) # N: Revealed type is "Any"
case BlahBlah(c=c): # E: Name "BlahBlah" is not defined
reveal_type(c) # N: Revealed type is "Any"
+
+[case testMatchAllowsNoneTypeAsClass]
+import types
+
+class V:
+ X = types.NoneType
+
+def fun(val: str | None):
+ match val:
+ case V.X():
+ reveal_type(val) # N: Revealed type is "None"
+
+ match val:
+ case types.NoneType():
+ reveal_type(val) # N: Revealed type is "None"
+[builtins fixtures/tuple.pyi]
=====================================
test-data/unit/check-type-aliases.test
=====================================
@@ -1351,3 +1351,93 @@ reveal_type(D(x="asdf")) # E: No overload variant of "dict" matches argument ty
# N: def __init__(self, arg: Iterable[tuple[str, int]], **kwargs: int) -> dict[str, int] \
# N: Revealed type is "Any"
[builtins fixtures/dict.pyi]
+
+[case testTypeAliasesInCyclicImport1]
+import p.aliases
+
+[file p/__init__.py]
+[file p/aliases.py]
+from typing_extensions import TypeAlias
+from .defs import C, Alias1
+
+Alias2: TypeAlias = Alias1[C]
+
+[file p/defs.py]
+from typing import TypeVar
+from typing_extensions import TypeAlias
+import p.aliases
+
+C = TypeVar("C", bound="SomeClass")
+Alias1: TypeAlias = C
+
+class SomeClass:
+ pass
+[builtins fixtures/tuple.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeAliasesInCyclicImport2]
+import p.aliases
+
+[file p/__init__.py]
+[file p/aliases.py]
+from typing_extensions import TypeAlias
+from .defs import C, Alias1
+
+Alias2: TypeAlias = Alias1[C]
+
+[file p/defs.py]
+from typing import TypeVar, Union
+from typing_extensions import TypeAlias
+import p.aliases
+
+C = TypeVar("C", bound="SomeClass")
+Alias1: TypeAlias = Union[C, int]
+
+class SomeClass:
+ pass
+[builtins fixtures/tuple.pyi]
+
+[case testTypeAliasesInCyclicImport3]
+import p.aliases
+
+[file p/__init__.py]
+[file p/aliases.py]
+from typing_extensions import TypeAlias
+from .defs import C, Alias1
+
+Alias2: TypeAlias = Alias1[C]
+
+[file p/defs.py]
+from typing import TypeVar
+from typing_extensions import TypeAlias
+import p.aliases
+
+C = TypeVar("C", bound="list[SomeClass]")
+Alias1: TypeAlias = C
+
+class SomeClass:
+ pass
+[builtins fixtures/tuple.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeAliasesInCyclicImport4]
+import p.aliases
+
+[file p/__init__.py]
+[file p/aliases.py]
+from typing_extensions import TypeAlias
+from .defs import C, Alias1
+
+Alias2: TypeAlias = Alias1[C]
+
+[file p/defs.py]
+from typing import TypeVar, Union
+from typing_extensions import TypeAlias
+import p.aliases
+
+C = TypeVar("C", bound="list[SomeClass]")
+Alias1: TypeAlias = Union[C, int]
+
+class SomeClass:
+ pass
+[builtins fixtures/tuple.pyi]
=====================================
test-data/unit/check-typevar-tuple.test
=====================================
@@ -2716,3 +2716,26 @@ class MyTuple(tuple[Unpack[Union[int, str]]], Generic[Unpack[Ts]]): # E: "Union
x: MyTuple[int, str]
reveal_type(x[0]) # N: Revealed type is "Any"
[builtins fixtures/tuple.pyi]
+
+[case testHigherOrderFunctionUnpackTypeVarTupleViaParamSpec]
+from typing import Callable, ParamSpec, TypeVar, TypeVarTuple, Unpack
+
+P = ParamSpec("P")
+T = TypeVar("T")
+Ts = TypeVarTuple("Ts")
+
+def call(func: Callable[P, T], *args: P.args, **kwargs: P.kwargs) -> T:
+ return func(*args, **kwargs)
+
+
+def run(func: Callable[[Unpack[Ts]], T], *args: Unpack[Ts], some_kwarg: str = "asdf") -> T:
+ raise
+
+
+def foo() -> str:
+ return "hello"
+
+
+# this is a false positive, but it no longer crashes
+call(run, foo, some_kwarg="a") # E: Argument 1 to "call" has incompatible type "def [Ts`-1, T] run(func: def (*Unpack[Ts]) -> T, *args: Unpack[Ts], some_kwarg: str = ...) -> T"; expected "Callable[[Callable[[], str], str], str]"
+[builtins fixtures/tuple.pyi]
=====================================
test-requirements.txt
=====================================
@@ -22,7 +22,7 @@ identify==2.6.15
# via pre-commit
iniconfig==2.1.0
# via pytest
-librt==0.6.2
+librt==0.7.3 ; platform_python_implementation != 'PyPy'
# via -r mypy-requirements.txt
lxml==6.0.2 ; python_version < "3.15"
# via -r test-requirements.in
View it on GitLab: https://salsa.debian.org/python-team/packages/mypy/-/commit/8e10d6ad677cca9c614af7124e6f37b775aedd05
--
View it on GitLab: https://salsa.debian.org/python-team/packages/mypy/-/commit/8e10d6ad677cca9c614af7124e6f37b775aedd05
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20251215/df0e3c53/attachment-0001.htm>
More information about the debian-med-commit
mailing list