[Git][debian-gis-team/pygeofilter][upstream] New upstream version 0.3.1
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Fri Jan 3 17:05:16 GMT 2025
Antonio Valentino pushed to branch upstream at Debian GIS Project / pygeofilter
Commits:
766a061d by Antonio Valentino at 2025-01-03T16:28:48+00:00
New upstream version 0.3.1
- - - - -
28 changed files:
- .github/workflows/main.yml
- .gitignore
- CHANGELOG.md
- CONTRIBUTING.md
- README.md
- pygeofilter/__init__.py
- pygeofilter/backends/cql2_json/evaluate.py
- + pygeofilter/backends/opensearch/__init__.py
- + pygeofilter/backends/opensearch/evaluate.py
- + pygeofilter/backends/opensearch/util.py
- pygeofilter/parsers/cql2_json/parser.py
- pygeofilter/parsers/cql2_text/parser.py
- pygeofilter/parsers/ecql/grammar.lark
- pygeofilter/parsers/ecql/parser.py
- pygeofilter/parsers/fes/__init__.py
- pygeofilter/parsers/iso8601.py
- pygeofilter/parsers/wkt.py
- + pygeofilter/version.py
- requirements-test.txt
- setup.cfg
- setup.py
- + tests/backends/opensearch/__init__.py
- + tests/backends/opensearch/test_evaluate.py
- tests/backends/sqlalchemy/test_evaluate.py
- tests/parsers/cql2_json/fixtures.json
- tests/parsers/cql2_json/test_parser.py
- tests/parsers/cql2_text/test_parser.py
- tests/parsers/ecql/test_parser.py
Changes:
=====================================
.github/workflows/main.yml
=====================================
@@ -7,7 +7,7 @@ jobs:
runs-on: ubuntu-20.04
strategy:
matrix:
- python-version: ['3.7', '3.8', '3.9']
+ python-version: ['3.8', '3.9', '3.10']
steps:
- uses: actions/checkout at v2
- uses: actions/setup-python at v2
@@ -38,6 +38,12 @@ jobs:
host node port: 9300
node port: 9300
discovery type: 'single-node'
+ - name: Install and run OpenSearch 📦
+ uses: esmarkowski/opensearch-github-action at v1.0.0
+ with:
+ version: 2.18.0
+ security-disabled: true
+ port: 9209
- name: Run unit tests
run: |
pytest
=====================================
.gitignore
=====================================
@@ -130,4 +130,5 @@ dmypy.json
.doctrees
-.vscode
\ No newline at end of file
+.vscode
+.idea
=====================================
CHANGELOG.md
=====================================
@@ -1,5 +1,32 @@
# Changelog
+## [0.3.1](https://github.com/geopython/pygeofilter/compare/v0.3.0...v0.3.1) (2024-12-31)
+
+
+### Bug Fixes
+
+* **CI:** using separate file for tracking version to help with release-please action ([1c28b7c](https://github.com/geopython/pygeofilter/commit/1c28b7c45415ecedabd01570b114902f1d8f9310))
+
+## [0.3.0](https://github.com/geopython/pygeofilter/compare/v0.2.4...v0.3.0) (2024-12-30)
+
+
+### Features
+
+* add support for OpenSearch backend ([#111](https://github.com/geopython/pygeofilter/pull/111))
+* Update lark ([#110](https://github.com/geopython/pygeofilter/pull/110))
+
+
+### Bug Fixes
+
+* Handle boolean in ecql like cql_text ([#108](https://github.com/geopython/pygeofilter/pull/108))
+* Fix compatibility with i386 ([#107](https://github.com/geopython/pygeofilter/pull/107))
+* add FES parser import shortcut as other filter languages ([#102](https://github.com/geopython/pygeofilter/pull/102))
+
+
+### Miscellaneous Chores
+
+* release 0.3.0 ([48de1f1](https://github.com/geopython/pygeofilter/commit/48de1f128c4956a99d6760487146636122e119a3))
+
## [0.2.4](https://github.com/geopython/pygeofilter/compare/v0.2.3...v0.2.4) (2024-07-10)
=====================================
CONTRIBUTING.md
=====================================
@@ -33,7 +33,7 @@ Before submitting a bug, please do the following:
Make sure your report gets the attention it deserves: bug reports with missing information may be ignored or punted back to you, delaying a fix. The below constitutes a bare minimum; more info is almost always better:
-* __What version of Python are you using?__ For example, are you using Python 2.7, Python 3.7, PyPy 2.0?
+* __What version of Python are you using?__ For example, are you using Python 3.8+, PyPy 2.0?
* __What operating system are you using?__ Windows (7, 8, 10, 32-bit, 64-bit), Mac OS X, (10.7.4, 10.9.0), GNU/Linux (which distribution, which version?) Again, more detail is better.
* __Which version or versions of the software are you using?__ Ideally, you've followed the advice above and are on the latest version, but please confirm this.
* __How can the we recreate your problem?__ Imagine that we have never used pygeofilter before and have downloaded it for the first time. Exactly what steps do we need to take to reproduce your problem?
=====================================
README.md
=====================================
@@ -89,6 +89,8 @@ There are a number of pre-defined backends available, where parsed expressions c
* Django
* sqlalchemy
* (Geo)Pandas
+* Elasticsearch
+* OpenSearch
* Pure Python object filtering
The usage of those are described in their own documentation.
=====================================
pygeofilter/__init__.py
=====================================
@@ -25,4 +25,7 @@
# THE SOFTWARE.
# ------------------------------------------------------------------------------
-__version__ = "0.2.4"
+from .version import __version__
+
+
+__all__ = ["__version__"]
=====================================
pygeofilter/backends/cql2_json/evaluate.py
=====================================
@@ -75,7 +75,7 @@ class CQL2Evaluator(Evaluator):
@handle(ast.IsNull)
def isnull(self, node, arg):
- return {"op": "isNull", "args": arg}
+ return {"op": "isNull", "args": [arg]}
@handle(ast.Function)
def function(self, node, *args):
=====================================
pygeofilter/backends/opensearch/__init__.py
=====================================
@@ -0,0 +1,33 @@
+# ------------------------------------------------------------------------------
+#
+# Project: pygeofilter <https://github.com/geopython/pygeofilter>
+# Authors: Fabian Schindler <fabian.schindler at eox.at>
+#
+# ------------------------------------------------------------------------------
+# Copyright (C) 2022 EOX IT Services GmbH
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies of this Software or works derived from this Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+# THE SOFTWARE.
+# ------------------------------------------------------------------------------
+
+""" OpenSearch backend for pygeofilter.
+"""
+
+from .evaluate import to_filter
+
+__all__ = ["to_filter"]
=====================================
pygeofilter/backends/opensearch/evaluate.py
=====================================
@@ -0,0 +1,312 @@
+# ------------------------------------------------------------------------------
+#
+# Project: pygeofilter <https://github.com/geopython/pygeofilter>
+# Authors: Fabian Schindler <fabian.schindler at eox.at>
+#
+# ------------------------------------------------------------------------------
+# Copyright (C) 2022 EOX IT Services GmbH
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies of this Software or works derived from this Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+# THE SOFTWARE.
+# ------------------------------------------------------------------------------
+
+"""
+OpenSearch filter evaluator.
+
+Uses opensearch-dsl package to create filter objects.
+"""
+
+
+# pylint: disable=E1130,C0103,W0223
+
+from datetime import date, datetime
+from typing import Dict, Optional, Union
+
+from opensearch_dsl import Q
+from packaging.version import Version
+
+from ... import ast, values
+from ..evaluator import Evaluator, handle
+from .util import like_to_wildcard
+
+VERSION_7_10_0 = Version("7.10.0")
+
+
+COMPARISON_OP_MAP = {
+ ast.ComparisonOp.LT: "lt",
+ ast.ComparisonOp.LE: "lte",
+ ast.ComparisonOp.GT: "gt",
+ ast.ComparisonOp.GE: "gte",
+}
+
+
+ARITHMETIC_OP_MAP = {
+ ast.ArithmeticOp.ADD: "+",
+ ast.ArithmeticOp.SUB: "-",
+ ast.ArithmeticOp.MUL: "*",
+ ast.ArithmeticOp.DIV: "/",
+}
+
+
+class OpenSearchDSLEvaluator(Evaluator):
+ """A filter evaluator for OpenSearch DSL."""
+
+ def __init__(
+ self,
+ attribute_map: Optional[Dict[str, str]] = None,
+ version: Optional[Version] = None,
+ ):
+ self.attribute_map = attribute_map
+ self.version = version or Version("7.1.0")
+
+ @handle(ast.Not)
+ def not_(self, _, sub):
+ """Inverts a filter object."""
+ return ~sub
+
+ @handle(ast.And)
+ def and_(self, _, lhs, rhs):
+ """Joins two filter objects with an `and` operator."""
+ return lhs & rhs
+
+ @handle(ast.Or)
+ def or_(self, _, lhs, rhs):
+ """Joins two filter objects with an `or` operator."""
+ return lhs | rhs
+
+ @handle(ast.Equal, ast.NotEqual)
+ def equality(self, node, lhs, rhs):
+ """Creates a match filter."""
+ q = Q("match", **{lhs: rhs})
+ if node.op == ast.ComparisonOp.NE:
+ q = ~q
+ return q
+
+ @handle(ast.LessThan, ast.LessEqual, ast.GreaterThan, ast.GreaterEqual)
+ def comparison(self, node, lhs, rhs):
+ """Creates a `range` filter."""
+ return Q("range", **{lhs: {COMPARISON_OP_MAP[node.op]: rhs}})
+
+ @handle(ast.Between)
+ def between(self, node: ast.Between, lhs, low, high):
+ """Creates a `range` filter."""
+ q = Q("range", **{lhs: {"gte": low, "lte": high}})
+ if node.not_:
+ q = ~q
+ return q
+
+ @handle(ast.Like)
+ def like(self, node: ast.Like, lhs):
+ """Transforms the provided LIKE pattern to an OpenSearch wildcard
+ pattern. Thus, this only works properly on "wildcard" fields.
+ Ignores case-sensitivity when OpenSearch version is below 7.10.0.
+ """
+ pattern = like_to_wildcard(
+ node.pattern, node.wildcard, node.singlechar, node.escapechar
+ )
+ expr: Dict[str, Union[str, bool]] = {
+ "value": pattern,
+ }
+ if self.version >= VERSION_7_10_0:
+ expr["case_insensitive"] = node.nocase
+
+ q = Q("wildcard", **{lhs: expr})
+ if node.not_:
+ q = ~q
+ return q
+
+ @handle(ast.In)
+ def in_(self, node, lhs, *options):
+ """Creates a `terms` filter."""
+ q = Q("terms", **{lhs: options})
+ if node.not_:
+ q = ~q
+ return q
+
+ @handle(ast.IsNull)
+ def null(self, node: ast.IsNull, lhs):
+ """Performs a null check, by using the `exists` query on the given
+ field.
+ """
+ q = Q("exists", field=lhs)
+ if not node.not_:
+ q = ~q
+ return q
+
+ @handle(ast.Exists)
+ def exists(self, node: ast.Exists, lhs):
+ """Performs an existense check, by using the `exists` query on the
+ given field
+ """
+ q = Q("exists", field=lhs)
+ if node.not_:
+ q = ~q
+ return q
+
+ @handle(ast.TemporalPredicate, subclasses=True)
+ def temporal(self, node: ast.TemporalPredicate, lhs, rhs):
+ """Creates a filter to match the given temporal predicate"""
+ op = node.op
+ if isinstance(rhs, (date, datetime)):
+ low = high = rhs
+ else:
+ low, high = rhs
+
+ query = "range"
+ not_ = False
+ predicate: Dict[str, Union[date, datetime, str]]
+ if op == ast.TemporalComparisonOp.DISJOINT:
+ not_ = True
+ predicate = {"gte": low, "lte": high}
+ elif op == ast.TemporalComparisonOp.AFTER:
+ predicate = {"gt": high}
+ elif op == ast.TemporalComparisonOp.BEFORE:
+ predicate = {"lt": low}
+ elif (
+ op == ast.TemporalComparisonOp.TOVERLAPS
+ or op == ast.TemporalComparisonOp.OVERLAPPEDBY
+ ):
+ predicate = {"gte": low, "lte": high}
+ elif op == ast.TemporalComparisonOp.BEGINS:
+ query = "term"
+ predicate = {"value": low}
+ elif op == ast.TemporalComparisonOp.BEGUNBY:
+ query = "term"
+ predicate = {"value": high}
+ elif op == ast.TemporalComparisonOp.DURING:
+ predicate = {"gt": low, "lt": high, "relation": "WITHIN"}
+ elif op == ast.TemporalComparisonOp.TCONTAINS:
+ predicate = {"gt": low, "lt": high, "relation": "CONTAINS"}
+ # elif op == ast.TemporalComparisonOp.ENDS:
+ # pass
+ # elif op == ast.TemporalComparisonOp.ENDEDBY:
+ # pass
+ # elif op == ast.TemporalComparisonOp.TEQUALS:
+ # pass
+ # elif op == ast.TemporalComparisonOp.BEFORE_OR_DURING:
+ # pass
+ # elif op == ast.TemporalComparisonOp.DURING_OR_AFTER:
+ # pass
+ else:
+ raise NotImplementedError(f"Unsupported temporal operator: {op}")
+
+ q = Q(
+ query,
+ **{lhs: predicate},
+ )
+ if not_:
+ q = ~q
+ return q
+
+ @handle(
+ ast.GeometryIntersects,
+ ast.GeometryDisjoint,
+ ast.GeometryWithin,
+ ast.GeometryContains,
+ )
+ def spatial_comparison(self, node: ast.SpatialComparisonPredicate, lhs: str, rhs):
+ """Creates a geo_shape query for the give spatial comparison
+ predicate.
+ """
+ return Q(
+ "geo_shape",
+ **{
+ lhs: {
+ "shape": rhs,
+ "relation": node.op.value.lower(),
+ },
+ },
+ )
+
+ @handle(ast.BBox)
+ def bbox(self, node: ast.BBox, lhs):
+ """Performs a geo_shape query for the given bounding box.
+ Ignores CRS parameter, as it is not supported by OpenSearch.
+ """
+ return Q(
+ "geo_shape",
+ **{
+ lhs: {
+ "shape": self.envelope(
+ values.Envelope(node.minx, node.maxx, node.miny, node.maxy)
+ ),
+ "relation": "intersects",
+ },
+ },
+ )
+
+ @handle(ast.Attribute)
+ def attribute(self, node: ast.Attribute):
+ """Attribute mapping from filter fields to OpenSearch fields.
+ If an attribute mapping is provided, it is used to look up the
+ field name from there.
+ """
+ if self.attribute_map is not None:
+ return self.attribute_map[node.name]
+ return node.name
+
+ # @handle(ast.Arithmetic, subclasses=True)
+ # def arithmetic(self, node: ast.Arithmetic, lhs, rhs):
+ # op = ARITHMETIC_OP_MAP[node.op]
+ # return f"({lhs} {op} {rhs})"
+
+ # @handle(ast.Function)
+ # def function(self, node, *arguments):
+ # func = self.function_map[node.name]
+ # return f"{func}({','.join(arguments)})"
+
+ @handle(*values.LITERALS)
+ def literal(self, node):
+ """Literal values are directly passed to opensearch-dsl"""
+ return node
+
+ @handle(values.Geometry)
+ def geometry(self, node: values.Geometry):
+ """Geometry values are converted to a GeoJSON object"""
+ return node.geometry
+
+ @handle(values.Envelope)
+ def envelope(self, node: values.Envelope):
+ """Envelope values are converted to an GeoJSON OpenSearch
+ extension object."""
+ return {
+ "type": "envelope",
+ "coordinates": [
+ [
+ min(node.x1, node.x2),
+ max(node.y1, node.y2),
+ ],
+ [
+ max(node.x1, node.x2),
+ min(node.y1, node.y2),
+ ],
+ ],
+ }
+
+
+def to_filter(
+ root,
+ attribute_map: Optional[Dict[str, str]] = None,
+ version: Optional[str] = None,
+):
+ """Shorthand function to convert a pygeofilter AST to an OpenSearch
+ filter structure.
+ """
+ return OpenSearchDSLEvaluator(
+ attribute_map, Version(version) if version else None
+ ).evaluate(root)
=====================================
pygeofilter/backends/opensearch/util.py
=====================================
@@ -0,0 +1,63 @@
+# ------------------------------------------------------------------------------
+#
+# Project: pygeofilter <https://github.com/geopython/pygeofilter>
+# Authors: Fabian Schindler <fabian.schindler at eox.at>
+#
+# ------------------------------------------------------------------------------
+# Copyright (C) 2022 EOX IT Services GmbH
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in
+# all copies of this Software or works derived from this Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
+# THE SOFTWARE.
+# ------------------------------------------------------------------------------
+
+""" General utilities for the OpenSearch backend.
+"""
+
+import re
+
+
+def like_to_wildcard(
+ value: str, wildcard: str, single_char: str, escape_char: str = "\\"
+) -> str:
+ """Adapts a "LIKE" pattern to create an OpenSearch "wildcard"
+ pattern.
+ """
+
+ x_wildcard = re.escape(wildcard)
+ x_single_char = re.escape(single_char)
+
+ if escape_char == "\\":
+ x_escape_char = "\\\\\\\\"
+ else:
+ x_escape_char = re.escape(escape_char)
+
+ if wildcard != "*":
+ value = re.sub(
+ f"(?<!{x_escape_char}){x_wildcard}",
+ "*",
+ value,
+ )
+
+ if single_char != "?":
+ value = re.sub(
+ f"(?<!{x_escape_char}){x_single_char}",
+ "?",
+ value,
+ )
+
+ return value
=====================================
pygeofilter/parsers/cql2_json/parser.py
=====================================
@@ -125,7 +125,10 @@ def walk_cql_json(node: JsonType): # noqa: C901
return ast.Not(cast(ast.Node, walk_cql_json(args)))
elif op == "isNull":
- return ast.IsNull(cast(ast.Node, walk_cql_json(args)), False)
+ # like with "not", allow both arrays and objects
+ if isinstance(args, list):
+ args = args[0]
+ return ast.IsNull(cast(ast.Node, walk_cql_json(args)), not_=False)
elif op == "between":
return ast.Between(
@@ -153,12 +156,6 @@ def walk_cql_json(node: JsonType): # noqa: C901
not_=False,
)
- elif op == "isNull":
- return ast.IsNull(
- walk_cql_json(args),
- not_=False,
- )
-
elif op in BINARY_OP_PREDICATES_MAP:
args = [cast(ast.Node, walk_cql_json(arg)) for arg in args]
return BINARY_OP_PREDICATES_MAP[op](*args)
=====================================
pygeofilter/parsers/cql2_text/parser.py
=====================================
@@ -38,7 +38,7 @@ from ..wkt import WKTTransformer
logger.setLevel(logging.DEBUG)
- at v_args(inline=True)
+ at v_args(meta=False, inline=True)
class CQLTransformer(WKTTransformer, ISO8601Transformer):
def and_(self, *args):
return ast.And.from_items(*args)
@@ -202,6 +202,7 @@ parser = Lark.open(
rel_to=__file__,
parser="lalr",
debug=True,
+ maybe_placeholders=False,
transformer=CQLTransformer(),
import_paths=[os.path.dirname(os.path.dirname(__file__))],
)
=====================================
pygeofilter/parsers/ecql/grammar.lark
=====================================
@@ -109,7 +109,7 @@ period: DATETIME "/" DATETIME
envelope: "ENVELOPE" "(" number number number number ")"
-BOOLEAN: ( "TRUE" | "FALSE" )
+BOOLEAN.2: ( "TRUE"i | "FALSE"i )
DOUBLE_QUOTED: "\"" /.*?/ "\""
SINGLE_QUOTED: "'" /.*?/ "'"
=====================================
pygeofilter/parsers/ecql/parser.py
=====================================
@@ -48,7 +48,7 @@ SPATIAL_PREDICATES_MAP = {
}
- at v_args(inline=True)
+ at v_args(meta=False, inline=True)
class ECQLTransformer(WKTTransformer, ISO8601Transformer):
def and_(self, lhs, rhs):
return ast.And(lhs, rhs)
@@ -181,7 +181,7 @@ class ECQLTransformer(WKTTransformer, ISO8601Transformer):
return float(value)
def BOOLEAN(self, value):
- return value == "TRUE"
+ return value.lower() == "true"
def DOUBLE_QUOTED(self, token):
return token[1:-1]
@@ -201,6 +201,7 @@ parser = Lark.open(
rel_to=__file__,
parser="lalr",
debug=True,
+ maybe_placeholders=False,
transformer=ECQLTransformer(),
import_paths=[os.path.dirname(os.path.dirname(__file__))],
)
=====================================
pygeofilter/parsers/fes/__init__.py
=====================================
@@ -0,0 +1,3 @@
+from .parser import parse
+
+__all__ = ["parse"]
=====================================
pygeofilter/parsers/iso8601.py
=====================================
@@ -30,7 +30,7 @@ from lark import Transformer, v_args
from ..util import parse_datetime, parse_duration
- at v_args(inline=True)
+ at v_args(meta=False, inline=True)
class ISO8601Transformer(Transformer):
def DATETIME(self, dt):
return parse_datetime(dt)
=====================================
pygeofilter/parsers/wkt.py
=====================================
@@ -28,7 +28,7 @@
from lark import Transformer, v_args
- at v_args(inline=True)
+ at v_args(meta=False, inline=True)
class WKTTransformer(Transformer):
def wkt__geometry_with_srid(self, srid, geometry):
print(srid, geometry)
=====================================
pygeofilter/version.py
=====================================
@@ -0,0 +1 @@
+__version__ = "0.3.1"
=====================================
requirements-test.txt
=====================================
@@ -9,4 +9,6 @@ pygml
dateparser
lark
elasticsearch
-elasticsearch-dsl
\ No newline at end of file
+elasticsearch-dsl
+opensearch-py
+opensearch-dsl
=====================================
setup.cfg
=====================================
@@ -1,5 +1,5 @@
[metadata]
-version = attr: pygeofilter.__version__
+version = attr: pygeofilter.version.__version__
######################################################
# code formating / lint / type checking configurations
=====================================
setup.py
=====================================
@@ -57,7 +57,7 @@ setup(
install_requires=(
[
"dateparser",
- "lark<1.0",
+ "lark",
"pygeoif>=1.0.0",
"dataclasses;python_version<'3.7'",
]
@@ -69,6 +69,7 @@ setup(
"backend-sqlalchemy": ["geoalchemy2", "sqlalchemy"],
"backend-native": ["shapely"],
"backend-elasticsearch": ["elasticsearch", "elasticsearch-dsl"],
+ "backend-opensearch": ["opensearch-py", "opensearch-dsl"],
"fes": ["pygml>=0.2"],
},
classifiers=[
@@ -76,10 +77,11 @@ setup(
"Intended Audience :: Developers",
"Topic :: Scientific/Engineering :: GIS",
"License :: OSI Approved :: MIT License",
- "Programming Language :: Python :: 3.6",
- "Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
],
tests_require=["pytest"],
)
=====================================
tests/backends/opensearch/__init__.py
=====================================
=====================================
tests/backends/opensearch/test_evaluate.py
=====================================
@@ -0,0 +1,298 @@
+# pylint: disable=W0621,C0114,C0115,C0116
+
+import pytest
+from opensearch_dsl import (
+ Date,
+ DateRange,
+ Document,
+ Field,
+ Float,
+ GeoPoint,
+ GeoShape,
+ Index,
+ InnerDoc,
+ Integer,
+ Nested,
+ Range,
+ Text,
+ connections,
+)
+
+from pygeofilter import ast
+from pygeofilter.backends.opensearch import to_filter
+from pygeofilter.parsers.ecql import parse
+from pygeofilter.util import parse_datetime
+
+
+class Wildcard(Field):
+ name = "wildcard"
+
+
+class RecordMeta(InnerDoc):
+ float_meta_attribute = Float()
+ int_meta_attribute = Integer()
+ str_meta_attribute = Text()
+ datetime_meta_attribute = Date()
+
+
+class Record(Document):
+ identifier = Text()
+ geometry = GeoShape()
+ center = GeoPoint()
+ float_attribute = Float()
+ int_attribute = Integer()
+ str_attribute = Wildcard()
+ maybe_str_attribute = Text()
+ datetime_attribute = Date()
+ daterange_attribute = DateRange()
+ record_metas = Nested(RecordMeta)
+
+ class Index:
+ name = "record"
+
+
+ at pytest.fixture(autouse=True, scope="session")
+def connection():
+ connections.create_connection(
+ hosts=["http://localhost:9209"],
+ )
+
+
+ at pytest.fixture(autouse=True, scope="session")
+def index(connection):
+ Record.init()
+ index = Index(Record.Index.name)
+ yield index
+ index.delete()
+
+
+ at pytest.fixture(autouse=True, scope="session")
+def data(index):
+ """Fixture to add initial data to the search index."""
+ record_a = Record(
+ identifier="A",
+ geometry="MULTIPOLYGON(((0 0, 0 5, 5 5,5 0,0 0)))",
+ center="POINT(2.5 2.5)",
+ float_attribute=0.0,
+ int_attribute=5,
+ str_attribute="this is a test",
+ maybe_str_attribute=None,
+ datetime_attribute=parse_datetime("2000-01-01T00:00:00Z"),
+ daterange_attribute=Range(
+ gte=parse_datetime("2000-01-01T00:00:00Z"),
+ lte=parse_datetime("2000-01-02T00:00:00Z"),
+ ),
+ )
+ record_a.save()
+
+ record_b = Record(
+ identifier="B",
+ geometry="MULTIPOLYGON(((5 5, 5 10, 10 10,10 5,5 5)))",
+ center="POINT(7.5 7.5)",
+ float_attribute=30.0,
+ int_attribute=None,
+ str_attribute="this is another test",
+ maybe_str_attribute="some value",
+ datetime_attribute=parse_datetime("2000-01-01T00:00:10Z"),
+ daterange_attribute=Range(
+ gte=parse_datetime("2000-01-04T00:00:00Z"),
+ lte=parse_datetime("2000-01-05T00:00:00Z"),
+ ),
+ )
+ record_b.save()
+ index.refresh()
+
+ yield [record_a, record_b]
+
+
+def filter_(ast_):
+ query = to_filter(ast_, version="8.2")
+ print(query)
+ result = Record.search().query(query).execute()
+ print([r.identifier for r in result])
+ return result
+
+
+def test_comparison(data):
+ result = filter_(parse("int_attribute = 5"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("float_attribute < 6"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("float_attribute > 6"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+ result = filter_(parse("int_attribute <= 5"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("float_attribute >= 8"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+ result = filter_(parse("float_attribute <> 0.0"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+
+def test_combination(data):
+ result = filter_(parse("int_attribute = 5 AND float_attribute < 6.0"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("int_attribute = 6 OR float_attribute < 6.0"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+
+def test_between(data):
+ result = filter_(parse("float_attribute BETWEEN -1 AND 1"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("int_attribute NOT BETWEEN 4 AND 6"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+
+def test_like(data):
+ result = filter_(parse("str_attribute LIKE 'this is a test'"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("str_attribute LIKE 'this is % test'"))
+ assert len(result) == 2
+
+ result = filter_(parse("str_attribute NOT LIKE '% another test'"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("str_attribute NOT LIKE 'this is . test'"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+ result = filter_(parse("str_attribute ILIKE 'THIS IS . TEST'"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("str_attribute ILIKE 'THIS IS % TEST'"))
+ assert len(result) == 2
+
+
+def test_in(data):
+ result = filter_(parse("int_attribute IN ( 1, 2, 3, 4, 5 )"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("int_attribute NOT IN ( 1, 2, 3, 4, 5 )"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+
+def test_null(data):
+ result = filter_(parse("maybe_str_attribute IS NULL"))
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(parse("maybe_str_attribute IS NOT NULL"))
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+
+def test_has_attr():
+ result = filter_(parse("extra_attr EXISTS"))
+ assert len(result) == 0
+
+ result = filter_(parse("extra_attr DOES-NOT-EXIST"))
+ assert len(result) == 2
+
+
+def test_temporal(data):
+ result = filter_(
+ ast.TimeDisjoint(
+ ast.Attribute("datetime_attribute"),
+ [
+ parse_datetime("2000-01-01T00:00:05.00Z"),
+ parse_datetime("2000-01-01T00:00:15.00Z"),
+ ],
+ )
+ )
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(
+ parse("datetime_attribute BEFORE 2000-01-01T00:00:05.00Z"),
+ )
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ result = filter_(
+ parse("datetime_attribute AFTER 2000-01-01T00:00:05.00Z"),
+ )
+ assert len(result) == 1 and result[0].identifier == data[1].identifier
+
+
+# def test_array():
+# result = filter_(
+# ast.ArrayEquals(
+# ast.Attribute('array_attr'),
+# [2, 3],
+# ),
+# data
+# )
+# assert len(result) == 1 and result[0] is data[0]
+
+# result = filter_(
+# ast.ArrayContains(
+# ast.Attribute('array_attr'),
+# [1, 2, 3, 4],
+# ),
+# data
+# )
+# assert len(result) == 1 and result[0] is data[1]
+
+# result = filter_(
+# ast.ArrayContainedBy(
+# ast.Attribute('array_attr'),
+# [1, 2, 3, 4],
+# ),
+# data
+# )
+# assert len(result) == 1 and result[0] is data[0]
+
+# result = filter_(
+# ast.ArrayOverlaps(
+# ast.Attribute('array_attr'),
+# [5, 6, 7],
+# ),
+# data
+# )
+# assert len(result) == 1 and result[0] is data[1]
+
+
+def test_spatial(data):
+ result = filter_(
+ parse("INTERSECTS(geometry, ENVELOPE (0.0 1.0 0.0 1.0))"),
+ )
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+ # TODO: test more spatial queries
+
+ result = filter_(
+ parse("BBOX(center, 2, 2, 3, 3)"),
+ )
+ assert len(result) == 1 and result[0].identifier == data[0].identifier
+
+
+# def test_arithmetic():
+# result = filter_(
+# parse('int_attr = float_attr - 0.5'),
+# data,
+# )
+# assert len(result) == 2
+
+# result = filter_(
+# parse('int_attr = 5 + 20 / 2 - 10'),
+# data,
+# )
+# assert len(result) == 1 and result[0] is data[0]
+
+
+# def test_function():
+# result = filter_(
+# parse('sin(float_attr) BETWEEN -0.75 AND -0.70'),
+# data,
+# )
+# assert len(result) == 1 and result[0] is data[0]
+
+
+# def test_nested():
+# result = filter_(
+# parse('"nested_attr.str_attr" = \'this is a test\''),
+# data,
+# )
+# assert len(result) == 1 and result[0] is data[0]
=====================================
tests/backends/sqlalchemy/test_evaluate.py
=====================================
@@ -22,6 +22,18 @@ from pygeofilter.parsers.ecql import parse
Base = declarative_base()
+mod_spatialite = ctypes.util.find_library("mod_spatialite")
+if not mod_spatialite:
+ import pathlib
+ matches = list(pathlib.Path("/usr/lib").glob("*/mod_spatialite.so"))
+ if matches:
+ mod_spatialite = str(matches[0])
+
+import pytest
+pytestmark = pytest.mark.skipif(
+ not mod_spatialite, reason="mod_spatialite.so not available"
+)
+
class Record(Base):
__tablename__ = "record"
@@ -70,10 +82,7 @@ FIELD_MAPPING = {
def load_spatialite(dbapi_conn, connection_record):
dbapi_conn.enable_load_extension(True)
- dbapi_conn.load_extension(
- ctypes.util.find_library("mod_spatialite")
- or "/usr/lib/x86_64-linux-gnu/mod_spatialite.so"
- )
+ dbapi_conn.load_extension(mod_spatialite)
@pytest.fixture(scope="session")
=====================================
tests/parsers/cql2_json/fixtures.json
=====================================
@@ -33,7 +33,7 @@
},
"Example 9": {
"text": "filter=sentinel:data_coverage > 50 OR landsat:coverage_percent < 10 OR (sentinel:data_coverage IS NULL AND landsat:coverage_percent IS NULL)",
- "json": "{\"filter-lang\": \"cql2-json\", \"filter\": {\"op\": \"or\", \"args\": [{\"op\": \">\", \"args\": [{\"property\": \"sentinel:data_coverage\"}, 50]}, {\"op\": \"<\", \"args\": [{\"property\": \"landsat:coverage_percent\"}, 10]}, {\"op\": \"and\", \"args\": [{\"op\": \"isNull\", \"args\": {\"property\": \"sentinel:data_coverage\"}}, {\"op\": \"isNull\", \"args\": {\"property\": \"landsat:coverage_percent\"}}]}]}}"
+ "json": "{\"filter-lang\": \"cql2-json\", \"filter\": {\"op\": \"or\", \"args\": [{\"op\": \">\", \"args\": [{\"property\": \"sentinel:data_coverage\"}, 50]}, {\"op\": \"<\", \"args\": [{\"property\": \"landsat:coverage_percent\"}, 10]}, {\"op\": \"and\", \"args\": [{\"op\": \"isNull\", \"args\": [{\"property\": \"sentinel:data_coverage\"}]}, {\"op\": \"isNull\", \"args\": [{\"property\": \"landsat:coverage_percent\"}]}]}]}}"
},
"Example 10": {
"text": "filter=eo:cloud_cover BETWEEN 0 AND 50",
=====================================
tests/parsers/cql2_json/test_parser.py
=====================================
@@ -145,7 +145,7 @@ def test_attribute_in_list():
def test_attribute_is_null():
- result = parse({"op": "isNull", "args": {"property": "attr"}})
+ result = parse({"op": "isNull", "args": [{"property": "attr"}]})
assert result == ast.IsNull(ast.Attribute("attr"), False)
=====================================
tests/parsers/cql2_text/test_parser.py
=====================================
@@ -9,6 +9,7 @@ def test_attribute_eq_true_uppercase():
True,
)
+
def test_attribute_eq_true_lowercase():
result = parse("attr = true")
assert result == ast.Equal(
=====================================
tests/parsers/ecql/test_parser.py
=====================================
@@ -41,6 +41,7 @@ def test_namespace_attribute_eq_literal():
"A",
)
+
def test_prefixed_attribute_eq_literal():
result = parse("properties.ns:attr = 'A'")
assert result == ast.Equal(
@@ -48,6 +49,7 @@ def test_prefixed_attribute_eq_literal():
"A",
)
+
def test_attribute_eq_literal():
result = parse("attr = 'A'")
assert result == ast.Equal(
@@ -595,3 +597,35 @@ def test_function_attr_string_arg():
],
),
)
+
+
+def test_attribute_eq_true_uppercase():
+ result = parse("attr = TRUE")
+ assert result == ast.Equal(
+ ast.Attribute("attr"),
+ True,
+ )
+
+
+def test_attribute_eq_true_lowercase():
+ result = parse("attr = true")
+ assert result == ast.Equal(
+ ast.Attribute("attr"),
+ True,
+ )
+
+
+def test_attribute_eq_false_uppercase():
+ result = parse("attr = FALSE")
+ assert result == ast.Equal(
+ ast.Attribute("attr"),
+ False,
+ )
+
+
+def test_attribute_eq_false_lowercase():
+ result = parse("attr = false")
+ assert result == ast.Equal(
+ ast.Attribute("attr"),
+ False,
+ )
View it on GitLab: https://salsa.debian.org/debian-gis-team/pygeofilter/-/commit/766a061d51df2bef608fbcf485ff030ec5f6f1af
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/pygeofilter/-/commit/766a061d51df2bef608fbcf485ff030ec5f6f1af
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20250103/c8093ac4/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list