Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature[next]: support for Python 3.11 #1407

Closed
wants to merge 11 commits into from
2 changes: 1 addition & 1 deletion .github/workflows/daily-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
daily-ci:
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
tox-module-factor: ["cartesian", "eve", "next", "storage"]
os: ["ubuntu-latest"]
requirements-file: ["requirements-dev.txt", "min-requirements-test.txt", "min-extra-requirements-test.txt"]
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-cartesian-fallback.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
backends: [internal-cpu, dace-cpu]

steps:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-cartesian.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
backends: [internal-cpu, dace-cpu]
steps:
- uses: actions/checkout@v2
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-eve-fallback.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
test-eve:
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
os: ["ubuntu-latest"]

runs-on: ${{ matrix.os }}
Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/test-eve.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
test-eve:
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
os: ["ubuntu-latest"]
fail-fast: false

Expand Down Expand Up @@ -68,4 +68,3 @@ jobs:
# with:
# name: info-py${{ matrix.python-version }}-${{ matrix.os }}
# path: info.txt

2 changes: 1 addition & 1 deletion .github/workflows/test-next-fallback.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ jobs:
test-next:
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.10", "3.11"]
tox-env-factor: ["nomesh", "atlas"]
os: ["ubuntu-latest"]

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-next.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
test-next:
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.10", "3.11"]
tox-env-factor: ["nomesh", "atlas"]
os: ["ubuntu-latest"]
fail-fast: false
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-storage-fallback.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
test-storage:
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
backends: [internal-cpu, dace-cpu]
os: ["ubuntu-latest"]

Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/test-storage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
test-storage:
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
backends: [internal-cpu, dace-cpu]
os: ["ubuntu-latest"]
fail-fast: false
Expand Down Expand Up @@ -70,4 +70,3 @@ jobs:
# with:
# name: info-py${{ matrix.python-version }}-${{ matrix.os }}
# path: info.txt

17 changes: 16 additions & 1 deletion src/gt4py/eve/datamodels/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -971,6 +971,14 @@ def __pretty__(
return __pretty__


def _is_concrete_data_model(
cls: Type, type_args: Tuple[Type]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if cls: Type, we need to do more checks inside, try again Type[GenericDataModelT]

) -> typing.TypeGuard[Type[DataModelT]]:
return hasattr(cls, "__bound_type_params__") and all(
a == b for a, b in zip(cls.__bound_type_params__.values(), type_args)
)


def _make_data_model_class_getitem() -> classmethod:
def __class_getitem__(
cls: Type[GenericDataModelT], args: Union[Type, Tuple[Type]]
Expand All @@ -980,7 +988,9 @@ def __class_getitem__(
See :class:`GenericDataModelAlias` for further information.
"""
type_args: Tuple[Type] = args if isinstance(args, tuple) else (args,)
concrete_cls: Type[DataModelT] = concretize(cls, *type_args)
concrete_cls: Type[DataModelT] = (
cls if _is_concrete_data_model(cls, type_args) else concretize(cls, *type_args)
)
res = xtyping.StdGenericAliasType(concrete_cls, type_args)
if sys.version_info < (3, 9):
# in Python 3.8, xtyping.StdGenericAliasType (aka typing._GenericAlias)
Expand Down Expand Up @@ -1348,9 +1358,14 @@ def _make_concrete_with_cache(

class_name = f"{datamodel_cls.__name__}__{'_'.join(arg_names)}"

bound_type_params = {
tp_var.__name__: type_params_map[tp_var] for tp_var in datamodel_cls.__parameters__
}

namespace = {
"__annotations__": new_annotations,
"__module__": module if module else datamodel_cls.__module__,
"__bound_type_params__": bound_type_params, # TODO(havogt) is this useful information?
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure if it's useful to store this, it allows to do the check above

**new_field_c_attrs,
}

Expand Down
2 changes: 2 additions & 0 deletions src/gt4py/eve/type_validation.py
Original file line number Diff line number Diff line change
Expand Up @@ -358,6 +358,8 @@ def make_is_instance_of(name: str, type_: type) -> FixedTypeValidator:
"""Create an ``FixedTypeValidator`` validator for a specific type."""

def _is_instance_of(value: Any, **kwargs: Any) -> None:
if type_ is Any:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

possibly tests are wrong

Copy link
Contributor

@egparedes egparedes Jan 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The actual reason for this was actually a bug in a previous fix for dealing with the implementation of typing.Any as a type in modern python & typing_extensions versions. I removed this fix and applied the proper bug fix in extended_typing.

return
if not isinstance(value, type_):
raise TypeError(
f"'{name}' must be {type_} (got '{value}' which is a {type(value)})."
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
#
# SPDX-License-Identifier: GPL-3.0-or-later

from dataclasses import dataclass
import dataclasses

import numpy as np
import pytest
Expand Down Expand Up @@ -201,22 +201,26 @@ def test_setup(fieldview_backend):
grid_type=common.GridType.UNSTRUCTURED,
)

@dataclass(frozen=True)
@dataclasses.dataclass(frozen=True)
class setup:
case: cases.Case = test_case
cell_size = case.default_sizes[Cell]
k_size = case.default_sizes[KDim]
z_alpha = case.as_field(
case: cases.Case = dataclasses.field(default_factory=lambda: test_case)
cell_size = test_case.default_sizes[Cell]
k_size = test_case.default_sizes[KDim]
z_alpha = test_case.as_field(
[Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size + 1))
)
z_beta = case.as_field(
z_beta = test_case.as_field(
[Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size))
)
z_q = test_case.as_field(
[Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size))
)
w = test_case.as_field(
[Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size))
)
z_q = case.as_field([Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size)))
w = case.as_field([Cell, KDim], np.random.default_rng().uniform(size=(cell_size, k_size)))
z_q_ref, w_ref = reference(z_alpha.ndarray, z_beta.ndarray, z_q.ndarray, w.ndarray)
dummy = case.as_field([Cell, KDim], np.zeros((cell_size, k_size), dtype=bool))
z_q_out = case.as_field([Cell, KDim], np.zeros((cell_size, k_size)))
dummy = test_case.as_field([Cell, KDim], np.zeros((cell_size, k_size), dtype=bool))
z_q_out = test_case.as_field([Cell, KDim], np.zeros((cell_size, k_size)))

return setup()

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,10 @@ def test_unpacking_swap():
lines = ast.unparse(ssa_ast).split("\n")
assert lines[0] == f"a{SEP}0 = 5"
assert lines[1] == f"b{SEP}0 = 1"
assert lines[2] == f"(b{SEP}1, a{SEP}1) = (a{SEP}0, b{SEP}0)"
assert lines[2] in [
f"(b{SEP}1, a{SEP}1) = (a{SEP}0, b{SEP}0)",
f"b{SEP}1, a{SEP}1 = (a{SEP}0, b{SEP}0)",
] # unparse produces different parentheses in different Python versions


def test_annotated_assign():
Expand Down
35 changes: 20 additions & 15 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,24 @@ envlist =
# docs
labels =
test-cartesian-cpu = cartesian-py38-internal-cpu, cartesian-py39-internal-cpu, cartesian-py310-internal-cpu, \
cartesian-py38-dace-cpu, cartesian-py39-dace-cpu, cartesian-py310-dace-cpu
cartesian-py311-internal-cpu, cartesian-py38-dace-cpu, cartesian-py39-dace-cpu, cartesian-py310-dace-cpu, \
cartesian-py311-dace-cpu

test-eve-cpu = eve-py38, eve-py39, eve-py310
test-eve-cpu = eve-py38, eve-py39, eve-py310, eve-py311

test-next-cpu = next-py310-nomesh, next-py310-atlas
test-next-cpu = next-py310-nomesh, next-py311-nomesh, next-py310-atlas, next-py311-atlas

test-storage-cpu = storage-py38-internal-cpu, storage-py39-internal-cpu, storage-py310-internal-cpu, \
storage-py38-dace-cpu, storage-py39-dace-cpu, storage-py310-dace-cpu
storage-py311-internal-cpu, storage-py38-dace-cpu, storage-py39-dace-cpu, storage-py310-dace-cpu, \
storage-py311-dace-cpu

test-cpu = cartesian-py38-internal-cpu, cartesian-py39-internal-cpu, cartesian-py310-internal-cpu, \
cartesian-py38-dace-cpu, cartesian-py39-dace-cpu, cartesian-py310-dace-cpu, \
eve-py38, eve-py39, eve-py310, \
next-py310-nomesh, next-py310-atlas, \
storage-py38-internal-cpu, storage-py39-internal-cpu, storage-py310-internal-cpu, \
storage-py38-dace-cpu, storage-py39-dace-cpu, storage-py310-dace-cpu
cartesian-py311-internal-cpu, cartesian-py38-dace-cpu, cartesian-py39-dace-cpu, cartesian-py310-dace-cpu, \
cartesian-py311-dace-cpu, \
eve-py38, eve-py39, eve-py310, eve-py311, \
next-py310-nomesh, next-py311-nomesh, next-py310-atlas, next-py311-atlas, \
storage-py38-internal-cpu, storage-py39-internal-cpu, storage-py310-internal-cpu, storage-py311-internal-cpu, \
storage-py38-dace-cpu, storage-py39-dace-cpu, storage-py310-dace-cpu, storage-py311-dace-cpu

[testenv]
deps = -r {tox_root}{/}{env:ENV_REQUIREMENTS_FILE:requirements-dev.txt}
Expand All @@ -44,7 +47,7 @@ pass_env = NUM_PROCESSES
set_env =
PYTHONWARNINGS = {env:PYTHONWARNINGS:ignore:Support for `[tool.setuptools]` in `pyproject.toml` is still *beta*:UserWarning}

[testenv:cartesian-py{38,39,310}-{internal,dace}-{cpu,cuda,cuda11x,cuda12x}]
[testenv:cartesian-py{38,39,310,311}-{internal,dace}-{cpu,cuda,cuda11x,cuda12x}]
description = Run 'gt4py.cartesian' tests
pass_env = {[testenv]pass_env}, BOOST_ROOT, BOOST_HOME, CUDA_HOME, CUDA_PATH, CXX, CC, OPENMP_CPPFLAGS, OPENMP_LDFLAGS, PIP_USER, PYTHONUSERBASE
allowlist_externals =
Expand All @@ -65,13 +68,13 @@ commands =
; coverage json --rcfile=setup.cfg
; coverage html --rcfile=setup.cfg --show-contexts

[testenv:eve-py{38,39,310}]
[testenv:eve-py{38,39,310,311}]
description = Run 'gt4py.eve' tests
commands =
python -m pytest --cache-clear -v -n {env:NUM_PROCESSES:1} {posargs} tests{/}eve_tests
python -m pytest --doctest-modules src{/}gt4py{/}eve

[testenv:next-py{310}-{nomesh,atlas}-{cpu,cuda,cuda11x,cuda12x}]
[testenv:next-py{310,311}-{nomesh,atlas}-{cpu,cuda,cuda11x,cuda12x}]
description = Run 'gt4py.next' tests
pass_env = {[testenv]pass_env}, BOOST_ROOT, BOOST_HOME, CUDA_HOME, CUDA_PATH
deps =
Expand All @@ -87,14 +90,14 @@ commands =
# atlas-{cuda,cuda11x,cuda12x}: python -m pytest --cache-clear -v -n {env:NUM_PROCESSES:1} -m "requires_atlas and requires_gpu" {posargs} tests{/}next_tests # TODO(ricoh): activate when such tests exist
pytest --doctest-modules src{/}gt4py{/}next

[testenv:storage-py{38,39,310}-{internal,dace}-{cpu,cuda,cuda11x,cuda12x}]
[testenv:storage-py{38,39,310,311}-{internal,dace}-{cpu,cuda,cuda11x,cuda12x}]
description = Run 'gt4py.storage' tests
commands =
cpu: python -m pytest --cache-clear -v -n {env:NUM_PROCESSES:1} -m "not requires_gpu" {posargs} tests{/}storage_tests
{cuda,cuda11x,cuda12x}: python -m pytest --cache-clear -v -n {env:NUM_PROCESSES:1} -m "requires_gpu" {posargs} tests{/}storage_tests
#pytest doctest-modules {posargs} src{/}gt4py{/}storage

[testenv:linters-py{38,39,310}]
[testenv:linters-py{38,39,310,311}]
description = Run linters
commands =
flake8 .{/}src
Expand Down Expand Up @@ -134,11 +137,13 @@ description =
py38: Update requirements for testing a specific python version
py39: Update requirements for testing a specific python version
py310: Update requirements for testing a specific python version
py311: Update requirements for testing a specific python version
base_python =
common: py38
py38: py38
py39: py39
py310: py310
py311: py311
deps =
cogapp>=3.3
pip-tools>=6.10
Expand Down Expand Up @@ -178,7 +183,7 @@ commands =
# Run cog to update .pre-commit-config.yaml with new versions
common: cog -r -P .pre-commit-config.yaml

[testenv:dev-py{38,39,310}{-atlas,}]
[testenv:dev-py{38,39,310,311}{-atlas,}]
description = Initialize development environment for gt4py
deps =
-r {tox_root}{/}requirements-dev.txt
Expand Down
Loading