Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redesign core #500

Open
wants to merge 26 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
22b1f6e
add current branch as part of CI check
skim0119 Nov 20, 2024
d1c1eee
clean up pipeline implementation
skim0119 Nov 20, 2024
32b5efa
rework: redesign how callback functions are gathered
skim0119 Nov 20, 2024
37247e3
cleanup: simplified cache check
skim0119 Nov 20, 2024
e50de8f
remove skip-plot: instead add separate operator plotter
skim0119 Nov 20, 2024
b9ea818
rework: generator-type operator callback redesign
skim0119 Nov 20, 2024
2472b3a
update test cases
skim0119 Nov 20, 2024
97a5156
allow user to pass callback functions as either operator instance or …
skim0119 Nov 20, 2024
fa4d71c
bump up version for python to 3.10
skim0119 Nov 20, 2024
122a0df
remove dependency lock for now
skim0119 Nov 20, 2024
767791e
recreate lock file within CI
skim0119 Nov 20, 2024
614da1d
lock dependencies
skim0119 Nov 20, 2024
115c1a2
remove centrality graph test
skim0119 Nov 20, 2024
f82fec2
wip: type hinting operator consistency
skim0119 Nov 20, 2024
9b12685
Merge branch 'main' into redesign_core
skim0119 Nov 30, 2024
0510e0a
fix test with new pipeline rework
skim0119 Nov 30, 2024
37051cd
Merge remote-tracking branch 'public/main' into redesign_core
skim0119 Dec 1, 2024
f95cb15
remove ParallelGeneratorFetch: issue with un-picklable object queueing
skim0119 Dec 1, 2024
fe8e451
Update main.yml
skim0119 Dec 1, 2024
b7a4b4a
Update main.yml
skim0119 Dec 1, 2024
99bd622
wip: cleanup unused code
skim0119 Dec 1, 2024
62a1854
wip: typing intan loader
skim0119 Dec 2, 2024
a7e21c0
wip: typing openephys loader, remove unused functions in data loaders
skim0119 Dec 2, 2024
d38f65a
wip: restructuring behavior design
skim0119 Dec 4, 2024
2a48c79
wip: restructure behavior definition
skim0119 Dec 4, 2024
411e3e1
update: reassign dependency trees
skim0119 Dec 10, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 7 additions & 10 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@ name: CI

on:
push:
branches: [ main, update-** ]
branches: [main, update-**, redesign_core]
pull_request:
branches: [ '**' ]
branches: ["**"]

jobs:
build:
Expand All @@ -13,7 +13,7 @@ jobs:
matrix:
python-version: ["3.10", "3.11", "3.12"]
os: [macos-13, ubuntu-latest]
mpi: ["openmpi"] # [ 'mpich', 'openmpi', 'intelmpi']
mpi: ["openmpi"] # [ 'mpich', 'openmpi', 'intelmpi']
include:
- os: macos-13
path: ~/Library/Caches/pip
Expand Down Expand Up @@ -48,17 +48,14 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction --no-root --all-extras --with=dev,algorithmExtension,sortingExtension #,mpi
run: poetry install --no-interaction --all-extras --with=dev,algorithmExtension,sortingExtension #,mpi
- uses: FedericoCarboni/setup-ffmpeg@v3
id: setup-ffmpeg
with:
# like "6.1.0". At the moment semver specifiers (i.e. >=6.1.0) are supported
# only on Windows, on other platforms they are allowed but version is matched
# exactly regardless.
ffmpeg-version: release
# Target architecture of the ffmpeg executable to install. Defaults to the
# system architecture. Only x64 and arm64 are supported (arm64 only on Linux).
architecture: ''
# Linking type of the binaries. Use "shared" to download shared binaries and
# "static" for statically linked ones. Shared builds are currently only available
# for windows releases. Defaults to "static"
Expand All @@ -70,17 +67,17 @@ jobs:
- name: Run pytests
if: always()
run: |
source $VENV
source .venv/bin/activate
make test
- name: Run mypy
if: always()
run: |
source $VENV
source .venv/bin/activate
make mypy
- name: Run formatting check
if: always()
run: |
source $VENV
source .venv/bin/activate
make check-codestyle
# Upload coverage to Codecov (use python 3.10 ubuntu-latest)
- name: Upload coverage to Codecov (only on 3.10 ubuntu-latest)
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ repos:
hooks:
- id: pyupgrade
name: pyupgrade
entry: poetry run pyupgrade --py311-plus
entry: poetry run pyupgrade --py310-plus
types: [python]
language: system

Expand Down
7 changes: 4 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ pre-commit-install:
#* Formatters
.PHONY: codestyle
codestyle:
# poetry run pyupgrade --exit-zero-even-if-changed --py38-plus **/*.py
poetry run pyupgrade --exit-zero-even-if-changed --py310-plus **/*.py
# poetry run isort --settings-path pyproject.toml ./
poetry run black --config pyproject.toml ./

Expand All @@ -33,7 +33,8 @@ formatting: codestyle
#* Linting
.PHONY: test
test:
poetry run pytest -c pyproject.toml --cov=miv --cov-report=xml
poetry run pytest -c pyproject.toml --cov=miv/core
# poetry run pytest -c pyproject.toml --cov=miv/core --cov-report=xml

.PHONY: check-codestyle
check-codestyle:
Expand All @@ -42,7 +43,7 @@ check-codestyle:

.PHONY: mypy
mypy:
poetry run mypy --config-file pyproject.toml miv
poetry run mypy --config-file pyproject.toml miv/core

.PHONY: lint
lint: test check-codestyle mypy check-safety
Expand Down
3 changes: 3 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -19,5 +19,8 @@ help:
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: clean clean_cache
clean:
rm -rf $(BUILDDIR)/*
clean_cache:
rm -rf **/results **/datasets
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@
autosummary_generate = True
autosummary_generate_overwrite = False

source_parsers: Dict[str, str] = {}
source_parsers: dict[str, str] = {}
source_suffix = {
".rst": "restructuredtext",
".md": "myst-nb",
Expand Down
7 changes: 3 additions & 4 deletions miv/core/datatype/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,9 @@

from miv.core.datatype.collapsable import *
from miv.core.datatype.events import *
from miv.core.datatype.protocol import *
from miv.core.datatype.pure_python import *
from miv.core.datatype.signal import *
from miv.core.datatype.spikestamps import *
from .pure_python import *
from .signal import *
from .spikestamps import *

DataTypes = Any # Union[ # TODO
# miv.core.datatype.signal.Signal,
Expand Down
23 changes: 12 additions & 11 deletions miv/core/datatype/collapsable.py
Original file line number Diff line number Diff line change
@@ -1,20 +1,21 @@
from typing import Generator, Protocol

from miv.core.datatype.protocol import Extendable
from typing import Any, Protocol
from collections.abc import Iterable


class _Collapsable(Protocol):
@classmethod
def from_collapse(self) -> None: ...
def from_collapse(self, values: Iterable["_Collapsable"]) -> "_Collapsable": ...

def extend(self, *args: Any, **kwargs: Any) -> None: ...

class CollapseExtendableMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)

class CollapseExtendableMixin:
@classmethod
def from_collapse(cls, values: Generator[Extendable, None, None]):
obj = cls()
for value in values:
obj.extend(value)
def from_collapse(cls, values: Iterable[_Collapsable]) -> _Collapsable:
obj: _Collapsable
for idx, value in enumerate(values):
if idx == 0:
obj = value
else:
obj.extend(value)
return obj
33 changes: 17 additions & 16 deletions miv/core/datatype/events.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

__all__ = ["Events"]

from typing import List, Optional
from typing import Optional, cast

from collections import UserList

Expand All @@ -28,36 +28,36 @@ class Events(CollapseExtendableMixin, DataNodeMixin):
Comply with `Extendable` protocols.
"""

def __init__(self, data: List[float] = None):
def __init__(self, data: list[float] | None = None) -> None:
super().__init__()
self.data = np.asarray(data) if data is not None else []
self.data = np.asarray(data) if data is not None else np.array([])

def append(self, item):
raise NotImplementedError("Not implemented yet. Need to append and sort")
def append(self, item: float) -> None:
self.data = np.append(self.data, item)

def extend(self, other):
raise NotImplementedError("Not implemented yet. Need to extend and sort")
def extend(self, other: "Events") -> None:
self.data = np.append(self.data, other.data)

def __len__(self):
def __len__(self) -> int:
return len(self.data)

def get_last_event(self):
def get_last_event(self) -> float:
"""Return timestamps of the last event"""
return max(self.data)
return cast(float, max(self.data))

def get_first_event(self):
def get_first_event(self) -> float:
"""Return timestamps of the first event"""
return min(self.data)
return cast(float, min(self.data))

def get_view(self, t_start: float, t_end: float):
def get_view(self, t_start: float, t_end: float) -> "Events":
"""Truncate array and only includes spikestamps between t_start and t_end."""
return Events(sorted(list(filter(lambda x: t_start <= x <= t_end, self.data))))

def binning(
self,
bin_size: float = 1 * pq.ms,
t_start: Optional[float] = None,
t_end: Optional[float] = None,
bin_size: float | pq.Quantity = 0.001,
t_start: float | None = None,
t_end: float | None = None,
return_count: bool = False,
) -> Signal:
"""
Expand Down Expand Up @@ -89,6 +89,7 @@ def binning(
rate=1.0 / bin_size,
)

# TODO: Make separate free function for this binning process
bins = np.digitize(self.data, time)
bincount = np.bincount(bins, minlength=n_bins + 2)[1:-1]
if return_count:
Expand Down
27 changes: 0 additions & 27 deletions miv/core/datatype/protocol.py

This file was deleted.

43 changes: 19 additions & 24 deletions miv/core/datatype/pure_python.py
Original file line number Diff line number Diff line change
@@ -1,31 +1,26 @@
__all__ = ["PythonDataType", "NumpyDType", "GeneratorType"]

from typing import Protocol, Union
from typing import Protocol, Union, TypeAlias, Any
from collections.abc import Generator, Iterator

import numpy as np

from miv.core.operator.operator import DataNodeMixin
from miv.core.operator.chainable import BaseChainingMixin


class RawValuesProtocol(Protocol):
@staticmethod
def is_valid(value) -> bool: ...
PurePythonTypes: TypeAlias = Union[int, float, str, bool, list, tuple, dict]


class ValuesMixin(BaseChainingMixin):
class ValuesMixin(DataNodeMixin, BaseChainingMixin):
"""
This mixin is used to convert pure/numpy data type to be a valid input/output of a node.
"""

def __init__(self, value, *args, **kwargs):
def __init__(
self, data: np.ndarray | PurePythonTypes, *args: Any, **kwargs: Any
) -> None:
super().__init__(*args, **kwargs)
self.value = value

def output(self):
return self.value

def run(self, *args, **kwargs):
return self.output()
self.data = data


class PythonDataType(ValuesMixin):
Expand All @@ -35,9 +30,9 @@ class PythonDataType(ValuesMixin):
"""

@staticmethod
def is_valid(value):
return value is None or isinstance(
value, (int, float, str, bool, list, tuple, dict)
def is_valid(data: Any) -> bool:
return data is None or isinstance(
data, (int, float, str, bool, list, tuple, dict)
)


Expand All @@ -47,23 +42,23 @@ class NumpyDType(ValuesMixin):
"""

@staticmethod
def is_valid(value):
return isinstance(value, np.ndarray)
def is_valid(data: Any) -> bool:
return isinstance(data, np.ndarray)


class GeneratorType(BaseChainingMixin):
def __init__(self, iterator, *args, **kwargs):
def __init__(self, iterator: Iterator, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.iterator = iterator

def output(self):
def output(self) -> Generator:
yield from self.iterator

def run(self, **kwargs):
def run(self, **kwargs: Any) -> Generator:
yield from self.output()

@staticmethod
def is_valid(value):
def is_valid(data: Any) -> bool:
import inspect

return inspect.isgenerator(value)
return inspect.isgenerator(data)
Loading
Loading