Skip to content

Commit

Permalink
[update] project info
Browse files Browse the repository at this point in the history
  • Loading branch information
AIboy996 committed Apr 26, 2024
1 parent d340af9 commit 34d785a
Show file tree
Hide file tree
Showing 9 changed files with 4 additions and 126 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/pythoh-package-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.9", "3.10", "3.11"]
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "npnn"
version = "0.1.0"
version = "0.1.1"
dependencies = [
"numpy"
]
Expand Down
97 changes: 0 additions & 97 deletions src/npnn.egg-info/PKG-INFO

This file was deleted.

15 changes: 0 additions & 15 deletions src/npnn.egg-info/SOURCES.txt

This file was deleted.

1 change: 0 additions & 1 deletion src/npnn.egg-info/dependency_links.txt

This file was deleted.

7 changes: 0 additions & 7 deletions src/npnn.egg-info/requires.txt

This file was deleted.

1 change: 0 additions & 1 deletion src/npnn.egg-info/top_level.txt

This file was deleted.

3 changes: 1 addition & 2 deletions src/npnn/__init__.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
"""
# NPNN
> NumPy Neural Network
> [NumPy Neural Network](https://github.com/AIboy996/npnn/)
"""

from .autograd import Tensor
Expand Down
2 changes: 1 addition & 1 deletion src/npnn/autograd.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,6 @@ def T(self):
if __name__ == "__main__":
from .functional import Inner
x = Tensor(np.random.random((1, 3, 1)), requires_grad=True)
loss = Inner()(x.T, x) # this condition is not included
loss = Inner()(x.T, x) # this case is not considered
loss.backward()
print(x.grad) # we will get a wrong grad.

0 comments on commit 34d785a

Please sign in to comment.