Skip to content

Generic reshape #213

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 14, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -64,10 +64,10 @@ add_library(neural-fortran
src/nf/nf_parallel.f90
src/nf/nf_parallel_submodule.f90
src/nf/nf_random.f90
src/nf/nf_reshape_layer.f90
src/nf/nf_reshape_layer_submodule.f90
src/nf/nf_reshape2d_layer.f90
src/nf/nf_reshape2d_layer_submodule.f90
src/nf/nf_reshape3d_layer.f90
src/nf/nf_reshape3d_layer_submodule.f90
src/nf/nf_self_attention_layer.f90
src/nf/io/nf_io_binary.f90
src/nf/io/nf_io_binary_submodule.f90
Expand Down
11 changes: 6 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@ Read the paper [here](https://arxiv.org/abs/1902.06714).

## Features

* Training and inference of dense (fully connected) and convolutional neural
networks
* Training and inference of dense (fully connected), convolutional (1-d and 2-d),
and transformer neural networks
* Stochastic gradient descent optimizers: Classic, momentum, Nesterov momentum,
RMSProp, Adagrad, Adam, AdamW
* More than a dozen activation functions and their derivatives
Expand All @@ -41,9 +41,8 @@ Read the paper [here](https://arxiv.org/abs/1902.06714).
| Linear (2-d) | `linear2d` | `input2d`, `layernorm`, `linear2d`, `self_attention` | 2 | ✅ | ✅ |
| Self-attention | `self_attention` | `input2d`, `layernorm`, `linear2d`, `self_attention` | 2 | ✅ | ✅ |
| Layer Normalization | `layernorm` | `linear2d`, `self_attention` | 2 | ✅ | ✅ |
| Flatten | `flatten` | `input2d`, `input3d`, `conv2d`, `maxpool2d`, `reshape` | 1 | ✅ | ✅ |
| Reshape (1-d to 2-d) | `reshape2d` | `input2d`, `conv1d`, `locally_connected1d`, `maxpool1d` | 2 | ✅ | ✅ |
| Reshape (1-d to 3-d) | `reshape` | `input1d`, `dense`, `flatten` | 3 | ✅ | ✅ |
| Flatten | `flatten` | `input2d`, `input3d`, `conv1d`, `conv2d`, `maxpool1d`, `maxpool2d`, `reshape` | 1 | ✅ | ✅ |
| Reshape (1-d to 2-d or 3-d) | `reshape` | `dense`, `dropout`, `flatten`, `input1d` | 2, 3 | ✅ | ✅ |

## Getting started

Expand Down Expand Up @@ -263,11 +262,13 @@ It may be useful to read if you want to contribute a new feature to neural-fortr

Thanks to all open-source contributors to neural-fortran:
[awvwgk](https://github.com/awvwgk),
[certik](https://github.com/certik),
[ggoyman](https://github.com/ggoyman),
[ivan-pi](https://github.com/ivan-pi),
[jacobwilliams](https://github.com/jacobwilliams),
[jvdp1](https://github.com/jvdp1),
[jvo203](https://github.com/jvo203),
[mathomp4](https://github.com/mathomp4),
[milancurcic](https://github.com/milancurcic),
[OneAdder](https://github.com/OneAdder),
[pirpyn](https://github.com/pirpyn),
Expand Down
2 changes: 1 addition & 1 deletion example/cnn_mnist.f90
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ program cnn_mnist

net = network([ &
input(784), &
reshape([1,28,28]), &
reshape(1, 28, 28), &
conv2d(filters=8, kernel_size=3, activation=relu()), &
maxpool2d(pool_size=2), &
conv2d(filters=16, kernel_size=3, activation=relu()), &
Expand Down
4 changes: 2 additions & 2 deletions example/cnn_mnist_1d.f90
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
program cnn_mnist_1d

use nf, only: network, sgd, &
input, conv1d, maxpool1d, flatten, dense, reshape, reshape2d, locally_connected1d, &
input, conv1d, maxpool1d, flatten, dense, reshape, locally_connected1d, &
load_mnist, label_digits, softmax, relu

implicit none
Expand All @@ -20,7 +20,7 @@ program cnn_mnist_1d

net = network([ &
input(784), &
reshape2d([28, 28]), &
reshape(28, 28), &
locally_connected1d(filters=8, kernel_size=3, activation=relu()), &
maxpool1d(pool_size=2), &
locally_connected1d(filters=16, kernel_size=3, activation=relu()), &
Expand Down
2 changes: 1 addition & 1 deletion fpm.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name = "neural-fortran"
version = "0.20.0"
version = "0.21.0"
license = "MIT"
author = "Milan Curcic"
maintainer = "[email protected]"
Expand Down
1 change: 0 additions & 1 deletion src/nf.f90
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@ module nf
maxpool1d, &
maxpool2d, &
reshape, &
reshape2d, &
self_attention
use nf_loss, only: mse, quadratic
use nf_metrics, only: corr, maxabs
Expand Down
45 changes: 22 additions & 23 deletions src/nf/nf_layer_constructors.f90
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@ module nf_layer_constructors
maxpool1d, &
maxpool2d, &
reshape, &
reshape2d, &
self_attention, &
embedding, &
layernorm
Expand Down Expand Up @@ -94,6 +93,28 @@ end function input3d

end interface input


interface reshape

module function reshape2d(dim1, dim2) result(res)
!! Rank-1 to rank-2 reshape layer constructor.
integer, intent(in) :: dim1, dim2
!! Shape of the output
type(layer) :: res
!! Resulting layer instance
end function reshape2d

module function reshape3d(dim1, dim2, dim3) result(res)
!! Rank-1 to rank-3 reshape layer constructor.
integer, intent(in) :: dim1, dim2, dim3
!! Shape of the output
type(layer) :: res
!! Resulting layer instance
end function reshape3d

end interface reshape


interface

module function dense(layer_size, activation) result(res)
Expand Down Expand Up @@ -283,28 +304,6 @@ module function maxpool2d(pool_size, stride) result(res)
!! Resulting layer instance
end function maxpool2d

module function reshape(output_shape) result(res)
!! Rank-1 to rank-any reshape layer constructor.
!! Currently implemented is only rank-3 for the output of the reshape.
!!
!! This layer is for connecting 1-d inputs to conv2d or similar layers.
integer, intent(in) :: output_shape(:)
!! Shape of the output
type(layer) :: res
!! Resulting layer instance
end function reshape

module function reshape2d(output_shape) result(res)
!! Rank-1 to rank-any reshape layer constructor.
!! Currently implemented is only rank-2 for the output of the reshape.
!!
!! This layer is for connecting 1-d inputs to conv1d or similar layers.
integer, intent(in) :: output_shape(:)
!! Shape of the output
type(layer) :: res
!! Resulting layer instance
end function reshape2d

module function linear2d(out_features) result(res)
!! Rank-2 (sequence_length, out_features) linear layer constructor.
!! sequence_length is determined at layer initialization, based on the
Expand Down
39 changes: 13 additions & 26 deletions src/nf/nf_layer_constructors_submodule.f90
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@
use nf_locally_connected1d_layer, only: locally_connected1d_layer
use nf_maxpool1d_layer, only: maxpool1d_layer
use nf_maxpool2d_layer, only: maxpool2d_layer
use nf_reshape_layer, only: reshape3d_layer
use nf_reshape2d_layer, only: reshape2d_layer
use nf_reshape3d_layer, only: reshape3d_layer
use nf_linear2d_layer, only: linear2d_layer
use nf_self_attention_layer, only: self_attention_layer
use nf_embedding_layer, only: embedding_layer
Expand Down Expand Up @@ -229,35 +229,22 @@ module function maxpool2d(pool_size, stride) result(res)
end function maxpool2d


module function reshape(output_shape) result(res)
integer, intent(in) :: output_shape(:)
type(layer) :: res

res % name = 'reshape'
res % layer_shape = output_shape

if (size(output_shape) == 3) then
allocate(res % p, source=reshape3d_layer(output_shape))
else
error stop 'size(output_shape) of the reshape layer must == 3'
end if

end function reshape

module function reshape2d(output_shape) result(res)
integer, intent(in) :: output_shape(:)
module function reshape2d(dim1, dim2) result(res)
integer, intent(in) :: dim1, dim2
type(layer) :: res

res % name = 'reshape2d'
res % layer_shape = output_shape
res % layer_shape = [dim1, dim2]
allocate(res % p, source=reshape2d_layer(res % layer_shape))
end function reshape2d

if (size(output_shape) == 2) then
allocate(res % p, source=reshape2d_layer(output_shape))
else
error stop 'size(output_shape) of the reshape layer must == 2'
end if

end function reshape2d
module function reshape3d(dim1, dim2, dim3) result(res)
integer, intent(in) :: dim1, dim2, dim3
type(layer) :: res
res % name = 'reshape3d'
res % layer_shape = [dim1, dim2, dim3]
allocate(res % p, source=reshape3d_layer(res % layer_shape))
end function reshape3d


module function linear2d(out_features) result(res)
Expand Down
2 changes: 1 addition & 1 deletion src/nf/nf_layer_submodule.f90
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
use nf_maxpool1d_layer, only: maxpool1d_layer
use nf_maxpool2d_layer, only: maxpool2d_layer
use nf_reshape2d_layer, only: reshape2d_layer
use nf_reshape_layer, only: reshape3d_layer
use nf_reshape3d_layer, only: reshape3d_layer
use nf_linear2d_layer, only: linear2d_layer
use nf_self_attention_layer, only: self_attention_layer
use nf_embedding_layer, only: embedding_layer
Expand Down
4 changes: 2 additions & 2 deletions src/nf/nf_network_submodule.f90
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@
use nf_maxpool1d_layer, only: maxpool1d_layer
use nf_maxpool2d_layer, only: maxpool2d_layer
use nf_reshape2d_layer, only: reshape2d_layer
use nf_reshape_layer, only: reshape3d_layer
use nf_reshape3d_layer, only: reshape3d_layer
use nf_linear2d_layer, only: linear2d_layer
use nf_self_attention_layer, only: self_attention_layer
use nf_embedding_layer, only: embedding_layer
use nf_layernorm_layer, only: layernorm_layer
use nf_layer, only: layer
use nf_layer_constructors, only: conv1d, conv2d, dense, flatten, input, maxpool1d, maxpool2d, reshape, reshape2d
use nf_layer_constructors, only: conv1d, conv2d, dense, flatten, input, maxpool1d, maxpool2d, reshape
use nf_loss, only: quadratic
use nf_optimizers, only: optimizer_base_type, sgd
use nf_parallel, only: tile_indices
Expand Down
4 changes: 2 additions & 2 deletions src/nf/nf_reshape_layer.f90 → src/nf/nf_reshape3d_layer.f90
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
module nf_reshape_layer
module nf_reshape3d_layer

!! This module provides the concrete reshape layer type.
!! It is used internally by the layer type.
Expand Down Expand Up @@ -73,4 +73,4 @@ end subroutine init

end interface

end module nf_reshape_layer
end module nf_reshape3d_layer
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
submodule(nf_reshape_layer) nf_reshape_layer_submodule
submodule(nf_reshape3d_layer) nf_reshape3d_layer_submodule

use nf_base_layer, only: base_layer

Expand Down Expand Up @@ -48,4 +48,4 @@ module subroutine init(self, input_shape)

end subroutine init

end submodule nf_reshape_layer_submodule
end submodule nf_reshape3d_layer_submodule
2 changes: 1 addition & 1 deletion test/test_insert_flatten.f90
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ program test_insert_flatten

net = network([ &
input(4), &
reshape([1, 2, 2]), &
reshape(1, 2, 2), &
dense(4) &
])

Expand Down
6 changes: 2 additions & 4 deletions test/test_reshape2d_layer.f90
Original file line number Diff line number Diff line change
@@ -1,23 +1,21 @@
program test_reshape2d_layer

use iso_fortran_env, only: stderr => error_unit
use nf, only: input, network, reshape2d_layer => reshape2d
use nf_datasets, only: download_and_unpack, keras_reshape_url
use nf, only: input, network, reshape2d => reshape

implicit none

type(network) :: net
real, allocatable :: sample_input(:), output(:,:)
integer, parameter :: output_shape(2) = [4,4]
integer, parameter :: input_size = product(output_shape)
character(*), parameter :: keras_reshape_path = 'keras_reshape.h5'
logical :: file_exists
logical :: ok = .true.

! Create the network
net = network([ &
input(input_size), &
reshape2d_layer(output_shape) &
reshape2d(output_shape(1), output_shape(2)) &
])

if (.not. size(net % layers) == 2) then
Expand Down
4 changes: 2 additions & 2 deletions test/test_reshape_layer.f90
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
program test_reshape_layer

use iso_fortran_env, only: stderr => error_unit
use nf, only: input, network, reshape_layer => reshape
use nf, only: input, network, reshape3d => reshape
use nf_datasets, only: download_and_unpack, keras_reshape_url

implicit none
Expand All @@ -17,7 +17,7 @@ program test_reshape_layer
! Create the network
net = network([ &
input(input_size), &
reshape_layer(output_shape) &
reshape3d(3, 32, 32) &
])

if (.not. size(net % layers) == 2) then
Expand Down