GraphNeuralNetworks.GNNChain
— TypeGNNChain(layers...)
+ 0.345098 0.458305 0.106353 0.345098 0.458305 0.106353
diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json
index 014bb6d90..0311c5c26 100644
--- a/dev/.documenter-siteinfo.json
+++ b/dev/.documenter-siteinfo.json
@@ -1 +1 @@
-{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-07-25T17:20:09","documenter_version":"1.5.0"}}
\ No newline at end of file
+{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-07-26T07:11:46","documenter_version":"1.5.0"}}
\ No newline at end of file
diff --git a/dev/api/basic/index.html b/dev/api/basic/index.html
index dc628de5d..d76708f9f 100644
--- a/dev/api/basic/index.html
+++ b/dev/api/basic/index.html
@@ -9,7 +9,7 @@
julia> dotdec(g, rand(2, 5))
1×6 Matrix{Float64}:
- 0.345098 0.458305 0.106353 0.345098 0.458305 0.106353source Collects multiple layers / functions to be called in sequence on given input graph and input node features. It allows to compose layers in a sequential fashion as Examples An abstract type from which graph neural network layers are derived. See also A type wrapping the If Examples An abstract type from which graph neural network layers are derived. See also A type wrapping the If ExamplesGraphNeuralNetworks.GNNChain
— TypeGNNChain(layers...)
+ 0.345098 0.458305 0.106353 0.345098 0.458305 0.106353
GraphNeuralNetworks.GNNChain
— TypeGNNChain(layers...)
GNNChain(name = layer, ...)
Flux.Chain
does, propagating the output of each layer to the next one. In addition, GNNChain
handles the input graph as well, providing it as a first argument only to layers subtyping the GNNLayer
abstract type. GNNChain
supports indexing and slicing, m[2]
or m[1:end-1]
, and if names are given, m[:name] == m[1]
etc.julia> using Flux, GraphNeuralNetworks
julia> m = GNNChain(GCNConv(2=>5),
@@ -41,7 +41,7 @@
2.90053 2.90053 2.90053 2.90053 2.90053 2.90053
julia> m2[:enc](g, x) == m(g, x)
-true
GraphNeuralNetworks.GNNLayer
— Typeabstract type GNNLayer end
GNNChain
.GraphNeuralNetworks.WithGraph
— TypeWithGraph(model, g::GNNGraph; traingraph=false)
model
and tying it to the graph g
. In the forward pass, can only take feature arrays as inputs, returning model(g, x...; kws...)
.traingraph=false
, the graph's parameters won't be part of the trainable
parameters in the gradient updates.g = GNNGraph([1,2,3], [2,3,1])
+true
GraphNeuralNetworks.GNNLayer
— Typeabstract type GNNLayer end
GNNChain
.GraphNeuralNetworks.WithGraph
— TypeWithGraph(model, g::GNNGraph; traingraph=false)
model
and tying it to the graph g
. In the forward pass, can only take feature arrays as inputs, returning model(g, x...; kws...)
.traingraph=false
, the graph's parameters won't be part of the trainable
parameters in the gradient updates.g = GNNGraph([1,2,3], [2,3,1])
x = rand(Float32, 2, 3)
model = SAGEConv(2 => 3)
wg = WithGraph(model, g)
@@ -51,4 +51,4 @@
g2 = GNNGraph([1,1,2,3], [2,4,1,1])
x2 = rand(Float32, 2, 4)
# WithGraph will ignore the internal graph if fed with a new one.
-@assert wg(g2, x2) == model(g2, x2)
Settings
This document was generated with Documenter.jl version 1.5.0 on Thursday 25 July 2024. Using Julia version 1.10.4.