Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add TemporalGraphConv #369

Closed
wants to merge 7 commits into from

Conversation

aurorarossi
Copy link
Member

This PR adds a TemporalGraphConv constructor to each snapshot of the TemporalSnapshotsGNNGraph, a convolutional homogeneous graph layer.
A PR will follow to update the documentation.

src/layers/temporalconv.jl Outdated Show resolved Hide resolved
TemporalGraphConv(layer)

A constructor for applying to each snapshot of the `TemporalSnapshotsGNNGraph` a convolutional homogeneous graph `layer`.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Paper citation is missing, and also more explanation of how it works.

Am I correct in saying that this applies layer independently to each temporal slice?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you are right.
This constructor has been used in the following papers, for example

And I do not know of any paper that defines it. I would say it is similar to the HeteroGraphConv for heterographs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say it is similar to the HeteroGraphConv for heterographs.

Yes, although HeteroGraphConv in the end aggregates messages coming from different edge types, while here there is no cross-talk between different times, it has to happen in separate modules (e.g. temporal attention in the last reference above)

src/layers/temporalconv.jl Outdated Show resolved Hide resolved
end

function (tgconv::TemporalGraphConv)(tsg::TemporalSnapshotsGNNGraph, x)
return tgconv.layer.(tsg.snapshots, x)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return tgconv.layer.(tsg.snapshots, x)
return tgconv.layer.(tsg.snapshots, x)

Also, here broadcasting should be avoided, since both snapshots and x could be expanded

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return tgconv.layer.(tsg.snapshots, x)
return map((a, b) -> tgconv.layer(a, b), tsg.snapshots, x)

Is this a better way to write the broadcast in this case?

Copy link
Member

@CarloLucibello CarloLucibello Jan 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, I got confused, it is the same provided x is a vector of the same length as snapshots.

aurorarossi and others added 2 commits January 30, 2024 10:32
Co-authored-by: Carlo Lucibello <[email protected]>
Co-authored-by: Carlo Lucibello <[email protected]>
@CarloLucibello
Copy link
Member

I think that instead of introducing a new layer, we can extend all layers to take a temporalsnapshotgraph as input and apply itself timewise. The time dimension in this way is like a batch dimension.
It would be something like

function (l::GNNLayer)(g::TemporalSnapshotsGNNGraph, x::AbstractVector; kws...)
    y = ....
    return y
end  

Maybe the operation could be made more efficient batching the snapshots first and then unbatching the output y later, but this is a performance optimization that can be explored later.

@aurorarossi
Copy link
Member Author

Ah ok I see. Do you prefer one big PR with support for TemporalSnapshotsGNNGraphs for all layers or one PR for each layer?
In both cases I will close this PR.
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants