Releases: alibaba/graph-learn
v1.1.0
We are glad to announce several new features and improvements to graphlearn, including heterogeneous graph support in SubGraph-based GNN, KNN support, HDFS support, new models, recommendation datasets and evaluation metrics, etc. We also introduce an online sampling module, named Dynamic Graph Service (DGS), for online inference services. We restructured the code structure to make it easier to follow, the training part is put into graphlearn and DGS is put into dynamic_graph_service.
New Features
- [DGS] Add DGS to GraphLearn. DGS is an online GNN inference service, it supports real-time sampling on dynamic graphs with streaming graph updates. doc
- Add Heterogeneous graph support for SubGraph-based GNN, add
HeteroSubGraph
andHeteroConv
and bipartite GraphSAGE example. - Add
nn.dataset
support for sparse data. - Add Edge feature support in both EgoGraph and SubGraph.
- Add HDFS graph source support.
- Add KNN operator based on faiss.
- Add AmazonBooks data and ego_bipartite_sage example.
- Add recemendation metrics: Recall, NDCG and HitRate.
- Add UltraGCN(CIKM'2021).
- Add hiactor-based graph engine implementation.
- Enable dump TensorFlow timeline in trainer for profiling.
- Add a setting for negative sampling strictness.
- [pytorch] Add interface
get_stats
to get the number of nodes and edges on each graphlearn server and refine PyG's GCN example. - [pytorch] Add support for
HeteroData
in PyG 2.x.
Breaking changes
- Refactor SubGraph inducer to support any sampling and subgraph generation method.
Bugfixes
- Fix the trainer log when there is no data in an epoch.
- Fix session is not closed as expected in DistTrainer.
- Fix transform in EgoGraph and change window size.
- [pytorch] Add get real data length for PyTorch DDP distributed training.
v1.0.1
GraphLearn r0.4.0 provided graph operating API and simple EgoGraph based GNN models. Recently we found that more and more users started to have the need for custom algorithms. In order to simplify the development of GNN algorithms, we have developed an algorithm framework for algorithm developers. This version supports both TF1.12 and PyTorch, and is also compatible with PyG. This GNN programming framework provides support for both fixed-size neighbor sampling and full neighbor sampling, and provides complete examples and algorithm development documentation.
New Features
- Refine Graph Sampling Language(GSL).
- Refactor the model implementation to simplify the model development process, abstract the data layer and model layer, and provide complete algorithm development examples and documentation.
- Refine EgoGraph based models.
- Add a new model development paradigm based on SubGraph.
- Add support for OGB data.
- Add a link prediction SEAL algorithm example.
- Add RGCN example.
- Add support for PyTorch.
- Add support for PyG.