GraphTER: Unsupervised Learning of Graph Transformation Equivariant Representations via Auto-Enco... - Crossminds
CrossMind.ai logo

GraphTER: Unsupervised Learning of Graph Transformation Equivariant Representations via Auto-Enco...

Sep 29, 2020
|
24 views
Details
Authors: Xiang Gao, Wei Hu, Guo-Jun Qi Description: Recent advances in Graph Convolutional Neural Networks (GCNNs) have shown their efficiency for nonEuclidean data on graphs, which often require a large amount of labeled data with high cost. It it thus critical to learn graph feature representations in an unsupervised manner in practice. To this end, we propose a novel unsupervised learning of Graph Transformation Equivariant Representations (GraphTER), aiming to capture intrinsic patterns of graph structure under both global and local transformations. Specifically, we allow to sample different groups of nodes from a graph and then transform them node-wise isotropically or anisotropically. Then, we self-train a representation encoder to capture the graph structures by reconstructing these node-wise transformations from the feature representations of the original and transformed graphs. In experiments, we apply the learned GraphTER to graphs of 3D point cloud data, and results on point cloud segmentation/classification show that GraphTER significantly outperforms state-of-the-art unsupervised approaches and pushes greatly closer towards the upper bound set by the fully supervised counterparts. The code is available at: https://github.com/gyshgx868/graph-ter.

Comments
loading...
Reactions (0) | Note
    📝 No reactions yet
    Be the first one to share your thoughts!
loading...
Recommended