#graphembedding #machinelearning #skipgram
graph2vec proposes a technique to embed entire graph in high dimension vector space. It is inspired from doc2vec learning approach over graphs and rooted subgraphs. It achieves significant improvements in classification and clustering accuracies over substructure representation learning approaches and are competitive with state-of-the-art graph kernels. Watch to know more :)
⏩ Abstract: Recent works on representation learning for graph structured data predominantly focus on learning distributed representations of graph substructures such as nodes and subgraphs. However, many graph analytics tasks such as graph classification and clustering require representing entire graphs as fixed length feature vectors. While the aforementioned approaches are naturally unequipped to learn such representations, graph kernels remain as the most effective way of obtaining them. However, these graph kernels use handcrafted features (e.g., shortest paths, graphlets, etc.) and hence are hampered by problems such as poor generalization. To address this limitation, in this work, we propose a neural embedding framework named graph2vec to learn data-driven distributed representations of arbitrary sized graphs. graph2vec's embeddings are learnt in an unsupervised manner and are task agnostic. Hence, they could be used for any downstream task such as graph classification, clustering and even seeding supervised representation learning approaches. Our experiments on several benchmark and large real-world datasets show that graph2vec achieves significant improvements in classification and clustering accuracies over substructure representation learning approaches and are competitive with state-of-the-art graph kernels.
Please feel free to share out the content and subscribe to my channel :)
⏩ Subscribe - https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA?sub_confirmation=1
⏩ OUTLINE:
0:00 - Abstract
01:51 - Advantages of graph2vec approach
04:08 - Problem statement and notations
05:45 - Neural document embeddings model (doc2vec)
07:97 - Doc2Vec, Skipgram and Graph2Vec visual representation
09:22 - Advantages of rooted subgraph
11:03 - Algorithm 1 - Graph2Vec function walkthrough
12:49 - Algorithm 2 - GetWLSubgraph function walkthrough
13:41 - Wrap up
⏩ Paper Title: graph2vec: Learning Distributed Representations of Graphs
⏩ Paper: https://arxiv.org/pdf/1707.05005.pdf
⏩ Author: Annamalai Narayanan, Mahinthan Chandramohan, Rajasekar Venkatesan, Lihui Chen, Yang Liu, Shantanu Jaiswal
⏩ Organisation: Nanyang Technological University, Singapore
⏩ IMPORTANT LINKS
Full Playlist on Machine Learning with Graphs: https://www.youtube.com/watch?v=-uJL_ANy1jc&list=PLsAqq9lZFOtU7tT6mDXX_fhv1R1-jGiYf
*********************************************
⏩ Youtube - https://www.youtube.com/c/TechVizTheDataScienceGuy
⏩ Blog - https://prakhartechviz.blogspot.com
⏩ LinkedIn - https://linkedin.com/in/prakhar21
⏩ Medium - https://medium.com/@prakhar.mishra
⏩ GitHub - https://github.com/prakhar21
⏩ Twitter - https://twitter.com/rattller
*********************************************
Please feel free to share out the content and subscribe to my channel :)
⏩ Subscribe - https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA?sub_confirmation=1
Tools I use for making videos :)
⏩ iPad - https://tinyurl.com/y39p6pwc
⏩ Apple Pencil - https://tinyurl.com/y5rk8txn
⏩ GoodNotes - https://tinyurl.com/y627cfsa
#techviz #datascienceguy #ml_with_graphs #node #representation #learning
0:00 Abstract
01:51 Advantages of graph2vec approach
04:08 Problem statement and notations
05:45 Neural document embeddings model (doc2vec)
07:97 Doc2Vec, Skipgram and Graph2Vec visual representation
09:22 Advantages of rooted subgraph
11:03 Algorithm 1 - Graph2Vec function walkthrough
12:49 Algorithm 2 - GetWLSubgraph function walkthrough
13:41 Wrap up