Graphs and more Complex structures for Learning and Reasoning (GCLR) workshop was held at AAAI 2021. For more details about the workshop, please visit website: https://sites.google.com/view/gclr2021/.
Speaker's Bio: Stephen Bach is an Assistant Professor in the Computer Science Department at Brown University. His research focuses on weakly supervised machine learning, in which the goal is to train models without hand labeled data. He also works on statistical relational learning and information extraction. In this talk, Stephen will discuss his recent work on learning knowledge graph representations for zero-shot learning in NLP and vision.
Tiitle of the talk: Using Knowledge Graphs to Learn without Labels
Abstract: How can we automatically exploit the information in common sense knowledge graphs to create classifiers for new concepts without labeled training examples? I will discuss our recent work on methods for incorporating knowledge graphs into zero-shot learning. We introduce ZSL-KG, a framework for identifying concepts represented as graph nodes without any examples for those concepts. ZSL-KG is based on a novel class of graph neural networks called transformer GCNs. These networks use non-linear, permutation-invariant aggregators based on self-attention to better capture the rich information in knowledge graphs. This framework is completely inductive, meaning that new nodes and edges can be added to the knowledge graph at test time to describe novel classes. On computer vision and natural language processing tasks, ZSL-KG significantly outperforms (+5 percentage points of accuracy on average) prior general-purpose, graph-based methods. It also outperforms specialized methods developed for specific tasks