We introduce FlyingSquid, a weak supervision framework to generate probabilistic training labels that runs orders of magnitude faster than previous weak supervision approaches and requires fewer assumptions. We demonstrate that FlyingSquid achieves the same or higher quality compared to previous approaches without the need to tune an SGD procedure, recovers model parameters 170 times faster on average, and enables new video analysis and online learning applications.
Blog: http://hazyresearch.stanford.edu/flyingsquid
Source Code: https://github.com/HazyResearch/flyingsquid
Paper: https://arxiv.org/abs/2002.11955