Conventional Neural Networks can approximate simple arithmetic operations, but fail to generalize beyond the range of numbers that were seen during training. Neural Arithmetic Units aim to overcome this difficulty, but current arithmetic units are either limited to operate on positive numbers or can only represent a subset of arithmetic operations. We introduce the Neural Power Unit (NPU) that operates on the full domain of real numbers and is capable of learning arbitrary power functions in a single layer. The NPU thus fixes the shortcomings of existing arithmetic units and extends their expressivity. This is achieved by internally using complex weights without requiring a conversion of the remaining network to complex numbers. We show where the NPU outperforms its competitors in terms of accuracy and sparsity on artificial arithmetic datasets. Additionally, we demonstrate how the NPU can be used as a highly interpretable model to discover the generating equation of a dynamical system purely from data.
Speakers: Niklas Maximilian Heim, Tomas Pevny, Vasek Smidl