Intern
    Data Science Chair

    Our article "iNALU: Improved Neural Arithmetic Logic Unit" has been published

    29.09.2020

    Our article iNALU: Improved Neural Arithmetic Logic Unit has recently been published in the section Machine Learning and Artificial Intelligence of the Frontiers Artificial Intelligence Journal.

    Abstract

    Neural networks have to capture mathematical relationships in order to learn various tasks. They approximate these relations implicitly and therefore often do not generalize well. The recently proposed Neural Arithmetic Logic Unit (NALU) is a novel neural architecture which is able to explicitly represent the mathematical relationships by the units of the network to learn operations such as summation, subtraction or multiplication. Although NALUs have been shown to perform well on various downstream tasks, an in-depth analysis reveals practical shortcomings by design, such as the inability to multiply or divide negative input values or training stability issues for deeper networks. We address these issues and propose an improved model architecture. We evaluate our model empirically in various settings from learning basic arithmetic operations to more complex functions. Our experiments indicate that our model solves stability issues and outperforms the original NALU model in means of arithmetic precision and convergence.

    You can find our article on Bibsonomy and Frontiers.

    Zurück