addons icon indicating copy to clipboard operation
addons copied to clipboard

Improved Neural Arithmetic Logic Modules - layer to learn +-*/ operations in transparent way

Open FilipKubackiSoton opened this issue 3 years ago • 0 comments

Describe the feature and the current behavior/state. iNALU (improved Neural Arithmetic Logic Modules) is the layer to learn addition, subtraction, multiplication, and division in a fully explainable/transparent way. Due to the mathematical nature of the layer, inference works well on OOD (out of distribution) inputs. Relevant information

  • Are you willing to contribute it (yes/no): yes If you wish to contribute, then read the requirements for new contributions in CONTRIBUTING.md
  • Are you willing to maintain it going forward? (yes/no): yes
  • Is there a relevant academic paper? (if so, where): iNALU
  • Does the relavent academic paper exceed 50 citations? (yes/no): no
  • Is there already an implementation in another framework? (if so, where): inalu
  • Was it part of tf.contrib? (if so, where):

Which API type would this fall under (layer, metric, optimizer, etc.) layer Who will benefit with this feature? Everyone Any other info. Original NALU paper has 160+ citations . Thank you

FilipKubackiSoton avatar Oct 05 '22 11:10 FilipKubackiSoton