addons icon indicating copy to clipboard operation
addons copied to clipboard

Memory unit for enhancing memorization in autoencoders

Open DanielWicz opened this issue 3 years ago • 0 comments

Describe the feature and the current behavior/state.

The feature allows for inclusion of a new layer type called Memory Unit. This layer allows for efficient memorization of the data, that were used to train neural network. At the given moment, this unit is used in the MemAE (memory-augmented autoencoder). Nevertheless, if it would be more popularized, it could be used in other problems, the same as attention module got widespread after a while.

Relevant information

  • Are you willing to contribute it (yes/no): No If you wish to contribute, then read the requirements for new contributions in CONTRIBUTING.md
  • Are you willing to maintain it going forward? (yes/no): No
  • Is there a relevant academic paper? (if so, where): https://arxiv.org/abs/1904.02639
  • Does the relavent academic paper exceed 50 citations? (yes/no): yes (550 citations)
  • Is there already an implementation in another framework? (if so, where): https://github.com/donggong1/memae-anomaly-detection/blob/master/models/memory_module.py (class called MemoryUnit)
  • Was it part of tf.contrib? (if so, where): No

Which API type would this fall under (layer, metric, optimizer, etc.) layer Who will benefit with this feature? People who are using autoencoders for anomaly detection and enhance discovery in the field of data memorization in neural network

Any other info.

DanielWicz avatar Sep 06 '22 07:09 DanielWicz