Learning to remember more with less memorization
Version 2 2024-06-05, 02:32Version 2 2024-06-05, 02:32
Version 1 2019-09-05, 09:06Version 1 2019-09-05, 09:06
conference contribution
posted on 2024-06-05, 02:32 authored by H Le, Truyen TranTruyen Tran, Svetha VenkateshSvetha Venkatesh© 7th International Conference on Learning Representations, ICLR 2019. All Rights Reserved. Memory-augmented neural networks consisting of a neural controller and an external memory have shown potentials in long-term sequential learning. Current RAM-like memory models maintain memory accessing every timesteps, thus they do not effectively leverage the short-term memory held in the controller. We hypothesize that this scheme of writing is suboptimal in memory utilization and introduces redundant computation. To validate our hypothesis, we derive a theoretical bound on the amount of information stored in a RAM-like system and formulate an optimization problem that maximizes the bound. The proposed solution dubbed Uniform Writing is proved to be optimal under the assumption of equal timestep contributions. To relax this assumption, we introduce modifications to the original solution, resulting in a solution termed Cached Uniform Writing. This method aims to balance between maximizing memorization and forgetting via overwriting mechanisms. Through an extensive set of experiments, we empirically demonstrate the advantages of our solutions over other recurrent architectures, claiming the state-of-the-arts in various sequential modeling tasks.
History
Location
New Orleans, LouisianaStart date
2019-05-06End date
2019-05-09Language
engPublication classification
E1 Full written paper - refereedCopyright notice
2019, 7th International Conference on Learning Representations, ICLR 2019Title of proceedings
ICLR 2019: Proceedings of the 7th International Conference on Learning RepresentationsEvent
Learning Representations. International Conference (7th : 2019 : New Orleans, Louisiana)Publisher
ICLRPlace of publication
[New Orleans, Louisiana]Publication URL
Usage metrics
Categories
No categories selectedLicence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC