期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Language Model Using Differentiable Neural Computer Based on Forget Gate-Based Memory Deallocation
1
作者 Donghyun Lee Hosung Park +4 位作者 Soonshin Seo Changmin Kim Hyunsoo Son Gyujin Kim Ji-Hwan Kim 《Computers, Materials & Continua》 SCIE EI 2021年第7期537-551,共15页
A differentiable neural computer(DNC)is analogous to the Von Neumann machine with a neural network controller that interacts with an external memory through an attention mechanism.Such DNC’s offer a generalized metho... A differentiable neural computer(DNC)is analogous to the Von Neumann machine with a neural network controller that interacts with an external memory through an attention mechanism.Such DNC’s offer a generalized method for task-specific deep learning models and have demonstrated reliability with reasoning problems.In this study,we apply a DNC to a language model(LM)task.The LM task is one of the reasoning problems,because it can predict the next word using the previous word sequence.However,memory deallocation is a problem in DNCs as some information unrelated to the input sequence is not allocated and remains in the external memory,which degrades performance.Therefore,we propose a forget gatebased memory deallocation(FMD)method,which searches for the minimum value of elements in a forget gate-based retention vector.The forget gatebased retention vector indicates the retention degree of information stored in each external memory address.In experiments,we applied our proposed NTM architecture to LM tasks as a task-specific example and to rescoring for speech recognition as a general-purpose example.For LM tasks,we evaluated DNC using the Penn Treebank and enwik8 LM tasks.Although it does not yield SOTA results in LM tasks,the FMD method exhibits relatively improved performance compared with DNC in terms of bits-per-character.For the speech recognition rescoring tasks,FMD again showed a relative improvement using the LibriSpeech data in terms of word error rate. 展开更多
关键词 Forget gate-based memory deallocation differentiable neural computer language model forget gate-based retention vector
下载PDF
A phenomenological memristor model for synaptic memory and learning behaviors
2
作者 邵楠 张盛兵 邵舒渊 《Chinese Physics B》 SCIE EI CAS CSCD 2017年第11期526-536,共11页
Properties that are similar to the memory and learning functions in biological systems have been observed and reported in the experimental studies of memristors fabricated by different materials. These properties incl... Properties that are similar to the memory and learning functions in biological systems have been observed and reported in the experimental studies of memristors fabricated by different materials. These properties include the forgetting effect, the transition from short-term memory(STM) to long-term memory(LTM), learning-experience behavior, etc. The mathematical model of this kind of memristor would be very important for its theoretical analysis and application design.In our analysis of the existing memristor model with these properties, we find that some behaviors of the model are inconsistent with the reported experimental observations. A phenomenological memristor model is proposed for this kind of memristor. The model design is based on the forgetting effect and STM-to-LTM transition since these behaviors are two typical properties of these memristors. Further analyses of this model show that this model can also be used directly or modified to describe other experimentally observed behaviors. Simulations show that the proposed model can give a better description of the reported memory and learning behaviors of this kind of memristor than the existing model. 展开更多
关键词 memristor model forgetting effect transition from short-term memory(STM) to long-term memory(LTM) learning-experience behavior
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部