摘要
In-memory computing is an alternative method to effectively accelerate the massive data-computing tasks of artificial intelligence(AI)and break the memory wall.In this work,we propose a 2T1C DRAM structure for in-memory computing.It integrates a monolayer graphene transistor,a monolayer MoS_(2)transistor,and a capacitor in a two-transistor-onecapacitor(2T1C)configuration.In this structure,the storage node is in a similar position to that of one-transistor-one-capacitor(1T1C)dynamic random-access memory(DRAM),while an additional graphene transistor is used to accomplish the nondestructive readout of the stored information.Furthermore,the ultralow leakage current of the MoS_(2)transistor enables the storage of multi-level voltages on the capacitor with a long retention time.The stored charges can effectually tune the channel conductance of the graphene transistor due to its excellent linearity so that linear analog multiplication can be realized.Because of the almost unlimited cycling endurance of DRAM,our 2T1C DRAM has great potential for in situ training and recognition,which can significantly improve the recognition accuracy of neural networks.
基金
This work was supported by the National Key Research and Development Program(2021YFA1200500)
in part by the Innovation Program of Shanghai Municipal Education Commission(2021-01-07-00-07-E00077)
Shanghai Municipal Science and Technology Commission(21DZ1100900).