期刊文献+

Efficient stochastic parallel gradient descent training for on-chip optical processor 被引量:1

下载PDF
导出
摘要 In recent years,space-division multiplexing(SDM)technology,which involves transmitting data information on multiple parallel channels for efficient capacity scaling,has been widely used in fiber and free-space optical communication sys-tems.To enable flexible data management and cope with the mixing between different channels,the integrated reconfig-urable optical processor is used for optical switching and mitigating the channel crosstalk.However,efficient online train-ing becomes intricate and challenging,particularly when dealing with a significant number of channels.Here we use the stochastic parallel gradient descent(SPGD)algorithm to configure the integrated optical processor,which has less com-putation than the traditional gradient descent(GD)algorithm.We design and fabricate a 6×6 on-chip optical processor on silicon platform to implement optical switching and descrambling assisted by the online training with the SPDG algorithm.Moreover,we apply the on-chip processor configured by the SPGD algorithm to optical communications for optical switching and efficiently mitigating the channel crosstalk in SDM systems.In comparison with the traditional GD al-gorithm,it is found that the SPGD algorithm features better performance especially when the scale of matrix is large,which means it has the potential to optimize large-scale optical matrix computation acceleration chips.
出处 《Opto-Electronic Advances》 SCIE EI CAS CSCD 2024年第4期5-15,共11页 光电进展(英文)
基金 supported by the National Natural Science Foundation of China(NSFC)(62125503,62261160388) the Natural Science Foundation of Hubei Province of China(2023AFA028) the Innovation Project of Optics Valley Laboratory(OVL2021BG004).
  • 相关文献

参考文献3

二级参考文献14

共引文献40

同被引文献77

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部