期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Divergent Projection Patterns Revealed by Reconstruction of Individual Neurons in Orbitofrontal Cortex 被引量:7
1
作者 Junjun Wang Pei Sun +9 位作者 Xiaohua Lv Sen Jin Anan Li Jianxia Kuang Ning Li Yadong Gang Rui Guo Shaoqun Zeng Fuqiang Xu Yu-Hui Zhang 《Neuroscience Bulletin》 SCIE CAS CSCD 2021年第4期461-477,共17页
The orbitofrontal cortex(OFC)is involved in diverse brain functions via its extensive projections to multiple target regions.There is a growing understanding of the overall outputs of the OFC at the population level,b... The orbitofrontal cortex(OFC)is involved in diverse brain functions via its extensive projections to multiple target regions.There is a growing understanding of the overall outputs of the OFC at the population level,but reports of the projection patterns of individual OFC neurons across different cortical layers remain rare.Here,by combining neuronal sparse and bright labeling with a whole-brain florescence imaging system(fMOST),we obtained an uninterrupted three-dimensional whole-brain dataset and achieved the full morphological reconstruction of 25 OFC pyramidal neurons.We compared the wholebrain projection targets of these individual OFC neurons in different cortical layers as well as in the same cortical layer.We found cortical layer-dependent projections characterized by divergent patterns for information delivery.Our study not only provides a structural basis for understanding the principles of laminar organizations in the OFC,but also provides clues for future functional and behavioral studies on OFC pyramidal neurons. 展开更多
关键词 Orbitofrontal cortex Whole-brain imaging Morphological reconstruction Output projection pattern
原文传递
Measurement on Camber Deformation of Wings of Free-flying Dragonflies and Beating-flying Dragonflies
2
作者 Deqiang Song 1,2, Lijiang Zeng 1 1. State Key Laboratory of Precision Measurement Technology and Instruments,Tsinghua University, Beijing, 100084, P.R. China 2. 9500 Gilman Dr. 0409, University of California, San Diego, 92093, CA, USA 《Journal of Bionic Engineering》 SCIE EI CSCD 2004年第1期41-45,共5页
The knowledge of wing orientation and deformation during flapping flight is necessary for a complete aerodynamic analysis, but to date those kinematic features have not been simultaneously quantified for free-flying i... The knowledge of wing orientation and deformation during flapping flight is necessary for a complete aerodynamic analysis, but to date those kinematic features have not been simultaneously quantified for free-flying insects. A projected comb-fringe (PCF) method has been developed for measuring spanwise camber changes on free-flying dragonflies and on beating-flying dragonflies through the course of a wingbeat, which bases on projecting a fringe pattern over the whole measurement area and then measuring the wing deformation from the distorted fringe pattern. Experimental results demonstrate substantial camber changes both along the wingspan and through the course of a wingbeat. The ratio of camber deformation to chord length for hind wing is up to 0.11 at 75% spanwise with a flapping angle of -0.66 degree for a free-flying dragonfly. 展开更多
关键词 free flight fringe pattern projection insect flight wing orientation wing camber
下载PDF
Generative deep-learning-embedded asynchronous structured light for three-dimensional imaging
3
作者 Lei Lu Chenhao Bu +4 位作者 Zhilong Su Banglei Guan Qifeng Yu Wei Pan Qinghui Zhang 《Advanced Photonics》 SCIE EI CAS CSCD 2024年第4期45-58,共14页
Three-dimensional(3D)imaging with structured light is crucial in diverse scenarios,ranging from intelligent manufacturing and medicine to entertainment.However,current structured light methods rely on projector-camera... Three-dimensional(3D)imaging with structured light is crucial in diverse scenarios,ranging from intelligent manufacturing and medicine to entertainment.However,current structured light methods rely on projector-camera synchronization,limiting the use of affordable imaging devices and their consumer applications.In this work,we introduce an asynchronous structured light imaging approach based on generative deep neural networks to relax the synchronization constraint,accomplishing the challenges of fringe pattern aliasing,without relying on any a priori constraint of the projection system.To overcome this need,we propose a generative deep neural network with U-Net-like encoder-decoder architecture to learn the underlying fringe features directly by exploring the intrinsic prior principles in the fringe pattern aliasing.We train within an adversarial learning framework and supervise the network training via a statisticsinformed loss function.We demonstrate that by evaluating the performance on fields of intensity,phase,and 3D reconstruction.It is shown that the trained network can separate aliased fringe patterns for producing comparable results with the synchronous one:the absolute error is no greater than 8μm,and the standard deviation does not exceed 3μm.Evaluation results on multiple objects and pattern types show it could be generalized for any asynchronous structured light scene. 展开更多
关键词 structured light fringe pattern projection ASYNCHRONY deep learning generative neural networks three-dimensional imaging
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部