The orbitofrontal cortex(OFC)is involved in diverse brain functions via its extensive projections to multiple target regions.There is a growing understanding of the overall outputs of the OFC at the population level,b...The orbitofrontal cortex(OFC)is involved in diverse brain functions via its extensive projections to multiple target regions.There is a growing understanding of the overall outputs of the OFC at the population level,but reports of the projection patterns of individual OFC neurons across different cortical layers remain rare.Here,by combining neuronal sparse and bright labeling with a whole-brain florescence imaging system(fMOST),we obtained an uninterrupted three-dimensional whole-brain dataset and achieved the full morphological reconstruction of 25 OFC pyramidal neurons.We compared the wholebrain projection targets of these individual OFC neurons in different cortical layers as well as in the same cortical layer.We found cortical layer-dependent projections characterized by divergent patterns for information delivery.Our study not only provides a structural basis for understanding the principles of laminar organizations in the OFC,but also provides clues for future functional and behavioral studies on OFC pyramidal neurons.展开更多
The knowledge of wing orientation and deformation during flapping flight is necessary for a complete aerodynamic analysis, but to date those kinematic features have not been simultaneously quantified for free-flying i...The knowledge of wing orientation and deformation during flapping flight is necessary for a complete aerodynamic analysis, but to date those kinematic features have not been simultaneously quantified for free-flying insects. A projected comb-fringe (PCF) method has been developed for measuring spanwise camber changes on free-flying dragonflies and on beating-flying dragonflies through the course of a wingbeat, which bases on projecting a fringe pattern over the whole measurement area and then measuring the wing deformation from the distorted fringe pattern. Experimental results demonstrate substantial camber changes both along the wingspan and through the course of a wingbeat. The ratio of camber deformation to chord length for hind wing is up to 0.11 at 75% spanwise with a flapping angle of -0.66 degree for a free-flying dragonfly.展开更多
Three-dimensional(3D)imaging with structured light is crucial in diverse scenarios,ranging from intelligent manufacturing and medicine to entertainment.However,current structured light methods rely on projector-camera...Three-dimensional(3D)imaging with structured light is crucial in diverse scenarios,ranging from intelligent manufacturing and medicine to entertainment.However,current structured light methods rely on projector-camera synchronization,limiting the use of affordable imaging devices and their consumer applications.In this work,we introduce an asynchronous structured light imaging approach based on generative deep neural networks to relax the synchronization constraint,accomplishing the challenges of fringe pattern aliasing,without relying on any a priori constraint of the projection system.To overcome this need,we propose a generative deep neural network with U-Net-like encoder-decoder architecture to learn the underlying fringe features directly by exploring the intrinsic prior principles in the fringe pattern aliasing.We train within an adversarial learning framework and supervise the network training via a statisticsinformed loss function.We demonstrate that by evaluating the performance on fields of intensity,phase,and 3D reconstruction.It is shown that the trained network can separate aliased fringe patterns for producing comparable results with the synchronous one:the absolute error is no greater than 8μm,and the standard deviation does not exceed 3μm.Evaluation results on multiple objects and pattern types show it could be generalized for any asynchronous structured light scene.展开更多
基金the National Natural Science Foundation of China(61827825,31770924,31470056,and 31600692)the Science Fund for Creative Research Group of China(61721092)the Director Fund of Wuhan National Laboratory for Optoelectronics。
文摘The orbitofrontal cortex(OFC)is involved in diverse brain functions via its extensive projections to multiple target regions.There is a growing understanding of the overall outputs of the OFC at the population level,but reports of the projection patterns of individual OFC neurons across different cortical layers remain rare.Here,by combining neuronal sparse and bright labeling with a whole-brain florescence imaging system(fMOST),we obtained an uninterrupted three-dimensional whole-brain dataset and achieved the full morphological reconstruction of 25 OFC pyramidal neurons.We compared the wholebrain projection targets of these individual OFC neurons in different cortical layers as well as in the same cortical layer.We found cortical layer-dependent projections characterized by divergent patterns for information delivery.Our study not only provides a structural basis for understanding the principles of laminar organizations in the OFC,but also provides clues for future functional and behavioral studies on OFC pyramidal neurons.
文摘The knowledge of wing orientation and deformation during flapping flight is necessary for a complete aerodynamic analysis, but to date those kinematic features have not been simultaneously quantified for free-flying insects. A projected comb-fringe (PCF) method has been developed for measuring spanwise camber changes on free-flying dragonflies and on beating-flying dragonflies through the course of a wingbeat, which bases on projecting a fringe pattern over the whole measurement area and then measuring the wing deformation from the distorted fringe pattern. Experimental results demonstrate substantial camber changes both along the wingspan and through the course of a wingbeat. The ratio of camber deformation to chord length for hind wing is up to 0.11 at 75% spanwise with a flapping angle of -0.66 degree for a free-flying dragonfly.
基金funding from the National Natural Science Foundation of China(Grant Nos.62375078 and 12002197)the Youth Talent Launching Program of Shanghai University+2 种基金the General Science Foundation of Henan Province(Grant No.222300420427)the Key Research Project Plan for Higher Education Institutions in Henan Province(Grant No.24ZX011)the National Key Laboratory of Ship Structural Safety
文摘Three-dimensional(3D)imaging with structured light is crucial in diverse scenarios,ranging from intelligent manufacturing and medicine to entertainment.However,current structured light methods rely on projector-camera synchronization,limiting the use of affordable imaging devices and their consumer applications.In this work,we introduce an asynchronous structured light imaging approach based on generative deep neural networks to relax the synchronization constraint,accomplishing the challenges of fringe pattern aliasing,without relying on any a priori constraint of the projection system.To overcome this need,we propose a generative deep neural network with U-Net-like encoder-decoder architecture to learn the underlying fringe features directly by exploring the intrinsic prior principles in the fringe pattern aliasing.We train within an adversarial learning framework and supervise the network training via a statisticsinformed loss function.We demonstrate that by evaluating the performance on fields of intensity,phase,and 3D reconstruction.It is shown that the trained network can separate aliased fringe patterns for producing comparable results with the synchronous one:the absolute error is no greater than 8μm,and the standard deviation does not exceed 3μm.Evaluation results on multiple objects and pattern types show it could be generalized for any asynchronous structured light scene.