Sensory drive, the concept that sensory systems primarily evolve under the influence of environmen tal features and that animal signals are evolutionarily shaped and tuned by these previously existing sensory systems,...Sensory drive, the concept that sensory systems primarily evolve under the influence of environmen tal features and that animal signals are evolutionarily shaped and tuned by these previously existing sensory systems, has been thoroughly studied regarding visual signals across many animals. Much of this work has focused on spectral aspects of vision and signals. Here, I review work on polarized light signals of animals and relate these to what is known of polarization visual systems, polarized light aspects of visual scenes, and polarizationrelated behavior (e.g., orientation, habitatfinding, contrast enhancement). Other than the broad patterns of scattered polarized light in the sky, most po larization in both terrestrial and aquatic environments results from either reflection or scattering in the horizontal plane. With overhead illumination, horizontal features such as the surfaces of many leaves or of air: water interfaces reflect horizontal polarization, and water scatters horizontally polar ized light under most conditions. Several animal species have been demonstrated to use horizontally polarized light fields or features in critical aspects of their biology. Significantly, most biological sig nals are also horizontally polarized. Here, I present relevant polarizationrelated behavior and discuss the hypothesis that sensory drive has evolutionarily influenced the structure of polarization signals. The paper also considers the evolutionary origin of circular polarization vision and circularly polar ized signals. It appears that this class of signals did not evolve under the influence of sensory drive. The study of signals based on polarized light is becoming a mature field of research.展开更多
Most state-of-the-art robotic cars' perception systems are quite different from the way a human driver understands traffic environments. First, humans assimilate information from the traffic scene mainly through visu...Most state-of-the-art robotic cars' perception systems are quite different from the way a human driver understands traffic environments. First, humans assimilate information from the traffic scene mainly through visual perception, while the machine perception of traffic environments needs to fuse information from several different kinds of sensors to meet safety-critical requirements. Second, a robotic car requires nearly 100% correct perception results for its autonomous driving, while an experienced human driver works well with dynamic traffic environments, in which machine perception could easily produce noisy perception results. In this paper, we propose a vision-centered multi-sensor fusing framework for a traffic environment perception approach to autonomous driving, which fuses camera, LIDAR, and GIS information consistently via both geometrical and semantic constraints for efficient self- localization and obstacle perception. We also discuss robust machine vision algorithms that have been successfully integrated with the framework and address multiple levels of machine vision techniques, from collecting training data, efficiently processing sensor data, and extracting low-level features, to higher-level object and environment mapping. The proposed framework has been tested extensively in actual urban scenes with our self-developed robotic cars for eight years. The empirical results validate its robustness and efficiency.展开更多
文摘Sensory drive, the concept that sensory systems primarily evolve under the influence of environmen tal features and that animal signals are evolutionarily shaped and tuned by these previously existing sensory systems, has been thoroughly studied regarding visual signals across many animals. Much of this work has focused on spectral aspects of vision and signals. Here, I review work on polarized light signals of animals and relate these to what is known of polarization visual systems, polarized light aspects of visual scenes, and polarizationrelated behavior (e.g., orientation, habitatfinding, contrast enhancement). Other than the broad patterns of scattered polarized light in the sky, most po larization in both terrestrial and aquatic environments results from either reflection or scattering in the horizontal plane. With overhead illumination, horizontal features such as the surfaces of many leaves or of air: water interfaces reflect horizontal polarization, and water scatters horizontally polar ized light under most conditions. Several animal species have been demonstrated to use horizontally polarized light fields or features in critical aspects of their biology. Significantly, most biological sig nals are also horizontally polarized. Here, I present relevant polarizationrelated behavior and discuss the hypothesis that sensory drive has evolutionarily influenced the structure of polarization signals. The paper also considers the evolutionary origin of circular polarization vision and circularly polar ized signals. It appears that this class of signals did not evolve under the influence of sensory drive. The study of signals based on polarized light is becoming a mature field of research.
基金supported by the National Key Program Project of China(No.2016YFB1001004)the National Natural Science Foundation of China(Nos.91320301 and 61273252)
文摘Most state-of-the-art robotic cars' perception systems are quite different from the way a human driver understands traffic environments. First, humans assimilate information from the traffic scene mainly through visual perception, while the machine perception of traffic environments needs to fuse information from several different kinds of sensors to meet safety-critical requirements. Second, a robotic car requires nearly 100% correct perception results for its autonomous driving, while an experienced human driver works well with dynamic traffic environments, in which machine perception could easily produce noisy perception results. In this paper, we propose a vision-centered multi-sensor fusing framework for a traffic environment perception approach to autonomous driving, which fuses camera, LIDAR, and GIS information consistently via both geometrical and semantic constraints for efficient self- localization and obstacle perception. We also discuss robust machine vision algorithms that have been successfully integrated with the framework and address multiple levels of machine vision techniques, from collecting training data, efficiently processing sensor data, and extracting low-level features, to higher-level object and environment mapping. The proposed framework has been tested extensively in actual urban scenes with our self-developed robotic cars for eight years. The empirical results validate its robustness and efficiency.