期刊文献+

图像栈的特征提取以及在线虫分类中的应用 被引量:1

Feature extraction of image stack and its application in nematode classification
下载PDF
导出
摘要 三维多焦距图像栈中的有效信息分布在不同的图像层上,其特征提取及分类跟普通二维图像有很大区别。针对图像栈的分类问题,提出了一种基于多方向图像融合的多线性特征提取和分类方法。首先,通过图像融合获取三维图像栈沿多个正交方向的融合图像,并从中提取特征;然后,通过典型相关分析(canonical correlation analysis,CCA)方法,将不同方向融合图像提取的特征进行融合并抽取组合的典型相关特征用于图像分类;其次,由于图像栈数据包含了样本、类别和方向等多个影响分类的因素,因此将多方向图像融合方法嵌入到多线性分析中,综合考虑多个因素交互作用时对图像栈分类的影响。本文提出的方法在线虫图像栈数据上进行了实验,识别率达到了97.0%,实验结果表明该方法具有较高的准确性。 The effective information of 3D multi-focal image stack is distributed on different image layers, so the feature extraction and classification of image stacks are significantly different from that of 2D images. In this paper, an image fusion based muhilinear analysis approach is presented to use for classification of muhi-focal image stacks. First, the image fusion techniques are used to combine the relevant information of multi-focal images within a given image stack into a single image. Besides, multi-focal images within a stack are fused along 3 orthogonal directions, and multiple features extracted from the fused images along different directions are combined by using canonical correlation analysis ( CCA ). Furthermore, because multi-focal image stacks represent the effect of different factors- texture, shape, different instances within the same class and different classes of objects, the image fusion method within a muhilinear framework is embeded to propose an image fusion based muhilinear classifier. The experimental results demonstrate that the multi- direction image fusion based multilinear classifier can reach a higher classification rate (97%) than other classification methods.
作者 王学平 刘敏
出处 《电子测量与仪器学报》 CSCD 北大核心 2017年第11期1753-1759,共7页 Journal of Electronic Measurement and Instrumentation
基金 国家自然科学基金(61301254) 湖南省自然科学基金(14JJ3069)资助项目
关键词 三维图像栈 多线性分析 图像融合 图像分类 3 D image stack muhilinear analysis image fusion image classification
  • 相关文献

参考文献6

二级参考文献59

  • 1ZHANG ChunHua 1 , TIAN YingJie 2 & DENG NaiYang 3,1 School of Information, Renmin University of China, Beijing 100872, China,2 Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing 100080, China,3 College of Science, China Agricultural University, Beijing 100083, China.The new interpretation of support vector machines on statistical learning theory[J].Science China Mathematics,2010,53(1):151-164. 被引量:13
  • 2胡沁春,何怡刚,郭迪新,李宏民.基于开关电流技术的小波变换的滤波器电路实现[J].物理学报,2006,55(2):641-647. 被引量:11
  • 3GARGOUR C, GABREA M, RAMACHANDRAN V, et al. A short introduction to wavelets and their applica- tions [ J ]. IEEE Circuits and Systems Magazine, 2009, 9(2) : 57-68.
  • 4HADDAD S A P, BAGGA B, SERDIJN W A. Log-do- main wavelet bases [J]. IEEE Transactions on Circuits and Systems, 2005, 52(10) :2023-2032.
  • 5AGOSTINHO P R, HADDAD S A P, DE LIMA J A, et al. An ultra low power CMOS PA/V transconductor and its application to wavelet filters [ J ]. Analog Integrated Circuits and Signal Processing, 2008, 57(1-2) :1-14.
  • 6KAREL J M H, HADDAD S A P, HISENI S, et al. Implementing wavelets in continuous-time analog circuits with dynamic range optimization [ J ]. IEEE Transac- tions on Circuits and Systems, 2012, 59(2):229-242.
  • 7LI M, HE Y G, LONG Y. Analog VLSI implementation of wavelet transform using switched-current circuits [ ]3. Analog Integrated Circuits and Signal Processing, 2012, 71(2) : 283-291.
  • 8LI M, HE Y G. Analog complex wavelet transform using ACDE algorithm and SI filters [ J]. Sensor Letters, 2011, 10(5-6) :1332-1338.
  • 9LI H, HE Y, SUN Y C. Detection of cardiac" signal characteristic point using log-domain wavelet transform circuits I J]- Circuits, Systems and Signal Processing, 2008, 27(5) :683-698.
  • 10TONGY N, HE Y G, LI H M, et al. Analogimple- mentation of wavelet transform in switched-current cir- cuits with high approximation precision and minimum circuit coefficients [ J 3. Circuits, Systems and Signal Processing, 2014, 33 (8) : 2333- 2361.

共引文献87

同被引文献18

引证文献1

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部