期刊文献+
共找到9篇文章
< 1 >
每页显示 20 50 100
Evaluation of Two Absolute Radiometric Normalization Algorithms for Pre-processing of Landsat Imagery 被引量:13
1
作者 徐涵秋 《Journal of China University of Geosciences》 SCIE CSCD 2006年第2期146-150,157,共6页
In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illuminati... In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invarlant features identified from multitemporal Landsat image pairs of Xiamen (厦门) and Fuzhou (福州) areas, both located in the eastern Fujian (福建) Province of China. Compared with the unnormalized image, the radiometric differences between the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnorrealized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference. 展开更多
关键词 LANDSAT radiometrie correction data normalization pseudo-invariant features image processing.
下载PDF
Ensembling Neural Networks for User’s Indoor Localization Using Magnetic Field Data from Smartphones 被引量:1
2
作者 Imran Ashraf Soojung Hur +1 位作者 Yousaf Bin Zikria Yongwan Park 《Computers, Materials & Continua》 SCIE EI 2021年第8期2597-2620,共24页
Predominantly the localization accuracy of the magnetic field-based localization approaches is severed by two limiting factors:Smartphone heterogeneity and smaller data lengths.The use of multifarioussmartphones cripp... Predominantly the localization accuracy of the magnetic field-based localization approaches is severed by two limiting factors:Smartphone heterogeneity and smaller data lengths.The use of multifarioussmartphones cripples the performance of such approaches owing to the variability of the magnetic field data.In the same vein,smaller lengths of magnetic field data decrease the localization accuracy substantially.The current study proposes the use of multiple neural networks like deep neural network(DNN),long short term memory network(LSTM),and gated recurrent unit network(GRN)to perform indoor localization based on the embedded magnetic sensor of the smartphone.A voting scheme is introduced that takes predictions from neural networks into consideration to estimate the current location of the user.Contrary to conventional magnetic field-based localization approaches that rely on the magnetic field data intensity,this study utilizes the normalized magnetic field data for this purpose.Training of neural networks is carried out using Galaxy S8 data while the testing is performed with three devices,i.e.,LG G7,Galaxy S8,and LG Q6.Experiments are performed during different times of the day to analyze the impact of time variability.Results indicate that the proposed approach minimizes the impact of smartphone variability and elevates the localization accuracy.Performance comparison with three approaches reveals that the proposed approach outperforms them in mean,50%,and 75%error even using a lesser amount of magnetic field data than those of other approaches. 展开更多
关键词 Indoor localization magnetic field data long short term memory network data normalization gated recurrent unit network deep learning
下载PDF
Monitoring Soil Salt Content Using HJ-1A Hyperspectral Data: A Case Study of Coastal Areas in Rudong County, Eastern China 被引量:5
3
作者 LI Jianguo PU Lijie +5 位作者 ZHU Ming DAI Xiaoqing XU Yan CHEN Xinjian ZHANG Lifang ZHANG Runsen 《Chinese Geographical Science》 SCIE CSCD 2015年第2期213-223,共11页
Hyperspectral data are an important source for monitoring soil salt content on a large scale. However, in previous studies, barriers such as interference due to the presence of vegetation restricted the precision of m... Hyperspectral data are an important source for monitoring soil salt content on a large scale. However, in previous studies, barriers such as interference due to the presence of vegetation restricted the precision of mapping soil salt content. This study tested a new method for predicting soil salt content with improved precision by using Chinese hyperspectral data, Huan Jing-Hyper Spectral Imager(HJ-HSI), in the coastal area of Rudong County, Eastern China. The vegetation-covered area and coastal bare flat area were distinguished by using the normalized differential vegetation index at the band length of 705 nm(NDVI705). The soil salt content of each area was predicted by various algorithms. A Normal Soil Salt Content Response Index(NSSRI) was constructed from continuum-removed reflectance(CR-reflectance) at wavelengths of 908.95 nm and 687.41 nm to predict the soil salt content in the coastal bare flat area(NDVI705 < 0.2). The soil adjusted salinity index(SAVI) was applied to predict the soil salt content in the vegetation-covered area(NDVI705 ≥ 0.2). The results demonstrate that 1) the new method significantly improves the accuracy of soil salt content mapping(R2 = 0.6396, RMSE = 0.3591), and 2) HJ-HSI data can be used to map soil salt content precisely and are suitable for monitoring soil salt content on a large scale. 展开更多
关键词 soil salt content normalized differential vegetation index(NDVI) hyperspectral data Huan Jing-Hyper Spectral Imager(HJ-HSI) coastal area eastern China
下载PDF
An Evolutionary Normalization Algorithm for Signed Floating-Point Multiply-Accumulate Operation
4
作者 Rajkumar Sarma Cherry Bhargava Ketan Kotecha 《Computers, Materials & Continua》 SCIE EI 2022年第7期481-495,共15页
In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier ... In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier Transform(FFT)algorithms,convolution,image processing algorithms,etcetera.In the domain of digital signal processing,the use of normalization architecture is very vast.The main objective of using normalization is to performcomparison and shift operations.In this research paper,an evolutionary approach for designing an optimized normalization algorithm is proposed using basic logical blocks such as Multiplexer,Adder etc.The proposed normalization algorithm is further used in designing an 8×8 bit Signed Floating-Point Multiply-Accumulate(SFMAC)architecture.Since the SFMAC can accept an 8-bit significand and a 3-bit exponent,the input to the said architecture can be somewhere between−(7.96872)_(10) to+(7.96872)_(10).The proposed architecture is designed and implemented using the Cadence Virtuoso using 90 and 130 nm technologies(in Generic Process Design Kit(GPDK)and Taiwan Semiconductor Manufacturing Company(TSMC),respectively).To reduce the power consumption of the proposed normalization architecture,techniques such as“block enabling”and“clock gating”are used rigorously.According to the analysis done on Cadence,the proposed architecture uses the least amount of power compared to its current predecessors. 展开更多
关键词 data normalization cadence virtuoso signed-floating-point MAC evolutionary optimized algorithm block enabling clock gating
下载PDF
Pretreating and normalizing metabolomics data for statistical analysis 被引量:1
5
作者 Jun Sun Yinglin Xia 《Genes & Diseases》 SCIE CSCD 2024年第3期188-205,共18页
Metabolomics as a research field and a set of techniques is to study the entire small molecules in biological samples.Metabolomics is emerging as a powerful tool generally for pre-cision medicine.Particularly,integrat... Metabolomics as a research field and a set of techniques is to study the entire small molecules in biological samples.Metabolomics is emerging as a powerful tool generally for pre-cision medicine.Particularly,integration of microbiome and metabolome has revealed the mechanism and functionality of microbiome in human health and disease.However,metabo-lomics data are very complicated.Preprocessing/pretreating and normalizing procedures on metabolomics data are usually required before statistical analysis.In this review article,we comprehensively review various methods that are used to preprocess and pretreat metabolo-mics data,including MS-based data and NMR-based data preprocessing,dealing with zero and/or missing values and detecting outliers,data normalization,data centering and scaling,data transformation.We discuss the advantages and limitations of each method.The choice for a suitable preprocessing method is determined by the biological hypothesis,the characteristics of the data set,and the selected statistical data analysis method.We then provide the perspective of their applications in the microbiome and metabolome research. 展开更多
关键词 data centering and scaling data normalization data transformation Missing values MS-Baseddata preprocessing NMRdata preprocessing OUTLIERS Preprocessing/pretreatment
原文传递
An Ensemble of Optimal Deep Learning Features for Brain Tumor Classification 被引量:2
6
作者 Ahsan Aziz Muhammad Attique +5 位作者 Usman Tariq Yunyoung Nam Muhammad Nazir Chang-Won Jeong Reham R.Mostafa Rasha H.Sakr 《Computers, Materials & Continua》 SCIE EI 2021年第11期2653-2670,共18页
Owing to technological developments,Medical image analysis has received considerable attention in the rapid detection and classification of diseases.The brain is an essential organ in humans.Brain tumors cause loss of... Owing to technological developments,Medical image analysis has received considerable attention in the rapid detection and classification of diseases.The brain is an essential organ in humans.Brain tumors cause loss of memory,vision,and name.In 2020,approximately 18,020 deaths occurred due to brain tumors.These cases can be minimized if a brain tumor is diagnosed at a very early stage.Computer vision researchers have introduced several techniques for brain tumor detection and classification.However,owing to many factors,this is still a challenging task.These challenges relate to the tumor size,the shape of a tumor,location of the tumor,selection of important features,among others.In this study,we proposed a framework for multimodal brain tumor classification using an ensemble of optimal deep learning features.In the proposed framework,initially,a database is normalized in the form of high-grade glioma(HGG)and low-grade glioma(LGG)patients and then two pre-trained deep learning models(ResNet50 and Densenet201)are chosen.The deep learning models were modified and trained using transfer learning.Subsequently,the enhanced ant colony optimization algorithm is proposed for best feature selection from both deep models.The selected features are fused using a serial-based approach and classified using a cubic support vector machine.The experimental process was conducted on the BraTs2019 dataset and achieved accuracies of 87.8%and 84.6%for HGG and LGG,respectively.The comparison is performed using several classification methods,and it shows the significance of our proposed technique. 展开更多
关键词 Brain tumor data normalization transfer learning features optimization features fusion
下载PDF
GITAR: An Open Source Tool for Analysis and Visualization of Hi-C Data 被引量:1
7
作者 Riccardo Calandrelli Qiuyang Wu +1 位作者 Jihong Guan Sheng Zhong 《Genomics, Proteomics & Bioinformatics》 SCIE CAS CSCD 2018年第5期365-372,共8页
Interactions between chromatin segments play a large role in functional genomic assays and developments in genomic interaction detection methods have shown interacting topological domains within the genome. Among thes... Interactions between chromatin segments play a large role in functional genomic assays and developments in genomic interaction detection methods have shown interacting topological domains within the genome. Among these methods, Hi-C plays a key role. Here, we present the Genome Interaction Tools and Resources(GITAR), a software to perform a comprehensive Hi-C data analysis, including data preprocessing, normalization, and visualization, as well as analysis of topologically-associated domains(TADs). GITAR is composed of two main modules:(1)HiCtool, a Python library to process and visualize Hi-C data, including TAD analysis; and(2)processed data library, a large collection of human and mouse datasets processed using HiCtool.HiCtool leads the user step-by-step through a pipeline, which goes from the raw Hi-C data to the computation, visualization, and optimized storage of intra-chromosomal contact matrices and TAD coordinates. A large collection of standardized processed data allows the users to compare different datasets in a consistent way, while saving time to obtain data for visualization or additional analyses. More importantly, GITAR enables users without any programming or bioinformatic expertise to work with Hi-C data. GITAR is publicly available at http://genomegitar.org as an open-source software. 展开更多
关键词 Chromatin interaction Pipeline Hi-C data normalization Topologically-associated domain Processed Hi-C data library
原文传递
Generalized T3-plot for testing high-dimensional normality 被引量:1
8
作者 Mingyao AI Jiajuan LIANG Man-Lai TANG 《Frontiers of Mathematics in China》 SCIE CSCD 2016年第6期1363-1363,1364-1378,共16页
A new dimension-reduction graphical method for testing high- dimensional normality is developed by using the theory of spherical distributions and the idea of principal component analysis. The dimension reduction is r... A new dimension-reduction graphical method for testing high- dimensional normality is developed by using the theory of spherical distributions and the idea of principal component analysis. The dimension reduction is realized by projecting high-dimensional data onto some selected eigenvector directions. The asymptotic statistical independence of the plotting functions on the selected eigenvector directions provides the principle for the new plot. A departure from multivariate normality of the raw data could be captured by at least one plot on the selected eigenvector direction. Acceptance regions associated with the plots are provided to enhance interpretability of the plots. Monte Carlo studies and an illustrative example show that the proposed graphical method has competitive power performance and improves the existing graphical method significantly in testing high-dimensional normality. 展开更多
关键词 Dimension reduction graphical method high-dimensional data multivariate normality spherical distribution
原文传递
Asymptotic Normality of Wavelet Density Estimator under Censored Dependent Observations
9
作者 Si-li NIU 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2012年第4期781-794,共14页
In this paper, we discuss the asymptotic normality of the wavelet estimator of the density function based on censored data, when the survival and the censoring times form a stationary α-mixing sequence. To simulate t... In this paper, we discuss the asymptotic normality of the wavelet estimator of the density function based on censored data, when the survival and the censoring times form a stationary α-mixing sequence. To simulate the distribution of estimator such that it is easy to perform statistical inference for the density function, a random weighted estimator of the density function is also constructed and investigated. Finite sample behavior of the estimator is investigated via simulations too. 展开更多
关键词 Wavelet density estimator asymptotic normality censored data α-mixing random weightedestimator
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部