A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on th...A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.展开更多
There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because the...There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because their presumptions are that sampled-data should obey the single Gaussian distribution or non-Gaussian distribution. In order to solve these problems, a novel weighted local standardization(WLS) strategy is proposed to standardize the multimodal data, which can eliminate the multi-mode characteristics of the collected data, and normalize them into unimodal data distribution. After detailed analysis of the raised data preprocessing strategy, a new algorithm using WLS strategy with support vector data description(SVDD) is put forward to apply for multi-mode monitoring process. Unlike the strategy of building multiple local models, the developed method only contains a model without the prior knowledge of multi-mode process. To demonstrate the proposed method's validity, it is applied to a numerical example and a Tennessee Eastman(TE) process. Finally, the simulation results show that the WLS strategy is very effective to standardize multimodal data, and the WLS-SVDD monitoring method has great advantages over the traditional SVDD and PCA combined with a local standardization strategy(LNS-PCA) in multi-mode process monitoring.展开更多
A new numerical differentiation method with local opti- mum by data segmentation is proposed. The segmentation of data is based on the second derivatives computed by a Fourier devel- opment method. A filtering process...A new numerical differentiation method with local opti- mum by data segmentation is proposed. The segmentation of data is based on the second derivatives computed by a Fourier devel- opment method. A filtering process is used to achieve acceptable segmentation. Numerical results are presented by using the data segmentation method, compared with the regularization method. For further investigation, the proposed algorithm is applied to the resistance capacitance (RC) networks identification problem, and improvements of the result are obtained by using this algorithm.展开更多
The localization of the blanket jamming is studied and a new method of solving the localization ambiguity is proposed. Radars only can acquire angle information without range information when encountering the blanket ...The localization of the blanket jamming is studied and a new method of solving the localization ambiguity is proposed. Radars only can acquire angle information without range information when encountering the blanket jamming. Netted radars could get position information of the blanket jamming by make use of radars' relative position and the angle information, when there is one blanket jamming. In the presence of error, the localization method and the accuracy analysis of one blanket jamming are given. However, if there are more than one blanket jamming, and the two blanket jamming and two radars are coplanar, the localization of jamming could be error due to localization ambiguity. To solve this confusion, the Kalman filter model is established for all intersections, and through the initiation and association algorithm of multi-target, the false intersection can be eliminated. Simulations show that the presented method is valid.展开更多
Omics data provides an essential means for molecular biology and systems biology to capture the systematic properties of inner activities of cells. And one of the strongest challenge problems biological researchers ha...Omics data provides an essential means for molecular biology and systems biology to capture the systematic properties of inner activities of cells. And one of the strongest challenge problems biological researchers have faced is to find the methods for discovering biomarkers for tracking the process of disease such as cancer. So some feature selection methods have been widely used to cope with discovering biomarkers problem. However omics data usually contains a large number of features, but a small number of samples and some omics data have a large range distribution, which make feature selection methods remains difficult to deal with omics data. In order to overcome the problems, wepresent a computing method called localized statistic of abundance distribution based on Gaussian window(LSADBGW) to test the significance of the feature. The experiments on three datasets including gene and protein datasets showed the accuracy and efficiency of LSADBGW for feature selection.展开更多
A new concept is suggested on tectonomagnetic research about the noise in simultaneous geomagnetic difference data caused by the effect of Sq local-time variation, together with the method of theoretical calculation. ...A new concept is suggested on tectonomagnetic research about the noise in simultaneous geomagnetic difference data caused by the effect of Sq local-time variation, together with the method of theoretical calculation. The level of the noise and its contribution to the total noises of the differences data are analyzed. The result indicates that the noise increases linearly with the increase of the distance between the two stations in the range of 40° longitude-difference, and its increasing rate is about 0.4 nT/(°)at latitude 40°N. The example calculated at a pair of sites with longitude-difference 0.357°, shows that the noise is about one fifth of the total noises of the difference data on geomagnetic quiet-day.展开更多
This paper presents the Optimized Kalman Particle Swarm (OKPS) filter. This filter results from two years of research and improves the Swarm Particle Filter (SPF). The OKPS has been designed to be both cooperative and...This paper presents the Optimized Kalman Particle Swarm (OKPS) filter. This filter results from two years of research and improves the Swarm Particle Filter (SPF). The OKPS has been designed to be both cooperative and reactive. It combines the advantages of the Particle Filter (PF) and the metaheuristic Particle Swarm Optimization (PSO) for ego-vehicles localization applications. In addition to a simple fusion between the swarm optimization and the particular filtering (which leads to the Swarm Particle Filter), the OKPS uses some attributes of the Extended Kalman filter (EKF). The OKPS filter innovates by fitting its particles with a capacity of self-diagnose by means of the EKF covariance uncertainty matrix. The particles can therefore evolve by exchanging information to assess the optimized position of the ego-vehicle. The OKPS fuses data coming from embedded sensors (low cost INS, GPS and Odometer) to perform a robust ego-vehicle positioning. The OKPS is compared to the EKF filter and to filters using particles (PF and SPF) on real data from our equipped vehicle.展开更多
On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such are...On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such areas can be considered as a critical factor in the national development and directorship of the sources.With regard to this fact,remote sourcing technologies with use of analytical operations of geographic information systems(GIS),will be remarkably advantageous.Iran’s south-eastern Makran coasts are geopolitically and economically,of importance due to their strategic characteristics but have been neglected and their development and transit infrastructure are significantly beyond the international standards.Therefore,in this paper,with regard to the importance of developing Makran coasts,a Multi-Criterion Decision Analysis(MCDA)method was applied to identify and prioritize the intended criteria and parameters of zoning,in order to establish new maritime zones.The major scope of this study is to employ the satellite data,remote sensing methods,and regional statistics obtained from Jask synoptic station and investigate the region’s status in terms of topography,rainfall rate and temperature changes to reach to a comprehensive monitoring and zoning of the coastal line and to provide a pervasive local data base via use of GIS and MCDA,which will be implemented to construct the coastal regions.In this article,while explaining the steps of coastal monitoring,its main objectives are also explained and the necessary procedures for doing so are presented.Then,the general steps of marine climate identification and study of marine parameters are stated and the final achievements of the coastal monitoring process are determined.In the following,considering that this article focuses on the monitoring of Makran beaches,the method of work in the mentioned region will be described and its specific differences and complexities will be discussed in detail.Also,the impact of such projects on future research results will be discussed.展开更多
为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条...为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条码的正确读取。为此,针对金属表面激光标刻二维条码出现的局部高光现象,提出了基于五步重构模型的条码重构法,以重构高光区域的条码信息。对获得的条码图像进行倾斜校正,使"L"型实线边界位于图像左下角,对条码进行网格划分实现各个模块的定位。基于Modified Specular-Free(MSF)图像对高光区域进行检测。采用五步重构模型对条码的各个模块进行数值填充,对条码进行读取。实验表明,该算法能达到去除金属表面上条码局部高光的目的,并取得了较高的识读正确率。展开更多
Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM...Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM) methods are based on the assumption that the process has only one nominal mode. When the process data contain different distributions, they may not function as well as in single mode processes. To address this issue, an improved partial least squares (IPLS) method was proposed for multimode process monitoring. By utilizing a novel local standardization strategy, the normal data in multiple modes could be centralized after being standardized and the fundamental assumption of partial least squares (PLS) could be valid again in multimode process. In this way, PLS method was extended to be suitable for not only single mode processes but also multimode processes. The efficiency of the proposed method was illustrated by comparing the monitoring results of PLS and IPLS in Tennessee Eastman(TE) process.展开更多
This paper considers the local linear regression estimators for partially linear model with censored data. Which have some nice large-sample behaviors and are easy to implement. By many simulation runs, the author als...This paper considers the local linear regression estimators for partially linear model with censored data. Which have some nice large-sample behaviors and are easy to implement. By many simulation runs, the author also found that the estimators show remarkable in the small sample case yet.展开更多
By using 11 global ocean tide models and tidal gauge data obtained in the East China Sea and South China Sea, the influence of the ocean loading on gravity field in China and its neighbor area is calculated in this pa...By using 11 global ocean tide models and tidal gauge data obtained in the East China Sea and South China Sea, the influence of the ocean loading on gravity field in China and its neighbor area is calculated in this paper. Furthermore, the differences between the results from original global models and modified models with local tides are discussed based on above calculation. The comparison shows that the differences at the position near the sea are so large that the local tides must be taken into account in the calculation. When the global ocean tide models of CSR4.0, FES02, GOT00, NAO99 and ORI96 are chosen, the local effect for M2 is less than 0.10 × 10-8 m·s-2 over the area far away from sea. And the local effect for O1 is less than 0.05 × 10-8 m·s-2 over that area when choosing AG95 or CSR3.0 models. This numerical result demonstrates that the choice of model is a complex problem because of the inconsistent accuracy of the models over the areas of East and South China Seas.展开更多
At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, man...At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and efficient way, and job scheduling is a key factor for achieving high performance in big data processing. This paper gives an overview of big data and highlights the problems and challenges in big data. It then highlights Hadoop Distributed File System (HDFS), Hadoop MapReduce, and various parameters that affect the performance of job scheduling algorithms in big data such as Job Tracker, Task Tracker, Name Node, Data Node, etc. The primary purpose of this paper is to present a comparative study of job scheduling algorithms along with their experimental results in Hadoop environment. In addition, this paper describes the advantages, disadvantages, features, and drawbacks of various Hadoop job schedulers such as FIFO, Fair, capacity, Deadline Constraints, Delay, LATE, Resource Aware, etc, and provides a comparative study among these schedulers.展开更多
Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical ...Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical methods for anomalous cell detection cannot adapt to the evolution of networks, and data mining becomes the mainstream. In this paper, we propose a novel kernel density-based local outlier factor(KLOF) to assign a degree of being an outlier to each object. Firstly, the notion of KLOF is introduced, which captures exactly the relative degree of isolation. Then, by analyzing its properties, including the tightness of upper and lower bounds, sensitivity of density perturbation, we find that KLOF is much greater than 1 for outliers. Lastly, KLOFis applied on a real-world dataset to detect anomalous cells with abnormal key performance indicators(KPIs) to verify its reliability. The experiment shows that KLOF can find outliers efficiently. It can be a guideline for the operators to perform faster and more efficient trouble shooting.展开更多
基金Supported by Major State Basic Research Development Program of China("973" Program,No.2010CB731502)
文摘A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.
基金Project(61374140)supported by the National Natural Science Foundation of China
文摘There are multiple operating modes in the real industrial process, and the collected data follow the complex multimodal distribution, so most traditional process monitoring methods are no longer applicable because their presumptions are that sampled-data should obey the single Gaussian distribution or non-Gaussian distribution. In order to solve these problems, a novel weighted local standardization(WLS) strategy is proposed to standardize the multimodal data, which can eliminate the multi-mode characteristics of the collected data, and normalize them into unimodal data distribution. After detailed analysis of the raised data preprocessing strategy, a new algorithm using WLS strategy with support vector data description(SVDD) is put forward to apply for multi-mode monitoring process. Unlike the strategy of building multiple local models, the developed method only contains a model without the prior knowledge of multi-mode process. To demonstrate the proposed method's validity, it is applied to a numerical example and a Tennessee Eastman(TE) process. Finally, the simulation results show that the WLS strategy is very effective to standardize multimodal data, and the WLS-SVDD monitoring method has great advantages over the traditional SVDD and PCA combined with a local standardization strategy(LNS-PCA) in multi-mode process monitoring.
基金supported by the National Basic Research Program of China(2011CB013103)
文摘A new numerical differentiation method with local opti- mum by data segmentation is proposed. The segmentation of data is based on the second derivatives computed by a Fourier devel- opment method. A filtering process is used to achieve acceptable segmentation. Numerical results are presented by using the data segmentation method, compared with the regularization method. For further investigation, the proposed algorithm is applied to the resistance capacitance (RC) networks identification problem, and improvements of the result are obtained by using this algorithm.
文摘The localization of the blanket jamming is studied and a new method of solving the localization ambiguity is proposed. Radars only can acquire angle information without range information when encountering the blanket jamming. Netted radars could get position information of the blanket jamming by make use of radars' relative position and the angle information, when there is one blanket jamming. In the presence of error, the localization method and the accuracy analysis of one blanket jamming are given. However, if there are more than one blanket jamming, and the two blanket jamming and two radars are coplanar, the localization of jamming could be error due to localization ambiguity. To solve this confusion, the Kalman filter model is established for all intersections, and through the initiation and association algorithm of multi-target, the false intersection can be eliminated. Simulations show that the presented method is valid.
文摘Omics data provides an essential means for molecular biology and systems biology to capture the systematic properties of inner activities of cells. And one of the strongest challenge problems biological researchers have faced is to find the methods for discovering biomarkers for tracking the process of disease such as cancer. So some feature selection methods have been widely used to cope with discovering biomarkers problem. However omics data usually contains a large number of features, but a small number of samples and some omics data have a large range distribution, which make feature selection methods remains difficult to deal with omics data. In order to overcome the problems, wepresent a computing method called localized statistic of abundance distribution based on Gaussian window(LSADBGW) to test the significance of the feature. The experiments on three datasets including gene and protein datasets showed the accuracy and efficiency of LSADBGW for feature selection.
基金Joint Seismological Science Foundation of China (198009).
文摘A new concept is suggested on tectonomagnetic research about the noise in simultaneous geomagnetic difference data caused by the effect of Sq local-time variation, together with the method of theoretical calculation. The level of the noise and its contribution to the total noises of the differences data are analyzed. The result indicates that the noise increases linearly with the increase of the distance between the two stations in the range of 40° longitude-difference, and its increasing rate is about 0.4 nT/(°)at latitude 40°N. The example calculated at a pair of sites with longitude-difference 0.357°, shows that the noise is about one fifth of the total noises of the difference data on geomagnetic quiet-day.
文摘This paper presents the Optimized Kalman Particle Swarm (OKPS) filter. This filter results from two years of research and improves the Swarm Particle Filter (SPF). The OKPS has been designed to be both cooperative and reactive. It combines the advantages of the Particle Filter (PF) and the metaheuristic Particle Swarm Optimization (PSO) for ego-vehicles localization applications. In addition to a simple fusion between the swarm optimization and the particular filtering (which leads to the Swarm Particle Filter), the OKPS uses some attributes of the Extended Kalman filter (EKF). The OKPS filter innovates by fitting its particles with a capacity of self-diagnose by means of the EKF covariance uncertainty matrix. The particles can therefore evolve by exchanging information to assess the optimized position of the ego-vehicle. The OKPS fuses data coming from embedded sensors (low cost INS, GPS and Odometer) to perform a robust ego-vehicle positioning. The OKPS is compared to the EKF filter and to filters using particles (PF and SPF) on real data from our equipped vehicle.
文摘On one hand,the diversity of activities and on the other hand,the conflicts between beneficiaries necessitate the efficient management and supervision of coastal areas.Accordingly,monitoring and evaluation of such areas can be considered as a critical factor in the national development and directorship of the sources.With regard to this fact,remote sourcing technologies with use of analytical operations of geographic information systems(GIS),will be remarkably advantageous.Iran’s south-eastern Makran coasts are geopolitically and economically,of importance due to their strategic characteristics but have been neglected and their development and transit infrastructure are significantly beyond the international standards.Therefore,in this paper,with regard to the importance of developing Makran coasts,a Multi-Criterion Decision Analysis(MCDA)method was applied to identify and prioritize the intended criteria and parameters of zoning,in order to establish new maritime zones.The major scope of this study is to employ the satellite data,remote sensing methods,and regional statistics obtained from Jask synoptic station and investigate the region’s status in terms of topography,rainfall rate and temperature changes to reach to a comprehensive monitoring and zoning of the coastal line and to provide a pervasive local data base via use of GIS and MCDA,which will be implemented to construct the coastal regions.In this article,while explaining the steps of coastal monitoring,its main objectives are also explained and the necessary procedures for doing so are presented.Then,the general steps of marine climate identification and study of marine parameters are stated and the final achievements of the coastal monitoring process are determined.In the following,considering that this article focuses on the monitoring of Makran beaches,the method of work in the mentioned region will be described and its specific differences and complexities will be discussed in detail.Also,the impact of such projects on future research results will be discussed.
文摘为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条码的正确读取。为此,针对金属表面激光标刻二维条码出现的局部高光现象,提出了基于五步重构模型的条码重构法,以重构高光区域的条码信息。对获得的条码图像进行倾斜校正,使"L"型实线边界位于图像左下角,对条码进行网格划分实现各个模块的定位。基于Modified Specular-Free(MSF)图像对高光区域进行检测。采用五步重构模型对条码的各个模块进行数值填充,对条码进行读取。实验表明,该算法能达到去除金属表面上条码局部高光的目的,并取得了较高的识读正确率。
基金National Natural Science Foundation of China ( No. 61074079) Shanghai Leading Academic Discipline Project,China ( No.B504)
文摘Complex industrial process often contains multiple operating modes, and the challenge of multimode process monitoring has recently gained much attention. However, most multivariate statistical process monitoring (MSPM) methods are based on the assumption that the process has only one nominal mode. When the process data contain different distributions, they may not function as well as in single mode processes. To address this issue, an improved partial least squares (IPLS) method was proposed for multimode process monitoring. By utilizing a novel local standardization strategy, the normal data in multiple modes could be centralized after being standardized and the fundamental assumption of partial least squares (PLS) could be valid again in multimode process. In this way, PLS method was extended to be suitable for not only single mode processes but also multimode processes. The efficiency of the proposed method was illustrated by comparing the monitoring results of PLS and IPLS in Tennessee Eastman(TE) process.
文摘This paper considers the local linear regression estimators for partially linear model with censored data. Which have some nice large-sample behaviors and are easy to implement. By many simulation runs, the author also found that the estimators show remarkable in the small sample case yet.
基金The Key Knowledge Innovation Project (KZCX3-SW-131), the Hundred Talents Program of Chinese Academy of Sciences and the National Natural Science Foundation of China (40374029)
文摘By using 11 global ocean tide models and tidal gauge data obtained in the East China Sea and South China Sea, the influence of the ocean loading on gravity field in China and its neighbor area is calculated in this paper. Furthermore, the differences between the results from original global models and modified models with local tides are discussed based on above calculation. The comparison shows that the differences at the position near the sea are so large that the local tides must be taken into account in the calculation. When the global ocean tide models of CSR4.0, FES02, GOT00, NAO99 and ORI96 are chosen, the local effect for M2 is less than 0.10 × 10-8 m·s-2 over the area far away from sea. And the local effect for O1 is less than 0.05 × 10-8 m·s-2 over that area when choosing AG95 or CSR3.0 models. This numerical result demonstrates that the choice of model is a complex problem because of the inconsistent accuracy of the models over the areas of East and South China Seas.
文摘At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and efficient way, and job scheduling is a key factor for achieving high performance in big data processing. This paper gives an overview of big data and highlights the problems and challenges in big data. It then highlights Hadoop Distributed File System (HDFS), Hadoop MapReduce, and various parameters that affect the performance of job scheduling algorithms in big data such as Job Tracker, Task Tracker, Name Node, Data Node, etc. The primary purpose of this paper is to present a comparative study of job scheduling algorithms along with their experimental results in Hadoop environment. In addition, this paper describes the advantages, disadvantages, features, and drawbacks of various Hadoop job schedulers such as FIFO, Fair, capacity, Deadline Constraints, Delay, LATE, Resource Aware, etc, and provides a comparative study among these schedulers.
基金supported by the National Basic Research Program of China (973 Program: 2013CB329004)
文摘Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical methods for anomalous cell detection cannot adapt to the evolution of networks, and data mining becomes the mainstream. In this paper, we propose a novel kernel density-based local outlier factor(KLOF) to assign a degree of being an outlier to each object. Firstly, the notion of KLOF is introduced, which captures exactly the relative degree of isolation. Then, by analyzing its properties, including the tightness of upper and lower bounds, sensitivity of density perturbation, we find that KLOF is much greater than 1 for outliers. Lastly, KLOFis applied on a real-world dataset to detect anomalous cells with abnormal key performance indicators(KPIs) to verify its reliability. The experiment shows that KLOF can find outliers efficiently. It can be a guideline for the operators to perform faster and more efficient trouble shooting.