A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanica...A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanical systems.Firstly,the singular points of original signals are eliminated effectively by using the first-order difference method.Then the OVMD method is applied for signal modal decomposition.Furthermore,correlation analysis is conducted to determine the degree of correlation between each mode and the original signal,so as to accurately separate the real operating signal from noise signal.On the basis of theoretical analysis and simulation,an edge node pre-processing system for distributed electromechanical system is designed.Finally,by virtue of the signal-to-noise ratio(SNR)and root-mean-square error(RMSE)indicators,the signal pre-processing effect is evaluated.The experimental results show that the OVMD-based edge node pre-processing system can extract signals with different characteristics and improve the SNR of reconstructed signals.Due to its high fidelity and reliability,this system can also provide data quality assurance for subsequent system health monitoring and fault diagnosis.展开更多
In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. Ho...In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on full channel state information (CSI), which improved the system performance. A pre-processing filter based on partial CSI at the transmitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on full CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.展开更多
In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflec...In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.展开更多
The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1...The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1) investigate the morphological features and geological structures at the landing site; (2) integrated in-situ analysis of minerals and chemical compositions; (3) integrated exploration of the structure of the lunar interior; (4) exploration of the lunar-terrestrial space environment, lunar sur- face environment and acquire Moon-based ultraviolet astronomical observations. The Ground Research and Application System (GRAS) is in charge of data acquisition and pre-processing, management of the payload in orbit, and managing the data products and their applications. The Data Pre-processing Subsystem (DPS) is a part of GRAS. The task of DPS is the pre-processing of raw data from the eight instruments that are part of CE-3, including channel processing, unpacking, package sorting, calibration and correction, identification of geographical location, calculation of probe azimuth angle, probe zenith angle, solar azimuth angle, and solar zenith angle and so on, and conducting quality checks. These processes produce Level 0, Level 1 and Level 2 data. The computing platform of this subsystem is comprised of a high-performance computing cluster, including a real-time subsystem used for processing Level 0 data and a post-time subsystem for generating Level 1 and Level 2 data. This paper de- scribes the CE-3 data pre-processing method, the data pre-processing subsystem, data classification, data validity and data products that are used for scientific studies.展开更多
Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching eng...Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching engines. However, state-of-theart matching methods often require a significant amount of pre-processing time and hence are not suitable for this fast updating scenario. In this paper, a novel matching engine called BFA is proposed to achieve high-speed regular expression matching with fast pre-processing. Experiments demonstrate that BFA obtains 5 to 20 times more update abilities compared to existing regular expression matching methods, and scales well on multi-core platforms.展开更多
High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution. Preprocessing of ice co...High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution. Preprocessing of ice cores has direct impacts on the data quality control for further analysis since the conventional ice core processing is time-consuming, produces qualitative data, leads to ice mass loss, and leads to risks of potential secondary pollution. However, over the past several decades, preprocessing of ice cores has received less attention than the improvement of ice drilling, the analytical methodology of various indices, and the researches on the climatic and environmental significance of ice core records. Therefore, this papers reviews the development of the processing for ice cores including framework, design as well as materials, analyzes the technical advantages and disadvantages of the different systems. In the past, continuous flowanalysis(CFA) has been successfully applied to process the polar ice cores. However, it is not suitable for ice cores outside polar region because of high level of particles, the memory effect between samples, and the filtration before injection. Ice core processing is a subtle and professional operation due to the fragility of the nonmetallic materials and the random distribution of particles and air bubbles in ice cores, which aggravates uncertainty in the measurements. The future developments of CFA are discussed in preprocessing, memory effect, challenge for brittle ice, coupling with real-time analysis and optimization of CFA in the field. Furthermore, non-polluting cutters with many different configurations could be designed to cut and scrape in multiple directions and to separate inner and outer portions of the core. This system also needs to be coupled with streamlined operation of packaging, coding, and stacking that can be implemented at high resolution and rate, avoiding manual intervention. At the same time, information of the longitudinal sections could be scanned andidentified, and then classified to obtain quantitative data. In addition, irregular ice volume and weight can also be obtained accurately. These improvements are recorded automatically via user-friendly interfaces. These innovations may be applied to other paleomedias with similar features and needs.展开更多
Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morp...Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morphology is to use construction ele- ment measure image morphology for solving understand problem.The article presented advanced cellular neural network that forms mathematical morphological cellular neural network (MMCNN) equation to be suit for mathematical morphology filter.It gave the theo- ries of MMCNN dynamic extent and stable state.It is evidenced that arrived mathematical morphology filter through steady of dynamic process in definite condition.展开更多
There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analys...There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analysis. This paper proposes a data pre-processing model based on intelligent algorithms. Firstly, we introduce the integrated network platform of ocean observation. Next, the preprocessing model of data is presemed, and an imelligent cleaning model of data is proposed. Based on fuzzy clustering, the Kohonen clustering network is improved to fulfill the parallel calculation of fuzzy c-means clustering. The proposed dynamic algorithm can automatically f'md the new clustering center with the updated sample data. The rapid and dynamic performance of the model makes it suitable for real time calculation, and the efficiency and accuracy of the model is proved by test results through observation data analysis.展开更多
Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the e...Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the effect of the noise constitutes a challenging problem in microarray analysis. Efficient denoising is often a necessary and the first step to be taken before the image data is analyzed to compensate for data corruption and for effective utilization for these data. Hence preprocessing of microarray image is an essential to eliminate the background noise in order to enhance the image quality and effective quantification. Existing denoising techniques based on transformed domain have been utilized for microarray noise reduction with their own limitations. The objective of this paper is to introduce novel preprocessing techniques such as optimized spatial resolution (OSR) and spatial domain filtering (SDF) for reduction of noise from microarray data and reduction of error during quantification process for estimating the microarray spots accurately to determine expression level of genes. Besides combined optimized spatial resolution and spatial filtering is proposed and found improved denoising of microarray data with effective quantification of spots. The proposed method has been validated in microarray images of gene expression profiles of Myeloid Leukemia using Stanford Microarray Database with various quality measures such as signal to noise ratio, peak signal to noise ratio, image fidelity, structural content, absolute average difference and correlation quality. It was observed by quantitative analysis that the proposed technique is more efficient for denoising the microarray image which enables to make it suitable for effective quantification.展开更多
The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or ve...The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or very large complicated structures, we must use the parallel algorithm with the aid of high-performance computers to solve complex problems. This paper introduces the implementation process having the parallel with sparse linear equations from the perspective of sparse linear equation group.展开更多
With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality pred...With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.展开更多
Forecasting river flow is crucial for optimal planning,management,and sustainability using freshwater resources.Many machine learning(ML)approaches have been enhanced to improve streamflow prediction.Hybrid techniques...Forecasting river flow is crucial for optimal planning,management,and sustainability using freshwater resources.Many machine learning(ML)approaches have been enhanced to improve streamflow prediction.Hybrid techniques have been viewed as a viable method for enhancing the accuracy of univariate streamflow estimation when compared to standalone approaches.Current researchers have also emphasised using hybrid models to improve forecast accuracy.Accordingly,this paper conducts an updated literature review of applications of hybrid models in estimating streamflow over the last five years,summarising data preprocessing,univariate machine learning modelling strategy,advantages and disadvantages of standalone ML techniques,hybrid models,and performance metrics.This study focuses on two types of hybrid models:parameter optimisation-based hybrid models(OBH)and hybridisation of parameter optimisation-based and preprocessing-based hybridmodels(HOPH).Overall,this research supports the idea thatmeta-heuristic approaches precisely improveML techniques.It’s also one of the first efforts to comprehensively examine the efficiency of various meta-heuristic approaches(classified into four primary classes)hybridised with ML techniques.This study revealed that previous research applied swarm,evolutionary,physics,and hybrid metaheuristics with 77%,61%,12%,and 12%,respectively.Finally,there is still room for improving OBH and HOPH models by examining different data pre-processing techniques and metaheuristic algorithms.展开更多
Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as re...Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as reduction in the lifespan of equipment due to frequent switching and interruption,delay,and stoppage of services may occur.Therefore,applying a machine learning(ML)method,which is possible to automatically judge and classify network-related service anomaly,and switch multi-input signals without dropping or changing signals by predicting or quickly determining the time of error occurrence for smooth stream switching when there are problems such as transmission errors,is required.In this paper,we propose an intelligent packet switching method based on the ML method of classification,which is one of the supervised learning methods,that presents the risk level of abnormal multi-stream occurring in broadcasting gateway equipment based on data.Furthermore,we subdivide the risk levels obtained from classification techniques into probabilities and then derive vectorized representative values for each attribute value of the collected input data and continuously update them.The obtained reference vector value is used for switching judgment through the cosine similarity value between input data obtained when a dangerous situation occurs.In the broadcasting gateway equipment to which the proposed method is applied,it is possible to perform more stable and smarter switching than before by solving problems of reliability and broadcasting accidents of the equipment and can maintain stable video streaming as well.展开更多
The use of traditional herbal drugs derived from natural sources is on the rise due to their minimal side effects and numerous health benefits.However,a major limitation is the lack of standardized knowledge for ident...The use of traditional herbal drugs derived from natural sources is on the rise due to their minimal side effects and numerous health benefits.However,a major limitation is the lack of standardized knowledge for identifying and mapping the quality of these herbal medicines.This article aims to provide practical insights into the application of artificial intelligence for quality-based commercialization of raw herbal drugs.It focuses on feature extraction methods,image processing techniques,and the preparation of herbal images for compatibility with machine learning models.The article discusses commonly used image processing tools such as normalization,slicing,cropping,and augmentation to prepare images for artificial intelligence-based models.It also provides an overview of global herbal image databases and the models employed for herbal plant/drug identification.Readers will gain a comprehensive understanding of the potential application of various machine learning models,including artificial neural networks and convolutional neural networks.The article delves into suitable validation parameters like true positive rates,accuracy,precision,and more for the development of artificial intelligence-based identification and authentication techniques for herbal drugs.This article offers valuable insights and a conclusive platform for the further exploration of artificial intelligence in the field of herbal drugs,paving the way for smarter identification and authentication methods.展开更多
This research concentrates to model an efficient thyroid prediction approach,which is considered a baseline for significant problems faced by the women community.The major research problem is the lack of automated mod...This research concentrates to model an efficient thyroid prediction approach,which is considered a baseline for significant problems faced by the women community.The major research problem is the lack of automated model to attain earlier prediction.Some existing model fails to give better prediction accuracy.Here,a novel clinical decision support system is framed to make the proper decision during a time of complexity.Multiple stages are followed in the proposed framework,which plays a substantial role in thyroid prediction.These steps include i)data acquisition,ii)outlier prediction,and iii)multi-stage weight-based ensemble learning process(MS-WEL).The weighted analysis of the base classifier and other classifier models helps bridge the gap encountered in one single classifier model.Various classifiers aremerged to handle the issues identified in others and intend to enhance the prediction rate.The proposed model provides superior outcomes and gives good quality prediction rate.The simulation is done in the MATLAB 2020a environment and establishes a better trade-off than various existing approaches.The model gives a prediction accuracy of 97.28%accuracy compared to other models and shows a better trade than others.展开更多
The satellite laser ranging (SLR) data quality from the COMPASS was analyzed, and the difference between curve recognition in computer vision and pre-process of SLR data finally proposed a new algorithm for SLR was ...The satellite laser ranging (SLR) data quality from the COMPASS was analyzed, and the difference between curve recognition in computer vision and pre-process of SLR data finally proposed a new algorithm for SLR was discussed data based on curve recognition from points cloud is proposed. The results obtained by the new algorithm are 85 % (or even higher) consistent with that of the screen displaying method, furthermore, the new method can process SLR data automatically, which makes it possible to be used in the development of the COMPASS navigation system.展开更多
The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.How...The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.However,the complexity of resource allocation is increased because of the large number of tasks and satellites.Therefore,the primary problem of implementing concurrent multiple tasks via LEO mega-constellation is to pre-process tasks and observation re-sources.To address the challenge,we propose a pre-processing algorithm for the mega-constellation based on highly Dynamic Spatio-Temporal Grids(DSTG).In the first stage,this paper describes the management model of mega-constellation and the multiple tasks.Then,the coding method of DSTG is proposed,based on which the description of complex mega-constellation observation resources is realized.In the third part,the DSTG algorithm is used to realize the processing of concurrent multiple tasks at multiple levels,such as task space attribute,time attribute and grid task importance evaluation.Finally,the simulation result of the proposed method in the case of constellation has been given to verify the effectiveness of concurrent multi-task pre-processing based on DSTG.The autonomous processing process of task decomposition and task fusion and mapping to grids,and the convenient indexing process of time window are verified.展开更多
Biomedical image processing acts as an essential part of severalmedical applications in supporting computer aided disease diagnosis. MagneticResonance Image (MRI) is a commonly utilized imaging tool used tosave glioma...Biomedical image processing acts as an essential part of severalmedical applications in supporting computer aided disease diagnosis. MagneticResonance Image (MRI) is a commonly utilized imaging tool used tosave glioma for clinical examination. Biomedical image segmentation plays avital role in healthcare decision making process which also helps to identifythe affected regions in the MRI. Though numerous segmentation models areavailable in the literature, it is still needed to develop effective segmentationmodels for BT. This study develops a salp swarm algorithm with multi-levelthresholding based brain tumor segmentation (SSAMLT-BTS) model. Thepresented SSAMLT-BTS model initially employs bilateral filtering based onnoise removal and skull stripping as a pre-processing phase. In addition,Otsu thresholding approach is applied to segment the biomedical imagesand the optimum threshold values are chosen by the use of SSA. Finally,active contour (AC) technique is used to identify the suspicious regions in themedical image. A comprehensive experimental analysis of the SSAMLT-BTSmodel is performed using benchmark dataset and the outcomes are inspectedin many aspects. The simulation outcomes reported the improved outcomesof the SSAMLT-BTS model over recent approaches with maximum accuracyof 95.95%.展开更多
The aviation industry is one of the most competitive markets. Themost common approach for airline service providers is to improve passengersatisfaction. Passenger satisfaction in the aviation industry occurs whenpasse...The aviation industry is one of the most competitive markets. Themost common approach for airline service providers is to improve passengersatisfaction. Passenger satisfaction in the aviation industry occurs whenpassengers’ expectations are met during flights. Airline service quality iscritical in attracting new passengers and retaining existing ones. It is crucialto identify passengers’ pain points and enhance their satisfaction with theservices offered. The airlines used a variety of techniques to improve servicequality. They used data analysis approaches to analyze the passenger pointdata. These solutions have focused simply on surveys;consequently, deeplearningapproaches have received insufficient attention. In this study, deepneural networks with the adaptive moment estimation Adam optimizationalgorithm were applied to enhance classification performance. In previousstudies, the quality of the dataset has been ignored. The proposed approachwas applied to the airline passenger satisfaction dataset from the Kagglerepository. It was validated by applying artificial neural networks (ANNs),random forests, and support vector machine techniques to the same dataset. Itwas compared with other research papers that used the same dataset and had asimilar problem. The experimental results showed that the proposed approachoutperformed previous studies. It has achieved an accuracy of 99.3%.展开更多
ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.D...ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.Duplicate detections involve discovering records referring to the same practical components,indicating tasks,which are generally dependent on several input parameters that experts yield.Record linkage specifies the issue of finding identical records across various data sources.The similarity existing between two records is characterized based on domain-based similarity functions over different features.De-duplication of one dataset or the linkage of multiple data sets has become a highly significant operation in the data processing stages of different data mining programmes.The objective is to match all the records associated with the same entity.Various measures have been in use for representing the quality and complexity about data linkage algorithms,and many other novel metrics have been introduced.An outline of the problem existing in themeasurement of data linkage and de-duplication quality and complexity is presented.This article focuses on the reprocessing of health data that is horizontally divided among data custodians,with the purpose of custodians giving similar features to sets of patients.The first step in this technique is about an automatic selection of training examples with superior quality from the compared record pairs and the second step involves training the reciprocal neuro-fuzzy inference system(RANFIS)classifier.Using the Optimal Threshold classifier,it is presumed that there is information about the original match status for all compared record pairs(i.e.,Ant Lion Optimization),and therefore an optimal threshold can be computed based on the respective RANFIS.Febrl,Clinical Decision(CD),and Cork Open Research Archive(CORA)data repository help analyze the proposed method with evaluated benchmarks with current techniques.展开更多
基金National Natural Science Foundation of China(No.61903291)Industrialization Project of Shaanxi Provincial Department of Education(No.18JC018)。
文摘A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanical systems.Firstly,the singular points of original signals are eliminated effectively by using the first-order difference method.Then the OVMD method is applied for signal modal decomposition.Furthermore,correlation analysis is conducted to determine the degree of correlation between each mode and the original signal,so as to accurately separate the real operating signal from noise signal.On the basis of theoretical analysis and simulation,an edge node pre-processing system for distributed electromechanical system is designed.Finally,by virtue of the signal-to-noise ratio(SNR)and root-mean-square error(RMSE)indicators,the signal pre-processing effect is evaluated.The experimental results show that the OVMD-based edge node pre-processing system can extract signals with different characteristics and improve the SNR of reconstructed signals.Due to its high fidelity and reliability,this system can also provide data quality assurance for subsequent system health monitoring and fault diagnosis.
基金supported by the National Natural Science Foundation of China(60902045)the National High-Tech Research and Developmeent Program of China(863 Program)(2011AA01A105)
文摘In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on full channel state information (CSI), which improved the system performance. A pre-processing filter based on partial CSI at the transmitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on full CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.
基金Projects 50221402, 50490271 and 50025413 supported by the National Natural Science Foundation of Chinathe National Basic Research Program of China (2009CB219603, 2009 CB724601, 2006CB202209 and 2005CB221500)+1 种基金the Key Project of the Ministry of Education (306002)the Program for Changjiang Scholars and Innovative Research Teams in Universities of MOE (IRT0408)
文摘In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.
文摘The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1) investigate the morphological features and geological structures at the landing site; (2) integrated in-situ analysis of minerals and chemical compositions; (3) integrated exploration of the structure of the lunar interior; (4) exploration of the lunar-terrestrial space environment, lunar sur- face environment and acquire Moon-based ultraviolet astronomical observations. The Ground Research and Application System (GRAS) is in charge of data acquisition and pre-processing, management of the payload in orbit, and managing the data products and their applications. The Data Pre-processing Subsystem (DPS) is a part of GRAS. The task of DPS is the pre-processing of raw data from the eight instruments that are part of CE-3, including channel processing, unpacking, package sorting, calibration and correction, identification of geographical location, calculation of probe azimuth angle, probe zenith angle, solar azimuth angle, and solar zenith angle and so on, and conducting quality checks. These processes produce Level 0, Level 1 and Level 2 data. The computing platform of this subsystem is comprised of a high-performance computing cluster, including a real-time subsystem used for processing Level 0 data and a post-time subsystem for generating Level 1 and Level 2 data. This paper de- scribes the CE-3 data pre-processing method, the data pre-processing subsystem, data classification, data validity and data products that are used for scientific studies.
基金supported by the National Key Technology R&D Program of China under Grant No. 2015BAK34B00the National Key Research and Development Program of China under Grant No. 2016YFB1000102
文摘Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching engines. However, state-of-theart matching methods often require a significant amount of pre-processing time and hence are not suitable for this fast updating scenario. In this paper, a novel matching engine called BFA is proposed to achieve high-speed regular expression matching with fast pre-processing. Experiments demonstrate that BFA obtains 5 to 20 times more update abilities compared to existing regular expression matching methods, and scales well on multi-core platforms.
基金supported by the National Natural Science Foundation of China(Grant No.41630754)the State Key Laboratory of Cryospheric Science(SKLCS-ZZ-2017)CAS Key Technology Talent Program and Open Foundation of State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering(2017490711)
文摘High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution. Preprocessing of ice cores has direct impacts on the data quality control for further analysis since the conventional ice core processing is time-consuming, produces qualitative data, leads to ice mass loss, and leads to risks of potential secondary pollution. However, over the past several decades, preprocessing of ice cores has received less attention than the improvement of ice drilling, the analytical methodology of various indices, and the researches on the climatic and environmental significance of ice core records. Therefore, this papers reviews the development of the processing for ice cores including framework, design as well as materials, analyzes the technical advantages and disadvantages of the different systems. In the past, continuous flowanalysis(CFA) has been successfully applied to process the polar ice cores. However, it is not suitable for ice cores outside polar region because of high level of particles, the memory effect between samples, and the filtration before injection. Ice core processing is a subtle and professional operation due to the fragility of the nonmetallic materials and the random distribution of particles and air bubbles in ice cores, which aggravates uncertainty in the measurements. The future developments of CFA are discussed in preprocessing, memory effect, challenge for brittle ice, coupling with real-time analysis and optimization of CFA in the field. Furthermore, non-polluting cutters with many different configurations could be designed to cut and scrape in multiple directions and to separate inner and outer portions of the core. This system also needs to be coupled with streamlined operation of packaging, coding, and stacking that can be implemented at high resolution and rate, avoiding manual intervention. At the same time, information of the longitudinal sections could be scanned andidentified, and then classified to obtain quantitative data. In addition, irregular ice volume and weight can also be obtained accurately. These improvements are recorded automatically via user-friendly interfaces. These innovations may be applied to other paleomedias with similar features and needs.
文摘Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morphology is to use construction ele- ment measure image morphology for solving understand problem.The article presented advanced cellular neural network that forms mathematical morphological cellular neural network (MMCNN) equation to be suit for mathematical morphology filter.It gave the theo- ries of MMCNN dynamic extent and stable state.It is evidenced that arrived mathematical morphology filter through steady of dynamic process in definite condition.
基金Key Science and Technology Project of the Shanghai Committee of Science and Technology, China (No.06dz1200921)Major Basic Research Project of the Shanghai Committee of Science and Technology(No.08JC1400100)+1 种基金Shanghai Talent Developing Foundation, China(No.001)Specialized Foundation for Excellent Talent of Shanghai,China
文摘There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analysis. This paper proposes a data pre-processing model based on intelligent algorithms. Firstly, we introduce the integrated network platform of ocean observation. Next, the preprocessing model of data is presemed, and an imelligent cleaning model of data is proposed. Based on fuzzy clustering, the Kohonen clustering network is improved to fulfill the parallel calculation of fuzzy c-means clustering. The proposed dynamic algorithm can automatically f'md the new clustering center with the updated sample data. The rapid and dynamic performance of the model makes it suitable for real time calculation, and the efficiency and accuracy of the model is proved by test results through observation data analysis.
文摘Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the effect of the noise constitutes a challenging problem in microarray analysis. Efficient denoising is often a necessary and the first step to be taken before the image data is analyzed to compensate for data corruption and for effective utilization for these data. Hence preprocessing of microarray image is an essential to eliminate the background noise in order to enhance the image quality and effective quantification. Existing denoising techniques based on transformed domain have been utilized for microarray noise reduction with their own limitations. The objective of this paper is to introduce novel preprocessing techniques such as optimized spatial resolution (OSR) and spatial domain filtering (SDF) for reduction of noise from microarray data and reduction of error during quantification process for estimating the microarray spots accurately to determine expression level of genes. Besides combined optimized spatial resolution and spatial filtering is proposed and found improved denoising of microarray data with effective quantification of spots. The proposed method has been validated in microarray images of gene expression profiles of Myeloid Leukemia using Stanford Microarray Database with various quality measures such as signal to noise ratio, peak signal to noise ratio, image fidelity, structural content, absolute average difference and correlation quality. It was observed by quantitative analysis that the proposed technique is more efficient for denoising the microarray image which enables to make it suitable for effective quantification.
文摘The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or very large complicated structures, we must use the parallel algorithm with the aid of high-performance computers to solve complex problems. This paper introduces the implementation process having the parallel with sparse linear equations from the perspective of sparse linear equation group.
基金supported by the National Science and Technology Innovation 2030 Next-Generation Artifical Intelligence Major Project(2018AAA0101801)the National Natural Science Foundation of China(72271188)。
文摘With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.
基金This paper’s logical organisation and content quality have been enhanced,so the authors thank anonymous reviewers and journal editors for assistance.
文摘Forecasting river flow is crucial for optimal planning,management,and sustainability using freshwater resources.Many machine learning(ML)approaches have been enhanced to improve streamflow prediction.Hybrid techniques have been viewed as a viable method for enhancing the accuracy of univariate streamflow estimation when compared to standalone approaches.Current researchers have also emphasised using hybrid models to improve forecast accuracy.Accordingly,this paper conducts an updated literature review of applications of hybrid models in estimating streamflow over the last five years,summarising data preprocessing,univariate machine learning modelling strategy,advantages and disadvantages of standalone ML techniques,hybrid models,and performance metrics.This study focuses on two types of hybrid models:parameter optimisation-based hybrid models(OBH)and hybridisation of parameter optimisation-based and preprocessing-based hybridmodels(HOPH).Overall,this research supports the idea thatmeta-heuristic approaches precisely improveML techniques.It’s also one of the first efforts to comprehensively examine the efficiency of various meta-heuristic approaches(classified into four primary classes)hybridised with ML techniques.This study revealed that previous research applied swarm,evolutionary,physics,and hybrid metaheuristics with 77%,61%,12%,and 12%,respectively.Finally,there is still room for improving OBH and HOPH models by examining different data pre-processing techniques and metaheuristic algorithms.
基金This work was supported by a research grant from Seoul Women’s University(2023-0183).
文摘Broadcasting gateway equipment generally uses a method of simply switching to a spare input stream when a failure occurs in a main input stream.However,when the transmission environment is unstable,problems such as reduction in the lifespan of equipment due to frequent switching and interruption,delay,and stoppage of services may occur.Therefore,applying a machine learning(ML)method,which is possible to automatically judge and classify network-related service anomaly,and switch multi-input signals without dropping or changing signals by predicting or quickly determining the time of error occurrence for smooth stream switching when there are problems such as transmission errors,is required.In this paper,we propose an intelligent packet switching method based on the ML method of classification,which is one of the supervised learning methods,that presents the risk level of abnormal multi-stream occurring in broadcasting gateway equipment based on data.Furthermore,we subdivide the risk levels obtained from classification techniques into probabilities and then derive vectorized representative values for each attribute value of the collected input data and continuously update them.The obtained reference vector value is used for switching judgment through the cosine similarity value between input data obtained when a dangerous situation occurs.In the broadcasting gateway equipment to which the proposed method is applied,it is possible to perform more stable and smarter switching than before by solving problems of reliability and broadcasting accidents of the equipment and can maintain stable video streaming as well.
文摘The use of traditional herbal drugs derived from natural sources is on the rise due to their minimal side effects and numerous health benefits.However,a major limitation is the lack of standardized knowledge for identifying and mapping the quality of these herbal medicines.This article aims to provide practical insights into the application of artificial intelligence for quality-based commercialization of raw herbal drugs.It focuses on feature extraction methods,image processing techniques,and the preparation of herbal images for compatibility with machine learning models.The article discusses commonly used image processing tools such as normalization,slicing,cropping,and augmentation to prepare images for artificial intelligence-based models.It also provides an overview of global herbal image databases and the models employed for herbal plant/drug identification.Readers will gain a comprehensive understanding of the potential application of various machine learning models,including artificial neural networks and convolutional neural networks.The article delves into suitable validation parameters like true positive rates,accuracy,precision,and more for the development of artificial intelligence-based identification and authentication techniques for herbal drugs.This article offers valuable insights and a conclusive platform for the further exploration of artificial intelligence in the field of herbal drugs,paving the way for smarter identification and authentication methods.
文摘This research concentrates to model an efficient thyroid prediction approach,which is considered a baseline for significant problems faced by the women community.The major research problem is the lack of automated model to attain earlier prediction.Some existing model fails to give better prediction accuracy.Here,a novel clinical decision support system is framed to make the proper decision during a time of complexity.Multiple stages are followed in the proposed framework,which plays a substantial role in thyroid prediction.These steps include i)data acquisition,ii)outlier prediction,and iii)multi-stage weight-based ensemble learning process(MS-WEL).The weighted analysis of the base classifier and other classifier models helps bridge the gap encountered in one single classifier model.Various classifiers aremerged to handle the issues identified in others and intend to enhance the prediction rate.The proposed model provides superior outcomes and gives good quality prediction rate.The simulation is done in the MATLAB 2020a environment and establishes a better trade-off than various existing approaches.The model gives a prediction accuracy of 97.28%accuracy compared to other models and shows a better trade than others.
文摘The satellite laser ranging (SLR) data quality from the COMPASS was analyzed, and the difference between curve recognition in computer vision and pre-process of SLR data finally proposed a new algorithm for SLR was discussed data based on curve recognition from points cloud is proposed. The results obtained by the new algorithm are 85 % (or even higher) consistent with that of the screen displaying method, furthermore, the new method can process SLR data automatically, which makes it possible to be used in the development of the COMPASS navigation system.
基金supported by the National Natural Science Foundation of China(Nos.62003115 and 11972130)the Shenzhen Science and Technology Program,China(JCYJ20220818102207015)the Heilongjiang Touyan Team Program,China。
文摘The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.However,the complexity of resource allocation is increased because of the large number of tasks and satellites.Therefore,the primary problem of implementing concurrent multiple tasks via LEO mega-constellation is to pre-process tasks and observation re-sources.To address the challenge,we propose a pre-processing algorithm for the mega-constellation based on highly Dynamic Spatio-Temporal Grids(DSTG).In the first stage,this paper describes the management model of mega-constellation and the multiple tasks.Then,the coding method of DSTG is proposed,based on which the description of complex mega-constellation observation resources is realized.In the third part,the DSTG algorithm is used to realize the processing of concurrent multiple tasks at multiple levels,such as task space attribute,time attribute and grid task importance evaluation.Finally,the simulation result of the proposed method in the case of constellation has been given to verify the effectiveness of concurrent multi-task pre-processing based on DSTG.The autonomous processing process of task decomposition and task fusion and mapping to grids,and the convenient indexing process of time window are verified.
基金The author would like to express their gratitude to the Ministry of Education and the Deanship of Scientific Research-Najran University-Kingdom of Saudi Arabia for their financial and technical support under code number:NU/NRP/SERC/11/3.
文摘Biomedical image processing acts as an essential part of severalmedical applications in supporting computer aided disease diagnosis. MagneticResonance Image (MRI) is a commonly utilized imaging tool used tosave glioma for clinical examination. Biomedical image segmentation plays avital role in healthcare decision making process which also helps to identifythe affected regions in the MRI. Though numerous segmentation models areavailable in the literature, it is still needed to develop effective segmentationmodels for BT. This study develops a salp swarm algorithm with multi-levelthresholding based brain tumor segmentation (SSAMLT-BTS) model. Thepresented SSAMLT-BTS model initially employs bilateral filtering based onnoise removal and skull stripping as a pre-processing phase. In addition,Otsu thresholding approach is applied to segment the biomedical imagesand the optimum threshold values are chosen by the use of SSA. Finally,active contour (AC) technique is used to identify the suspicious regions in themedical image. A comprehensive experimental analysis of the SSAMLT-BTSmodel is performed using benchmark dataset and the outcomes are inspectedin many aspects. The simulation outcomes reported the improved outcomesof the SSAMLT-BTS model over recent approaches with maximum accuracyof 95.95%.
文摘The aviation industry is one of the most competitive markets. Themost common approach for airline service providers is to improve passengersatisfaction. Passenger satisfaction in the aviation industry occurs whenpassengers’ expectations are met during flights. Airline service quality iscritical in attracting new passengers and retaining existing ones. It is crucialto identify passengers’ pain points and enhance their satisfaction with theservices offered. The airlines used a variety of techniques to improve servicequality. They used data analysis approaches to analyze the passenger pointdata. These solutions have focused simply on surveys;consequently, deeplearningapproaches have received insufficient attention. In this study, deepneural networks with the adaptive moment estimation Adam optimizationalgorithm were applied to enhance classification performance. In previousstudies, the quality of the dataset has been ignored. The proposed approachwas applied to the airline passenger satisfaction dataset from the Kagglerepository. It was validated by applying artificial neural networks (ANNs),random forests, and support vector machine techniques to the same dataset. Itwas compared with other research papers that used the same dataset and had asimilar problem. The experimental results showed that the proposed approachoutperformed previous studies. It has achieved an accuracy of 99.3%.
基金This research project was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R234),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘ESystems based on EHRs(Electronic health records)have been in use for many years and their amplified realizations have been felt recently.They still have been pioneering collections of massive volumes of health data.Duplicate detections involve discovering records referring to the same practical components,indicating tasks,which are generally dependent on several input parameters that experts yield.Record linkage specifies the issue of finding identical records across various data sources.The similarity existing between two records is characterized based on domain-based similarity functions over different features.De-duplication of one dataset or the linkage of multiple data sets has become a highly significant operation in the data processing stages of different data mining programmes.The objective is to match all the records associated with the same entity.Various measures have been in use for representing the quality and complexity about data linkage algorithms,and many other novel metrics have been introduced.An outline of the problem existing in themeasurement of data linkage and de-duplication quality and complexity is presented.This article focuses on the reprocessing of health data that is horizontally divided among data custodians,with the purpose of custodians giving similar features to sets of patients.The first step in this technique is about an automatic selection of training examples with superior quality from the compared record pairs and the second step involves training the reciprocal neuro-fuzzy inference system(RANFIS)classifier.Using the Optimal Threshold classifier,it is presumed that there is information about the original match status for all compared record pairs(i.e.,Ant Lion Optimization),and therefore an optimal threshold can be computed based on the respective RANFIS.Febrl,Clinical Decision(CD),and Cork Open Research Archive(CORA)data repository help analyze the proposed method with evaluated benchmarks with current techniques.