Automatic modulation recognition(AMR)of radiation source signals is a research focus in the field of cognitive radio.However,the AMR of radiation source signals at low SNRs still faces a great challenge.Therefore,the ...Automatic modulation recognition(AMR)of radiation source signals is a research focus in the field of cognitive radio.However,the AMR of radiation source signals at low SNRs still faces a great challenge.Therefore,the AMR method of radiation source signals based on two-dimensional data matrix and improved residual neural network is proposed in this paper.First,the time series of the radiation source signals are reconstructed into two-dimensional data matrix,which greatly simplifies the signal preprocessing process.Second,the depthwise convolution and large-size convolutional kernels based residual neural network(DLRNet)is proposed to improve the feature extraction capability of the AMR model.Finally,the model performs feature extraction and classification on the two-dimensional data matrix to obtain the recognition vector that represents the signal modulation type.Theoretical analysis and simulation results show that the AMR method based on two-dimensional data matrix and improved residual network can significantly improve the accuracy of the AMR method.The recognition accuracy of the proposed method maintains a high level greater than 90% even at -14 dB SNR.展开更多
The identification of intercepted radio fuze modulation types is a prerequisite for decision-making in interference systems.However,the electromagnetic environment of modern battlefields is complex,and the signal-to-n...The identification of intercepted radio fuze modulation types is a prerequisite for decision-making in interference systems.However,the electromagnetic environment of modern battlefields is complex,and the signal-to-noise ratio(SNR)of such environments is usually low,which makes it difficult to implement accurate recognition of radio fuzes.To solve the above problem,a radio fuze automatic modulation recognition(AMR)method for low-SNR environments is proposed.First,an adaptive denoising algorithm based on data rearrangement and the two-dimensional(2D)fast Fourier transform(FFT)(DR2D)is used to reduce the noise of the intercepted radio fuze intermediate frequency(IF)signal.Then,the textural features of the denoised IF signal rearranged data matrix are extracted from the statistical indicator vectors of gray-level cooccurrence matrices(GLCMs),and support vector machines(SVMs)are used for classification.The DR2D-based adaptive denoising algorithm achieves an average correlation coefficient of more than 0.76 for ten fuze types under SNRs of-10 d B and above,which is higher than that of other typical algorithms.The trained SVM classification model achieves an average recognition accuracy of more than 96%on seven modulation types and recognition accuracies of more than 94%on each modulation type under SNRs of-12 d B and above,which represents a good AMR performance of radio fuzes under low SNRs.展开更多
In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed a...In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.展开更多
Data acquisition, analysis and calibrating system affiliated with the vehicle is developed for the research on the automatic shift system (ASS). Considering the vehicle’s hard environment such as vibration, high and ...Data acquisition, analysis and calibrating system affiliated with the vehicle is developed for the research on the automatic shift system (ASS). Considering the vehicle’s hard environment such as vibration, high and low temperature, electromagnetic disturbance and so on, the most suitable project is selected. PC104 transfers data with ECU by serial communication and a solid state disk is used as a FLASH ROM. Some techniques including frequency division of data is adopted in the software design in order to ensure the sampling frequency. The analysis and debug software is also contrived according to the characteristic of the ASS. The system plays an important role in the development of the ASS because of the good reliability and practicability in the application.展开更多
The control system of Hefei Light Source II(HLS-Ⅱ) is a distributed system based on the experimental physics and industrial control system(EPICS). It is necessary to maintain the central configuration files for the e...The control system of Hefei Light Source II(HLS-Ⅱ) is a distributed system based on the experimental physics and industrial control system(EPICS). It is necessary to maintain the central configuration files for the existing archiving system. When the process variables in the control system are added, removed, or updated, the configuration files must be manually modified to maintain consistency with the control system. This paper presents a new method for data archiving, which realizes the automatic configuration of the archiving parameters. The system uses microservice architecture to integrate the EPICS Archiver Appliance and Rec Sync. In this way, the system can collect all the archived meta-configuration from the distributed input/output controllers and enter them into the EPICS Archiver Appliance automatically. Furthermore, we also developed a web-based GUI to provide automatic visualization of real-time and historical data. At present,this system is under commissioning at HLS-Ⅱ. The results indicate that the new archiving system is reliable and convenient to operate. The operation mode without maintenance is valuable for large-scale scientific facilities.展开更多
A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depi...A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing.展开更多
Accurate detection and picking of the P-phase onset time in noisy microseismic data from underground mines remains a big challenge. Reliable P-phase onset time picking is necessary for accurate source location needed ...Accurate detection and picking of the P-phase onset time in noisy microseismic data from underground mines remains a big challenge. Reliable P-phase onset time picking is necessary for accurate source location needed for planning and rescue operations in the event of failures. In this paper, a new technique based on the discrete stationary wavelet transform (DSWT)and higher order statist!cs, is proposed for processing noisy data from underground mines. The objectives of this method are to (1) Improve manual detection and tPicking of P-phase onset; and (ii) provide an automatic means of detecting and picking P-phase onset me accurately. The DSWT is first used to filter the signal over several scales. The manual P-phase onset detection and picking are then obtained by computing the signal energy across selected scales with frequency bands that capture the signal of interest. The automatic P-phase onset, on the other hand, is achieved by using skewness- and kurtosis-based criterion applied to selected scales in a time-frequency domain. The method was tested using synthetic and field data from an underground limestone mine. Results were compared with results obtained by using the short-term to long-term average (STA/LTA) ratio and that by Reference Ge et al. (2009). The results show that the me!hod provides a more reliable estimate of the P-phase onset arrival than the STA]LTA method when the signal to noise ratio is very low. Also, the results obtained from the field data matched accurately with the results from Reference Ge et al. (2009).展开更多
[Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic...[Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation period by using the contrast difference and the standard deviation of difference value. The difference between the automatic station and the artificial data, the variation characteristics were understood. Meanwhile, the significance test and analysis of annual average value were carried out by the data sequence during 1990-2009. The influence of automatic station replacing the artificial observation on the sequence continuity of historical temperature data was discussed. [Result] Although the two temperature data in the parallel observation period had the certain difference, the difference was in the permitted range of automatic station difference value on average. The difference of individual month surpassed the permitted range of automatic station difference value. The significance test showed that the annual average temperature and the annual average minimum temperature which were observed in the automatic station had the difference with the historical data. It had the certain influence on the annual temperature sequence, but the difference wasn’t significant as a whole. When the automatic observation combined with the artificial observation to use, the sequence needed carry out the homogeneous test and correction. [Conclusion] The research played the important role on guaranteeing the monorail running of automatic station, optimizing the meteorological surface observation system, improving the climate sequence continuity of meteorological element and the reliability of climate statistics.展开更多
As the demand for wind energy continues to grow at exponential rate, reducing operation and maintenance (O & M) costs and improving reliability have become top priorities in wind turbine maintenance strategies. Pr...As the demand for wind energy continues to grow at exponential rate, reducing operation and maintenance (O & M) costs and improving reliability have become top priorities in wind turbine maintenance strategies. Prediction of wind turbine failures before they reach a catastrophic stage is critical to reduce the O & M cost due to unnecessary scheduled maintenance. A SCADA-data based condition monitoring system, which takes advantage of data already collected at the wind turbine controller, is a cost-effective way to monitor wind turbines for early warning of failures. This article proposes a methodology of fault prediction and automatically generating warning and alarm for wind turbine main bearings based on stored SCADA data using Artificial Neural Network (ANN). The ANN model of turbine main bearing normal behavior is established and then the deviation between estimated and actual values of the parameter is calculated. Furthermore, a method has been developed to generate early warning and alarm and avoid false warnings and alarms based on the deviation. In this way, wind farm operators are able to have enough time to plan maintenance, and thus, unanticipated downtime can be avoided and O & M costs can be reduced.展开更多
Meteorological data is useful for varied applications and sectors ranging from weather and climate forecasting, landscape planning to disaster management among others. However, the availability of these data requires ...Meteorological data is useful for varied applications and sectors ranging from weather and climate forecasting, landscape planning to disaster management among others. However, the availability of these data requires a good network of manual meteorological stations and other support systems for its collection, recording, processing, archiving, communication and dissemination. In sub-Saharan Africa, such networks are limited due to low investment and capacity. To bridge this gap, the National Meteorological Services in Kenya and few others from African countries have moved to install a number of Automatic Weather Stations (AWSs) in the past decade including a few additions from private institutions and individuals. Although these AWSs have the potential to improve the existing observation network and the early warning systems in the region, the quality and capacity of the data collected from the stations are not well exploited. This is mainly due to low confidence, by data users, in electronically observed data. In this study, we set out to confirm that electronically observed data is of comparable quality to a human observer recorded data, and can thus be used to bridge data gaps at temporal and spatial scales. To assess this potential, we applied the simple Pearson correlation method and other statistical tests and approaches by conducting inter-comparison analysis of weather observations from the manual synoptic station and data from two Automatic Weather Stations (TAHMO and 3D-PAWS) co-located at KMD Headquarters to establish existing consistencies and variances in several weather parameters. Results show there is comparable consistency in most of the weather parameters between the three stations. Strong associations were noted between the TAHMO and manual station data for minimum (r = 0.65) and maximum temperatures (r = 0.86) and the maximum temperature between TAHMO and 3DPAWS (r = 0.56). Similar associations were indicated for surface pressure (r = 0.99) and RH (r > 0.6) with the weakest correlations occurring in wind direction and speed. The Shapiro test for normality assumption indicated that the distribution of several parameters compared between the 3 stations were normally distributed (p > 0.05). We conclude that these findings can be used as a basis for wider use of data sets from Automatic Weather Stations in Kenya and elsewhere. This can inform various applications in weather and climate related decisions.展开更多
[Objective] The aim was to study the rear-end real-time data quality control method of regional automatic weather station. [Method] The basic content and steps of rear-end real-time data quality control of regional au...[Objective] The aim was to study the rear-end real-time data quality control method of regional automatic weather station. [Method] The basic content and steps of rear-end real-time data quality control of regional automatic weather station were introduced. Each element was treated with systematic quality control procedure. The existence of rear-end real time data of regional meteorological station in Guangxi was expounded. Combining with relevant elements and linear changes, improvement based on traditional quality control method was made. By dint of evaluation and relevant check of element, the quality of temperature and pressure was controlled. [Result] The method was optimized based on traditional quality control method, and it narrowed the effectiveness of real-time data quality control. The quality check of hourly precipitation applied relevant check of hourly minimum temperature, vertical consistency check of radar data, which can effectively improve the accuracy and credibility of hourly precipitation quality control. [Conclusion] The method was on trial for one year in the quality control of real-time data in the regional automatic meteorological station in Guangxi and had gained good outcome.展开更多
The quality control system for meteorological real-time data from automatic weather stations in Shandong realized integration of communi- cation system and provincial quality control system, and an interaction platfor...The quality control system for meteorological real-time data from automatic weather stations in Shandong realized integration of communi- cation system and provincial quality control system, and an interaction platform which was mainly created by Web was set up. The system not only was fully guaranteed for the funning of basic business, also improved the reliability of the data.展开更多
Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and com...Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.展开更多
In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smooth...In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.展开更多
High-frequency surface wave radar(HFSWR) and automatic identification system(AIS) are the two most important sensors used for vessel tracking.The HFSWR can be applied to tracking all vessels in a detection area,wh...High-frequency surface wave radar(HFSWR) and automatic identification system(AIS) are the two most important sensors used for vessel tracking.The HFSWR can be applied to tracking all vessels in a detection area,while the AIS is usually used to verify the information of cooperative vessels.Because of interference from sea clutter,employing single-frequency HFSWR for vessel tracking may obscure vessels located in the blind zones of Bragg peaks.Analyzing changes in the detection frequencies constitutes an effective method for addressing this deficiency.A solution consisting of vessel fusion tracking is proposed using dual-frequency HFSWR data calibrated by the AIS.Since different systematic biases exist between HFSWR frequency measurements and AIS measurements,AIS information is used to estimate and correct the HFSWR systematic biases at each frequency.First,AIS point measurements for cooperative vessels are associated with the HFSWR measurements using a JVC assignment algorithm.From the association results of the cooperative vessels,the systematic biases in the dualfrequency HFSWR data are estimated and corrected.Then,based on the corrected dual-frequency HFSWR data,the vessels are tracked using a dual-frequency fusion joint probabilistic data association(JPDA)-unscented Kalman filter(UKF) algorithm.Experimental results using real-life detection data show that the proposed method is efficient at tracking vessels in real time and can improve the tracking capability and accuracy compared with tracking processes involving single-frequency data.展开更多
AIM To perform automatic gastric cancer risk classificationusing photofluorography for realizing effective mass screening as a preliminary study. METHODS We used data for 2100 subjects including X-ray images, pepsinog...AIM To perform automatic gastric cancer risk classificationusing photofluorography for realizing effective mass screening as a preliminary study. METHODS We used data for 2100 subjects including X-ray images, pepsinogen Ⅰ and Ⅱ levels, PGⅠ/PGⅡ ratio, Helicobacter pylori (H. pylori) antibody, H. pylori eradication history and interview sheets. We performed two-stage classification with our system. In the first stage, H. pylori infection status classification was performed, and H. pylori-infected subjects were automatically detected. In the second stage, we performed atrophic level classification to validate the effectiveness of our system.RESULTS Sensitivity, specificity and Youden index(YI) of H. pylori infection status classification were 0.884, 0.895 and 0.779, respectively, in the first stage. In the second stage, sensitivity, specificity and YI of atrophic level classification for H. pylori-infected subjects were 0.777, 0.824 and 0.601, respectively. CONCLUSION Although further improvements of the system are needed, experimental results indicated the effectiveness of machine learning techniques for estimation of gastric cancer risk.展开更多
Purpose:The main objective of this work is to show the potentialities of recently developed approaches for automatic knowledge extraction directly from the universities’websites.The information automatically extracte...Purpose:The main objective of this work is to show the potentialities of recently developed approaches for automatic knowledge extraction directly from the universities’websites.The information automatically extracted can be potentially updated with a frequency higher than once per year,and be safe from manipulations or misinterpretations.Moreover,this approach allows us flexibility in collecting indicators about the efficiency of universities’websites and their effectiveness in disseminating key contents.These new indicators can complement traditional indicators of scientific research(e.g.number of articles and number of citations)and teaching(e.g.number of students and graduates)by introducing further dimensions to allow new insights for“profiling”the analyzed universities.Design/methodology/approach:Webometrics relies on web mining methods and techniques to perform quantitative analyses of the web.This study implements an advanced application of the webometric approach,exploiting all the three categories of web mining:web content mining;web structure mining;web usage mining.The information to compute our indicators has been extracted from the universities’websites by using web scraping and text mining techniques.The scraped information has been stored in a NoSQL DB according to a semistructured form to allow for retrieving information efficiently by text mining techniques.This provides increased flexibility in the design of new indicators,opening the door to new types of analyses.Some data have also been collected by means of batch interrogations of search engines(Bing,www.bing.com)or from a leading provider of Web analytics(SimilarWeb,http://www.similarweb.com).The information extracted from the Web has been combined with the University structural information taken from the European Tertiary Education Register(https://eter.joanneum.at/#/home),a database collecting information on Higher Education Institutions(HEIs)at European level.All the above was used to perform a clusterization of 79 Italian universities based on structural and digital indicators.Findings:The main findings of this study concern the evaluation of the potential in digitalization of universities,in particular by presenting techniques for the automatic extraction of information from the web to build indicators of quality and impact of universities’websites.These indicators can complement traditional indicators and can be used to identify groups of universities with common features using clustering techniques working with the above indicators.Research limitations:The results reported in this study refers to Italian universities only,but the approach could be extended to other university systems abroad.Practical implications:The approach proposed in this study and its illustration on Italian universities show the usefulness of recently introduced automatic data extraction and web scraping approaches and its practical relevance for characterizing and profiling the activities of universities on the basis of their websites.The approach could be applied to other university systems.Originality/value:This work applies for the first time to university websites some recently introduced techniques for automatic knowledge extraction based on web scraping,optical character recognition and nontrivial text mining operations(Bruni&Bianchi,2020).展开更多
Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,whi...Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,which interfere with the crack detection performance.Till to the present,there still lacks efficient algorithm models and training datasets to deal with the interference brought by the shadows.To fill in the gap,we made several contributions as follows.First,we proposed a new pavement shadow and crack dataset,which contains a variety of shadow and pavement pixel size combinations.It also covers all common cracks(linear cracks and network cracks),placing higher demands on crack detection methods.Second,we designed a two-step shadow-removal-oriented crack detection approach:SROCD,which improves the performance of the algorithm by first removing the shadow and then detecting it.In addition to shadows,the method can cope with other noise disturbances.Third,we explored the mechanism of how shadows affect crack detection.Based on this mechanism,we propose a data augmentation method based on the difference in brightness values,which can adapt to brightness changes caused by seasonal and weather changes.Finally,we introduced a residual feature augmentation algorithm to detect small cracks that can predict sudden disasters,and the algorithm improves the performance of the model overall.We compare our method with the state-of-the-art methods on existing pavement crack datasets and the shadow-crack dataset,and the experimental results demonstrate the superiority of our method.展开更多
基金National Natural Science Foundation of China under Grant No.61973037China Postdoctoral Science Foundation under Grant No.2022M720419。
文摘Automatic modulation recognition(AMR)of radiation source signals is a research focus in the field of cognitive radio.However,the AMR of radiation source signals at low SNRs still faces a great challenge.Therefore,the AMR method of radiation source signals based on two-dimensional data matrix and improved residual neural network is proposed in this paper.First,the time series of the radiation source signals are reconstructed into two-dimensional data matrix,which greatly simplifies the signal preprocessing process.Second,the depthwise convolution and large-size convolutional kernels based residual neural network(DLRNet)is proposed to improve the feature extraction capability of the AMR model.Finally,the model performs feature extraction and classification on the two-dimensional data matrix to obtain the recognition vector that represents the signal modulation type.Theoretical analysis and simulation results show that the AMR method based on two-dimensional data matrix and improved residual network can significantly improve the accuracy of the AMR method.The recognition accuracy of the proposed method maintains a high level greater than 90% even at -14 dB SNR.
基金National Natural Science Foundation of China under Grant No.61973037China Postdoctoral Science Foundation 2022M720419 to provide fund for conducting experiments。
文摘The identification of intercepted radio fuze modulation types is a prerequisite for decision-making in interference systems.However,the electromagnetic environment of modern battlefields is complex,and the signal-to-noise ratio(SNR)of such environments is usually low,which makes it difficult to implement accurate recognition of radio fuzes.To solve the above problem,a radio fuze automatic modulation recognition(AMR)method for low-SNR environments is proposed.First,an adaptive denoising algorithm based on data rearrangement and the two-dimensional(2D)fast Fourier transform(FFT)(DR2D)is used to reduce the noise of the intercepted radio fuze intermediate frequency(IF)signal.Then,the textural features of the denoised IF signal rearranged data matrix are extracted from the statistical indicator vectors of gray-level cooccurrence matrices(GLCMs),and support vector machines(SVMs)are used for classification.The DR2D-based adaptive denoising algorithm achieves an average correlation coefficient of more than 0.76 for ten fuze types under SNRs of-10 d B and above,which is higher than that of other typical algorithms.The trained SVM classification model achieves an average recognition accuracy of more than 96%on seven modulation types and recognition accuracies of more than 94%on each modulation type under SNRs of-12 d B and above,which represents a good AMR performance of radio fuzes under low SNRs.
基金The Soft Science Research Project of Ministry of Housing and Urban-Rural Development of China (No. 2008-k5-14)
文摘In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.
文摘Data acquisition, analysis and calibrating system affiliated with the vehicle is developed for the research on the automatic shift system (ASS). Considering the vehicle’s hard environment such as vibration, high and low temperature, electromagnetic disturbance and so on, the most suitable project is selected. PC104 transfers data with ECU by serial communication and a solid state disk is used as a FLASH ROM. Some techniques including frequency division of data is adopted in the software design in order to ensure the sampling frequency. The analysis and debug software is also contrived according to the characteristic of the ASS. The system plays an important role in the development of the ASS because of the good reliability and practicability in the application.
基金supported by the National Natural Science Foundation of China(No.11375186)
文摘The control system of Hefei Light Source II(HLS-Ⅱ) is a distributed system based on the experimental physics and industrial control system(EPICS). It is necessary to maintain the central configuration files for the existing archiving system. When the process variables in the control system are added, removed, or updated, the configuration files must be manually modified to maintain consistency with the control system. This paper presents a new method for data archiving, which realizes the automatic configuration of the archiving parameters. The system uses microservice architecture to integrate the EPICS Archiver Appliance and Rec Sync. In this way, the system can collect all the archived meta-configuration from the distributed input/output controllers and enter them into the EPICS Archiver Appliance automatically. Furthermore, we also developed a web-based GUI to provide automatic visualization of real-time and historical data. At present,this system is under commissioning at HLS-Ⅱ. The results indicate that the new archiving system is reliable and convenient to operate. The operation mode without maintenance is valuable for large-scale scientific facilities.
文摘A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing.
文摘Accurate detection and picking of the P-phase onset time in noisy microseismic data from underground mines remains a big challenge. Reliable P-phase onset time picking is necessary for accurate source location needed for planning and rescue operations in the event of failures. In this paper, a new technique based on the discrete stationary wavelet transform (DSWT)and higher order statist!cs, is proposed for processing noisy data from underground mines. The objectives of this method are to (1) Improve manual detection and tPicking of P-phase onset; and (ii) provide an automatic means of detecting and picking P-phase onset me accurately. The DSWT is first used to filter the signal over several scales. The manual P-phase onset detection and picking are then obtained by computing the signal energy across selected scales with frequency bands that capture the signal of interest. The automatic P-phase onset, on the other hand, is achieved by using skewness- and kurtosis-based criterion applied to selected scales in a time-frequency domain. The method was tested using synthetic and field data from an underground limestone mine. Results were compared with results obtained by using the short-term to long-term average (STA/LTA) ratio and that by Reference Ge et al. (2009). The results show that the me!hod provides a more reliable estimate of the P-phase onset arrival than the STA]LTA method when the signal to noise ratio is very low. Also, the results obtained from the field data matched accurately with the results from Reference Ge et al. (2009).
文摘[Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation period by using the contrast difference and the standard deviation of difference value. The difference between the automatic station and the artificial data, the variation characteristics were understood. Meanwhile, the significance test and analysis of annual average value were carried out by the data sequence during 1990-2009. The influence of automatic station replacing the artificial observation on the sequence continuity of historical temperature data was discussed. [Result] Although the two temperature data in the parallel observation period had the certain difference, the difference was in the permitted range of automatic station difference value on average. The difference of individual month surpassed the permitted range of automatic station difference value. The significance test showed that the annual average temperature and the annual average minimum temperature which were observed in the automatic station had the difference with the historical data. It had the certain influence on the annual temperature sequence, but the difference wasn’t significant as a whole. When the automatic observation combined with the artificial observation to use, the sequence needed carry out the homogeneous test and correction. [Conclusion] The research played the important role on guaranteeing the monorail running of automatic station, optimizing the meteorological surface observation system, improving the climate sequence continuity of meteorological element and the reliability of climate statistics.
文摘As the demand for wind energy continues to grow at exponential rate, reducing operation and maintenance (O & M) costs and improving reliability have become top priorities in wind turbine maintenance strategies. Prediction of wind turbine failures before they reach a catastrophic stage is critical to reduce the O & M cost due to unnecessary scheduled maintenance. A SCADA-data based condition monitoring system, which takes advantage of data already collected at the wind turbine controller, is a cost-effective way to monitor wind turbines for early warning of failures. This article proposes a methodology of fault prediction and automatically generating warning and alarm for wind turbine main bearings based on stored SCADA data using Artificial Neural Network (ANN). The ANN model of turbine main bearing normal behavior is established and then the deviation between estimated and actual values of the parameter is calculated. Furthermore, a method has been developed to generate early warning and alarm and avoid false warnings and alarms based on the deviation. In this way, wind farm operators are able to have enough time to plan maintenance, and thus, unanticipated downtime can be avoided and O & M costs can be reduced.
文摘Meteorological data is useful for varied applications and sectors ranging from weather and climate forecasting, landscape planning to disaster management among others. However, the availability of these data requires a good network of manual meteorological stations and other support systems for its collection, recording, processing, archiving, communication and dissemination. In sub-Saharan Africa, such networks are limited due to low investment and capacity. To bridge this gap, the National Meteorological Services in Kenya and few others from African countries have moved to install a number of Automatic Weather Stations (AWSs) in the past decade including a few additions from private institutions and individuals. Although these AWSs have the potential to improve the existing observation network and the early warning systems in the region, the quality and capacity of the data collected from the stations are not well exploited. This is mainly due to low confidence, by data users, in electronically observed data. In this study, we set out to confirm that electronically observed data is of comparable quality to a human observer recorded data, and can thus be used to bridge data gaps at temporal and spatial scales. To assess this potential, we applied the simple Pearson correlation method and other statistical tests and approaches by conducting inter-comparison analysis of weather observations from the manual synoptic station and data from two Automatic Weather Stations (TAHMO and 3D-PAWS) co-located at KMD Headquarters to establish existing consistencies and variances in several weather parameters. Results show there is comparable consistency in most of the weather parameters between the three stations. Strong associations were noted between the TAHMO and manual station data for minimum (r = 0.65) and maximum temperatures (r = 0.86) and the maximum temperature between TAHMO and 3DPAWS (r = 0.56). Similar associations were indicated for surface pressure (r = 0.99) and RH (r > 0.6) with the weakest correlations occurring in wind direction and speed. The Shapiro test for normality assumption indicated that the distribution of several parameters compared between the 3 stations were normally distributed (p > 0.05). We conclude that these findings can be used as a basis for wider use of data sets from Automatic Weather Stations in Kenya and elsewhere. This can inform various applications in weather and climate related decisions.
文摘[Objective] The aim was to study the rear-end real-time data quality control method of regional automatic weather station. [Method] The basic content and steps of rear-end real-time data quality control of regional automatic weather station were introduced. Each element was treated with systematic quality control procedure. The existence of rear-end real time data of regional meteorological station in Guangxi was expounded. Combining with relevant elements and linear changes, improvement based on traditional quality control method was made. By dint of evaluation and relevant check of element, the quality of temperature and pressure was controlled. [Result] The method was optimized based on traditional quality control method, and it narrowed the effectiveness of real-time data quality control. The quality check of hourly precipitation applied relevant check of hourly minimum temperature, vertical consistency check of radar data, which can effectively improve the accuracy and credibility of hourly precipitation quality control. [Conclusion] The method was on trial for one year in the quality control of real-time data in the regional automatic meteorological station in Guangxi and had gained good outcome.
文摘The quality control system for meteorological real-time data from automatic weather stations in Shandong realized integration of communi- cation system and provincial quality control system, and an interaction platform which was mainly created by Web was set up. The system not only was fully guaranteed for the funning of basic business, also improved the reliability of the data.
基金the National Key R&D Program of China(Grant no.2019YFC1709803)National Natural Science Foundation of China(Grant no.81873183).
文摘Objectives:The aim of this study was to investigate and develop a data storage and exchange format for the process of automatic systematic reviews(ASR)of traditional Chinese medicine(TCM).Methods:A lightweight and commonly used data format,namely,JavaScript Object Notation(JSON),was introduced in this study.We designed a fully described data structure to collect TCM clinical trial information based on the JSON syntax.Results:A smart and powerful data format,JSON-ASR,was developed.JSON-ASR uses a plain-text data format in the form of key/value pairs and consists of six sections and more than 80 preset pairs.JSON-ASR adopts extensible structured arrays to support the situations of multi-groups and multi-outcomes.Conclusion:JSON-ASR has the characteristics of light weight,flexibility,and good scalability,which is suitable for the complex data of clinical evidence.
基金Supported by the Science and Technology Development Fund of Macao(China)grant(No.042/2007/A3,No.003/2008/A1)partly supported by NSFC Project(No.10631080)National Key Basic Research Project of China grant(No.2004CB318000)
文摘In the present paper,a new criterion is derived to obtain the optimum fitting curve while using Cubic B-spline basis functions to remove the statistical noise in the spectroscopic data.In this criterion,firstly,smoothed fitting curves using Cubic B-spline basis functions are selected with the increasing knot number.Then,the best fitting curves are selected according to the value of the minimum residual sum of squares(RSS)of two adjacent fitting curves.In the case of more than one best fitting curves,the authors use Reinsch's first condition to find a better one.The minimum residual sum of squares(RSS)of fitting curve with noisy data is not recommended as the criterion to determine the best fitting curve,because this value decreases to zero as the number of selected channels increases and the minimum value gives no smoothing effect.Compared with Reinsch's method,the derived criterion is simple and enables the smoothing conditions to be determined automatically without any initial input parameter.With the derived criterion,the satisfactory result was obtained for the experimental spectroscopic data to remove the statistical noise using Cubic B-spline basis functions.
基金The National Natural Science Foundation of China under contract No.61362002the Marine Scientific Research Special Funds for Public Welfare of China under contract No.201505002
文摘High-frequency surface wave radar(HFSWR) and automatic identification system(AIS) are the two most important sensors used for vessel tracking.The HFSWR can be applied to tracking all vessels in a detection area,while the AIS is usually used to verify the information of cooperative vessels.Because of interference from sea clutter,employing single-frequency HFSWR for vessel tracking may obscure vessels located in the blind zones of Bragg peaks.Analyzing changes in the detection frequencies constitutes an effective method for addressing this deficiency.A solution consisting of vessel fusion tracking is proposed using dual-frequency HFSWR data calibrated by the AIS.Since different systematic biases exist between HFSWR frequency measurements and AIS measurements,AIS information is used to estimate and correct the HFSWR systematic biases at each frequency.First,AIS point measurements for cooperative vessels are associated with the HFSWR measurements using a JVC assignment algorithm.From the association results of the cooperative vessels,the systematic biases in the dualfrequency HFSWR data are estimated and corrected.Then,based on the corrected dual-frequency HFSWR data,the vessels are tracked using a dual-frequency fusion joint probabilistic data association(JPDA)-unscented Kalman filter(UKF) algorithm.Experimental results using real-life detection data show that the proposed method is efficient at tracking vessels in real time and can improve the tracking capability and accuracy compared with tracking processes involving single-frequency data.
文摘AIM To perform automatic gastric cancer risk classificationusing photofluorography for realizing effective mass screening as a preliminary study. METHODS We used data for 2100 subjects including X-ray images, pepsinogen Ⅰ and Ⅱ levels, PGⅠ/PGⅡ ratio, Helicobacter pylori (H. pylori) antibody, H. pylori eradication history and interview sheets. We performed two-stage classification with our system. In the first stage, H. pylori infection status classification was performed, and H. pylori-infected subjects were automatically detected. In the second stage, we performed atrophic level classification to validate the effectiveness of our system.RESULTS Sensitivity, specificity and Youden index(YI) of H. pylori infection status classification were 0.884, 0.895 and 0.779, respectively, in the first stage. In the second stage, sensitivity, specificity and YI of atrophic level classification for H. pylori-infected subjects were 0.777, 0.824 and 0.601, respectively. CONCLUSION Although further improvements of the system are needed, experimental results indicated the effectiveness of machine learning techniques for estimation of gastric cancer risk.
基金This work is developed with the support of the H2020 RISIS 2 Project(No.824091)and of the“Sapienza”Research Awards No.RM1161550376E40E of 2016 and RM11916B8853C925 of 2019.This article is a largely extended version of Bianchi et al.(2019)presented at the ISSI 2019 Conference held in Rome,2–5 September 2019.
文摘Purpose:The main objective of this work is to show the potentialities of recently developed approaches for automatic knowledge extraction directly from the universities’websites.The information automatically extracted can be potentially updated with a frequency higher than once per year,and be safe from manipulations or misinterpretations.Moreover,this approach allows us flexibility in collecting indicators about the efficiency of universities’websites and their effectiveness in disseminating key contents.These new indicators can complement traditional indicators of scientific research(e.g.number of articles and number of citations)and teaching(e.g.number of students and graduates)by introducing further dimensions to allow new insights for“profiling”the analyzed universities.Design/methodology/approach:Webometrics relies on web mining methods and techniques to perform quantitative analyses of the web.This study implements an advanced application of the webometric approach,exploiting all the three categories of web mining:web content mining;web structure mining;web usage mining.The information to compute our indicators has been extracted from the universities’websites by using web scraping and text mining techniques.The scraped information has been stored in a NoSQL DB according to a semistructured form to allow for retrieving information efficiently by text mining techniques.This provides increased flexibility in the design of new indicators,opening the door to new types of analyses.Some data have also been collected by means of batch interrogations of search engines(Bing,www.bing.com)or from a leading provider of Web analytics(SimilarWeb,http://www.similarweb.com).The information extracted from the Web has been combined with the University structural information taken from the European Tertiary Education Register(https://eter.joanneum.at/#/home),a database collecting information on Higher Education Institutions(HEIs)at European level.All the above was used to perform a clusterization of 79 Italian universities based on structural and digital indicators.Findings:The main findings of this study concern the evaluation of the potential in digitalization of universities,in particular by presenting techniques for the automatic extraction of information from the web to build indicators of quality and impact of universities’websites.These indicators can complement traditional indicators and can be used to identify groups of universities with common features using clustering techniques working with the above indicators.Research limitations:The results reported in this study refers to Italian universities only,but the approach could be extended to other university systems abroad.Practical implications:The approach proposed in this study and its illustration on Italian universities show the usefulness of recently introduced automatic data extraction and web scraping approaches and its practical relevance for characterizing and profiling the activities of universities on the basis of their websites.The approach could be applied to other university systems.Originality/value:This work applies for the first time to university websites some recently introduced techniques for automatic knowledge extraction based on web scraping,optical character recognition and nontrivial text mining operations(Bruni&Bianchi,2020).
基金supported in part by the 14th Five-Year Project of Ministry of Science and Technology of China(2021YFD2000304)Fundamental Research Funds for the Central Universities(531118010509)Natural Science Foundation of Hunan Province,China(2021JJ40114)。
文摘Automatic pavement crack detection is a critical task for maintaining the pavement stability and driving safety.The task is challenging because the shadows on the pavement may have similar intensity with the crack,which interfere with the crack detection performance.Till to the present,there still lacks efficient algorithm models and training datasets to deal with the interference brought by the shadows.To fill in the gap,we made several contributions as follows.First,we proposed a new pavement shadow and crack dataset,which contains a variety of shadow and pavement pixel size combinations.It also covers all common cracks(linear cracks and network cracks),placing higher demands on crack detection methods.Second,we designed a two-step shadow-removal-oriented crack detection approach:SROCD,which improves the performance of the algorithm by first removing the shadow and then detecting it.In addition to shadows,the method can cope with other noise disturbances.Third,we explored the mechanism of how shadows affect crack detection.Based on this mechanism,we propose a data augmentation method based on the difference in brightness values,which can adapt to brightness changes caused by seasonal and weather changes.Finally,we introduced a residual feature augmentation algorithm to detect small cracks that can predict sudden disasters,and the algorithm improves the performance of the model overall.We compare our method with the state-of-the-art methods on existing pavement crack datasets and the shadow-crack dataset,and the experimental results demonstrate the superiority of our method.