Air-gun arrays are used in marine-seismic exploration. Far-field wavelets in subsurface media represent the stacking of single air-gun ideal wavelets. We derived single air-gun ideal wavelets using near-field wavelets...Air-gun arrays are used in marine-seismic exploration. Far-field wavelets in subsurface media represent the stacking of single air-gun ideal wavelets. We derived single air-gun ideal wavelets using near-field wavelets recorded from near-field geophones and then synthesized them into far-field wavelets. This is critical for processing wavelets in marine- seismic exploration. For this purpose, several algorithms are currently used to decompose and synthesize wavelets in the time domain. If the traveltime of single air-gun wavelets is not an integral multiple of the sampling interval, the complex and error-prone resampling of the seismic signals using the time-domain method is necessary. Based on the relation between the frequency-domain phase and the time-domain time delay, we propose a method that first transforms the real near-field wavelet to the frequency domain via Fourier transforms; then, it decomposes it and composes the wavelet spectrum in the frequency domain, and then back transforms it to the time domain. Thus, the resampling problem is avoided and single air-gun wavelets and far-field wavelets can be reliably derived. The effect of ghost reflections is also considered, while decomposing the wavelet and removing the ghost reflections. Modeling and real data processing were used to demonstrate the feasibility of the proposed method.展开更多
To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of p...To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of processing data is presented and the error formulae, which are the functions of the testing system property, are derived. Finally, the methods of reducing the errors are provided. The measured results are in correspondence with the theoretical conclusion.展开更多
In multi-component seismic exploration, the horizontal and vertical components both contain P- and SV-waves. The P- and SV-wavefields in a seismic record can be separated by their horizontal and vertical displacements...In multi-component seismic exploration, the horizontal and vertical components both contain P- and SV-waves. The P- and SV-wavefields in a seismic record can be separated by their horizontal and vertical displacements when upgoing P- and SV-waves arrive at the sea floor. If the sea floor P wave velocity, S wave velocity, and density are known, the separation can be achieved in ther-p domain. The separated wavefields are then transformed to the time domain. A method of separating P- and SV-wavefields is presented in this paper and used to effectively separate P- and SV-wavefields in synthetic and real data. The application to real data shows that this method is feasible and effective. It also can be used for free surface data.展开更多
The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is bas...The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is based on compressed sensing. Underwater acoustic signals are transformed into the sparse domain for data storage at a receiving terminal, and the improved orthogonal matching pursuit(IOMP) algorithm is used to reconstruct the original underwater acoustic signals at a data processing terminal. When an increase in sidelobe level occasionally causes a direction of arrival estimation error, the proposed compression method can achieve a 10 times stronger compression for narrowband signals and a 5 times stronger compression for wideband signals than the orthogonal matching pursuit(OMP) algorithm. The IOMP algorithm also reduces the computing time by about 20% more than the original OMP algorithm. The simulation and experimental results are discussed.展开更多
During advanced water detection using the transient electromagnetic method, the exploration effect for water-rich area is often poor due to the interference of bolts that are distributed in different positions in work...During advanced water detection using the transient electromagnetic method, the exploration effect for water-rich area is often poor due to the interference of bolts that are distributed in different positions in working face. Thus, the study on the interference characteristics of bolts in different states has important directive significance for improving the acquisition quality and data processing method in water detection. Based on the analysis of the distribution laws of magnetic field excited by small multi-turn coincident loop in full space of homogeneity, the test on the interference of bolts has been designed in the mine. Through drilling 18 holes around the overlapping coil in the working face, mass data are collected in order with the posi- tion change and the exposed bolt length. The results of comprehensive data analysis show that the transient electromagnetic field is strongly interfered as the distance between the bolt and the center of the coil is less than 3 m, and the interference varies greatly as the distance varies. On the other hand, the field induced by the bolts can be ignored as the distance exceeds 3 m. The findings can help to improve data acquisition and correction during advanced water detection when using the transient electromagnetic method.展开更多
A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on th...A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.展开更多
Deformation patterns, shortening amounts and rates in the late Quaternary across the Kalpin thrust system have received tittle attention in the past. This paper attempts to discuss them, mainly in the eastern part of ...Deformation patterns, shortening amounts and rates in the late Quaternary across the Kalpin thrust system have received tittle attention in the past. This paper attempts to discuss them, mainly in the eastern part of the thrust system by doing field investigation along the faults and folds, measuring geomorphic deformation, excavating trenches in several important sites where young alluvial fans were obviously displaced and dating young deposits of alluvial terraces. There are two types of deformation in the surface and near surface for the Kalpin thrust system in the late Quaternary. They are movement of thrust faults on lower angles and bending of young folds. Both kinds of deformation are shown by shortening and uplifting of young geomorphic surfaces. The surface ages of 3 stages are calculated by dating 20 examples using the TL method in the study area and comparing the results of our predecessors on the deposition and incision times of alluvial terraces in the Tianshan mountain which are 100ka B. P., 33 - 18ka B.P. and 6.6 - 8.2ka B.P. respectively for the large-scale deformed alluvial surfaces: T3, T2 and T1 in the Kalpin region. Then, 19 sets of shortening amounts and rates are obtained in 13 sites along 4 rows of anticlines in front of the Kalpin thrust system and Piqiang fold. The shortening amounts and rates show that there are two sections where deformation is stronger than others. The two sections consist of two arcs that are towards the south. The shortening rates near the top of arcs are 1.32mm/a in the west and 1.39mm/a in the east across the thrust system, respectively. In addition, deformation is stronger in the front rows than the rear ones for bifurcate folds.展开更多
Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial d...Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial design approaches, a reconfigurable-system-on-chip (RSoC) solution based on state-of-the-art FPGA is introduced. The flexibility and reliability of this approach are outlined, and the requirements for an enhanced RSoC design with in-flight reconfigurability for space applications are presented. This design has been demonstrated as an on-board computer prototype, providing an in-flight reconfigurable DPU design approach using integrated hardwired processors.展开更多
The identification of fractures is of great importance in gravity and magnetic data processing and interpretation.In this study,four fracture identification methods are applied,and widely used in processing and analys...The identification of fractures is of great importance in gravity and magnetic data processing and interpretation.In this study,four fracture identification methods are applied,and widely used in processing and analysis of the gravity anomaly,including vertical second derivative method,tilt derivative method,theta map method and normalized differential method,for gravity data acquired in a given area in Heilongjiang.By comparing the distribution of the zero contour or maximum contour,we summarize the application effects,and both advantages and disadvantages of each method.It is found that tilt derivative method and normalized differential method provide better effects than other two methods:the narrower anomaly gradient belt and higher identification precision of fracture or geological boundary.The inferred fractures and geological boundaries have a great match with the results obtained from geologic map and remote sensing data interpretation.Those study results have definitely provided the theoretical foundation for identifying faults and the geological boundaries.展开更多
In this paper, an indoor environmental monitoring solution is proposed for the transformer sub- station system by using the ZigBee technology. It mainly analyzes the principle of sulfur hexafluoride sensor based on th...In this paper, an indoor environmental monitoring solution is proposed for the transformer sub- station system by using the ZigBee technology. It mainly analyzes the principle of sulfur hexafluoride sensor based on the acoustic method and puts forward a difference data processing method of single- channel and temperature compensation point at power consumption issues for sensor detection. The method not only reduces the sensor power consumption, but also ensures the accuracy and extends the lifetime of wireless sensor nodes effectively. This paper also analyzes the design of the base sta- tion mode, and demonstrates the running processes of routers and sensors. The feasibility of the pro- posed approach has been testified and proved.展开更多
The paper first gives a brief overview of the use of statistical methods in sociology in order to show the continuity and importance of these methods in the development of sociology as a science. Therefore, education ...The paper first gives a brief overview of the use of statistical methods in sociology in order to show the continuity and importance of these methods in the development of sociology as a science. Therefore, education of sociologists requires, among other things, training in statistical methods applicable in data processing and analysis in sociological research. Then the research continues with a comparative analysis of the curricula of undergraduate academic studies of sociology, and especially the presence of statistics teaching in them, in the Republic of Serbia and the neighbouring countries. Different programs of the Department of Sociology of the Faculty of Philosophy in Novi Sad are analyzed together with the results of the students' evaluations for the academic years 2007/2008 and 2009/2010.展开更多
The existing surface roughness standards comprise only two dimensions. However, the real roughness of the surface is 3D (three-dimensional). Roughness parameters of the 3D surface are also important in analyzing the...The existing surface roughness standards comprise only two dimensions. However, the real roughness of the surface is 3D (three-dimensional). Roughness parameters of the 3D surface are also important in analyzing the mechanics of contact surfaces. Problems of mechanics of contact surfaces are related to accuracy of 3D surface roughness characteristic. One of the most important factors for 3D characteristics determination is the number of data points per x and y axes. With number of data points we understand its number in cut-off length. Number of data points have substantial influence on the accuracy of measurement results, measuring time and size of output data file (especially along the y-axis direction, where number of data points are number of parallel profiles). Number of data points must be optimal. Small number of data points lead to incorrect results and increase distribution amplitude, but too large number of data points do not enlarge range of fundamental information, but substantially increase measuring time. Therefore, we must find optimal number of data points per each surface processing method.展开更多
The analysis of relevant standards and guidelines proved the lack of information on actions and activities concerning data warehouse testing. The absence of the complex data warehouse testing methodology seems to be c...The analysis of relevant standards and guidelines proved the lack of information on actions and activities concerning data warehouse testing. The absence of the complex data warehouse testing methodology seems to be crucial particularly in the phase of the data warehouse implementation. The aim of this article is to suggest basic data warehouse testing activities as a final part of data warehouse testing methodology. The testing activities that must be implemented in the process of the data warehouse testing can be split into four logical units regarding the multidimensional database testing, data pump testing, metadata and OLAP (Online Analytical Processing) testing. Between main testing activities can be included: revision of the multidimensional database scheme, optimizing of fact tables number, problem of data explosion, testing for correctness of aggregation and summation of data etc.展开更多
In order to make effective use a large amount of graduate data in colleges and universities that accumulate by teaching management of work, the paper study the data mining for higher vocational graduates database usin...In order to make effective use a large amount of graduate data in colleges and universities that accumulate by teaching management of work, the paper study the data mining for higher vocational graduates database using the data mining technology. Using a variety of data preprocessing methods for the original data, and the paper put forward to mining algorithm based on commonly association rule Apriori algorithm, then according to the actual needs of the design and implementation of association rule mining system, has been beneficial to the employment guidance of college teaching management decision and graduates of the mining results.展开更多
Successful modeling of hydroenvironmental processes widely relies on quantity and quality of accessible data,and noisy data can affect the modeling performance.On the other hand in training phase of any Artificial Int...Successful modeling of hydroenvironmental processes widely relies on quantity and quality of accessible data,and noisy data can affect the modeling performance.On the other hand in training phase of any Artificial Intelligence(AI) based model,each training data set is usually a limited sample of possible patterns of the process and hence,might not show the behavior of whole population.Accordingly,in the present paper,wavelet-based denoising method was used to smooth hydrological time series.Thereafter,small normally distributed noises with the mean of zero and various standard deviations were generated and added to the smooth time series to form different denoised-jittered data sets.Finally,the obtained pre-processed data were imposed into Artificial Neural Network(ANN) and Adaptive Neuro-Fuzzy Inference System(ANFIS)models for daily runoff-sediment modeling of the Minnesota River.To evaluate the modeling performance,the outcomes were compared with results of multi linear regression(MLR) and Auto Regressive Integrated Moving Average(ARIMA)models.The comparison showed that the proposed data processing approach which serves both denoising and jittering techniques could enhance the performance of ANN and ANFIS based runoffsediment modeling of the case study up to 34%and 25%in the verification phase,respectively.展开更多
The authors present a case study to demonstrate the possibility of discovering complex and interesting latent structures using hierarchical latent class (HLC) models. A similar effort was made earlier by Zhang (200...The authors present a case study to demonstrate the possibility of discovering complex and interesting latent structures using hierarchical latent class (HLC) models. A similar effort was made earlier by Zhang (2002), but that study involved only small applications with 4 or 5 observed variables and no more than 2 latent variables due to the lack of efficient learning algorithms. Significant progress has been made since then on algorithmic research, and it is now possible to learn HLC models with dozens of observed variables. This allows us to demonstrate the benefits of HLC models more convincingly than before. The authors have successfully analyzed the CoIL Challenge 2000 data set using HLC models. The model obtained consists of 22 latent variables, and its structure is intuitively appealing. It is exciting to know that such a large and meaningful latent structure can be automatically inferred from data.展开更多
Reconstruction of natural streamflow is fundamental to the sustainable management of water resources.In China,previous reconstructions from sparse and poor-quality gauge measurements have led to large biases in simula...Reconstruction of natural streamflow is fundamental to the sustainable management of water resources.In China,previous reconstructions from sparse and poor-quality gauge measurements have led to large biases in simulation of the interannual and seasonal variability of natural flows.Here we use a well-trained and tested land surface model coupled to a routing model with flow direction correction to reconstruct the first high-quality gauge-based natural streamflow dataset for China,covering all its330 catchments during the period from 1961 to 2018.A stronger positive linear relationship holds between upstream routing cells and drainage areas,after flow direction correction to 330 catchments.We also introduce a parameter-uncertainty analysis framework including sensitivity analysis,optimization,and regionalization,which further minimizes biases between modeled and inferred natural streamflow from natural or near-natural gauges.The resulting behavior of the natural hydrological system is represented properly by the model which achieves high skill metric values of the monthly streamflow,with about 83%of the 330 catchments having Nash-Sutcliffe efficiency coefficient(NSE)>0.7,and about56%of the 330 catchments having Kling-Gupta efficiency coefficient(KGE)>0.7.The proposed construction scheme has important implications for similar simulation studies in other regions,and the developed low bias long-term national datasets by statistical postprocessing should be useful in supporting river management activities in China.展开更多
In recent years, the sensor array has attracted much attention in the field of complex system analysis on the basis of its good selectivity and easy operation. Many optical colorimetric sensor arrays are designed to a...In recent years, the sensor array has attracted much attention in the field of complex system analysis on the basis of its good selectivity and easy operation. Many optical colorimetric sensor arrays are designed to analyze multi-target analytes due to the good sensitivity of optical signal. In this review, we introduce the targeting analytes, sensing mechanisms and data processing methods of the optical colorimetric sensor array based on optical probes(including organic molecular probes, polymer materials and nanomaterials). The research progress in the detection of metal ions, anions, toxic gases, organic compounds, biomolecules and living organisms(such as DNA, amino acids, proteins, microbes and cells) and actual sample mixtures are summarized here.The review illustrates the types, application advantages and development prospects of the optical colorimetric sensor array to help broad readers to understand the research progress in the application of chemical sensor array.展开更多
基金supported by the Geosciences and Technology Academy of China University of Petroleum(East China)
文摘Air-gun arrays are used in marine-seismic exploration. Far-field wavelets in subsurface media represent the stacking of single air-gun ideal wavelets. We derived single air-gun ideal wavelets using near-field wavelets recorded from near-field geophones and then synthesized them into far-field wavelets. This is critical for processing wavelets in marine- seismic exploration. For this purpose, several algorithms are currently used to decompose and synthesize wavelets in the time domain. If the traveltime of single air-gun wavelets is not an integral multiple of the sampling interval, the complex and error-prone resampling of the seismic signals using the time-domain method is necessary. Based on the relation between the frequency-domain phase and the time-domain time delay, we propose a method that first transforms the real near-field wavelet to the frequency domain via Fourier transforms; then, it decomposes it and composes the wavelet spectrum in the frequency domain, and then back transforms it to the time domain. Thus, the resampling problem is avoided and single air-gun wavelets and far-field wavelets can be reliably derived. The effect of ghost reflections is also considered, while decomposing the wavelet and removing the ghost reflections. Modeling and real data processing were used to demonstrate the feasibility of the proposed method.
文摘To analyze the errors of processing data, the testing principle for jet elements is introduced and the property of testing system is theoretically and experimentally studied. On the basis of the above, the method of processing data is presented and the error formulae, which are the functions of the testing system property, are derived. Finally, the methods of reducing the errors are provided. The measured results are in correspondence with the theoretical conclusion.
基金This research is sponsored by National Natural Science Foundation of China (No. 40272041) and Innovative Foundation of CNPC (N0. 04E702).
文摘In multi-component seismic exploration, the horizontal and vertical components both contain P- and SV-waves. The P- and SV-wavefields in a seismic record can be separated by their horizontal and vertical displacements when upgoing P- and SV-waves arrive at the sea floor. If the sea floor P wave velocity, S wave velocity, and density are known, the separation can be achieved in ther-p domain. The separated wavefields are then transformed to the time domain. A method of separating P- and SV-wavefields is presented in this paper and used to effectively separate P- and SV-wavefields in synthetic and real data. The application to real data shows that this method is feasible and effective. It also can be used for free surface data.
基金Project(11174235)supported by the National Natural Science Foundation of ChinaProject(3102014JC02010301)supported by the Fundamental Research Funds for the Central Universities,China
文摘The use of underwater acoustic data has rapidly expanded with the application of multichannel, large-aperture underwater detection arrays. This study presents an underwater acoustic data compression method that is based on compressed sensing. Underwater acoustic signals are transformed into the sparse domain for data storage at a receiving terminal, and the improved orthogonal matching pursuit(IOMP) algorithm is used to reconstruct the original underwater acoustic signals at a data processing terminal. When an increase in sidelobe level occasionally causes a direction of arrival estimation error, the proposed compression method can achieve a 10 times stronger compression for narrowband signals and a 5 times stronger compression for wideband signals than the orthogonal matching pursuit(OMP) algorithm. The IOMP algorithm also reduces the computing time by about 20% more than the original OMP algorithm. The simulation and experimental results are discussed.
基金Supported by the Key Projects of Anhui Provincial Scientific and Technological Program (11010401015) the Key Program of National Natural Science Foundation of China (51134012)
文摘During advanced water detection using the transient electromagnetic method, the exploration effect for water-rich area is often poor due to the interference of bolts that are distributed in different positions in working face. Thus, the study on the interference characteristics of bolts in different states has important directive significance for improving the acquisition quality and data processing method in water detection. Based on the analysis of the distribution laws of magnetic field excited by small multi-turn coincident loop in full space of homogeneity, the test on the interference of bolts has been designed in the mine. Through drilling 18 holes around the overlapping coil in the working face, mass data are collected in order with the posi- tion change and the exposed bolt length. The results of comprehensive data analysis show that the transient electromagnetic field is strongly interfered as the distance between the bolt and the center of the coil is less than 3 m, and the interference varies greatly as the distance varies. On the other hand, the field induced by the bolts can be ignored as the distance exceeds 3 m. The findings can help to improve data acquisition and correction during advanced water detection when using the transient electromagnetic method.
基金Supported by Major State Basic Research Development Program of China("973" Program,No.2010CB731502)
文摘A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.
基金The research was sponsored by"Special Project of Emergency Response to the MS 6 .8 Bachu-Jiashi , Xinjiang Earthquake"of China Earthquake Administration
文摘Deformation patterns, shortening amounts and rates in the late Quaternary across the Kalpin thrust system have received tittle attention in the past. This paper attempts to discuss them, mainly in the eastern part of the thrust system by doing field investigation along the faults and folds, measuring geomorphic deformation, excavating trenches in several important sites where young alluvial fans were obviously displaced and dating young deposits of alluvial terraces. There are two types of deformation in the surface and near surface for the Kalpin thrust system in the late Quaternary. They are movement of thrust faults on lower angles and bending of young folds. Both kinds of deformation are shown by shortening and uplifting of young geomorphic surfaces. The surface ages of 3 stages are calculated by dating 20 examples using the TL method in the study area and comparing the results of our predecessors on the deposition and incision times of alluvial terraces in the Tianshan mountain which are 100ka B. P., 33 - 18ka B.P. and 6.6 - 8.2ka B.P. respectively for the large-scale deformed alluvial surfaces: T3, T2 and T1 in the Kalpin region. Then, 19 sets of shortening amounts and rates are obtained in 13 sites along 4 rows of anticlines in front of the Kalpin thrust system and Piqiang fold. The shortening amounts and rates show that there are two sections where deformation is stronger than others. The two sections consist of two arcs that are towards the south. The shortening rates near the top of arcs are 1.32mm/a in the west and 1.39mm/a in the east across the thrust system, respectively. In addition, deformation is stronger in the front rows than the rear ones for bifurcate folds.
基金Supported by Innovative Program of the Chinese Academy of Sciences (No. KGCY-SYW-407-02)Grand International Cooperation Foundation of Shanghai Science and Technology Commission (No. 052207046)
文摘Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial design approaches, a reconfigurable-system-on-chip (RSoC) solution based on state-of-the-art FPGA is introduced. The flexibility and reliability of this approach are outlined, and the requirements for an enhanced RSoC design with in-flight reconfigurability for space applications are presented. This design has been demonstrated as an on-board computer prototype, providing an in-flight reconfigurable DPU design approach using integrated hardwired processors.
文摘The identification of fractures is of great importance in gravity and magnetic data processing and interpretation.In this study,four fracture identification methods are applied,and widely used in processing and analysis of the gravity anomaly,including vertical second derivative method,tilt derivative method,theta map method and normalized differential method,for gravity data acquired in a given area in Heilongjiang.By comparing the distribution of the zero contour or maximum contour,we summarize the application effects,and both advantages and disadvantages of each method.It is found that tilt derivative method and normalized differential method provide better effects than other two methods:the narrower anomaly gradient belt and higher identification precision of fracture or geological boundary.The inferred fractures and geological boundaries have a great match with the results obtained from geologic map and remote sensing data interpretation.Those study results have definitely provided the theoretical foundation for identifying faults and the geological boundaries.
基金Supported by the National Natural Science Foundation of China ( No. 10974044), the Fundamental Research Funds for the Central Universities of Hohai University (No. 2009B31514) and the 2009 Jiangsu Province Graduate Education Reform and Practical Project (No. 2009-22).
文摘In this paper, an indoor environmental monitoring solution is proposed for the transformer sub- station system by using the ZigBee technology. It mainly analyzes the principle of sulfur hexafluoride sensor based on the acoustic method and puts forward a difference data processing method of single- channel and temperature compensation point at power consumption issues for sensor detection. The method not only reduces the sensor power consumption, but also ensures the accuracy and extends the lifetime of wireless sensor nodes effectively. This paper also analyzes the design of the base sta- tion mode, and demonstrates the running processes of routers and sensors. The feasibility of the pro- posed approach has been testified and proved.
文摘The paper first gives a brief overview of the use of statistical methods in sociology in order to show the continuity and importance of these methods in the development of sociology as a science. Therefore, education of sociologists requires, among other things, training in statistical methods applicable in data processing and analysis in sociological research. Then the research continues with a comparative analysis of the curricula of undergraduate academic studies of sociology, and especially the presence of statistics teaching in them, in the Republic of Serbia and the neighbouring countries. Different programs of the Department of Sociology of the Faculty of Philosophy in Novi Sad are analyzed together with the results of the students' evaluations for the academic years 2007/2008 and 2009/2010.
文摘The existing surface roughness standards comprise only two dimensions. However, the real roughness of the surface is 3D (three-dimensional). Roughness parameters of the 3D surface are also important in analyzing the mechanics of contact surfaces. Problems of mechanics of contact surfaces are related to accuracy of 3D surface roughness characteristic. One of the most important factors for 3D characteristics determination is the number of data points per x and y axes. With number of data points we understand its number in cut-off length. Number of data points have substantial influence on the accuracy of measurement results, measuring time and size of output data file (especially along the y-axis direction, where number of data points are number of parallel profiles). Number of data points must be optimal. Small number of data points lead to incorrect results and increase distribution amplitude, but too large number of data points do not enlarge range of fundamental information, but substantially increase measuring time. Therefore, we must find optimal number of data points per each surface processing method.
文摘The analysis of relevant standards and guidelines proved the lack of information on actions and activities concerning data warehouse testing. The absence of the complex data warehouse testing methodology seems to be crucial particularly in the phase of the data warehouse implementation. The aim of this article is to suggest basic data warehouse testing activities as a final part of data warehouse testing methodology. The testing activities that must be implemented in the process of the data warehouse testing can be split into four logical units regarding the multidimensional database testing, data pump testing, metadata and OLAP (Online Analytical Processing) testing. Between main testing activities can be included: revision of the multidimensional database scheme, optimizing of fact tables number, problem of data explosion, testing for correctness of aggregation and summation of data etc.
文摘In order to make effective use a large amount of graduate data in colleges and universities that accumulate by teaching management of work, the paper study the data mining for higher vocational graduates database using the data mining technology. Using a variety of data preprocessing methods for the original data, and the paper put forward to mining algorithm based on commonly association rule Apriori algorithm, then according to the actual needs of the design and implementation of association rule mining system, has been beneficial to the employment guidance of college teaching management decision and graduates of the mining results.
基金financially supported by a grant from Research Affairs of Najafabad Branch,Islamic Azad University,Iran
文摘Successful modeling of hydroenvironmental processes widely relies on quantity and quality of accessible data,and noisy data can affect the modeling performance.On the other hand in training phase of any Artificial Intelligence(AI) based model,each training data set is usually a limited sample of possible patterns of the process and hence,might not show the behavior of whole population.Accordingly,in the present paper,wavelet-based denoising method was used to smooth hydrological time series.Thereafter,small normally distributed noises with the mean of zero and various standard deviations were generated and added to the smooth time series to form different denoised-jittered data sets.Finally,the obtained pre-processed data were imposed into Artificial Neural Network(ANN) and Adaptive Neuro-Fuzzy Inference System(ANFIS)models for daily runoff-sediment modeling of the Minnesota River.To evaluate the modeling performance,the outcomes were compared with results of multi linear regression(MLR) and Auto Regressive Integrated Moving Average(ARIMA)models.The comparison showed that the proposed data processing approach which serves both denoising and jittering techniques could enhance the performance of ANN and ANFIS based runoffsediment modeling of the case study up to 34%and 25%in the verification phase,respectively.
基金Hong Kong Grants Council Grants #622105 and #622307the National Basic Research Program of China (aka the 973 Program) under project No.2003CB517106.
文摘The authors present a case study to demonstrate the possibility of discovering complex and interesting latent structures using hierarchical latent class (HLC) models. A similar effort was made earlier by Zhang (2002), but that study involved only small applications with 4 or 5 observed variables and no more than 2 latent variables due to the lack of efficient learning algorithms. Significant progress has been made since then on algorithmic research, and it is now possible to learn HLC models with dozens of observed variables. This allows us to demonstrate the benefits of HLC models more convincingly than before. The authors have successfully analyzed the CoIL Challenge 2000 data set using HLC models. The model obtained consists of 22 latent variables, and its structure is intuitively appealing. It is exciting to know that such a large and meaningful latent structure can be automatically inferred from data.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program(2019QZKK0405)the National Natural Science Foundation of China(42041006,41877155)+1 种基金support from the Center for Geodata and Analysis,Faculty of Geographical Science,Beijing Normal University(https://gda.bnu.edu.cn/)reviewed by Ministry of Natural Resources of the People’s Republic of China(GS(2021)7303)。
文摘Reconstruction of natural streamflow is fundamental to the sustainable management of water resources.In China,previous reconstructions from sparse and poor-quality gauge measurements have led to large biases in simulation of the interannual and seasonal variability of natural flows.Here we use a well-trained and tested land surface model coupled to a routing model with flow direction correction to reconstruct the first high-quality gauge-based natural streamflow dataset for China,covering all its330 catchments during the period from 1961 to 2018.A stronger positive linear relationship holds between upstream routing cells and drainage areas,after flow direction correction to 330 catchments.We also introduce a parameter-uncertainty analysis framework including sensitivity analysis,optimization,and regionalization,which further minimizes biases between modeled and inferred natural streamflow from natural or near-natural gauges.The resulting behavior of the natural hydrological system is represented properly by the model which achieves high skill metric values of the monthly streamflow,with about 83%of the 330 catchments having Nash-Sutcliffe efficiency coefficient(NSE)>0.7,and about56%of the 330 catchments having Kling-Gupta efficiency coefficient(KGE)>0.7.The proposed construction scheme has important implications for similar simulation studies in other regions,and the developed low bias long-term national datasets by statistical postprocessing should be useful in supporting river management activities in China.
基金supported by Beijing Natural Science Foundation (L172018)the National Natural Science Foundation of China (21575032, 21775010, 81728010)+1 种基金the Fundamental Research Funds for the Central Universities (PYBZ1707, buctrc201607, PT1801)Open Ground from Beijing National Laboratory for Molecular Sciences, Institute of Chemistry, Chinese Academy of Sciences
文摘In recent years, the sensor array has attracted much attention in the field of complex system analysis on the basis of its good selectivity and easy operation. Many optical colorimetric sensor arrays are designed to analyze multi-target analytes due to the good sensitivity of optical signal. In this review, we introduce the targeting analytes, sensing mechanisms and data processing methods of the optical colorimetric sensor array based on optical probes(including organic molecular probes, polymer materials and nanomaterials). The research progress in the detection of metal ions, anions, toxic gases, organic compounds, biomolecules and living organisms(such as DNA, amino acids, proteins, microbes and cells) and actual sample mixtures are summarized here.The review illustrates the types, application advantages and development prospects of the optical colorimetric sensor array to help broad readers to understand the research progress in the application of chemical sensor array.