Injection of water to enhance oil production is commonplace, and improvements in understanding the process are economically important. This study examines predictive models of the injection-to-production ratio. First...Injection of water to enhance oil production is commonplace, and improvements in understanding the process are economically important. This study examines predictive models of the injection-to-production ratio. Firstly, the error between the fitting and actual injection-production ratio is calculated with such methods as the injection-production ratio and water-oil ratio method, the material balance method, the multiple regression method, the gray theory GM (1,1) model and the back-propogation (BP) neural network method by computer applications in this paper. The relative average errors calculated are respectively 1.67%, 1.08%, 19.2%, 1.38% and 0.88%. Secondly, the reasons for the errors from different prediction methods are analyzed theoretically, indicating that the prediction precision of the BP neural network method is high, and that it has a better self-adaptability, so that it can reflect the internal relationship between the injection-production ratio and the influencing factors. Therefore, the BP neural network method is suitable to the prediction of injection-production ratio.展开更多
Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables...Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.展开更多
Based on the synthesis and analysis of recursive receivers, a new algorithm, namely partial grouping maximization likelihood algorithm, is proposed to achieve satisfactory performance with moderate computational compl...Based on the synthesis and analysis of recursive receivers, a new algorithm, namely partial grouping maximization likelihood algorithm, is proposed to achieve satisfactory performance with moderate computational complexity.During the analysis, some interesting properties shared by the proposed procedures are described.Finally, the performance assessment shows that the new scheme is superior to the linear detector and ordinary grouping algorithm, and achieves a bit-error rate close to that of the optimum receiver.展开更多
Three progressive stages of testing techniques are elaborated,which are en-tirely manual operating,taking separate instruments testing and computer program con-trolling.The testing method and principle are detailed ba...Three progressive stages of testing techniques are elaborated,which are en-tirely manual operating,taking separate instruments testing and computer program con-trolling.The testing method and principle are detailed based on the testing process formeteorological parameters,air pressure,air quality and rotating velocity.And every testingtechnique is analyzed.Finally, the technique outlook is viewed.All this plays a leading rolein development of the testing techniques.展开更多
The ESS software package is prepared for electrical data processing in the fields of coal prospecting, hydrogeologicai engineering, and can be used in the other fields of electrical data processing. It can be operated...The ESS software package is prepared for electrical data processing in the fields of coal prospecting, hydrogeologicai engineering, and can be used in the other fields of electrical data processing. It can be operated on any kind of microcomputer which has an internal memories of more than 512kB.The ESS software package would be leading the office operation to an automatic data processing period and the field work free from the tedious, repeated data treating and mapping, so that the engineers would have more time to analyse and interpret field data. Undoubtedly, it is of benefit to improving the relibility of the geological evaluation.展开更多
The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collect...The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collected. Based on the preprocessing for the raw data, a computational program for complexity measure is compiled and the complexity measures of all samples are calculated. The mean value and standard error of complexity measure of control group is as 0.33 and 0.10, and the normal group is as 0.53 and 0.08. When the confidence degree is 0.05, the confidence interval of the normal population mean of complexity measures for the control group is (0.2871,0.3652), and (0.4944,0.5552) for the normal group. The statistic results show that the normal samples and patient samples can be clearly distinguished by the value of measures. In clinical medicine, the results can be used to be a reference to evaluate the function or state, to diagnose disease, to monitor the rehabilitation progress of the brain.展开更多
This paper uses the TSA (therrnoelastic stress analysis) technique to determine the stress concentration factor (Kt) of a U-notch in an aluminum plate, and then compares the results with those obtained from a FEA ...This paper uses the TSA (therrnoelastic stress analysis) technique to determine the stress concentration factor (Kt) of a U-notch in an aluminum plate, and then compares the results with those obtained from a FEA (finite elements analysis) of the same specimen. In order to do so, it devises a calculation procedure to extrapolate the thermoelastic data near the tip of the notch and then applies the resulting algorithm to seven distinct experiments that had different loading frequencies, mean loads and load ranges. The overall positive results suggest that the technique may be suitable for Kt measurements in real-world structures. A discussion about the calibration factor of the thermoelastic data is included by confronting the calibration results using independent tensile uniaxial tests and using the U-notch TSA and FEA paired specimen data.展开更多
This paper focuses on the stability testing of fractional-delay systems. It begins with a brief introduction of a recently reportedalgorithm, a detailed demonstration of a failure in applications of the algorithm and ...This paper focuses on the stability testing of fractional-delay systems. It begins with a brief introduction of a recently reportedalgorithm, a detailed demonstration of a failure in applications of the algorithm and the key points behind the failure. Then,it presents a criterion via integration, in terms of the characteristic function of the fractional-delay system directly, for testingwhether the characteristic function has roots with negative real parts only or not. As two applications of the proposed criterion,an algorithm for calculating the rightmost characteristic root and an algorithm for determining the stability switches, are proposed.The illustrative examples show that the algorithms work effectively in the stability testing of fractional-delay systems.展开更多
A novel approach, which can handle ambiguous data from weak targets, is proposed within the randomized Hough transform track-before-detect(RHT-TBD) framework. The main idea is that, without the pre-detection and ambig...A novel approach, which can handle ambiguous data from weak targets, is proposed within the randomized Hough transform track-before-detect(RHT-TBD) framework. The main idea is that, without the pre-detection and ambiguity resolution step at each time step, the ambiguous measurements are mapped by the multiple hypothesis ranging(MHR) procedure. In this way, all the information, based on the relativity in time and pulse repetition frequency(PRF) domains, can be gathered among different PRFs and integrated over time via a batch procedure. The final step is to perform the RHT with all the extended measurements, and the ambiguous data is unfolded while the detection decision is confirmed at the end of the processing chain.Unlike classic methods, the new approach resolves the problem of range ambiguity and detects the true track for targets. Finally, its application is illustrated to analyze and compare the performance between the proposed approach and the existing approach. Simulation results exhibit the effectiveness of this approach.展开更多
The mispredictive costs of flaring and non-flaring samples are different for different applications of solar flare prediction.Hence,solar flare prediction is considered a cost sensitive problem.A cost sensitive solar ...The mispredictive costs of flaring and non-flaring samples are different for different applications of solar flare prediction.Hence,solar flare prediction is considered a cost sensitive problem.A cost sensitive solar flare prediction model is built by modifying the basic decision tree algorithm.Inconsistency rate with the exhaustive search strategy is used to determine the optimal combination of magnetic field parameters in an active region.These selected parameters are applied as the inputs of the solar flare prediction model.The performance of the cost sensitive solar flare prediction model is evaluated for the different thresholds of solar flares.It is found that more flaring samples are correctly predicted and more non-flaring samples are wrongly predicted with the increase of the cost for wrongly predicting flaring samples as non-flaring samples,and the larger cost of wrongly predicting flaring samples as non-flaring samples is required for the higher threshold of solar flares.This can be considered as the guide line for choosing proper cost to meet the requirements in different applications.展开更多
文摘Injection of water to enhance oil production is commonplace, and improvements in understanding the process are economically important. This study examines predictive models of the injection-to-production ratio. Firstly, the error between the fitting and actual injection-production ratio is calculated with such methods as the injection-production ratio and water-oil ratio method, the material balance method, the multiple regression method, the gray theory GM (1,1) model and the back-propogation (BP) neural network method by computer applications in this paper. The relative average errors calculated are respectively 1.67%, 1.08%, 19.2%, 1.38% and 0.88%. Secondly, the reasons for the errors from different prediction methods are analyzed theoretically, indicating that the prediction precision of the BP neural network method is high, and that it has a better self-adaptability, so that it can reflect the internal relationship between the injection-production ratio and the influencing factors. Therefore, the BP neural network method is suitable to the prediction of injection-production ratio.
基金Supported in part by the National Natural Science Foundation of China(60073012),National Grand Fundamental Research 973 Program of China(G1999032701),National Research Foundation for the Doctoral Program of Higher Education of China,Natural Science Found
文摘Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.
基金Supported by National Natural Science Foundation of China (No. 60372107, 10371106, 10471114)Natural Science Foundation of Jiangsu Province (No. 04KJB110097)
文摘Based on the synthesis and analysis of recursive receivers, a new algorithm, namely partial grouping maximization likelihood algorithm, is proposed to achieve satisfactory performance with moderate computational complexity.During the analysis, some interesting properties shared by the proposed procedures are described.Finally, the performance assessment shows that the new scheme is superior to the linear detector and ordinary grouping algorithm, and achieves a bit-error rate close to that of the optimum receiver.
文摘Three progressive stages of testing techniques are elaborated,which are en-tirely manual operating,taking separate instruments testing and computer program con-trolling.The testing method and principle are detailed based on the testing process formeteorological parameters,air pressure,air quality and rotating velocity.And every testingtechnique is analyzed.Finally, the technique outlook is viewed.All this plays a leading rolein development of the testing techniques.
文摘The ESS software package is prepared for electrical data processing in the fields of coal prospecting, hydrogeologicai engineering, and can be used in the other fields of electrical data processing. It can be operated on any kind of microcomputer which has an internal memories of more than 512kB.The ESS software package would be leading the office operation to an automatic data processing period and the field work free from the tedious, repeated data treating and mapping, so that the engineers would have more time to analyse and interpret field data. Undoubtedly, it is of benefit to improving the relibility of the geological evaluation.
基金International Joint Research Program from the Ministry of Science and Technology of Chinagrant number:20070667+1 种基金Education Commission of Chongqing of Chinagrant number:KJ081209
文摘The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collected. Based on the preprocessing for the raw data, a computational program for complexity measure is compiled and the complexity measures of all samples are calculated. The mean value and standard error of complexity measure of control group is as 0.33 and 0.10, and the normal group is as 0.53 and 0.08. When the confidence degree is 0.05, the confidence interval of the normal population mean of complexity measures for the control group is (0.2871,0.3652), and (0.4944,0.5552) for the normal group. The statistic results show that the normal samples and patient samples can be clearly distinguished by the value of measures. In clinical medicine, the results can be used to be a reference to evaluate the function or state, to diagnose disease, to monitor the rehabilitation progress of the brain.
文摘This paper uses the TSA (therrnoelastic stress analysis) technique to determine the stress concentration factor (Kt) of a U-notch in an aluminum plate, and then compares the results with those obtained from a FEA (finite elements analysis) of the same specimen. In order to do so, it devises a calculation procedure to extrapolate the thermoelastic data near the tip of the notch and then applies the resulting algorithm to seven distinct experiments that had different loading frequencies, mean loads and load ranges. The overall positive results suggest that the technique may be suitable for Kt measurements in real-world structures. A discussion about the calibration factor of the thermoelastic data is included by confronting the calibration results using independent tensile uniaxial tests and using the U-notch TSA and FEA paired specimen data.
基金supported by the National Natural Science Foundation of China (Grant Nos. 10825207 and 11032009)the Program for Changjiang Scholars and Innovative Research Team in University (Grant No. IRT0968)
文摘This paper focuses on the stability testing of fractional-delay systems. It begins with a brief introduction of a recently reportedalgorithm, a detailed demonstration of a failure in applications of the algorithm and the key points behind the failure. Then,it presents a criterion via integration, in terms of the characteristic function of the fractional-delay system directly, for testingwhether the characteristic function has roots with negative real parts only or not. As two applications of the proposed criterion,an algorithm for calculating the rightmost characteristic root and an algorithm for determining the stability switches, are proposed.The illustrative examples show that the algorithms work effectively in the stability testing of fractional-delay systems.
基金supported by National Natural Science Foundation of China (Grant Nos. 61179018, 61372027, 61501489)Special Foundation for Mountain Tai Scholars
文摘A novel approach, which can handle ambiguous data from weak targets, is proposed within the randomized Hough transform track-before-detect(RHT-TBD) framework. The main idea is that, without the pre-detection and ambiguity resolution step at each time step, the ambiguous measurements are mapped by the multiple hypothesis ranging(MHR) procedure. In this way, all the information, based on the relativity in time and pulse repetition frequency(PRF) domains, can be gathered among different PRFs and integrated over time via a batch procedure. The final step is to perform the RHT with all the extended measurements, and the ambiguous data is unfolded while the detection decision is confirmed at the end of the processing chain.Unlike classic methods, the new approach resolves the problem of range ambiguity and detects the true track for targets. Finally, its application is illustrated to analyze and compare the performance between the proposed approach and the existing approach. Simulation results exhibit the effectiveness of this approach.
基金supported by the Young Researcher Grant of National Astronomical Observatories,Chinese Academy of Sciencesthe National Basic Research Program of China (Grant No.2011CB811406)the National Natural Science Foundation of China(Grant Nos.10733020,10921303 and 11078010)
文摘The mispredictive costs of flaring and non-flaring samples are different for different applications of solar flare prediction.Hence,solar flare prediction is considered a cost sensitive problem.A cost sensitive solar flare prediction model is built by modifying the basic decision tree algorithm.Inconsistency rate with the exhaustive search strategy is used to determine the optimal combination of magnetic field parameters in an active region.These selected parameters are applied as the inputs of the solar flare prediction model.The performance of the cost sensitive solar flare prediction model is evaluated for the different thresholds of solar flares.It is found that more flaring samples are correctly predicted and more non-flaring samples are wrongly predicted with the increase of the cost for wrongly predicting flaring samples as non-flaring samples,and the larger cost of wrongly predicting flaring samples as non-flaring samples is required for the higher threshold of solar flares.This can be considered as the guide line for choosing proper cost to meet the requirements in different applications.