In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken a...In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken as the model inputs,which brings uncertainties to LSP results.This study aims to reveal the influence rules of the different proportional random errors in conditioning factors on the LSP un-certainties,and further explore a method which can effectively reduce the random errors in conditioning factors.The original conditioning factors are firstly used to construct original factors-based LSP models,and then different random errors of 5%,10%,15% and 20%are added to these original factors for con-structing relevant errors-based LSP models.Secondly,low-pass filter-based LSP models are constructed by eliminating the random errors using low-pass filter method.Thirdly,the Ruijin County of China with 370 landslides and 16 conditioning factors are used as study case.Three typical machine learning models,i.e.multilayer perceptron(MLP),support vector machine(SVM)and random forest(RF),are selected as LSP models.Finally,the LSP uncertainties are discussed and results show that:(1)The low-pass filter can effectively reduce the random errors in conditioning factors to decrease the LSP uncertainties.(2)With the proportions of random errors increasing from 5%to 20%,the LSP uncertainty increases continuously.(3)The original factors-based models are feasible for LSP in the absence of more accurate conditioning factors.(4)The influence degrees of two uncertainty issues,machine learning models and different proportions of random errors,on the LSP modeling are large and basically the same.(5)The Shapley values effectively explain the internal mechanism of machine learning model predicting landslide sus-ceptibility.In conclusion,greater proportion of random errors in conditioning factors results in higher LSP uncertainty,and low-pass filter can effectively reduce these random errors.展开更多
In the process of using the original key stratum theory to predict the height of a water-flowing fractured zone(WFZ),the influence of rock strata outside the calculation range on the rock strata within the calculation...In the process of using the original key stratum theory to predict the height of a water-flowing fractured zone(WFZ),the influence of rock strata outside the calculation range on the rock strata within the calculation range as well as the fact that the shape of the overburden deformation area will change with the excavation length are ignored.In this paper,an improved key stratum theory(IKS theory)was proposed by fixing these two shortcomings.Then,a WFZ height prediction method based on IKS theory was established and applied.First,the range of overburden involved in the analysis was determined according to the tensile stress distribution range above the goaf.Second,the key stratum in the overburden involved in the analysis was identified through IKS theory.Finally,the tendency of the WFZ to develop upward was determined by judging whether or not the identified key stratum will break.The proposed method was applied and verified in a mining case study,and the reasons for the differences in the development patterns between the WFZs in coalfields in Northwest and East China were also fully explained by this method.展开更多
This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations...This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations and can lead to varied or similar results in terms of precision and performance under certain assumptions. The article underlines the importance of comparing these two approaches to choose the one best suited to the context, available data and modeling objectives.展开更多
Reservoir identification and production prediction are two of the most important tasks in petroleum exploration and development.Machine learning(ML)methods are used for petroleum-related studies,but have not been appl...Reservoir identification and production prediction are two of the most important tasks in petroleum exploration and development.Machine learning(ML)methods are used for petroleum-related studies,but have not been applied to reservoir identification and production prediction based on reservoir identification.Production forecasting studies are typically based on overall reservoir thickness and lack accuracy when reservoirs contain a water or dry layer without oil production.In this paper,a systematic ML method was developed using classification models for reservoir identification,and regression models for production prediction.The production models are based on the reservoir identification results.To realize the reservoir identification,seven optimized ML methods were used:four typical single ML methods and three ensemble ML methods.These methods classify the reservoir into five types of layers:water,dry and three levels of oil(I oil layer,II oil layer,III oil layer).The validation and test results of these seven optimized ML methods suggest the three ensemble methods perform better than the four single ML methods in reservoir identification.The XGBoost produced the model with the highest accuracy;up to 99%.The effective thickness of I and II oil layers determined during the reservoir identification was fed into the models for predicting production.Effective thickness considers the distribution of the water and the oil resulting in a more reasonable production prediction compared to predictions based on the overall reservoir thickness.To validate the superiority of the ML methods,reference models using overall reservoir thickness were built for comparison.The models based on effective thickness outperformed the reference models in every evaluation metric.The prediction accuracy of the ML models using effective thickness were 10%higher than that of reference model.Without the personal error or data distortion existing in traditional methods,this novel system realizes rapid analysis of data while reducing the time required to resolve reservoir classification and production prediction challenges.The ML models using the effective thickness obtained from reservoir identification were more accurate when predicting oil production compared to previous studies which use overall reservoir thickness.展开更多
Objective:To investigate the reliability for kinetic assay of substance with background predicted by the integrated method using uricase reaction as model. Methods: Absorbance before uricase action (Δ0) was estim...Objective:To investigate the reliability for kinetic assay of substance with background predicted by the integrated method using uricase reaction as model. Methods: Absorbance before uricase action (Δ0) was estimated by extrapolation with given lag time of steady-state reaction. With Km fixed at 12.5μmol/L, background absorbance (Δb) was predicted by nonlinearly fitting integrated Michaelis-Menten equation to Candida utilis uricase reaction curve. Uric acid in reaction solution was determined by the difference (ΔA) between Δ0 and Δb. Results .Ab usually showed deviation 〈3% from direct assay with residual substrate done fifth of initial substrate for analysis. ΔA showed CV 〈5% with resistance to common interferences except xanthine, and it linearly responded to uric acid with slope consistent to the absorptivity of uric acid. The lower limit was 2.0 μmol/L and upper limit reached 30 μmol/L in reaction solution with data monitored within 8 min reaction at 0. 015 U/ml uricase. Preliminary application to serum and urine gave better precision than the direct equilibrium method without the removal of proteins before analysis. Conclusion .This kinetic method with background predicted by the integrated method was reliable for enzymatic analysis, and it showed resistance to common interferences and enhanced efficiency at much lower cost.展开更多
Prediction of surface subsidence caused by longwall mining operation in inclined coal seams is often very challenging. The existing empirical prediction methods are inflexible for varying geological and mining conditi...Prediction of surface subsidence caused by longwall mining operation in inclined coal seams is often very challenging. The existing empirical prediction methods are inflexible for varying geological and mining conditions. An improved influence function method has been developed to take the advantage of its fundamentally sound nature and flexibility. In developing this method, the original Knothe function has been transformed to produce a continuous and asymmetrical subsidence influence function. The empirical equations for final subsidence parameters derived from col- lected longwall subsidence data have been incorporated into the mathematical models to improve the prediction accuracy. A number of demonstration cases for longwall mining operations in coal seams with varying inclination angles, depths and panel widths have been used to verify the applicability of the new subsidence prediction model.展开更多
The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indicatio...The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indication of the seasonal variability over the Indian Ocean. It is widely recognized that the warm and cold events of SST over the tropical Indian Ocean are strongly linked to those of the equatorial eastern Pacific. In this study, a statistical prediction model has been developed to predict the monthly SST over the tropical Indian Ocean. This model is a linear regression model based on the lag relationship between the SST over the tropical Indian Ocean and the Nino3.4 (5°S-5°N, 170°W-120°W) SST Index. The pre- dictor (i.e., Nino3.4 SST Index) has been operationally predicted by a large size ensemble E1 Nifio and the Southern Oscillation (ENSO) forecast system with cou- pled data assimilation (Leefs_CDA), which achieves a high predictive skill of up to a 24-month lead time for the equatorial eastern Pacific SST. As a result, the prediction skill of the present statistical model over the tropical In- dian Ocean is better than that of persistence prediction for January 1982 through December 2009.展开更多
Dimensional analysis and numerical simulations were carried out to research prediction method of breakthrough time of horizontal wells in bottom water reservoir. Four dimensionless independent variables and dimensionl...Dimensional analysis and numerical simulations were carried out to research prediction method of breakthrough time of horizontal wells in bottom water reservoir. Four dimensionless independent variables and dimensionless time were derived from 10 influencing factors of the problem by using dimensional analysis. Simulations of horizontal well in reservoir with bottom water were run to find the prediction correlation. A general and concise functional relationship for predicting breakthrough time was established based on simulation results and theoretical analysis. The breakthrough time of one conceptual model predicted by the correlation is very close to the result by Eclipse with less than 2% error. The practical breakthrough time of one well in Helder oilfield is 10 d, and the predicted results by the method is 11.2 d, which is more accurate than the analytical result. Case study indicates that the method could predict breakthrough time of horizontal well under different reservoir conditions accurately. For its university and ease of use, the method is suitable for quick prediction of breakthrough time.展开更多
Aquaculture has long been a critical economic sector in Taiwan.Since a key factor in aquaculture production efficiency is water quality,an effective means of monitoring the dissolved oxygen content(DOC)of aquaculture ...Aquaculture has long been a critical economic sector in Taiwan.Since a key factor in aquaculture production efficiency is water quality,an effective means of monitoring the dissolved oxygen content(DOC)of aquaculture water is essential.This study developed an internet of things system for monitoring DOC by collecting essential data related to water quality.Artificial intelligence technology was used to construct a water quality prediction model for use in a complete system for managing water quality.Since aquaculture water quality depends on a continuous interaction among multiple factors,and the current state is correlated with the previous state,a model with time series is required.Therefore,this study used recurrent neural networks(RNNs)with sequential characteristics.Commonly used RNNs such as long short-term memory model and gated recurrent unit(GRU)model have a memory function that appropriately retains previous results for use in processing current results.To construct a suitable RNN model,this study used Taguchi method to optimize hyperparameters(including hidden layer neuron count,iteration count,batch size,learning rate,and dropout ratio).Additionally,optimization performance was also compared between 5-layer and 7-layer network architectures.The experimental results revealed that the 7-layer GRU was more suitable for the application considered in this study.The values obtained in tests of prediction performance were mean absolute percentage error of 3.7134%,root mean square error of 0.0638,and R-value of 0.9984.Therefore,thewater qualitymanagement system developed in this study can quickly provide practitioners with highly accurate data,which is essential for a timely response to water quality issues.This study was performed in collaboration with the Taiwan Industrial Technology Research Institute and a local fishery company.Practical application of the system by the fishery company confirmed that the monitoring system is effective in improving the survival rate of farmed fish by providing data needed to maintain DOC higher than the standard value.展开更多
The development of prediction supports is a critical step in information systems engineering in this era defined by the knowledge economy, the hub of which is big data. Currently, the lack of a predictive model, wheth...The development of prediction supports is a critical step in information systems engineering in this era defined by the knowledge economy, the hub of which is big data. Currently, the lack of a predictive model, whether qualitative or quantitative, depending on a company’s areas of intervention can handicap or weaken its competitive capacities, endangering its survival. In terms of quantitative prediction, depending on the efficacy criteria, a variety of methods and/or tools are available. The multiple linear regression method is one of the methods used for this purpose. A linear regression model is a regression model of an explained variable on one or more explanatory variables in which the function that links the explanatory variables to the explained variable has linear parameters. The purpose of this work is to demonstrate how to use multiple linear regressions, which is one aspect of decisional mathematics. The use of multiple linear regressions on random data, which can be replaced by real data collected by or from organizations, provides decision makers with reliable data knowledge. As a result, machine learning methods can provide decision makers with relevant and trustworthy data. The main goal of this article is therefore to define the objective function on which the influencing factors for its optimization will be defined using the linear regression method.展开更多
It is a significant task to predict the solar activity for space weather and solar physics. All kinds of approaches have been used to forecast solar activities, and they have been applied to many areas such as the sol...It is a significant task to predict the solar activity for space weather and solar physics. All kinds of approaches have been used to forecast solar activities, and they have been applied to many areas such as the solar dynamo of simulation and space mission planning. In this paper, we employ the long-shortterm memory(LSTM) and neural network autoregression(NNAR) deep learning methods to predict the upcoming 25 th solar cycle using the sunspot area(SSA) data during the period of May 1874 to December2020. Our results show that the 25 th solar cycle will be 55% stronger than Solar Cycle 24 with a maximum sunspot area of 3115±401 and the cycle reaching its peak in October 2022 by using the LSTM method. It also shows that deep learning algorithms perform better than the other commonly used methods and have high application value.展开更多
Geo-engineering problems are known for their complexity and high uncertainty levels,requiring precise defini-tions,past experiences,logical reasoning,mathematical analysis,and practical insight to address them effecti...Geo-engineering problems are known for their complexity and high uncertainty levels,requiring precise defini-tions,past experiences,logical reasoning,mathematical analysis,and practical insight to address them effectively.Soft Computing(SC)methods have gained popularity in engineering disciplines such as mining and civil engineering due to computer hardware and machine learning advancements.Unlike traditional hard computing approaches,SC models use soft values and fuzzy sets to navigate uncertain environments.This study focuses on the application of SC methods to predict backbreak,a common issue in blasting operations within mining and civil projects.Backbreak,which refers to the unintended fracturing of rock beyond the desired blast perimeter,can significantly impact project timelines and costs.This study aims to explore how SC methods can be effectively employed to anticipate and mitigate the undesirable consequences of blasting operations,specifically focusing on backbreak prediction.The research explores the complexities of backbreak prediction and highlights the potential benefits of utilizing SC methods to address this challenging issue in geo-engineering projects.展开更多
In response to the lack of reliable physical parameters in the process simulation of the butadiene extraction,a large amount of phase equilibrium data were collected in the context of the actual process of butadiene p...In response to the lack of reliable physical parameters in the process simulation of the butadiene extraction,a large amount of phase equilibrium data were collected in the context of the actual process of butadiene production by acetonitrile.The accuracy of five prediction methods,UNIFAC(UNIQUAC Functional-group Activity Coefficients),UNIFAC-LL,UNIFAC-LBY,UNIFAC-DMD and COSMO-RS,applied to the butadiene extraction process was verified using partial phase equilibrium data.The results showed that the UNIFAC-DMD method had the highest accuracy in predicting phase equilibrium data for the missing system.COSMO-RS-predicted multiple systems showed good accuracy,and a large number of missing phase equilibrium data were estimated using the UNIFAC-DMD method and COSMO-RS method.The predicted phase equilibrium data were checked for consistency.The NRTL-RK(non-Random Two Liquid-Redlich-Kwong Equation of State)and UNIQUAC thermodynamic models were used to correlate the phase equilibrium data.Industrial device simulations were used to verify the accuracy of the thermodynamic model applied to the butadiene extraction process.The simulation results showed that the average deviations of the simulated results using the correlated thermodynamic model from the actual values were less than 2%compared to that using the commercial simulation software,Aspen Plus and its database.The average deviation was much smaller than that of the simulations using the Aspen Plus database(>10%),indicating that the obtained phase equilibrium data are highly accurate and reliable.The best phase equilibrium data and thermodynamic model parameters for butadiene extraction are provided.This improves the accuracy and reliability of the design,optimization and control of the process,and provides a basis and guarantee for developing a more environmentally friendly and economical butadiene extraction process.展开更多
The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was p...The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.展开更多
Background Co-salient object detection(Co-SOD)aims to identify and segment commonly salient objects in a set of related images.However,most current Co-SOD methods encounter issues with the inclusion of irrelevant info...Background Co-salient object detection(Co-SOD)aims to identify and segment commonly salient objects in a set of related images.However,most current Co-SOD methods encounter issues with the inclusion of irrelevant information in the co-representation.These issues hamper their ability to locate co-salient objects and significantly restrict the accuracy of detection.Methods To address this issue,this study introduces a novel Co-SOD method with iterative purification and predictive optimization(IPPO)comprising a common salient purification module(CSPM),predictive optimizing module(POM),and diminishing mixed enhancement block(DMEB).Results These components are designed to explore noise-free joint representations,assist the model in enhancing the quality of the final prediction results,and significantly improve the performance of the Co-SOD algorithm.Furthermore,through a comprehensive evaluation of IPPO and state-of-the-art algorithms focusing on the roles of CSPM,POM,and DMEB,our experiments confirmed that these components are pivotal in enhancing the performance of the model,substantiating the significant advancements of our method over existing benchmarks.Experiments on several challenging benchmark co-saliency datasets demonstrate that the proposed IPPO achieves state-of-the-art performance.展开更多
Based on an analysis of the limitations of conventional production component methods for natural gas development planning,this study proposes a new one that uses life cycle models for the trend fitting and prediction ...Based on an analysis of the limitations of conventional production component methods for natural gas development planning,this study proposes a new one that uses life cycle models for the trend fitting and prediction of production.In this new method,the annual production of old and new wells is predicted by year first and then is summed up to yield the production for the planning period.It shows that the changes in the production of old wells in old blocks can be fitted and predicted using the vapor pressure model(VPM),with precision of 80%e95%,which is 6.6%e13.2%higher than that of other life cycle models.Furthermore,a new production prediction process and method for new wells have been established based on this life cycle model to predict the production of medium-to-shallow gas reservoirs in western Sichuan Basin,with predication error of production rate in 2021 and 2022 being 6%and 3%respectively.The new method can be used to guide the medium-and long-term planning or annual scheme preparation for gas development.It is also applicable to planning for large single gas blocks that require continuous infill drilling and adjustment to improve gas recovery.展开更多
It is found that there is a linear relationship between log P-w, and the parameter term V-f/0.5 E(coh) [1+(delta(w) - delta(p))(2)/delta(p)(2), from the water permeability (P-w) data of 21 polymers covering 4 orders o...It is found that there is a linear relationship between log P-w, and the parameter term V-f/0.5 E(coh) [1+(delta(w) - delta(p))(2)/delta(p)(2), from the water permeability (P-w) data of 21 polymers covering 4 orders of magnitude. This correlation may be useful in choosing membrane materials for dehumidification of gases.展开更多
In order to improve the accuracy of prediction when using the empirical orthogonal function (EOF) method, this paper describes a novel approach for two-dimensional (2D) EOF analysis based on extrapolating both the...In order to improve the accuracy of prediction when using the empirical orthogonal function (EOF) method, this paper describes a novel approach for two-dimensional (2D) EOF analysis based on extrapolating both the spatial and temporal EOF components for long-term prediction of coastal morphological changes. The approach was investigated with data obtained from a process-based numerical model, COAST2D, which was applied to an idealized study site with a group of shore-parallel breakwaters. The progressive behavior of the spatial and temporal EOF components, related to bathymetric changes over a training period, was demonstrated, and EOF components were extrapolated with combined linear and exponential functions for long-term prediction. The extrapolated EOF components were then used to reconstruct bathymetric changes. The comparison of the reconstructed bathymetric changes with the modeled results from the COAST2D model illustrates that the presented approach can be effective for long-term prediction of coastal morphological changes, and extrapolating both the spatial and temporal EOF components yields better results than extrapolating only the temporal EOF component.展开更多
Water content in output crude oil is hard to measure precisely because of wide range of dielectric coefficient of crude oil caused by injected dehydrating and demulsifying agents.The method to reduce measurement error...Water content in output crude oil is hard to measure precisely because of wide range of dielectric coefficient of crude oil caused by injected dehydrating and demulsifying agents.The method to reduce measurement error of water content in crude oil proposed in this paper is based on switching measuring ranges of on-line water content analyzer automatically.Measuring precision on data collected from oil field and analyzed by in-field operators can be impressively improved by using back propogation (BP) neural network to predict water content in output crude oil.Application results show that the difficulty in accurately measuring water-oil content ratio can be solved effectively through this combination of on-line measuring range automatic switching and real time prediction,as this method has been tested repeatedly on-site in oil fields with satisfactory prediction results.展开更多
The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software w...The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.展开更多
基金This work is funded by the National Natural Science Foundation of China(Grant Nos.42377164 and 52079062)the National Science Fund for Distinguished Young Scholars of China(Grant No.52222905).
文摘In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken as the model inputs,which brings uncertainties to LSP results.This study aims to reveal the influence rules of the different proportional random errors in conditioning factors on the LSP un-certainties,and further explore a method which can effectively reduce the random errors in conditioning factors.The original conditioning factors are firstly used to construct original factors-based LSP models,and then different random errors of 5%,10%,15% and 20%are added to these original factors for con-structing relevant errors-based LSP models.Secondly,low-pass filter-based LSP models are constructed by eliminating the random errors using low-pass filter method.Thirdly,the Ruijin County of China with 370 landslides and 16 conditioning factors are used as study case.Three typical machine learning models,i.e.multilayer perceptron(MLP),support vector machine(SVM)and random forest(RF),are selected as LSP models.Finally,the LSP uncertainties are discussed and results show that:(1)The low-pass filter can effectively reduce the random errors in conditioning factors to decrease the LSP uncertainties.(2)With the proportions of random errors increasing from 5%to 20%,the LSP uncertainty increases continuously.(3)The original factors-based models are feasible for LSP in the absence of more accurate conditioning factors.(4)The influence degrees of two uncertainty issues,machine learning models and different proportions of random errors,on the LSP modeling are large and basically the same.(5)The Shapley values effectively explain the internal mechanism of machine learning model predicting landslide sus-ceptibility.In conclusion,greater proportion of random errors in conditioning factors results in higher LSP uncertainty,and low-pass filter can effectively reduce these random errors.
基金supported by the Key Projects of Natural Science Foundation of China(No.41931284)the Scientific Research Start-Up Fund for High-Level Introduced Talents of Anhui University of Science and Technology(No.2022yjrc21).
文摘In the process of using the original key stratum theory to predict the height of a water-flowing fractured zone(WFZ),the influence of rock strata outside the calculation range on the rock strata within the calculation range as well as the fact that the shape of the overburden deformation area will change with the excavation length are ignored.In this paper,an improved key stratum theory(IKS theory)was proposed by fixing these two shortcomings.Then,a WFZ height prediction method based on IKS theory was established and applied.First,the range of overburden involved in the analysis was determined according to the tensile stress distribution range above the goaf.Second,the key stratum in the overburden involved in the analysis was identified through IKS theory.Finally,the tendency of the WFZ to develop upward was determined by judging whether or not the identified key stratum will break.The proposed method was applied and verified in a mining case study,and the reasons for the differences in the development patterns between the WFZs in coalfields in Northwest and East China were also fully explained by this method.
文摘This article explores the comparison between the probability method and the least squares method in the design of linear predictive models. It points out that these two approaches have distinct theoretical foundations and can lead to varied or similar results in terms of precision and performance under certain assumptions. The article underlines the importance of comparing these two approaches to choose the one best suited to the context, available data and modeling objectives.
文摘Reservoir identification and production prediction are two of the most important tasks in petroleum exploration and development.Machine learning(ML)methods are used for petroleum-related studies,but have not been applied to reservoir identification and production prediction based on reservoir identification.Production forecasting studies are typically based on overall reservoir thickness and lack accuracy when reservoirs contain a water or dry layer without oil production.In this paper,a systematic ML method was developed using classification models for reservoir identification,and regression models for production prediction.The production models are based on the reservoir identification results.To realize the reservoir identification,seven optimized ML methods were used:four typical single ML methods and three ensemble ML methods.These methods classify the reservoir into five types of layers:water,dry and three levels of oil(I oil layer,II oil layer,III oil layer).The validation and test results of these seven optimized ML methods suggest the three ensemble methods perform better than the four single ML methods in reservoir identification.The XGBoost produced the model with the highest accuracy;up to 99%.The effective thickness of I and II oil layers determined during the reservoir identification was fed into the models for predicting production.Effective thickness considers the distribution of the water and the oil resulting in a more reasonable production prediction compared to predictions based on the overall reservoir thickness.To validate the superiority of the ML methods,reference models using overall reservoir thickness were built for comparison.The models based on effective thickness outperformed the reference models in every evaluation metric.The prediction accuracy of the ML models using effective thickness were 10%higher than that of reference model.Without the personal error or data distortion existing in traditional methods,this novel system realizes rapid analysis of data while reducing the time required to resolve reservoir classification and production prediction challenges.The ML models using the effective thickness obtained from reservoir identification were more accurate when predicting oil production compared to previous studies which use overall reservoir thickness.
文摘Objective:To investigate the reliability for kinetic assay of substance with background predicted by the integrated method using uricase reaction as model. Methods: Absorbance before uricase action (Δ0) was estimated by extrapolation with given lag time of steady-state reaction. With Km fixed at 12.5μmol/L, background absorbance (Δb) was predicted by nonlinearly fitting integrated Michaelis-Menten equation to Candida utilis uricase reaction curve. Uric acid in reaction solution was determined by the difference (ΔA) between Δ0 and Δb. Results .Ab usually showed deviation 〈3% from direct assay with residual substrate done fifth of initial substrate for analysis. ΔA showed CV 〈5% with resistance to common interferences except xanthine, and it linearly responded to uric acid with slope consistent to the absorptivity of uric acid. The lower limit was 2.0 μmol/L and upper limit reached 30 μmol/L in reaction solution with data monitored within 8 min reaction at 0. 015 U/ml uricase. Preliminary application to serum and urine gave better precision than the direct equilibrium method without the removal of proteins before analysis. Conclusion .This kinetic method with background predicted by the integrated method was reliable for enzymatic analysis, and it showed resistance to common interferences and enhanced efficiency at much lower cost.
文摘Prediction of surface subsidence caused by longwall mining operation in inclined coal seams is often very challenging. The existing empirical prediction methods are inflexible for varying geological and mining conditions. An improved influence function method has been developed to take the advantage of its fundamentally sound nature and flexibility. In developing this method, the original Knothe function has been transformed to produce a continuous and asymmetrical subsidence influence function. The empirical equations for final subsidence parameters derived from col- lected longwall subsidence data have been incorporated into the mathematical models to improve the prediction accuracy. A number of demonstration cases for longwall mining operations in coal seams with varying inclination angles, depths and panel widths have been used to verify the applicability of the new subsidence prediction model.
基金supported by the National Basic Research Program of China (Grant No. 2012CB417404)the National Natural Science Foundation of China (Grant Nos.41075064 and 41176014)
文摘The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indication of the seasonal variability over the Indian Ocean. It is widely recognized that the warm and cold events of SST over the tropical Indian Ocean are strongly linked to those of the equatorial eastern Pacific. In this study, a statistical prediction model has been developed to predict the monthly SST over the tropical Indian Ocean. This model is a linear regression model based on the lag relationship between the SST over the tropical Indian Ocean and the Nino3.4 (5°S-5°N, 170°W-120°W) SST Index. The pre- dictor (i.e., Nino3.4 SST Index) has been operationally predicted by a large size ensemble E1 Nifio and the Southern Oscillation (ENSO) forecast system with cou- pled data assimilation (Leefs_CDA), which achieves a high predictive skill of up to a 24-month lead time for the equatorial eastern Pacific SST. As a result, the prediction skill of the present statistical model over the tropical In- dian Ocean is better than that of persistence prediction for January 1982 through December 2009.
基金Project(2011ZX05009-004)supported by the National Science and Technology Major Projects of China
文摘Dimensional analysis and numerical simulations were carried out to research prediction method of breakthrough time of horizontal wells in bottom water reservoir. Four dimensionless independent variables and dimensionless time were derived from 10 influencing factors of the problem by using dimensional analysis. Simulations of horizontal well in reservoir with bottom water were run to find the prediction correlation. A general and concise functional relationship for predicting breakthrough time was established based on simulation results and theoretical analysis. The breakthrough time of one conceptual model predicted by the correlation is very close to the result by Eclipse with less than 2% error. The practical breakthrough time of one well in Helder oilfield is 10 d, and the predicted results by the method is 11.2 d, which is more accurate than the analytical result. Case study indicates that the method could predict breakthrough time of horizontal well under different reservoir conditions accurately. For its university and ease of use, the method is suitable for quick prediction of breakthrough time.
基金Publication costs are funded by the Ministry of Science and Technology,Taiwan,under Grant Numbers MOST 110-2221-E-153-010.
文摘Aquaculture has long been a critical economic sector in Taiwan.Since a key factor in aquaculture production efficiency is water quality,an effective means of monitoring the dissolved oxygen content(DOC)of aquaculture water is essential.This study developed an internet of things system for monitoring DOC by collecting essential data related to water quality.Artificial intelligence technology was used to construct a water quality prediction model for use in a complete system for managing water quality.Since aquaculture water quality depends on a continuous interaction among multiple factors,and the current state is correlated with the previous state,a model with time series is required.Therefore,this study used recurrent neural networks(RNNs)with sequential characteristics.Commonly used RNNs such as long short-term memory model and gated recurrent unit(GRU)model have a memory function that appropriately retains previous results for use in processing current results.To construct a suitable RNN model,this study used Taguchi method to optimize hyperparameters(including hidden layer neuron count,iteration count,batch size,learning rate,and dropout ratio).Additionally,optimization performance was also compared between 5-layer and 7-layer network architectures.The experimental results revealed that the 7-layer GRU was more suitable for the application considered in this study.The values obtained in tests of prediction performance were mean absolute percentage error of 3.7134%,root mean square error of 0.0638,and R-value of 0.9984.Therefore,thewater qualitymanagement system developed in this study can quickly provide practitioners with highly accurate data,which is essential for a timely response to water quality issues.This study was performed in collaboration with the Taiwan Industrial Technology Research Institute and a local fishery company.Practical application of the system by the fishery company confirmed that the monitoring system is effective in improving the survival rate of farmed fish by providing data needed to maintain DOC higher than the standard value.
文摘The development of prediction supports is a critical step in information systems engineering in this era defined by the knowledge economy, the hub of which is big data. Currently, the lack of a predictive model, whether qualitative or quantitative, depending on a company’s areas of intervention can handicap or weaken its competitive capacities, endangering its survival. In terms of quantitative prediction, depending on the efficacy criteria, a variety of methods and/or tools are available. The multiple linear regression method is one of the methods used for this purpose. A linear regression model is a regression model of an explained variable on one or more explanatory variables in which the function that links the explanatory variables to the explained variable has linear parameters. The purpose of this work is to demonstrate how to use multiple linear regressions, which is one aspect of decisional mathematics. The use of multiple linear regressions on random data, which can be replaced by real data collected by or from organizations, provides decision makers with reliable data knowledge. As a result, machine learning methods can provide decision makers with relevant and trustworthy data. The main goal of this article is therefore to define the objective function on which the influencing factors for its optimization will be defined using the linear regression method.
基金supported by the National Natural Science Foundation of China under Grant numbers U2031202,U1731124 and U1531247the special foundation work of the Ministry of Science and Technology of the People’s Republic of China under Grant number 2014FY120300the 13th Five-year Informatization Plan of Chinese Academy of Sciences under Grant number XXH13505-04。
文摘It is a significant task to predict the solar activity for space weather and solar physics. All kinds of approaches have been used to forecast solar activities, and they have been applied to many areas such as the solar dynamo of simulation and space mission planning. In this paper, we employ the long-shortterm memory(LSTM) and neural network autoregression(NNAR) deep learning methods to predict the upcoming 25 th solar cycle using the sunspot area(SSA) data during the period of May 1874 to December2020. Our results show that the 25 th solar cycle will be 55% stronger than Solar Cycle 24 with a maximum sunspot area of 3115±401 and the cycle reaching its peak in October 2022 by using the LSTM method. It also shows that deep learning algorithms perform better than the other commonly used methods and have high application value.
文摘Geo-engineering problems are known for their complexity and high uncertainty levels,requiring precise defini-tions,past experiences,logical reasoning,mathematical analysis,and practical insight to address them effectively.Soft Computing(SC)methods have gained popularity in engineering disciplines such as mining and civil engineering due to computer hardware and machine learning advancements.Unlike traditional hard computing approaches,SC models use soft values and fuzzy sets to navigate uncertain environments.This study focuses on the application of SC methods to predict backbreak,a common issue in blasting operations within mining and civil projects.Backbreak,which refers to the unintended fracturing of rock beyond the desired blast perimeter,can significantly impact project timelines and costs.This study aims to explore how SC methods can be effectively employed to anticipate and mitigate the undesirable consequences of blasting operations,specifically focusing on backbreak prediction.The research explores the complexities of backbreak prediction and highlights the potential benefits of utilizing SC methods to address this challenging issue in geo-engineering projects.
基金supported by the National Natural Science Foundation of China(22178190)。
文摘In response to the lack of reliable physical parameters in the process simulation of the butadiene extraction,a large amount of phase equilibrium data were collected in the context of the actual process of butadiene production by acetonitrile.The accuracy of five prediction methods,UNIFAC(UNIQUAC Functional-group Activity Coefficients),UNIFAC-LL,UNIFAC-LBY,UNIFAC-DMD and COSMO-RS,applied to the butadiene extraction process was verified using partial phase equilibrium data.The results showed that the UNIFAC-DMD method had the highest accuracy in predicting phase equilibrium data for the missing system.COSMO-RS-predicted multiple systems showed good accuracy,and a large number of missing phase equilibrium data were estimated using the UNIFAC-DMD method and COSMO-RS method.The predicted phase equilibrium data were checked for consistency.The NRTL-RK(non-Random Two Liquid-Redlich-Kwong Equation of State)and UNIQUAC thermodynamic models were used to correlate the phase equilibrium data.Industrial device simulations were used to verify the accuracy of the thermodynamic model applied to the butadiene extraction process.The simulation results showed that the average deviations of the simulated results using the correlated thermodynamic model from the actual values were less than 2%compared to that using the commercial simulation software,Aspen Plus and its database.The average deviation was much smaller than that of the simulations using the Aspen Plus database(>10%),indicating that the obtained phase equilibrium data are highly accurate and reliable.The best phase equilibrium data and thermodynamic model parameters for butadiene extraction are provided.This improves the accuracy and reliability of the design,optimization and control of the process,and provides a basis and guarantee for developing a more environmentally friendly and economical butadiene extraction process.
基金financially supported by the National Key Research and Development Program of China(2022YFB3706800,2020YFB1710100)the National Natural Science Foundation of China(51821001,52090042,52074183)。
文摘The complex sand-casting process combined with the interactions between process parameters makes it difficult to control the casting quality,resulting in a high scrap rate.A strategy based on a data-driven model was proposed to reduce casting defects and improve production efficiency,which includes the random forest(RF)classification model,the feature importance analysis,and the process parameters optimization with Monte Carlo simulation.The collected data includes four types of defects and corresponding process parameters were used to construct the RF model.Classification results show a recall rate above 90% for all categories.The Gini Index was used to assess the importance of the process parameters in the formation of various defects in the RF model.Finally,the classification model was applied to different production conditions for quality prediction.In the case of process parameters optimization for gas porosity defects,this model serves as an experimental process in the Monte Carlo method to estimate a better temperature distribution.The prediction model,when applied to the factory,greatly improved the efficiency of defect detection.Results show that the scrap rate decreased from 10.16% to 6.68%.
基金Supported by the National Natural Science Foundation of China under Grant(62301330,62101346)the Guangdong Basic and Applied Basic Research Foundation(2024A1515010496,2022A1515110101)+1 种基金the Stable Support Plan for Shenzhen Higher Education Institutions(20231121103807001)the Guangdong Provincial Key Laboratory under(2023B1212060076).
文摘Background Co-salient object detection(Co-SOD)aims to identify and segment commonly salient objects in a set of related images.However,most current Co-SOD methods encounter issues with the inclusion of irrelevant information in the co-representation.These issues hamper their ability to locate co-salient objects and significantly restrict the accuracy of detection.Methods To address this issue,this study introduces a novel Co-SOD method with iterative purification and predictive optimization(IPPO)comprising a common salient purification module(CSPM),predictive optimizing module(POM),and diminishing mixed enhancement block(DMEB).Results These components are designed to explore noise-free joint representations,assist the model in enhancing the quality of the final prediction results,and significantly improve the performance of the Co-SOD algorithm.Furthermore,through a comprehensive evaluation of IPPO and state-of-the-art algorithms focusing on the roles of CSPM,POM,and DMEB,our experiments confirmed that these components are pivotal in enhancing the performance of the model,substantiating the significant advancements of our method over existing benchmarks.Experiments on several challenging benchmark co-saliency datasets demonstrate that the proposed IPPO achieves state-of-the-art performance.
基金funded by the project entitled Technical Countermeasures for the Quantitative Characterization and Adjustment of Residual Gas in Tight Sandstone Gas Reservoirs of the Daniudi Gas Field(P20065-1)organized by the Science&Technology R&D Department of Sinopec.
文摘Based on an analysis of the limitations of conventional production component methods for natural gas development planning,this study proposes a new one that uses life cycle models for the trend fitting and prediction of production.In this new method,the annual production of old and new wells is predicted by year first and then is summed up to yield the production for the planning period.It shows that the changes in the production of old wells in old blocks can be fitted and predicted using the vapor pressure model(VPM),with precision of 80%e95%,which is 6.6%e13.2%higher than that of other life cycle models.Furthermore,a new production prediction process and method for new wells have been established based on this life cycle model to predict the production of medium-to-shallow gas reservoirs in western Sichuan Basin,with predication error of production rate in 2021 and 2022 being 6%and 3%respectively.The new method can be used to guide the medium-and long-term planning or annual scheme preparation for gas development.It is also applicable to planning for large single gas blocks that require continuous infill drilling and adjustment to improve gas recovery.
基金This work was supported by the National Natural Science Foundation of China
文摘It is found that there is a linear relationship between log P-w, and the parameter term V-f/0.5 E(coh) [1+(delta(w) - delta(p))(2)/delta(p)(2), from the water permeability (P-w) data of 21 polymers covering 4 orders of magnitude. This correlation may be useful in choosing membrane materials for dehumidification of gases.
基金the School of Engineering at Cardiff University for providing the financial support of a Ph D studentship to accomplish the research
文摘In order to improve the accuracy of prediction when using the empirical orthogonal function (EOF) method, this paper describes a novel approach for two-dimensional (2D) EOF analysis based on extrapolating both the spatial and temporal EOF components for long-term prediction of coastal morphological changes. The approach was investigated with data obtained from a process-based numerical model, COAST2D, which was applied to an idealized study site with a group of shore-parallel breakwaters. The progressive behavior of the spatial and temporal EOF components, related to bathymetric changes over a training period, was demonstrated, and EOF components were extrapolated with combined linear and exponential functions for long-term prediction. The extrapolated EOF components were then used to reconstruct bathymetric changes. The comparison of the reconstructed bathymetric changes with the modeled results from the COAST2D model illustrates that the presented approach can be effective for long-term prediction of coastal morphological changes, and extrapolating both the spatial and temporal EOF components yields better results than extrapolating only the temporal EOF component.
基金Sponsored by the Basic Research Fundation of Beijing Institute of Technology (200705422009)
文摘Water content in output crude oil is hard to measure precisely because of wide range of dielectric coefficient of crude oil caused by injected dehydrating and demulsifying agents.The method to reduce measurement error of water content in crude oil proposed in this paper is based on switching measuring ranges of on-line water content analyzer automatically.Measuring precision on data collected from oil field and analyzed by in-field operators can be impressively improved by using back propogation (BP) neural network to predict water content in output crude oil.Application results show that the difficulty in accurately measuring water-oil content ratio can be solved effectively through this combination of on-line measuring range automatic switching and real time prediction,as this method has been tested repeatedly on-site in oil fields with satisfactory prediction results.
文摘The development of defect prediction plays a significant role in improving software quality. Such predictions are used to identify defective modules before the testing and to minimize the time and cost. The software with defects negatively impacts operational costs and finally affects customer satisfaction. Numerous approaches exist to predict software defects. However, the timely and accurate software bugs are the major challenging issues. To improve the timely and accurate software defect prediction, a novel technique called Nonparametric Statistical feature scaled QuAdratic regressive convolution Deep nEural Network (SQADEN) is introduced. The proposed SQADEN technique mainly includes two major processes namely metric or feature selection and classification. First, the SQADEN uses the nonparametric statistical Torgerson–Gower scaling technique for identifying the relevant software metrics by measuring the similarity using the dice coefficient. The feature selection process is used to minimize the time complexity of software fault prediction. With the selected metrics, software fault perdition with the help of the Quadratic Censored regressive convolution deep neural network-based classification. The deep learning classifier analyzes the training and testing samples using the contingency correlation coefficient. The softstep activation function is used to provide the final fault prediction results. To minimize the error, the Nelder–Mead method is applied to solve non-linear least-squares problems. Finally, accurate classification results with a minimum error are obtained at the output layer. Experimental evaluation is carried out with different quantitative metrics such as accuracy, precision, recall, F-measure, and time complexity. The analyzed results demonstrate the superior performance of our proposed SQADEN technique with maximum accuracy, sensitivity and specificity by 3%, 3%, 2% and 3% and minimum time and space by 13% and 15% when compared with the two state-of-the-art methods.