At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achievi...At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.展开更多
Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic i...Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.展开更多
In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of inform...In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.展开更多
This paper describes a data reconstruction technique for a multi-function sensor based on the Mestimator, which uses least squares and weighted least squares method. The algorithm has better robustness than convention...This paper describes a data reconstruction technique for a multi-function sensor based on the Mestimator, which uses least squares and weighted least squares method. The algorithm has better robustness than conventional least squares which can amplify the errors of inaccurate data. The M-estimator places particular emphasis on reducing the effects of large data errors, which are further overcome by an iterative regression process which gives small weights to large off-group data errors and large weights to small data errors. Simulation results are consistent with the hypothesis with 81 groups of regression data having an average accuracy of 3.5%, which demonstrates that the M-estimator provides more accurate and reliable data reconstruction.展开更多
Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seism...Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.展开更多
BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 ...BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 and 385 patients,respectively),but their results are discordant.AIM To synthetize the available evidence on the effectiveness of lutetium in pre-treated metastatic castration-resistant prostate cancer;and to test the application of a new artificial intelligence technique that synthetizes effectiveness based on reconstructed patient-level data.METHODS We employed a new artificial intelligence method(shiny method)to pool the survival data of these two trials and evaluate to what extent the lutetium cohorts differed from one another.The shiny technique employs an original reconstruction of individual patient data from the Kaplan-Meier curves.The progression-free survival graphs of the two lutetium cohorts were analyzed and compared.RESULTS The hazard ratio estimated was in favor of the vision trial;the difference was statistically significant(P<0.001).These results indicate that further studies on lutetium are needed because the survival data of the two trials published thus far are conflicting.CONCLUSION Our study confirms the feasibility of reconstructing patient-level data from survival graphs in order to generate a survival statistics.展开更多
A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject t...A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.展开更多
We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical info...We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical information.The random forest method is selected to develop the machine learning data reconstruction model(MLDRM-RF)for wind speeds over Beijing from 2015-19.We use temporal,geospatial attribute and meteorological background field features as inputs.The wind speed field can be reconstructed at any station in the region not used in the training process to cross-validate model performance.The evaluation considers the spatial distribution of and seasonal variations in the root mean squared error(RMSE)of the reconstructed wind speed field across Beijing.The average RMSE is 1.09 m s^(−1),considerably smaller than the result(1.29 m s^(−1))obtained with inverse distance weighting(IDW)interpolation.Finally,we extract the important feature permutations by the method of mean decrease in impurity(MDI)and discuss the reasonableness of the model prediction results.MLDRM-RF is a reasonable approach with excellent potential for the improved reconstruction of historical surface wind speed fields with arbitrary grid resolutions.Such a model is needed in many wind applications,such as wind energy and aviation safety assessments.展开更多
A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years...A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years' worth of data collected in Beijing. Observation data of Rs and Qp from 2005-2008 are used to investigate the temporal variability of Qp and its dependence on the clearness index and solar zenith angle. A simple and effcient all-weather empirically derived reconstruction model is proposed to reconstruct Qp from Rs. This reconstruction method is found to estimate instantaneous Qp with high accuracy. The annual mean of the daily values of Qp during the period 1958-2005 period is 25.06 mol m-2 d-1. The magnitude of the long-term trend for the annual averaged Qp is presented (-0.19 mol m-2 yr-1 from 1958-1997 and -0.12 mol m-2 yr-1 from 1958-2005). The trend in Qp exhibits sharp decreases in the spring and summer and more gentle decreases in the autumn and winter.展开更多
Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly a...Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly and seasonal streamflow forecasting in two large catchments in the Jaguaribe River Basin in the Brazilian semi-arid area.We adopted four different lead times:one month ahead for monthly scale and two,three and four months ahead for seasonal scale.The gaps of the historic streamflow series were filled up by using rainfall-runoff modelling.Then,time series model techniques were applied,i.e.,the locally constant,the locally averaged,the k-nearest-neighbours algorithm(k-NN)and the autoregressive(AR)model.The criterion of reliability of the validation results is that the forecast is more skillful than streamflow climatology.Our approach outperformed the streamflow climatology for all monthly streamflows.On average,the former was 25%better than the latter.The seasonal streamflow forecasting(SSF)was also reliable(on average,20%better than the climatology),failing slightly only for the high flow season of one catchment(6%worse than the climatology).Considering an uncertainty envelope(probabilistic forecasting),which was considerably narrower than the data standard deviation,the streamflow forecasting performance increased by about 50%at both scales.The forecast errors were mainly driven by the streamflow intra-seasonality at monthly scale,while they were by the forecast lead time at seasonal scale.The best-fit and worst-fit time series model were the k-NN approach and the AR model,respectively.The rainfall-runoff modelling outputs played an important role in improving streamflow forecasting for one streamgauge that showed 35%of data gaps.The developed data-driven approach is mathematical and computationally very simple,demands few resources to accomplish its operational implementation and is applicable to other dryland watersheds.Our findings may be part of drought forecasting systems and potentially help allocating water months in advance.Moreover,the developed strategy can serve as a baseline for more complex streamflow forecast systems.展开更多
Precipitation is the most discontinuous atmospheric parameter because of its temporal and spatial variability. Precipitation observations at automatic weather stations(AWSs) show different patterns over different ti...Precipitation is the most discontinuous atmospheric parameter because of its temporal and spatial variability. Precipitation observations at automatic weather stations(AWSs) show different patterns over different time periods. This paper aims to reconstruct missing data by finding the time periods when precipitation patterns are similar, with a method called the intermittent sliding window period(ISWP) technique—a novel approach to reconstructing the majority of non-continuous missing real-time precipitation data. The ISWP technique is applied to a 1-yr precipitation dataset(January 2015 to January 2016), with a temporal resolution of 1 h, collected at 11 AWSs run by the Indian Meteorological Department in the capital region of Delhi. The acquired dataset has missing precipitation data amounting to 13.66%, of which 90.6% are reconstructed successfully. Furthermore, some traditional estimation algorithms are applied to the reconstructed dataset to estimate the remaining missing values on an hourly basis. The results show that the interpolation of the reconstructed dataset using the ISWP technique exhibits high quality compared with interpolation of the raw dataset. By adopting the ISWP technique, the root-mean-square errors(RMSEs)in the estimation of missing rainfall data—based on the arithmetic mean, multiple linear regression, linear regression,and moving average methods—are reduced by 4.2%, 55.47%, 19.44%, and 9.64%, respectively. However, adopting the ISWP technique with the inverse distance weighted method increases the RMSE by 0.07%, due to the fact that the reconstructed data add a more diverse relation to its neighboring AWSs.展开更多
An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with th...An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with the fuel consumption of the furnace to obtain a model of the energy consumption.Combined with the mechanism analysis,the basic parameters affecting energy consumption were determined,and four key influencing factors were obtained:furnace output,furnace charging temperature,furnace tapping temperature,and steel type.The specific calculation method of the contribution of each influencing factor was derived to define the conditions of the baseline energy consumption,while the online data were used to calculate the energy value and the actual performance value of the baseline energy consumption.The contribution of each influencing factor was determined through normalization.The cloud platform was used for database reconstruction and programming to realize the online intelligent evaluation of the energy consumption of the reheating furnace.Finally,a case study of the evaluation of the practical energy consumption of a steel rolling furnace in a steel plant was presented.The intelligent evaluation results were quantified and displayed online,and the performance of the system in reducing production line energy consumption was demonstrated.展开更多
基金This study was supported by the National Natural Science Foundation of China under the project‘Research on the Dynamic Location of Receiver Points and Wave Field Separation Technology Based on Deep Learning in OBN Seismic Exploration’(No.42074140).
文摘At present,the acquisition of seismic data is developing toward high-precision and high-density methods.However,complex natural environments and cultural factors in many exploration areas cause difficulties in achieving uniform and intensive acquisition,which makes complete seismic data collection impossible.Therefore,data reconstruction is required in the processing link to ensure imaging accuracy.Deep learning,as a new field in rapid development,presents clear advantages in feature extraction and modeling.In this study,the convolutional neural network deep learning algorithm is applied to seismic data reconstruction.Based on the convolutional neural network algorithm and combined with the characteristics of seismic data acquisition,two training strategies of supervised and unsupervised learning are designed to reconstruct sparse acquisition seismic records.First,a supervised learning strategy is proposed for labeled data,wherein the complete seismic data are segmented as the input of the training set and are randomly sampled before each training,thereby increasing the number of samples and the richness of features.Second,an unsupervised learning strategy based on large samples is proposed for unlabeled data,and the rolling segmentation method is used to update(pseudo)labels and training parameters in the training process.Through the reconstruction test of simulated and actual data,the deep learning algorithm based on a convolutional neural network shows better reconstruction quality and higher accuracy than compressed sensing based on Curvelet transform.
基金supported by National Natural Science Foundation of China(Grant No.41874146 and No.42030103)Postgraduate Innovation Project of China University of Petroleum(East China)(No.YCX2021012)
文摘Seismic data reconstruction is an essential and yet fundamental step in seismic data processing workflow,which is of profound significance to improve migration imaging quality,multiple suppression effect,and seismic inversion accuracy.Regularization methods play a central role in solving the underdetermined inverse problem of seismic data reconstruction.In this paper,a novel regularization approach is proposed,the low dimensional manifold model(LDMM),for reconstructing the missing seismic data.Our work relies on the fact that seismic patches always occupy a low dimensional manifold.Specifically,we exploit the dimension of the seismic patches manifold as a regularization term in the reconstruction problem,and reconstruct the missing seismic data by enforcing low dimensionality on this manifold.The crucial procedure of the proposed method is to solve the dimension of the patches manifold.Toward this,we adopt an efficient dimensionality calculation method based on low-rank approximation,which provides a reliable safeguard to enforce the constraints in the reconstruction process.Numerical experiments performed on synthetic and field seismic data demonstrate that,compared with the curvelet-based sparsity-promoting L1-norm minimization method and the multichannel singular spectrum analysis method,the proposed method obtains state-of-the-art reconstruction results.
基金supported by the National Natural Science Foundation of China under Grant No.61501064Sichuan Provincial Science and Technology Project under Grant No.2016GZ0122
文摘In the process of encoding and decoding,erasure codes over binary fields,which just need AND operations and XOR operations and therefore have a high computational efficiency,are widely used in various fields of information technology.A matrix decoding method is proposed in this paper.The method is a universal data reconstruction scheme for erasure codes over binary fields.Besides a pre-judgment that whether errors can be recovered,the method can rebuild sectors of loss data on a fault-tolerant storage system constructed by erasure codes for disk errors.Data reconstruction process of the new method has simple and clear steps,so it is beneficial for implementation of computer codes.And more,it can be applied to other non-binary fields easily,so it is expected that the method has an extensive application in the future.
基金the National Natural Science Foundation of China (Nos. 60172071 and 60372005)
文摘This paper describes a data reconstruction technique for a multi-function sensor based on the Mestimator, which uses least squares and weighted least squares method. The algorithm has better robustness than conventional least squares which can amplify the errors of inaccurate data. The M-estimator places particular emphasis on reducing the effects of large data errors, which are further overcome by an iterative regression process which gives small weights to large off-group data errors and large weights to small data errors. Simulation results are consistent with the hypothesis with 81 groups of regression data having an average accuracy of 3.5%, which demonstrates that the M-estimator provides more accurate and reliable data reconstruction.
基金supported by the National Science and Technology Major project(No.2016ZX05024001003)the Innovation Consortium Project of China Petroleum,and the Southwest Petroleum University(No.2020CX010201).
文摘Oil and gas seismic exploration have to adopt irregular seismic acquisition due to the increasingly complex exploration conditions to adapt to complex geological conditions and environments.However,the irregular seismic acquisition is accompanied by the lack of acquisition data,which requires high-precision regularization.The sparse signal feature in the transform domain in compressed sensing theory is used in this paper to recover the missing signal,involving sparse transform base optimization and threshold modeling.First,this paper analyzes and compares the effects of six sparse transformation bases on the reconstruction accuracy and efficiency of irregular seismic data and establishes the quantitative relationship between sparse transformation and reconstruction accuracy and efficiency.Second,an adaptive threshold modeling method based on sparse coefficient is provided to improve the reconstruction accuracy.Test results show that the method has good adaptability to different seismic data and sparse transform bases.The f-x domain reconstruction method of effective frequency samples is studied to address the problem of low computational efficiency.The parallel computing strategy of curvelet transform combined with OpenMP is further proposed,which substantially improves the computational efficiency under the premise of ensuring the reconstruction accuracy.Finally,the actual acquisition data are used to verify the proposed method.The results indicate that the proposed method strategy can solve the regularization problem of irregular seismic data in production and improve the imaging quality of the target layer economically and efficiently.
文摘BACKGROUND Lutetium has been shown to be an important potential innovation in pre-treated metastatic castration-resistant prostate cancer.Two clinical trials have evaluated lutetium thus far(therap and vision with 99 and 385 patients,respectively),but their results are discordant.AIM To synthetize the available evidence on the effectiveness of lutetium in pre-treated metastatic castration-resistant prostate cancer;and to test the application of a new artificial intelligence technique that synthetizes effectiveness based on reconstructed patient-level data.METHODS We employed a new artificial intelligence method(shiny method)to pool the survival data of these two trials and evaluate to what extent the lutetium cohorts differed from one another.The shiny technique employs an original reconstruction of individual patient data from the Kaplan-Meier curves.The progression-free survival graphs of the two lutetium cohorts were analyzed and compared.RESULTS The hazard ratio estimated was in favor of the vision trial;the difference was statistically significant(P<0.001).These results indicate that further studies on lutetium are needed because the survival data of the two trials published thus far are conflicting.CONCLUSION Our study confirms the feasibility of reconstructing patient-level data from survival graphs in order to generate a survival statistics.
基金This project is supported by National Natural Science Foundation of China(No. 10272033) and Provincial Natural Science Foundation of Guangdong,China(No.04105385).
文摘A new B-spline surface reconstruction method from layer data based on deformable model is presented. An initial deformable surface, which is represented as a closed cylinder, is firstly given. The surface is subject to internal forces describing its implicit smoothness property and external forces attracting it toward the layer data points. And then finite element method is adopted to solve its energy minimization problem, which results a bicubic closed B-spline surface with C^2 continuity. The proposed method can provide a smoothness and accurate surface model directly from the layer data, without the need to fit cross-sectional curves and make them compatible. The feasibility of the proposed method is verified by the experimental results.
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDA19030402)the Key Special Projects for International Cooperation in Science and Technology Innovation between Governments(Grant No.2017YFE0133600the Beijing Municipal Natural Science Foundation Youth Project 8214066:Application Research of Beijing Road Visibility Prediction Based on Machine Learning Methods.
文摘We propose a novel machine learning approach to reconstruct meshless surface wind speed fields,i.e.,to reconstruct the surface wind speed at any location,based on meteorological background fields and geographical information.The random forest method is selected to develop the machine learning data reconstruction model(MLDRM-RF)for wind speeds over Beijing from 2015-19.We use temporal,geospatial attribute and meteorological background field features as inputs.The wind speed field can be reconstructed at any station in the region not used in the training process to cross-validate model performance.The evaluation considers the spatial distribution of and seasonal variations in the root mean squared error(RMSE)of the reconstructed wind speed field across Beijing.The average RMSE is 1.09 m s^(−1),considerably smaller than the result(1.29 m s^(−1))obtained with inverse distance weighting(IDW)interpolation.Finally,we extract the important feature permutations by the method of mean decrease in impurity(MDI)and discuss the reasonableness of the model prediction results.MLDRM-RF is a reasonable approach with excellent potential for the improved reconstruction of historical surface wind speed fields with arbitrary grid resolutions.Such a model is needed in many wind applications,such as wind energy and aviation safety assessments.
基金supported by the National Basic Research Program of China(No.2007CB407303)
文摘A long-term dataset of photosynthetically active radiation (Qp) is reconstructed from a broadband global solar radiation (Rs) dataset through an all-weather reconstruction model. This method is based on four years' worth of data collected in Beijing. Observation data of Rs and Qp from 2005-2008 are used to investigate the temporal variability of Qp and its dependence on the clearness index and solar zenith angle. A simple and effcient all-weather empirically derived reconstruction model is proposed to reconstruct Qp from Rs. This reconstruction method is found to estimate instantaneous Qp with high accuracy. The annual mean of the daily values of Qp during the period 1958-2005 period is 25.06 mol m-2 d-1. The magnitude of the long-term trend for the annual averaged Qp is presented (-0.19 mol m-2 yr-1 from 1958-1997 and -0.12 mol m-2 yr-1 from 1958-2005). The trend in Qp exhibits sharp decreases in the spring and summer and more gentle decreases in the autumn and winter.
基金The first author thanks the Brazilian National Council for Scientific and Technological Development for the Post-Doc scholarship(155814/2018-4).
文摘Streamflow forecasting in drylands is challenging.Data are scarce,catchments are highly humanmodified and streamflow exhibits strong nonlinear responses to rainfall.The goal of this study was to evaluate the monthly and seasonal streamflow forecasting in two large catchments in the Jaguaribe River Basin in the Brazilian semi-arid area.We adopted four different lead times:one month ahead for monthly scale and two,three and four months ahead for seasonal scale.The gaps of the historic streamflow series were filled up by using rainfall-runoff modelling.Then,time series model techniques were applied,i.e.,the locally constant,the locally averaged,the k-nearest-neighbours algorithm(k-NN)and the autoregressive(AR)model.The criterion of reliability of the validation results is that the forecast is more skillful than streamflow climatology.Our approach outperformed the streamflow climatology for all monthly streamflows.On average,the former was 25%better than the latter.The seasonal streamflow forecasting(SSF)was also reliable(on average,20%better than the climatology),failing slightly only for the high flow season of one catchment(6%worse than the climatology).Considering an uncertainty envelope(probabilistic forecasting),which was considerably narrower than the data standard deviation,the streamflow forecasting performance increased by about 50%at both scales.The forecast errors were mainly driven by the streamflow intra-seasonality at monthly scale,while they were by the forecast lead time at seasonal scale.The best-fit and worst-fit time series model were the k-NN approach and the AR model,respectively.The rainfall-runoff modelling outputs played an important role in improving streamflow forecasting for one streamgauge that showed 35%of data gaps.The developed data-driven approach is mathematical and computationally very simple,demands few resources to accomplish its operational implementation and is applicable to other dryland watersheds.Our findings may be part of drought forecasting systems and potentially help allocating water months in advance.Moreover,the developed strategy can serve as a baseline for more complex streamflow forecast systems.
文摘Precipitation is the most discontinuous atmospheric parameter because of its temporal and spatial variability. Precipitation observations at automatic weather stations(AWSs) show different patterns over different time periods. This paper aims to reconstruct missing data by finding the time periods when precipitation patterns are similar, with a method called the intermittent sliding window period(ISWP) technique—a novel approach to reconstructing the majority of non-continuous missing real-time precipitation data. The ISWP technique is applied to a 1-yr precipitation dataset(January 2015 to January 2016), with a temporal resolution of 1 h, collected at 11 AWSs run by the Indian Meteorological Department in the capital region of Delhi. The acquired dataset has missing precipitation data amounting to 13.66%, of which 90.6% are reconstructed successfully. Furthermore, some traditional estimation algorithms are applied to the reconstructed dataset to estimate the remaining missing values on an hourly basis. The results show that the interpolation of the reconstructed dataset using the ISWP technique exhibits high quality compared with interpolation of the raw dataset. By adopting the ISWP technique, the root-mean-square errors(RMSEs)in the estimation of missing rainfall data—based on the arithmetic mean, multiple linear regression, linear regression,and moving average methods—are reduced by 4.2%, 55.47%, 19.44%, and 9.64%, respectively. However, adopting the ISWP technique with the inverse distance weighted method increases the RMSE by 0.07%, due to the fact that the reconstructed data add a more diverse relation to its neighboring AWSs.
基金supported by the National Key Researchand Development Programof China (Grant No.2020YFB1711101)the Anhui Provincial University Natural Science Foundation Key Project (Grant No.KJ2019A127).
文摘An online model was proposed to identify the reasons behind changes in the energy consumption of the reheating furnace of a steel processing plant.The heat conversion of the furnace was analyzed and integrated with the fuel consumption of the furnace to obtain a model of the energy consumption.Combined with the mechanism analysis,the basic parameters affecting energy consumption were determined,and four key influencing factors were obtained:furnace output,furnace charging temperature,furnace tapping temperature,and steel type.The specific calculation method of the contribution of each influencing factor was derived to define the conditions of the baseline energy consumption,while the online data were used to calculate the energy value and the actual performance value of the baseline energy consumption.The contribution of each influencing factor was determined through normalization.The cloud platform was used for database reconstruction and programming to realize the online intelligent evaluation of the energy consumption of the reheating furnace.Finally,a case study of the evaluation of the practical energy consumption of a steel rolling furnace in a steel plant was presented.The intelligent evaluation results were quantified and displayed online,and the performance of the system in reducing production line energy consumption was demonstrated.