期刊文献+
共找到11篇文章
< 1 >
每页显示 20 50 100
Brittleness index predictions from Lower Barnett Shale well-log data applying an optimized data matching algorithm at various sampling densities 被引量:1
1
作者 David A.Wood 《Geoscience Frontiers》 SCIE CAS CSCD 2021年第6期444-457,共14页
The capability of accurately predicting mineralogical brittleness index (BI) from basic suites of well logs is desirable as it provides a useful indicator of the fracability of tight formations.Measuring mineralogical... The capability of accurately predicting mineralogical brittleness index (BI) from basic suites of well logs is desirable as it provides a useful indicator of the fracability of tight formations.Measuring mineralogical components in rocks is expensive and time consuming.However,the basic well log curves are not well correlated with BI so correlation-based,machine-learning methods are not able to derive highly accurate BI predictions using such data.A correlation-free,optimized data-matching algorithm is configured to predict BI on a supervised basis from well log and core data available from two published wells in the Lower Barnett Shale Formation (Texas).This transparent open box (TOB) algorithm matches data records by calculating the sum of squared errors between their variables and selecting the best matches as those with the minimum squared errors.It then applies optimizers to adjust weights applied to individual variable errors to minimize the root mean square error (RMSE)between calculated and predicted (BI).The prediction accuracy achieved by TOB using just five well logs (Gr,ρb,Ns,Rs,Dt) to predict BI is dependent on the density of data records sampled.At a sampling density of about one sample per 0.5 ft BI is predicted with RMSE~0.056 and R^(2)~0.790.At a sampling density of about one sample per0.1 ft BI is predicted with RMSE~0.008 and R^(2)~0.995.Adding a stratigraphic height index as an additional (sixth)input variable method improves BI prediction accuracy to RMSE~0.003 and R^(2)~0.999 for the two wells with only 1 record in 10,000 yielding a BI prediction error of>±0.1.The model has the potential to be applied in an unsupervised basis to predict BI from basic well log data in surrounding wells lacking mineralogical measurements but with similar lithofacies and burial histories.The method could also be extended to predict elastic rock properties in and seismic attributes from wells and seismic data to improve the precision of brittleness index and fracability mapping spatially. 展开更多
关键词 Well-log brittleness index estimates Data record sample densities Zoomed-in data interpolation Correlation-free prediction analysis Mineralogical and elastic influences
下载PDF
Knowledge Transfer Learning via Dual Density Sampling for Resource-Limited Domain Adaptation 被引量:1
2
作者 Zefeng Zheng Luyao Teng +2 位作者 Wei Zhang Naiqi Wu Shaohua Teng 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第12期2269-2291,共23页
Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by sampling.However,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global... Most existing domain adaptation(DA) methods aim to explore favorable performance under complicated environments by sampling.However,there are three unsolved problems that limit their efficiencies:ⅰ) they adopt global sampling but neglect to exploit global and local sampling simultaneously;ⅱ)they either transfer knowledge from a global perspective or a local perspective,while overlooking transmission of confident knowledge from both perspectives;and ⅲ) they apply repeated sampling during iteration,which takes a lot of time.To address these problems,knowledge transfer learning via dual density sampling(KTL-DDS) is proposed in this study,which consists of three parts:ⅰ) Dual density sampling(DDS) that jointly leverages two sampling methods associated with different views,i.e.,global density sampling that extracts representative samples with the most common features and local density sampling that selects representative samples with critical boundary information;ⅱ)Consistent maximum mean discrepancy(CMMD) that reduces intra-and cross-domain risks and guarantees high consistency of knowledge by shortening the distances of every two subsets among the four subsets collected by DDS;and ⅲ) Knowledge dissemination(KD) that transmits confident and consistent knowledge from the representative target samples with global and local properties to the whole target domain by preserving the neighboring relationships of the target domain.Mathematical analyses show that DDS avoids repeated sampling during the iteration.With the above three actions,confident knowledge with both global and local properties is transferred,and the memory and running time are greatly reduced.In addition,a general framework named dual density sampling approximation(DDSA) is extended,which can be easily applied to other DA algorithms.Extensive experiments on five datasets in clean,label corruption(LC),feature missing(FM),and LC&FM environments demonstrate the encouraging performance of KTL-DDS. 展开更多
关键词 Cross-domain risk dual density sampling intra-domain risk maximum mean discrepancy knowledge transfer learning resource-limited domain adaptation
下载PDF
Spatial Variability of Soil Properties at Capulin Volcano,New Mexico,USA:Implications for Sampling Strategy 被引量:40
3
作者 D.C.WEINDORF 《Pedosphere》 SCIE CAS CSCD 2010年第2期185-197,共13页
Non-agricultural lands are surveyed sparsely in general.Meanwhile,soils in these areas usually exhibit strong spatial variability which requires more samples for producing acceptable estimates.Capulin Volcano National... Non-agricultural lands are surveyed sparsely in general.Meanwhile,soils in these areas usually exhibit strong spatial variability which requires more samples for producing acceptable estimates.Capulin Volcano National Monument,as a typical sparsely-surveyed area,was chosen to assess spatial variability of a variety of soil properties,and furthermore,to investigate its implications for sampling design.One hundred and forty one composited soil samples were collected across the Monument and the surrounding areas.Soil properties including pH,organic matter content,extractable elements such as calcium (Ca),magnesium (Mg),potassium (K),sodium (Na),phosphorus (P),sulfur (S),zinc (Zn),and copper (Cu),as well as sand,silt,and clay percentages were analyzed for each sample.Semivariograms of all properties were constructed,standardized,and compared to estimate the spatial variability of the soil properties in the area.Based on the similarity among standardized semivariograms,we found that the semivariograms could be generalized for physical and chemical properties,respectively.The generalized semivariogram for physical properties had a much greater sill value (2.635) and effective range (7 500 m) than that for chemical properties.Optimal sampling density (OSD),which is derived from the generalized semivariogram and defines the relationship between sampling density and expected error percentage,was proposed to represent,interpret,and compare soil spatial variability and to provide guidance for sample scheme design.OSDs showed that chemical properties exhibit a stronger local spatial variability than soil texture parameters,implying more samples or analysis are required to achieve a similar level of precision. 展开更多
关键词 generalized semivariogram GIS optimal sampling density sampling design
下载PDF
Improved Prediction and Reduction of Sampling Density for Soil Salinity by Different Geostatistical Methods 被引量:7
4
作者 LI Yan SHI Zhou +2 位作者 WU Ci-fang LI Hong-yi LI Feng 《Agricultural Sciences in China》 CAS CSCD 2007年第7期832-841,共10页
The spatial estimation for soil properties was improved and sampling intensities also decreased in terms of incorporated auxiliary data. In this study, kriging and two interpolation methods were proven well to estimat... The spatial estimation for soil properties was improved and sampling intensities also decreased in terms of incorporated auxiliary data. In this study, kriging and two interpolation methods were proven well to estimate auxiliary variables: cokriging and regression-kriging, and using the salinity data from the first two stages as auxiliary variables, the methods both improved the interpolation of soil salinity in coastal saline land. The prediction accuracy of the three methods was observed under different sampling density of the target variable by comparison with another group of 80 validation sample points, from which the root-mean-square error (RMSE) and correlation coefficient (r) between the predicted and measured values were calculated. The results showed, with the help of auxiliary data, whatever the sample size of the target variable may be, cokriging and regression-kriging performed better than ordinary kriging. Moreover, regression-kriging produced on average more accurate predictions than cokriging. Compared with the kriging results, cokriging improved the estimations by reducing RMSE from 23.3 to 29% and increasing r from 16.6 to 25.5%, regression-kriging improved the estimations by reducing RMSE from 25 to 41.5% and increasing r from 16.8 to 27.2%. Therefore, regression-kriging shows promise for improved prediction for soil salinity and reduction of soil sampling intensity considerably while maintaining high prediction accuracy. Moreover, in regression-kriging, the regression model can have any form, such as generalized linear models, non-linear models or tree-based models, which provide a possibility to include more ancillary variables. 展开更多
关键词 auxiliary data prediction precision sampling density soil salinity KRIGING
下载PDF
Characteristics analysis on high density spatial sampling seismic data 被引量:11
5
作者 Cai Xiling Liu Xuewei +1 位作者 Deng Chunyan Lv Yingme 《Applied Geophysics》 SCIE CSCD 2006年第1期48-54,共7页
China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a... China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data. 展开更多
关键词 high density spatial sampling symmetric sampling static correction noise suppression wave field separation and data processing.
下载PDF
Seismic data analysis based on spatial subsets 被引量:2
6
作者 蔡希玲 刘学伟 +2 位作者 李虹 钱宇明 吕英梅 《Applied Geophysics》 SCIE CSCD 2009年第4期384-392,395,共10页
There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from ... There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information. 展开更多
关键词 spatial subset 3D visualization high density sampling noise attenuation data analysis
下载PDF
Geochemical Mapping:With Special Emphasis on Analytical Requirements 被引量:5
7
作者 XIE Xuejing CHENG Hangxin LIU Dawen 《Acta Geologica Sinica(English Edition)》 SCIE CAS CSCD 2008年第2期451-462,共12页
More than 40 national and regional geochemical mapping projects in the world carried out from 1973 to 1988 do not conform to common standards. In particular they have many analytical deficiencies. In the period 1988 t... More than 40 national and regional geochemical mapping projects in the world carried out from 1973 to 1988 do not conform to common standards. In particular they have many analytical deficiencies. In the period 1988 to 1992, the International Geochemical Mapping project (Project 259 of UNESCO's IGCP Program) prepared recommendations designed to standardize geochemical mapping methods. The analytical requirements are an essential component of the overall recommendations. They included the following: 71 elements should be analyzed in future mapping projects; the detection limits of trace and ultratrace elements must be lower than the corresponding crustal abundances; and the Chinese GSD and Canadian STSD standard sample series should be used for the correlation of global data. A proposal was also made to collect 5000 composite samples, at very low sampling densities to cover the whole Earth's land surface. In 1997 an IUGS Working Group on Global Geochemical Baselines was formed to continue the work which began with IGCP 259. From 1997 up to now, new progress has been made especially in China and FOREGS countries under the aegis of this working group, including the study of suitable sampling media, development of a multi-element analytical system, new proficiency test for selection of competent laboratories and role of wide-spaced mapping in mineral exploration. One of the major problems awaiting solution has been the inability of many laboratories to meet the IGCP recommendations to generate high quality geochemical maps. Fortunately several laboratories in China and Europe have demonstrated an ability to meet the requirements and they will be well placed to render technical assistance to other countries. 展开更多
关键词 geochemical mapping geoanalysis analytical requirement extremely low density sampling
下载PDF
SMC-PHD based multi-target track-before-detect with nonstandard point observations model 被引量:5
8
作者 占荣辉 高彦钊 +1 位作者 胡杰民 张军 《Journal of Central South University》 SCIE EI CAS CSCD 2015年第1期232-240,共9页
Detection and tracking of multi-target with unknown and varying number is a challenging issue, especially under the condition of low signal-to-noise ratio(SNR). A modified multi-target track-before-detect(TBD) method ... Detection and tracking of multi-target with unknown and varying number is a challenging issue, especially under the condition of low signal-to-noise ratio(SNR). A modified multi-target track-before-detect(TBD) method was proposed to tackle this issue using a nonstandard point observation model. The method was developed from sequential Monte Carlo(SMC)-based probability hypothesis density(PHD) filter, and it was implemented by modifying the original calculation in update weights of the particles and by adopting an adaptive particle sampling strategy. To efficiently execute the SMC-PHD based TBD method, a fast implementation approach was also presented by partitioning the particles into multiple subsets according to their position coordinates in 2D resolution cells of the sensor. Simulation results show the effectiveness of the proposed method for time-varying multi-target tracking using raw observation data. 展开更多
关键词 adaptive particle sampling multi-target track-before-detect probability hypothesis density(PHD) filter sequential Monte Carlo(SMC) method
下载PDF
Free clustering optimal particle probability hypothesis density(PHD) filter
9
作者 李云湘 肖怀铁 +2 位作者 宋志勇 范红旗 付强 《Journal of Central South University》 SCIE EI CAS 2014年第7期2673-2683,共11页
As to the fact that it is difficult to obtain analytical form of optimal sampling density and tracking performance of standard particle probability hypothesis density(P-PHD) filter would decline when clustering algori... As to the fact that it is difficult to obtain analytical form of optimal sampling density and tracking performance of standard particle probability hypothesis density(P-PHD) filter would decline when clustering algorithm is used to extract target states,a free clustering optimal P-PHD(FCO-P-PHD) filter is proposed.This method can lead to obtainment of analytical form of optimal sampling density of P-PHD filter and realization of optimal P-PHD filter without use of clustering algorithms in extraction target states.Besides,as sate extraction method in FCO-P-PHD filter is coupled with the process of obtaining analytical form for optimal sampling density,through decoupling process,a new single-sensor free clustering state extraction method is proposed.By combining this method with standard P-PHD filter,FC-P-PHD filter can be obtained,which significantly improves the tracking performance of P-PHD filter.In the end,the effectiveness of proposed algorithms and their advantages over other algorithms are validated through several simulation experiments. 展开更多
关键词 multiple target tracking probability hypothesis density filter optimal sampling density particle filter random finite set clustering algorithm state extraction
下载PDF
Research on Influencing Factors of Spatial Variation of Soil Nutrients
10
作者 Tingting MENG 《Meteorological and Environmental Research》 CAS 2020年第4期95-97,共3页
Based on previous studies,the research methods and influencing factors of spatial variation of soil nutrients are summarized.It is concluded that the spatial variation of soil nutrients is studied generally by geostat... Based on previous studies,the research methods and influencing factors of spatial variation of soil nutrients are summarized.It is concluded that the spatial variation of soil nutrients is studied generally by geostatistics methods,and the spatial distribution of nutrients is visually observed by using Kriging interpolation method.The influencing factors mainly include topography,sampling method,sampling spacing,sampling density and sampling scale.The influence of random sampling and grid sampling on interpolation is analyzed based on the specific conditions of the actual study area.The influence of sampling density and topography on the spatial variation of soil nutrients cannot be ignored,especially on available nutrients.When samples are collected in a large area(under a small and medium scale),the spatial variation of soil nutrients is large,and they have strong spatial autocorrelation;in a small area(namely under a large scale),the spatial variability of soil nutrients is small,and they have obvious spatial autocorrelation.This study can provide intuitive and convenient reference materials for the following researchers. 展开更多
关键词 Soil nutrients Spatial variation GEOSTATISTICS sampling method sampling scale sampling density
下载PDF
Static Frame Model Validation with Small Samples Solution Using Improved Kernel Density Estimation and Confidence Level Method 被引量:5
11
作者 ZHANG Baoqiang CHEN Guoping GUO Qintao 《Chinese Journal of Aeronautics》 SCIE EI CAS CSCD 2012年第6期879-886,共8页
An improved method using kernel density estimation (KDE) and confidence level is presented for model validation with small samples. Decision making is a challenging problem because of input uncertainty and only smal... An improved method using kernel density estimation (KDE) and confidence level is presented for model validation with small samples. Decision making is a challenging problem because of input uncertainty and only small samples can be used due to the high costs of experimental measurements. However, model validation provides more confidence for decision makers when improving prediction accuracy at the same time. The confidence level method is introduced and the optimum sample variance is determined using a new method in kernel density estimation to increase the credibility of model validation. As a numerical example, the static frame model validation challenge problem presented by Sandia National Laboratories has been chosen. The optimum bandwidth is selected in kernel density estimation in order to build the probability model based on the calibration data. The model assessment is achieved using validation and accreditation experimental data respectively based on the probability model. Finally, the target structure prediction is performed using validated model, which are consistent with the results obtained by other researchers. The results demonstrate that the method using the improved confidence level and kernel density estimation is an effective approach to solve the model validation problem with small samples. 展开更多
关键词 model validation small samples uncertainty analysis kernel density estimation confidence level prediction
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部