The flux-variance similarity relation and the vertical transfer of scalars exhibit dissimilarity over different types of surfaces, resulting in different parameterization approaches of relative transport efficiency am...The flux-variance similarity relation and the vertical transfer of scalars exhibit dissimilarity over different types of surfaces, resulting in different parameterization approaches of relative transport efficiency among scalars to estimate turbulent fluxes using the flux-variance method. We investigated these issues using eddycovariance measurements over an open, homogeneous and flat grassland in the eastern Tibetan Plateau in summer under intermediate hydrological conditions during rainy season. In unstable conditions, the temperature, water vapor, and CO2 followed the flux-variance similarity relation, but did not show in precisely the same way due to different roles (active or passive) of these scalars. Similarity constants of temperature, water vapor and CO2 were found to be 1.12, 1.19 and 1.17, respectively. Heat transportation was more eft% cient than water vapor and CO2. Based on the estimated sensible heat flux, five parameterization methods of relative transport efficiency of heat to water vapor and CO2 were examined to estimate latent heat and CO2 fluxes. The strategy of local determination of flux-variance similarity relation is recommended for the estimation of latent heat and CO2 fluxes. This approach is better for representing the averaged relative transport efficiency, and technically easier to apply, compared to other more complex ones.展开更多
Mixed models provide a wide range of applications including hierarchical modeling and longitudinal studies. The tests of variance component in mixed models have long been a methodological challenge because of its boun...Mixed models provide a wide range of applications including hierarchical modeling and longitudinal studies. The tests of variance component in mixed models have long been a methodological challenge because of its boundary conditions. It is well documented in literature that the traditional first-order methods: likelihood ratio statistic, Wald statistic and score statistic, provide an excessively conservative approximation to the null distribution. However, the magnitude of the conservativeness has not been thoroughly explored. In this paper, we propose a likelihood-based third-order method to the mixed models for testing the null hypothesis of zero and non-zero variance component. The proposed method dramatically improved the accuracy of the tests. Extensive simulations were carried out to demonstrate the accuracy of the proposed method in comparison with the standard first-order methods. The results show the conservativeness of the first order methods and the accuracy of the proposed method in approximating the p-values and confidence intervals even when the sample size is small.展开更多
A mixed distribution of empirical variances, composed of two distributions the basic and contaminating ones, and referred to as PERG mixed distribution of empirical variances, is considered. In the paper a robust inve...A mixed distribution of empirical variances, composed of two distributions the basic and contaminating ones, and referred to as PERG mixed distribution of empirical variances, is considered. In the paper a robust inverse problem solution is given, namely a (new) robust method for estimation of variances of both distributions—PEROBVC Method, as well as the estimates for the numbers of observations for both distributions and, in this way also the estimate of contamination degree.展开更多
This paper studies the estimation of variance and covariance compo-nents for GPS baseline network by MINQUE method.The fundamental rule forselecting variance-covariance model has been presented,and the alternative alg...This paper studies the estimation of variance and covariance compo-nents for GPS baseline network by MINQUE method.The fundamental rule forselecting variance-covariance model has been presented,and the alternative algo-rithm which simultaneouly estimates fixed variance components and scalled vari-ance components of the distance,azimuth and geodetic height difference for a GPSbaseline vector has been developed.展开更多
In order to make a scientific pavement maintenance decision, a grey-theory-based prediction methodological framework is proposed to predict pavement performance. Based on the field pavement rutting data,analysis of va...In order to make a scientific pavement maintenance decision, a grey-theory-based prediction methodological framework is proposed to predict pavement performance. Based on the field pavement rutting data,analysis of variance (ANOVA)was first used to study the influence of different factors on pavement rutting. Cluster analysis was then employed to investigate the rutting development trend.Based on the clustering results,the grey theory was applied to build pavement rutting models for each cluster, which can effectively reduce the complexity of the predictive model.The results show that axial load and asphalt binder type play important roles in rutting development.The prediction model is capable of capturing the uncertainty in the pavement performance prediction process and can meet the requirements of highway pavement maintenance,and,therefore,has a wide application prospects.展开更多
Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance ...Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance determination,Bruce’s method,is customarily used because the volume estimator takes the form of a product of random variables.However,the genesis of Bruce’s method is not known to most foresters who use the method in practice.Methods:We establish that the Taylor series approximation known as the Delta method provides a plausible explanation for the origins of Bruce’s method.Simulations were conducted on two different tree populations to ascertain the similarities of the Delta method to the exact variance of a product.Additionally,two alternative estimators for the variance of individual tree volume-basal area ratios,which are part of the estimation process,were compared within the overall variance estimation procedure.Results:The simulation results demonstrate that Bruce’s method provides a robust method for estimating the variance of inventories conducted with the big BAF method.The simulations also demonstrate that the variance of the mean volume-basal area ratios can be computed using either the usual sample variance of the mean or the ratio variance estimators with equal accuracy,which had not been shown previously for Big BAF sampling.Conclusions:A plausible explanation for the origins of Bruce’s method has been set forth both historically and mathematically in the Delta Method.In most settings,there is evidently no practical difference between applying the exact variance of a product or the Delta method—either can be used.A caution is articulated concerning the aggregation of tree-wise attributes into point-wise summaries in order to test the correlation between the two as a possible indicator of the need for further covariance augmentation.展开更多
In the present work, the response surface method software was used with five measurement levels with three factors.These were applied for the optimization of operating parameters that affected gas separation performan...In the present work, the response surface method software was used with five measurement levels with three factors.These were applied for the optimization of operating parameters that affected gas separation performance of polyurethane–zeolite 3A, ZSM-5 mixed matrix membranes.The basis of the experiments was a rotatable central composite design(CCD).The three independent variables studied were: zeolite content(0–24 wt%), operating temperature(25–45 ℃) and operating pressure(0.2–0.1 MPa).The effects of these three variables on the selectivity and permeability membranes were studied by the analysis of variance(ANOVA).Optimal conditions for the enhancement of gas separation performances of polyurethane–3A zeolite were found to be 18 wt%, 30 ℃ and 0.8 MPa respectively.Under these conditions, the permeabilities of carbon dioxide, methane, oxygen and nitrogen gases were measured at 138.4, 22.9, 15.7 and 6.4 Barrer respectively while the CO_2/CH_4, CO_2/N_2 and O_2/N_2 selectivities were 5.8, 22.5 and 2.5, respectively.Also, the optimal conditions for improvement of the gas separation performance of polyurethane–ZSM 5 were found to be 15.64 wt%, 30 ℃ and 4 bar.The permeabilities of these four gases(i.e.carbon dioxide, methane, oxygen and nitrogen) were 164.7, 21.2, 21.5 and 8.1 Barrer while the CO_2/CH_4, CO_2/N_2 and O_2/N_2 selectivities were 7.8, 20.6 and 2.7 respectively.展开更多
This paper advances a new simplified formula for estimating variance components ,sums up the basic law to calculate the weights of observed values and a circulation method using the increaments of weights when estimat...This paper advances a new simplified formula for estimating variance components ,sums up the basic law to calculate the weights of observed values and a circulation method using the increaments of weights when estimating the variance components of traverse nets,advances the charicteristic roots method to estimate the variance components of traveres nets and presents a practical method to make two real and symmetric matrices two diagonal ones.展开更多
Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for t...Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for tree counts and a larger BAF on which tree measurements are made usually including DBHs and heights needed for volume estimation.Methods:The new estimator is derived using the Delta method from an existing formulation of the big BAF estimator as consisting of three sample means.The new formula is compared to existing big BAF estimators including a popular estimator based on Bruce’s formula.Results:Several computer simulation studies were conducted comparing the new variance estimator to all known variance estimators for big BAF currently in the forest inventory literature.In simulations the new estimator performed well and comparably to existing variance formulas.Conclusions:A possible advantage of the new estimator is that it does not require the assumption of negligible correlation between basal area counts on the small BAF factor and volume-basal area ratios based on the large BAF factor selection trees,an assumption required by all previous big BAF variance estimation formulas.Although this correlation was negligible on the simulation stands used in this study,it is conceivable that the correlation could be significant in some forest types,such as those in which the DBH-height relationship can be affected substantially by density perhaps through competition.We derived a formula that can be used to estimate the covariance between estimates of mean basal area and the ratio of estimates of mean volume and mean basal area.We also mathematically derived expressions for bias in the big BAF estimator that can be used to show the bias approaches zero in large samples on the order of 1n where n is the number of sample points.展开更多
A model for both stochastic jumps and volatility for equity returns in the area of option pricing is the stochastic volatility process with jumps (SVPJ). A major advantage of this model lies in the area of mean revers...A model for both stochastic jumps and volatility for equity returns in the area of option pricing is the stochastic volatility process with jumps (SVPJ). A major advantage of this model lies in the area of mean reversion and volatility clustering between returns and volatility with uphill movements in price asserts. Thus, in this article, we propose to solve the SVPJ model numerically through a discretized variational iteration method (DVIM) to obtain sample paths for the state variable and variance process at various timesteps and replications in order to estimate the expected jump times at various iterates resulting from executing the DVIM as n increases. These jumps help in estimating the degree of randomness in the financial market. It was observed that the average computed expected jump times for the state variable and variance process is moderated by the parameters (variance process through mean reversion), Θ (long-run mean of the variance process), σ (volatility variance process) and λ (constant intensity of the Poisson process) at each iterate. For instance, when = 0.0, Θ = 0.0, σ = 0.0 and λ = 1.0, the state variable cluttered maximally compared to the variance process with less volatility cluttering with an average computed expected jump times of 52.40607869 as n increases in the DVIM scheme. Similarly, when = 3.99, Θ = 0.014, σ = 0.27 and λ = 0.11, the stochastic jumps for the state variable are less cluttered compared to the variance process with maximum volatility cluttering as n increases in the DVIM scheme. In terms of option pricing, the value 52.40607869 suggest a better bargain compared to the value 20.40344029 due to the fact that it yields less volatility rate. MAPLE 18 software was used for all computations in this research.展开更多
Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random samp...Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.展开更多
Value-at-Risk (VaR) estimation via Monte Carlo (MC) simulation is studied here. The variance reduction technique is proposed in order to speed up MC algorithm. The algorithm for estimating the probability of high ...Value-at-Risk (VaR) estimation via Monte Carlo (MC) simulation is studied here. The variance reduction technique is proposed in order to speed up MC algorithm. The algorithm for estimating the probability of high portfolio losses (more general risk measure) based on the Cross - Entropy importance sampling is developed. This algorithm can easily be applied in any light- or heavy-tailed case without an extra adaptation. Besides, it does not loose in the performance in comparison to other known methods. A numerical study in both cases is performed and the variance reduction rate is compared with other known methods. The problem of VaR estimation using procedures for estimating the probability of high portfolio losses is also discussed.展开更多
The fractional-order Boussinesq equations(FBSQe)are investigated in this work to see if they can effectively improve the situation where the shallow water equation cannot directly handle the dispersion wave.The fuzzy ...The fractional-order Boussinesq equations(FBSQe)are investigated in this work to see if they can effectively improve the situation where the shallow water equation cannot directly handle the dispersion wave.The fuzzy forms of analytical FBSQe solutions are first derived using the Adomian decomposition method.It also occurs on the sea floor as opposed to at the functionality.A set of dynamical partial differential equations(PDEs)in this article exemplify an unconfined aquifer flow implication.Thismethodology can accurately simulate climatological intrinsic waves,so the ripples are spread across a large demographic zone.The Aboodh transform merged with the mechanism of Adomian decomposition is implemented to obtain the fuzzified FBSQe in R,R^(n) and(2nth)-order involving generalized Hukuhara differentiability.According to the system parameter,we classify the qualitative features of the Aboodh transform in the fuzzified Caputo and Atangana-Baleanu-Caputo fractional derivative formulations,which are addressed in detail.The illustrations depict a comparison analysis between the both fractional operators under gH-differentiability,as well as the appropriate attributes for the fractional-order and unpredictability factorsσ∈[0,1].A statistical experiment is conducted between the findings of both fractional derivatives to prevent changing the hypothesis after the results are known.Based on the suggested analyses,hydrodynamic technicians,as irrigation or aquifer quality experts,may be capable of obtaining an appropriate storage intensity amount,including an unpredictability threshold.展开更多
基金funding frown the Chinese National Key Programme for Developing Basic Sciences,the National Natural Science Foundation of China,the Key Program of the Chinese Academy of Sciences,the National Natural Science Foundation of China,the Foundation for Excellent Young Scholars of CAREERI
文摘The flux-variance similarity relation and the vertical transfer of scalars exhibit dissimilarity over different types of surfaces, resulting in different parameterization approaches of relative transport efficiency among scalars to estimate turbulent fluxes using the flux-variance method. We investigated these issues using eddycovariance measurements over an open, homogeneous and flat grassland in the eastern Tibetan Plateau in summer under intermediate hydrological conditions during rainy season. In unstable conditions, the temperature, water vapor, and CO2 followed the flux-variance similarity relation, but did not show in precisely the same way due to different roles (active or passive) of these scalars. Similarity constants of temperature, water vapor and CO2 were found to be 1.12, 1.19 and 1.17, respectively. Heat transportation was more eft% cient than water vapor and CO2. Based on the estimated sensible heat flux, five parameterization methods of relative transport efficiency of heat to water vapor and CO2 were examined to estimate latent heat and CO2 fluxes. The strategy of local determination of flux-variance similarity relation is recommended for the estimation of latent heat and CO2 fluxes. This approach is better for representing the averaged relative transport efficiency, and technically easier to apply, compared to other more complex ones.
文摘Mixed models provide a wide range of applications including hierarchical modeling and longitudinal studies. The tests of variance component in mixed models have long been a methodological challenge because of its boundary conditions. It is well documented in literature that the traditional first-order methods: likelihood ratio statistic, Wald statistic and score statistic, provide an excessively conservative approximation to the null distribution. However, the magnitude of the conservativeness has not been thoroughly explored. In this paper, we propose a likelihood-based third-order method to the mixed models for testing the null hypothesis of zero and non-zero variance component. The proposed method dramatically improved the accuracy of the tests. Extensive simulations were carried out to demonstrate the accuracy of the proposed method in comparison with the standard first-order methods. The results show the conservativeness of the first order methods and the accuracy of the proposed method in approximating the p-values and confidence intervals even when the sample size is small.
文摘A mixed distribution of empirical variances, composed of two distributions the basic and contaminating ones, and referred to as PERG mixed distribution of empirical variances, is considered. In the paper a robust inverse problem solution is given, namely a (new) robust method for estimation of variances of both distributions—PEROBVC Method, as well as the estimates for the numbers of observations for both distributions and, in this way also the estimate of contamination degree.
文摘This paper studies the estimation of variance and covariance compo-nents for GPS baseline network by MINQUE method.The fundamental rule forselecting variance-covariance model has been presented,and the alternative algo-rithm which simultaneouly estimates fixed variance components and scalled vari-ance components of the distance,azimuth and geodetic height difference for a GPSbaseline vector has been developed.
基金The Major Scientific and Technological Special Project of Jiangsu Provincial Communications Department(No.2011Y/02-G1)
文摘In order to make a scientific pavement maintenance decision, a grey-theory-based prediction methodological framework is proposed to predict pavement performance. Based on the field pavement rutting data,analysis of variance (ANOVA)was first used to study the influence of different factors on pavement rutting. Cluster analysis was then employed to investigate the rutting development trend.Based on the clustering results,the grey theory was applied to build pavement rutting models for each cluster, which can effectively reduce the complexity of the predictive model.The results show that axial load and asphalt binder type play important roles in rutting development.The prediction model is capable of capturing the uncertainty in the pavement performance prediction process and can meet the requirements of highway pavement maintenance,and,therefore,has a wide application prospects.
基金Research Joint Venture Agreement 17-JV-11242306045,“Old Growth Forest Dynamics and Structure,”between the USDA Forest Service and the University of New Hampshire.Additional support to MJD was provided by the USDA National Institute of Food and Agriculture McIntire-Stennis Project Accession Number 1020142,“Forest Structure,Volume,and Biomass in the Northeastern United States.”TBL:This work was supported by the USDA National Institute of Food and Agriculture,McIntire-Stennis project OKL02834 and the Division of Agricultural Sciences and Natural Resources at Oklahoma State University.
文摘Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance determination,Bruce’s method,is customarily used because the volume estimator takes the form of a product of random variables.However,the genesis of Bruce’s method is not known to most foresters who use the method in practice.Methods:We establish that the Taylor series approximation known as the Delta method provides a plausible explanation for the origins of Bruce’s method.Simulations were conducted on two different tree populations to ascertain the similarities of the Delta method to the exact variance of a product.Additionally,two alternative estimators for the variance of individual tree volume-basal area ratios,which are part of the estimation process,were compared within the overall variance estimation procedure.Results:The simulation results demonstrate that Bruce’s method provides a robust method for estimating the variance of inventories conducted with the big BAF method.The simulations also demonstrate that the variance of the mean volume-basal area ratios can be computed using either the usual sample variance of the mean or the ratio variance estimators with equal accuracy,which had not been shown previously for Big BAF sampling.Conclusions:A plausible explanation for the origins of Bruce’s method has been set forth both historically and mathematically in the Delta Method.In most settings,there is evidently no practical difference between applying the exact variance of a product or the Delta method—either can be used.A caution is articulated concerning the aggregation of tree-wise attributes into point-wise summaries in order to test the correlation between the two as a possible indicator of the need for further covariance augmentation.
文摘In the present work, the response surface method software was used with five measurement levels with three factors.These were applied for the optimization of operating parameters that affected gas separation performance of polyurethane–zeolite 3A, ZSM-5 mixed matrix membranes.The basis of the experiments was a rotatable central composite design(CCD).The three independent variables studied were: zeolite content(0–24 wt%), operating temperature(25–45 ℃) and operating pressure(0.2–0.1 MPa).The effects of these three variables on the selectivity and permeability membranes were studied by the analysis of variance(ANOVA).Optimal conditions for the enhancement of gas separation performances of polyurethane–3A zeolite were found to be 18 wt%, 30 ℃ and 0.8 MPa respectively.Under these conditions, the permeabilities of carbon dioxide, methane, oxygen and nitrogen gases were measured at 138.4, 22.9, 15.7 and 6.4 Barrer respectively while the CO_2/CH_4, CO_2/N_2 and O_2/N_2 selectivities were 5.8, 22.5 and 2.5, respectively.Also, the optimal conditions for improvement of the gas separation performance of polyurethane–ZSM 5 were found to be 15.64 wt%, 30 ℃ and 4 bar.The permeabilities of these four gases(i.e.carbon dioxide, methane, oxygen and nitrogen) were 164.7, 21.2, 21.5 and 8.1 Barrer while the CO_2/CH_4, CO_2/N_2 and O_2/N_2 selectivities were 7.8, 20.6 and 2.7 respectively.
文摘This paper advances a new simplified formula for estimating variance components ,sums up the basic law to calculate the weights of observed values and a circulation method using the increaments of weights when estimating the variance components of traverse nets,advances the charicteristic roots method to estimate the variance components of traveres nets and presents a practical method to make two real and symmetric matrices two diagonal ones.
基金Support was provided by Research Joint Venture Agreement 17-JV-11242306045,“Old Growth Forest Dynamics and Structure,”between the USDA Forest Service and the University of New HampshireAdditional support to MJD was provided by the USDA National Institute of Food and Agriculture McIntire-Stennis Project Accession Number 1020142,“Forest Structure,Volume,and Biomass in the Northeastern United States.”+1 种基金supported by the USDA National Institute of Food and Agriculture,McIntire-Stennis project OKL02834the Division of Agricultural Sciences and Natural Resources at Oklahoma State University.
文摘Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for tree counts and a larger BAF on which tree measurements are made usually including DBHs and heights needed for volume estimation.Methods:The new estimator is derived using the Delta method from an existing formulation of the big BAF estimator as consisting of three sample means.The new formula is compared to existing big BAF estimators including a popular estimator based on Bruce’s formula.Results:Several computer simulation studies were conducted comparing the new variance estimator to all known variance estimators for big BAF currently in the forest inventory literature.In simulations the new estimator performed well and comparably to existing variance formulas.Conclusions:A possible advantage of the new estimator is that it does not require the assumption of negligible correlation between basal area counts on the small BAF factor and volume-basal area ratios based on the large BAF factor selection trees,an assumption required by all previous big BAF variance estimation formulas.Although this correlation was negligible on the simulation stands used in this study,it is conceivable that the correlation could be significant in some forest types,such as those in which the DBH-height relationship can be affected substantially by density perhaps through competition.We derived a formula that can be used to estimate the covariance between estimates of mean basal area and the ratio of estimates of mean volume and mean basal area.We also mathematically derived expressions for bias in the big BAF estimator that can be used to show the bias approaches zero in large samples on the order of 1n where n is the number of sample points.
文摘A model for both stochastic jumps and volatility for equity returns in the area of option pricing is the stochastic volatility process with jumps (SVPJ). A major advantage of this model lies in the area of mean reversion and volatility clustering between returns and volatility with uphill movements in price asserts. Thus, in this article, we propose to solve the SVPJ model numerically through a discretized variational iteration method (DVIM) to obtain sample paths for the state variable and variance process at various timesteps and replications in order to estimate the expected jump times at various iterates resulting from executing the DVIM as n increases. These jumps help in estimating the degree of randomness in the financial market. It was observed that the average computed expected jump times for the state variable and variance process is moderated by the parameters (variance process through mean reversion), Θ (long-run mean of the variance process), σ (volatility variance process) and λ (constant intensity of the Poisson process) at each iterate. For instance, when = 0.0, Θ = 0.0, σ = 0.0 and λ = 1.0, the state variable cluttered maximally compared to the variance process with less volatility cluttering with an average computed expected jump times of 52.40607869 as n increases in the DVIM scheme. Similarly, when = 3.99, Θ = 0.014, σ = 0.27 and λ = 0.11, the stochastic jumps for the state variable are less cluttered compared to the variance process with maximum volatility cluttering as n increases in the DVIM scheme. In terms of option pricing, the value 52.40607869 suggest a better bargain compared to the value 20.40344029 due to the fact that it yields less volatility rate. MAPLE 18 software was used for all computations in this research.
基金the Ministry of Agriculture and Forestry key project“Puuta liikkeelle ja uusia tuotteita metsästä”(“Wood on the move and new products from forest”)Academy of Finland(project numbers 295100 , 306875).
文摘Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.
文摘Value-at-Risk (VaR) estimation via Monte Carlo (MC) simulation is studied here. The variance reduction technique is proposed in order to speed up MC algorithm. The algorithm for estimating the probability of high portfolio losses (more general risk measure) based on the Cross - Entropy importance sampling is developed. This algorithm can easily be applied in any light- or heavy-tailed case without an extra adaptation. Besides, it does not loose in the performance in comparison to other known methods. A numerical study in both cases is performed and the variance reduction rate is compared with other known methods. The problem of VaR estimation using procedures for estimating the probability of high portfolio losses is also discussed.
文摘The fractional-order Boussinesq equations(FBSQe)are investigated in this work to see if they can effectively improve the situation where the shallow water equation cannot directly handle the dispersion wave.The fuzzy forms of analytical FBSQe solutions are first derived using the Adomian decomposition method.It also occurs on the sea floor as opposed to at the functionality.A set of dynamical partial differential equations(PDEs)in this article exemplify an unconfined aquifer flow implication.Thismethodology can accurately simulate climatological intrinsic waves,so the ripples are spread across a large demographic zone.The Aboodh transform merged with the mechanism of Adomian decomposition is implemented to obtain the fuzzified FBSQe in R,R^(n) and(2nth)-order involving generalized Hukuhara differentiability.According to the system parameter,we classify the qualitative features of the Aboodh transform in the fuzzified Caputo and Atangana-Baleanu-Caputo fractional derivative formulations,which are addressed in detail.The illustrations depict a comparison analysis between the both fractional operators under gH-differentiability,as well as the appropriate attributes for the fractional-order and unpredictability factorsσ∈[0,1].A statistical experiment is conducted between the findings of both fractional derivatives to prevent changing the hypothesis after the results are known.Based on the suggested analyses,hydrodynamic technicians,as irrigation or aquifer quality experts,may be capable of obtaining an appropriate storage intensity amount,including an unpredictability threshold.