A method for determination of 226Ra and 228Ra in environmental samples using the α-β coincidence liquid scintillation counting (LSC) has been developed. Radium were preconcentrated from environmental samples by copr...A method for determination of 226Ra and 228Ra in environmental samples using the α-β coincidence liquid scintillation counting (LSC) has been developed. Radium were preconcentrated from environmental samples by coprecipitation with BaSO4, then purified from others radionuclide interferences using the cation column exchange (Bio-Rad AG 50 W-X4 resin with 200-400 mesh size and H+ form) and operating in warm temperature which is between 70-80oC. Then, the Ba(Ra)SO4 precipitate was filtered through the Millipore filter paper, dried and weighed to calculate chemical yield. The activity concentration of radium isotopes in mixture of liquid scintillation cocktails were measured using LSC after being stored for over 21 days to allow the growth of the progeny nuclides. The method has been validated with a certi-fied reference material supplied by the International Atomic Energy Agency and reliable results were obtained. The radiochemical yields for radium were 59% - 90% and recovery was 97% and 80% for 226Ra and 228Ra, respectively. Sixteen seawater and fish flesh samples collected in Kapar coastal water have been analyzed with the developed method. The obtained radium activity concentrations in seawater were in the range of 02.08 ± 0.82 mBq/L to 3.69 ± 1.29 mBq/L for 226Ra and 6.01 ± 3.05 mBq/L to 17.07 ± 6.62 mBq/L for 228Ra. Meanwhile, the activity concentrations of 226Ra and 228Ra in fish flesh were in the range of 11.82 ± 5.23 – 16.53 ± 6.53 Bq/kg dry wt. and 43.52 ± 16.34 – 53.57 ± 19.86 Bq/kg dry wt., respectively.展开更多
The following qualitative conclusions of forest resources in Zigui can be drawn by the research on 73 plots and 5 vegetation plots:forest area is increasing; forest growing stock is increasing; the adjustment of fores...The following qualitative conclusions of forest resources in Zigui can be drawn by the research on 73 plots and 5 vegetation plots:forest area is increasing; forest growing stock is increasing; the adjustment of forest category structure is constantly improved; forest quality has been improving; stand structure is optimized continuously; biodiversity has initially appeared.展开更多
<strong><em>Background: </em></strong>The appropriate time to initiate antiretroviral therapy (ART) in HIV/AIDS patients is determined by measurement of CD4+/CD8+ T cell count. The CD4/CD8+ T c...<strong><em>Background: </em></strong>The appropriate time to initiate antiretroviral therapy (ART) in HIV/AIDS patients is determined by measurement of CD4+/CD8+ T cell count. The CD4/CD8+ T cell count is also useful, together with viral load, in monitoring disease progression and effectiveness treatment regimens. Several factors may contribute to sample rejection during the CD4+/CD8+ T cells count, resulting in negative effects on patient management. <strong> <em>Objective: </em></strong>Evaluate the causes for CD4+CD8+ T cell count sample rejection at the Kenyatta National Hospital Comprehensive Care Center Laboratory. <strong><em>Method:</em></strong> A retrospective cross-sectional study was conducted between 2018 and 2020. Data was obtained from the “rejected samples” for Partec<sup>R</sup> FlowCyp flow cytometry file. Designed data collection sheet was used for data capture. A total of 3972 samples were submitted for CD4+/CD8+ T cell count during the study period. Causes for sample rejection were numbered 1 to 12, each representing a reason for sample rejection. Number 1 was sub-categorized into clotted, hemolyzed, short-draw and lipemic. Data was analyzed using excel, and presented using tables, graphs and pie charts. Approval to conduct the study was obtained from KNH/UoN ERC. <strong> <em>Results: </em></strong>In the study period, 81/3972 (2.0%) samples were rejected. Samples submitted more than 48 hours after collection were mostly rejected. Other factors included improper collection technique, delayed testing, patient identification error and incorrect use of vacutainer. A combination of clotted samples, specimen submission more than 48 hours caused the most frequent sample rejection, followed with combination of specimen submission more than 48 hours, delayed testing and delayed specimen processing. Together, clotted samples, incorrect vacutainer and poor specimen label caused the least sample rejection. <strong><em>Conclusion:</em></strong> Sample rejection rate for CD4/CD8+ T cell count was relatively low, and multiple factors contributed to rejection. However, improved quality assurance will enable more benefit to patients who seek this test in the laboratory.展开更多
Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random samp...Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.展开更多
The material identification is a pressing requirement for the sensitive security applications. Dual-energy X-ray computer tomography (DXCT) has been investigated for material identification in the medical and security...The material identification is a pressing requirement for the sensitive security applications. Dual-energy X-ray computer tomography (DXCT) has been investigated for material identification in the medical and security fields. It requires two tomographic images at sufficiently different energies. To discriminate dangerous materials of light elements such as plastic bombs in luggage, it is needed to measure accurately with several tens of kilo electron volts where such materials exhibit significant spectral differences. However, CT images in that energy region often include artifacts from beam hardening. To reduce these artifacts, a novel reconstruction method has been investigated. It is an extension of the Al-gebraic Reconstruction Technique and Total Variation (ART-TV) method that reduces the artifacts in a lower-energy CT image by referencing it to an image obtained at higher energy. The CT image of a titanium sample was recon-structed using this method in order to demonstrate the artifact reduction capability.展开更多
There are two distinct types of domains,design-and cross-classes domains,with the former extensively studied under the topic of small-area estimation.In natural resource inventory,however,most classes listed in the co...There are two distinct types of domains,design-and cross-classes domains,with the former extensively studied under the topic of small-area estimation.In natural resource inventory,however,most classes listed in the condition tables of national inventory programs are characterized as cross-classes domains,such as vegetation type,productivity class,and age class.To date,challenges remain active for inventorying cross-classes domains because these domains are usually of unknown sampling frame and spatial distribution with the result that inference relies on population-level as opposed to domain-level sampling.Multiple challenges are noteworthy:(1)efficient sampling strategies are difficult to develop because of little priori information about the target domain;(2)domain inference relies on a sample designed for the population,so within-domain sample sizes could be too small to support a precise estimation;and(3)increasing sample size for the population does not ensure an increase to the domain,so actual sample size for a target domain remains highly uncertain,particularly for small domains.In this paper,we introduce a design-based generalized systematic adaptive cluster sampling(GSACS)for inventorying cross-classes domains.Design-unbiased Hansen-Hurwitz and Horvitz-Thompson estimators are derived for domain totals and compared within GSACS and with systematic sampling(SYS).Comprehensive Monte Carlo simulations show that(1)GSACS Hansen-Hurwitz and Horvitz-Thompson estimators are unbiased and equally efficient,whereas thelatter outperforms the former for supporting a sample of size one;(2)SYS is a special case of GSACS while the latter outperforms the former in terms of increased efficiency and reduced intensity;(3)GSACS Horvitz-Thompson variance estimator is design-unbiased for a single SYS sample;and(4)rules-ofthumb summarized with respect to sampling design and spatial effect improve precision.Because inventorying a mini domain is analogous to inventorying a rare variable,alternative network sampling procedures are also readily available for inventorying cross-classes domains.展开更多
Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance ...Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance determination,Bruce’s method,is customarily used because the volume estimator takes the form of a product of random variables.However,the genesis of Bruce’s method is not known to most foresters who use the method in practice.Methods:We establish that the Taylor series approximation known as the Delta method provides a plausible explanation for the origins of Bruce’s method.Simulations were conducted on two different tree populations to ascertain the similarities of the Delta method to the exact variance of a product.Additionally,two alternative estimators for the variance of individual tree volume-basal area ratios,which are part of the estimation process,were compared within the overall variance estimation procedure.Results:The simulation results demonstrate that Bruce’s method provides a robust method for estimating the variance of inventories conducted with the big BAF method.The simulations also demonstrate that the variance of the mean volume-basal area ratios can be computed using either the usual sample variance of the mean or the ratio variance estimators with equal accuracy,which had not been shown previously for Big BAF sampling.Conclusions:A plausible explanation for the origins of Bruce’s method has been set forth both historically and mathematically in the Delta Method.In most settings,there is evidently no practical difference between applying the exact variance of a product or the Delta method—either can be used.A caution is articulated concerning the aggregation of tree-wise attributes into point-wise summaries in order to test the correlation between the two as a possible indicator of the need for further covariance augmentation.展开更多
In this study, for accuracy and cost an optimal inventory method was examined and introduced to obtain information about Zagros forests, Iran. For this purpose,three distance sampling methods(compound, order distance ...In this study, for accuracy and cost an optimal inventory method was examined and introduced to obtain information about Zagros forests, Iran. For this purpose,three distance sampling methods(compound, order distance and random-pairs) in 5 inventory networks(100 m × 100 m, 100 m × 150 m, 100 m × 200 m,150 m × 150 m, 200 m × 200 m) were implemented in GIS environment, and the related statistical analyses were carried out. Average tree density and canopy cover in hectare with 100% inventory were compared to each other.All the studied methods were implemented in 30 inventory points, and the implementation time of each was recorded.According to the results, the best inventory methods for estimating density and canopy cover were compound150 m × 150 m and 100 m × 100 m methods, respectively. The minimum amount of product inventory time per second(T), and(E%)2 square percent of inventory error of sampling for the compound 150 m × 150 m method regarding density in hectare was 691.8, and for the compound 100 m × 100 m method regarding canopy of 12,089 ha. It can be concluded that compound method is the best for estimating density and canopy features of the forests area.展开更多
Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a...Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are展开更多
Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for t...Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for tree counts and a larger BAF on which tree measurements are made usually including DBHs and heights needed for volume estimation.Methods:The new estimator is derived using the Delta method from an existing formulation of the big BAF estimator as consisting of three sample means.The new formula is compared to existing big BAF estimators including a popular estimator based on Bruce’s formula.Results:Several computer simulation studies were conducted comparing the new variance estimator to all known variance estimators for big BAF currently in the forest inventory literature.In simulations the new estimator performed well and comparably to existing variance formulas.Conclusions:A possible advantage of the new estimator is that it does not require the assumption of negligible correlation between basal area counts on the small BAF factor and volume-basal area ratios based on the large BAF factor selection trees,an assumption required by all previous big BAF variance estimation formulas.Although this correlation was negligible on the simulation stands used in this study,it is conceivable that the correlation could be significant in some forest types,such as those in which the DBH-height relationship can be affected substantially by density perhaps through competition.We derived a formula that can be used to estimate the covariance between estimates of mean basal area and the ratio of estimates of mean volume and mean basal area.We also mathematically derived expressions for bias in the big BAF estimator that can be used to show the bias approaches zero in large samples on the order of 1n where n is the number of sample points.展开更多
Background: Forest inventories have always been a primary information source concerning the forest ecosystem state. Various applied survey approaches arise from the numerous important factors during sampling scheme pl...Background: Forest inventories have always been a primary information source concerning the forest ecosystem state. Various applied survey approaches arise from the numerous important factors during sampling scheme planning. Paramount aspects include the survey goal and scale, target population inherent variation and patterns,and available resources. The last factor commonly inhibits the goal, and compromises have to be made. Airborne laser scanning(ALS) has been intensively tested as a cost-effective option for forest inventories. Despite existing foundations, research has provided disparate results. Environmental conditions are one of the factors greatly influencing inventory performance. Therefore, a need for site-related sampling optimization is well founded.Moreover, as stands are the basic operational unit of managed forest holdings, few related studies have presented stand-level results. As such, herein, we tested the sampling intensity influence on the performance of the ALSenhanced stand-level inventory.Results: Distributions of possible errors were plotted by comparing ALS model estimates, with reference values derived from field surveys of 3300 sample plots and more than 300 control stands located in 5 forest districts. No improvement in results was observed due to the scanning density. The variance in obtained errors stabilized in the interval of 200–300 sample plots, maintaining the bias within +/-5% and the precision above 80%. The sample plot area affected scores mostly when transitioning from 100 to 200 m2. Only a slight gain was observed when bigger plots were used.Conclusions: ALS-enhanced inventories effectively address the demand for comprehensive and detailed information on the structure of single stands over vast areas. Knowledge of the relation between the sampling intensity and accuracy of ALS estimates allows the determination of certain sampling intensity thresholds. This should be useful when matching the required sample size and accuracy with available resources. Site optimization may be necessary, as certain errors may occur due to the sampling scheme, estimator type or forest site, making these factors worth further consideration.展开更多
Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basa...Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basal area. In the second stage, sampling units are trees which are a subset of the first stage trees. In the Big BAF method, the probability of selecting trees in the second stage is made proportional to the two BAFs’ ratio, with a basal area factor larger than that of the first stage. In the Point-3P method the probability of selecting trees, in the second stage, is based on the height prediction and use of a specific random number table. Estimates of the forest stands’ volume and their sampling errors are based on the theory of the product of two random variables. The increasing error in the second stage is small, but the total cost of measuring the trees is much smaller than simply using the first stage, with all the trees measured. In general, the two sampling methods are modern and cost-effective approaches that can be applied in forest stand inventories for forest management purposes and are receiving the growing interest of researchers in the current decade.展开更多
In two-phase sampling, or double sampling, from a population with size N we take one, relatively large, sample size n. From this relatively large sample we take a small sub-sample size m, which usually costs more per ...In two-phase sampling, or double sampling, from a population with size N we take one, relatively large, sample size n. From this relatively large sample we take a small sub-sample size m, which usually costs more per sample unit than the first one. In double sampling with regression estimators, the sample of the first phase n is used for the estimation of the average of an auxiliary variable X, which should be strongly related to the main variable Y (which is estimated from the sub-sample m). Sampling optimization can be achieved by minimizing cost C with fixed var Y, or by finding a minimum var Y for fixed C. In this paper we optimize sampling with use of Lagrange multipliers, either by minimizing variance of Y and having predetermined cost, or by minimizing cost and having predetermined variance of Y.展开更多
文摘A method for determination of 226Ra and 228Ra in environmental samples using the α-β coincidence liquid scintillation counting (LSC) has been developed. Radium were preconcentrated from environmental samples by coprecipitation with BaSO4, then purified from others radionuclide interferences using the cation column exchange (Bio-Rad AG 50 W-X4 resin with 200-400 mesh size and H+ form) and operating in warm temperature which is between 70-80oC. Then, the Ba(Ra)SO4 precipitate was filtered through the Millipore filter paper, dried and weighed to calculate chemical yield. The activity concentration of radium isotopes in mixture of liquid scintillation cocktails were measured using LSC after being stored for over 21 days to allow the growth of the progeny nuclides. The method has been validated with a certi-fied reference material supplied by the International Atomic Energy Agency and reliable results were obtained. The radiochemical yields for radium were 59% - 90% and recovery was 97% and 80% for 226Ra and 228Ra, respectively. Sixteen seawater and fish flesh samples collected in Kapar coastal water have been analyzed with the developed method. The obtained radium activity concentrations in seawater were in the range of 02.08 ± 0.82 mBq/L to 3.69 ± 1.29 mBq/L for 226Ra and 6.01 ± 3.05 mBq/L to 17.07 ± 6.62 mBq/L for 228Ra. Meanwhile, the activity concentrations of 226Ra and 228Ra in fish flesh were in the range of 11.82 ± 5.23 – 16.53 ± 6.53 Bq/kg dry wt. and 43.52 ± 16.34 – 53.57 ± 19.86 Bq/kg dry wt., respectively.
文摘The following qualitative conclusions of forest resources in Zigui can be drawn by the research on 73 plots and 5 vegetation plots:forest area is increasing; forest growing stock is increasing; the adjustment of forest category structure is constantly improved; forest quality has been improving; stand structure is optimized continuously; biodiversity has initially appeared.
文摘<strong><em>Background: </em></strong>The appropriate time to initiate antiretroviral therapy (ART) in HIV/AIDS patients is determined by measurement of CD4+/CD8+ T cell count. The CD4/CD8+ T cell count is also useful, together with viral load, in monitoring disease progression and effectiveness treatment regimens. Several factors may contribute to sample rejection during the CD4+/CD8+ T cells count, resulting in negative effects on patient management. <strong> <em>Objective: </em></strong>Evaluate the causes for CD4+CD8+ T cell count sample rejection at the Kenyatta National Hospital Comprehensive Care Center Laboratory. <strong><em>Method:</em></strong> A retrospective cross-sectional study was conducted between 2018 and 2020. Data was obtained from the “rejected samples” for Partec<sup>R</sup> FlowCyp flow cytometry file. Designed data collection sheet was used for data capture. A total of 3972 samples were submitted for CD4+/CD8+ T cell count during the study period. Causes for sample rejection were numbered 1 to 12, each representing a reason for sample rejection. Number 1 was sub-categorized into clotted, hemolyzed, short-draw and lipemic. Data was analyzed using excel, and presented using tables, graphs and pie charts. Approval to conduct the study was obtained from KNH/UoN ERC. <strong> <em>Results: </em></strong>In the study period, 81/3972 (2.0%) samples were rejected. Samples submitted more than 48 hours after collection were mostly rejected. Other factors included improper collection technique, delayed testing, patient identification error and incorrect use of vacutainer. A combination of clotted samples, specimen submission more than 48 hours caused the most frequent sample rejection, followed with combination of specimen submission more than 48 hours, delayed testing and delayed specimen processing. Together, clotted samples, incorrect vacutainer and poor specimen label caused the least sample rejection. <strong><em>Conclusion:</em></strong> Sample rejection rate for CD4/CD8+ T cell count was relatively low, and multiple factors contributed to rejection. However, improved quality assurance will enable more benefit to patients who seek this test in the laboratory.
基金the Ministry of Agriculture and Forestry key project“Puuta liikkeelle ja uusia tuotteita metsästä”(“Wood on the move and new products from forest”)Academy of Finland(project numbers 295100 , 306875).
文摘Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.
文摘The material identification is a pressing requirement for the sensitive security applications. Dual-energy X-ray computer tomography (DXCT) has been investigated for material identification in the medical and security fields. It requires two tomographic images at sufficiently different energies. To discriminate dangerous materials of light elements such as plastic bombs in luggage, it is needed to measure accurately with several tens of kilo electron volts where such materials exhibit significant spectral differences. However, CT images in that energy region often include artifacts from beam hardening. To reduce these artifacts, a novel reconstruction method has been investigated. It is an extension of the Al-gebraic Reconstruction Technique and Total Variation (ART-TV) method that reduces the artifacts in a lower-energy CT image by referencing it to an image obtained at higher energy. The CT image of a titanium sample was recon-structed using this method in order to demonstrate the artifact reduction capability.
基金supported by the Fundamental Research Funds for the Central Universities (Grant No. 2021ZY04)the National Natural Science Foundation of China (Grant No. 32001252)the International Center for Bamboo and Rattan (Grant No. 1632020029)
文摘There are two distinct types of domains,design-and cross-classes domains,with the former extensively studied under the topic of small-area estimation.In natural resource inventory,however,most classes listed in the condition tables of national inventory programs are characterized as cross-classes domains,such as vegetation type,productivity class,and age class.To date,challenges remain active for inventorying cross-classes domains because these domains are usually of unknown sampling frame and spatial distribution with the result that inference relies on population-level as opposed to domain-level sampling.Multiple challenges are noteworthy:(1)efficient sampling strategies are difficult to develop because of little priori information about the target domain;(2)domain inference relies on a sample designed for the population,so within-domain sample sizes could be too small to support a precise estimation;and(3)increasing sample size for the population does not ensure an increase to the domain,so actual sample size for a target domain remains highly uncertain,particularly for small domains.In this paper,we introduce a design-based generalized systematic adaptive cluster sampling(GSACS)for inventorying cross-classes domains.Design-unbiased Hansen-Hurwitz and Horvitz-Thompson estimators are derived for domain totals and compared within GSACS and with systematic sampling(SYS).Comprehensive Monte Carlo simulations show that(1)GSACS Hansen-Hurwitz and Horvitz-Thompson estimators are unbiased and equally efficient,whereas thelatter outperforms the former for supporting a sample of size one;(2)SYS is a special case of GSACS while the latter outperforms the former in terms of increased efficiency and reduced intensity;(3)GSACS Horvitz-Thompson variance estimator is design-unbiased for a single SYS sample;and(4)rules-ofthumb summarized with respect to sampling design and spatial effect improve precision.Because inventorying a mini domain is analogous to inventorying a rare variable,alternative network sampling procedures are also readily available for inventorying cross-classes domains.
基金Research Joint Venture Agreement 17-JV-11242306045,“Old Growth Forest Dynamics and Structure,”between the USDA Forest Service and the University of New Hampshire.Additional support to MJD was provided by the USDA National Institute of Food and Agriculture McIntire-Stennis Project Accession Number 1020142,“Forest Structure,Volume,and Biomass in the Northeastern United States.”TBL:This work was supported by the USDA National Institute of Food and Agriculture,McIntire-Stennis project OKL02834 and the Division of Agricultural Sciences and Natural Resources at Oklahoma State University.
文摘Background:The double sampling method known as“big BAF sampling”has been advocated as a way to reduce sampling effort while still maintaining a reasonably precise estimate of volume.A well-known method for variance determination,Bruce’s method,is customarily used because the volume estimator takes the form of a product of random variables.However,the genesis of Bruce’s method is not known to most foresters who use the method in practice.Methods:We establish that the Taylor series approximation known as the Delta method provides a plausible explanation for the origins of Bruce’s method.Simulations were conducted on two different tree populations to ascertain the similarities of the Delta method to the exact variance of a product.Additionally,two alternative estimators for the variance of individual tree volume-basal area ratios,which are part of the estimation process,were compared within the overall variance estimation procedure.Results:The simulation results demonstrate that Bruce’s method provides a robust method for estimating the variance of inventories conducted with the big BAF method.The simulations also demonstrate that the variance of the mean volume-basal area ratios can be computed using either the usual sample variance of the mean or the ratio variance estimators with equal accuracy,which had not been shown previously for Big BAF sampling.Conclusions:A plausible explanation for the origins of Bruce’s method has been set forth both historically and mathematically in the Delta Method.In most settings,there is evidently no practical difference between applying the exact variance of a product or the Delta method—either can be used.A caution is articulated concerning the aggregation of tree-wise attributes into point-wise summaries in order to test the correlation between the two as a possible indicator of the need for further covariance augmentation.
文摘In this study, for accuracy and cost an optimal inventory method was examined and introduced to obtain information about Zagros forests, Iran. For this purpose,three distance sampling methods(compound, order distance and random-pairs) in 5 inventory networks(100 m × 100 m, 100 m × 150 m, 100 m × 200 m,150 m × 150 m, 200 m × 200 m) were implemented in GIS environment, and the related statistical analyses were carried out. Average tree density and canopy cover in hectare with 100% inventory were compared to each other.All the studied methods were implemented in 30 inventory points, and the implementation time of each was recorded.According to the results, the best inventory methods for estimating density and canopy cover were compound150 m × 150 m and 100 m × 100 m methods, respectively. The minimum amount of product inventory time per second(T), and(E%)2 square percent of inventory error of sampling for the compound 150 m × 150 m method regarding density in hectare was 691.8, and for the compound 100 m × 100 m method regarding canopy of 12,089 ha. It can be concluded that compound method is the best for estimating density and canopy features of the forests area.
文摘Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are
基金Support was provided by Research Joint Venture Agreement 17-JV-11242306045,“Old Growth Forest Dynamics and Structure,”between the USDA Forest Service and the University of New HampshireAdditional support to MJD was provided by the USDA National Institute of Food and Agriculture McIntire-Stennis Project Accession Number 1020142,“Forest Structure,Volume,and Biomass in the Northeastern United States.”+1 种基金supported by the USDA National Institute of Food and Agriculture,McIntire-Stennis project OKL02834the Division of Agricultural Sciences and Natural Resources at Oklahoma State University.
文摘Background:A new variance estimator is derived and tested for big BAF(Basal Area Factor)sampling which is a forest inventory system that utilizes Bitterlich sampling(point sampling)with two BAF sizes,a small BAF for tree counts and a larger BAF on which tree measurements are made usually including DBHs and heights needed for volume estimation.Methods:The new estimator is derived using the Delta method from an existing formulation of the big BAF estimator as consisting of three sample means.The new formula is compared to existing big BAF estimators including a popular estimator based on Bruce’s formula.Results:Several computer simulation studies were conducted comparing the new variance estimator to all known variance estimators for big BAF currently in the forest inventory literature.In simulations the new estimator performed well and comparably to existing variance formulas.Conclusions:A possible advantage of the new estimator is that it does not require the assumption of negligible correlation between basal area counts on the small BAF factor and volume-basal area ratios based on the large BAF factor selection trees,an assumption required by all previous big BAF variance estimation formulas.Although this correlation was negligible on the simulation stands used in this study,it is conceivable that the correlation could be significant in some forest types,such as those in which the DBH-height relationship can be affected substantially by density perhaps through competition.We derived a formula that can be used to estimate the covariance between estimates of mean basal area and the ratio of estimates of mean volume and mean basal area.We also mathematically derived expressions for bias in the big BAF estimator that can be used to show the bias approaches zero in large samples on the order of 1n where n is the number of sample points.
基金the research project entitled“Remote sensing-based assessment of woody biomass and carbon storage in forests”,which was financially supported by the National Centre for Research and Development(Poland),under the BIOSTRATEG programme(Agreement No.BIOSTRATEG1/267755/4/NCBR/2015)Financial support was also received from the project entitled“Rozbudowa metody inwentaryzacji urządzeniowej stanu lasu z wykorzystaniem efektów projektu REMBIOFOR”(Project No.500463,agreement No.EO.271.3.12.2019 with the Polish State Forests National Forest Holding,signed on 14.10.2019),which constitutes a continuation of the former project.
文摘Background: Forest inventories have always been a primary information source concerning the forest ecosystem state. Various applied survey approaches arise from the numerous important factors during sampling scheme planning. Paramount aspects include the survey goal and scale, target population inherent variation and patterns,and available resources. The last factor commonly inhibits the goal, and compromises have to be made. Airborne laser scanning(ALS) has been intensively tested as a cost-effective option for forest inventories. Despite existing foundations, research has provided disparate results. Environmental conditions are one of the factors greatly influencing inventory performance. Therefore, a need for site-related sampling optimization is well founded.Moreover, as stands are the basic operational unit of managed forest holdings, few related studies have presented stand-level results. As such, herein, we tested the sampling intensity influence on the performance of the ALSenhanced stand-level inventory.Results: Distributions of possible errors were plotted by comparing ALS model estimates, with reference values derived from field surveys of 3300 sample plots and more than 300 control stands located in 5 forest districts. No improvement in results was observed due to the scanning density. The variance in obtained errors stabilized in the interval of 200–300 sample plots, maintaining the bias within +/-5% and the precision above 80%. The sample plot area affected scores mostly when transitioning from 100 to 200 m2. Only a slight gain was observed when bigger plots were used.Conclusions: ALS-enhanced inventories effectively address the demand for comprehensive and detailed information on the structure of single stands over vast areas. Knowledge of the relation between the sampling intensity and accuracy of ALS estimates allows the determination of certain sampling intensity thresholds. This should be useful when matching the required sample size and accuracy with available resources. Site optimization may be necessary, as certain errors may occur due to the sampling scheme, estimator type or forest site, making these factors worth further consideration.
文摘Big Basal Area Factor (Big BAF) and Point-3P are two-stage sampling methods. In the first stage the sampling units, in both methods, are Bitterlich points where the selection of the trees is proportional to their basal area. In the second stage, sampling units are trees which are a subset of the first stage trees. In the Big BAF method, the probability of selecting trees in the second stage is made proportional to the two BAFs’ ratio, with a basal area factor larger than that of the first stage. In the Point-3P method the probability of selecting trees, in the second stage, is based on the height prediction and use of a specific random number table. Estimates of the forest stands’ volume and their sampling errors are based on the theory of the product of two random variables. The increasing error in the second stage is small, but the total cost of measuring the trees is much smaller than simply using the first stage, with all the trees measured. In general, the two sampling methods are modern and cost-effective approaches that can be applied in forest stand inventories for forest management purposes and are receiving the growing interest of researchers in the current decade.
文摘In two-phase sampling, or double sampling, from a population with size N we take one, relatively large, sample size n. From this relatively large sample we take a small sub-sample size m, which usually costs more per sample unit than the first one. In double sampling with regression estimators, the sample of the first phase n is used for the estimation of the average of an auxiliary variable X, which should be strongly related to the main variable Y (which is estimated from the sub-sample m). Sampling optimization can be achieved by minimizing cost C with fixed var Y, or by finding a minimum var Y for fixed C. In this paper we optimize sampling with use of Lagrange multipliers, either by minimizing variance of Y and having predetermined cost, or by minimizing cost and having predetermined variance of Y.