The deformation and failure of coal and rock is energy-driving results according to thermodynamics.It is important to study the strain energy characteristics of coal-rock composite samples to better understand the def...The deformation and failure of coal and rock is energy-driving results according to thermodynamics.It is important to study the strain energy characteristics of coal-rock composite samples to better understand the deformation and failure mechanism of of coal-rock composite structures.In this research,laboratory tests and numerical simulation of uniaxial compressions of coal-rock composite samples were carried out with five different loading rates.The test results show that strength,deformation,acoustic emission(AE)and energy evolution of coal-rock composite sample all have obvious loading rate effects.The uniaxial compressive strength and elastic modulus increase with the increase of loading rate.And with the increase of loading rate,the AE energy at the peak strength of coal-rock composites increases first,then decreases,and then increases.With the increase of loading rate,the AE cumulative count first decreases and then increases.And the total absorption energy and dissipation energy of coal-rock composite samples show non-linear increasing trends,while release elastic strain energy increases first and then decreases.The laboratory experiments conducted on coal-rock composite samples were simulated numerically using the particle flow code(PFC).With careful selection of suitable material constitutive models for coal and rock,and accurate estimation and calibration of mechanical parameters of coal-rock composite sample,it was possible to obtain a good agreement between the laboratory experimental and numerical results.This research can provide references for understanding failure of underground coalrock composite structure by using energy related measuring methods.展开更多
Terpenes, aldehydes, ketones, benzene, and toluene are the important volatileorganic compounds (VOCs) emitted from wood composites. A sampling apparatus of VOCs for woodcomposites was designed and manufactured by Nort...Terpenes, aldehydes, ketones, benzene, and toluene are the important volatileorganic compounds (VOCs) emitted from wood composites. A sampling apparatus of VOCs for woodcomposites was designed and manufactured by Northeast Forestry University in China. Theconcentration of VOCs derived from wood based materials, such as flooring, panel wall, finishing,and furniture can be sampled in a small stainless steel chambers. A protocol is also developed inthis study to sample and measure the new and representative specimens. Preliminary research showedthat the properties of the equipment have good stability. The sort and the amount of differentcomponents can be detected from it. The apparatus is practicable.展开更多
Human face can be rebuilt to a three-dimensional (3 D) digital profile based on an optical 3D sensing system named Composite Fourier-Transform Profilometry (CFTP) where a composite structured light will be used. To st...Human face can be rebuilt to a three-dimensional (3 D) digital profile based on an optical 3D sensing system named Composite Fourier-Transform Profilometry (CFTP) where a composite structured light will be used. To study the sampling effect during the digitization process in practical CFTP, the pectinate function and convolution theorem were introduced to discuss the potential phase errors caused by sampling the composite pattern along two orthogonal directions. The selecting criterions of sampling frequencies are derived and the results indicate that to avoid spectral aliasing, the sampling frequency along the phrase variation direction must be at least four times as the baseband and along the orthogonal direction it must be at least three times as the larger frequency of the two carrier frequencies. The practical experiment of a model face reconstruction verified the theories.展开更多
The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite ar...The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.展开更多
Monte Carlo Simulations(MCS),commonly used for reliability analysis,require a large amount of data points to obtain acceptable accuracy,even if the Subset Simulation with Importance Sampling(SS/IS)methods are used.The...Monte Carlo Simulations(MCS),commonly used for reliability analysis,require a large amount of data points to obtain acceptable accuracy,even if the Subset Simulation with Importance Sampling(SS/IS)methods are used.The Second Order Reliability Method(SORM)has proved to be an excellent rapid tool in the stochastic analysis of laminated composite structures,when compared to the slower MCS techniques.However,SORM requires differentiating the performance function with respect to each of the random variables involved in the simulation.The most suitable approach to do this is to use a symbolic solver,which renders the simulations very slow,although still faster than MCS.Moreover,the inability to obtain the derivative of the performance function with respect to some parameters,such as ply thickness,limits the capabilities of the classical SORM.In this work,a Neural Network-Based Second Order Reliability Method(NNBSORM)is developed to replace the finite element algorithm in the stochastic analysis of laminated composite plates in free vibration.Because of the ability to obtain expressions for the first and second derivatives of the NN system outputs with respect to any of its inputs,such as material properties,ply thicknesses and orientation angles,the need for using a symbolic solver to calculate the derivatives of the performance function no longer exists.The proposed approach is accordingly much faster,and easily allows for the consideration of ply thickness uncertainty.The present analysis showed that dealing with ply thicknesses as random variables results in 37%increase in the laminate’s probability of failure.展开更多
A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling l...A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling load. The proposed optimization algorithm applies both finite element analysis and the mode-pursuing sampling (MPS)method. The algorithms suggest the optimal stacking sequence for achieving the maximal buckling load. The procedure is implemented by integrating ANSYS and MATLAB. The stacking sequence designing for the symmetric angle-ply three-layered and five-layered composite cylinder shells is presented to illustrate the optimization process, respectively. Compared with the genetic algorithms, the proposed optimization method is much faster and efficient for composite staking sequence plan.展开更多
In order to investigate the effects of different geometrical parameters and pretightening loads on failure mode and bearing strength,a large number of single-bolted T300/QY8911 composite laminates were tested under st...In order to investigate the effects of different geometrical parameters and pretightening loads on failure mode and bearing strength,a large number of single-bolted T300/QY8911 composite laminates were tested under static tension load.Box-plot was used to extract the singular testing values of bearing strength and effective statistical values were obtained.T-test method of independent samples was used to study how much pretightening loads influence bearing strength.The results show that the geometrical parameters,such as ratios of width to hole diameter(w/d) and edge distance to hole diameter(e/d),remarkably influence failure mode and bearing strength.Net-section failure will occur when w/d is smaller than 4,and shear-out failure will occur when e/d is smaller than 2.Bearing failure or bearing and shear-out combined failure will occur when w/d is greater than 4 and e/d is greater than 2.There is an optimal combination of geometrical parameters to achieve the highest bearing strength.For most of specimens,pretightening loads do not explicitly influence bearing strength.展开更多
A series of triaxial compression tests were arried out by means of composite-reinforced soil samples to simulate the interaction between soil and pile. The samples are made of gravel or lime-soil with different length...A series of triaxial compression tests were arried out by means of composite-reinforced soil samples to simulate the interaction between soil and pile. The samples are made of gravel or lime-soil with different length at the center. The experiment indicates that the strength of the composite samples can not be obtained by superimposure of reinforcing pile and soil simply according to their replacement proportion. It also indicates the law for stress ratio of reinforcing column to soil. The stress ratio of reinforcing column to soil increases and reaches peak rapidly while load and strain is small. Then the ratio decreases. This law is in accordance with the measuring resuits in construction site.展开更多
In this work empirical models describing sampling error (Δ) are reported based upon analytical findings elicited from 3 common probability density functions (PDF): the Gaussian, representing any real-valued, ...In this work empirical models describing sampling error (Δ) are reported based upon analytical findings elicited from 3 common probability density functions (PDF): the Gaussian, representing any real-valued, randomly changing variable x of mean μ?and standard deviation σthe Poisson, representing counting data: i.e., any integral-valued entity’s count of x (cells, clumps of cells or colony forming units, molecules, mutations, etc.) per tested volume, area, length of time, etc. with population mean of μ?and;binomial data representing the number of successful occurrences of something (x+) out of n observations or sub-samplings. These data were generated in such a way as to simulate what should be observed in practice but avoid other forms of experimental error. Based upon analyses of 104 Δ?measurements, we show that the average Δ?() is proportional to ?(σx•μ-1;Gaussian) or ?(Poisson & binomial). The average proportionality constants associated with these disparate populations were also nearly identical (;±s). However, since ?for any Poisson process, . In a similar vein, we have empirically demonstrated that binomial-associated ?were also proportional to σx•μ-1. Furthermore, we established that, when all ?were plotted against either ?or σx•μ-1, there was only one relationship with a slope = A (0.767 ± 0.0990) and a near-zero intercept. This latter finding also argues that all , regardless of parent PDF, are proportional to σx•μ-1?which is the coefficient of variation for a population of sample means (). Lastly, we establish that the proportionality constant A is equivalent to the coefficient of variation associated with Δ?() measurement and, therefore, . These results are noteworthy inasmuch as they provide a straightforward empirical link between stochastic sampling error and the aforementioned Cvs. Finally, we demonstrate that all attendant empirical measures of Δ?are reasonably small (e.g., ) when an environmental microbiome was well-sampled: n = 16 - 18 observations with μ∼3?isolates per observation. These colony counting results were supported by the fact that the two major isolates’ relative abundance was reproducible in the four most probable composition observations from one common population.展开更多
The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil...The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil organic carbon (SOC). In turn, SOC can support healthy soils and provide a multitude of ecosystem benefits. To support SOC sequestration, researchers and policy makers must be able to precisely measure the amount of SOC in a given plot of land. SOC measurement is typically accomplished by taking soil cores selected at random from the plot under study, mixing (compositing) some of them together, and analyzing (assaying) the composited samples in a laboratory. Compositing reduces assay costs, which can be substantial. Taking samples is also costly. Given uncertainties and costs in both sampling and assay along with a desired estimation precision, there is an optimal composite size that will minimize the budget required to achieve that precision. Conversely, given a fixed budget, there is a composite size that minimizes uncertainty. In this paper, we describe and formalize sampling and assay for SOC and derive the optima for three commonly used assay methods: dry combustion in an elemental analyzer, loss-on-ignition, and mid-infrared spectroscopy. We demonstrate the utility of this approach using data from a soil survey conducted in California. We give recommendations for practice and provide software to implement our framework.展开更多
Forest assessments are essential to understand the tree population structure and diversity status of forests and to provide information for biodiversity recovery planning. Unfortunately, the majority of Miombo woodlan...Forest assessments are essential to understand the tree population structure and diversity status of forests and to provide information for biodiversity recovery planning. Unfortunately, the majority of Miombo woodlands in Mozambique lack of inventory data, and consequently are often insufficient for management. This study aimed to assess the species richness, diversity and structure of Miombo woodlands using a range of sampling sizes in Mocuba district, Mozambique. Plant inventory was carried out in 128 systematically selected sample plots in 71.6 ha, which was divided into eight treatments, i.e., TI: 0.1 ha; T2:0.25 ha; T3:0.375 ha; T4:0.5 ha; T5:0.625 ha; T6:0.75 ha; T7:0.875 ha; T8:1.0 ha, with 16 repetitions. All stems _〉 10 cm diameter at breast height, and species name to evaluate the floristic composition, richness of tree species, diversity and diameter distribution were recorded. A total of 36,535 individuals were recorded, belonging 124 species, 83 genera and representing 31 botanical families. The most important species was the Brachystegia spiciformis Benth. and the richest botanical family was Fabaceae. The forest showed an average of 517 + 85 trees/ha, and high species diversity and evenness. Analysis of covariance shows that the intercepts and slope of exponential function for diameter distribution are not significantly different for the eight treatments. Compared with the entire composite forest, inventory means were accurately estimated and size class distributions were well represented for plots 〈 0.25 ha, for selecting an efficient sampling design suited to forest characteristics and the inventory's purpose.展开更多
The analysis of stable isotopes of carbon and oxygen in different carbonate rocks by the phosphoric acid method is not easier than that by the laser sampling method developed in recent years, which optically focuses l...The analysis of stable isotopes of carbon and oxygen in different carbonate rocks by the phosphoric acid method is not easier than that by the laser sampling method developed in recent years, which optically focuses laser beams with sufficient energy on a micro area of a thin section in a vacuum sample box via microscope. CO 2 produced by heating decomposition of carbonate was purified by the vacuum system, and the stable isotopic values of carbon and oxygen were calculated and analyzed on a mass spectrometer. This paper adopted the laser micro-sampling technique to analyze the stable isotopes of carbon and oxygen in dolomite, carbonate cement, stromatolite and different forms of dawsonite (donbassite). Results indicated that the laser micro-sampling method is effective in analyzing carbonate composition and could be a convincing proof for justification on carbonate composition analysis.展开更多
Length composition analysis can provide insights into the dynamics of a fish population.Accurate quantification of the size structure of a population is critical to understand the status of a fishery and how the popul...Length composition analysis can provide insights into the dynamics of a fish population.Accurate quantification of the size structure of a population is critical to understand the status of a fishery and how the population responds to environmental stressors.A scientific observer program is a reliable way to provide such accurate information.However,100%observer coverage is usually impossible for most fisheries because of logistic and financial constraints.Thus,there is a need to evaluate observer program performance,identify suitable sample sizes,and optimize the allocation of observation efforts.The objective of this study is to evaluate the effects of sample size on the quality of length composition data and identify an optimal coverage rate and observation ratio to improve the observation efficiency using an onboard observer data set from China's tuna longline fishery in the western and central Pacific Ocean.We found that the required sample size varies with fish species,indices used to describe length composition,the acceptable accuracy of the estimates,and the allocation methods of sampling effort.Ignoring other information requirements,1000 individuals would be sufficient for most species to reliably quantify length compositions,and a smaller sample size could generate reliable estimates of mean length.A coverage rate of 20%would be sufficient for most species,but a lower coverage rate(5%or 10%)could also be effective to meet with the accuracy and precision requirement in estimating length compositions.A nonrandom effort allocation among fishing baskets within a set could cause the length composition to be overestimated or underestimated for some species.The differences in effective sample sizes among species should be included in the consideration for a rational allocation of observation effort among species when there are different species management priorities.展开更多
基金Projects(51774196,51804181,51874190)supported by the National Natural Science Foundation of ChinaProject(2019GSF111020)supported by the Key R&D Program of Shandong Province,ChinaProject(201908370205)supported by the China Scholarship Council。
文摘The deformation and failure of coal and rock is energy-driving results according to thermodynamics.It is important to study the strain energy characteristics of coal-rock composite samples to better understand the deformation and failure mechanism of of coal-rock composite structures.In this research,laboratory tests and numerical simulation of uniaxial compressions of coal-rock composite samples were carried out with five different loading rates.The test results show that strength,deformation,acoustic emission(AE)and energy evolution of coal-rock composite sample all have obvious loading rate effects.The uniaxial compressive strength and elastic modulus increase with the increase of loading rate.And with the increase of loading rate,the AE energy at the peak strength of coal-rock composites increases first,then decreases,and then increases.With the increase of loading rate,the AE cumulative count first decreases and then increases.And the total absorption energy and dissipation energy of coal-rock composite samples show non-linear increasing trends,while release elastic strain energy increases first and then decreases.The laboratory experiments conducted on coal-rock composite samples were simulated numerically using the particle flow code(PFC).With careful selection of suitable material constitutive models for coal and rock,and accurate estimation and calibration of mechanical parameters of coal-rock composite sample,it was possible to obtain a good agreement between the laboratory experimental and numerical results.This research can provide references for understanding failure of underground coalrock composite structure by using energy related measuring methods.
基金This project is supported by the grand of the Oversea Back Scholar Research Startup of China Education Ministry, Heilongjiang Post-doctorial Research Startup and NEFU Creative Item.
文摘Terpenes, aldehydes, ketones, benzene, and toluene are the important volatileorganic compounds (VOCs) emitted from wood composites. A sampling apparatus of VOCs for woodcomposites was designed and manufactured by Northeast Forestry University in China. Theconcentration of VOCs derived from wood based materials, such as flooring, panel wall, finishing,and furniture can be sampled in a small stainless steel chambers. A protocol is also developed inthis study to sample and measure the new and representative specimens. Preliminary research showedthat the properties of the equipment have good stability. The sort and the amount of differentcomponents can be detected from it. The apparatus is practicable.
文摘Human face can be rebuilt to a three-dimensional (3 D) digital profile based on an optical 3D sensing system named Composite Fourier-Transform Profilometry (CFTP) where a composite structured light will be used. To study the sampling effect during the digitization process in practical CFTP, the pectinate function and convolution theorem were introduced to discuss the potential phase errors caused by sampling the composite pattern along two orthogonal directions. The selecting criterions of sampling frequencies are derived and the results indicate that to avoid spectral aliasing, the sampling frequency along the phrase variation direction must be at least four times as the baseband and along the orthogonal direction it must be at least three times as the larger frequency of the two carrier frequencies. The practical experiment of a model face reconstruction verified the theories.
文摘The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.
文摘Monte Carlo Simulations(MCS),commonly used for reliability analysis,require a large amount of data points to obtain acceptable accuracy,even if the Subset Simulation with Importance Sampling(SS/IS)methods are used.The Second Order Reliability Method(SORM)has proved to be an excellent rapid tool in the stochastic analysis of laminated composite structures,when compared to the slower MCS techniques.However,SORM requires differentiating the performance function with respect to each of the random variables involved in the simulation.The most suitable approach to do this is to use a symbolic solver,which renders the simulations very slow,although still faster than MCS.Moreover,the inability to obtain the derivative of the performance function with respect to some parameters,such as ply thickness,limits the capabilities of the classical SORM.In this work,a Neural Network-Based Second Order Reliability Method(NNBSORM)is developed to replace the finite element algorithm in the stochastic analysis of laminated composite plates in free vibration.Because of the ability to obtain expressions for the first and second derivatives of the NN system outputs with respect to any of its inputs,such as material properties,ply thicknesses and orientation angles,the need for using a symbolic solver to calculate the derivatives of the performance function no longer exists.The proposed approach is accordingly much faster,and easily allows for the consideration of ply thickness uncertainty.The present analysis showed that dealing with ply thicknesses as random variables results in 37%increase in the laminate’s probability of failure.
基金Innovation Team Development Program of Ministry of Education of China (No. IRT0763)National Natural Science Foundation of China (No. 50205028).
文摘A new optimization method for the optimization of stacking of composite glass fiber laminates is developed. The fiber orientation and angle of the layers of the cylindrical shells are sought considering the buckling load. The proposed optimization algorithm applies both finite element analysis and the mode-pursuing sampling (MPS)method. The algorithms suggest the optimal stacking sequence for achieving the maximal buckling load. The procedure is implemented by integrating ANSYS and MATLAB. The stacking sequence designing for the symmetric angle-ply three-layered and five-layered composite cylinder shells is presented to illustrate the optimization process, respectively. Compared with the genetic algorithms, the proposed optimization method is much faster and efficient for composite staking sequence plan.
基金Project(51175424)supported by the National Natural Science Foundation of ChinaProject(B07050)supported by‘111’Program of ChinaProject(JC20110257)supported by the Basic Research Foundation of Northwestern Polytechnical University,China
文摘In order to investigate the effects of different geometrical parameters and pretightening loads on failure mode and bearing strength,a large number of single-bolted T300/QY8911 composite laminates were tested under static tension load.Box-plot was used to extract the singular testing values of bearing strength and effective statistical values were obtained.T-test method of independent samples was used to study how much pretightening loads influence bearing strength.The results show that the geometrical parameters,such as ratios of width to hole diameter(w/d) and edge distance to hole diameter(e/d),remarkably influence failure mode and bearing strength.Net-section failure will occur when w/d is smaller than 4,and shear-out failure will occur when e/d is smaller than 2.Bearing failure or bearing and shear-out combined failure will occur when w/d is greater than 4 and e/d is greater than 2.There is an optimal combination of geometrical parameters to achieve the highest bearing strength.For most of specimens,pretightening loads do not explicitly influence bearing strength.
文摘A series of triaxial compression tests were arried out by means of composite-reinforced soil samples to simulate the interaction between soil and pile. The samples are made of gravel or lime-soil with different length at the center. The experiment indicates that the strength of the composite samples can not be obtained by superimposure of reinforcing pile and soil simply according to their replacement proportion. It also indicates the law for stress ratio of reinforcing column to soil. The stress ratio of reinforcing column to soil increases and reaches peak rapidly while load and strain is small. Then the ratio decreases. This law is in accordance with the measuring resuits in construction site.
文摘In this work empirical models describing sampling error (Δ) are reported based upon analytical findings elicited from 3 common probability density functions (PDF): the Gaussian, representing any real-valued, randomly changing variable x of mean μ?and standard deviation σthe Poisson, representing counting data: i.e., any integral-valued entity’s count of x (cells, clumps of cells or colony forming units, molecules, mutations, etc.) per tested volume, area, length of time, etc. with population mean of μ?and;binomial data representing the number of successful occurrences of something (x+) out of n observations or sub-samplings. These data were generated in such a way as to simulate what should be observed in practice but avoid other forms of experimental error. Based upon analyses of 104 Δ?measurements, we show that the average Δ?() is proportional to ?(σx•μ-1;Gaussian) or ?(Poisson & binomial). The average proportionality constants associated with these disparate populations were also nearly identical (;±s). However, since ?for any Poisson process, . In a similar vein, we have empirically demonstrated that binomial-associated ?were also proportional to σx•μ-1. Furthermore, we established that, when all ?were plotted against either ?or σx•μ-1, there was only one relationship with a slope = A (0.767 ± 0.0990) and a near-zero intercept. This latter finding also argues that all , regardless of parent PDF, are proportional to σx•μ-1?which is the coefficient of variation for a population of sample means (). Lastly, we establish that the proportionality constant A is equivalent to the coefficient of variation associated with Δ?() measurement and, therefore, . These results are noteworthy inasmuch as they provide a straightforward empirical link between stochastic sampling error and the aforementioned Cvs. Finally, we demonstrate that all attendant empirical measures of Δ?are reasonably small (e.g., ) when an environmental microbiome was well-sampled: n = 16 - 18 observations with μ∼3?isolates per observation. These colony counting results were supported by the fact that the two major isolates’ relative abundance was reproducible in the four most probable composition observations from one common population.
文摘The world needs around 150 Pg of negative carbon emissions to mitigate climate change. Global soils may provide a stable, sizeable reservoir to help achieve this goal by sequestering atmospheric carbon dioxide as soil organic carbon (SOC). In turn, SOC can support healthy soils and provide a multitude of ecosystem benefits. To support SOC sequestration, researchers and policy makers must be able to precisely measure the amount of SOC in a given plot of land. SOC measurement is typically accomplished by taking soil cores selected at random from the plot under study, mixing (compositing) some of them together, and analyzing (assaying) the composited samples in a laboratory. Compositing reduces assay costs, which can be substantial. Taking samples is also costly. Given uncertainties and costs in both sampling and assay along with a desired estimation precision, there is an optimal composite size that will minimize the budget required to achieve that precision. Conversely, given a fixed budget, there is a composite size that minimizes uncertainty. In this paper, we describe and formalize sampling and assay for SOC and derive the optima for three commonly used assay methods: dry combustion in an elemental analyzer, loss-on-ignition, and mid-infrared spectroscopy. We demonstrate the utility of this approach using data from a soil survey conducted in California. We give recommendations for practice and provide software to implement our framework.
文摘Forest assessments are essential to understand the tree population structure and diversity status of forests and to provide information for biodiversity recovery planning. Unfortunately, the majority of Miombo woodlands in Mozambique lack of inventory data, and consequently are often insufficient for management. This study aimed to assess the species richness, diversity and structure of Miombo woodlands using a range of sampling sizes in Mocuba district, Mozambique. Plant inventory was carried out in 128 systematically selected sample plots in 71.6 ha, which was divided into eight treatments, i.e., TI: 0.1 ha; T2:0.25 ha; T3:0.375 ha; T4:0.5 ha; T5:0.625 ha; T6:0.75 ha; T7:0.875 ha; T8:1.0 ha, with 16 repetitions. All stems _〉 10 cm diameter at breast height, and species name to evaluate the floristic composition, richness of tree species, diversity and diameter distribution were recorded. A total of 36,535 individuals were recorded, belonging 124 species, 83 genera and representing 31 botanical families. The most important species was the Brachystegia spiciformis Benth. and the richest botanical family was Fabaceae. The forest showed an average of 517 + 85 trees/ha, and high species diversity and evenness. Analysis of covariance shows that the intercepts and slope of exponential function for diameter distribution are not significantly different for the eight treatments. Compared with the entire composite forest, inventory means were accurately estimated and size class distributions were well represented for plots 〈 0.25 ha, for selecting an efficient sampling design suited to forest characteristics and the inventory's purpose.
文摘The analysis of stable isotopes of carbon and oxygen in different carbonate rocks by the phosphoric acid method is not easier than that by the laser sampling method developed in recent years, which optically focuses laser beams with sufficient energy on a micro area of a thin section in a vacuum sample box via microscope. CO 2 produced by heating decomposition of carbonate was purified by the vacuum system, and the stable isotopic values of carbon and oxygen were calculated and analyzed on a mass spectrometer. This paper adopted the laser micro-sampling technique to analyze the stable isotopes of carbon and oxygen in dolomite, carbonate cement, stromatolite and different forms of dawsonite (donbassite). Results indicated that the laser micro-sampling method is effective in analyzing carbonate composition and could be a convincing proof for justification on carbonate composition analysis.
基金The work was supported by the scientific observer program of the distant-water fishery of the Agriculture Ministry of China(08–25).
文摘Length composition analysis can provide insights into the dynamics of a fish population.Accurate quantification of the size structure of a population is critical to understand the status of a fishery and how the population responds to environmental stressors.A scientific observer program is a reliable way to provide such accurate information.However,100%observer coverage is usually impossible for most fisheries because of logistic and financial constraints.Thus,there is a need to evaluate observer program performance,identify suitable sample sizes,and optimize the allocation of observation efforts.The objective of this study is to evaluate the effects of sample size on the quality of length composition data and identify an optimal coverage rate and observation ratio to improve the observation efficiency using an onboard observer data set from China's tuna longline fishery in the western and central Pacific Ocean.We found that the required sample size varies with fish species,indices used to describe length composition,the acceptable accuracy of the estimates,and the allocation methods of sampling effort.Ignoring other information requirements,1000 individuals would be sufficient for most species to reliably quantify length compositions,and a smaller sample size could generate reliable estimates of mean length.A coverage rate of 20%would be sufficient for most species,but a lower coverage rate(5%or 10%)could also be effective to meet with the accuracy and precision requirement in estimating length compositions.A nonrandom effort allocation among fishing baskets within a set could cause the length composition to be overestimated or underestimated for some species.The differences in effective sample sizes among species should be included in the consideration for a rational allocation of observation effort among species when there are different species management priorities.