Doubled haploid(DH)plants have been widely used for breeding and biological research in crops.Pop ulus spp.have been used as model woody plant species for biological research.However,the induction of DH poplar plants ...Doubled haploid(DH)plants have been widely used for breeding and biological research in crops.Pop ulus spp.have been used as model woody plant species for biological research.However,the induction of DH poplar plants is onerous,and limited biological or breeding work has been carried out on DH individuals or populations.In this study,we provide an effective protocol for poplar haploid induction based on an anther culture method.A total of 96 whole DH plant lines were obtained using an F1hybrid of Populus simonii×P.nigra as a donor tree.The phenotypes of the DH population showed exceptionally high variance when compared to those of half-sib progeny of the donor tree.Each DH line displayed distinct features compared to those of the other DH lines or the donor tree.Additionally,some excellent homozygous lines have the potential to be model plants in genetic and breeding studies.展开更多
The inter-cycle correlation of fission source distributions(FSDs)in the Monte Carlo power iteration process results in variance underestimation of tallied physical quantities,especially in large local tallies.This stu...The inter-cycle correlation of fission source distributions(FSDs)in the Monte Carlo power iteration process results in variance underestimation of tallied physical quantities,especially in large local tallies.This study provides a mesh-free semiquantitative variance underestimation elimination method to obtain a credible confidence interval for the tallied results.This method comprises two procedures:Estimation and Elimination.The FSD inter-cycle correlation length is estimated in the Estimation procedure using the Sliced Wasserstein distance algorithm.The batch method was then used in the elimination procedure.The FSD inter-cycle correlation length was proved to be the optimum batch length to eliminate the variance underestimation problem.We exemplified this method using the OECD sphere array model and 3D PWR BEAVRS model.The results showed that the average variance underestimation ratios of local tallies declined from 37 to 87%to within±5%in these models.展开更多
The zero-energy variance principle can be exploited in variational quantum eigensolvers for solving general eigenstates but its capacity for obtaining a specified eigenstate,such as ground state,is limited as all eige...The zero-energy variance principle can be exploited in variational quantum eigensolvers for solving general eigenstates but its capacity for obtaining a specified eigenstate,such as ground state,is limited as all eigenstates are of zero energy variance.We propose a variance-based variational quantum eigensolver for solving the ground state by searching in an enlarged space of wavefunction and Hamiltonian.With a mutual variance-Hamiltonian optimization procedure,the Hamiltonian is iteratively updated to guild the state towards to the ground state of the target Hamiltonian by minimizing the energy variance in each iteration.We demonstrate the performance and properties of the algorithm with numeral simulations.Our work suggests an avenue for utilizing guided Hamiltonian in hybrid quantum-classical algorithms.展开更多
As modern weapons and equipment undergo increasing levels of informatization,intelligence,and networking,the topology and traffic characteristics of battlefield data networks built with tactical data links are becomin...As modern weapons and equipment undergo increasing levels of informatization,intelligence,and networking,the topology and traffic characteristics of battlefield data networks built with tactical data links are becoming progressively complex.In this paper,we employ a traffic matrix to model the tactical data link network.We propose a method that utilizes the Maximum Variance Unfolding(MVU)algorithm to conduct nonlinear dimensionality reduction analysis on high-dimensional open network traffic matrix datasets.This approach introduces novel ideas and methods for future applications,including traffic prediction and anomaly analysis in real battlefield network environments.展开更多
Monitoring temporal changes in sea level is important in assessing coastal risk.Sea level anomalies at a tide gauge station,if kinematically conceived,include systematic variations such as trend,acceleration,periodic ...Monitoring temporal changes in sea level is important in assessing coastal risk.Sea level anomalies at a tide gauge station,if kinematically conceived,include systematic variations such as trend,acceleration,periodic oscillations,and random disturbances.Among them,the non-stationary nature of the random sea level variations of known or unknown origin at coastal regions has been long recognized by the sea level community.This study proposes the analyses of subgroups of random residual statistics of a rigorously formulated kinematic model solution of tide gauge variations using X-bar and S control charts.The approach is demonstrated using Key West,Florida tide gauge records.The mean and standard errors of 5-year-long subgroups of the residuals revealed that sea level changes at this location have been progressively intensifying from 1913 to the present.Increasing oscillations in sea level at this locality may be attributed partly to the thermal expansion of seawater with increasing temperatures causing larger buoyancy-related sea level fluctuations as well as the intensification of atmospheric events including wind patterns and the impact of changes in inverted barometer effects that will alter coastal risk assessments for the future.展开更多
In this study, based on the simulated discharge results of chemical disinfectants, hypocotyl germination concentration gradient pre-test and concentration gradient determination experiment were set up respectively. La...In this study, based on the simulated discharge results of chemical disinfectants, hypocotyl germination concentration gradient pre-test and concentration gradient determination experiment were set up respectively. Laboratory cultivation was conducted to compare and analyze the root germination and germination indexes, three mangrove hypocotyls of Kandelia candel (Linn.) Druce, Ceriopstagal C.B. Rob. and Bruguiera sexangula var. Rhynchopetalas’ efficiency of cumulative root germination, cumulative germination and the cumulative expansion of the second pair of leaves, one-way analysis of variance was used to obtain the tolerance threshold of three mangrove hypocotyls to strong chlorin disinfectant. The study determined that the by-products of strong chlorin disinfectant, the toxic threshold concentrations of Kandelia candel (Linn.) Druce, Ceriopstagal C.B. Rob. and Bruguiera sexangula var. rhynchopetala are close to 0.55 mg/L, 0.55 mg/L and 0.25 mg/L, respectively. This concentration range is lower than the average concentration of 1.183 mg/L of active chlorine emitted from strong chlorine concentrate during pond clearing in high-level shrimp ponds, indicating that transient emissions of strong chlorine concentrate during pond clearing can have a toxic effect on mangrove plants. The strength of tolerance of the embryonic axes of the three mangrove species to effective chlorine contamination was, Ceriopstagal C.B. Rob. stronger than Bruguiera sexangula var. rhynchopetala, and Kandelia candel (Linn.) Druce is the weakest.展开更多
Background: When continuous scale measurements are available, agreements between two measuring devices are assessed both graphically and analytically. In clinical investigations, Bland and Altman proposed plotting sub...Background: When continuous scale measurements are available, agreements between two measuring devices are assessed both graphically and analytically. In clinical investigations, Bland and Altman proposed plotting subject-wise differences between raters against subject-wise averages. In order to scientifically assess agreement, Bartko recommended combining the graphical approach with the statistical analytic procedure suggested by Bradley and Blackwood. The advantage of using this approach is that it enables significance testing and sample size estimation. We noted that the direct use of the results of the regression is misleading and we provide a correction in this regard. Methods: Graphical and linear models are used to assess agreements for continuous scale measurements. We demonstrate that software linear regression results should not be readily used and we provided correct analytic procedures. The degrees of freedom of the F-statistics are incorrectly reported, and we propose methods to overcome this problem by introducing the correct analytic form of the F statistic. Methods for sample size estimation using R-functions are also given. Results: We believe that the tutorial and the R-codes are useful tools for testing and estimating agreement between two rating protocols for continuous scale measurements. The interested reader may use the codes and apply them to their available data when the issue of agreement between two raters is the subject of interest.展开更多
The article introduces proportional reinsurance contracts under the mean-variance criterion,studying the time-consistence investment portfolio problem considering the interests of both insurance companies and reinsura...The article introduces proportional reinsurance contracts under the mean-variance criterion,studying the time-consistence investment portfolio problem considering the interests of both insurance companies and reinsurance companies.The insurance claims process follows a jump-diffusion model,assuming that the risk asset prices of insurance companies and reinsurance companies follow CEV models different from each other.In the framework of game theory,the time-consistent equilibrium reinsurance strategy is obtained by solving the extended HJB equation analytically.Finally,numerical examples are used to illustrate the impact of model parameters on equilibrium strategies and provide economic explanations.The results indicate that the decision weights of insurance companies and reinsurance companies do have a significant impact on both the reinsurance ratio and the equilibrium reinsurance strategy.展开更多
This paper describes the application of the variance method for flux estimation over a mixed agricultural region in China. Eddy covariance and flux variance measurements were conducted in a near-surface layer over a n...This paper describes the application of the variance method for flux estimation over a mixed agricultural region in China. Eddy covariance and flux variance measurements were conducted in a near-surface layer over a non-uniform land surface in the central plain of China from 7 June to 20 July 2002. During this period, the mean canopy height was about 0.50 m. The study site consisted of grass (10% of area), beans (15%), corn (15%) and rice (60%). Under unstable conditions, the standard deviations of temperature and water vapor density (normalized by appropriate scaling parameters), observed by a single instrument, followed the Monin-Obukhov similarity theory. The similarity constants for heat (CT) and water vapor (Cq) were 1.09 and 1.49, respectively. In comparison with direct measurements using eddy covariance techniques, the flux variance method, on average, underestimated sensible heat flux by 21% and latent heat flux by 24%, which may be attributed to the fact that the observed slight deviations (20% or 30% at most) of the similarity "constants" may be within the expected range of variation of a single instrument from the generally-valid relations.展开更多
The multipath effect and movements of people in indoor environments lead to inaccurate localization. Through the test, calculation and analysis on the received signal strength indication (RSSI) and the variance of R...The multipath effect and movements of people in indoor environments lead to inaccurate localization. Through the test, calculation and analysis on the received signal strength indication (RSSI) and the variance of RSSI, we propose a novel variance-based fingerprint distance adjustment algorithm (VFDA). Based on the rule that variance decreases with the increase of RSSI mean, VFDA calculates RSSI variance with the mean value of received RSSIs. Then, we can get the correction weight. VFDA adjusts the fingerprint distances with the correction weight based on the variance of RSSI, which is used to correct the fingerprint distance. Besides, a threshold value is applied to VFDA to improve its performance further. VFDA and VFDA with the threshold value are applied in two kinds of real typical indoor environments deployed with several Wi-Fi access points. One is a quadrate lab room, and the other is a long and narrow corridor of a building. Experimental results and performance analysis show that in indoor environments, both VFDA and VFDA with the threshold have better positioning accuracy and environmental adaptability than the current typical positioning methods based on the k-nearest neighbor algorithm and the weighted k-nearest neighbor algorithm with similar computational costs.展开更多
This article studies the optimal proportional reinsurance and investment problem under a constant elasticity of variance (CEV) model. Assume that the insurer's surplus process follows a jump-diffusion process, the ...This article studies the optimal proportional reinsurance and investment problem under a constant elasticity of variance (CEV) model. Assume that the insurer's surplus process follows a jump-diffusion process, the insurer can purchase proportional reinsurance from the reinsurer via the variance principle and invest in a risk-free asset and a risky asset whose price is modeled by a CEV model. The diffusion term can explain the uncertainty associated with the surplus of the insurer or the additional small claims. The objective of the insurer is to maximize the expected exponential utility of terminal wealth. This optimization problem is studied in two cases depending on the diffusion term's explanation. In all cases, by using techniques of stochastic control theory, closed-form expressions for the value functions and optimal strategies are obtained.展开更多
Underwater acoustic signal processing is one of the research hotspots in underwater acoustics.Noise reduction of underwater acoustic signals is the key to underwater acoustic signal processing.Owing to the complexity ...Underwater acoustic signal processing is one of the research hotspots in underwater acoustics.Noise reduction of underwater acoustic signals is the key to underwater acoustic signal processing.Owing to the complexity of marine environment and the particularity of underwater acoustic channel,noise reduction of underwater acoustic signals has always been a difficult challenge in the field of underwater acoustic signal processing.In order to solve the dilemma,we proposed a novel noise reduction technique for underwater acoustic signals based on complete ensemble empirical mode decomposition with adaptive noise(CEEMDAN),minimum mean square variance criterion(MMSVC) and least mean square adaptive filter(LMSAF).This noise reduction technique,named CEEMDAN-MMSVC-LMSAF,has three main advantages:(i) as an improved algorithm of empirical mode decomposition(EMD) and ensemble EMD(EEMD),CEEMDAN can better suppress mode mixing,and can avoid selecting the number of decomposition in variational mode decomposition(VMD);(ii) MMSVC can identify noisy intrinsic mode function(IMF),and can avoid selecting thresholds of different permutation entropies;(iii) for noise reduction of noisy IMFs,LMSAF overcomes the selection of deco mposition number and basis function for wavelet noise reduction.Firstly,CEEMDAN decomposes the original signal into IMFs,which can be divided into noisy IMFs and real IMFs.Then,MMSVC and LMSAF are used to detect identify noisy IMFs and remove noise components from noisy IMFs.Finally,both denoised noisy IMFs and real IMFs are reconstructed and the final denoised signal is obtained.Compared with other noise reduction techniques,the validity of CEEMDAN-MMSVC-LMSAF can be proved by the analysis of simulation signals and real underwater acoustic signals,which has the better noise reduction effect and has practical application value.CEEMDAN-MMSVC-LMSAF also provides a reliable basis for the detection,feature extraction,classification and recognition of underwater acoustic signals.展开更多
Peak ground acceleration(PGA) estimation is an important task in earthquake engineering practice.One of the most well-known models is the Boore-Joyner-Fumal formula,which estimates the PGA using the moment magnitude,t...Peak ground acceleration(PGA) estimation is an important task in earthquake engineering practice.One of the most well-known models is the Boore-Joyner-Fumal formula,which estimates the PGA using the moment magnitude,the site-to-fault distance and the site foundation properties.In the present study,the complexity for this formula and the homogeneity assumption for the prediction-error variance are investigated and an effi ciency-robustness balanced formula is proposed.For this purpose,a reduced-order Monte Carlo simulation algorithm for Bayesian model class selection is presented to obtain the most suitable predictive formula and prediction-error model for the seismic attenuation relationship.In this approach,each model class(a predictive formula with a prediction-error model) is evaluated according to its plausibility given the data.The one with the highest plausibility is robust since it possesses the optimal balance between the data fi tting capability and the sensitivity to noise.A database of strong ground motion records in the Tangshan region of China is obtained from the China Earthquake Data Center for the analysis.The optimal predictive formula is proposed based on this database.It is shown that the proposed formula with heterogeneous prediction-error variance is much simpler than the attenuation model suggested by Boore,Joyner and Fumal(1993).展开更多
The low-frequency variance of the surface wave in the area of the Antarctic Circumpolar Current (ACC) and its correlation with the antarctic circumpolar wave (ACW) are focused on. The analysis of the series of 44 ...The low-frequency variance of the surface wave in the area of the Antarctic Circumpolar Current (ACC) and its correlation with the antarctic circumpolar wave (ACW) are focused on. The analysis of the series of 44 a significant wave height (SWH) interannual anomalies reveals that the SWH anomalies have a strong periodicity of about 4-5 a and this signal propagates eastward obviously from 1985 to 1995, which needs about 8 a to complete a mimacircle around the earth. The method of empirical orthogonal function (EOF) is used to analyze the filtered monthly SWH anomalies to study the spatio-temporal distributions and the propagation characteristics of the low-frequency signals in the wave field. Both the dominant wavenumber- 2 pattern in space and the propagation feature in the south Pacific, the south Atlantic and the south Indian ocean show strong consistency with the ACW. So it is reasonable to conclude that the ACW signal also exists in the wave field. The ACW is important for the climate in the Southern Ocean, so it is worth to pay more attention to the large- scale effect of the surface wave, which may also be important for climate studies.展开更多
In the present study, the authors investigated the relationship between the Arctic Oscillation (AO) and the high-frequency variability of daily sea level pressures in the Northern Hemisphere in winter (November throug...In the present study, the authors investigated the relationship between the Arctic Oscillation (AO) and the high-frequency variability of daily sea level pressures in the Northern Hemisphere in winter (November through March), using NCEP/NCAR reanalysis datasets for the time period of 1948/49-2000/01. High-frequency signals are defined as those with timescales shorter than three weeks and measured in terms of variance, for each winter for each grid. The correlations between monthly mean AO index and high-frequency variance are conducted. A predominant feature is that several regional centers with high correlation show up in the middle to high latitudes. Significant areas include mid- to high-latitude Asia centered at Siberia, northern Europe and the middle-latitude North Atlantic east of northern Africa. Their strong correlations can also be confirmed by the singular value decomposition analysis of covariance between mean SLP and high-frequency variance. This indicates that the relationship of AO with daily Sea Level Pressure (SLP) is confined to some specific regions in association with the inherent atmospheric dynamics. In middle-latitude Asia, there is a significant (at the 95% level) trend of variance of-2.26% (10 yr)-1. Another region that displays a strong trend is the northwestern Pacific with a significant rate of change of 0.80% (10 yr)-1. If the winter of 1948/49, an apparent outlier, is excluded, a steady linear trend of +1.51% (10 yr)-1 shows up in northern Europe. The variance probability density functions (PDFs) are found to change in association with different AO phases. The changes corresponding to high and low AO phases, however, are asymmetric in these regions. Some regions such as northern Europe display much stronger changes in high AO years, whereas some other regions such as Siberia show a stronger connection to low AO conditions. These features are supported by ECMWF reanalysis data. However, the dynamical mechanisms involved in the AO-high frequency SLP variance connection have not been well understood, and this needs further study.展开更多
The acquisition of precise soil data representative of the entire survey area, is a critical issue for many treatments such as irrigation or fertilization in precision agriculture. The aim of this study was to investi...The acquisition of precise soil data representative of the entire survey area, is a critical issue for many treatments such as irrigation or fertilization in precision agriculture. The aim of this study was to investigate the spatial variability of soil bulk electrical conductivity (ECb) in a coastal saline field and design an optimized spatial sampling scheme of ECb based on a sampling design algorithm, the variance quad-tree (VQT) method. Soil ECb data were collected from the field at 20 m interval in a regular grid scheme. The smooth contour map of the whole field was obtained by ordinary kriging interpolation, VQT algorithm was then used to split the smooth contour map into strata of different number desired, the sampling locations can be selected within each stratum in subsequent sampling. The result indicated that the probability of choosing representative sampling sites was increased significantly by using VQT method with the sampling number being greatly reduced compared to grid sampling design while retaining the same prediction accuracy. The advantage of the VQT method is that this scheme samples sparsely in fields where the spatial variability is relatively uniform and more intensive where the variability is large. Thus the sampling efficiency can be improved, hence facilitate an assessment methodology that can be applied in a rapid, practical and cost-effective manner.展开更多
In order to enhance the robustness and contrast in the minimum variance(MV) beamformer, adaptive diagonal loading method was proposed. The conventional diagonal loading technique has already been used in the MV beamfo...In order to enhance the robustness and contrast in the minimum variance(MV) beamformer, adaptive diagonal loading method was proposed. The conventional diagonal loading technique has already been used in the MV beamformer, but has the drawback that its level is specified by predefined parameter and without consideration of input-data. To alleviate this problem, the level of diagonal loading was computed appropriately and automatically from the given data by shrinkage method in the proposed adaptive diagonal loaded beamformer. The performance of the proposed beamformer was tested on the simulated point target and cyst phantom was obtained using Field II. In the point target simulation, it is shown that the proposed method has higher lateral resolution than the conventional delay-and-sum beamformer and could be more robust in estimating the amplitude peak than the MV beamformer when acoustic velocity error exists. In the cyst phantom simulation, the proposed beamformer has shown that it achieves an improvement in contrast ratio and without distorting the edges of cyst.展开更多
Bayes decision rule of variance components for one-way random effects model is derived and empirical Bayes (EB) decision rules are constructed by kernel estimation method. Under suitable conditions, it is shown that t...Bayes decision rule of variance components for one-way random effects model is derived and empirical Bayes (EB) decision rules are constructed by kernel estimation method. Under suitable conditions, it is shown that the proposed EB decision rules are asymptotically optimal with convergence rates near O(n-1/2). Finally, an example concerning the main result is given.展开更多
Suppression effect in multiple regression analysis may be more common in research than what is currently recognized. We have reviewed several literatures of interest which treats the concept and types of suppressor va...Suppression effect in multiple regression analysis may be more common in research than what is currently recognized. We have reviewed several literatures of interest which treats the concept and types of suppressor variables. Also, we have highlighted systematic ways to identify suppression effect in multiple regressions using statistics such as: R2, sum of squares, regression weight and comparing zero-order correlations with Variance Inflation Factor (VIF) respectively. We also establish that suppression effect is a function of multicollinearity;however, a suppressor variable should only be allowed in a regression analysis if its VIF is less than five (5).展开更多
Background:Large area forest inventories often use regular grids(with a single random start)of sample locations to ensure a uniform sampling intensity across the space of the surveyed populations.A design-unbiased est...Background:Large area forest inventories often use regular grids(with a single random start)of sample locations to ensure a uniform sampling intensity across the space of the surveyed populations.A design-unbiased estimator of variance does not exist for this design.Oftentimes,a quasi-default estimator applicable to simple random sampling(SRS)is used,even if it carries with it the likely risk of overestimating the variance by a practically important margin.To better exploit the precision of systematic sampling we assess the performance of five estimators of variance,including the quasi default.In this study,simulated systematic sampling was applied to artificial populations with contrasting covariance structures and with or without linear trends.We compared the results obtained with the SRS,Matern’s,successive difference replication,Ripley’s,and D’Orazio’s variance estimators.Results:The variances obtained with the four alternatives to the SRS estimator of variance were strongly correlated,and in all study settings consistently closer to the target design variance than the estimator for SRS.The latter always produced the greatest overestimation.In populations with a near zero spatial autocorrelation,all estimators,performed equally,and delivered estimates close to the actual design variance.Conclusion:Without a linear trend,the SDR and DOR estimators were best with variance estimates more narrowly distributed around the benchmark;yet in terms of the least average absolute deviation,Matern’s estimator held a narrow lead.With a strong or moderate linear trend,Matern’s estimator is choice.In large populations,and a low sampling intensity,the performance of the investigated estimators becomes more similar.展开更多
基金supported by the National Key R&D Program of China(2021YFD2200203)Heilongjiang Province Key R&D Program of China(GA21B010)+1 种基金Heilongjiang Touyan Innovation Team Program(Tree Genetics and Breeding Innovation Team)Heilongjiang Postdoctoral Financial Assistance(LBH-Z21097)。
文摘Doubled haploid(DH)plants have been widely used for breeding and biological research in crops.Pop ulus spp.have been used as model woody plant species for biological research.However,the induction of DH poplar plants is onerous,and limited biological or breeding work has been carried out on DH individuals or populations.In this study,we provide an effective protocol for poplar haploid induction based on an anther culture method.A total of 96 whole DH plant lines were obtained using an F1hybrid of Populus simonii×P.nigra as a donor tree.The phenotypes of the DH population showed exceptionally high variance when compared to those of half-sib progeny of the donor tree.Each DH line displayed distinct features compared to those of the other DH lines or the donor tree.Additionally,some excellent homozygous lines have the potential to be model plants in genetic and breeding studies.
基金supported by China Nuclear Power Engineering Co.,Ltd.Scientific Research Project(No.KY22104)the fellowship of China Postdoctoral Science Foundation(No.2022M721793).
文摘The inter-cycle correlation of fission source distributions(FSDs)in the Monte Carlo power iteration process results in variance underestimation of tallied physical quantities,especially in large local tallies.This study provides a mesh-free semiquantitative variance underestimation elimination method to obtain a credible confidence interval for the tallied results.This method comprises two procedures:Estimation and Elimination.The FSD inter-cycle correlation length is estimated in the Estimation procedure using the Sliced Wasserstein distance algorithm.The batch method was then used in the elimination procedure.The FSD inter-cycle correlation length was proved to be the optimum batch length to eliminate the variance underestimation problem.We exemplified this method using the OECD sphere array model and 3D PWR BEAVRS model.The results showed that the average variance underestimation ratios of local tallies declined from 37 to 87%to within±5%in these models.
基金supported by the National Natural Science Foundation of China(Grant No.12005065)the Guangdong Basic and Applied Basic Research Fund(Grant No.2021A1515010317)。
文摘The zero-energy variance principle can be exploited in variational quantum eigensolvers for solving general eigenstates but its capacity for obtaining a specified eigenstate,such as ground state,is limited as all eigenstates are of zero energy variance.We propose a variance-based variational quantum eigensolver for solving the ground state by searching in an enlarged space of wavefunction and Hamiltonian.With a mutual variance-Hamiltonian optimization procedure,the Hamiltonian is iteratively updated to guild the state towards to the ground state of the target Hamiltonian by minimizing the energy variance in each iteration.We demonstrate the performance and properties of the algorithm with numeral simulations.Our work suggests an avenue for utilizing guided Hamiltonian in hybrid quantum-classical algorithms.
文摘As modern weapons and equipment undergo increasing levels of informatization,intelligence,and networking,the topology and traffic characteristics of battlefield data networks built with tactical data links are becoming progressively complex.In this paper,we employ a traffic matrix to model the tactical data link network.We propose a method that utilizes the Maximum Variance Unfolding(MVU)algorithm to conduct nonlinear dimensionality reduction analysis on high-dimensional open network traffic matrix datasets.This approach introduces novel ideas and methods for future applications,including traffic prediction and anomaly analysis in real battlefield network environments.
文摘Monitoring temporal changes in sea level is important in assessing coastal risk.Sea level anomalies at a tide gauge station,if kinematically conceived,include systematic variations such as trend,acceleration,periodic oscillations,and random disturbances.Among them,the non-stationary nature of the random sea level variations of known or unknown origin at coastal regions has been long recognized by the sea level community.This study proposes the analyses of subgroups of random residual statistics of a rigorously formulated kinematic model solution of tide gauge variations using X-bar and S control charts.The approach is demonstrated using Key West,Florida tide gauge records.The mean and standard errors of 5-year-long subgroups of the residuals revealed that sea level changes at this location have been progressively intensifying from 1913 to the present.Increasing oscillations in sea level at this locality may be attributed partly to the thermal expansion of seawater with increasing temperatures causing larger buoyancy-related sea level fluctuations as well as the intensification of atmospheric events including wind patterns and the impact of changes in inverted barometer effects that will alter coastal risk assessments for the future.
文摘In this study, based on the simulated discharge results of chemical disinfectants, hypocotyl germination concentration gradient pre-test and concentration gradient determination experiment were set up respectively. Laboratory cultivation was conducted to compare and analyze the root germination and germination indexes, three mangrove hypocotyls of Kandelia candel (Linn.) Druce, Ceriopstagal C.B. Rob. and Bruguiera sexangula var. Rhynchopetalas’ efficiency of cumulative root germination, cumulative germination and the cumulative expansion of the second pair of leaves, one-way analysis of variance was used to obtain the tolerance threshold of three mangrove hypocotyls to strong chlorin disinfectant. The study determined that the by-products of strong chlorin disinfectant, the toxic threshold concentrations of Kandelia candel (Linn.) Druce, Ceriopstagal C.B. Rob. and Bruguiera sexangula var. rhynchopetala are close to 0.55 mg/L, 0.55 mg/L and 0.25 mg/L, respectively. This concentration range is lower than the average concentration of 1.183 mg/L of active chlorine emitted from strong chlorine concentrate during pond clearing in high-level shrimp ponds, indicating that transient emissions of strong chlorine concentrate during pond clearing can have a toxic effect on mangrove plants. The strength of tolerance of the embryonic axes of the three mangrove species to effective chlorine contamination was, Ceriopstagal C.B. Rob. stronger than Bruguiera sexangula var. rhynchopetala, and Kandelia candel (Linn.) Druce is the weakest.
文摘Background: When continuous scale measurements are available, agreements between two measuring devices are assessed both graphically and analytically. In clinical investigations, Bland and Altman proposed plotting subject-wise differences between raters against subject-wise averages. In order to scientifically assess agreement, Bartko recommended combining the graphical approach with the statistical analytic procedure suggested by Bradley and Blackwood. The advantage of using this approach is that it enables significance testing and sample size estimation. We noted that the direct use of the results of the regression is misleading and we provide a correction in this regard. Methods: Graphical and linear models are used to assess agreements for continuous scale measurements. We demonstrate that software linear regression results should not be readily used and we provided correct analytic procedures. The degrees of freedom of the F-statistics are incorrectly reported, and we propose methods to overcome this problem by introducing the correct analytic form of the F statistic. Methods for sample size estimation using R-functions are also given. Results: We believe that the tutorial and the R-codes are useful tools for testing and estimating agreement between two rating protocols for continuous scale measurements. The interested reader may use the codes and apply them to their available data when the issue of agreement between two raters is the subject of interest.
文摘The article introduces proportional reinsurance contracts under the mean-variance criterion,studying the time-consistence investment portfolio problem considering the interests of both insurance companies and reinsurance companies.The insurance claims process follows a jump-diffusion model,assuming that the risk asset prices of insurance companies and reinsurance companies follow CEV models different from each other.In the framework of game theory,the time-consistent equilibrium reinsurance strategy is obtained by solving the extended HJB equation analytically.Finally,numerical examples are used to illustrate the impact of model parameters on equilibrium strategies and provide economic explanations.The results indicate that the decision weights of insurance companies and reinsurance companies do have a significant impact on both the reinsurance ratio and the equilibrium reinsurance strategy.
文摘This paper describes the application of the variance method for flux estimation over a mixed agricultural region in China. Eddy covariance and flux variance measurements were conducted in a near-surface layer over a non-uniform land surface in the central plain of China from 7 June to 20 July 2002. During this period, the mean canopy height was about 0.50 m. The study site consisted of grass (10% of area), beans (15%), corn (15%) and rice (60%). Under unstable conditions, the standard deviations of temperature and water vapor density (normalized by appropriate scaling parameters), observed by a single instrument, followed the Monin-Obukhov similarity theory. The similarity constants for heat (CT) and water vapor (Cq) were 1.09 and 1.49, respectively. In comparison with direct measurements using eddy covariance techniques, the flux variance method, on average, underestimated sensible heat flux by 21% and latent heat flux by 24%, which may be attributed to the fact that the observed slight deviations (20% or 30% at most) of the similarity "constants" may be within the expected range of variation of a single instrument from the generally-valid relations.
基金supported by the National Natural Science Foundation of China(6120200461472192)+1 种基金the Special Fund for Fast Sharing of Science Paper in Net Era by CSTD(2013116)the Natural Science Fund of Higher Education of Jiangsu Province(14KJB520014)
文摘The multipath effect and movements of people in indoor environments lead to inaccurate localization. Through the test, calculation and analysis on the received signal strength indication (RSSI) and the variance of RSSI, we propose a novel variance-based fingerprint distance adjustment algorithm (VFDA). Based on the rule that variance decreases with the increase of RSSI mean, VFDA calculates RSSI variance with the mean value of received RSSIs. Then, we can get the correction weight. VFDA adjusts the fingerprint distances with the correction weight based on the variance of RSSI, which is used to correct the fingerprint distance. Besides, a threshold value is applied to VFDA to improve its performance further. VFDA and VFDA with the threshold value are applied in two kinds of real typical indoor environments deployed with several Wi-Fi access points. One is a quadrate lab room, and the other is a long and narrow corridor of a building. Experimental results and performance analysis show that in indoor environments, both VFDA and VFDA with the threshold have better positioning accuracy and environmental adaptability than the current typical positioning methods based on the k-nearest neighbor algorithm and the weighted k-nearest neighbor algorithm with similar computational costs.
文摘This article studies the optimal proportional reinsurance and investment problem under a constant elasticity of variance (CEV) model. Assume that the insurer's surplus process follows a jump-diffusion process, the insurer can purchase proportional reinsurance from the reinsurer via the variance principle and invest in a risk-free asset and a risky asset whose price is modeled by a CEV model. The diffusion term can explain the uncertainty associated with the surplus of the insurer or the additional small claims. The objective of the insurer is to maximize the expected exponential utility of terminal wealth. This optimization problem is studied in two cases depending on the diffusion term's explanation. In all cases, by using techniques of stochastic control theory, closed-form expressions for the value functions and optimal strategies are obtained.
基金The authors gratefully acknowledge the support of the National Natural Science Foundation of China(No.11574250).
文摘Underwater acoustic signal processing is one of the research hotspots in underwater acoustics.Noise reduction of underwater acoustic signals is the key to underwater acoustic signal processing.Owing to the complexity of marine environment and the particularity of underwater acoustic channel,noise reduction of underwater acoustic signals has always been a difficult challenge in the field of underwater acoustic signal processing.In order to solve the dilemma,we proposed a novel noise reduction technique for underwater acoustic signals based on complete ensemble empirical mode decomposition with adaptive noise(CEEMDAN),minimum mean square variance criterion(MMSVC) and least mean square adaptive filter(LMSAF).This noise reduction technique,named CEEMDAN-MMSVC-LMSAF,has three main advantages:(i) as an improved algorithm of empirical mode decomposition(EMD) and ensemble EMD(EEMD),CEEMDAN can better suppress mode mixing,and can avoid selecting the number of decomposition in variational mode decomposition(VMD);(ii) MMSVC can identify noisy intrinsic mode function(IMF),and can avoid selecting thresholds of different permutation entropies;(iii) for noise reduction of noisy IMFs,LMSAF overcomes the selection of deco mposition number and basis function for wavelet noise reduction.Firstly,CEEMDAN decomposes the original signal into IMFs,which can be divided into noisy IMFs and real IMFs.Then,MMSVC and LMSAF are used to detect identify noisy IMFs and remove noise components from noisy IMFs.Finally,both denoised noisy IMFs and real IMFs are reconstructed and the final denoised signal is obtained.Compared with other noise reduction techniques,the validity of CEEMDAN-MMSVC-LMSAF can be proved by the analysis of simulation signals and real underwater acoustic signals,which has the better noise reduction effect and has practical application value.CEEMDAN-MMSVC-LMSAF also provides a reliable basis for the detection,feature extraction,classification and recognition of underwater acoustic signals.
基金Research Committee of University of Macao under Research Grant No.MYRG081(Y1-L2)-FST13-YKVthe Science and Technology Development Fund of the Macao SAR government under Grant No.012/2013/A1
文摘Peak ground acceleration(PGA) estimation is an important task in earthquake engineering practice.One of the most well-known models is the Boore-Joyner-Fumal formula,which estimates the PGA using the moment magnitude,the site-to-fault distance and the site foundation properties.In the present study,the complexity for this formula and the homogeneity assumption for the prediction-error variance are investigated and an effi ciency-robustness balanced formula is proposed.For this purpose,a reduced-order Monte Carlo simulation algorithm for Bayesian model class selection is presented to obtain the most suitable predictive formula and prediction-error model for the seismic attenuation relationship.In this approach,each model class(a predictive formula with a prediction-error model) is evaluated according to its plausibility given the data.The one with the highest plausibility is robust since it possesses the optimal balance between the data fi tting capability and the sensitivity to noise.A database of strong ground motion records in the Tangshan region of China is obtained from the China Earthquake Data Center for the analysis.The optimal predictive formula is proposed based on this database.It is shown that the proposed formula with heterogeneous prediction-error variance is much simpler than the attenuation model suggested by Boore,Joyner and Fumal(1993).
基金The National Natural Science Foundation of China under contract Nos 40976005 and 40930844
文摘The low-frequency variance of the surface wave in the area of the Antarctic Circumpolar Current (ACC) and its correlation with the antarctic circumpolar wave (ACW) are focused on. The analysis of the series of 44 a significant wave height (SWH) interannual anomalies reveals that the SWH anomalies have a strong periodicity of about 4-5 a and this signal propagates eastward obviously from 1985 to 1995, which needs about 8 a to complete a mimacircle around the earth. The method of empirical orthogonal function (EOF) is used to analyze the filtered monthly SWH anomalies to study the spatio-temporal distributions and the propagation characteristics of the low-frequency signals in the wave field. Both the dominant wavenumber- 2 pattern in space and the propagation feature in the south Pacific, the south Atlantic and the south Indian ocean show strong consistency with the ACW. So it is reasonable to conclude that the ACW signal also exists in the wave field. The ACW is important for the climate in the Southern Ocean, so it is worth to pay more attention to the large- scale effect of the surface wave, which may also be important for climate studies.
文摘In the present study, the authors investigated the relationship between the Arctic Oscillation (AO) and the high-frequency variability of daily sea level pressures in the Northern Hemisphere in winter (November through March), using NCEP/NCAR reanalysis datasets for the time period of 1948/49-2000/01. High-frequency signals are defined as those with timescales shorter than three weeks and measured in terms of variance, for each winter for each grid. The correlations between monthly mean AO index and high-frequency variance are conducted. A predominant feature is that several regional centers with high correlation show up in the middle to high latitudes. Significant areas include mid- to high-latitude Asia centered at Siberia, northern Europe and the middle-latitude North Atlantic east of northern Africa. Their strong correlations can also be confirmed by the singular value decomposition analysis of covariance between mean SLP and high-frequency variance. This indicates that the relationship of AO with daily Sea Level Pressure (SLP) is confined to some specific regions in association with the inherent atmospheric dynamics. In middle-latitude Asia, there is a significant (at the 95% level) trend of variance of-2.26% (10 yr)-1. Another region that displays a strong trend is the northwestern Pacific with a significant rate of change of 0.80% (10 yr)-1. If the winter of 1948/49, an apparent outlier, is excluded, a steady linear trend of +1.51% (10 yr)-1 shows up in northern Europe. The variance probability density functions (PDFs) are found to change in association with different AO phases. The changes corresponding to high and low AO phases, however, are asymmetric in these regions. Some regions such as northern Europe display much stronger changes in high AO years, whereas some other regions such as Siberia show a stronger connection to low AO conditions. These features are supported by ECMWF reanalysis data. However, the dynamical mechanisms involved in the AO-high frequency SLP variance connection have not been well understood, and this needs further study.
基金We thank the financial support from the National Natural Science Foundation of China(40701007,40571066)the Postdoctoral Science Foundation of China(20060401048).
文摘The acquisition of precise soil data representative of the entire survey area, is a critical issue for many treatments such as irrigation or fertilization in precision agriculture. The aim of this study was to investigate the spatial variability of soil bulk electrical conductivity (ECb) in a coastal saline field and design an optimized spatial sampling scheme of ECb based on a sampling design algorithm, the variance quad-tree (VQT) method. Soil ECb data were collected from the field at 20 m interval in a regular grid scheme. The smooth contour map of the whole field was obtained by ordinary kriging interpolation, VQT algorithm was then used to split the smooth contour map into strata of different number desired, the sampling locations can be selected within each stratum in subsequent sampling. The result indicated that the probability of choosing representative sampling sites was increased significantly by using VQT method with the sampling number being greatly reduced compared to grid sampling design while retaining the same prediction accuracy. The advantage of the VQT method is that this scheme samples sparsely in fields where the spatial variability is relatively uniform and more intensive where the variability is large. Thus the sampling efficiency can be improved, hence facilitate an assessment methodology that can be applied in a rapid, practical and cost-effective manner.
基金Project(2013GZX0147-3)supported by the Science and Technology Pillar Program of Sichuan Province,China
文摘In order to enhance the robustness and contrast in the minimum variance(MV) beamformer, adaptive diagonal loading method was proposed. The conventional diagonal loading technique has already been used in the MV beamformer, but has the drawback that its level is specified by predefined parameter and without consideration of input-data. To alleviate this problem, the level of diagonal loading was computed appropriately and automatically from the given data by shrinkage method in the proposed adaptive diagonal loaded beamformer. The performance of the proposed beamformer was tested on the simulated point target and cyst phantom was obtained using Field II. In the point target simulation, it is shown that the proposed method has higher lateral resolution than the conventional delay-and-sum beamformer and could be more robust in estimating the amplitude peak than the MV beamformer when acoustic velocity error exists. In the cyst phantom simulation, the proposed beamformer has shown that it achieves an improvement in contrast ratio and without distorting the edges of cyst.
基金The project is partly supported by NSFC (19971085)the Doctoral Program Foundation of the Institute of High Education and the Special Foundation of Chinese Academy of Sciences.
文摘Bayes decision rule of variance components for one-way random effects model is derived and empirical Bayes (EB) decision rules are constructed by kernel estimation method. Under suitable conditions, it is shown that the proposed EB decision rules are asymptotically optimal with convergence rates near O(n-1/2). Finally, an example concerning the main result is given.
文摘Suppression effect in multiple regression analysis may be more common in research than what is currently recognized. We have reviewed several literatures of interest which treats the concept and types of suppressor variables. Also, we have highlighted systematic ways to identify suppression effect in multiple regressions using statistics such as: R2, sum of squares, regression weight and comparing zero-order correlations with Variance Inflation Factor (VIF) respectively. We also establish that suppression effect is a function of multicollinearity;however, a suppressor variable should only be allowed in a regression analysis if its VIF is less than five (5).
文摘Background:Large area forest inventories often use regular grids(with a single random start)of sample locations to ensure a uniform sampling intensity across the space of the surveyed populations.A design-unbiased estimator of variance does not exist for this design.Oftentimes,a quasi-default estimator applicable to simple random sampling(SRS)is used,even if it carries with it the likely risk of overestimating the variance by a practically important margin.To better exploit the precision of systematic sampling we assess the performance of five estimators of variance,including the quasi default.In this study,simulated systematic sampling was applied to artificial populations with contrasting covariance structures and with or without linear trends.We compared the results obtained with the SRS,Matern’s,successive difference replication,Ripley’s,and D’Orazio’s variance estimators.Results:The variances obtained with the four alternatives to the SRS estimator of variance were strongly correlated,and in all study settings consistently closer to the target design variance than the estimator for SRS.The latter always produced the greatest overestimation.In populations with a near zero spatial autocorrelation,all estimators,performed equally,and delivered estimates close to the actual design variance.Conclusion:Without a linear trend,the SDR and DOR estimators were best with variance estimates more narrowly distributed around the benchmark;yet in terms of the least average absolute deviation,Matern’s estimator held a narrow lead.With a strong or moderate linear trend,Matern’s estimator is choice.In large populations,and a low sampling intensity,the performance of the investigated estimators becomes more similar.