We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP)...We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP) distribution are the classical distributions for this problem. However, from 2004, [1] and many other researchers have been empirically showing that generalized logistic (GL) distribution is a better model than GEV and GP distributions in modeling extreme movement of stock market data. In this paper, we show that these results are not accidental. We prove the theoretical importance of GL distribution in extreme value modeling. For proving this, we introduce a general multivariate limit theorem and deduce some important multivariate theorems in probability as special cases. By using the theorem, we derive a limit theorem in extreme value theory, where GL distribution plays central role instead of GEV distribution. The proof of this result is parallel to the proof of classical extremal types theorem, in the sense that, it possess important characteristic in classical extreme value theory, for e.g. distributional property, stability, convergence and multivariate extension etc.展开更多
In this study, a morphodynamic numerical model is established with the Regional Ocean Modeling System(ROMS)to investigate the transient behavior of sand waves under realistic sea conditions. The simulation of sand wav...In this study, a morphodynamic numerical model is established with the Regional Ocean Modeling System(ROMS)to investigate the transient behavior of sand waves under realistic sea conditions. The simulation of sand wave evolution comprises two steps: 1) a regional-scale model is configured first to simulate the ocean hydrodynamics, i.e., tides and tidal currents, and 2) the transient behavior of sand waves is simulated in a small computational domain under the time-variant currents extracted from the large model. The evolution of sand waves on the continental shelf in the Beibu Gulf is specifically investigated. The numerical results of the two-year evolution of sand waves under normal sea conditions compare well with the field survey data. The transient behavior of sand waves in individual months shows that the sand waves are more stable in April and October than that in other months, which can be selected as the windows for seabed operations. The effects of sediment properties, including settling velocity, critical shear stress and surface erosion rate, on sand wave evolution are also analyzed. Then, the typhoon-induced currents are further superimposed on the tidal currents as the extreme weather conditions. Sand waves with the average wavelength generally have more active behavior than smaller or larger sand waves. The characteristics of the evolution of sand waves in an individual typhoon process are quite different for different hydrodynamic combinations. For the storm conditions, i.e., the real combination and maximum combination cases, the sand waves experience a significant migration together with a damping in height due to the dominant suspended sediment transport. For the mild conditions, i.e., the pure tidal current and minimum combination cases, the sand waves migrate less, but the heights continue growing due to the dominant bedload transport.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distributi...Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.展开更多
It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the character...It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jum...Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.展开更多
The accurate calculation of marine environmental design parameters depends on the probability distribution model,and the calculation results of different distribution models are often different.It is very important to...The accurate calculation of marine environmental design parameters depends on the probability distribution model,and the calculation results of different distribution models are often different.It is very important to determine which distribution model is more stable and reasonable when extrapolating the recurrence level of the studied sea area.In this paper,we constructed an evaluation method of the overall uncertainty of the calculation results and a measurement of the uncertainty of the design parameters derivation model,by incorporating the influence of sample information on the model information entropy,such as sample size,degree of dispersion,and sampling error.Results show that the sample data size and the degree of dispersion are directly proportional to the information entropy.Within the same group of data,the maximum entropy distribution model has the lowest overall uncertainty,while the Gumbel distribution model has the largest overall uncertainty.In other words,the maximum entropy distribution model has good applicability in the accurate calculation of marine environmental design parameters.展开更多
Cyber losses in terms of number of records breached under cyber incidents commonly feature a significant portion of zeros, specific characteristics of mid-range losses and large losses, which make it hard to model the...Cyber losses in terms of number of records breached under cyber incidents commonly feature a significant portion of zeros, specific characteristics of mid-range losses and large losses, which make it hard to model the whole range of the losses using a standard loss distribution. We tackle this modeling problem by proposing a three-component spliced regression model that can simultaneously model zeros, moderate and large losses and consider heterogeneous effects in mixture components. To apply our proposed model to Privacy Right Clearinghouse (PRC) data breach chronology, we segment geographical groups using unsupervised cluster analysis, and utilize a covariate-dependent probability to model zero losses, finite mixture distributions for moderate body and an extreme value distribution for large losses capturing the heavy-tailed nature of the loss data. Parameters and coefficients are estimated using the Expectation-Maximization (EM) algorithm. Combining with our frequency model (generalized linear mixed model) for data breaches, aggregate loss distributions are investigated and applications on cyber insurance pricing and risk management are discussed.展开更多
This review paper discusses advances of statistical inference in modeling extreme observations from multiple sources and heterogeneous populations.The paper starts briefly reviewing classical univariate/multivariate e...This review paper discusses advances of statistical inference in modeling extreme observations from multiple sources and heterogeneous populations.The paper starts briefly reviewing classical univariate/multivariate extreme value theory,tail equivalence,and tail(in)dependence.New extreme value theory for heterogeneous populations is then introduced.Time series models for maxima and extreme observations are the focus of the review.These models naturally form a new system with similar structures.They can be used as alternatives to the widely used ARMA models and GARCH models.Applications of these time series models can be in many fields.The paper discusses two important applications:systematic risks and extreme co-movements/large scale contagions.展开更多
In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the ...In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the null hypothesis,the test statistic converges to the type I extreme value distribution,and otherwise,it explodes fast and the divergence rate could even reach n in the strong signal case where n is the size of the network,guaranteeing high detection power.This method is simple to use and serves as an alternative approach to the novel one in Lei(2016)using random matrix theory.To detect the change of the community structure,the authors also propose a two-sample test for the stochastic block model with two observed adjacency matrices.Simulation studies justify the theory.The authors apply the proposed method to the political blog data set and find reasonable group structures.展开更多
In this paper, a new type of distribution, multivariate compound extreme value distribution (MCEVD), is introduced by compounding a discrete distribution with a multivariate continuous distribution of extreme sea even...In this paper, a new type of distribution, multivariate compound extreme value distribution (MCEVD), is introduced by compounding a discrete distribution with a multivariate continuous distribution of extreme sea events. In its engineering application the number over certain threshold level per year is fitting to Poisson distribution and the corresponding extreme sea events are fitting to Nested Logistic distribution, then the Poisson-Nested logistic trivariate compound extreme value distribution (PNLTCED) is proposed to predict extreme wave heights, periods and wind speeds in Yellow Sea. The new model gives more stable and reasonable predicted results.展开更多
This study developed a hierarchical Bayesian(HB)model for local and regional flood frequency analysis in the Dongting Lake Basin,in China.The annual maximum daily flows from 15 streamflow-gauged sites in the study are...This study developed a hierarchical Bayesian(HB)model for local and regional flood frequency analysis in the Dongting Lake Basin,in China.The annual maximum daily flows from 15 streamflow-gauged sites in the study area were analyzed with the HB model.The generalized extreme value(GEV)distribution was selected as the extreme flood distribution,and the GEV distribution location and scale parameters were spatially modeled through a regression approach with the drainage area as a covariate.The Markov chain Monte Carlo(MCMC)method with Gibbs sampling was employed to calculate the posterior distribution in the HB model.The results showed that the proposed HB model provided satisfactory Bayesian credible intervals for flood quantiles,while the traditional delta method could not provide reliable uncertainty estimations for large flood quantiles,due to the fact that the lower confidence bounds tended to decrease as the return periods increased.Furthermore,the HB model for regional analysis allowed for a reduction in the value of some restrictive assumptions in the traditional index flood method,such as the homogeneity region assumption and the scale invariance assumption.The HB model can also provide an uncertainty band of flood quantile prediction at a poorly gauged or ungauged site,but the index flood method with L-moments does not demonstrate this uncertainty directly.Therefore,the HB model is an effective method of implementing the flexible local and regional frequency analysis scheme,and of quantifying the associated predictive uncertainty.展开更多
The output of 25 models used in the Coupled Model Intercomparison Project phase 3 (CMIP3) were evaluated,with a focus on summer precipitation in eastern China for the last 40 years of the 20th century.Most mod-els fai...The output of 25 models used in the Coupled Model Intercomparison Project phase 3 (CMIP3) were evaluated,with a focus on summer precipitation in eastern China for the last 40 years of the 20th century.Most mod-els failed to reproduce rainfall associated with the East Asian summer monsoon (EASM),and hence the seasonal cycle in eastern China,but provided reasonable results in Southwest (SW) and Northeast China (NE).The simula-tions produced reasonable results for the Yangtze-Huai (YH) Basin area,although the Meiyu phenomenon was underestimated in general.One typical regional phe-nomenon,a seasonal northward shift in the rain belt from early to late summer,was completely missed by most models.The long-term climate trends in rainfall over eastern China were largely underestimated,and the ob-served geographical pattern of rainfall changes was not reproduced by most models.Precipitation extremes were evaluated via parameters of fitted GEV (Generalized Ex-treme Values) distributions.The annual extremes were grossly underestimated in the monsoon-dominated YH and SW regions,but reasonable values were calculated for the North China (NC) and NE regions.These results suggest a general failure to capture the dynamics of the EASM in current coupled climate models.Nonetheless,models with higher resolution tend to reproduce larger decadal trends and annual extremes of precipitation in the regions studied.展开更多
Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficult...Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficulty of intertidal area classification is compounded because a high proportion of this area is frequently flooded by water, making statistical modeling methods with spatial contextual information often ineffective. Because polarimetric entropy and anisotropy play significant roles in characterizing intertidal areas, in this paper we propose a novel unsupervised contextual classification algorithm. The key point of the method is to combine the generalized extreme value(GEV) statistical model of the polarization features and the Markov random field(MRF) for contextual smoothing. A goodness-of-fit test is added to determine the significance of the components of the statistical model. The final classification results are obtained by effectively combining the results of polarimetric entropy and anisotropy. Experimental results of the polarimetric data obtained by the Chinese Gaofen-3 SAR satellite demonstrate the feasibility and superiority of the proposed classification algorithm.展开更多
Most left ventricular(LV)Doppler measurements vary significantly with age and gender,making it necessary to correct them for physiological variances.We aimed to verify the hypothesis that different Doppler measurement...Most left ventricular(LV)Doppler measurements vary significantly with age and gender,making it necessary to correct them for physiological variances.We aimed to verify the hypothesis that different Doppler measurements correlate nonlinearly with different biometric variables raised to different scaling factors and exponents.In this work,a total of 23 LV Doppler parameters were measured in 1224 healthy Chinese adults.An optimized multivariable allometric model(OMAM)and scaling equations were developed in 70%of the subjects(group A),and the reliability of the model and equations was verified using the remaining 30%of the subjects(group B)as well as 183 overweight subjects(group C).The single-variable isometric model(SVIM)with body surface area(BSA)as a scaling variable was used for comparison.Before correction,all 23 LV Doppler parameters correlated significantly with one or more of the biometric variables.In group B,gender differences were found in 47.8%(11/23)of the parameters and were eliminated in 81.8%(9/11)of the parameters after correction with OMAM.The successful correction rate with OMAM was 100%(23/23)in group B and 82.6%(19/23)in group C.New reference values for corrected Doppler measurements independent of biometric variables were established.The SVIM with BSA successfully corrected none of the 23 parameters.In conclusion,different LV Doppler parameters allometrically correlated with one or more of the biometric variables.The novel OMAM developed in this study successfully corrected the effects of the physiological variances of most biometric variables on Doppler measurements in healthy and overweight subjects,and was found to be far superior to the SVIM.However,whether the OMAM equations can be applied to other ethnicities,obese subjects,and pathological conditions requires further investigation.展开更多
Sticky Brownian motions can be viewed as time-changed semimartingale reflecting Brownian motions,which find applications in many areas including queueing theory and mathematical finance.In this paper,we focus on stati...Sticky Brownian motions can be viewed as time-changed semimartingale reflecting Brownian motions,which find applications in many areas including queueing theory and mathematical finance.In this paper,we focus on stationary distributions for sticky Brownian motions.Main results obtained here include tail asymptotic properties in the marginal distributions and joint distributions.The kernel method,copula concept and extreme value theory are the main tools used in our analysis.展开更多
In this paper, according to economics of real estate and macro-control theory, combine with the characteristics of the real estate market, macro-control of the real estate market is studied. After giving the dynamic m...In this paper, according to economics of real estate and macro-control theory, combine with the characteristics of the real estate market, macro-control of the real estate market is studied. After giving the dynamic model of three-dimensional nonlinear differential equations based on the total number of houses on the real estate business, the government’s averages housing investment funds and the standard price, systematically established the stability conditions of equilibrium point for this model. What’s more, through the use of extreme value analysis model, government funds have been invested in real estate business building devotion principles and the construction base of the real estate businessmen has also been estimated successfully. This provides the corresponding theoretical basis for government macro control policy-making.展开更多
文摘We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP) distribution are the classical distributions for this problem. However, from 2004, [1] and many other researchers have been empirically showing that generalized logistic (GL) distribution is a better model than GEV and GP distributions in modeling extreme movement of stock market data. In this paper, we show that these results are not accidental. We prove the theoretical importance of GL distribution in extreme value modeling. For proving this, we introduce a general multivariate limit theorem and deduce some important multivariate theorems in probability as special cases. By using the theorem, we derive a limit theorem in extreme value theory, where GL distribution plays central role instead of GEV distribution. The proof of this result is parallel to the proof of classical extremal types theorem, in the sense that, it possess important characteristic in classical extreme value theory, for e.g. distributional property, stability, convergence and multivariate extension etc.
基金financially supported by the National Natural Science Foundation of China (Grant Nos. 51579232 and 51890913)the Open Funding of State Key Laboratory of Hydraulic Engineering Simulation and Safety (Grant No. HESS-1712)。
文摘In this study, a morphodynamic numerical model is established with the Regional Ocean Modeling System(ROMS)to investigate the transient behavior of sand waves under realistic sea conditions. The simulation of sand wave evolution comprises two steps: 1) a regional-scale model is configured first to simulate the ocean hydrodynamics, i.e., tides and tidal currents, and 2) the transient behavior of sand waves is simulated in a small computational domain under the time-variant currents extracted from the large model. The evolution of sand waves on the continental shelf in the Beibu Gulf is specifically investigated. The numerical results of the two-year evolution of sand waves under normal sea conditions compare well with the field survey data. The transient behavior of sand waves in individual months shows that the sand waves are more stable in April and October than that in other months, which can be selected as the windows for seabed operations. The effects of sediment properties, including settling velocity, critical shear stress and surface erosion rate, on sand wave evolution are also analyzed. Then, the typhoon-induced currents are further superimposed on the tidal currents as the extreme weather conditions. Sand waves with the average wavelength generally have more active behavior than smaller or larger sand waves. The characteristics of the evolution of sand waves in an individual typhoon process are quite different for different hydrodynamic combinations. For the storm conditions, i.e., the real combination and maximum combination cases, the sand waves experience a significant migration together with a damping in height due to the dominant suspended sediment transport. For the mild conditions, i.e., the pure tidal current and minimum combination cases, the sand waves migrate less, but the heights continue growing due to the dominant bedload transport.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.
文摘It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.
基金Supported by the National Natural Science Foundation of China(Nos.52071306,51379195)the Natural Science Foundation of Shandong Province(No.ZR2019MEE050)the Graduate Education Foundation(No.HDYA19006)。
文摘The accurate calculation of marine environmental design parameters depends on the probability distribution model,and the calculation results of different distribution models are often different.It is very important to determine which distribution model is more stable and reasonable when extrapolating the recurrence level of the studied sea area.In this paper,we constructed an evaluation method of the overall uncertainty of the calculation results and a measurement of the uncertainty of the design parameters derivation model,by incorporating the influence of sample information on the model information entropy,such as sample size,degree of dispersion,and sampling error.Results show that the sample data size and the degree of dispersion are directly proportional to the information entropy.Within the same group of data,the maximum entropy distribution model has the lowest overall uncertainty,while the Gumbel distribution model has the largest overall uncertainty.In other words,the maximum entropy distribution model has good applicability in the accurate calculation of marine environmental design parameters.
文摘Cyber losses in terms of number of records breached under cyber incidents commonly feature a significant portion of zeros, specific characteristics of mid-range losses and large losses, which make it hard to model the whole range of the losses using a standard loss distribution. We tackle this modeling problem by proposing a three-component spliced regression model that can simultaneously model zeros, moderate and large losses and consider heterogeneous effects in mixture components. To apply our proposed model to Privacy Right Clearinghouse (PRC) data breach chronology, we segment geographical groups using unsupervised cluster analysis, and utilize a covariate-dependent probability to model zero losses, finite mixture distributions for moderate body and an extreme value distribution for large losses capturing the heavy-tailed nature of the loss data. Parameters and coefficients are estimated using the Expectation-Maximization (EM) algorithm. Combining with our frequency model (generalized linear mixed model) for data breaches, aggregate loss distributions are investigated and applications on cyber insurance pricing and risk management are discussed.
基金partially supported by NSF-DMS-1505367 and NSF-DMS-2012298.
文摘This review paper discusses advances of statistical inference in modeling extreme observations from multiple sources and heterogeneous populations.The paper starts briefly reviewing classical univariate/multivariate extreme value theory,tail equivalence,and tail(in)dependence.New extreme value theory for heterogeneous populations is then introduced.Time series models for maxima and extreme observations are the focus of the review.These models naturally form a new system with similar structures.They can be used as alternatives to the widely used ARMA models and GARCH models.Applications of these time series models can be in many fields.The paper discusses two important applications:systematic risks and extreme co-movements/large scale contagions.
基金supported by the National Natural Science Foundation of China under Grant No.71971118supported by Major Natural Science Projects of Universities in Jiangsu Province under Grant No.20KJA520002。
文摘In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the null hypothesis,the test statistic converges to the type I extreme value distribution,and otherwise,it explodes fast and the divergence rate could even reach n in the strong signal case where n is the size of the network,guaranteeing high detection power.This method is simple to use and serves as an alternative approach to the novel one in Lei(2016)using random matrix theory.To detect the change of the community structure,the authors also propose a two-sample test for the stochastic block model with two observed adjacency matrices.Simulation studies justify the theory.The authors apply the proposed method to the political blog data set and find reasonable group structures.
基金This work was supported by the National Natural Science Foundation of China(Grant No.50379051).
文摘In this paper, a new type of distribution, multivariate compound extreme value distribution (MCEVD), is introduced by compounding a discrete distribution with a multivariate continuous distribution of extreme sea events. In its engineering application the number over certain threshold level per year is fitting to Poisson distribution and the corresponding extreme sea events are fitting to Nested Logistic distribution, then the Poisson-Nested logistic trivariate compound extreme value distribution (PNLTCED) is proposed to predict extreme wave heights, periods and wind speeds in Yellow Sea. The new model gives more stable and reasonable predicted results.
基金supported by the National Natural Science Foundation of China(Grants No.51779074 and 41371052)the Special Fund for the Public Welfare Industry of the Ministry of Water Resources of China(Grant No.201501059)+3 种基金the National Key Research and Development Program of China(Grant No.2017YFC0404304)the Jiangsu Water Conservancy Science and Technology Project(Grant No.2017027)the Program for Outstanding Young Talents in Colleges and Universities of Anhui Province(Grant No.gxyq2018143)the Natural Science Foundation of Wanjiang University of Technology(Grant No.WG18030)
文摘This study developed a hierarchical Bayesian(HB)model for local and regional flood frequency analysis in the Dongting Lake Basin,in China.The annual maximum daily flows from 15 streamflow-gauged sites in the study area were analyzed with the HB model.The generalized extreme value(GEV)distribution was selected as the extreme flood distribution,and the GEV distribution location and scale parameters were spatially modeled through a regression approach with the drainage area as a covariate.The Markov chain Monte Carlo(MCMC)method with Gibbs sampling was employed to calculate the posterior distribution in the HB model.The results showed that the proposed HB model provided satisfactory Bayesian credible intervals for flood quantiles,while the traditional delta method could not provide reliable uncertainty estimations for large flood quantiles,due to the fact that the lower confidence bounds tended to decrease as the return periods increased.Furthermore,the HB model for regional analysis allowed for a reduction in the value of some restrictive assumptions in the traditional index flood method,such as the homogeneity region assumption and the scale invariance assumption.The HB model can also provide an uncertainty band of flood quantile prediction at a poorly gauged or ungauged site,but the index flood method with L-moments does not demonstrate this uncertainty directly.Therefore,the HB model is an effective method of implementing the flexible local and regional frequency analysis scheme,and of quantifying the associated predictive uncertainty.
基金supported by the National Basic Research Program of China 2009CB421401/2006CB400503the Chinese Meteorological Administration ProgramGYHY200706001
文摘The output of 25 models used in the Coupled Model Intercomparison Project phase 3 (CMIP3) were evaluated,with a focus on summer precipitation in eastern China for the last 40 years of the 20th century.Most mod-els failed to reproduce rainfall associated with the East Asian summer monsoon (EASM),and hence the seasonal cycle in eastern China,but provided reasonable results in Southwest (SW) and Northeast China (NE).The simula-tions produced reasonable results for the Yangtze-Huai (YH) Basin area,although the Meiyu phenomenon was underestimated in general.One typical regional phe-nomenon,a seasonal northward shift in the rain belt from early to late summer,was completely missed by most models.The long-term climate trends in rainfall over eastern China were largely underestimated,and the ob-served geographical pattern of rainfall changes was not reproduced by most models.Precipitation extremes were evaluated via parameters of fitted GEV (Generalized Ex-treme Values) distributions.The annual extremes were grossly underestimated in the monsoon-dominated YH and SW regions,but reasonable values were calculated for the North China (NC) and NE regions.These results suggest a general failure to capture the dynamics of the EASM in current coupled climate models.Nonetheless,models with higher resolution tend to reproduce larger decadal trends and annual extremes of precipitation in the regions studied.
基金Project supported by the National Natural Science Foundation of China(No.61331017)
文摘Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficulty of intertidal area classification is compounded because a high proportion of this area is frequently flooded by water, making statistical modeling methods with spatial contextual information often ineffective. Because polarimetric entropy and anisotropy play significant roles in characterizing intertidal areas, in this paper we propose a novel unsupervised contextual classification algorithm. The key point of the method is to combine the generalized extreme value(GEV) statistical model of the polarization features and the Markov random field(MRF) for contextual smoothing. A goodness-of-fit test is added to determine the significance of the components of the statistical model. The final classification results are obtained by effectively combining the results of polarimetric entropy and anisotropy. Experimental results of the polarimetric data obtained by the Chinese Gaofen-3 SAR satellite demonstrate the feasibility and superiority of the proposed classification algorithm.
基金supported by the Program of Introducing Talents of Discipline to Universities(BP 0719033)the State Key Program of the National Natural Science Foundation of China(82030051)+4 种基金the International Collaboration and Exchange Program of China(81920108003)the National Natural Science Foundation of China(81671703,81770442,and 11771408)the Qingdao Key Health Discipline Development Fund(3311000000073)the People’s Livelihood Science and Technology Project of Qingdao(18-6-1-62-nsh)the Fundamental Research Funds for the Central Universities(201964006)。
文摘Most left ventricular(LV)Doppler measurements vary significantly with age and gender,making it necessary to correct them for physiological variances.We aimed to verify the hypothesis that different Doppler measurements correlate nonlinearly with different biometric variables raised to different scaling factors and exponents.In this work,a total of 23 LV Doppler parameters were measured in 1224 healthy Chinese adults.An optimized multivariable allometric model(OMAM)and scaling equations were developed in 70%of the subjects(group A),and the reliability of the model and equations was verified using the remaining 30%of the subjects(group B)as well as 183 overweight subjects(group C).The single-variable isometric model(SVIM)with body surface area(BSA)as a scaling variable was used for comparison.Before correction,all 23 LV Doppler parameters correlated significantly with one or more of the biometric variables.In group B,gender differences were found in 47.8%(11/23)of the parameters and were eliminated in 81.8%(9/11)of the parameters after correction with OMAM.The successful correction rate with OMAM was 100%(23/23)in group B and 82.6%(19/23)in group C.New reference values for corrected Doppler measurements independent of biometric variables were established.The SVIM with BSA successfully corrected none of the 23 parameters.In conclusion,different LV Doppler parameters allometrically correlated with one or more of the biometric variables.The novel OMAM developed in this study successfully corrected the effects of the physiological variances of most biometric variables on Doppler measurements in healthy and overweight subjects,and was found to be far superior to the SVIM.However,whether the OMAM equations can be applied to other ethnicities,obese subjects,and pathological conditions requires further investigation.
基金supported by the Shandong Provincial Natural Science Foundation of China(Grtant No.ZR2019MA035)the Natural Sciences and Engineering Research Council(NSERC)of Canadasupported by the China Scholarship Council(Grant No.201708370006)。
文摘Sticky Brownian motions can be viewed as time-changed semimartingale reflecting Brownian motions,which find applications in many areas including queueing theory and mathematical finance.In this paper,we focus on stationary distributions for sticky Brownian motions.Main results obtained here include tail asymptotic properties in the marginal distributions and joint distributions.The kernel method,copula concept and extreme value theory are the main tools used in our analysis.
文摘In this paper, according to economics of real estate and macro-control theory, combine with the characteristics of the real estate market, macro-control of the real estate market is studied. After giving the dynamic model of three-dimensional nonlinear differential equations based on the total number of houses on the real estate business, the government’s averages housing investment funds and the standard price, systematically established the stability conditions of equilibrium point for this model. What’s more, through the use of extreme value analysis model, government funds have been invested in real estate business building devotion principles and the construction base of the real estate businessmen has also been estimated successfully. This provides the corresponding theoretical basis for government macro control policy-making.