The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in m...The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in making decisive assessments of their potential impact on technical,economic,and social aspects in various fields.This research focuses on analyzing the hourly values of the auroral electrojet(AE)geomagnetic index from 1957 to 2019 by using the peak over threshold method in extreme value theory.By fitting the generalized Pareto distribution to extreme AE values,shape parameter indices were derived,revealing negative values that establish an upper bound for this time series.Consequently,it became evident that the AE values had reached a plateau,suggesting that extreme events exceeding the established upper limit are rare.As a result,although the need for diligent precautions to mitigate the consequences of such extreme events persists,surpassing the upper limit of AE values becomes increasingly challenging.It is also possible to observe an aurora in the middle-and low-latitude regions during the maximum period of the AE index.展开更多
We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pa...We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pareto (GP) model fitted to the Niño3.4 index and SOI are shown, and all four diagnostic plots support the fitted GP model. Because the shape parameter of the Niño3.4 was negative, the Niño3.4 index had a finite upper limit. In contrast, that of the SOI was zero, therefore the SOI did not have a finite upper limit, and there is a possibility that a significant risk will occur. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals, CI. The 10-year, and 100-year return levels for Niño3.4 were estimated to be 2.41, and 2.62, with 95% CI [2.22, 2.59], and [2.58, 2.66], respectively. The Niño3.4 index was 2.65 in the 2015/16 super El Niño, which is a phenomenon that occurs once every 500 years. The Niño3.4 index was 2.51 in the 1982/83, and 1997/98 super El Niño, which is a phenomenon that occurs once every 20 years. Recently, a large super El Niño event with a small probability of occurrence has occurred. In response to global warming, the super El Niño events are becoming more likely to occur.展开更多
We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the ...We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the 1997 Asian financial crisis, the USD/JPY and EUR/JPY became multifractal, then the USD/JPY became monofractal and stable, and yen depreciation was observed. However, the EUR/JPY became multifractal and unstable, and a strong yen depreciation was observed. The coherence between the USD/JPY and EUR/JPY was strong between 1995 and 2000. After the 2007-2008 financial crisis, the USD/JPY became monofractal and stable, and yen appreciation was observed. However, the EUR/JPY became multifractal and unstable, and strong yen appreciation was observed. Various diagnostic plots for assessing the accuracy of the GP model fitted to USD/JPY and EUR/JPY are shown, and all the diagnostic plots support the fitted GP model. The shape parameters of USD/JPY and EUR/JPY were close to zero, therefore the USD/JPY and EUR/JPY did not have finite upper limits. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals (CI). As a result, the 10-year and 100-year return levels for USD/JPY were estimated to be 149.6 and 164.8, with 95% CI [143.2, 156.0] and [149.4, 180.1], respectively.展开更多
This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and m...This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.展开更多
One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We e...One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.展开更多
A new approach to evaluate the extreme value distribution (EVD) of the response and reliability of general multi-DOF nonlinear stochastic structures is proposed. The approach is based on the recently developed proba...A new approach to evaluate the extreme value distribution (EVD) of the response and reliability of general multi-DOF nonlinear stochastic structures is proposed. The approach is based on the recently developed probability density evolution method, which enables the instantaneous probability density functions of the stochastic responses to be captured. In the proposed method, a virtual stochastic process is first constructed to satisfy the condition that the extreme value of the response equals the value of the constructed process at a certain instant of time. The probability density evolution method is then applied to evaluate the instantaneous probability density function of the response, yielding the EVD. The reliability is therefore available through a simple integration over the safe domain. A numerical algorithm is developed using the Number Theoretical Method to select the discretized representative points. Further, a hyper-ball is imposed to sieve the points from the preceding point set in the hypercube. In the numerical examples, the EVD of random variables is evaluated and compared with the analytical solution. A frame structure is analyzed to capture the EVD of the response and the dynamic reliability. The investigations indicate that the proposed approach provides reasonable accuracy and efficiency.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distributi...Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.展开更多
The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extre...The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extreme tail of standardized residual series of daily/weekly indices losses, and parametric and nonparametric methods are used to estimate parameters of the general Pareto distribution (GPD), and dynamic VaR for indices of three stock markets in China. The accuracy and time scale invariance of risk measurement methods through back-testing approach are also examined. Results show that not all the indices accept time scale invariance; there are some differences in accuracy between different indices at various confidence levels. The most powerful dynamic VaR estimation methods are EVT-GJR-Hill at 97.5% level for weekly loss to Shanghai stock market, and EVT-GARCH-MLE (Hill) at 99.0% level for weekly loss to Taiwan and Hong Kong stock markets, respectively.展开更多
In this paper,the maximum 1-hour rainfall( rain peak),the maximum 6-hour rainfall and the maximum 24-hour rainfall in the Caojiang River basin from 1967 to 2013 were taken as samples. The typical typhoon rainstorm hyd...In this paper,the maximum 1-hour rainfall( rain peak),the maximum 6-hour rainfall and the maximum 24-hour rainfall in the Caojiang River basin from 1967 to 2013 were taken as samples. The typical typhoon rainstorm hydrograph of joint distribution of rainfall in three periods was constructed based on the asymmetric Archimedean Gumbel-Hougaard extreme value Copula. The main conclusions were as follows:( 1) the design rainstorm value in the Caojiang River basin calculated by using the joint distribution of rainfall in three periods was larger than the design rainstorm value of the joint distribution in two periods and that of a single period. The design rainstorm process hydrograph amplified at the same frequency had the optimal overall effect,which provided a new idea and method for studying the design rainfall patterns.( 2) According to the maximum 24-hour rainfall,the risk rate of the multi-peak rainstorm process that the main peak was in the back was the highest,and the constructed typical design rainstorm process hydrograph was the most representative.( 3) " OR" joint return period of rainfall combination in three periods as the design criteria of a watershed was applicable to responding to the risk of rainfall and flood in this watershed.展开更多
In this paper we estimate the incubation period of a possible pathology following exposure to dioxins during a poor diet. The tools developed for this purpose include the probabilistic extremal model and the stochasti...In this paper we estimate the incubation period of a possible pathology following exposure to dioxins during a poor diet. The tools developed for this purpose include the probabilistic extremal model and the stochastic behavior of the distribution tails of the contamination. We propose a cumulative distribution function for a random variable that follows both a Gaussian distribution and a GPD. A global optimization method is also explored for the efficient estimation of parameters of GPD.展开更多
The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is con...The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is concerned overview of the theory of infinite distribution functions.The tool to deal with the problems raised in the paper is the mathematical methods of random analysis(theory of random process and multivariate statistics).In this article,we introduce the new function to find out the bias and standard error with jackknife method for Generalized Extreme Value distributions.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jum...Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.展开更多
The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado ...The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado outbreaks on fatalities while accounting for climate and demographic factors.The findings indicate that the number of fatalities increases with the increase of tornado outbreaks.Additionally,this study undertook a counterfactual analysis to determine what would have been the probability of a tornado outbreak under various climatic and demographic scenarios.The results of the counterfactual study indicate that the likelihood of increased mortality increases as the population forecast grows.Intensified El Niño events,on the other hand,reduce the likelihood of further fatalities.La Niña events are expected to increase probability of fatalities.展开更多
The exponential Randić index has important applications in the fields of biology and chemistry. The exponential Randić index of a graph G is defined as the sum of the weights e 1 d( u )d( v ) of all edges uv of G, whe...The exponential Randić index has important applications in the fields of biology and chemistry. The exponential Randić index of a graph G is defined as the sum of the weights e 1 d( u )d( v ) of all edges uv of G, where d( u ) denotes the degree of a vertex u in G. The paper mainly provides the upper and lower bounds of the exponential Randić index in quasi-tree graphs, and characterizes the extremal graphs when the bounds are achieved.展开更多
In Part Ⅰ the concept of the general regular variation of n-th order is proposed and its construction is discussed. The uniqueness of the standard expression and the higher order regularity of the auxiliary functions...In Part Ⅰ the concept of the general regular variation of n-th order is proposed and its construction is discussed. The uniqueness of the standard expression and the higher order regularity of the auxiliary functions are proved.展开更多
The paper is concerned with the basic properties of multivariate extreme value distribution (in the Logistic model). We obtain the characteristic function and recurrence formula of the density function. The explicit a...The paper is concerned with the basic properties of multivariate extreme value distribution (in the Logistic model). We obtain the characteristic function and recurrence formula of the density function. The explicit algebraic formula for Fisher information matrix is indicated. A simple and accurate procedure for generating random vector from multivariate extreme value distribution is presented.展开更多
Based on model test,the statistical distribution of extreme values of wave- current in-line forces acting on vertical circular cylinder is analyzed in this paper.It is shown that the results calculated by the simplifi...Based on model test,the statistical distribution of extreme values of wave- current in-line forces acting on vertical circular cylinder is analyzed in this paper.It is shown that the results calculated by the simplified method,proposed by authors,agree well with the test data;Weibull distribution is also adoptable in the region of high KC number, and the shape parameter a and scale parameter β are related well with KC number respectively.展开更多
The research of carbon content along the casting direction of 82B cord steel billets is of great significance for improvingthe quality of cord products from subsequent processing.However,the traditional segregation an...The research of carbon content along the casting direction of 82B cord steel billets is of great significance for improvingthe quality of cord products from subsequent processing.However,the traditional segregation and billets quality evaluationmethods have certain limitations,such as sampling length and analysis area.which affect the accuracy of quality judgment.Thus.the statistics of extreme values(SEV)was introduced to predict the maximum value of carbon element contentalong the casting direction,which can quantitatively characterize the segregation degree.The size of the selected billet is150 mm×150 mm,and the sampling location is the centerline of the billet.The experiment was conducted by consideringthe effect of cooling intensity and casting speed on the maximum value of carbon element content.Firstly,the calculationresults show that the SEN method can predict the maximum value of carbon element content along the casting directionof 82B cord steel,and the SEV method is proved to be effective by analyzing the carbon distribution and fluctuation in billets.To some extent,the SEV method can break the limitations of the sampling length and analysis area by predicting themaximum value of carbon element on a larger range of continuous casting billets with few samples.During the continuouscasting process the increase in cooling intensity makes the surface shrinking rate increase,which can slow down the flowof solute-enriched liquid to the center,and the center segregation can be reduced.On the other hand,the function area ofthe final electromagnetic stirring can be expanded with the increase in the casting speed,which can reduce the concentration of carbon element in the center of the billets and reduce the maximum value of carbon element content.Ilt can providea new theoretical reference for the quantitative calculation of carbon content in continuous casting billets and the qualityevaluation of continuous casting billcts.展开更多
In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the ...In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the null hypothesis,the test statistic converges to the type I extreme value distribution,and otherwise,it explodes fast and the divergence rate could even reach n in the strong signal case where n is the size of the network,guaranteeing high detection power.This method is simple to use and serves as an alternative approach to the novel one in Lei(2016)using random matrix theory.To detect the change of the community structure,the authors also propose a two-sample test for the stochastic block model with two observed adjacency matrices.Simulation studies justify the theory.The authors apply the proposed method to the political blog data set and find reasonable group structures.展开更多
We have applied the grey system theory to study triple jumps. In this Paper we introducethe grey system theory, apply it to establish a monotonic sequence nonlinear Verhulst differentialdynamic model. Using that model...We have applied the grey system theory to study triple jumps. In this Paper we introducethe grey system theory, apply it to establish a monotonic sequence nonlinear Verhulst differentialdynamic model. Using that model and the triple jump records in recent 45 years we calculate thefuture extreme values of world triple jump, predict the optimal apportionment among the three phases,and study the tendency of development of triple jumP techniques and strategy. Every event has itsown development, maturity and peak periods. Our study helps coaches and athletes to develop theirstrategy on a scientific base. Based on the grey system theory we predict that the record of worldtriple jump will finally approach 20.65m. The distance of hop, step, and jump will approach 7.56m,6.06m, and 7.03m respectively. The apportionment will approach 36.6% for hop. 29.4% for step, and34.00% for jump. According to our calculation the tendency of development is to follow the model ofthe Russian style basically, and at the same time to absorb the advantage of the Polish style to place agreater emphasis on the distance of the distance of the third phase.展开更多
文摘The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in making decisive assessments of their potential impact on technical,economic,and social aspects in various fields.This research focuses on analyzing the hourly values of the auroral electrojet(AE)geomagnetic index from 1957 to 2019 by using the peak over threshold method in extreme value theory.By fitting the generalized Pareto distribution to extreme AE values,shape parameter indices were derived,revealing negative values that establish an upper bound for this time series.Consequently,it became evident that the AE values had reached a plateau,suggesting that extreme events exceeding the established upper limit are rare.As a result,although the need for diligent precautions to mitigate the consequences of such extreme events persists,surpassing the upper limit of AE values becomes increasingly challenging.It is also possible to observe an aurora in the middle-and low-latitude regions during the maximum period of the AE index.
文摘We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pareto (GP) model fitted to the Niño3.4 index and SOI are shown, and all four diagnostic plots support the fitted GP model. Because the shape parameter of the Niño3.4 was negative, the Niño3.4 index had a finite upper limit. In contrast, that of the SOI was zero, therefore the SOI did not have a finite upper limit, and there is a possibility that a significant risk will occur. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals, CI. The 10-year, and 100-year return levels for Niño3.4 were estimated to be 2.41, and 2.62, with 95% CI [2.22, 2.59], and [2.58, 2.66], respectively. The Niño3.4 index was 2.65 in the 2015/16 super El Niño, which is a phenomenon that occurs once every 500 years. The Niño3.4 index was 2.51 in the 1982/83, and 1997/98 super El Niño, which is a phenomenon that occurs once every 20 years. Recently, a large super El Niño event with a small probability of occurrence has occurred. In response to global warming, the super El Niño events are becoming more likely to occur.
文摘We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the 1997 Asian financial crisis, the USD/JPY and EUR/JPY became multifractal, then the USD/JPY became monofractal and stable, and yen depreciation was observed. However, the EUR/JPY became multifractal and unstable, and a strong yen depreciation was observed. The coherence between the USD/JPY and EUR/JPY was strong between 1995 and 2000. After the 2007-2008 financial crisis, the USD/JPY became monofractal and stable, and yen appreciation was observed. However, the EUR/JPY became multifractal and unstable, and strong yen appreciation was observed. Various diagnostic plots for assessing the accuracy of the GP model fitted to USD/JPY and EUR/JPY are shown, and all the diagnostic plots support the fitted GP model. The shape parameters of USD/JPY and EUR/JPY were close to zero, therefore the USD/JPY and EUR/JPY did not have finite upper limits. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals (CI). As a result, the 10-year and 100-year return levels for USD/JPY were estimated to be 149.6 and 164.8, with 95% CI [143.2, 156.0] and [149.4, 180.1], respectively.
基金the National Natural Science Foundation of China (No. 79970041).
文摘This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.
文摘One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.
基金National Natural Science Foundation of China for Innovative Research Groups Under Grant No. 50321803 National Natural Science Foundation of China for Young Scholars Under Grant No. 10402030
文摘A new approach to evaluate the extreme value distribution (EVD) of the response and reliability of general multi-DOF nonlinear stochastic structures is proposed. The approach is based on the recently developed probability density evolution method, which enables the instantaneous probability density functions of the stochastic responses to be captured. In the proposed method, a virtual stochastic process is first constructed to satisfy the condition that the extreme value of the response equals the value of the constructed process at a certain instant of time. The probability density evolution method is then applied to evaluate the instantaneous probability density function of the response, yielding the EVD. The reliability is therefore available through a simple integration over the safe domain. A numerical algorithm is developed using the Number Theoretical Method to select the discretized representative points. Further, a hyper-ball is imposed to sieve the points from the preceding point set in the hypercube. In the numerical examples, the EVD of random variables is evaluated and compared with the analytical solution. A frame structure is analyzed to capture the EVD of the response and the dynamic reliability. The investigations indicate that the proposed approach provides reasonable accuracy and efficiency.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.
基金The National Natural Science Foundation of China (No70501025 & 70572089)
文摘The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extreme tail of standardized residual series of daily/weekly indices losses, and parametric and nonparametric methods are used to estimate parameters of the general Pareto distribution (GPD), and dynamic VaR for indices of three stock markets in China. The accuracy and time scale invariance of risk measurement methods through back-testing approach are also examined. Results show that not all the indices accept time scale invariance; there are some differences in accuracy between different indices at various confidence levels. The most powerful dynamic VaR estimation methods are EVT-GJR-Hill at 97.5% level for weekly loss to Shanghai stock market, and EVT-GARCH-MLE (Hill) at 99.0% level for weekly loss to Taiwan and Hong Kong stock markets, respectively.
基金Supported by National Natural Science Foundation of China(4177104441371498)。
文摘In this paper,the maximum 1-hour rainfall( rain peak),the maximum 6-hour rainfall and the maximum 24-hour rainfall in the Caojiang River basin from 1967 to 2013 were taken as samples. The typical typhoon rainstorm hydrograph of joint distribution of rainfall in three periods was constructed based on the asymmetric Archimedean Gumbel-Hougaard extreme value Copula. The main conclusions were as follows:( 1) the design rainstorm value in the Caojiang River basin calculated by using the joint distribution of rainfall in three periods was larger than the design rainstorm value of the joint distribution in two periods and that of a single period. The design rainstorm process hydrograph amplified at the same frequency had the optimal overall effect,which provided a new idea and method for studying the design rainfall patterns.( 2) According to the maximum 24-hour rainfall,the risk rate of the multi-peak rainstorm process that the main peak was in the back was the highest,and the constructed typical design rainstorm process hydrograph was the most representative.( 3) " OR" joint return period of rainfall combination in three periods as the design criteria of a watershed was applicable to responding to the risk of rainfall and flood in this watershed.
文摘In this paper we estimate the incubation period of a possible pathology following exposure to dioxins during a poor diet. The tools developed for this purpose include the probabilistic extremal model and the stochastic behavior of the distribution tails of the contamination. We propose a cumulative distribution function for a random variable that follows both a Gaussian distribution and a GPD. A global optimization method is also explored for the efficient estimation of parameters of GPD.
文摘The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is concerned overview of the theory of infinite distribution functions.The tool to deal with the problems raised in the paper is the mathematical methods of random analysis(theory of random process and multivariate statistics).In this article,we introduce the new function to find out the bias and standard error with jackknife method for Generalized Extreme Value distributions.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.
文摘The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado outbreaks on fatalities while accounting for climate and demographic factors.The findings indicate that the number of fatalities increases with the increase of tornado outbreaks.Additionally,this study undertook a counterfactual analysis to determine what would have been the probability of a tornado outbreak under various climatic and demographic scenarios.The results of the counterfactual study indicate that the likelihood of increased mortality increases as the population forecast grows.Intensified El Niño events,on the other hand,reduce the likelihood of further fatalities.La Niña events are expected to increase probability of fatalities.
文摘The exponential Randić index has important applications in the fields of biology and chemistry. The exponential Randić index of a graph G is defined as the sum of the weights e 1 d( u )d( v ) of all edges uv of G, where d( u ) denotes the degree of a vertex u in G. The paper mainly provides the upper and lower bounds of the exponential Randić index in quasi-tree graphs, and characterizes the extremal graphs when the bounds are achieved.
文摘In Part Ⅰ the concept of the general regular variation of n-th order is proposed and its construction is discussed. The uniqueness of the standard expression and the higher order regularity of the auxiliary functions are proved.
文摘The paper is concerned with the basic properties of multivariate extreme value distribution (in the Logistic model). We obtain the characteristic function and recurrence formula of the density function. The explicit algebraic formula for Fisher information matrix is indicated. A simple and accurate procedure for generating random vector from multivariate extreme value distribution is presented.
文摘Based on model test,the statistical distribution of extreme values of wave- current in-line forces acting on vertical circular cylinder is analyzed in this paper.It is shown that the results calculated by the simplified method,proposed by authors,agree well with the test data;Weibull distribution is also adoptable in the region of high KC number, and the shape parameter a and scale parameter β are related well with KC number respectively.
基金The authors are very grateful for support from United Funds between National Natural Science Foundation and Baowu Steel Group Corporation Limited from China(No.U1860101)Chongqing Fundamental Research and Cutting-Edge Technology Funds(No.cstc2017jcyjAX0019).
文摘The research of carbon content along the casting direction of 82B cord steel billets is of great significance for improvingthe quality of cord products from subsequent processing.However,the traditional segregation and billets quality evaluationmethods have certain limitations,such as sampling length and analysis area.which affect the accuracy of quality judgment.Thus.the statistics of extreme values(SEV)was introduced to predict the maximum value of carbon element contentalong the casting direction,which can quantitatively characterize the segregation degree.The size of the selected billet is150 mm×150 mm,and the sampling location is the centerline of the billet.The experiment was conducted by consideringthe effect of cooling intensity and casting speed on the maximum value of carbon element content.Firstly,the calculationresults show that the SEN method can predict the maximum value of carbon element content along the casting directionof 82B cord steel,and the SEV method is proved to be effective by analyzing the carbon distribution and fluctuation in billets.To some extent,the SEV method can break the limitations of the sampling length and analysis area by predicting themaximum value of carbon element on a larger range of continuous casting billets with few samples.During the continuouscasting process the increase in cooling intensity makes the surface shrinking rate increase,which can slow down the flowof solute-enriched liquid to the center,and the center segregation can be reduced.On the other hand,the function area ofthe final electromagnetic stirring can be expanded with the increase in the casting speed,which can reduce the concentration of carbon element in the center of the billets and reduce the maximum value of carbon element content.Ilt can providea new theoretical reference for the quantitative calculation of carbon content in continuous casting billets and the qualityevaluation of continuous casting billcts.
基金supported by the National Natural Science Foundation of China under Grant No.71971118supported by Major Natural Science Projects of Universities in Jiangsu Province under Grant No.20KJA520002。
文摘In this paper,to obtain a consistent estimator of the number of communities,the authors present a new sequential testing procedure,based on the locally smoothed adjacency matrix and the extreme value theory.Under the null hypothesis,the test statistic converges to the type I extreme value distribution,and otherwise,it explodes fast and the divergence rate could even reach n in the strong signal case where n is the size of the network,guaranteeing high detection power.This method is simple to use and serves as an alternative approach to the novel one in Lei(2016)using random matrix theory.To detect the change of the community structure,the authors also propose a two-sample test for the stochastic block model with two observed adjacency matrices.Simulation studies justify the theory.The authors apply the proposed method to the political blog data set and find reasonable group structures.
文摘We have applied the grey system theory to study triple jumps. In this Paper we introducethe grey system theory, apply it to establish a monotonic sequence nonlinear Verhulst differentialdynamic model. Using that model and the triple jump records in recent 45 years we calculate thefuture extreme values of world triple jump, predict the optimal apportionment among the three phases,and study the tendency of development of triple jumP techniques and strategy. Every event has itsown development, maturity and peak periods. Our study helps coaches and athletes to develop theirstrategy on a scientific base. Based on the grey system theory we predict that the record of worldtriple jump will finally approach 20.65m. The distance of hop, step, and jump will approach 7.56m,6.06m, and 7.03m respectively. The apportionment will approach 36.6% for hop. 29.4% for step, and34.00% for jump. According to our calculation the tendency of development is to follow the model ofthe Russian style basically, and at the same time to absorb the advantage of the Polish style to place agreater emphasis on the distance of the distance of the third phase.