The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in m...The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in making decisive assessments of their potential impact on technical,economic,and social aspects in various fields.This research focuses on analyzing the hourly values of the auroral electrojet(AE)geomagnetic index from 1957 to 2019 by using the peak over threshold method in extreme value theory.By fitting the generalized Pareto distribution to extreme AE values,shape parameter indices were derived,revealing negative values that establish an upper bound for this time series.Consequently,it became evident that the AE values had reached a plateau,suggesting that extreme events exceeding the established upper limit are rare.As a result,although the need for diligent precautions to mitigate the consequences of such extreme events persists,surpassing the upper limit of AE values becomes increasingly challenging.It is also possible to observe an aurora in the middle-and low-latitude regions during the maximum period of the AE index.展开更多
The weather in Nagano Prefecture, Japan, can be roughly classified into four types according to principal component analysis and k-means clustering. We predicted the extreme values of the maximum daily and hourly prec...The weather in Nagano Prefecture, Japan, can be roughly classified into four types according to principal component analysis and k-means clustering. We predicted the extreme values of the maximum daily and hourly precipitation in Nagano Prefecture using the extreme value theory. For the maximum daily precipitation, the vales of ξ in Matsumoto, Karuizawa, Sugadaira, and Saku were positive;therefore, it has no upper bound and tends to take large values. Therefore, it is dangerous and caution is required. The values of ξ in Nagano, Kisofukushima, and Minamishinano were determined to be zero, therefore, there was no upper limit, the probability of obtaining a large value was low, and caution was required. We predicted the maximum return levels for return periods of 10, 20, 50, and 100 years along with respective 95% confidence intervals in Nagano, Matsumoto, Karuizawa, Sugadaira, Saku, Kisofukushima, and Minamishinano. In Matsumoto, the 100-year return level was 182 mm, with a 95% CI [129, 236]. In Minamishinano, the 100-year return level was 285 mm, with a 95% CI [173, 398]. The 100-year return levels for the maximum daily rainfall were 285, 271, and 271 mm in Minamishinano, Saku, and Karuizawa, respectively, where the changes in the daily maximum rainfall were larger than those at other points. Because these values are large, caution is required during heavy rainfall. The 100-year return levels for the maximum daily and hourly precipitation were similar in Karuizawa and Saku. In Sugadaira, the 100-year return level for a maximum hourly rainfall of 107.2 mm was larger than the maximum daily rainfall. Hence, it is necessary to be careful about short-term rainfall events.展开更多
We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pa...We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pareto (GP) model fitted to the Niño3.4 index and SOI are shown, and all four diagnostic plots support the fitted GP model. Because the shape parameter of the Niño3.4 was negative, the Niño3.4 index had a finite upper limit. In contrast, that of the SOI was zero, therefore the SOI did not have a finite upper limit, and there is a possibility that a significant risk will occur. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals, CI. The 10-year, and 100-year return levels for Niño3.4 were estimated to be 2.41, and 2.62, with 95% CI [2.22, 2.59], and [2.58, 2.66], respectively. The Niño3.4 index was 2.65 in the 2015/16 super El Niño, which is a phenomenon that occurs once every 500 years. The Niño3.4 index was 2.51 in the 1982/83, and 1997/98 super El Niño, which is a phenomenon that occurs once every 20 years. Recently, a large super El Niño event with a small probability of occurrence has occurred. In response to global warming, the super El Niño events are becoming more likely to occur.展开更多
We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the ...We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the 1997 Asian financial crisis, the USD/JPY and EUR/JPY became multifractal, then the USD/JPY became monofractal and stable, and yen depreciation was observed. However, the EUR/JPY became multifractal and unstable, and a strong yen depreciation was observed. The coherence between the USD/JPY and EUR/JPY was strong between 1995 and 2000. After the 2007-2008 financial crisis, the USD/JPY became monofractal and stable, and yen appreciation was observed. However, the EUR/JPY became multifractal and unstable, and strong yen appreciation was observed. Various diagnostic plots for assessing the accuracy of the GP model fitted to USD/JPY and EUR/JPY are shown, and all the diagnostic plots support the fitted GP model. The shape parameters of USD/JPY and EUR/JPY were close to zero, therefore the USD/JPY and EUR/JPY did not have finite upper limits. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals (CI). As a result, the 10-year and 100-year return levels for USD/JPY were estimated to be 149.6 and 164.8, with 95% CI [143.2, 156.0] and [149.4, 180.1], respectively.展开更多
GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distributi...GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distribution and generalized Pareto distribution assumptions are adopted re- spectively to simulate the random component in the model. The demonstration of the quantile of network traffic series indicates that common GARCH-M model can partially deal with the "fat tail" problem. However, the "fat tail" characteristic of the random component directly affects the accura- cy of the calculation. Even t distribution is based on the assumption for all the data. On the other hand, extreme value theory, which only concentrates on the tail distribution, can provide more ac- curate result for high quantiles. The best result is obtained based on the generalized Pareto distribu- tion assumption for the random component in the GARCH-M model.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distributi...Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.展开更多
One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We e...One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.展开更多
This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and m...This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.展开更多
The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extre...The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extreme tail of standardized residual series of daily/weekly indices losses, and parametric and nonparametric methods are used to estimate parameters of the general Pareto distribution (GPD), and dynamic VaR for indices of three stock markets in China. The accuracy and time scale invariance of risk measurement methods through back-testing approach are also examined. Results show that not all the indices accept time scale invariance; there are some differences in accuracy between different indices at various confidence levels. The most powerful dynamic VaR estimation methods are EVT-GJR-Hill at 97.5% level for weekly loss to Shanghai stock market, and EVT-GARCH-MLE (Hill) at 99.0% level for weekly loss to Taiwan and Hong Kong stock markets, respectively.展开更多
It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the character...It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.展开更多
The article deals with the methodology of pseudorandom data analysis. As a mathematical tool for carrying out the research the extreme value theory was used that creates one of the directions in mathematical statistic...The article deals with the methodology of pseudorandom data analysis. As a mathematical tool for carrying out the research the extreme value theory was used that creates one of the directions in mathematical statistics, and is related to investigating the extreme deviations from the median values in probability distributions. Also, the methods for estimating unknown parameters and algorithm of random-number generation are discussed. The models of treatment the extreme values are constructed which are based on machine generated sample and approach is proposed for their future application for constructing forecasting models.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jum...Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.展开更多
Many stock exchanges around the world enforcing daily price limits on the amount asset prices can change to prevent the market from overreacting and to reduce volatility. Price limits are artificial boundaries set by ...Many stock exchanges around the world enforcing daily price limits on the amount asset prices can change to prevent the market from overreacting and to reduce volatility. Price limits are artificial boundaries set by market regulators who restrict price changes of a stock to a pre-specified range during a trading day or a single trading session. The primary aim of price limit rules is to stabilize the markets during panic trading, to moderate vitality by repressing excessive speculation, and to allow stocks to be traded at prices close to their fair value. However, their impact on the market is a somewhat unresolved issue (Harris, 1998). Using a methodology of comparing volatility based on the extreme value technique, the authors empirically investigate the impact of price limits on the volatility of the Stock Exchange of Thailand. The empirical results support price limits advocates, suggesting that price limits rules moderate stock price volatility.展开更多
The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado ...The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado outbreaks on fatalities while accounting for climate and demographic factors.The findings indicate that the number of fatalities increases with the increase of tornado outbreaks.Additionally,this study undertook a counterfactual analysis to determine what would have been the probability of a tornado outbreak under various climatic and demographic scenarios.The results of the counterfactual study indicate that the likelihood of increased mortality increases as the population forecast grows.Intensified El Niño events,on the other hand,reduce the likelihood of further fatalities.La Niña events are expected to increase probability of fatalities.展开更多
Sea level rise has become an important issue in global climate change studies. This study investigates trends in sea level records, particularly extreme records, in the Pearl River Estuary, using measurements from two...Sea level rise has become an important issue in global climate change studies. This study investigates trends in sea level records, particularly extreme records, in the Pearl River Estuary, using measurements from two tide gauge stations in Macao and Hong Kong. Extremes in the original sea level records (daily higher high water heights) and in tidal residuals with and without the 18.6-year nodal modulation are investigated separately. Thresholds for defining extreme sea levels are calibrated based on extreme value theory. Extreme events are then modeled by peaks-over-threshold models. The model applied to extremes in original sea level records does not include modeling of their durations, while a geometric distribution is added to model the duration of extremes in tidal residuals. Realistic modeling results are recommended in all stationary models. Parametric trends of extreme sea level records are then introduced to nonstationary models through a generalized linear model framework. The result shows that, in recent decades, since the 1960s, no significant trends can be found in any type of extreme at any station, which may be related to a reduction in the influence of tropical cyclones in the region. For the longer-term record since the 1920s at Macao, a regime shift of tidal amplitudes around the 1970s may partially explain the diverse trend of extremes in original sea level records and tidal residuals.展开更多
Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so...Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so as to analyze and model extreme events. One of the most successful methods is the maximum product of spacing (MPS). However, there is a problem encountered while modeling data through this method in that the method breaks down when there is a tie in the exceedances. This study offers a solution to model data even if it contains ties. To do so, an optimal threshold that gives more optimal parameters for extreme events, was determined. The study achieved its main objective by deriving a method that improved MPS method for determining an optimal threshold for extreme values in a data set containing ties, estimated the Generalized Pareto Distribution (GPD) parameters for the optimal threshold derived and compared these GPD parameters with GPD parameters determined through the standard MPS model. The study improved maximum product of spacing method and used Generalized Pareto Distribution (GPD) and Peak over threshold (POT) methods as the basis of identifying extreme values. This study will help the statisticians in different sectors of our economy to model extreme events involving ties. To statisticians, the structure of the extreme levels which exist in the tails of the ordinary distributions is very important in analyzing, predicting and forecasting the likelihood of an occurrence of the extreme event.展开更多
This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical propertie...This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical properties are examined and a comprehensive set of diagnostic checks are made on the two decades of AAPL daily stock returns. Combing the Extreme Value Approach together with a statistical analysis, it is learnt that the lowest VaR occurs on Fridays and Mondays typically. Moreover, high Q4 and Q3 VaR are observed during the test period. These results are valuable for anyone who needs evaluation and forecasts of the risk situation in AAPL. Moreover, this methodology, which is applicable to any other stocks or portfolios, is more realistic and comprehensive than the standard normal distribution based VaR model that is commonly used.展开更多
文摘The study of extreme weather and space events has gained paramount importance in modern society owing to rapid advances in high technology.Understanding and describing exceptional occurrences plays a crucial role in making decisive assessments of their potential impact on technical,economic,and social aspects in various fields.This research focuses on analyzing the hourly values of the auroral electrojet(AE)geomagnetic index from 1957 to 2019 by using the peak over threshold method in extreme value theory.By fitting the generalized Pareto distribution to extreme AE values,shape parameter indices were derived,revealing negative values that establish an upper bound for this time series.Consequently,it became evident that the AE values had reached a plateau,suggesting that extreme events exceeding the established upper limit are rare.As a result,although the need for diligent precautions to mitigate the consequences of such extreme events persists,surpassing the upper limit of AE values becomes increasingly challenging.It is also possible to observe an aurora in the middle-and low-latitude regions during the maximum period of the AE index.
文摘The weather in Nagano Prefecture, Japan, can be roughly classified into four types according to principal component analysis and k-means clustering. We predicted the extreme values of the maximum daily and hourly precipitation in Nagano Prefecture using the extreme value theory. For the maximum daily precipitation, the vales of ξ in Matsumoto, Karuizawa, Sugadaira, and Saku were positive;therefore, it has no upper bound and tends to take large values. Therefore, it is dangerous and caution is required. The values of ξ in Nagano, Kisofukushima, and Minamishinano were determined to be zero, therefore, there was no upper limit, the probability of obtaining a large value was low, and caution was required. We predicted the maximum return levels for return periods of 10, 20, 50, and 100 years along with respective 95% confidence intervals in Nagano, Matsumoto, Karuizawa, Sugadaira, Saku, Kisofukushima, and Minamishinano. In Matsumoto, the 100-year return level was 182 mm, with a 95% CI [129, 236]. In Minamishinano, the 100-year return level was 285 mm, with a 95% CI [173, 398]. The 100-year return levels for the maximum daily rainfall were 285, 271, and 271 mm in Minamishinano, Saku, and Karuizawa, respectively, where the changes in the daily maximum rainfall were larger than those at other points. Because these values are large, caution is required during heavy rainfall. The 100-year return levels for the maximum daily and hourly precipitation were similar in Karuizawa and Saku. In Sugadaira, the 100-year return level for a maximum hourly rainfall of 107.2 mm was larger than the maximum daily rainfall. Hence, it is necessary to be careful about short-term rainfall events.
文摘We predicted the extreme values of the ENSO index, the Niño3.4 index, and the Southern Oscillation Index (SOI) using extreme value theory. Various diagnostic plots for assessing the accuracy of the Generalized Pareto (GP) model fitted to the Niño3.4 index and SOI are shown, and all four diagnostic plots support the fitted GP model. Because the shape parameter of the Niño3.4 was negative, the Niño3.4 index had a finite upper limit. In contrast, that of the SOI was zero, therefore the SOI did not have a finite upper limit, and there is a possibility that a significant risk will occur. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals, CI. The 10-year, and 100-year return levels for Niño3.4 were estimated to be 2.41, and 2.62, with 95% CI [2.22, 2.59], and [2.58, 2.66], respectively. The Niño3.4 index was 2.65 in the 2015/16 super El Niño, which is a phenomenon that occurs once every 500 years. The Niño3.4 index was 2.51 in the 1982/83, and 1997/98 super El Niño, which is a phenomenon that occurs once every 20 years. Recently, a large super El Niño event with a small probability of occurrence has occurred. In response to global warming, the super El Niño events are becoming more likely to occur.
文摘We performed a multifractal analysis using wavelet transform to detect the changes in the fractality of the USD/JPY and EUR/JPY exchange rates, and predicted their extreme values using extreme value theory. After the 1997 Asian financial crisis, the USD/JPY and EUR/JPY became multifractal, then the USD/JPY became monofractal and stable, and yen depreciation was observed. However, the EUR/JPY became multifractal and unstable, and a strong yen depreciation was observed. The coherence between the USD/JPY and EUR/JPY was strong between 1995 and 2000. After the 2007-2008 financial crisis, the USD/JPY became monofractal and stable, and yen appreciation was observed. However, the EUR/JPY became multifractal and unstable, and strong yen appreciation was observed. Various diagnostic plots for assessing the accuracy of the GP model fitted to USD/JPY and EUR/JPY are shown, and all the diagnostic plots support the fitted GP model. The shape parameters of USD/JPY and EUR/JPY were close to zero, therefore the USD/JPY and EUR/JPY did not have finite upper limits. We predicted the maximum return level for the return periods of 10, 20, 50, 100, 350, and 500 years and their respective 95% confidence intervals (CI). As a result, the 10-year and 100-year return levels for USD/JPY were estimated to be 149.6 and 164.8, with 95% CI [143.2, 156.0] and [149.4, 180.1], respectively.
基金Supported by University and College Doctoral Subject Special Scientific Research Fund( No. 20040056041).
文摘GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distribution and generalized Pareto distribution assumptions are adopted re- spectively to simulate the random component in the model. The demonstration of the quantile of network traffic series indicates that common GARCH-M model can partially deal with the "fat tail" problem. However, the "fat tail" characteristic of the random component directly affects the accura- cy of the calculation. Even t distribution is based on the assumption for all the data. On the other hand, extreme value theory, which only concentrates on the tail distribution, can provide more ac- curate result for high quantiles. The best result is obtained based on the generalized Pareto distribu- tion assumption for the random component in the GARCH-M model.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.
文摘One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.
基金the National Natural Science Foundation of China (No. 79970041).
文摘This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.
基金The National Natural Science Foundation of China (No70501025 & 70572089)
文摘The accuracy and time scale invariance of value-at-risk (VaR) measurement methods for different stock indices and at different confidence levels are tested. Extreme value theory (EVT) is applied to model the extreme tail of standardized residual series of daily/weekly indices losses, and parametric and nonparametric methods are used to estimate parameters of the general Pareto distribution (GPD), and dynamic VaR for indices of three stock markets in China. The accuracy and time scale invariance of risk measurement methods through back-testing approach are also examined. Results show that not all the indices accept time scale invariance; there are some differences in accuracy between different indices at various confidence levels. The most powerful dynamic VaR estimation methods are EVT-GJR-Hill at 97.5% level for weekly loss to Shanghai stock market, and EVT-GARCH-MLE (Hill) at 99.0% level for weekly loss to Taiwan and Hong Kong stock markets, respectively.
文摘It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.
文摘The article deals with the methodology of pseudorandom data analysis. As a mathematical tool for carrying out the research the extreme value theory was used that creates one of the directions in mathematical statistics, and is related to investigating the extreme deviations from the median values in probability distributions. Also, the methods for estimating unknown parameters and algorithm of random-number generation are discussed. The models of treatment the extreme values are constructed which are based on machine generated sample and approach is proposed for their future application for constructing forecasting models.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.
文摘Many stock exchanges around the world enforcing daily price limits on the amount asset prices can change to prevent the market from overreacting and to reduce volatility. Price limits are artificial boundaries set by market regulators who restrict price changes of a stock to a pre-specified range during a trading day or a single trading session. The primary aim of price limit rules is to stabilize the markets during panic trading, to moderate vitality by repressing excessive speculation, and to allow stocks to be traded at prices close to their fair value. However, their impact on the market is a somewhat unresolved issue (Harris, 1998). Using a methodology of comparing volatility based on the extreme value technique, the authors empirically investigate the impact of price limits on the volatility of the Stock Exchange of Thailand. The empirical results support price limits advocates, suggesting that price limits rules moderate stock price volatility.
文摘The majority of tornado fatalities occur during severe thunderstorm occurrences that produce a large number of tornadoes,termed tornado outbreaks.This study used extreme value theory to estimate the impact of tornado outbreaks on fatalities while accounting for climate and demographic factors.The findings indicate that the number of fatalities increases with the increase of tornado outbreaks.Additionally,this study undertook a counterfactual analysis to determine what would have been the probability of a tornado outbreak under various climatic and demographic scenarios.The results of the counterfactual study indicate that the likelihood of increased mortality increases as the population forecast grows.Intensified El Niño events,on the other hand,reduce the likelihood of further fatalities.La Niña events are expected to increase probability of fatalities.
基金supported by the National Natural Science Foundation of China(Project No.41375096)the Research Grants Council of the Hong Kong Special Administrative Region(Project Nos.14408214 and 11305715)
文摘Sea level rise has become an important issue in global climate change studies. This study investigates trends in sea level records, particularly extreme records, in the Pearl River Estuary, using measurements from two tide gauge stations in Macao and Hong Kong. Extremes in the original sea level records (daily higher high water heights) and in tidal residuals with and without the 18.6-year nodal modulation are investigated separately. Thresholds for defining extreme sea levels are calibrated based on extreme value theory. Extreme events are then modeled by peaks-over-threshold models. The model applied to extremes in original sea level records does not include modeling of their durations, while a geometric distribution is added to model the duration of extremes in tidal residuals. Realistic modeling results are recommended in all stationary models. Parametric trends of extreme sea level records are then introduced to nonstationary models through a generalized linear model framework. The result shows that, in recent decades, since the 1960s, no significant trends can be found in any type of extreme at any station, which may be related to a reduction in the influence of tropical cyclones in the region. For the longer-term record since the 1920s at Macao, a regime shift of tidal amplitudes around the 1970s may partially explain the diverse trend of extremes in original sea level records and tidal residuals.
文摘Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so as to analyze and model extreme events. One of the most successful methods is the maximum product of spacing (MPS). However, there is a problem encountered while modeling data through this method in that the method breaks down when there is a tie in the exceedances. This study offers a solution to model data even if it contains ties. To do so, an optimal threshold that gives more optimal parameters for extreme events, was determined. The study achieved its main objective by deriving a method that improved MPS method for determining an optimal threshold for extreme values in a data set containing ties, estimated the Generalized Pareto Distribution (GPD) parameters for the optimal threshold derived and compared these GPD parameters with GPD parameters determined through the standard MPS model. The study improved maximum product of spacing method and used Generalized Pareto Distribution (GPD) and Peak over threshold (POT) methods as the basis of identifying extreme values. This study will help the statisticians in different sectors of our economy to model extreme events involving ties. To statisticians, the structure of the extreme levels which exist in the tails of the ordinary distributions is very important in analyzing, predicting and forecasting the likelihood of an occurrence of the extreme event.
文摘This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical properties are examined and a comprehensive set of diagnostic checks are made on the two decades of AAPL daily stock returns. Combing the Extreme Value Approach together with a statistical analysis, it is learnt that the lowest VaR occurs on Fridays and Mondays typically. Moreover, high Q4 and Q3 VaR are observed during the test period. These results are valuable for anyone who needs evaluation and forecasts of the risk situation in AAPL. Moreover, this methodology, which is applicable to any other stocks or portfolios, is more realistic and comprehensive than the standard normal distribution based VaR model that is commonly used.