We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP)...We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP) distribution are the classical distributions for this problem. However, from 2004, [1] and many other researchers have been empirically showing that generalized logistic (GL) distribution is a better model than GEV and GP distributions in modeling extreme movement of stock market data. In this paper, we show that these results are not accidental. We prove the theoretical importance of GL distribution in extreme value modeling. For proving this, we introduce a general multivariate limit theorem and deduce some important multivariate theorems in probability as special cases. By using the theorem, we derive a limit theorem in extreme value theory, where GL distribution plays central role instead of GEV distribution. The proof of this result is parallel to the proof of classical extremal types theorem, in the sense that, it possess important characteristic in classical extreme value theory, for e.g. distributional property, stability, convergence and multivariate extension etc.展开更多
The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is con...The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is concerned overview of the theory of infinite distribution functions.The tool to deal with the problems raised in the paper is the mathematical methods of random analysis(theory of random process and multivariate statistics).In this article,we introduce the new function to find out the bias and standard error with jackknife method for Generalized Extreme Value distributions.展开更多
GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distributi...GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distribution and generalized Pareto distribution assumptions are adopted re- spectively to simulate the random component in the model. The demonstration of the quantile of network traffic series indicates that common GARCH-M model can partially deal with the "fat tail" problem. However, the "fat tail" characteristic of the random component directly affects the accura- cy of the calculation. Even t distribution is based on the assumption for all the data. On the other hand, extreme value theory, which only concentrates on the tail distribution, can provide more ac- curate result for high quantiles. The best result is obtained based on the generalized Pareto distribu- tion assumption for the random component in the GARCH-M model.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distributi...Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.展开更多
One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We e...One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.展开更多
This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and m...This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.展开更多
In recent years, the red tide erupted frequently, and caused a great economic loss. At present, most literatures emphasize the academic research on the growth mechanism of red tide alga. In order to find out the chara...In recent years, the red tide erupted frequently, and caused a great economic loss. At present, most literatures emphasize the academic research on the growth mechanism of red tide alga. In order to find out the characters of red tide in detail and improve the precision of forecast, this paper gives some new approaches to dealing with the red tide. By the extreme values, we deal with the red tide frequency analysis and get the estimation of T-times red tide level U (T), which is the level once the consistence of red tide alga exceeds on the average in a period of T times.展开更多
It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the character...It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.展开更多
Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jum...Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.展开更多
It has been theoretically proven that at a high threshold an approximate expression for a quantile of GEV (Generalized Extreme Values) distribution can be derived from GPD (Generalized Pareto Distribution). Afterw...It has been theoretically proven that at a high threshold an approximate expression for a quantile of GEV (Generalized Extreme Values) distribution can be derived from GPD (Generalized Pareto Distribution). Afterwards, a quantile of extreme rainfall events in a certain return period is found using L-moment estimation and extreme rainfall events simulated by GPD and GEV, with all aspects of their results compared. Numerical simulations show that POT (Peaks Over Threshold)-based GPD is advantageous in its simple operation and subjected to practically no effect of the sample size of the primitive series, producing steady high-precision fittings in the whole field of values (including the high-end heavy tailed). In comparison, BM (Block Maximum)-based GEV is limited, to some extent, to the probability and quantile simulation, thereby showing that GPD is an extension of GEV, the former being of greater utility and higher significance to climate research compared to the latter.展开更多
About 30 years of measurements made by the rain gauges located in Piedmont (Italy) have been analyzed. Rain gauges have been divided into 4 datasets considering the complex orography near Turin, namely the flatlands, ...About 30 years of measurements made by the rain gauges located in Piedmont (Italy) have been analyzed. Rain gauges have been divided into 4 datasets considering the complex orography near Turin, namely the flatlands, mountains, hills and urban areas. For each group of gauges, the Generalized Extreme Values (GEV) distributions are estimated considering both the entire dataset of available data and different sets of 3 years of data in running mode. It is shown that the GEV estimated parameters temporal series for the 3 years dataset do not present any specific trend over the entire period. The study presented here is preliminary to a future extreme rainfall event analysis using high temporal and spatial resolution X-band weather radar with a limited temporal availability of radar maps covering the same area.展开更多
Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so...Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so as to analyze and model extreme events. One of the most successful methods is the maximum product of spacing (MPS). However, there is a problem encountered while modeling data through this method in that the method breaks down when there is a tie in the exceedances. This study offers a solution to model data even if it contains ties. To do so, an optimal threshold that gives more optimal parameters for extreme events, was determined. The study achieved its main objective by deriving a method that improved MPS method for determining an optimal threshold for extreme values in a data set containing ties, estimated the Generalized Pareto Distribution (GPD) parameters for the optimal threshold derived and compared these GPD parameters with GPD parameters determined through the standard MPS model. The study improved maximum product of spacing method and used Generalized Pareto Distribution (GPD) and Peak over threshold (POT) methods as the basis of identifying extreme values. This study will help the statisticians in different sectors of our economy to model extreme events involving ties. To statisticians, the structure of the extreme levels which exist in the tails of the ordinary distributions is very important in analyzing, predicting and forecasting the likelihood of an occurrence of the extreme event.展开更多
Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficult...Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficulty of intertidal area classification is compounded because a high proportion of this area is frequently flooded by water, making statistical modeling methods with spatial contextual information often ineffective. Because polarimetric entropy and anisotropy play significant roles in characterizing intertidal areas, in this paper we propose a novel unsupervised contextual classification algorithm. The key point of the method is to combine the generalized extreme value(GEV) statistical model of the polarization features and the Markov random field(MRF) for contextual smoothing. A goodness-of-fit test is added to determine the significance of the components of the statistical model. The final classification results are obtained by effectively combining the results of polarimetric entropy and anisotropy. Experimental results of the polarimetric data obtained by the Chinese Gaofen-3 SAR satellite demonstrate the feasibility and superiority of the proposed classification algorithm.展开更多
One of the more critical issues in a changing climate is the behavior of extreme weather events, such as severe tornadic storms as seen recently in Moore and El Reno, Oklahoma. It is generally thought that such events...One of the more critical issues in a changing climate is the behavior of extreme weather events, such as severe tornadic storms as seen recently in Moore and El Reno, Oklahoma. It is generally thought that such events would increase under a changing climate. How to evaluate this extreme behavior is a topic currently under much debate and investigation. One approach is to look at the behavior of large scale indicators of severe weather. The use of the generalized extreme value distribution for annual maxima is explored for a combination product of convective available potential energy and wind shear. Results from this initial study show successful modeling and high quantile prediction using extreme value methods. Predicted large scale values are consistent across different extreme value modeling frameworks, and a general increase over time in predicted values is indicated. A case study utilizing this methodology considers the large scale atmospheric indicators for the region of Moore, Oklahoma for Class EF5 tornadoes on May 3, 1999 and more recently on May 20, 2013, and for the class EF5 storm in El Reno, Oklahoma on May 31, 2013.展开更多
Extreme rainfall events are primary natural hazards, which cause a severe threat to people and their properties in populated cities, which are normally located in coastal areas in Vietnam. Analysing these events by us...Extreme rainfall events are primary natural hazards, which cause a severe threat to people and their properties in populated cities, which are normally located in coastal areas in Vietnam. Analysing these events by using a data series observed over years will support us to draw a picture of how the climate change impact on local environments. The purpose of this report is to understand the characteristics of the extreme rainfall events in MEKONG river delta (south VietNam). Daily rainfall data in the period of 30 years for a meteorological station in each area were collected from the Vietnam National Hydro-meteorological Service. The extreme rainfall events were defined as those exceeding the 95th percentile for each station. The analytical results show that the rainfall values (95th percentile) are 37.4 mm/day at Nam Can station, 27 mm/day at My Thanh station, 22.4 mm/day at Hoa Binh station, 23.8 mm/day at Binh Dai station and 22.7 mm/day at Ben Trai station. The highest rainfall data ever recorded are 246.4 mm/day (Nam Can), 174.5 mm/day (My Thanh), 179 mm/day (Hoa Bin_h), 187.3 mm/day (Binh Dai) and 136.3 mm/day (Ben Trai) during 1983-2012. The result of the Mann-Kendall tests show that there was a significant creasing of the rainfall at Nam Can, My Thanh station in two periods (1983-2012, 1998-2012) while no clear trend of the rainfall was recoreded at Hoa Birth, Binh Dai, Ben Trai station. In order to estimate the return period of the extreme rainfall events, the method General Extreme Value Distribution was used to calculate frequent distribution. The magnitudes of daily maximum rainfall were from 2 to 100 years. The results of return period show that maximum rainfalls are 46.6 mm at Nam Can station (highest) and 31.4 mm at Hoa Birth station (lowest) during 50 years. Similarly, maximum rainfalls are expected to be about 55.1 mm at Nam Can station and 37.2 mm at Hoa Birth station for 100 years.展开更多
This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical propertie...This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical properties are examined and a comprehensive set of diagnostic checks are made on the two decades of AAPL daily stock returns. Combing the Extreme Value Approach together with a statistical analysis, it is learnt that the lowest VaR occurs on Fridays and Mondays typically. Moreover, high Q4 and Q3 VaR are observed during the test period. These results are valuable for anyone who needs evaluation and forecasts of the risk situation in AAPL. Moreover, this methodology, which is applicable to any other stocks or portfolios, is more realistic and comprehensive than the standard normal distribution based VaR model that is commonly used.展开更多
文摘We consider a problem from stock market modeling, precisely, choice of adequate distribution of modeling extremal behavior of stock market data. Generalized extreme value (GEV) distribution and generalized Pareto (GP) distribution are the classical distributions for this problem. However, from 2004, [1] and many other researchers have been empirically showing that generalized logistic (GL) distribution is a better model than GEV and GP distributions in modeling extreme movement of stock market data. In this paper, we show that these results are not accidental. We prove the theoretical importance of GL distribution in extreme value modeling. For proving this, we introduce a general multivariate limit theorem and deduce some important multivariate theorems in probability as special cases. By using the theorem, we derive a limit theorem in extreme value theory, where GL distribution plays central role instead of GEV distribution. The proof of this result is parallel to the proof of classical extremal types theorem, in the sense that, it possess important characteristic in classical extreme value theory, for e.g. distributional property, stability, convergence and multivariate extension etc.
文摘The bootstrap method is one of the new ways of studying statistical math which this article uses but is a major tool for studying and evaluating the values of parameters in probability distribution.Our research is concerned overview of the theory of infinite distribution functions.The tool to deal with the problems raised in the paper is the mathematical methods of random analysis(theory of random process and multivariate statistics).In this article,we introduce the new function to find out the bias and standard error with jackknife method for Generalized Extreme Value distributions.
基金Supported by University and College Doctoral Subject Special Scientific Research Fund( No. 20040056041).
文摘GARCH-M ( generalized autoregressive conditional heteroskedasticity in the mean) model is used to analyse the volatility clustering phenomenon in mobile communication network traffic. Normal distribution, t distribution and generalized Pareto distribution assumptions are adopted re- spectively to simulate the random component in the model. The demonstration of the quantile of network traffic series indicates that common GARCH-M model can partially deal with the "fat tail" problem. However, the "fat tail" characteristic of the random component directly affects the accura- cy of the calculation. Even t distribution is based on the assumption for all the data. On the other hand, extreme value theory, which only concentrates on the tail distribution, can provide more ac- curate result for high quantiles. The best result is obtained based on the generalized Pareto distribu- tion assumption for the random component in the GARCH-M model.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We predicted the ultimate 100 m dash records for men and women for specific periods using the generalized extreme value (GEV) distribution. The various diagnostic plots, which assessed the accuracy of the GEV model, were well fitted to the 100 m records in the world and Japan, validating the model. The men’s world record had a shape parameter of -0.250 with a 95% confidence interval of [-0.391, -0.109]. The 100 m record had a finite limit and a calculated upper limit was 9.46 s. The return level estimates for the men’s world record were 9.74, 9.62, and 9.58 s with a 95% confidence interval of [9.69, 9.79], [9.54, 9.69], and [9.48, 9.67] for 10-, 100- and 350-year return periods, respectively. In one year, the probability of occurrence for a new world record of men, 9.58 s (Usain Bolt), was 1/350, while that for women, 10.49 s (Florence Griffith-Joyner), was about 1/100, confirming it was more difficult for men to break records than women.
文摘One of the most important and interesting issues associated with the earthquakes is the long-term trend of the extreme events. Extreme value theory provides methods for analysis of the most extreme parts of data. We estimated the annual maximum magnitude of earthquakes in Japan by extreme value theory using earthquake data between 1900 and 2019. Generalized extreme value (GEV) distribution was applied to fit the extreme indices. The distribution was used to estimate the probability of extreme values in specified time periods. The various diagnostic plots for assessing the accuracy of the GEV model fitted to the magnitude of maximum earthquakes data in Japan gave the validity of the GEV model. The extreme value index, <span style="white-space:nowrap;"><span style="white-space:nowrap;"><em>ξ</em></span></span> was evaluated as <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.163, with a 95% confidence interval of [<span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.260, <span style="white-space:nowrap;"><span style="white-space:nowrap;">−</span></span>0.0174] by the use of profile likelihood. Hence, the annual maximum magnitude of earthquakes has a finite upper limit. We obtained the maximum return level for the return periods of 10, 20, 50, 100 and 500 years along with their respective 95% confidence interval. Further, to get a more accurate confidence interval, we estimated the profile log-likelihood. The return level estimate was obtained as 7.83, 8.60 and 8.99, with a 95% confidence interval of [7.67, 8.06], [8.32, 9.21] and [8.61, 10.0] for the 10-, 100- and 500-year return periods, respectively. Hence, the 2011 off the Pacific coast of Tohoku Earthquake, which was the largest in the observation history of Japan, had a magnitude of 9.0, and it was a phenomenon that occurs once every 500 year.
基金the National Natural Science Foundation of China (No. 79970041).
文摘This paper investigates methods of value-at-risk (VaR) estimation using extreme value theory (EVT). It compares two different estimation methods, 'two-step subsample bootstrap' based on moment estimation and maximum likelihood estimation (MLE), according to their theoretical bases and computation procedures. Then, the estimation results are analyzed together with those of normal method and empirical method. The empirical research of foreign exchange data shows that the EVT methods have good characters in estimating VaR under extreme conditions and 'two-step subsample bootstrap' method is preferable to MLE.
基金This study is supported by the National Natural Science Foundation of China under No.10472077.
文摘In recent years, the red tide erupted frequently, and caused a great economic loss. At present, most literatures emphasize the academic research on the growth mechanism of red tide alga. In order to find out the characters of red tide in detail and improve the precision of forecast, this paper gives some new approaches to dealing with the red tide. By the extreme values, we deal with the red tide frequency analysis and get the estimation of T-times red tide level U (T), which is the level once the consistence of red tide alga exceeds on the average in a period of T times.
文摘It is very im portant to analyze network traffic in the network control and management. In thi s paper, extreme value theory is first introduced and a model with threshold met hods is proposed to analyze the characteristics of network traffic. In this mode l, only some traffic data that is greater than threshold value is considered. Th en the proposed model with the trace is simulated by using S Plus software. The modeling results show the network traffic model constructed from the extreme va lue theory fits well with that of empirical distribution. Finally, the extreme v alue model with the FARIMA(p,d,q) modeling is compared. The anal ytical results illustrate that extreme value theory has a good application foreg round in the statistic analysis of network traffic. In addition, since only some traffic data which is greater than the threshold is processed, the computation overhead is reduced greatly.
文摘Extreme value theory provides methods to analyze the most extreme parts of data. We used the generalized extreme value (GEV) distribution to predict the ultimate 100 m, 200 m, 400 m, 4 × 100 m relay, and long jump records of male gold medalists at the Olympics. The diagnostic plots, which assessed the accuracy of the GEV model, were fitted to all event records, validating the model. The 100 m, 200 m, 400 m, 4 × 100 m, and long jump records had negative shape parameters and calculated upper limits of 9.58 s, 19.18 s, 42.97 s, 36.71 s, and 9.03 m, respectively. The calculated upper limit in the 100 m (9.58 s) was equal to the record of Usain Bolt (August 16, 2009). The 100 m and 200 m world records were close to the calculated upper limits, and achieving the calculated limit was difficult. The 400 m and 4 × 100 m relay world records were almost equal to the calculated upper limits and the 500-year return level estimate, and slight improvement was possible in both. At the Tokyo Olympics in August 2021, in the 100 m, 200 m, and 4 × 100 m, in one year the probability of occurrence for a record was about 1/30. In the 400 m and long jump, it was about 1/20. In the 100 m, 200 m, and 4 × 100 m relay, more difficult records show that a fierce battle has taken place.
基金supported jointly Science Foundation of China (Grant No. 40675043) Program of the Jiangsu Key Laboratory of Meteorological Disaster (Grant No. KLME050209).
文摘It has been theoretically proven that at a high threshold an approximate expression for a quantile of GEV (Generalized Extreme Values) distribution can be derived from GPD (Generalized Pareto Distribution). Afterwards, a quantile of extreme rainfall events in a certain return period is found using L-moment estimation and extreme rainfall events simulated by GPD and GEV, with all aspects of their results compared. Numerical simulations show that POT (Peaks Over Threshold)-based GPD is advantageous in its simple operation and subjected to practically no effect of the sample size of the primitive series, producing steady high-precision fittings in the whole field of values (including the high-end heavy tailed). In comparison, BM (Block Maximum)-based GEV is limited, to some extent, to the probability and quantile simulation, thereby showing that GPD is an extension of GEV, the former being of greater utility and higher significance to climate research compared to the latter.
文摘About 30 years of measurements made by the rain gauges located in Piedmont (Italy) have been analyzed. Rain gauges have been divided into 4 datasets considering the complex orography near Turin, namely the flatlands, mountains, hills and urban areas. For each group of gauges, the Generalized Extreme Values (GEV) distributions are estimated considering both the entire dataset of available data and different sets of 3 years of data in running mode. It is shown that the GEV estimated parameters temporal series for the 3 years dataset do not present any specific trend over the entire period. The study presented here is preliminary to a future extreme rainfall event analysis using high temporal and spatial resolution X-band weather radar with a limited temporal availability of radar maps covering the same area.
文摘Extreme events are defined as values of the event below or above a certain value called threshold. A well chosen threshold helps to identify the extreme levels. Several methods have been used to determine threshold so as to analyze and model extreme events. One of the most successful methods is the maximum product of spacing (MPS). However, there is a problem encountered while modeling data through this method in that the method breaks down when there is a tie in the exceedances. This study offers a solution to model data even if it contains ties. To do so, an optimal threshold that gives more optimal parameters for extreme events, was determined. The study achieved its main objective by deriving a method that improved MPS method for determining an optimal threshold for extreme values in a data set containing ties, estimated the Generalized Pareto Distribution (GPD) parameters for the optimal threshold derived and compared these GPD parameters with GPD parameters determined through the standard MPS model. The study improved maximum product of spacing method and used Generalized Pareto Distribution (GPD) and Peak over threshold (POT) methods as the basis of identifying extreme values. This study will help the statisticians in different sectors of our economy to model extreme events involving ties. To statisticians, the structure of the extreme levels which exist in the tails of the ordinary distributions is very important in analyzing, predicting and forecasting the likelihood of an occurrence of the extreme event.
基金Project supported by the National Natural Science Foundation of China(No.61331017)
文摘Classification of intertidal area in synthetic aperture radar(SAR) images is an important yet challenging issue when considering the complicatedly and dramatically changing features of tidal fluctuation. The difficulty of intertidal area classification is compounded because a high proportion of this area is frequently flooded by water, making statistical modeling methods with spatial contextual information often ineffective. Because polarimetric entropy and anisotropy play significant roles in characterizing intertidal areas, in this paper we propose a novel unsupervised contextual classification algorithm. The key point of the method is to combine the generalized extreme value(GEV) statistical model of the polarization features and the Markov random field(MRF) for contextual smoothing. A goodness-of-fit test is added to determine the significance of the components of the statistical model. The final classification results are obtained by effectively combining the results of polarimetric entropy and anisotropy. Experimental results of the polarimetric data obtained by the Chinese Gaofen-3 SAR satellite demonstrate the feasibility and superiority of the proposed classification algorithm.
文摘One of the more critical issues in a changing climate is the behavior of extreme weather events, such as severe tornadic storms as seen recently in Moore and El Reno, Oklahoma. It is generally thought that such events would increase under a changing climate. How to evaluate this extreme behavior is a topic currently under much debate and investigation. One approach is to look at the behavior of large scale indicators of severe weather. The use of the generalized extreme value distribution for annual maxima is explored for a combination product of convective available potential energy and wind shear. Results from this initial study show successful modeling and high quantile prediction using extreme value methods. Predicted large scale values are consistent across different extreme value modeling frameworks, and a general increase over time in predicted values is indicated. A case study utilizing this methodology considers the large scale atmospheric indicators for the region of Moore, Oklahoma for Class EF5 tornadoes on May 3, 1999 and more recently on May 20, 2013, and for the class EF5 storm in El Reno, Oklahoma on May 31, 2013.
文摘Extreme rainfall events are primary natural hazards, which cause a severe threat to people and their properties in populated cities, which are normally located in coastal areas in Vietnam. Analysing these events by using a data series observed over years will support us to draw a picture of how the climate change impact on local environments. The purpose of this report is to understand the characteristics of the extreme rainfall events in MEKONG river delta (south VietNam). Daily rainfall data in the period of 30 years for a meteorological station in each area were collected from the Vietnam National Hydro-meteorological Service. The extreme rainfall events were defined as those exceeding the 95th percentile for each station. The analytical results show that the rainfall values (95th percentile) are 37.4 mm/day at Nam Can station, 27 mm/day at My Thanh station, 22.4 mm/day at Hoa Binh station, 23.8 mm/day at Binh Dai station and 22.7 mm/day at Ben Trai station. The highest rainfall data ever recorded are 246.4 mm/day (Nam Can), 174.5 mm/day (My Thanh), 179 mm/day (Hoa Bin_h), 187.3 mm/day (Binh Dai) and 136.3 mm/day (Ben Trai) during 1983-2012. The result of the Mann-Kendall tests show that there was a significant creasing of the rainfall at Nam Can, My Thanh station in two periods (1983-2012, 1998-2012) while no clear trend of the rainfall was recoreded at Hoa Birth, Binh Dai, Ben Trai station. In order to estimate the return period of the extreme rainfall events, the method General Extreme Value Distribution was used to calculate frequent distribution. The magnitudes of daily maximum rainfall were from 2 to 100 years. The results of return period show that maximum rainfalls are 46.6 mm at Nam Can station (highest) and 31.4 mm at Hoa Birth station (lowest) during 50 years. Similarly, maximum rainfalls are expected to be about 55.1 mm at Nam Can station and 37.2 mm at Hoa Birth station for 100 years.
文摘This study investigates calendar anomalies: day-of-the-week effect and seasonal effect in the Value-at-Risk (VaR) analysis of stock returns for AAPL during the period of 1995 through 2015. The statistical properties are examined and a comprehensive set of diagnostic checks are made on the two decades of AAPL daily stock returns. Combing the Extreme Value Approach together with a statistical analysis, it is learnt that the lowest VaR occurs on Fridays and Mondays typically. Moreover, high Q4 and Q3 VaR are observed during the test period. These results are valuable for anyone who needs evaluation and forecasts of the risk situation in AAPL. Moreover, this methodology, which is applicable to any other stocks or portfolios, is more realistic and comprehensive than the standard normal distribution based VaR model that is commonly used.