期刊文献+

A Special Weight for Inverse Gaussian Mixing Distribution in Normal Variance Mean Mixture with Application

A Special Weight for Inverse Gaussian Mixing Distribution in Normal Variance Mean Mixture with Application
下载PDF
导出
摘要 <p> <span style="color:#000000;"><span style="color:#000000;">Normal Variance-Mean Mixture (NVMM) provide</span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">s</span></span></span><span><span><span><span style="color:#000000;"> a general framework for deriving models with desirable properties for modelling financial market variables such as exchange rates, equity prices, and interest rates measured over short time intervals, </span><i><span style="color:#000000;">i.e.</span></i><span style="color:#000000;"> daily or weekly. Such data sets are characterized by non-normality and are usually skewed, fat-tailed and exhibit excess kurtosis. </span><span style="color:#000000;">The Generalised Hyperbolic distribution (GHD) introduced by Barndorff-</span><span style="color:#000000;">Nielsen </span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">(1977)</span></span></span><span><span><span><span style="color:#000000;"> which act as Normal variance-mean mixtures with Generalised Inverse Gaussian (GIG) mixing distribution nest a number of special and limiting case distributions. The Normal Inverse Gaussian (NIG) distribution is obtained when the Inverse Gaussian is the mixing distribution, </span><i><span style="color:#000000;">i.e</span></i></span></span></span><span style="color:#000000;"><span style="color:#000000;"><i><span style="color:#000000;">.</span></i></span></span><span><span><span><span style="color:#000000;">, the index parameter of the GIG is</span><span style="color:red;"> <img src="Edit_721a4317-7ef5-4796-9713-b9057bc426fc.bmp" alt="" /></span><span style="color:#000000;">. The NIG is very popular because of its analytical tractability. In the mixing mechanism</span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">,</span></span></span><span><span><span><span><span style="color:#000000;"> the mixing distribution characterizes the prior information of the random variable of the conditional distribution. Therefore, considering finite mixture models is one way of extending the work. The GIG is a three parameter distribution denoted by </span><img src="Edit_d21f2e1e-d426-401e-bf8b-f56d268dddb6.bmp" alt="" /></span><span><span style="color:#000000;"> and nest several special and limiting cases. When </span><img src="Edit_ffee9824-2b75-4ea6-a3d2-e048d49b553f.bmp" alt="" /></span><span><span style="color:#000000;">, we have </span><img src="Edit_654ea565-9798-4435-9a59-a0a1a7c282df.bmp" alt="" /></span><span style="color:#000000;"> which is called an Inverse Gaussian (IG) distribution. </span><span><span><span style="color:#000000;">When </span><img src="Edit_b15daf3d-849f-440a-9e4f-7b0c78d519e5.bmp" alt="" /></span><span style="color:red;"><span style="color:#000000;">, </span><img src="Edit_08a2088c-f57e-401c-8fb9-9974eec5947a.bmp" alt="" /><span style="color:#000000;">, </span><img src="Edit_130f4d7c-3e27-4937-b60f-6bf6e41f1f52.bmp" alt="" /><span style="color:#000000;">,</span></span><span><span style="color:#000000;"> we have </span><img src="Edit_215e67cb-b0d9-44e1-88d1-a2598dea05af.bmp" alt="" /></span><span style="color:red;"><span style="color:#000000;">, </span><img src="Edit_6bf9602b-a9c9-4a9d-aed0-049c47fe8dfe.bmp" alt="" /></span></span><span style="color:red;"><span style="color:#000000;"> </span><span><span style="color:#000000;">and </span><img src="Edit_d642ba7f-8b63-4830-aea1-d6e5fba31cc8.bmp" alt="" /></span></span><span><span style="color:#000000;"> distributions respectively. These distributions are related to </span><img src="Edit_0ca6658e-54cb-4d4d-87fa-25eb3a0a8934.bmp" alt="" /></span><span style="color:#000000;"> and are called weighted inverse Gaussian distributions. In this</span> <span style="color:#000000;">work</span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">,</span></span></span><span><span><span><span style="color:#000000;"> we consider a finite mixture of </span><img src="Edit_30ee74b7-0bfc-413d-b4d6-43902ec6c69d.bmp" alt="" /></span></span></span><span><span><span><span><span style="color:#000000;"> and </span><img src="Edit_ba62dff8-eb11-48f9-8388-68f5ee954c00.bmp" alt="" /></span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;"> and show that the mixture is also a weighted Inverse Gaussian distribution and use it to construct a NVMM. Due to the complexity of the likelihood, direct maximization is difficult. An EM type algorithm is provided for the Maximum Likelihood estimation of the parameters of the proposed model. We adopt an iterative scheme which is not based on explicit solution to the normal equations. This subtle approach reduces the computational difficulty of solving the complicated quantities involved directly to designing an iterative scheme based on a representation of the normal equation. The algorithm is easily programmable and we obtained a monotonic convergence for the data sets used.</span></span></span> </p> <p> <span style="color:#000000;"><span style="color:#000000;">Normal Variance-Mean Mixture (NVMM) provide</span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">s</span></span></span><span><span><span><span style="color:#000000;"> a general framework for deriving models with desirable properties for modelling financial market variables such as exchange rates, equity prices, and interest rates measured over short time intervals, </span><i><span style="color:#000000;">i.e.</span></i><span style="color:#000000;"> daily or weekly. Such data sets are characterized by non-normality and are usually skewed, fat-tailed and exhibit excess kurtosis. </span><span style="color:#000000;">The Generalised Hyperbolic distribution (GHD) introduced by Barndorff-</span><span style="color:#000000;">Nielsen </span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">(1977)</span></span></span><span><span><span><span style="color:#000000;"> which act as Normal variance-mean mixtures with Generalised Inverse Gaussian (GIG) mixing distribution nest a number of special and limiting case distributions. The Normal Inverse Gaussian (NIG) distribution is obtained when the Inverse Gaussian is the mixing distribution, </span><i><span style="color:#000000;">i.e</span></i></span></span></span><span style="color:#000000;"><span style="color:#000000;"><i><span style="color:#000000;">.</span></i></span></span><span><span><span><span style="color:#000000;">, the index parameter of the GIG is</span><span style="color:red;"> <img src="Edit_721a4317-7ef5-4796-9713-b9057bc426fc.bmp" alt="" /></span><span style="color:#000000;">. The NIG is very popular because of its analytical tractability. In the mixing mechanism</span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">,</span></span></span><span><span><span><span><span style="color:#000000;"> the mixing distribution characterizes the prior information of the random variable of the conditional distribution. Therefore, considering finite mixture models is one way of extending the work. The GIG is a three parameter distribution denoted by </span><img src="Edit_d21f2e1e-d426-401e-bf8b-f56d268dddb6.bmp" alt="" /></span><span><span style="color:#000000;"> and nest several special and limiting cases. When </span><img src="Edit_ffee9824-2b75-4ea6-a3d2-e048d49b553f.bmp" alt="" /></span><span><span style="color:#000000;">, we have </span><img src="Edit_654ea565-9798-4435-9a59-a0a1a7c282df.bmp" alt="" /></span><span style="color:#000000;"> which is called an Inverse Gaussian (IG) distribution. </span><span><span><span style="color:#000000;">When </span><img src="Edit_b15daf3d-849f-440a-9e4f-7b0c78d519e5.bmp" alt="" /></span><span style="color:red;"><span style="color:#000000;">, </span><img src="Edit_08a2088c-f57e-401c-8fb9-9974eec5947a.bmp" alt="" /><span style="color:#000000;">, </span><img src="Edit_130f4d7c-3e27-4937-b60f-6bf6e41f1f52.bmp" alt="" /><span style="color:#000000;">,</span></span><span><span style="color:#000000;"> we have </span><img src="Edit_215e67cb-b0d9-44e1-88d1-a2598dea05af.bmp" alt="" /></span><span style="color:red;"><span style="color:#000000;">, </span><img src="Edit_6bf9602b-a9c9-4a9d-aed0-049c47fe8dfe.bmp" alt="" /></span></span><span style="color:red;"><span style="color:#000000;"> </span><span><span style="color:#000000;">and </span><img src="Edit_d642ba7f-8b63-4830-aea1-d6e5fba31cc8.bmp" alt="" /></span></span><span><span style="color:#000000;"> distributions respectively. These distributions are related to </span><img src="Edit_0ca6658e-54cb-4d4d-87fa-25eb3a0a8934.bmp" alt="" /></span><span style="color:#000000;"> and are called weighted inverse Gaussian distributions. In this</span> <span style="color:#000000;">work</span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;">,</span></span></span><span><span><span><span style="color:#000000;"> we consider a finite mixture of </span><img src="Edit_30ee74b7-0bfc-413d-b4d6-43902ec6c69d.bmp" alt="" /></span></span></span><span><span><span><span><span style="color:#000000;"> and </span><img src="Edit_ba62dff8-eb11-48f9-8388-68f5ee954c00.bmp" alt="" /></span></span></span></span><span style="color:#000000;"><span style="color:#000000;"><span style="color:#000000;"> and show that the mixture is also a weighted Inverse Gaussian distribution and use it to construct a NVMM. Due to the complexity of the likelihood, direct maximization is difficult. An EM type algorithm is provided for the Maximum Likelihood estimation of the parameters of the proposed model. We adopt an iterative scheme which is not based on explicit solution to the normal equations. This subtle approach reduces the computational difficulty of solving the complicated quantities involved directly to designing an iterative scheme based on a representation of the normal equation. The algorithm is easily programmable and we obtained a monotonic convergence for the data sets used.</span></span></span> </p>
作者 Calvin B. Maina Patrick G. O. Weke Carolyne A. Ogutu Joseph A. M. Ottieno Calvin B. Maina;Patrick G. O. Weke;Carolyne A. Ogutu;Joseph A. M. Ottieno(Department of Mathematics and Actuarial Science, Kisii University, Kisii, Kenya;School of Mathematics, University of Nairobi, Nairobi, Kenya)
出处 《Open Journal of Statistics》 2021年第6期977-992,共16页 统计学期刊(英文)
关键词 Finite Mixture Weighted Distribution Mixed Model EM-ALGORITHM Finite Mixture Weighted Distribution Mixed Model EM-Algorithm
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部