In this paper, we consider a Markov switching Lévy process model in which the underlying risky assets are driven by the stochastic exponential of Markov switching Lévy process and then apply the model to opt...In this paper, we consider a Markov switching Lévy process model in which the underlying risky assets are driven by the stochastic exponential of Markov switching Lévy process and then apply the model to option pricing and hedging. In this model, the market interest rate, the volatility of the underlying risky assets and the N-state compensator,depend on unobservable states of the economy which are modeled by a continuous-time Hidden Markov process. We use the MEMM(minimal entropy martingale measure) as the equivalent martingale measure. The option price using this model is obtained by the Fourier transform method. We obtain a closed-form solution for the hedge ratio by applying the local risk minimizing hedging.展开更多
Markov modeling of HIV/AIDS progression was done under the assumption that the state holding time (waiting time) had a constant hazard. This paper discusses the properties of the hazard function of the Exponential dis...Markov modeling of HIV/AIDS progression was done under the assumption that the state holding time (waiting time) had a constant hazard. This paper discusses the properties of the hazard function of the Exponential distributions and its modifications namely;Parameter proportion hazard (PH) and Accelerated failure time models (AFT) and their effectiveness in modeling the state holding time in Markov modeling of HIV/AIDS progression with and without risk factors. Patients were categorized by gender and age with female gender being the baseline. Data simulated using R software was fitted to each model, and the model parameters were estimated. The estimated P and Z values were then used to test the null hypothesis that the state waiting time data followed an Exponential distribution. Model identification criteria;Akaike information criteria (AIC), Bayesian information criteria (BIC), log-likelihood (LL), and R2 were used to evaluate the performance of the models. For the Survival Regression model, P and Z values supported the non-rejection of the null hypothesis for mixed gender without interaction and supported the rejection of the same for mixed gender with interaction term and males aged 50 - 60 years. Both Parameters supported the non-rejection of the null hypothesis in the rest of the age groups. For Gender male with interaction both P and Z values supported rejection in all the age groups except the age group 20 - 30 years. For Cox Proportional hazard and AFT models, both P and Z values supported the non-rejection of the null hypothesis across all age groups. The P-values for the three models supported different decisions for and against the Null hypothesis with AFT and Cox values supporting similar decisions in most of the age groups. Among the models considered, the regression assumption provided a superior fit based on (AIC), (BIC), (LL), and R2 Model identification criteria. This was particularly evident in age and gender subgroups where the data exhibited non-proportional hazards and violated the assumptions required for the Cox Proportional Hazard model. Moreover, the simplicity of the regression model, along with its ability to capture essential state transitions without over fitting, made it a more appropriate choice.展开更多
The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same ...The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.展开更多
A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method...A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method by effectively making use of the information from the prior distribution and that from the discovery sequence according to posterior probabilities. All statistical inferences about the parameters of the model and total resources can be quantified by drawing samples directly from the joint posterior distribution. In addition, statistical errors of the samples can be easily assessed and the convergence properties can be monitored during the sampling. Because the information contained in a discovery sequence is not enough to estimate all parameters, especially the number of fields, geologically justified prior information is crucial to the estimation. The Bayesian approach allows the analyst to specify his subjective estimates of the required parameters and his degree of uncertainty about the estimates in a clearly identified fashion throughout the analysis. As an example, this approach is applied to the same data of the North Sea on which Smith demonstrated his maximum likelihood method. For this case, the Bayesian approach has really improved the overly pessimistic results and downward bias of the maximum likelihood procedure.展开更多
Nowadays remote sensing is an important technique for observing Earth surface applied to different areas such as, land use, urban planning, remote monitoring, real time deformation of the soil that can be associated w...Nowadays remote sensing is an important technique for observing Earth surface applied to different areas such as, land use, urban planning, remote monitoring, real time deformation of the soil that can be associated with earthquakes or landslides, the variations in thickness of the glaciers, the measurement of volume changes in the case of volcanic eruptions, deforestation, etc. To follow the evolution of these phenomena and to predict their future states, many approaches have been proposed. However, these approaches do not respond completely to the specialists who process yet more commonly the data extracted from the images in their studies to predict the future. In this paper, we propose an innovative methodology based on hidden Markov models (HMM). Our approach exploits temporal series of satellite images in order to predict spatio-temporal phenomena. It uses HMM for representing and making prediction concerning any objects in a satellite image. The first step builds a set of feature vectors gathering the available information. The next step uses a Baum-Welch learning algorithm on these vectors for detecting state changes. Finally, the system interprets these changes to make predictions. The performance of our approach is evaluated by tests of space-time interpretation of events conducted over two study sites, using different time series of SPOT images and application to the change in vegetation with LANDSAT images.展开更多
We consider a continuous time risk model based on a two state Markov process, in which after an exponentially distributed time, the claim frequency changes to a different level and can change back again in the same wa...We consider a continuous time risk model based on a two state Markov process, in which after an exponentially distributed time, the claim frequency changes to a different level and can change back again in the same way. We derive the Laplace transform for the first passage time to surplus zero from a given negative surplus and for the duration of negative surplus. Closed-form expressions are given in the case of exponential individual claim. Finally, numerical results are provided to show how to estimate the moments of duration of negative surplus.展开更多
Drought conditions at a given location evolve randomly through time and are typically characterized by severity and duration. Researchers interested in modeling the economic effects of drought on agriculture or other ...Drought conditions at a given location evolve randomly through time and are typically characterized by severity and duration. Researchers interested in modeling the economic effects of drought on agriculture or other water users often capture the stochastic nature of drought and its conditions via multiyear, stochastic economic models. Three major sources of uncertainty in application of a multiyear discrete stochastic model to evaluate user preparedness and response to drought are: (1) the assumption of independence of yearly weather conditions, (2) linguistic vagueness in the definition of drought itself, and (3) the duration of drought. One means of addressing these uncertainties is to re-cast drought as a stochastic, multiyear process using a “fuzzy” semi-Markov process. In this paper, we review “crisp” versus “fuzzy” representations of drought and show how fuzzy semi-Markov processes can aid researchers in developing more robust multiyear, discrete stochastic models.展开更多
At any given time, a product stock manager is expected to carry out activities to check his or her holdings in general and to monitor the condition of the stock in particular. He should monitor the level or quantity a...At any given time, a product stock manager is expected to carry out activities to check his or her holdings in general and to monitor the condition of the stock in particular. He should monitor the level or quantity available of a given product, of any item. On the basis of the observation made in relation to the movements of previous periods, he may decide to order or not a certain quantity of products. This paper discusses the applicability of discrete-time Markov chains in making relevant decisions for the management of a stock of COTRA-Honey products. A Markov chain model based on the transition matrix and equilibrium probabilities was developed to help managers predict the likely state of the stock in order to anticipate procurement decisions in the short, medium or long term. The objective of any manager is to ensure efficient management by limiting overstocking, minimising the risk of stock-outs as much as possible and maximising profits. The determined Markov chain model allows the manager to predict whether or not to order for the period following the current period, and if so, how much.展开更多
We consider risk minimization problems for Markov decision processes. From a standpoint of making the risk of random reward variable at each time as small as possible, a risk measure is introduced using conditional va...We consider risk minimization problems for Markov decision processes. From a standpoint of making the risk of random reward variable at each time as small as possible, a risk measure is introduced using conditional value-at-risk for random immediate reward variables in Markov decision processes, under whose risk measure criteria the risk-optimal policies are characterized by the optimality equations for the discounted or average case. As an application, the inventory models are considered.展开更多
A new stochastic epidemic model, that is, a general continuous time birth and death chain model, is formulated based on a deterministic model including vaccination. We use continuous time Markov chain to construct the...A new stochastic epidemic model, that is, a general continuous time birth and death chain model, is formulated based on a deterministic model including vaccination. We use continuous time Markov chain to construct the birth and death process. Through the Kolmogorov forward equation and the theory of moment generating function, the corresponding population expectations are studied. The theoretical result of the stochastic model and deterministic version is also given. Finally, numerical simulations are carried out to substantiate the theoretical results of random walk.展开更多
基金Supported by the National Natural Science Foundation of China(11201221)Supported by the Natural Science Foundation of Jiangsu Province(BK2012468)
文摘In this paper, we consider a Markov switching Lévy process model in which the underlying risky assets are driven by the stochastic exponential of Markov switching Lévy process and then apply the model to option pricing and hedging. In this model, the market interest rate, the volatility of the underlying risky assets and the N-state compensator,depend on unobservable states of the economy which are modeled by a continuous-time Hidden Markov process. We use the MEMM(minimal entropy martingale measure) as the equivalent martingale measure. The option price using this model is obtained by the Fourier transform method. We obtain a closed-form solution for the hedge ratio by applying the local risk minimizing hedging.
文摘Markov modeling of HIV/AIDS progression was done under the assumption that the state holding time (waiting time) had a constant hazard. This paper discusses the properties of the hazard function of the Exponential distributions and its modifications namely;Parameter proportion hazard (PH) and Accelerated failure time models (AFT) and their effectiveness in modeling the state holding time in Markov modeling of HIV/AIDS progression with and without risk factors. Patients were categorized by gender and age with female gender being the baseline. Data simulated using R software was fitted to each model, and the model parameters were estimated. The estimated P and Z values were then used to test the null hypothesis that the state waiting time data followed an Exponential distribution. Model identification criteria;Akaike information criteria (AIC), Bayesian information criteria (BIC), log-likelihood (LL), and R2 were used to evaluate the performance of the models. For the Survival Regression model, P and Z values supported the non-rejection of the null hypothesis for mixed gender without interaction and supported the rejection of the same for mixed gender with interaction term and males aged 50 - 60 years. Both Parameters supported the non-rejection of the null hypothesis in the rest of the age groups. For Gender male with interaction both P and Z values supported rejection in all the age groups except the age group 20 - 30 years. For Cox Proportional hazard and AFT models, both P and Z values supported the non-rejection of the null hypothesis across all age groups. The P-values for the three models supported different decisions for and against the Null hypothesis with AFT and Cox values supporting similar decisions in most of the age groups. Among the models considered, the regression assumption provided a superior fit based on (AIC), (BIC), (LL), and R2 Model identification criteria. This was particularly evident in age and gender subgroups where the data exhibited non-proportional hazards and violated the assumptions required for the Cox Proportional Hazard model. Moreover, the simplicity of the regression model, along with its ability to capture essential state transitions without over fitting, made it a more appropriate choice.
文摘The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.
文摘A Bayesian approach using Markov chain Monte Carlo algorithms has been developed to analyze Smith’s discretized version of the discovery process model. It avoids the problems involved in the maximum likelihood method by effectively making use of the information from the prior distribution and that from the discovery sequence according to posterior probabilities. All statistical inferences about the parameters of the model and total resources can be quantified by drawing samples directly from the joint posterior distribution. In addition, statistical errors of the samples can be easily assessed and the convergence properties can be monitored during the sampling. Because the information contained in a discovery sequence is not enough to estimate all parameters, especially the number of fields, geologically justified prior information is crucial to the estimation. The Bayesian approach allows the analyst to specify his subjective estimates of the required parameters and his degree of uncertainty about the estimates in a clearly identified fashion throughout the analysis. As an example, this approach is applied to the same data of the North Sea on which Smith demonstrated his maximum likelihood method. For this case, the Bayesian approach has really improved the overly pessimistic results and downward bias of the maximum likelihood procedure.
文摘Nowadays remote sensing is an important technique for observing Earth surface applied to different areas such as, land use, urban planning, remote monitoring, real time deformation of the soil that can be associated with earthquakes or landslides, the variations in thickness of the glaciers, the measurement of volume changes in the case of volcanic eruptions, deforestation, etc. To follow the evolution of these phenomena and to predict their future states, many approaches have been proposed. However, these approaches do not respond completely to the specialists who process yet more commonly the data extracted from the images in their studies to predict the future. In this paper, we propose an innovative methodology based on hidden Markov models (HMM). Our approach exploits temporal series of satellite images in order to predict spatio-temporal phenomena. It uses HMM for representing and making prediction concerning any objects in a satellite image. The first step builds a set of feature vectors gathering the available information. The next step uses a Baum-Welch learning algorithm on these vectors for detecting state changes. Finally, the system interprets these changes to make predictions. The performance of our approach is evaluated by tests of space-time interpretation of events conducted over two study sites, using different time series of SPOT images and application to the change in vegetation with LANDSAT images.
基金Supported in part by the National Natural Science Foundation of China and the Ministry of Education of China
文摘We consider a continuous time risk model based on a two state Markov process, in which after an exponentially distributed time, the claim frequency changes to a different level and can change back again in the same way. We derive the Laplace transform for the first passage time to surplus zero from a given negative surplus and for the duration of negative surplus. Closed-form expressions are given in the case of exponential individual claim. Finally, numerical results are provided to show how to estimate the moments of duration of negative surplus.
文摘Drought conditions at a given location evolve randomly through time and are typically characterized by severity and duration. Researchers interested in modeling the economic effects of drought on agriculture or other water users often capture the stochastic nature of drought and its conditions via multiyear, stochastic economic models. Three major sources of uncertainty in application of a multiyear discrete stochastic model to evaluate user preparedness and response to drought are: (1) the assumption of independence of yearly weather conditions, (2) linguistic vagueness in the definition of drought itself, and (3) the duration of drought. One means of addressing these uncertainties is to re-cast drought as a stochastic, multiyear process using a “fuzzy” semi-Markov process. In this paper, we review “crisp” versus “fuzzy” representations of drought and show how fuzzy semi-Markov processes can aid researchers in developing more robust multiyear, discrete stochastic models.
文摘At any given time, a product stock manager is expected to carry out activities to check his or her holdings in general and to monitor the condition of the stock in particular. He should monitor the level or quantity available of a given product, of any item. On the basis of the observation made in relation to the movements of previous periods, he may decide to order or not a certain quantity of products. This paper discusses the applicability of discrete-time Markov chains in making relevant decisions for the management of a stock of COTRA-Honey products. A Markov chain model based on the transition matrix and equilibrium probabilities was developed to help managers predict the likely state of the stock in order to anticipate procurement decisions in the short, medium or long term. The objective of any manager is to ensure efficient management by limiting overstocking, minimising the risk of stock-outs as much as possible and maximising profits. The determined Markov chain model allows the manager to predict whether or not to order for the period following the current period, and if so, how much.
文摘We consider risk minimization problems for Markov decision processes. From a standpoint of making the risk of random reward variable at each time as small as possible, a risk measure is introduced using conditional value-at-risk for random immediate reward variables in Markov decision processes, under whose risk measure criteria the risk-optimal policies are characterized by the optimality equations for the discounted or average case. As an application, the inventory models are considered.
文摘A new stochastic epidemic model, that is, a general continuous time birth and death chain model, is formulated based on a deterministic model including vaccination. We use continuous time Markov chain to construct the birth and death process. Through the Kolmogorov forward equation and the theory of moment generating function, the corresponding population expectations are studied. The theoretical result of the stochastic model and deterministic version is also given. Finally, numerical simulations are carried out to substantiate the theoretical results of random walk.