This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, inclu...This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, including the homogeneous Poisson process (HPP) and the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity. We use the estimating function, the zero mean martingale (ZMM) as a procedure of parameter estimation in the insurance claim counting process. Then, Λ(t) , the compensator of is proposed for the number of claims in the time interval . We present situations through a simulation study of both processes on the time interval . Some examples of the situations in the simulation study are depicted by a sample path relating to its compensator Λ(t). In addition, an example of the claim counting process illustrates the result of the compensator estimate misspecification.展开更多
An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk mode...An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk model, for survival probabilities in a finite time horizon. Then explicit solutions are found with the inversion of the double Laplace transform;using algebra, the Laplace complex inversion formula and Matlab, for the exponential claim amount distribution.展开更多
The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped in...The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity and a beta-shaped intensity. The estimating function, such as the zero mean martingale (ZMM), is used as a procedure for parameter estimation of the insurance claim counting process, and the parameters of model claim intensity are estimated by the Bayesian method. Then,Λ(t), the compensator of N(t) is proposed for the number of claims in a time interval (0,t]. Given the process over the time interval (0,t]., the situations are presented through a simulation study and some examples of these situations are also depicted by a sample path relating N(t) to its compensatorΛ(t).展开更多
The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same ...The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.展开更多
Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be ...Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be restored and nowadays there is no available cure for glaucoma; however an early detection and treatment,could offer neuronal protection and avoid later serious damages to the visual function.A full understanding of the etiology of the disease will still require the contribution of many scientific efforts.Glial activation has been observed in glaucoma,being microglial proliferation a hallmark in this neurodegenerative disease.A typical project studying these cellular changes involved in glaucoma often needs thousands of images- from several animals- covering different layers and regions of the retina.The gold standard to evaluate them is the manual count.This method requires a large amount of time from specialized personnel.It is a tedious process and prone to human error.We present here a new method to count microglial cells by using a computer algorithm.It counts in one hour the same number of images that a researcher counts in four weeks,with no loss of reliability.展开更多
Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the ca...Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.展开更多
Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the ca...Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.展开更多
文摘This study presents an estimation approach to non-life insurance claim counts relating to a specified time. The objective of this study is to estimate the parameters in non-life insurance claim counting process, including the homogeneous Poisson process (HPP) and the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity. We use the estimating function, the zero mean martingale (ZMM) as a procedure of parameter estimation in the insurance claim counting process. Then, Λ(t) , the compensator of is proposed for the number of claims in the time interval . We present situations through a simulation study of both processes on the time interval . Some examples of the situations in the simulation study are depicted by a sample path relating to its compensator Λ(t). In addition, an example of the claim counting process illustrates the result of the compensator estimate misspecification.
文摘An important problem of actuarial risk management is the calculation of the probability of ruin. Using probability theory and the definition of the Laplace transform one obtains expressions, in the classical risk model, for survival probabilities in a finite time horizon. Then explicit solutions are found with the inversion of the double Laplace transform;using algebra, the Laplace complex inversion formula and Matlab, for the exponential claim amount distribution.
文摘The aim of this study is to propose an estimation approach to non-life insurance claim counts related to the insurance claim counting process, including the non-homogeneous Poisson process (NHPP) with a bell-shaped intensity and a beta-shaped intensity. The estimating function, such as the zero mean martingale (ZMM), is used as a procedure for parameter estimation of the insurance claim counting process, and the parameters of model claim intensity are estimated by the Bayesian method. Then,Λ(t), the compensator of N(t) is proposed for the number of claims in a time interval (0,t]. Given the process over the time interval (0,t]., the situations are presented through a simulation study and some examples of these situations are also depicted by a sample path relating N(t) to its compensatorΛ(t).
文摘The issue of document management has been raised for a long time, especially with the appearance of office automation in the 1980s, which led to dematerialization and Electronic Document Management (EDM). In the same period, workflow management has experienced significant development, but has become more focused on the industry. However, it seems to us that document workflows have not had the same interest for the scientific community. But nowadays, the emergence and supremacy of the Internet in electronic exchanges are leading to a massive dematerialization of documents;which requires a conceptual reconsideration of the organizational framework for the processing of said documents in both public and private administrations. This problem seems open to us and deserves the interest of the scientific community. Indeed, EDM has mainly focused on the storage (referencing) and circulation of documents (traceability). It paid little attention to the overall behavior of the system in processing documents. The purpose of our researches is to model document processing systems. In the previous works, we proposed a general model and its specialization in the case of small documents (any document processed by a single person at a time during its processing life cycle), which represent 70% of documents processed by administrations, according to our study. In this contribution, we extend the model for processing small documents to the case where they are managed in a system comprising document classes organized in subclasses;which is the case for most administrations. We have thus observed that this model is a Markovian <i>M<sup>L×K</sup>/M<sup>L×K</sup>/</i>1 queues network. We have analyzed the constraints of this model and deduced certain characteristics and metrics. <span style="white-space:normal;"><i></i></span><i>In fine<span style="white-space:normal;"></span></i>, the ultimate objective of our work is to design a document workflow management system, integrating a component of global behavior prediction.
基金supported by the Science Foundation of Arizona through the Bisgrove Program to PdG,Grant Number:BSP 0529-13the Ophthalmological Network OFTARED(RD12-0034/0002)+5 种基金the Institute of Health Carlos IIIthe PN I+D+i 2008–2011the ISCIII-Subdireccion General de Redes y Centros de Investigación Cooperativathe European Programme FEDERthe project SAF2014-53779-Rthe project:“The role of encapsulated NSAIDs in PLGA microparticles as a neuroprotective treatment” funded by the Spanish Ministry of Economy and Competitiveness
文摘Glaucoma is a multifactorial optic neuropathy characterized by the damage and death of the retinal ganglion cells.This disease results in vision loss and blindness.Any vision loss resulting from the disease cannot be restored and nowadays there is no available cure for glaucoma; however an early detection and treatment,could offer neuronal protection and avoid later serious damages to the visual function.A full understanding of the etiology of the disease will still require the contribution of many scientific efforts.Glial activation has been observed in glaucoma,being microglial proliferation a hallmark in this neurodegenerative disease.A typical project studying these cellular changes involved in glaucoma often needs thousands of images- from several animals- covering different layers and regions of the retina.The gold standard to evaluate them is the manual count.This method requires a large amount of time from specialized personnel.It is a tedious process and prone to human error.We present here a new method to count microglial cells by using a computer algorithm.It counts in one hour the same number of images that a researcher counts in four weeks,with no loss of reliability.
文摘Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.
文摘Spinning has a significant influence on all textile processes. Combinations of all the capital equipment display the process’ critical condition. By transforming unprocessed fibers into carded sliver and yarn, the carding machine serves a critical role in the textile industry. The carding machine’s licker-in and flat speeds are crucial operational factors that have a big influence on the finished goods’ quality. The purpose of this study is to examine the link between licker-in and flat speeds and how they affect the yarn and carded sliver quality. A thorough experimental examination on a carding machine was carried out to accomplish this. The carded sliver and yarn produced after experimenting with different licker-in and flat speed combinations were assessed for important quality factors including evenness, strength, and flaws. To account for changes in material qualities and machine settings, the study also took into consideration the impact of various fiber kinds and processing circumstances. The findings of the investigation showed a direct relationship between the quality of the carded sliver and yarn and the licker-in and flat speeds. Within a limited range, greater licker-in speeds were shown to increase carding efficiency and decrease fiber tangling. On the other hand, extremely high speeds led to more fiber breakage and neps. Higher flat speeds, on the other hand, helped to enhance fiber alignment, which increased the evenness and strength of the carded sliver and yarn. Additionally, it was discovered that the ideal blend of licker-in and flat rates varied based on the fiber type and processing circumstances. When being carded, various fibers displayed distinctive behaviors that necessitated adjusting the operating settings in order to provide the necessary quality results. The study also determined the crucial speed ratios between the licker-in and flat speeds that reduced fiber breakage and increased the caliber of the finished goods. The results of this study offer useful information for textile producers and process engineers to improve the quality of carded sliver and yarn while maximizing the performance of carding machines. Operators may choose machine settings and parameter adjustments wisely by knowing the impacts of licker-in and flat speeds, which will increase textile industry efficiency, productivity, and product quality.