This study examines vishing, a form of social engineering scam using voice communication to deceive individuals into revealing sensitive information or losing money. With the rise of smartphone usage, people are more ...This study examines vishing, a form of social engineering scam using voice communication to deceive individuals into revealing sensitive information or losing money. With the rise of smartphone usage, people are more susceptible to vishing attacks. The proposed Emoti-Shing model analyzes potential victims’ emotions using Hidden Markov Models to track vishing scams by examining the emotional content of phone call audio conversations. This approach aims to detect vishing scams using biological features of humans, specifically emotions, which cannot be easily masked or spoofed. Experimental results on 30 generated emotions indicate the potential for increased vishing scam detection through this approach.展开更多
This paper presents an anomaly detection approach to detect intrusions into computer systems. In this approach, a hierarchical hidden Markov model (HHMM) is used to represent a temporal profile of normal behavior in...This paper presents an anomaly detection approach to detect intrusions into computer systems. In this approach, a hierarchical hidden Markov model (HHMM) is used to represent a temporal profile of normal behavior in a computer system. The HHMM of the norm profile is learned from historic data of the system's normal behavior. The observed behavior of the system is analyzed to infer the probability that the HHMM of the norm profile supports the observed behavior. A low probability of support indicates an anomalous behavior that may result from intrusive activities. The model was implemented and tested on the UNIX system call sequences collected by the University of New Mexico group. The testing results showed that the model can clearly identify the anomaly activities and has a better performance than hidden Markov model.展开更多
In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Develo...In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.展开更多
The links between low temperature and the incidence of disease have been studied by many researchers. What remains still unclear is the exact nature of the relation, especially the mechanism by which the change of wea...The links between low temperature and the incidence of disease have been studied by many researchers. What remains still unclear is the exact nature of the relation, especially the mechanism by which the change of weather effects on the onset of diseases. The existence of lag period between exposure to temperature and its effect on mortality may reflect the nature of the onset of diseases. Therefore, to assess lagged effects becomes potentially important. The most of studies on lags used the method by Lag-distributed Poisson Regression, and neglected extreme case as random noise to get correlations. In order to assess the lagged effect, we proposed a new approach, i.e., Hidden Markov Model by Self Organized Map (HMM by SOM) apart from well-known regression models. HMM by SOM includes the randomness in its nature and encompasses the extreme cases which were neglected by auto-regression models. The daily data of the number of patients transported by ambulance in Nagoya, Japan, were used. SOM was carried out to classify the meteorological elements into six classes. These classes were used as “states” of HMM. HMM was used to describe a background process which might produce the time series of the incidence of diseases. The background process was considered to change randomly weather states, classified by SOM. We estimated the lagged effects of weather change on the onset of both cerebral infarction and ischemic heart disease. This fact is potentially important in that if one could trace a path in the chain of events leading from temperature change to death, one might be able to prevent it and avert the fatal outcome.展开更多
A land cover classification procedure is presented utilizing the information content of fully polarimetric SAR images. The Cameron coherent target decomposition (CTD) is employed to characterize each pixel, using a se...A land cover classification procedure is presented utilizing the information content of fully polarimetric SAR images. The Cameron coherent target decomposition (CTD) is employed to characterize each pixel, using a set of canonical scattering mechanisms in order to describe the physical properties of the scatterer. The novelty of the proposed classification approach lies on the use of Hidden Markov Models (HMM) to uniquely characterize each type of land cover. The motivation to this approach is the investigation of the alternation between scattering mechanisms from SAR pixel to pixel. Depending </span><span style="font-family:Verdana;">on the observations-scattering mechanisms and exploiting the transitions </span><span style="font-family:Verdana;">between the scattering mechanisms we decide upon the HMM-land cover type. The classification process is based on the likelihood of observation sequences </span><span style="font-family:Verdana;">been evaluated by each model. The performance of the classification ap</span><span style="font-family:Verdana;">proach is assessed my means of fully polarimetric SLC SAR data from the broader </span><span style="font-family:Verdana;">area of Vancouver, Canada and was found satisfactory, reaching a success</span><span style="font-family:Verdana;"> from 87% to over 99%.展开更多
In this paper the authors look into the problem of Hidden Markov Models (HMM): the evaluation, the decoding and the learning problem. The authors have explored an approach to increase the effectiveness of HMM in th...In this paper the authors look into the problem of Hidden Markov Models (HMM): the evaluation, the decoding and the learning problem. The authors have explored an approach to increase the effectiveness of HMM in the speech recognition field. Although hidden Markov modeling has significantly improved the performance of current speech-recognition systems, the general problem of completely fluent speaker-independent speech recognition is still far from being solved. For example, there is no system which is capable of reliably recognizing unconstrained conversational speech. Also, there does not exist a good way to infer the language structure from a limited corpus of spoken sentences statistically. Therefore, the authors want to provide an overview of the theory of HMM, discuss the role of statistical methods, and point out a range of theoretical and practical issues that deserve attention and are necessary to understand so as to further advance research in the field of speech recognition.展开更多
Many kinds of channel currents are especially weak and the background noise dominates in the patch clamp recordings. This makes the threshold detection fail during estimating of the transition probabilities. So direct...Many kinds of channel currents are especially weak and the background noise dominates in the patch clamp recordings. This makes the threshold detection fail during estimating of the transition probabilities. So direct fitting of the patch clamp recording, not of the histogram coming from the recordings, is a desirable way to estimate the transition probabilities. Iterative batch EM algorithm based on hidden markov model has been used in this field but which has the "curse of dimensionality" and besides cant keep tracking the varying of the parameters. A new on line sequential iterative one is proposed here, which needs fewer computational efforts and can adaptively keep tracking the varying of parameters. Simulations suggest its robust, effective and convenient.展开更多
Several studies were devoted to investigate the effects of meteorological factors on the occurrence of stroke. Regression models had been mostly used to assess the correlation between weather and stroke incidence. How...Several studies were devoted to investigate the effects of meteorological factors on the occurrence of stroke. Regression models had been mostly used to assess the correlation between weather and stroke incidence. However, these methods could not describe the process proceeding in the back-ground of stroke incidence. The purpose of this study was to provide a new approach based on Hidden Markov Models (HMMs) and self-organizing maps (SOM), interpreting the background from the viewpoint of weather variability. Based on meteorological data, SOM was performed to classify weather patterns. Using these classes by SOM as randomly changing “states”, our Hidden Markov Models were constructed with “observation data” that were extracted from the daily data of emergency transport at Nagoya City in Japan. We showed that SOM was an effective method to get weather patterns that would serve as “states” of Hidden Markov Models. Our Hidden Markov Models provided effective models to clarify background process for stroke incidence. The effectiveness of these Hidden Markov Models was estimated by stochastic test for root mean square errors (RMSE). “HMMs with states by SOM” would serve as a description of the background process of stroke incidence and were useful to show the influence of weather on stroke onset. This finding will contribute to an improvement of our understanding for links between weather variability and stroke incidence.展开更多
The label text is a very important tool for the automatic processing of language. It is used in several applications such as morphological and syntactic text analysis, index-ing, retrieval, finished networks determini...The label text is a very important tool for the automatic processing of language. It is used in several applications such as morphological and syntactic text analysis, index-ing, retrieval, finished networks deterministic (in which all combinations of words that are accepted by the grammar are listed) or by statistical grammars (e.g., an n-gram in which the probabilities of sequences of n words in a specific order are given), etc. In this article, we developed a morphosyntactic labeling system language “Baoule” using hidden Markov models. This will allow us to build a tagged reference corpus and rep-resent major grammatical rules faced “Baoule” language in general. To estimate the parameters of this model, we used a training corpus manually labeled using a set of morpho-syntactic labels. We then proceed to an improvement of the system through the re-estimation procedure parameters of this model.展开更多
In this paper,we tested our methodology on the stocks of four representative companies:Apple,Comcast Corporation(CMCST),Google,and Qualcomm.We compared their performance to several stocks using the hidden Markov model...In this paper,we tested our methodology on the stocks of four representative companies:Apple,Comcast Corporation(CMCST),Google,and Qualcomm.We compared their performance to several stocks using the hidden Markov model(HMM)and forecasts using mean absolute percentage error(MAPE).For simplicity,we considered four main features in these stocks:open,close,high,and low prices.When using the HMM for forecasting,the HMM has the best prediction for the daily low stock price and daily high stock price of Apple and CMCST,respectively.By calculating the MAPE for the four data sets of Google,the close price has the largest prediction error,while the open price has the smallest prediction error.The HMM has the largest prediction error and the smallest prediction error for Qualcomm’s daily low stock price and daily high stock price,respectively.展开更多
The existing ontology mapping methods mainly consider the structure of the ontology and the mapping precision is lower to some extent. According to statistical theory, a method which is based on the hidden Markov mode...The existing ontology mapping methods mainly consider the structure of the ontology and the mapping precision is lower to some extent. According to statistical theory, a method which is based on the hidden Markov model is presented to establish ontology mapping. This method considers concepts as models, and attributes, relations, hierarchies, siblings and rules of the concepts as the states of the HMM, respectively. The models corresponding to the concepts are built by virtue of learning many training instances. On the basis of the best state sequence that is decided by the Viterbi algorithm and corresponding to the instance, mapping between the concepts can be established by maximum likelihood estimation. Experimental results show that this method can improve the precision of heterogeneous ontology mapping effectively.展开更多
A parameter estimation algorithm of the continuous hidden Markov model isintroduced and the rigorous proof of its convergence is also included. The algorithm uses theViterbi algorithm instead of K-means clustering use...A parameter estimation algorithm of the continuous hidden Markov model isintroduced and the rigorous proof of its convergence is also included. The algorithm uses theViterbi algorithm instead of K-means clustering used in the segmental K-means algorithm to determineoptimal state and branch sequences. Based on the optimal sequence, parameters are estimated withmaximum-likelihood as objective functions. Comparisons with the traditional Baum-Welch and segmentalK-means algorithms on various aspects, such as optimal objectives and fundamentals, are made. Allthree algorithms are applied to face recognition. Results indicate that the proposed algorithm canreduce training time with comparable recognition rate and it is least sensitive to the training set.So its average performance exceeds the other two.展开更多
An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction ste...An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction step, by which wavelet transform and principal component analysis are used to capture the inherent characteristics from process measurements, followed by a similarity assessment step using hidden Markov model (HMM) for pattern comparison. In most previous cases, a fixed-length moving window was employed to track dynamic data, and often failed to capture enough information for each fault and sometimes even deteriorated the diagnostic performance. A variable moving window, the length of which is modified with time, is introduced in this paper and case studies on the Tennessee Eastman process illustrate the potential of the proposed method.展开更多
Frame erasure concealment is studied to solve the problem of rapid speech quality reduction due to the loss of speech parameters during speech transmission. A large hidden Markov model is applied to model the immittan...Frame erasure concealment is studied to solve the problem of rapid speech quality reduction due to the loss of speech parameters during speech transmission. A large hidden Markov model is applied to model the immittance spectral frequency (ISF) parameters in AMR-WB codec to optimally estimate the lost ISFs based on the minimum mean square error (MMSE) rule. The estimated ISFs are weighted with the ones of their previous neighbors to smooth the speech, resulting in the actual concealed ISF vectors. They are used instead of the lost ISFs in the speech synthesis on the receiver. Comparison is made between the speech concealed by this algorithm and by Annex I of G. 722. 2 specification, and simulation shows that the proposed concealment algorithm can lead to better performance in terms of frequency-weighted spectral distortion and signal-to-noise ratio compared to the baseline method, with an increase of 2.41 dB in signal-to-noise ratio (SNR) and a reduction of 0. 885 dB in frequency-weighted spectral distortion.展开更多
In order to overcome defects of the classical hidden Markov model (HMM), Markov family model (MFM), a new statistical model was proposed. Markov family model was applied to speech recognition and natural language proc...In order to overcome defects of the classical hidden Markov model (HMM), Markov family model (MFM), a new statistical model was proposed. Markov family model was applied to speech recognition and natural language processing. The speaker independently continuous speech recognition experiments and the part-of-speech tagging experiments show that Markov family model has higher performance than hidden Markov model. The precision is enhanced from 94.642% to 96.214% in the part-of-speech tagging experiments, and the work rate is reduced by 11.9% in the speech recognition experiments with respect to HMM baseline system.展开更多
Epilepsy is one of the most prevalent neurological disorders affecting 70 million people worldwide.The present work is focused on designing an efficient algorithm for automatic seizure detection by using electroenceph...Epilepsy is one of the most prevalent neurological disorders affecting 70 million people worldwide.The present work is focused on designing an efficient algorithm for automatic seizure detection by using electroencephalogram(EEG) as a noninvasive procedure to record neuronal activities in the brain.EEG signals' underlying dynamics are extracted to differentiate healthy and seizure EEG signals.Shannon entropy,collision entropy,transfer entropy,conditional probability,and Hjorth parameter features are extracted from subbands of tunable Q wavelet transform.Efficient decomposition level for different feature vector is selected using the Kruskal-Wallis test to achieve good classification.Different features are combined using the discriminant correlation analysis fusion technique to form a single fused feature vector.The accuracy of the proposed approach is higher for Q=2 and J=10.Transfer entropy is observed to be significant for different class combinations.Proposed approach achieved 100% accuracy in classifying healthy-seizure EEG signal using simple and robust features and hidden Markov model with less computation time.The proposed approach efficiency is evaluated in classifying seizure and non-seizure surface EEG signals.The system has achieved 96.87% accuracy in classifying surface seizure and nonseizure EEG segments using efficient features extracted from different J level.展开更多
Aiming at solving the problems of machine-learning in fault diagnosis, a diagnosis approach is proposed based on hidden Markov model (HMM) and support vector machine (SVM). HMM usually describes intra-class measur...Aiming at solving the problems of machine-learning in fault diagnosis, a diagnosis approach is proposed based on hidden Markov model (HMM) and support vector machine (SVM). HMM usually describes intra-class measure well and is good at dealing with continuous dynamic signals. SVM expresses inter-class difference effectively and has perfect classify ability. This approach is built on the merit of HMM and SVM. Then, the experiment is made in the transmission system of a helicopter. With the features extracted from vibration signals in gearbox, this HMM-SVM based diagnostic approach is trained and used to monitor and diagnose the gearbox's faults. The result shows that this method is better than HMM-based and SVM-based diagnosing methods in higher diagnostic accuracy with small training samples.展开更多
With the increasing availability of precipitation radar data from space,enhancement of the resolution of spaceborne precipitation observations is important,particularly for hazard prediction and climate modeling at lo...With the increasing availability of precipitation radar data from space,enhancement of the resolution of spaceborne precipitation observations is important,particularly for hazard prediction and climate modeling at local scales relevant to extreme precipitation intensities and gradients.In this paper,the statistical characteristics of radar precipitation reflectivity data are studied and modeled using a hidden Markov tree(HMT)in the wavelet domain.Then,a high-resolution interpolation algorithm is proposed for spaceborne radar reflectivity using the HMT model as prior information.Owing to the small and transient storm elements embedded in the larger and slowly varying elements,the radar precipitation data exhibit distinct multiscale statistical properties,including a non-Gaussian structure and scale-to-scale dependency.An HMT model can capture well the statistical properties of radar precipitation,where the wavelet coefficients in each sub-band are characterized as a Gaussian mixture model(GMM),and the wavelet coefficients from the coarse scale to fine scale are described using a multiscale Markov process.The state probabilities of the GMM are determined using the expectation maximization method,and other parameters,for instance,the variance decay parameters in the HMT model are learned and estimated from high-resolution ground radar reflectivity images.Using the prior model,the wavelet coefficients at finer scales are estimated using local Wiener filtering.The interpolation algorithm is validated using data from the precipitation radar onboard the Tropical Rainfall Measurement Mission satellite,and the reconstructed results are found to be able to enhance the spatial resolution while optimally reproducing the local extremes and gradients.展开更多
Translation software has become an important tool for communication between different languages.People’s requirements for translation are higher and higher,mainly reflected in people’s desire for barrier free cultur...Translation software has become an important tool for communication between different languages.People’s requirements for translation are higher and higher,mainly reflected in people’s desire for barrier free cultural exchange.With a large corpus,the performance of statistical machine translation based on words and phrases is limited due to the small size of modeling units.Previous statistical methods rely primarily on the size of corpus and number of its statistical results to avoid ambiguity in translation,ignoring context.To support the ongoing improvement of translation methods built upon deep learning,we propose a translation algorithm based on the Hidden Markov Model to improve the use of context in the process of translation.During translation,our Hidden Markov Model prediction chain selects a number of phrases with the highest result probability to form a sentence.The collection of all of the generated sentences forms a topic sequence.Using probabilities and article sequences determined from the training set,our method again applies the Hidden Markov Model to form the final translation to improve the context relevance in the process of translation.This algorithm improves the accuracy of translation,avoids the combination of invalid words,and enhances the readability and meaning of the resulting translation.展开更多
文摘This study examines vishing, a form of social engineering scam using voice communication to deceive individuals into revealing sensitive information or losing money. With the rise of smartphone usage, people are more susceptible to vishing attacks. The proposed Emoti-Shing model analyzes potential victims’ emotions using Hidden Markov Models to track vishing scams by examining the emotional content of phone call audio conversations. This approach aims to detect vishing scams using biological features of humans, specifically emotions, which cannot be easily masked or spoofed. Experimental results on 30 generated emotions indicate the potential for increased vishing scam detection through this approach.
基金Supported by the Science and Technology Development Project Foundation of Tianjin (033800611, 05YFGZGX24200)
文摘This paper presents an anomaly detection approach to detect intrusions into computer systems. In this approach, a hierarchical hidden Markov model (HHMM) is used to represent a temporal profile of normal behavior in a computer system. The HHMM of the norm profile is learned from historic data of the system's normal behavior. The observed behavior of the system is analyzed to infer the probability that the HHMM of the norm profile supports the observed behavior. A low probability of support indicates an anomalous behavior that may result from intrusive activities. The model was implemented and tested on the UNIX system call sequences collected by the University of New Mexico group. The testing results showed that the model can clearly identify the anomaly activities and has a better performance than hidden Markov model.
文摘In this paper, we will illustrate the use and power of Hidden Markov models in analyzing multivariate data over time. The data used in this study was obtained from the Organization for Economic Co-operation and Development (OECD. Stat database url: https://stats.oecd.org/) and encompassed monthly data on the employment rate of males and females in Canada and the United States (aged 15 years and over;seasonally adjusted from January 1995 to July 2018). Two different underlying patterns of trends in employment over the 23 years observation period were uncovered.
文摘The links between low temperature and the incidence of disease have been studied by many researchers. What remains still unclear is the exact nature of the relation, especially the mechanism by which the change of weather effects on the onset of diseases. The existence of lag period between exposure to temperature and its effect on mortality may reflect the nature of the onset of diseases. Therefore, to assess lagged effects becomes potentially important. The most of studies on lags used the method by Lag-distributed Poisson Regression, and neglected extreme case as random noise to get correlations. In order to assess the lagged effect, we proposed a new approach, i.e., Hidden Markov Model by Self Organized Map (HMM by SOM) apart from well-known regression models. HMM by SOM includes the randomness in its nature and encompasses the extreme cases which were neglected by auto-regression models. The daily data of the number of patients transported by ambulance in Nagoya, Japan, were used. SOM was carried out to classify the meteorological elements into six classes. These classes were used as “states” of HMM. HMM was used to describe a background process which might produce the time series of the incidence of diseases. The background process was considered to change randomly weather states, classified by SOM. We estimated the lagged effects of weather change on the onset of both cerebral infarction and ischemic heart disease. This fact is potentially important in that if one could trace a path in the chain of events leading from temperature change to death, one might be able to prevent it and avert the fatal outcome.
文摘A land cover classification procedure is presented utilizing the information content of fully polarimetric SAR images. The Cameron coherent target decomposition (CTD) is employed to characterize each pixel, using a set of canonical scattering mechanisms in order to describe the physical properties of the scatterer. The novelty of the proposed classification approach lies on the use of Hidden Markov Models (HMM) to uniquely characterize each type of land cover. The motivation to this approach is the investigation of the alternation between scattering mechanisms from SAR pixel to pixel. Depending </span><span style="font-family:Verdana;">on the observations-scattering mechanisms and exploiting the transitions </span><span style="font-family:Verdana;">between the scattering mechanisms we decide upon the HMM-land cover type. The classification process is based on the likelihood of observation sequences </span><span style="font-family:Verdana;">been evaluated by each model. The performance of the classification ap</span><span style="font-family:Verdana;">proach is assessed my means of fully polarimetric SLC SAR data from the broader </span><span style="font-family:Verdana;">area of Vancouver, Canada and was found satisfactory, reaching a success</span><span style="font-family:Verdana;"> from 87% to over 99%.
文摘In this paper the authors look into the problem of Hidden Markov Models (HMM): the evaluation, the decoding and the learning problem. The authors have explored an approach to increase the effectiveness of HMM in the speech recognition field. Although hidden Markov modeling has significantly improved the performance of current speech-recognition systems, the general problem of completely fluent speaker-independent speech recognition is still far from being solved. For example, there is no system which is capable of reliably recognizing unconstrained conversational speech. Also, there does not exist a good way to infer the language structure from a limited corpus of spoken sentences statistically. Therefore, the authors want to provide an overview of the theory of HMM, discuss the role of statistical methods, and point out a range of theoretical and practical issues that deserve attention and are necessary to understand so as to further advance research in the field of speech recognition.
文摘Many kinds of channel currents are especially weak and the background noise dominates in the patch clamp recordings. This makes the threshold detection fail during estimating of the transition probabilities. So direct fitting of the patch clamp recording, not of the histogram coming from the recordings, is a desirable way to estimate the transition probabilities. Iterative batch EM algorithm based on hidden markov model has been used in this field but which has the "curse of dimensionality" and besides cant keep tracking the varying of the parameters. A new on line sequential iterative one is proposed here, which needs fewer computational efforts and can adaptively keep tracking the varying of parameters. Simulations suggest its robust, effective and convenient.
文摘Several studies were devoted to investigate the effects of meteorological factors on the occurrence of stroke. Regression models had been mostly used to assess the correlation between weather and stroke incidence. However, these methods could not describe the process proceeding in the back-ground of stroke incidence. The purpose of this study was to provide a new approach based on Hidden Markov Models (HMMs) and self-organizing maps (SOM), interpreting the background from the viewpoint of weather variability. Based on meteorological data, SOM was performed to classify weather patterns. Using these classes by SOM as randomly changing “states”, our Hidden Markov Models were constructed with “observation data” that were extracted from the daily data of emergency transport at Nagoya City in Japan. We showed that SOM was an effective method to get weather patterns that would serve as “states” of Hidden Markov Models. Our Hidden Markov Models provided effective models to clarify background process for stroke incidence. The effectiveness of these Hidden Markov Models was estimated by stochastic test for root mean square errors (RMSE). “HMMs with states by SOM” would serve as a description of the background process of stroke incidence and were useful to show the influence of weather on stroke onset. This finding will contribute to an improvement of our understanding for links between weather variability and stroke incidence.
文摘The label text is a very important tool for the automatic processing of language. It is used in several applications such as morphological and syntactic text analysis, index-ing, retrieval, finished networks deterministic (in which all combinations of words that are accepted by the grammar are listed) or by statistical grammars (e.g., an n-gram in which the probabilities of sequences of n words in a specific order are given), etc. In this article, we developed a morphosyntactic labeling system language “Baoule” using hidden Markov models. This will allow us to build a tagged reference corpus and rep-resent major grammatical rules faced “Baoule” language in general. To estimate the parameters of this model, we used a training corpus manually labeled using a set of morpho-syntactic labels. We then proceed to an improvement of the system through the re-estimation procedure parameters of this model.
文摘In this paper,we tested our methodology on the stocks of four representative companies:Apple,Comcast Corporation(CMCST),Google,and Qualcomm.We compared their performance to several stocks using the hidden Markov model(HMM)and forecasts using mean absolute percentage error(MAPE).For simplicity,we considered four main features in these stocks:open,close,high,and low prices.When using the HMM for forecasting,the HMM has the best prediction for the daily low stock price and daily high stock price of Apple and CMCST,respectively.By calculating the MAPE for the four data sets of Google,the close price has the largest prediction error,while the open price has the smallest prediction error.The HMM has the largest prediction error and the smallest prediction error for Qualcomm’s daily low stock price and daily high stock price,respectively.
基金The Weaponry Equipment Foundation of PLA Equipment Ministry (No51406020105JB8103)
文摘The existing ontology mapping methods mainly consider the structure of the ontology and the mapping precision is lower to some extent. According to statistical theory, a method which is based on the hidden Markov model is presented to establish ontology mapping. This method considers concepts as models, and attributes, relations, hierarchies, siblings and rules of the concepts as the states of the HMM, respectively. The models corresponding to the concepts are built by virtue of learning many training instances. On the basis of the best state sequence that is decided by the Viterbi algorithm and corresponding to the instance, mapping between the concepts can be established by maximum likelihood estimation. Experimental results show that this method can improve the precision of heterogeneous ontology mapping effectively.
文摘A parameter estimation algorithm of the continuous hidden Markov model isintroduced and the rigorous proof of its convergence is also included. The algorithm uses theViterbi algorithm instead of K-means clustering used in the segmental K-means algorithm to determineoptimal state and branch sequences. Based on the optimal sequence, parameters are estimated withmaximum-likelihood as objective functions. Comparisons with the traditional Baum-Welch and segmentalK-means algorithms on various aspects, such as optimal objectives and fundamentals, are made. Allthree algorithms are applied to face recognition. Results indicate that the proposed algorithm canreduce training time with comparable recognition rate and it is least sensitive to the training set.So its average performance exceeds the other two.
基金Supported by National High-Tech Program of China (No. 2001AA413110).
文摘An integrated framework is presented to represent and classify process data for on-line identifying abnormal operating conditions. It is based on pattern recognition principles and consists of a feature extraction step, by which wavelet transform and principal component analysis are used to capture the inherent characteristics from process measurements, followed by a similarity assessment step using hidden Markov model (HMM) for pattern comparison. In most previous cases, a fixed-length moving window was employed to track dynamic data, and often failed to capture enough information for each fault and sometimes even deteriorated the diagnostic performance. A variable moving window, the length of which is modified with time, is introduced in this paper and case studies on the Tennessee Eastman process illustrate the potential of the proposed method.
基金The Science Foundation of Southeast University(No.XJ0704268)the Natural Science Foundation of the Education Department of Anhui Province(No.KJ2007B088)
文摘Frame erasure concealment is studied to solve the problem of rapid speech quality reduction due to the loss of speech parameters during speech transmission. A large hidden Markov model is applied to model the immittance spectral frequency (ISF) parameters in AMR-WB codec to optimally estimate the lost ISFs based on the minimum mean square error (MMSE) rule. The estimated ISFs are weighted with the ones of their previous neighbors to smooth the speech, resulting in the actual concealed ISF vectors. They are used instead of the lost ISFs in the speech synthesis on the receiver. Comparison is made between the speech concealed by this algorithm and by Annex I of G. 722. 2 specification, and simulation shows that the proposed concealment algorithm can lead to better performance in terms of frequency-weighted spectral distortion and signal-to-noise ratio compared to the baseline method, with an increase of 2.41 dB in signal-to-noise ratio (SNR) and a reduction of 0. 885 dB in frequency-weighted spectral distortion.
基金Project(60763001)supported by the National Natural Science Foundation of ChinaProjects(2009GZS0027,2010GZS0072)supported by the Natural Science Foundation of Jiangxi Province,China
文摘In order to overcome defects of the classical hidden Markov model (HMM), Markov family model (MFM), a new statistical model was proposed. Markov family model was applied to speech recognition and natural language processing. The speaker independently continuous speech recognition experiments and the part-of-speech tagging experiments show that Markov family model has higher performance than hidden Markov model. The precision is enhanced from 94.642% to 96.214% in the part-of-speech tagging experiments, and the work rate is reduced by 11.9% in the speech recognition experiments with respect to HMM baseline system.
文摘Epilepsy is one of the most prevalent neurological disorders affecting 70 million people worldwide.The present work is focused on designing an efficient algorithm for automatic seizure detection by using electroencephalogram(EEG) as a noninvasive procedure to record neuronal activities in the brain.EEG signals' underlying dynamics are extracted to differentiate healthy and seizure EEG signals.Shannon entropy,collision entropy,transfer entropy,conditional probability,and Hjorth parameter features are extracted from subbands of tunable Q wavelet transform.Efficient decomposition level for different feature vector is selected using the Kruskal-Wallis test to achieve good classification.Different features are combined using the discriminant correlation analysis fusion technique to form a single fused feature vector.The accuracy of the proposed approach is higher for Q=2 and J=10.Transfer entropy is observed to be significant for different class combinations.Proposed approach achieved 100% accuracy in classifying healthy-seizure EEG signal using simple and robust features and hidden Markov model with less computation time.The proposed approach efficiency is evaluated in classifying seizure and non-seizure surface EEG signals.The system has achieved 96.87% accuracy in classifying surface seizure and nonseizure EEG segments using efficient features extracted from different J level.
基金This project is supported by National Natural Science Foundation of China(No.50375153).
文摘Aiming at solving the problems of machine-learning in fault diagnosis, a diagnosis approach is proposed based on hidden Markov model (HMM) and support vector machine (SVM). HMM usually describes intra-class measure well and is good at dealing with continuous dynamic signals. SVM expresses inter-class difference effectively and has perfect classify ability. This approach is built on the merit of HMM and SVM. Then, the experiment is made in the transmission system of a helicopter. With the features extracted from vibration signals in gearbox, this HMM-SVM based diagnostic approach is trained and used to monitor and diagnose the gearbox's faults. The result shows that this method is better than HMM-based and SVM-based diagnosing methods in higher diagnostic accuracy with small training samples.
基金This study was funded by the National Natural Science Foundation of China(Grant No.41975027)the Natural Science Foundation of Jiangsu Province(Grant No.BK20171457)the National Key R&D Program on Monitoring,Early Warning and Prevention of Major Natural Disasters(Grant No.2017YFC1501401).
文摘With the increasing availability of precipitation radar data from space,enhancement of the resolution of spaceborne precipitation observations is important,particularly for hazard prediction and climate modeling at local scales relevant to extreme precipitation intensities and gradients.In this paper,the statistical characteristics of radar precipitation reflectivity data are studied and modeled using a hidden Markov tree(HMT)in the wavelet domain.Then,a high-resolution interpolation algorithm is proposed for spaceborne radar reflectivity using the HMT model as prior information.Owing to the small and transient storm elements embedded in the larger and slowly varying elements,the radar precipitation data exhibit distinct multiscale statistical properties,including a non-Gaussian structure and scale-to-scale dependency.An HMT model can capture well the statistical properties of radar precipitation,where the wavelet coefficients in each sub-band are characterized as a Gaussian mixture model(GMM),and the wavelet coefficients from the coarse scale to fine scale are described using a multiscale Markov process.The state probabilities of the GMM are determined using the expectation maximization method,and other parameters,for instance,the variance decay parameters in the HMT model are learned and estimated from high-resolution ground radar reflectivity images.Using the prior model,the wavelet coefficients at finer scales are estimated using local Wiener filtering.The interpolation algorithm is validated using data from the precipitation radar onboard the Tropical Rainfall Measurement Mission satellite,and the reconstructed results are found to be able to enhance the spatial resolution while optimally reproducing the local extremes and gradients.
基金support provided from the Cooperative Education Fund of China Ministry of Education(201702113002 and 201801193119)Hunan Natural Science Foundation(2018JJ2138)Degree and Graduate Education Reform Project of Hunan Province(JG2018B096)are greatly appreciated by the authors.
文摘Translation software has become an important tool for communication between different languages.People’s requirements for translation are higher and higher,mainly reflected in people’s desire for barrier free cultural exchange.With a large corpus,the performance of statistical machine translation based on words and phrases is limited due to the small size of modeling units.Previous statistical methods rely primarily on the size of corpus and number of its statistical results to avoid ambiguity in translation,ignoring context.To support the ongoing improvement of translation methods built upon deep learning,we propose a translation algorithm based on the Hidden Markov Model to improve the use of context in the process of translation.During translation,our Hidden Markov Model prediction chain selects a number of phrases with the highest result probability to form a sentence.The collection of all of the generated sentences forms a topic sequence.Using probabilities and article sequences determined from the training set,our method again applies the Hidden Markov Model to form the final translation to improve the context relevance in the process of translation.This algorithm improves the accuracy of translation,avoids the combination of invalid words,and enhances the readability and meaning of the resulting translation.