Efficient Entropy-Based Decoding Algorithms For Higher-Order Hidden Markov Model
Higher-order Hidden Markov model (HHMM) has a higher prediction accuracy than the first-order Hidden Markov model (HMM). This is due to more exploration of the historical state information for predicting the next state found in HHMM. State sequence for HHMM is invisible but the classical Viterbi...
Saved in:
主要作者: | |
---|---|
格式: | Thesis |
语言: | English |
出版: |
2019
|
主题: | |
在线阅读: | http://eprints.usm.my/61146/1/Efficient%20entropy%20based%20decoding%20cut.pdf |
标签: |
添加标签
没有标签, 成为第一个标记此记录!
|
总结: | Higher-order Hidden Markov model (HHMM) has a higher prediction
accuracy than the first-order Hidden Markov model (HMM). This is due to more
exploration of the historical state information for predicting the next state found in
HHMM. State sequence for HHMM is invisible but the classical Viterbi algorithm is
able to track the optimal state sequence. The extended entropy-based Viterbi algorithm
is proposed for decoding HHMM. This algorithm is a memory-efficient algorithm due
to its required memory space that is time independent. In other words, the required
memory is not subjected to the length of the observational sequence. The entropybased
Viterbi algorithm with a reduction approach (EVRA) is also introduced for
decoding HHMM. The required memory of this algorithm is also time independent. In
addition, the optimal state sequence obtained by the EVRA algorithm is the same as
that obtained by the classical Viterbi algorithm for HHMM. |
---|