-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to do prediction? #20
Comments
I have the same question. Did you solve it? |
I have the same question. Did you solve it? |
Referring to "https://web.ece.ucsb.edu/Faculty/Rabiner/ece259/Reprints/tutorial%20on%20hmm%20and%20applications.pdf" and library "https://hmmlearn.readthedocs.io/en/latest/" I have found this solution: 1- Through log_gamma (posterior distribution):
2- Viterbi Algorithm:
|
building on top of this, using log-gamma can decode the sequence of hidden states. To fit testing data, one could set the data using the testing set and re-run the E-step to get a new set of log-gammas. (this does not update the transitions and emissions, so would be still using the trained transitions and emissions). Using these new log-gammas, re-run the decoding function as above.
|
How to do prediction for new test data after trained the model?
The text was updated successfully, but these errors were encountered: