Exercises for TDT4171-Methods-in-AI at NTNU
Exercise 1 implements a Bayesian network using Inference By Enumeration to calculate probability distributions.
In Exercise 2, we implement filtering, prediction, smoothing and "most likely sequence" (Viterbi) calculations for a Hidden Markov Model.
Exercise 3 has no code, and is about making a decision network, drawing it in Genie, and use reasonable probability distributions and dependencies.
In Exercise 4, we implement Decision Tree Learning and try to predict the survival of a given passenger on Titanic.
In Exercise 5, we implement a neural network that can be used as a Perceptron, or with one hidden layer. The Backpropagation algorithm is used to calculate the gradient descent and tune the weights.