Skip to content

Commit

Permalink
examples with msws
Browse files Browse the repository at this point in the history
  • Loading branch information
friguzzi committed Jun 6, 2016
1 parent bd53c11 commit bd825c6
Show file tree
Hide file tree
Showing 3 changed files with 154 additions and 1 deletion.
9 changes: 8 additions & 1 deletion examples/inference/kalman_filter.pl
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,9 @@

:- end_lpad.


hist(Samples,NBins,Chart):-
mc_sample_arg(kf(1,_O1,Y),Samples,Y,L0),
histogram(L0,NBins,Chart).

dens_lw(Samples,NBins,Chart):-
mc_sample_arg(kf(1,_O1,Y),Samples,Y,L0),
Expand All @@ -76,6 +78,11 @@
% plot the density of the state at time 1 in case of no observation (prior)
% and in case of observing 2.5 by taking 1000 samples and dividing the domain
% in 40 bins
?- hist(1000,40,G).
% plot the density of the state at time 1 in case of no observation
% by taking 1000 samples and dividing the domain
% in 40 bins
*/

85 changes: 85 additions & 0 deletions examples/inference/kalman_filtermsw.pl
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
/*
One-dimensional Kalman filter. Hidden Markov model with a real
value as state and a real value as output. The next state is given by
the current state plus Gaussian noise (mean 0 and variance 2 in this example)
and the output is given by the current state plus Gaussian noise (mean
0 and variance 1 in this example).
This example can be considered as modeling a random walk of a single continuous
state variable with noisy observations.
Given that at time 0 the value 2.5 was
observed, what is the distribution of the state at time 1 (filtering problem)?
The distribution of the state is plotted in the case of having (posterior) or
not having the observation (prior).
Liklihood weighing is used to condition the distribution on evidence on
a continuous random variable (evidence with probability 0).
CLP(R) constraints allow both sampling and weighing samples with the same
program.
From
Islam, Muhammad Asiful, C. R. Ramakrishnan, and I. V. Ramakrishnan.
"Inference in probabilistic logic programs with continuous random variables."
Theory and Practice of Logic Programming 12.4-5 (2012): 505-523.
http://arxiv.org/pdf/1112.2681v3.pdf
Russell, S. and Norvig, P. 2010. Arficial Intelligence: A Modern Approach.
Third Edition, Prentice Hall, Figure 15.10 page 587
*/
:- use_module(library(mcintyre)).
:- use_module(library(clpr)).
:- if(current_predicate(use_rendering/1)).
:- use_rendering(c3).
:- endif.
:- mc.
:- begin_lpad.

kf(N,O, T) :-
msw(init,S),
kf_part(0, N, S,O,T).

kf_part(I, N, S,[V|RO], T) :-
I < N,
NextI is I+1,
trans(S,I,NextS),
emit(NextS,I,V),
kf_part(NextI, N, NextS,RO, T).

kf_part(N, N, S, [],S).

trans(S,_I,NextS) :-
{NextS =:= E + S},
msw(trans_err,E).

emit(NextS,_I,V) :-
{NextS =:= V+X},
msw(obs_err,X).

values(init,real).
values(trans_err,real).
values(obs_err,real).

:- set_sw(init, norm(0,1)).
:- set_sw(trans_err,norm(0,2)).
:- set_sw(obs_err,norm(0,1)).

% prior as in Russel and Norvig 2010, Fig 15.10
% transition noise as in Russel and Norvig 2010, Fig 15.10
% observation noise as in Russel and Norvig 2010, Fig 15.10

:- end_lpad.



hist(Samples,NBins,Chart):-
mc_sample_arg(kf(1,_O1,Y),Samples,Y,L0),
histogram(L0,NBins,Chart).
% plot the density of the state at time 1 in case of no observation (prior)
% and in case of observing 2.5.
% Observation as in Russel and Norvig 2010, Fig 15.10

/** <examples>
?- dens(1000,40,G).
% plot the density of the state at time 1 in case of no observation
% by taking 1000 samples and dividing the domain
% in 40 bins
*/

61 changes: 61 additions & 0 deletions examples/inference/widgetmsw.pl
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
/*
Factory producing widgets.
Consider a factory with two machines a and b. Each machine produces a widget
with a continuous feature. A widget is produced by machine a with probability
0.7 and by machine b with probability b.
If the widget is produced by machine a, the feature is distributed as a
Gaussian with mean 2.0 and variance 1.0.
If the widget is produced by machine b, the feature is distributed as a
Gaussian with mean 3.0 and variance 1.0.
The widget then is processed by a third machine that adds a random quantity to
the feature distributed as a Gaussian with mean 0.5 and variance 1.5.
What is the distribution of the feature?
What is the distribution of the feature given that the widget was procuded
by machine a?
What is the distribution of the feature given that the third machine added a
quantity greater than 0.2?
What is the distribution of the feature given that the third machine added
a quantity of 2.0?
Adapted from
Islam, Muhammad Asiful, C. R. Ramakrishnan, and I. V. Ramakrishnan.
"Inference in probabilistic logic programs with continuous random variables."
Theory and Practice of Logic Programming 12.4-5 (2012): 505-523.
http://arxiv.org/pdf/1112.2681v3.pdf
*/
:- use_module(library(mcintyre)).
:- use_module(library(clpr)).

:- if(current_predicate(use_rendering/1)).
:- use_rendering(c3).
:- endif.
:- mc.
:- begin_lpad.
widget(X) :- msw(m, M),
msw(st(M), Z),
msw(pt, Y),
{X = Y + Z}.
% Ranges of RVs
values(m, [a,b]).
values(st(_), real).
values(pt, real).
% PDFs and PMFs:
:- set_sw(m, [0.3, 0.7]),
set_sw(st(a), norm(2.0, 1.0)),
set_sw(st(b), norm(3.0, 1.0)),
set_sw(pt, norm(0.5, 0.1)).


:- end_lpad.

hist_uncond(Samples,NBins,Chart):-
mc_sample_arg(widget(X),Samples,X,L0),
histogram(L0,NBins,Chart).
% What is the distribution of the feature?


/** <examples>
?- hist_uncond(10000,40,G).
% What is the distribution of the feature?
*/

0 comments on commit bd825c6

Please sign in to comment.