Void’s Vault

Knowledge source for efficiency.

Extended Information Filter

I’m still reading Probabilistic Robotics from Thrun,Burgard and Fox. Here’s how to implement the extended information filter algorithm.

I used the same problem as before with the unscented Kalman filter. We have a robot that moves in a planar system. It uses a non linear sensor to find its distance away from the origin.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
function [ epsilon1, omega1 ] = eif( epsilon0, omega0, u1, z1, xErr, yErr, zErr)
% The extended information filter's algorithm. 

%R1 and Q1 are covariances of the distribution of added Gaussian noises
R1 = [xErr  0; 0 yErr]; % noise during estimation of the error in the position
Q1 = zErr; % noise during estimation of the error in the measurement

mu0 = inv(omega0)*epsilon0;

%---------------------------------------
% State transition prediction
% g is an arbitrary nonlinear function. 
%mu1_P = g(u1, mu0)
g = [mu0(1)+u1(1) ; mu0(2)+u1(2)];
%---------------------------------------

%G is the jacobian of g at mu0
f = @(mu)[mu(1)+u1(1), mu(2)+u1(2)];
G = jacobianest(f,mu0);

omega_p = inv(G*inv(omega0)*G' + R1);
epsilon_p = omega_p * g;

mu1_p = g;

%---------------------------------------
% Measurement Prediction function
% h is an arbitrary nonlinear function.
h = ((sqrt((mu1_p(1).^2) + (mu1_p(2).^2))) * (1/100)).^-1;
%---------------------------------------

%H is the jacobian of h at mu1_p
f = @(mu)((sqrt((mu(1).^2) + (mu(2).^2))) * (1/100)).^-1;
H = jacobianest(f,mu1_p);

omega1 = omega_p + (H'*inv(Q1)*H);
epsilon1 = epsilon_p + H'*inv(Q1)*(z1 - h + H*mu1_p);

end

The method jackobianest comes from a useful matlab toolbox I found in this website.