Here I would like to show you some changes in the source code of the Robust least squares fitting required for a general quadric surface (fitting of the planar model was introduced here). Assume you want to fit a set of data points in the three-dimensional space with a general quadric described by five parameters , , , , and as a function of and in the following way:
Again, fisrt we predefine the surface model with the so-called “ground truth” coefficients which we will need for evaluation of the fitting data:
#! /usr/bin/env python import numpy as np import matplotlib.pyplot as plt # coefficients of the model a1, a2, a3, a4, a5 = 0.1, -0.2, -0.3, 0.1, 0.15 # ground truth A_gt = [a1, a2, a3, a4, a5] print 'A_gt = ', A_gt # create a coordinate matrix nx = np.linspace(-1, 1, 41) ny = np.linspace(-1, 1, 41) x, y = np.meshgrid(nx, ny) # make the estimation z = a1*x*x + a2*y*y + a3*x + a4*y + a5
Then, similar to the case with the planar surface, we find the model coefficients and display the output of the found model for these three cases:
1. Input data is corrupted by Gaussian noise only and the Regular linear least squares method is used.
2. Input data is corrupted by Gaussian noise AND outliers, Regular linear least squares method is used.
3. Input data is corrupted by Gaussian noise AND outliers, Robust linear least squares method is used.
Note that in the current example we have 5 parameters which need to be estimated, below you can see required changes in the source code:
x_fl = x.flatten() y_fl = y.flatten() z_ones = np.ones([x.size,1]) x_fl2 = x_fl**2 y_fl2 = y_fl**2 v1 = np.reshape(x_fl2, ([len(x_fl2),1])) v2 = np.reshape(y_fl2, ([len(y_fl2),1])) v3 = np.reshape(x_fl, ([len(x_fl),1])) v4 = np.reshape(y_fl, ([len(y_fl),1])) X = np.hstack((v1, v2, v3, v4, z_ones))
the next lines remain the same until weighted X-values need to be computed, here the following change is needed:
XW = np.tile(w, (1, 5)) * X
That’s it. Here you can see output results for all three cases:
And again the output produced by the Robust linear least squares method looks pretty good despite many outliers in the input data.