Start with the classical example due to Runge (1901)
Solution. With
denoting the sampling point vector, and ,
the function values at the sample points, the least squares problem
is
where
The solution to the least squares problem
furnishes the approximation
that can be sampled at
points to assess approximation error.
When carrying convergence studies such as these, it is convenient to
define functions for common tasks:
sample. Returns a sample
of at equidistant points
∴ |
function sample(a,b,f,m)
t = LinRange(a,b,m); y = f.(t)
return t,y
end; |
plotLSQ. Constructs a
figure with plots of:
-
The sample points (i.e., data)
represented as black dots;
-
The function sampled at more points, i.e., ;
-
The approximation sampled at
points
∴ |
function plotLSQ(a,b,f,Basis,m,n,M)
data=sample(a,b,f,m); t=data[1]; y=data[2]
Data=sample(a,b,f,M); T=Data[1]; Y=Data[2]
A = Basis(t,n); x = A\y; z = Basis(T,n)*x
plot(t,y,"ok",T,z,"-r",T,Y,"-b"); grid("on");
xlabel("t"); ylabel("y");
title("Least squares approximation")
end; |
plotConv. Construct a
convergence plot as increases for
fixed value of
∴ |
function plotConv(t,y,Basis)
E=zeros(4,2)
for p=2:5
n=2^p; A = Basis(t,n)
x = A\y; logerr = log(2,norm(y-Basis(t,n)*x))
E[p-1,1]=p; E[p-1,2]=logerr
end
plot(E[:,1],E[:,2],"o-"); grid("on")
xlabel("log(2,n)"); ylabel("log(2,err)");
title("Convergence plot");
end; |
This problem solution is obtained by:
-
defining the function
-
defining the basis
∴ |
function MonomialBasis(t,n)
m=size(t)[1]; A=ones(m,1);
for j=1:n-1
A = [A t.^j]
end
return A
end; |
-
Invoking plotLSQ with appropriate parameters
∴ |
clf(); plotLSQ(-1,1,Runge,MonomialBasis,16,4,64); |
∴ |
FigPrefix=homedir()*"/courses/MATH661/images/H03"; |
∴ |
savefig(FigPrefix*"Fig01.eps") |
 |
|
Figure 2. Least squares approximant
(red) of Runge function (blue) sampled at (black dots).
|
Once the above are defined, cycling through the parameter ranges is
straightforward (open figure folds to see code).
 |
|
Figure 3. First
monomial basis functions
|
∴ |
data=sample(-1,1,Runge,128); t=data[1]; A=MonomialBasis(t,8); |
∴ |
for k=1:8
global t
plot(t,A[:,k])
end |
∴ |
grid("on"); xlabel("t"); ylabel("B"); |
∴ |
title("Monomial basis functions"); |
∴ |
FigPrefix=homedir()*"/courses/MATH661/images/H03"; |
∴ |
savefig(FigPrefix*"MonomialBasis.eps") |
|
|
Figure 4. Effect of
increasing number of monomial basis functions in least
squares approximation of Runge function. Equidistant
sample points.
|
∴ |
for p=2:5
clf(); n=2^p
for q=4:8
m=2^q; plotLSQ(-1,1,Runge,MonomialBasis,m,n,1024)
end
savefig(FigPrefix*"Fig01n="*string(n)*".eps")
end |
Just as straightforward is the construction of the convergence plots
for .
Note that when ,
the error is , and the least squares approximant
becomes an interpolant. In all cases, as the number of basis
functions increases, the error
decreases, Fig. 5. How to reconcile this observation to
Fig. 13, where increasing error is observed at interval
endpoints? Note that in Fig. 13, a comparison is made
between the approximant and the exact function, both evaluated at
more data points than present in the least squares approximation
(LSQ). In Fig. 5, the error is evaluated only at points
within the data used in the LSQ.
 |
|
Figure 5. Convergence of
monomial approximation to available data.
|
∴ |
for q=5:8
m=2^q;
data=sample(-1,1,Runge,m); t=data[1]; y=data[2]
plotConv(t,y,MonomialBasis)
end |
∴ |
savefig(FigPrefix*"FigConv.eps") |
Instead of the equidistant point samples of the Runge example above
use the Chebyshev nodes
keeping other parameters as in Problem 1.
Solution. Using the Chebyshev nodes corresponds to uniform sampling
of the composite function ,
, with
It is straightforward to modify plotLSQ to take an
additional argument
∴ |
function plotLSQ(a,b,f,g,Basis,m,n,M)
data=sample(a,b,g,m); θ=data[1]; t=data[2]; y=f.(t)
Data=sample(t[1],t[m],f,M); T=Data[1]; Y=Data[2]
A = Basis(t,n); x = A\y; z = Basis(T,n)*x
plot(t,y,"ok",T,z,"-r",T,Y,"-b"); grid("on");
xlabel("t"); ylabel("y");
title("Least squares approximation")
end; |
The approximant obtained for
sample points with
basis functions is compared at
points in Fig. 6, with the results for the full
parameter sweep shown in Fig. 5. Use of the Chebyshev sample points
leads to significantly smaller approximation error upon finer
sampling of the domain of definition of .
∴ |
g(θ)=cos(θ); m=16; δ=π/(2*m); |
∴ |
clf(); plotLSQ(δ,π-δ,Runge,g,MonomialBasis,16,4,64); |
∴ |
FigPrefix=homedir()*"/courses/MATH661/images/H03"; |
∴ |
savefig(FigPrefix*"Fig04.eps") |
 |
|
Figure 6. Least squares
approximant (red) of Runge function (blue) sampled at (black
dots).
|
|
|
Figure 7. Effect of increasing number of
monomial basis functions in least squares approximation of
Runge function. Chebyshev sample points.
|
∴ |
for p=2:5
figure(p-1); clf(); n=2^p
for q=4:8
local m=2^q; plotLSQ(δ,π-δ,Runge,g,MonomialBasis,m,n,1024)
end
savefig(FigPrefix*"Fig05n="*string(n)*".eps")
end |
The convergence plot
 |
|
Figure 8. Convergenge of least squares
approximation, monomial basis, Chebyshev sampling points.
|
∴ |
figure(1); clf(); g(θ)=cos(θ); |
∴ |
for q=5:8
local m,δ
m=2^q; δ=π/(2*m);
data=sample(δ,π-δ,g,m); θ=data[1]; t=data[2]; y=Runge.(t)
plotConv(t,y,MonomialBasis)
end |
∴ |
savefig(FigPrefix*"FigConvChebPts.eps") |
Instead of the monomial family of the Runge example, use the
piecewise linear -spline basis
keeping other parameters as in Problem 1.
Solution. First define , and
then the basis set.
∴ |
function N(t,ti,h)
if ((t<=ti-h) || (ti+h<=t)) return 0 end
if (t<ti) return (t-ti)/h+1
else return (ti-t)/h+1 end
end; |
∴ |
function LinBsplineBasis(t,n)
m=size(t)[1]; h=(t[m]-t[1])/(n-1); A=N.(t,t[1],h)
for k=1:n-1
A = [A N.(t,t[1]+k*h,h)]
end
return A
end; |
∴ |
t=LinRange(-1,1,4); A=LinBsplineBasis(t,2) |
|
(2) |
In contrast to the previous basis sets, the linear splines are
non-zero only over the interval ; they are said to have compact support.
 |
|
Figure 12. Linear -spline
functions for .
|
∴ |
data=sample(-1,1,Runge,17); t=data[1]; A=LinBsplineBasis(t,9); |
∴ |
for k=1:8
global t
plot(t,A[:,k])
end |
∴ |
grid("on"); xlabel("t"); ylabel("B"); |
∴ |
title("Linear B-spline basis functions"); |
∴ |
FigPrefix=homedir()*"/courses/MATH661/images/H03"; |
∴ |
savefig(FigPrefix*"BsplineBasis.eps") |
|
|
Figure 13. Effect of
increasing number of trigonometric basis functions in
least squares approximation of Runge function. Equidistant
sample points with ,
(from row 1 to row 5). Note that when more basis functions
are used than data points, the approximation error is
large.
|
 |
|
Figure 14. Convergence of least
squares approximation, -spline
basis.
|
∴ |
for q=5:8
local m,data,t,y
m=2^q; data=sample(-1,1,Runge,m+1); t=data[1]; y=data[2]
plotConv(t,y,LinBsplineBasis)
end |
∴ |
savefig(FigPrefix*"FigConvBspline.eps") |
∴ |
for p=2:5
local n=2^p; ns=string(n)
for q=4:8
local m=2^q; ms=string(m)
clf(); plotLSQ(-1,1,Runge,LinBsplineBasis,m+1,n+1,512)
savefig(FigPrefix*"Fig09m="*ms*"n="*ns*".eps")
end
end |