Skip to main content

Section 14.4 Examples and questions

For the function \(f(x) = e^x \) on the interval \([0, 1]\text{,}\) use Matlab to compute the left-, right, and midpoint approximations with \(n=10\) and find the error of each approximation (that is, its difference with the actual integral).

Then repeat the same with \(f(x) = e^{\sqrt{x}} \text{.}\) What could explain the lackluster performance of Simpson's rule here?

Solution
f = @(x) exp(x);  % or exp(sqrt(x))
a = 0;
b = 1;
exact = exp(1)-1;  % or 2

n = 10;
h = (b-a)/n;
x = a:h:b;
y = f(x);   

T = (h/2)*(y(1) + y(end) + 2*sum(y(2:end-1))); 
S = (h/3)*(y(1) + y(end) + 4*sum(y(2:2:end-1)) + 2*sum(y(3:2:end-2))); 
midpoints = (x(1:end-1) + x(2:end))/2;
M = h*sum(f(midpoints));

er = abs([T M S] - exact);
fprintf('Errors: Trapezoidal %g, Midpoint %g, Simpson %g\n', er);

The function \(f(x) = e^{\sqrt{x}} \) is not differentiable at \(0\text{.}\) This behavior affects higher order methods more because they are based on comparing the function to a smooth one (such as a parabola). The midpoint method has an advantage in that it does not use the problematic point \(0\text{.}\)

We derived Simpson's rule from trapezoidal rule using the Richardson extrapolation. Why not do the same starting from the midpoint rule instead of trapezoidal?