# Optimization in MATLAB – Part II | MATLAB Tutorial

**by** Gunjan Gupta

**Optimization in MATLAB – Part II**

In the previous tutorial, we started discussion about Optimization and in particular we talked about Golden – Section Search method of One – Dimensional Optimization. Now moving further, we will first discuss Parabolic Interpolation Method of One – Dimensional Optimization and later we will discuss Multidimensional Optimization.

**Parabolic Interpolation**

Parabolic interpolation takes advantage of the fact that a second-order polynomial often provides a good approximation to the shape of *f *(*x*) near an optimum (Fig.1).

**Figure 1: Graphical depiction of parabolic interpolation.**

Just as there is only one straight line connecting two points, there is only one parabola connecting three points. Thus, if we have three points that jointly bracket an optimum, we can fit a parabola to the points. Then we can differentiate it, set the result equal to zero, and solve for an estimate of the optimal *x*. It can be shown through some algebraic manipulations that the result is

where *x*1, *x*2, and *x*3 are the initial guesses, and *x*4 is the value of *x *that corresponds to the optimum value of the parabolic fit to the guesses.

** Problem 1:** Use the parabolic interpolation to find the minimum of

with initial guesses of

The function values at the tree guesses can be calculated:

Substituting these in equation for x4 , we get x4=1.4903 . which has a function value of *f*(1.5055) = 1.7691.

Next, a strategy similar to the golden-section search can be employed to determine which point should be discarded. Because the function value for the new point is lower than for the intermediate point (*x*2) and the new *x *value is to the right of the intermediate point, the lower guess (*x*1) is discarded. Therefore, for the next iteration:

Substituting these in equation for x4 , we get x4=1.4903. which has a function value of *f*(1.4903) = 1.7714

1 | 0 | 0 | 1 | −1.5829 | 4 | 3.1136 | 1.5055 | −1.7691 |

2 | 1 | −1.5829 | 1.5055 | −1.7691 | 4 | 3.1136 | 1.4903 | −1.7714 |

3 | 1 | −1.5829 | 1.4903 | −1.7714 | 1.5055 | −1.7691 | 1.4256 | −1.7757 |

4 | 1 | −1.5829 | 1.4256 | −1.7757 | 1.4903 | −1.7714 | 1.4256 | −1.7757 |

5 | 1.4256 | −1.7757 | 1.4266 | −1.7757 | 1.4903 | −1.7714 | 1.4275 | −1.7757 |

Thus, within five iterations, the result is converging rapidly on the true value of -1.7757 at *x *= 1.4276.

**Matlab m-file:**

function [x,fx] = parab(f,x1,x2,x3,es) % input: % f = name of function % x1, x2, x3 = initial guesses % es = desired relative error (default = 0.0001%) % output: % x = location of minimum % fx = minimum function value if nargin == 4 error('at least 4 input arguments required'); end if nargin == 5 || isempty(es) es = 0.0001; end x(1) = x1; x(2) = x2; x(3) = x3; iteration = 0; for i = 4:1000 x(i) = x(i-2)-(0.5 *(((x(i-2)-x(i-3))^2)*(f(x(i-2))-f(x(i-1)))... - ((x(i-2) - x(i-1))^2) * (f(x(i-2)) - f(x(i-3)))) / ... ((x(i-2) - x(i-3)) *(f(x(i-2))-f(x(i-1)))-(x(i-2) - x(i-1)) * (f(x(i-2)) - f(x(i-3))))); iteration = iteration + 1; if abs((x(i)-x(i-1))/x(i)); root = x(i); break; end end x = root; fx = f(x);

>> y = @(x)((x^2)/10)- 2*sin(x);

>> [xmin,fmin] = parab(y,0,1,4)

xmin =

1.4276

fmin =

-1.7757

**Multidimensional Optimization**

Aside from one-dimensional functions, optimization also deals with multidimensional functions. Fig. 2*a *our visual image of a one-dimensional search is like a roller coaster. For two-dimensional cases, the image becomes that of mountains and valleys (Fig. 2*b*).

**Figure 2: (a) One-dimensional optimization. (b) Two-dimensional optimization**

** Problem 2:** Use MATLAB’s graphical capabilities to display the following function and visually estimate its minimum in the range –2 ≤

*x*1 ≤ 0 and 0 ≤

*x*2 ≤ 3:

clc; clear all; close all; x=linspace(-2,0,40);y=linspace(0,3,40); [X,Y] = meshgrid(x,y); Z=2+X-Y+2*X.^2+2*X.*Y+Y.^2; subplot(1,2,1); cs=contour(X,Y,Z);clabel(cs); xlabel('x_1');ylabel('x_2'); title('(a) Contour plot');grid; subplot(1,2,2); cs=surfc(X,Y,Z); zmin=floor(min(Z)); zmax=ceil(max(Z)); xlabel('x_1');ylabel('x_2');zlabel('f(x_1,x_2)'); title('(b) Mesh plot');

As displayed in above figure, both plots indicate that function has a minimum value of about

*f *(*x*1, *x*2) = 0 to 1 located at about *x*1=−1 and *x*2 = 1.5.

Techniques for multidimensional unconstrained optimization can be classified in a number of ways. For purposes of the present discussion, we will divide them depending on whether they require derivative evaluation. Those that require derivatives are called *gradient, *or *descent *(or ascent), methods. The approaches that do not require derivative evaluation are called *non-gradient, *or *direct, *methods.

Standard MATLAB has a function fminsearch that can be used to determine the minimum of a multidimensional function. A simple expression of its syntax is

[*xmin*, *fval*] = fminsearch(*function*,*x0*)

where *xmin* and *fval* are the location and value of the minimum, *function* is the name of the function being evaluated, and *x0 *is the initial guess. Note that *x0*can be a scalar, vector, or a matrix. Simple MATLAB session that uses fminsearch to determine minimum for the function we just graphed in problem 4:

>> f=@(x) 2+x(1)-x(2)+2*x(1)^2+2*x(1)*x(2)+x(2)^2;

>> [x,fval]=fminsearch(f,[-0.5,0.5])

x = -1.0000 1.5000

fval = 0.7500

### Recommended Posts

##### PID Tuning

18 Feb 2018 - PID Controller, Simulink, Tutorial