A Toolbox For Modeling and Optimization in Matlab: October 2004
A Toolbox For Modeling and Optimization in Matlab: October 2004
discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/4124388
CITATIONS READS
931 9,672
1 author:
Johan Löfberg
Linköping University
58 PUBLICATIONS 3,562 CITATIONS
SEE PROFILE
All content following this page was uploaded by Johan Löfberg on 17 December 2016.
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document
and are linked to publications on ResearchGate, letting you access and read them immediately.
YALMIP : A toolbox for modeling and
optimization in MATLAB
Johan Löfberg
Automatic Control Laboratory, ETHZ
CH-8092 Zürich, Switzerland.
[email protected]
Abstract— The MATLAB toolbox YALMIP is introduced. Rapid prototyping of an algorithm based on SDP can
It is described how YALMIP can be used to model and be done in matter of minutes using standard MATLAB
solve optimization problems typically occurring in systems and commands. In fact, learning 3 YALMIP specific commands
control theory.
will be enough for most users to model and solve their
I. I NTRODUCTION optimization problem.
YALMIP was initially indented for SDP and LMIs
Two of the most important mathematical tools introduced (hence the now obsolete name Yet Another LMI Parser),
in control and systems theory in the last decade are proba- but has evolved substantially over the years. The most
bly semidefinite programming (SDP) and linear matrix in- recent release, YALMIP 3, supports linear programming
equalities (LMI). Semidefinite programming unifies a large (LP), quadratic programming (QP), second order cone pro-
number of control problems, ranging from the more than gramming (SOCP), semidefinite programming, determinant
100 year old classical Lyapunov theory for linear systems, maximization, mixed integer programming, posynomial ge-
modern control theory from the 60’s based on the algebraic ometric programming, semidefinite programs with bilinear
Riccati equation, and more recent developments such as H∞ matrix inequalities (BMI), and multiparametric linear and
control in the 80’s. More importantly, LMIs and SDP has led quadratic programming. To solve these problems, around 20
to many new results on stability analysis and synthesis for solvers are interfaced. This includes both freeware solvers
uncertain system, robust model predictive control, control such as SeDuMi [16] and SDPT3 [17], and commercial
of piecewise affine systems and robust system identification, solvers as the PENNON solvers [7], LMILAB [4] and
just to mention a few applications. CPLEX [1]. Due to a flexible solver interface and internal
In the same sense that we earlier agreed that a control format, adding new solvers, and even new problem classes,
problem was solved if the problem boiled down to a can often be done with modest effort.
Riccati equation, as in linear quadratic control, we have YALMIP automatically detects what kind of a problem
now come to a point where a problem with a solution the user has defined, and selects a suitable solver based on
implicitly described by an SDP can be considered solved, this analysis. If no suitable solver is available, YALMIP
even though there is no analytic closed-form expression of tries to convert the problem to be able to solve it. As an
the solution. It was recognized in the 90’s that SDPs are example, if the user defines second order cone constraints,
convex optimization problems that can be solved efficiently but no second order cone programming solver is available,
in polynomial time [13]. Hence, for a problem stated using YALMIP converts the constraints to LMIs and solves the
an SDP, not only can we solve the problem but we can problem using any installed SDP solver.
solve it relatively efficiently. One of the most important extension in YALMIP 3
The large number of applications of SDP has led to an compared to earlier versions is the possibility to work with
intense research and development of software for solving nonlinear expression. This has enabled YALMIP users to
the optimization problems. There are today around 10 define optimization problems involving BMIs, which then
public solvers available, most of them free and easily can be solved using the solver PENBMI [6], the first
accessible on the Internet. However, these solvers typically public solver for problems with BMI constraints. These
take the problem description in a very compact format, optimization problems are unfortunately extremely hard to
making immediate use of the solvers time-consuming and solve, at-least globally, but since an enormous amount of
error prone. To overcome this, modeling languages and problems in control theory falls into this problems class, it
interfaces are needed. is our hope that YALMIP will inspire researchers to develop
This paper introduces the free MATLAB toolbox efficient BMI solvers and make them publicly available.
YALMIP, developed initially to model SDPs and solve Another introduction in YALMIP 3 is an internal branch-
these by interfacing external solvers. The toolbox makes and-bound framework. This enables YALMIP to solve
development of optimization problems in general, and con- integer programs for all supported convex optimization
trol oriented SDP problems in particular, extremely simple. classes, i.e. mixed integer linear, quadratic, second order
cone and semidefinite programs. The built-in integer solver III. I NTRODUCTION TO YALMIP
should not be considered a competitor to any dedicated This paper does not serve as a manual to YALMIP.
integer solver such as CPLEX [1]. However, if the user Nevertheless, a short introduction to the basic commands
has no integer solver installed, he or she will at-least be is included here to allow novel users to get started. It is
able to solve some small integer problems using YALMIP. assumed that the reader is familiar with MATLAB.
Moreover, there are currently no other free public solvers
available for solving mixed integer second order cone and A. Defining decision variables
semidefinite programs. The central component in an optimization problem is
The latest release of YALMIP has been extended to the decision variables. Decision variables are represented
include a set of mid-level commands to facilitate advanced in YALMIP by sdpvar objects. Using full syntax, a
YALMIP programming. These commands have been used symmetric matrix P ∈ Rn×n is defined by the following
to develop scripts for moment relaxation problems [10] command.
and sum-of-square decompositions [14], two recent ap- >> P = sdpvar(n,n,’symmetric’,’real’);
proaches, based on SDP and LMIs, for solving global poly-
nomial optimization problems. There are dedicated, more Square matrices are by default symmetric and real, so the
efficient, packages available for solving these problems same variable can be defined using only the dimension
(GloptiPoly [5] and SOSTOOLS [15]), and the inclusion of arguments.
these functionalities are mainly intended to give advanced >> P = sdpvar(n,n);
users hints on how the mid-level commands can be used.
The sum-of-square functionality does however have a novel A set of standard parameterizations are predefined and can
feature in that the sum-of-squares problem can be non- be used to create, e.g., fully parameterized matrices and
linearly parameterized. In theory, this means that this func- various type of matrices with complex variables.
tion can be used, e.g., to synthesize controllers for nonlinear >> Y = sdpvar(n,n,’full’);
systems. However, the resulting optimization problem is a >> X = sdpvar(n,n,’hermitian’,’complex’);
semidefinite program with BMIs instead of LMIs.
Although SDPs can be solved relatively efficiently using Important to realize is that most standard MATLAB com-
polynomial time algorithms, large-scale control problems mands and operators can be applied to sdpvar variables.
can easily become problematic, even for state-of-the-art Hence, the following construction is valid.
semidefinite solvers. To reduce computational complex- >> X = [P P(:,1);ones(1,n) sum(sum(P))];
ity, problem-specific solvers are needed in some cases.
B. Defining constraints
One problem class where structure can be exploited is
KYP problems, a generalization of Lyapunov inequalities. The most commonly used constraints in YALMIP are
YALMIP comes with a specialized command for defin- element-wise, semidefinite and equality constraints. The
ing KYP constraints, and interfaces the dedicated solver command to define these is called set1 .
KYPD [20]. The code below generate a list of constraints, gathered
Other features worth mentioning are the capabilities to in the set object F, constraining a matrix to be positive
work transparently with complex-valued data and con- definite, having all elements positive, and with the sum of
straints, easy extraction of dual variables and automatic all elements being n.
reduction of variables in equality constrained problems. >> P = sdpvar(n,n);
>> F = set(P > 0);
II. P RELIMINARIES AND NOTATION >> F = F + set(P(:) > 0);
>> F = F + set(sum(sum(P)) == n);
A symmetric matrix P is denoted positive semidefinite Note that the operators > and < are used to describe
(P 0) if zT Pz ≥ 0 ∀z. Positive definite (P 0) is the strict both semidefinite constraints and standard element-wise
version zT Pz > 0 ∀z 6= 0 . Linear matrix inequality (LMI) constraints2 . A constraint is interpreted in terms of semidef-
denotes a constraint of the form F0 + ∑ni=1 Fi xi 0, where initeness if both left-hand side and the right-hand side of
Fi are fixed symmetric matrices and x ∈ Rn is the decision the constraint is symmetric, and as an element-wise con-
variable. Constraints F0 + ∑ni=1 Fi xi + ∑nj=1 ∑ni=1 Fi j xi x j 0 straints otherwise. In addition to these standard constraints,
are denoted BMIs (bilinear matrix inequalities). Constraints YALMIP also supports convenient definition of integral-
involving either LMIs or BMIs are called semidefinite ity constraints, second order cone constraints and sum-
constraints. Optimization problems involving semidefinite of squares constraints. Without going into details, typical
constraints are termed semidefinite programs (SDPs). notation for these constraints would be
MATLAB commands and variables will be displayed 1 Not to be confused with the built-in function set in MATLAB
using typewriter font. Commands will be written on 2 Non-strict inequalities (>= and <=) are supported also. The reader is
separate lines and start with >>. referred to the YALMIP manual for details.
>> F = set(integer(x)); The YALMIP implementation of this feasibility problem is
>> F = set(cone(A*x+b,c’*x+d)); given below.
>> F = set(sos(1+x+xˆ7+xˆ8));
>> P = sdpvar(n,n);
C. Solving optimization problems >> F = set(P > 0) + set(A’*P+P*A < 0);
>> solvesdp(F)
Once all variables and constraints have been defined, the
optimization problem can be solved. Let us for simplicity The problem above can be addressed more efficiently by
assume that we have matrices c, A and b and we wish solving the classical Lyapunov equation, and the benefit of
to minimize cT x subject to the constraints Ax ≤ b and semidefinite programming and YALMIP is appearant first
∑ xi = 1. The YALMIP code to define and solve this prob- when we try to solve more complex problems. Consider the
lem is extremely intuitive and is essentially a one-to-one problem of finding a common Lyapunov function for two
mapping from the mathematical description. The command different systems with state matrices A1 and A2 . Further-
solvesdp3 is used for all4 optimization problems and more, let us assume that we want to find a diagonal solution
typically take two arguments, a set of constraints and the P satisfying P Q for some given symmetric matrix Q,
objective function. and moreover, we want to find the minimum trace solution.
Stating this as an SDP using YALMIP is straightforward.
>> x = sdpvar(length(c),1);
>> F = set(A*x < b)+set(sum(x)==1); >> P = diag(sdpvar(n,1));
>> solvesdp(F,c’*x); >> F = set(P > Q);
>> F = F + set(A1’*P+P*A1 < 0);
YALMIP will automatically categorize this as a linear pro- >> F = F + set(A2’*P+P*A2 < 0);
gramming problem and call a suitable solver. The optimal >> solvesdp(F,trace(P))
solution can be extracted with the command double(x).
A third argument can be used to guide YALMIP in the To make things even more complicated, consider the min-
selection of solver, setting display levels and change solver imum Frobenius norm (TrPPT ) problem. The changes in
specific options etc. the code are minimal.