`
Miguel Á. Carreira-Perpiñán
Assistant professor
Electrical Engineering and Computer Science
School of Engineering
University of California, Merced
mcarreira-perpinan-[at]-ucmerced.edu; 209-2284545
Office: 284, Science & Engineering Building
`

Office hours: Wednesdays 3-4pm (SE284).

Lectures: Mondays/Wednesdays 1:30-2:45pm (COB279)

Lab class: Wednesdays 4:30-7:15pm (Linux Lab, SE138)

Course web page: `http://faculty.ucmerced.edu/mcarreira-perpinan/EECS260`

Optimization problems arise in multiple areas of science, engineering and business. This course introduces theory and numerical methods for continuous multivariate optimization (constrained and unconstrained), including: line-search and trust-region strategies; conjugate-gradient, Newton, quasi-Newton and large-scale methods; linear programming; quadratic programming; penalty and augmented Lagrangian methods; sequential quadratic programming; and interior-point methods. The primary programming tool for this course is Matlab.

Prerequisites: MATH 23, MATH 24, MATH 141 or equivalent (undergraduate courses in linear algebra and multivariate calculus). Basic concepts will be briefly reviewed during the course.

Required textbook (get the errata and additional errata):

- Jorge Nocedal and Stephen J. Wright:
*Numerical Optimization*, second ed. Springer-Verlag, 2006. This book is accessible online from within UC Merced.

Other recommended books:

- R. Fletcher:
*Practical Methods of Optimization*. Wiley, 1987. - David G. Luenberger and Yinyu Ye:
*Linear and Nonlinear Programming*, 3rd ed. Springer, 2008. Available online from within UC Merced. - Stephen Boyd and Lieven Vandenberghe:
*Convex Optimization*. Cambridge University Press, 2004. Available online. - W. H. Press, S. A. Teukolsky, W. T. Vetterling and B. P. Flannery:
*Numerical Recipes in C: The Art of Scientific Computing*, 2nd ed., chapter 10. Cambridge University Press, 1992.

If you want to refresh your knowledge of linear algebra and multivariate calculus, the following are helpful (any edition or similar book is fine):

- Seymour Lipschutz:
*Schaum's Outline of Linear Algebra*. McGraw-Hill. - Murray Spiegel:
*Schaum's Outline of Vector Analysis*. McGraw-Hill. - Frank Ayres:
*Schaum's Outline of Matrices*. McGraw-Hill. - Richard Bronson:
*Schaum's Outline of Matrix Operations*. McGraw-Hill. - Erwin Kreyszig:
*Advanced Engineering Mathematics*. Wiley. - Gilbert Strang:
*Introduction to Linear Algebra*. Wellesley-Cambridge Press. - Gilbert Strang:
*Linear Algebra and Its Applications*. Brooks Cole.

Before each class, you should have read the corresponding part of the textbook and the notes. I will teach the material in the order below (which is more or less the order in the book).

- Introduction (ch. 1: all) and math review (appendix A: skim it)
- Unconstrained optimization:
- Fundamentals of unconstrained optimization (ch. 2: all)
- Line search methods (ch. 3: pp. 30-49; skip proofs in sec. 3.3)
- Trust-region methods (ch. 4: pp. 66-71)
- Conjugate gradient methods (ch. 5: all, but skim pp. 113-114, 119-120, 125-133)
- Quasi-Newton methods (ch. 6: pp. 135-146; skip 147-152; skim rest)
- Large-scale unconstrained optimization (ch. 7: pp. 164-170, 176-181, 185-190; skip 171-175, 182-184)
- Calculating derivatives (ch. 8: pp. 193-206; skim 199-201, 203; skip rest)
- Derivative-free optimization (ch. 9: all except pp. 226-229, 237)
- Least-squares problems (ch. 10: all but skim pp. 251-252, 259-268)
- Nonlinear equations (ch. 11: all but skim 277-279, 287-295; skip 283-284)

- Constrained optimization:
- Theory of constrained optimization (ch. 12: all except as follows: from section 12.2 see defs. 12.3, 12.4 and skip the rest; from section 12.5 skip the proofs of second-order conditions; skip sections 12.4, 12.6, 12.7, 12.9)
- Linear programming: the simplex method (ch. 13: all except as follows: skip sections 13.4 to 13.7)
- Linear programming: interior-point methods (ch. 14: all except as follows: skip lemmas 14.1, 14.2 and the proof of th. 14.3; skim pp. 410, 412-413; skip section 14.3)
- Fundamentals of algorithms for nonlinear constrained optimization (ch. 15: pp. 421-437)
- Quadratic programming (ch. 16: all except as follows: skip sections 16.2-3, skim section 16.5 and skim pp. 482-485)
- Penalty and augmented Lagrangian methods (ch. 17: all except as follows: skip def. 17.1, skim section 17.4)
- Sequential quadratic programming (ch. 18: sections 18.1-2; skip rest)
- Interior-point methods for nonlinear programming (ch. 19: sections 19.1, 19.2 and 19.6; skip rest)

- Some Matlab code to plot contours of functions, etc. The function
`numgradhess.m`is useful to check numerically whether the expressions for your gradient and Hessian are correct. - The notes to accompany the textbook. Bring the corresponding part to each class.
- The notes for the 1st edition of the textbook.
- Homework (to do on your own, not graded):
- Homework #1 (covering chapters 1-4 and A from the book)
- Homework #2 (covering chapters 5-11 from the book)
- Homework #3 (covering chapters 12-15 from the book)
- Homework #4 (covering chapters 16-19 from the book)

- Projects (to be submitted and graded):
- Project #1 (implementing unconstrained optimization methods)
**due March 15****due March 26**, in groups of 1 to 3 students - Project #2 (reviewing a paper)
**due March 15****due March 26**, individual (no groups) - Project #3 (implementing a constrained optimization method)
**due May 7**, in groups of 1 to 3 students

For projects done in groups, briefly describe in the report what each member of the group did. Note that each member should understand the whole project. Project 3 will build on your project 1 solution.

- Project #1 (implementing unconstrained optimization methods)

The course grading will be based on three projects and a final exam, as follows (but note that too low a grade in the exams cannot be compensated by a high grade in the projects or vice versa):

- Project 1 (15%): implementing a method for unconstrained optimization problems
- Project 2 (10%): reviewing a research paper related to optimization (I will accept suggestions for these)
- Project 3 (25%): implementing a method for constrained optimization problems
- Midterm exam (25%): in-class, closed-book, consisting of problems and conceptual questions
- Final exam (25%): as the midterm

While I encourage you to discuss your work with other students, the projects and the exam must be the result of your own work without collaboration.

I will also give homework exercises (mainly from the textbook) of two types, pencil-and-paper and Matlab programming. I will not ask you to solve them, i.e., they will not count towards your grade. I will give the solutions and solve some of the exercises in class. **However, I strongly recommend that you try to solve all the exercises on your own.**

- Demos:
- Matlab Optimization Toolbox. You may find useful to compare the textbook and the user's guide. You can see some demos by running Matlab and typing
`demo toolbox optimization`in the command window. But note that you will be writing your own code, rather than using the Toolbox. - NEOS Guide: information about most areas of optimization
- Optimization Online: a repository of eprints about optimization
- Mathematical Programming Glossary
- Matrix identities (handy formulas for matrix derivatives, inverses, etc.):

If you have never used Matlab, there are many online tutorials, for example:

Miguel A. Carreira-Perpinan Last modified: Thu May 6 14:55:09 PDT 2010