`
Miguel Á. Carreira-Perpiñán
Assistant professor
Dept. of Computer Science & Electrical Engineering
OGI School of Science & Engineering / OHSU
miguel-[at]-csee.ogi.edu; 503-7481455
Office: Central Building 142
`

Office hours: by appointment (call or email, including [MATH519] in the subject).

Classes: Mondays and Wednesdays 6:30-7:59pm in classroom 401, Paul Clayton Building (number 2 in the map)

Course web page: `http://www.csee.ogi.edu/~miguel/math519`

Optimization problems arise in multiple areas of science, engineering and business. This course introduces numerical methods for continuous optimization, focusing on practical methods. The course will cover derivative-based methods for constrained and unconstrained multivariate optimization, including line-search and trust-region strategies; conjugate-gradient, Newton and quasi-Newton methods; linear programming (simplex and interior-point methods); quadratic programming; penalty, barrier and augmented Lagrangian methods; and sequential quadratic programming. The primary programming tool for this course is Matlab.

Prerequisites: undergraduate courses in linear algebra and multivariate calculus. Basic concepts will be briefly reviewed during the course.

Required textbook (get the errata and additional errata):

- Jorge Nocedal and Stephen J. Wright:
*Numerical Optimization*. Springer-Verlag, 1999.

There is also a second edition that you can use if you want.

Other recommended books:

- R. Fletcher:
*Practical Methods of Optimization*. Wiley, 1987. - David G. Luenberger:
*Introduction to Linear and Nonlinear Programming*, 2nd ed. Addison Wesley, 1984. - Stephen Boyd and Lieven Vandenberghe:
*Convex Optimization*. Cambridge University Press, 2004. Available online. - W. H. Press, S. A. Teukolsky, W. T. Vetterling and B. P. Flannery:
*Numerical Recipes in C: The Art of Scientific Computing*, 2nd ed., chapter 10. Cambridge University Press, 1992. Available online.

All are on reserve in the OGI Library.

If you want to refresh your knowledge of linear algebra and multivariate calculus, the following are helpful (any edition or similar book is fine):

- Seymour Lipschutz:
*Schaum's Outline of Linear Algebra*. McGraw-Hill. - Murray Spiegel:
*Schaum's Outline of Vector Analysis*. McGraw-Hill. - Frank Ayres:
*Schaum's Outline of Matrices*. McGraw-Hill. - Richard Bronson:
*Schaum's Outline of Matrix Operations*. McGraw-Hill. - Erwin Kreyszig:
*Advanced Engineering Mathematics*. Wiley. - Gilbert Strang:
*Introduction To Linear Algebra*. Wellesley-Cambridge Press. - Gilbert Strang:
*Linear Algebra and Its Applications*. Brooks Cole.

Before each class, you should have read the corresponding part of the textbook and the notes. I will teach the material in the order below (which is more or less the order in the book) except that the __underlined page numbers__ (corresponding to __pages 35-37, 48-52, 62-63 in the notes__) will be taught at the end of the course.

- Introduction (ch. 1: all) and math review (appendix A: skim it)
- Unconstrained optimization
- Mathematical criteria (ch. 2: all; skip R-rates of convergence)
- Line search methods (ch. 3: pp. 35-55; skip proofs in sec. 3.3)
- Trust-region methods (ch. 4: pp. 65-69, 77-79)
- Conjugate gradient methods (ch. 5: pp. 101-117, 120-122; skim 112-117)
- Practical Newton methods (ch. 6: skim pp. 135-142)
- Quasi-Newton methods (ch. 8: pp. 193-200)
- Nonlinear least-squares problems (ch. 10: pp. 251-264; skip proofs)
- Nonlinear equations (ch. 11: skim it)

- Constrained optimization
- Mathematical criteria (ch. 12: pp. 315-331, 342-348).
- Linear programming: the simplex method (ch. 13: pp. 364-370,
__370-378 and 391-392; skip proofs in sec. 13.2__) - Linear programming: interior-point methods (ch. 14: pp. 395-402, skim 402-411)
- Fundamentals of algorithms for nonlinear constrained optimization (ch. 15: all)
- Quadratic programming (ch. 16: 441-447,
__453-459, 462, 464-467, 476-480__, 480-484) - Penalty, barrier and augmented Lagrangian methods (ch. 17: all; skip proofs; skim pp. 521-523)
- Sequential quadratic programming (ch. 18: pp.
__529-534__)

- Some Matlab code to plot contours of functions, etc.
- The notes and math review to accompany the textbook. Bring the corresponding part to each class.
- Homework (to do on your own, not graded):
- Homework #1 (covering chapters 1-4 from the book); solutions #1 (
**available from April 16**) - Homework #2 (covering chapters 5, 6, 8, 10 from the book); solutions #2 (
**available from May 4**) - Homework #3 (covering chapters 12-15 from the book); solutions #3 (
**available from May 14**) - Homework #4 (covering chapters 16-18 from the book); solutions #4 (
**available from May 28**)

- Homework #1 (covering chapters 1-4 from the book); solutions #1 (
- Projects (to be submitted and graded):
- Project #1 (implementing an unconstrained optimization method)
**due May 14**, in groups of 1 or 2 students - Project #2 (reviewing a paper)
**due May 14**, individual (no groups) - Project #3 (implementing a constrained optimization method)
**due June 10**, in groups of 1 or 2 students

- Project #1 (implementing an unconstrained optimization method)

The course grading will be based on three projects and a final exam, as follows:

- Project 1 (17%): implementing a method for unconstrained optimization problems
- Project 2 (17%): reviewing a research paper related to optimization (I will accept suggestions for these)
- Project 3 (33%): implementing a method for constrained optimization problems
- Exam (33%): in-class, close-book, consisting of problems and conceptual questions

While I encourage you to discuss your work with other students, the projects and the exam must be the result of your own work without collaboration.

I will also give homework exercises (mainly from the textbook) of two types, pencil-and-paper and Matlab programming. I will not ask you to solve them, i.e., they will not count towards your grade. I will give the solutions and solve some of the exercises in class. **However, I strongly recommend that you try to solve all the exercises on your own.**

- Demos:
- Matlab Optimization Toolbox. You may find useful to compare the textbook and the user's guide. You can see some demos by running Matlab from an OGI PC and typing
`demo toolbox optimization`in the command window. But note that you will be writing your own code, rather than using the Toolbox. - NEOS Guide: information about most areas of optimization
- Optimization Online: a repository of eprints about optimization
- Mathematical Programming Glossary
- Tutorials and books
- Matrix identities (handy formulas for matrix derivatives, inverses, etc.):

If you have never used Matlab, there are many online tutorials, for example:

Miguel A. Carreira-Perpinan Last modified: Sat Jun 9 20:43:25 PDT 2007

OHSU | OGI | Dept. of CSEE | Adaptive Systems Group | MACP's Home Page