The method of auxiliary coordinates (MAC)

This page collects material about the method of auxiliary coordinates (MAC), a mathematical strategy to optimise "nested" systems, such as deep neural nets, without using chain-rule gradients (backpropagation), that reuses single-layer algorithms, handles non-differentiable layers and introduces significant parallelism.

We have used related ideas to MAC in our "learning-compression" algorithm for neural net compression.

This work has been done in collaboration with my past and present students and collaborators Mehdi Alizadeh, Zhengdong Lu, Ramin Raziperchikolaei, Max Vladymyrov and Weiran Wang.

It has been funded in part by:

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Selected presentations

Tutorials, reviews

Original reference for MAC

Other references: extensions, related work, etc.


Miguel A. Carreira-Perpinan
Last modified: Wed Nov 1 01:51:29 PDT 2017

UC Merced | EECS | MACP's Home Page