The method of auxiliary coordinates (MAC)

This page collects material about the method of auxiliary coordinates (MAC), a mathematical strategy to optimise "nested" systems, such as deep neural nets, without using chain-rule gradients (backpropagation), that reuses single-layer algorithms, handles non-differentiable layers and introduces significant parallelism.

We have used related ideas to MAC in our "learning-compression" algorithm for neural net compression.

This work has been done in collaboration with my past and present students and collaborators Mehdi Alizadeh, Magzhan Gabidolla, Zhengdong Lu, Ramin Raziperchikolaei, Max Vladymyrov, Weiran Wang and Arman Zharmagambetov.

 

It has been funded in part by:

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Selected presentations

Original reference for MAC

Other references: extensions, related work, etc.

Tutorials, reviews


Miguel A. Carreira-Perpinan
Last modified: Mon Jan 16 17:26:51 PST 2023

UC Merced | EECS | MACP's Home Page