I attended the 2017 International Conference on Machine Learning (ICML) in Sydney, Australia. ICML is one of the flagship machine learning conferences. Since machine learning is a very applied field, there was a considerable industry presence, including all the big-name tech giants: Google (Brain, DeepMind), Microsoft, Facebook, Amazon, NVidia, etc. Machine learning isn’t my area of expertise, but it was nonetheless interesting for me to step out of my comfort zone and hear about the state of the art in this very popular field. As one might expect, deep learning and reinforcement learning were prominently featured, and there were some very impressive demonstrations. It looks like there is still much work to be done on the theory side of deep learning, however; the next five years should be quite exciting!
At ICML, I presented joint work with Bin Hu and myself on using energy-based methods (dissipativity theory) to understand and interpret why and how iterative algorithms converge. The key idea is that iterative algorithms are dynamical systems. So if we define the appropriate notion of internal energy, we can write a conservation law (as we do in physics). The rate of dissipation of internal energy is then equivalent to the rate of convergence of the algorithm to its fixed point. If you’re interested, you can download my slides, my poster, or the paper itself.
This was my first time visiting Australia and my first time in the Southern hemisphere too! The photo above is a panorama taken from a ferry looking back at Darling Harbour. In the photo, you can see the iconic Sydney Tower Eye, the Sydney Opera House, and the Sydney Harbour Bridge.