Inmar Givoni – Expectation Maximization, Gaussian Mixtures & Belief Propagation, OH MY! – YouTube

Inmar Givoni - Expectation Maximization, Gaussian Mixtures & Belief Propagation, OH MY!

In this episode i’m joined by Inmar Givoni, Autonomy Engineering Manager at Uber ATG, to discuss her work on the paper Min-Max Propagation, which was presented at NIPS last month in Long Beach.

Inmar and I get into a meaty discussion about graphical models, including what they are and how they’re used, some of the challenges they present for both training and inference, and how and where they can be best applied. Then we jump into an in-depth look at the key ideas behind the Min-Max Propagation paper itself, including the relationship to the broader domain of belief propagation and ideas like affinity propagation, and how all these can be applied to a use case example like the makespan problem. This was a really fun conversation! Enjoy!

Be sure to check out some of the great names that will be at the AI Conference in New York, Apr 29–May 2, where you’ll join the leading minds in AI, Peter Norvig, George Church, Olga Russakovsky, Manuela Veloso, and Zoubin Ghahramani. Explore AI’s latest developments, separate what’s hype and what’s really game-changing, and learn how to apply AI in your organization right now. Save 20% on most passes with discount code PCTWIML. Visit twimlai.com/ainy2018 for registration details. Early price ends February 2!

The notes for this show can be found at twimlai.com/talk/101.

Author: administrator