Presentation: Modern Distributed Optimization
Abstract
We often want to find the best settings for our systems, whether it’s configuring the best JVM parameters, optimizing user workflows, or selecting the right configuration for a machine learning algorithm. Black-box optimization techniques that can find good (hopefully optimal!) parameters have been investigated for the last 60 years, but over the last 20 years there’s been significant attention placed on creating versions that can take advantage of parallel compute.
In this talk, we’ll cover the types of real-world problems that are being solved with these techniques. We’ll do a deep dive into a few of the most popular ones, such as Distributed Nelder-Mead and Bayesian Optimization, and discuss their trade-offs. You should walk away with an understanding of what’s actually going on inside of these black-boxes and a good idea of how you can start applying them to your problems today.
Similar Talks
Psychologically Safe Process Evolution in a Flat Structure
Director of Software Development @Hunter_Ind
Christopher Lucian
Not Sold Yet, GraphQL: A Humble Tale From Skeptic to Enthusiast
Software Engineer @Netflix
Garrett Heinlen
Let's talk locks!
Software Engineer @Samsara
Kavya Joshi
PID Loops and the Art of Keeping Systems Stable
Senior Principal Engineer @awscloud
Colm MacCárthaigh
How Did Things Go Right? Learning More From Incidents
Site Reliability Engineering @Netflix
Ryan Kitchens
Graceful Degradation as a Feature
Director of Product @GremlinInc
Lorne Kligerman
A Dive Into Streams @LinkedIn With Brooklin
Data Infrastructure @LinkedIn
Celia Kung
Liberating Structures @CapitalOne
Agile Coach, Engineering @CapitalOne
Greg Myers
Making 'npm install' Safe
Software Engineer @agoric