GTC ON-DEMAND

 
SEARCH SESSIONS
SEARCH SESSIONS

Search All
 
Refine Results:
 
Year(s)

SOCIAL MEDIA

EMAIL SUBSCRIPTION

 
 

GTC ON-DEMAND

Deep Learning & AI Frameworks
Presentation
Media
Deep Learning Hyperparameter Optimization with Competing Objectives via Bayesian Optimization
Abstract:

Bayesian Optimization is an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive. Deep learning pipelines like MXnet are notoriously expensive to train, even on GPUs, and often have many tunable parameters including hyperparameters, the architecture, and feature transformations that can have a large impact on the efficacy of the model. In traditional optimization, a single metric like accuracy is optimized over a potentially large set of configurations with the goal of producing a single, best configuration. We'll explore real world extensions where multiple competing objectives need to be optimized, a portfolio of multiple solutions may be required, constraints on the underlying system make certain configurations not viable, and more. We'll present work from recent ICML and NIPS workshop papers and detailed examples, with code, for each extension.

 
Topics:
Deep Learning & AI Frameworks, AI Startup
Type:
Talk
Event:
GTC Silicon Valley
Year:
2018
Session ID:
S8136
Streaming:
Download:
Share: