SEARCH SESSIONS

Search All
 
Refine Results:
 
Year(s)

SOCIAL MEDIA

EMAIL SUBSCRIPTION

 
 

GTC ON-DEMAND

Presentation
Media
Abstract:
The average human brain has about 100 billion nerve cells. We therefore investigate the question whether there are algorithms for artificial neural networks that are linear in the number of neurons, while the number of connections incident to a neuron is bounded by a constant. We offer two approaches to answer this question: First, we derive an algorithm that quantizes a trained artificial neural network such that the resulting complexity is linear. Second, we demonstrate that training networks, whose connections are determined by uniform sampling can achieve a similar precision as compared to using fully connected layers. Due to sparsity upfront, these networks can be trained much faster. Both approaches are made plausible by relating artificial neural units to Monte Carlo integration. We'll demonstrate the results for classic test datasets.
The average human brain has about 100 billion nerve cells. We therefore investigate the question whether there are algorithms for artificial neural networks that are linear in the number of neurons, while the number of connections incident to a neuron is bounded by a constant. We offer two approaches to answer this question: First, we derive an algorithm that quantizes a trained artificial neural network such that the resulting complexity is linear. Second, we demonstrate that training networks, whose connections are determined by uniform sampling can achieve a similar precision as compared to using fully connected layers. Due to sparsity upfront, these networks can be trained much faster. Both approaches are made plausible by relating artificial neural units to Monte Carlo integration. We'll demonstrate the results for classic test datasets.  Back
 
Topics:
AI Application Deployment and Inference, AI and DL Research
Type:
Talk
Event:
GTC Silicon Valley
Year:
2018
Session ID:
S8780
Streaming:
Download:
Share: