GTC ON-DEMAND

 
SEARCH SESSIONS
SEARCH SESSIONS

Search All
 
Refine Results:
 
Year(s)

SOCIAL MEDIA

EMAIL SUBSCRIPTION

 
 

GTC ON-DEMAND

Presentation
Media
Abstract:

Modern data science demands interactive exploration and analysis of large volumes of data. Learn how NVIDIA and RAPIDS take advantage of GPU acceleration by using libraries such as cuDF, cuIO, and cuString. The computational limits of CPUs are being realized. We'll how RAPIDS uses GPUs to accelerate existing workflows and enable workflows that were previously impossible. We'll cover cuDF's high-level architecture and its GPU use, and do a technical dive into cuDF internals such as the cuIO and cuString libraries. We'll also share testing and benchmarking results and reveal some of the new features and optimizations we're investigating for the future of RAPIDS and cuDF.

Modern data science demands interactive exploration and analysis of large volumes of data. Learn how NVIDIA and RAPIDS take advantage of GPU acceleration by using libraries such as cuDF, cuIO, and cuString. The computational limits of CPUs are being realized. We'll how RAPIDS uses GPUs to accelerate existing workflows and enable workflows that were previously impossible. We'll cover cuDF's high-level architecture and its GPU use, and do a technical dive into cuDF internals such as the cuIO and cuString libraries. We'll also share testing and benchmarking results and reveal some of the new features and optimizations we're investigating for the future of RAPIDS and cuDF.

  Back
 
Topics:
Accelerated Data Science, Tools & Libraries
Type:
Talk
Event:
GTC Silicon Valley
Year:
2019
Session ID:
S9793
Streaming:
Download:
Share:
 
Abstract:
Data Science/Data Mining is the exploration of data to extract novel knowledge and insight. That discovery process often involves a considerable amount of trial and error, after all, if you know what you are looking for you are not doing discovery. The Python programming language has grown in popularity amount data scientists for its flexibility, ease of programming, and readability. However, Python is not known for performance, which has not been an issue in the past. Unfortunately, today, a large amount of science is driven through the exploration of large volumes of data. Combined with the ever-increasing need for more complex algorithms and analytics, data scientists have had to turn more and more of their attention away from the problems they're trying to solve and instead towards implementing their hypotheses in less friendly, "more performant" systems. Luckily, work being done in the GPU Open Analytics Initiative (GOAI) and the NEW RAPIDS platform are pushing to make GPU-accelerated Data Science in Python a first class citizen and driving performance to be on par with the other languages, including GPU-accelerated C/C++.
Data Science/Data Mining is the exploration of data to extract novel knowledge and insight. That discovery process often involves a considerable amount of trial and error, after all, if you know what you are looking for you are not doing discovery. The Python programming language has grown in popularity amount data scientists for its flexibility, ease of programming, and readability. However, Python is not known for performance, which has not been an issue in the past. Unfortunately, today, a large amount of science is driven through the exploration of large volumes of data. Combined with the ever-increasing need for more complex algorithms and analytics, data scientists have had to turn more and more of their attention away from the problems they're trying to solve and instead towards implementing their hypotheses in less friendly, "more performant" systems. Luckily, work being done in the GPU Open Analytics Initiative (GOAI) and the NEW RAPIDS platform are pushing to make GPU-accelerated Data Science in Python a first class citizen and driving performance to be on par with the other languages, including GPU-accelerated C/C++.  Back
 
Topics:
Artificial Intelligence and Deep Learning
Type:
Talk
Event:
GTC Israel
Year:
2018
Session ID:
SIL8136
Share:
 
Abstract:
As cybersecurity data volumes grow, even the best designed SIEMs struggle to perform complex analytics on a large range of data with interactive speeds. We'll discuss how NVIDIA GPU accelerated its own Splunk instance with technologies that are a part of the GPU Open Analytics Initiative, GOAI, to drastically improve cyberhunting. Using tools such as Anaconda, BlazingDB, Graphistry, and MapD, NVIDIA interactively explored billions of events faster than ever to detect threats and perform root cause analysis. We'll walk through how cyberdefenders can use open source tools and libraries to accelerate their own Splunk instance, with code samples and how to's. Finally, we'll discuss how to stay involved in the GPU-accelerated Splunk community.
As cybersecurity data volumes grow, even the best designed SIEMs struggle to perform complex analytics on a large range of data with interactive speeds. We'll discuss how NVIDIA GPU accelerated its own Splunk instance with technologies that are a part of the GPU Open Analytics Initiative, GOAI, to drastically improve cyberhunting. Using tools such as Anaconda, BlazingDB, Graphistry, and MapD, NVIDIA interactively explored billions of events faster than ever to detect threats and perform root cause analysis. We'll walk through how cyberdefenders can use open source tools and libraries to accelerate their own Splunk instance, with code samples and how to's. Finally, we'll discuss how to stay involved in the GPU-accelerated Splunk community.  Back
 
Topics:
Accelerated Data Science, 5G & Edge, Cyber Security
Type:
Talk
Event:
GTC Silicon Valley
Year:
2018
Session ID:
S8499
Streaming:
Download:
Share:
 
Abstract:

Customers are looking to extend the benefits beyond big data with the power of the deep learning and accelerated analytics ecosystems. The NVIDIA® DGX-1™ is the platform of AI Pioneers, which integrates power of deep learning and accelerated analytics together in a single hardware and software system. This session will cover the learnings and successes of real world customer examples for accelerated analytics. Learn how customers are leveraging Deep Learning and Accelerated Analytics to turn insights into AI-accelerated knowledge. The growing ecosystem of solutions and technologies that are delivering on this promise will be covered in this session.

Customers are looking to extend the benefits beyond big data with the power of the deep learning and accelerated analytics ecosystems. The NVIDIA® DGX-1™ is the platform of AI Pioneers, which integrates power of deep learning and accelerated analytics together in a single hardware and software system. This session will cover the learnings and successes of real world customer examples for accelerated analytics. Learn how customers are leveraging Deep Learning and Accelerated Analytics to turn insights into AI-accelerated knowledge. The growing ecosystem of solutions and technologies that are delivering on this promise will be covered in this session.

  Back
 
Topics:
Artificial Intelligence and Deep Learning, Intelligent Machines, IoT & Robotics
Type:
Talk
Event:
GTC Washington D.C.
Year:
2016
Session ID:
DCS16186
Streaming:
Share:
 
Abstract:

Cyber security has an unique complex data problem; 250M to 2B events daily is common. In addition, data is scattered across numerous protection/detection systems and data silos. Rethinking the cybersecurity problem as a data-centric problem, Accenture Labs Cyber Security team uses emerging big-data tools, graph databases & analysis, and GPUs to exploit the connected nature of the data. Pairing GPUs with traditional big data technology created a best of breed evolving system, ASGARD, to allow users to hunt for new unknown threats and risks at speeds much faster than pure CPU systems. Learn how we're visualizing orders of magnitude more data with Graphistry, a GPU powered visualization engine, and accelerating complex analytics on GPUs to level the playing field against new cyber threats.

Cyber security has an unique complex data problem; 250M to 2B events daily is common. In addition, data is scattered across numerous protection/detection systems and data silos. Rethinking the cybersecurity problem as a data-centric problem, Accenture Labs Cyber Security team uses emerging big-data tools, graph databases & analysis, and GPUs to exploit the connected nature of the data. Pairing GPUs with traditional big data technology created a best of breed evolving system, ASGARD, to allow users to hunt for new unknown threats and risks at speeds much faster than pure CPU systems. Learn how we're visualizing orders of magnitude more data with Graphistry, a GPU powered visualization engine, and accelerating complex analytics on GPUs to level the playing field against new cyber threats.

  Back
 
Topics:
Federal, Intelligent Machines, IoT & Robotics, HPC and AI
Type:
Talk
Event:
GTC Washington D.C.
Year:
2016
Session ID:
DCS16135
Streaming:
Share:
 
 
Previous
  • Amazon Web Services
  • IBM
  • Cisco
  • Dell EMC
  • Hewlett Packard Enterprise
  • Inspur
  • Lenovo
  • SenseTime
  • Supermicro Computers
  • Synnex
  • Autodesk
  • HP
  • Linear Technology
  • MSI Computer Corp.
  • OPTIS
  • PNY
  • SK Hynix
  • vmware
  • Abaco Systems
  • Acceleware Ltd.
  • ASUSTeK COMPUTER INC
  • Cray Inc.
  • Exxact Corporation
  • Flanders - Belgium
  • Google Cloud
  • HTC VIVE
  • Liqid
  • MapD
  • Penguin Computing
  • SAP
  • Sugon
  • Twitter
Next