AI is revolutionizing the $10T transportation industry. Every vehicle will be autonomous â cars, trucks, taxis, buses and shuttles. AI is core to enabling autonomous driving, but AI is also being applied to mobility, logistics, connected vehicles, connected factory, customer experience and a myriad of other use cases in Automotive. Come learn from experts at Audi, BMW and VW about how they are applying data ingestion, labeling, discovery and exploration to develop trained AI models with significant reductions in the time it takes due to GPU-accelerated computing infrastructures.
While GPU-accelerated analytics have already radically accelerated the speed of training machine learning models, data scientists and analysts still grapple with deriving insights from these complex models to better inform decision-making. The key: Visualizing and interrogating black box models with a GPU-enabled architecture. Volkswagen and MapD will discuss how interactive, visual analytics are helping the automotive brand interactively explore the output of their ML models to interrogate them in real time, for greater accuracy and reduced biases. They'll also examine how applying the GPU Data Frame to their efforts has enabled them to accelerate data science by minimizing data transfers and made it possible for their complex, multi-platform machine learning workflows to run entirely on GPUs.
In the world of analytics and AI for many, GPU-accelerated analytics is equivalent to speeding up training time. The question, however, remains is how one interprets such highly complex black box models? How these models can help decision-making? Well discuss and present here a GPU based architecture to not only accelerate training the models but also use the GPU based databases and visual analytics to render billions of rows to solve the challenges of interpreting these black box models. With the advent of algorithms, databases and visualization tools, all based on a GPU architecture a solution like this has become more accessible. Interactive visualization of the model, based on partial dependence analysis, is one approach to interpret these opaque models and is our focus here.