Pixvana's Creative Director Scott Squires & Tywen Kelly Product Manager/Evangelists will demonstrate their cloud-based GPU video processing system created for handling many large 360 videos. The talk will also showcase how VRWorks has made it possible for Pixvana to stitch multiple 8k+ videos, each from several cameras, in parallel, and how Pixvana has transformed the editing process by allowing in-headset creation of interactive experiences with branching narratives. The potential for AI and Machine Learning in VR will also be covered.
We will present NVIDIA's solution for interactive, real-time streaming of VR content (such as games and professional applications) from the cloud to a low-powered client driving a VR/AR headset. We will outline few of the challenges, describe our design, and share some performance and quality metrics.
We'll provide an overview of new techniques developed by ZeroLight using NVIDIA's Volta and Turing GPUs to enhance real-time 3D visualization in the automotive industry for compelling retail experiences. We'll cover the challenges involved in integrating real-time ray-traced reflections at 60fps in 4k and how future developments using DXR and NVIDIA RTX will enable improvements to both graphics and performance. We'll also discuss the challenges to achieving state-of-the-art graphical quality in virtual reality. Specifically, we'll explain how the team created a compelling commercial VR encounter using StarVR One and its eye-tracking capabilities for foveated rendering.
Large AEC projects involve complex structure design and validation tools but also benefit from High-end Visualization and VR for proper scale-one immersion and volume apprehension. CADMakers has been one of the very early adopters of Dassault Systemes 3DEXPERIENCE that combines the legacy of over 20 years of CATIA CAD excellence with advanced rendering materials support and native VR immersion without the need to use other external tools. This presentation will provide an unique inside view into todays and future possibilities of decision making in building design, leveraging the power of integrated Virtual Reality and Visualization experiences that happen directly in the CAD software tools of the 3DEXPERIENCE platform. The talk will present some of the latest GPU intensive 3DEXPERIENCE platform achievements at CADMakers, including how the platform is being used for building construction simulation, visual High-End Material validation for realistic AEC design review but also actual VR usages showing the different graphics performance gains obtained with large AEC projects, as well as how VR SLI allows enabling 90FPS immersion for multi-user multi-location VR reviews. Talk will be presented with actual AEC dataset used by CADMakers for some of their buildings designed with 3DEXPERIENCE.
Learn about the plans of market leaders in streaming VR and AR content from the cloud in this panel discussion. From enterprise use cases to streaming VR to the 5G edge, panelists will describe the state-of-the-art and challenges to making XR truly mobile.
VR is rapidly evolving. HMD resolution and field of view are increasing, VR content is becoming more detailed, and demand for more realistic and more immersive experiences continues to grow. As we march forward in the pursuit of ever-better VR, how will we render fast enough to drive those higher resolution displays? How will we generate realistic content for enormous virtual worlds? How will we continue to enhance the quality and depth of immersion? In this panel, we'll cover topics such as human perception and neurophysiology, adaptive rendering strategies that focus compute power where it's needed, and deep learning-based synthesis for virtual models and environment. Learn how these components are being integrated to drive the future of VR.
Learn how Microsoft is extending WebRTC to enable real-time, interactive 3D streaming from the cloud to any low-powered remote device. The purpose is to provide an open-source toolkit to enable industries to leverage remote cloud rendering in their service and product pipelines. This is required for many industries in which the scale and complexity of 3D models, scenes, physics, and rendering is beyond the capabilities of a mobile device platform. We are extending the industry standard WebRTC framework to 3D scenarios such as mixed reality. We'll explain the work we did to realize the goal of delivering high-quality 3D applications to any client — web, mobile, desktop, and embedded. This is only possible using the NVIDIA NVENCODE pipeline for server-side rendering on the cloud.
While most industries have experienced growth due to rapid adoption of digital tools, productivity in architecture and construction has dropped since 2005. The industry's toolkit is limited and disjointed, imposing a ceiling on the capabilities of even the most talented designers. The result is lost time, wasted money, and an industry-wide foreclosing of creative possibility that affects the world around us. We'll discuss suite of tools developed and deployed by our team at SHoP Architects, and describe the pioneering workflows we're using to redefine the future of architecture.
We'll discuss how advancements in GPU technology, real-time ray tracing, and virtual reality technologies furthered the building design process of a global design firm. Our talk will trace the steps we took to move beyond earlier approaches to using digital technology in building design by adopting, planning, and implementing new technologies. We'll cover our firm's infrastructure and cloud technology, and conclude with a full reveal of how we deploy and embrace advanced technologies to enhance designs, enrich internal and client communication, and ultimately deliver buildings that further improve our communities.
We'll examine the potential for spatial computing and machine learning to reintroduce people to the physical potential of their bodies by focusing on Embody, MAP Lab's 2019 Sundance premiere. Inspired by movement traditions such as aikido, yoga, and dance, Embody is a social VR experience that uses visual metaphor and encouragement from teachers and friends to bring about coordinated body movement. We'll explain how this experience, which is piloted entirely by body movement and position, reclaims the body's potential inside the digital landscape. Users prompt each other with conversation, mirroring, and environmental channeling to step together through physical sequences designed to center, balance, extend, and strengthen. We hope players who experience Embody will be reminded of their deep physical potential and remember that the body is a flexible tool and able to change (http://www.sundance.org/projects/embody).