You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In-situ visualization is a ray-tracing technique for visualizing the simulation data in real-time without requiring additional storage resources. This project will integrate distributed asynchronous ray-tracing with TACC Galaxy into CB-Geo MPM code for rendering petascale simulations.
Intensity
Priority
Involves
Mentors
Moderate
High
Integrating Galaxy with existing rendering systems in CB-Geo MPM. Developing rendering of natural hazards such as landslides with in-situ viz interface.
Students who work on this project can expect their skillset to grow in
Distributed asynchronous ray-tracing
Petascale simulations
In-situ visualization
Motivation
Current techniques for visualization of petascale simulations involve the post-hoc rendering of a temporal slice of a subset of the data from disk, which leads to a significant portion of information being disregarded and potentially lost. A large-scale simulation generates several terabytes of data that push commercial rendering engines like Blender, Mantra, and Arnold to the limit. The amount of data would be several orders of magnitude higher at petascale and exascale levels, making the simulation spend most of the supercomputing time doing I/O.
Technical Details
In-situ visualization is a rendering technique for visualizing the simulation data in real-time without requiring additional storage resources. Running the visualization and simulation in tandem avoids the bottleneck of data transfer. Furthermore, this approach allows for monitoring and interaction while the simulation is running, enabling scientists to modify simulation parameters and explore the effect on the phenomena in real-time. Ray tracing engines such as TACC Galaxy offer distributed asynchronous in-situ visualization capabilities for petascale simulations.
This project will integrate the in-situ data visualization and query framework for the HPC-scalable CB-Geo MPM code using the TACC Galaxy and Intel OSPRay libraries. Galaxy is a fully asynchronous distributed parallel rendering engine geared towards using full global illumination for large-scale visualization. Galaxy provides a performant distributed rendering using an actor model to render scenes across multiple MPI tasks asynchronously. Galaxy employs asynchronous framebuffer updates and a novel subtractive lighting model to achieve acceptable image quality even from the first ray generation, and the quality of rendering is continuously improved throughout the render epoch. This technology allows for transparent data communication across the network using a client/server model using sockets or MPI layers. The project involves effective communication of distributed data between the simulator and in-situ viz engine. Advanced sampling techniques with visual culling techniques are essential to render photorealistic MPM simulations at petascale. An a priori adaptive sampling method based on multiple viewpoints with visual culling of the rendered scenes will be tested with Monte Carlo sampling to render a billion particle simulation.
The main goals of this project are to:
Integrate in-situ visualization with TACC Galaxy in CB-Geo MPM.
Perform an asynchronous distributed ray-tracing of billions of particles in a petascale simulation of natural hazards, such as the Oso landslide in Washington.
Support context and user-specific visualization, i.e., different users may be interested in different aspects of the data sets.
Develop modular and adaptive visualization infrastructure to automatically choose appropriate rendering configurations (lighting, camera position, material, textures, and identify regions of interest) for petascale problems.
Benefits to project/community
The CB-Geo MPM currently supports VTK outputs and rendering support through Disney’s Partio library, which requires commercial rendering tools such as Houdini / Pixar Renderman. Enabling In-situ visualization capabilities in the CB-Geo Material Point Method code will open a new avenue of possibilities for running realistic petascale simulations, which has never been attempted. In-situ visualization with simultaneous multiple viewports will offer the ability to support context and user-specific visualization of the simulation results. Lessons learned from rendering at petascale with asynchronous communication and ray-tracing will benefit the wider community and significantly influence policymakers in devising strategies in the event of a natural hazard. The TACC Galaxy and Intel OSPray in-situ visualization libraries will enable a 100% open-source pipeline from input to final rendering.
Helpful Experience
Have a working knowledge of C++ and Python.
Experience with rendering or visualization tools (ParaView, Houdini, Blender) is encouraged but not required.
Experience with distributed parallel architectures such as MPI is helpful.
thank you for providing awesome documentation. i have experience in python and c++ interested to work with all and getting guidance from amazing mentors. I will surely complete the first steps and hope to work under your guidance in gsoc
Greetings everyone,
I am Rohan Gupta, a 3rd year CSE Undergraduate student at Shri Mata Vaishno Devi University. I have experience working with Python and C++, and I am interested in working on this project. It would be a pleasure to work on this project for GSoC 2021.
Project Ideas
In-situ visualization using Galaxy in MPM
Abstract
In-situ visualization is a ray-tracing technique for visualizing the simulation data in real-time without requiring additional storage resources. This project will integrate distributed asynchronous ray-tracing with TACC Galaxy into CB-Geo MPM code for rendering petascale simulations.
Benefits of working on this project
Motivation
Current techniques for visualization of petascale simulations involve the post-hoc rendering of a temporal slice of a subset of the data from disk, which leads to a significant portion of information being disregarded and potentially lost. A large-scale simulation generates several terabytes of data that push commercial rendering engines like Blender, Mantra, and Arnold to the limit. The amount of data would be several orders of magnitude higher at petascale and exascale levels, making the simulation spend most of the supercomputing time doing I/O.
Technical Details
In-situ visualization is a rendering technique for visualizing the simulation data in real-time without requiring additional storage resources. Running the visualization and simulation in tandem avoids the bottleneck of data transfer. Furthermore, this approach allows for monitoring and interaction while the simulation is running, enabling scientists to modify simulation parameters and explore the effect on the phenomena in real-time. Ray tracing engines such as TACC Galaxy offer distributed asynchronous in-situ visualization capabilities for petascale simulations.
This project will integrate the in-situ data visualization and query framework for the HPC-scalable CB-Geo MPM code using the TACC Galaxy and Intel OSPRay libraries. Galaxy is a fully asynchronous distributed parallel rendering engine geared towards using full global illumination for large-scale visualization. Galaxy provides a performant distributed rendering using an actor model to render scenes across multiple MPI tasks asynchronously. Galaxy employs asynchronous framebuffer updates and a novel subtractive lighting model to achieve acceptable image quality even from the first ray generation, and the quality of rendering is continuously improved throughout the render epoch. This technology allows for transparent data communication across the network using a client/server model using sockets or MPI layers. The project involves effective communication of distributed data between the simulator and in-situ viz engine. Advanced sampling techniques with visual culling techniques are essential to render photorealistic MPM simulations at petascale. An a priori adaptive sampling method based on multiple viewpoints with visual culling of the rendered scenes will be tested with Monte Carlo sampling to render a billion particle simulation.
The main goals of this project are to:
Benefits to project/community
The CB-Geo MPM currently supports VTK outputs and rendering support through Disney’s Partio library, which requires commercial rendering tools such as Houdini / Pixar Renderman. Enabling In-situ visualization capabilities in the CB-Geo Material Point Method code will open a new avenue of possibilities for running realistic petascale simulations, which has never been attempted. In-situ visualization with simultaneous multiple viewports will offer the ability to support context and user-specific visualization of the simulation results. Lessons learned from rendering at petascale with asynchronous communication and ray-tracing will benefit the wider community and significantly influence policymakers in devising strategies in the event of a natural hazard. The TACC Galaxy and Intel OSPray in-situ visualization libraries will enable a 100% open-source pipeline from input to final rendering.
Helpful Experience
First steps
The text was updated successfully, but these errors were encountered: