This repo explores how AR Body Tracking and Unity VFX Graph can be leveraged to create AR VFX that content creators can use in real time. This repo uses Unity ARFoundation & Apple ARKit packages for prototyping.
One of the purposes of AR Body Tracking is to have 3D data that can be used to trigger / spawn events based on the position and rotation of each skeleton bone in the system. For example, VFX events that are triggered based on specific character poses.
In the world of immersive media, a major application of this is short form video on TikTok, where dance & music are displayed prominently with a large percentage of its user base.
I believe AR Body Tracking has the potential to reshape how people create video.
Imagine that Adobe After Effects now becomes a lightweight editor for real time human bodies, that can be placed on the palm of your hands? In the video e-commerce economy, building these lightweight VFX tools that power the future of the video economy is critical to where the future of work and play will be heading.