Plato Data Intelligence.
Vertical Search & Ai.

Unreal Engine 5 powers VR for wingsuit flying simulator

Date:

Intrepid individuals throughout history have attempted to conquer the skies using everything from pedal-powered planes to rocket-powered jet packs.

Arguably the closest a human has ever come to non-mechanically aided flight, however, is wingsuit BASE jumping – also known as wingsuit flying.

This highly dangerous and technically difficult sport involves BASE jumping from a high point while dressed in a webbing-sleeved jumpsuit that enables the wearer to glide rather than free fall.

Requiring years of skydiving and BASE jumping experience – and with a fatality rate of one in 500 jumps – wingsuit BASE jumping is a pursuit that has, until now, been beyond the reach of 99.9% of the population.

JUMP is the world’s first hyperreal wingsuit simulator, combining a real wingsuit, a virtual reality helmet and a mix of suspension, wind effects and hyperreal multi-sensory stimulation.

It is the brainchild of chief executive and founder James Jensen, who was part of the team that set up The VOID, one of the first walking VR simulation companies.

Jensen assembled a team, and between 2019 and 2021, they built a prototype simulator. That led to a working facility in Bluffdale, Utah, which has now been operating for more than four months and has flown more than 5,000 people. “I’ve never sky-dived or BASE jumped,” says Jensen. “I rely on my professional athletes to tell me this is real—they’ve said it’s about 85% there. We’re pushing for 100%.”

JUMP takes the flyer into hyper-detailed 3D landscapes of some of the world’s most breath-taking BASE jumps, including Notch Peak in the US. To achieve this, the JUMP team flew a helicopter kitted out with top-of-the-range cameras, spending two days capturing thousands of ultra-high-resolution images of the landscape below.

The images were processed using the latest version of the RealityCapture photogrammetry tool that enables ultra-realistic 3D models to be created from sets of images and/or laser scans.

Reconstructing the 58,000 images captured required five supercomputers. The team also used precise information from gyroscopes and other sensors to create a high-precision custom flight log. The result was an incredibly detailed digital model of the environment of more than eight billion polygons across 10 square miles.

The next step was to bring the huge dataset into Unreal Engine 5. “It took some support from the RealityCapture team, but in the end, we developed some new tools that helped chop up these massive data sets and assign the appropriate textures and materials,” says Jensen.

The team leveraged Nanite, Unreal Engine 5’s virtualised micro-polygon geometry system to handle the import and replication of the multimillion-polygon mesh while maintaining a real-time frame rate without any noticeable loss of fidelity.

For the lighting and shadows, the team harnessed the power of Lumen, a fully dynamic global illumination system in Unreal Engine 5 that enables indirect lighting to adapt on the fly to changes to direct lighting or geometry.

“Because we are looking for total photorealism, we are leaning heavily into Nanite and Lumen to make our scenes come to life,” says Jensen. “We currently have the largest dataset in Nanite at eight billion polygons – more than 700 parts and 16k textures per part.”

Jensen explains that features such as these are the reason JUMP used Unreal Engine to create the experience. “Unreal Engine is just flat-out leading the industry in high-resolution real-time simulations,” he says.

“Seeing the things that I used to do in video production that would take days, even weeks, and months to render now all happen in real time is unbelievable. Polygon count has always been a bottleneck, and global illumination with Lumen – it’s mindblowing to see in real time.”

The JUMP team filled out the virtual environment with shrubs, trees, grass and other objects from Quixel Megascans, a scans library offering of photorealistic 3D scanned tileable surfaces, textures, vegetation and other high-fidelity CG assets that is included with Unreal Engine 5.

They also developed their own physics engine, FLIGHT, which handles all of the configurations and physics for both the physical and digital worlds. Blender and Maya were used for the 3D art.

The result is an awe-inspiring virtual world realistic enough to trick flyers into believing they are standing on the precipice of a 1,200m drop.

But to create the fully immersive sensation of real flight, what is seen in the VR headset needs to be combined with a real wingsuit, suspension system, wind effects and multi-sensory stimulation.

“Physical effects are essential in being able to mimic reality,” says Jensen. “When you can synchronise physical sensation with visuals and audio, you go to a whole other dimension in virtual reality simulations.”

The simulation’s haptics are triggered by events in the virtual environment. “We’ve written custom code inside Unreal Engine specifically for moments inside of the wingsuit BASE jumping experience that initiates signals for scent, wind speed, haptic stage effects, sound effects and physical objects,” explains Jensen.

Details that provide a true sense of presence include filling the wingsuit with compressed air. Once a flyer jumps off the cliff, their wingsuit inflates within a few seconds, while a fan blows wind at an ever-increasing pace to add to the realism of the experience.

For now, JUMP is a location-based experience, but Jensen alludes to a future in which a version of the system could be operating in homes around the world.

“The JUMP simulator and technology are the foundation for true full mobility inside any metaverse,” he says. “Through a few years of location-based entertainment, we will inevitably derive a perfect virtual reality mobility product for at-home use.”

spot_img

Latest Intelligence

spot_img