‘Pathways’ Fast-Forwards Tracking Refinement for Hybrid Movie VFX
Learn how Lightcraft and ETC used iPhone tracking to fast-forward tracking refinement in movie VFX. Achieve studio-grade precision on an agile budget.
Written By: Jesse Radonski
In the world of high-end 3D VFX, there’s a ghost that haunts every production: the dreaded “wiggle.” If you’re unfamiliar, it’s a slight, nauseating jitter where a digital object doesn’t quite stick to the real-world object in the frame. For years now, exorcising this ghost required the deep pockets of a major studio and an army of matchmove artists.But at the Entertainment Technology Center (ETC) at USC, a new project titled Pathways is proving that the goalposts have moved. By combining virtual production, generative AI, and traditional VFX into a single, streamlined workflow, the Pathways team is achieving studio-grade tracking refinement precision with something most of us have in our pockets — an iPhone.The Challenge: Beyond the LED Wall
Directed by Felipe Vargas, Pathways is a short film that merges live-action performance with AI-assisted cinematography and post-production. The story follows Isa in the wake of her grandfather’s death as she connects with her cultural roots over time, altering her physical reality by transporting to different AI-generated historical locations, including Machu Picchu. Renowned cinematographer Roberto Schaefer (Quantum of Solace, Finding Neverland) worked with the team to design a visual language that could later be enhanced using AI.While machine learning has been a VFX staple for years, from character creation to rotoscoping, generative AI environment workflows still feel a bit like the Wild West. To help make the team more cohesive, executive producer Tom Thudiyanplacakal worked with MovieLabs to define a new ontology, or common language, so that the director, DP, and AI artists spoke the same filmmaking language in their respective workflows. But while an ontology solves the human side of the equation, the technical side of generative AI can be unforgiving. You can have a perfect creative consensus on how a 1950s Peruvian street should look, but if that street doesn't stay anchored to the actors' feet, the illusion shatters.The One-Pixel Barrier
The margin for error vanishes the moment a physical object, like a rock formation, must touch a virtual one. If your camera tracking has more than a single pixel of reprojection error, the CG elements will appear to float or slide. Traditionally, fixing this is a massive bottleneck. Professional matchmoving can easily cost $300 per shot and eat up hours of manual labor. Most on-set tracking tools, while great for visualization, simply don’t provide the necessary data needed for these critical joining points.
The iPhone’s Triple-Threat Workflow
Rather than relying on a single, expensive tracking sensor, the team used three distinct data streams already built into the iPhone.ARKit Data: Provides the initial rough 3D spatial orientation.LiDAR Geometry: Generates a 3D mesh of the physical set in real time.Lens Calibration: Maps the unique optical characteristics of the glass.
When synchronized via a dual-iPhone rig (one for the plate, one for tracking), these unlock a professional-grade solution. The bridge in this process is Lightcraft Technology’s Autoshot. Instead of a technician spending hours manually aligning points, Autoshot automates the handoff, generating custom Syntheyes scripts that pre-load all three data streams, and giving the tracking software a huge head start that previously didn’t exist. If you want to see this in action, here’s a video detailing how to fix a misaligned shot.Case Study: The Citadel
In one of the film’s sequences, Isa precariously peeks over the edge of a stone shelf in Machu Picchu, taking in the Andes mountain range’s awe-inspiring view. Rather than film on location, the team wanted to take advantage of an AI-generated environment, both in the background and some of the foreground. What you’ll see in the Tracking Refinement in Pathways video is that the actress playing Isa and the stones she’s lying on are the only real things in the shot.Using a script automatically generated by Autoshot, the team imported the entire on-set package into SynthEyes. This is more than a video file; it’s a bundle containing the frame sequence, the iPhone’s lens calibration, and Jetset's tracking data.When you first hit play, you’ll see a ghost-like LiDAR scan overlaying the footage. It’s remarkably close, usually within a few centimeters, but in the world of high-end VFX, close is where the “wiggle” lives. If you zoom in on a foreground object, such as a jagged rock, you’ll notice the digital scan doesn’t perfectly stick to the live-action texture. There’s the team’s starting line.To fix this, they needed to bridge the gap between the iPhone’s sensors and the actual pixels on the screen. Starting in the background, the team can use a magic wand tool to place several trackers on high-contrast markets in the distance. By tracking these points forward and locking them, the software gains a stable foundation. It basically tells the computer, “This is exactly where the room is. Now, help me find the camera.”The most critical part of the process is the contact zone, the area where the CG will actually touch the live-action world. In Pathways, this meant the front surface of a rock. The team manually placed trackers on high-contrast features directly on the foreground surface, and by prioritizing these points during the refinement solve, they forced the virtual camera to align perfectly with the physical geometry.What was once a rough 3D scan becomes a locked, sub-pixel accurate environment. By combining the broad spatial awareness of the iPhone’s LiDAR with the pixel-level precision of SynthEyes, the team can eliminate the wiggle entirely. The result is a shot where digital elements don’t just sit on top of the footage — they belong to it.And the reward? A workflow that typically takes hours per shot is reduced to minutes.The Future is Agile
Is it too bold to say a small team with iPhones can match a major VFX house? Perhaps. However, Pathways demonstrates that the barrier to entry for high-end cinematic integration is a lot lower than it once was.As generative AI technology continues to improve, the need for precise spatial anchors, like those provided by Lightcraft Jetset, will become a critical part of the filmmaking pipeline.Want to learn more about tracking refinement with Jetset and SynthEyes?