Disney’s upcoming animated film Big Hero 6 was rendered using Hyperion, a software Disney created from the ground up to handle the film’s impressive lighting. Hyperion, however, represents the Disney’s biggest and riskiest commitment to R&D in animation technology so far. Its completion was sometimes doubted, something Disney’s Chief Technology Officer Andy Hendrickson underscores when he says, “It’s the analog to building a car while you’re driving it.”
According to reports, it took a team of 10 people 10 years to create, with a plan B being developed in case project ‘Hyperion’ didn’t pan out. Endgadget reports:
As Hendrickson explains, it handles incredibly complex calculations to account for how “light gets from its source to the camera as it’s bouncing and picking up colors and illuminating other things.” This software allowed animators to eschew the incredibly time-consuming manual effort to animate single-bounce, indirect lighting in favor of 10 to 20 bounces simulated by the software. It’s responsible for environmental effects — stuff most audiences might take for granted, like when they see Baymax, the soft, vinyl robot featured in the film, illuminated from behind. That seemingly mundane lighting trick is no small feat; it required the use of a 55,000-core supercomputer spread across four geographic locations.
“This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems,” says Hendrickson. To manage that cluster and the 400,000-plus computations it processes per day (roughly about 1.1 million computational hours), his team created software called Coda, which treats the four render farms like a single supercomputer. If one or more of those thousands of jobs fails, Coda alerts the appropriate staffers via an iPhone app.
If that doesn’t drive the power of Disney’s proprietary renderer home, then consider this: San Fransokyo contains around 83,000 buildings, 260,000 trees, 215,000 streetlights and 100,000 vehicles (plus thousands of crowd extras generated by a tool called Denizen). What’s more, all of the detail you see in the city is actually based off assessor data for lots and street layouts from the real San Francisco. As Visual Effects Supervisor Kyle Odermatt explains, animating a city that lively and massive simply would not have been possible with previous technology. “You couldn’t zoom all the way out [for a] wide shot down to just a single street level the way we’re able to,” he says.
Sadly, such an amazing feat will be lost to the casual moviegoer, but it’s still impressive to see the amount of work that went into he film.