Tuesday 2 February 2010

Avatar - The computing and data centre behind making of

The computing and data centre behind making of Avatar

A palm-swept suburb of Wellington, New Zealand is not the first place you'd look for one of the most powerful purpose-built data centers in the world. Yet Miramar, pop. 8,334, is home to precisely that, along with a huge campus of studios, production facilities and soundstages.
The compound is a project that began 15 years ago, inspired by filmmakers Peter Jackson, Richard Taylor and Jamie Selkirk. The studios have since been the main location for creating The Lord of the Rings movies, King Kong, and several others.

Right in the middle sits Weta Digital, the increasingly famous visual effects production house behind high-end commercials and blockbuster movies, most lately the $230 million James Cameron extravaganza AVATAR.
Despite the locale, Weta has drawn plenty of attention. Five Academy Award nominations and four Oscars will do that, but publicist Judy Alley says nothing has matched the buzz of AVATAR. “We’ve done more than 100 interviews in the last few months,” Alley says. With most of the attention focused on the movie’s immersive look, Alley was glad someone was interested to look at the technology installation that sits within Weta and kindly connected us to two of the people who make it run.
As they explained, what makes Weta and a project like AVATAR work is in equal parts the computing power of the data center that creates the visual effects, and the data management of artistic processes that drive the computing.
Hot Gear
Weta Digital is really a visual effects job shop that manages thousands of work orders of intense amounts of data. That preselects most of the fast, constant capacity equipment required. The data center used to process the effects for AVATAR is Weta’s 10,000 square foot facility, rebuilt and stocked with HP BL2x220c blades in the summer of 2008.
The computing core - 34 racks, each with four chassis of 32 machines each - adds up to some 40,000 processors and 104 terabytes of RAM. The blades read and write against 3 petabytes of fast fiber channel disk network area storage from BluArc and NetApp.
All the gear sits tightly packed and connected by multiple 10-gigabit network links. “We need to stack the gear closely to get the bandwidth we need for our visual effects, and, because the data flows are so great, the storage has to be local,” says Paul Gunn, Weta’s data center systems administrator.
That ruled out colocation or cloud infrastructure, leaving Gunn as a sort of owner-operator responsible for keeping the gear running. It also required some extra engineering for the hardware because the industry standard of raised floors and forced-air cooling could not keep up with the constant heat coming off the machines churning out a project like AVATAR.
Heat exchange for an installation like Weta’s has to be enclosed, water cooled racks where the hot air is sucked into a radiator and cycled back through the front of the machines. “Plus,” Gunn says, “we run the machines a bit warm, which modern gear doesn’t mind, and the room itself is fairly cool.”

With building costs absorbed, water cooling becomes much less expensive than air conditioning, and the engineering in the data center allows fine tuning. “I don’t want to give you an exact figure,” says Gunn, “but we’re talking tens of thousands of dollars saved by changing the temperature by a degree.”
Because of passive heat exchangers and the local climate, Weta pays no more than the cost of running water pumps to get rid of heat for all but a couple months a year. Just weeks ago, Weta won an energy excellence award for building a smaller footprint that came with 40 percent lower cooling costs for a data center of its type.
Throughput Revisited
The other half of Weta Digital’s processing story comes from the intense visual effect activities that heat up the data center.
Weta Digital is actually two companies, Weta Workshop, where a crew of artists and craftsmen create physical models, and the like-named Weta Digital, which creates digital effects for commercials, short films and blockbuster movies.
"If it's something that you can hold in your hand, it comes from Weta Workshop," says Gunn, "whereas if it's something that doesn't exist, we'll make it."
In the visual effects process, a mix of inputs come from storyboards, director revisions and tweaking by internal and external digital artists who turn a director’s concept into an image via 3D software from Maya or Pixar’s RenderMan. Artists work through concepts, and iterate versions to get movement and lighting just right. It’s nothing the movie industry hasn’t done all along, says Gunn, only now the tools are different and more data intensive.
The main activity in a visual effects data center is called rendering, the process of turning the digital description of an image into an actual image that can be saved to disk and eventually written to film or another media.
The banks of computers are called render walls, where Joe Wilkie serves as Weta’s “manager wrangler,” the person who oversees the data flow and feeds jobs through the pipeline.
“Wrangler” is a traditional but still common film industry term that first referred to the people who herded the horses and other livestock in Western movies. Likewise, Wilkie says he’s most often called a “render wrangler,” in this case someone who rounds up digital files rather than cattle. “Each part of a movie is an individual item, and it all has to be put together,” he says. “So when an artist is working on a shot, they will hit a button that launches a job on the render wall and loads it into our queueing system.”
The queueing system is a Pixar product called Alfred, which creates a hierarchical job structure or tree of multiple tasks that have to run in a certain order. In any single job, there might be thousands of interdependent tasks. As soon as CPUs on the render wall are freed up, new tasks are fired at idle processors.
At the peak of AVATAR, Wilkie was wrangling more than 10,000 jobs and an estimated 1.3 to 1.4 million tasks per day. Each frame of the 24 frame-per-second movie saw multiple iterations of back and forth between directors and artists and took multiple hours to render.

For Gunn’s data center, that added up to processing seven or eight gigabytes of data per second, a job that ran 24 hours a day for the last month or more of production. It’s a Goldilocks task of keeping the gear running fast, “not too fast, not too slow, but just right,” Gunn says, to keep the production on schedule. “It’s a complex system and when you’re on deadline with a project like this, you really want to make sure the lights stay on.”
A final film copy of AVATAR is more humble than all the back and forth that occurred in its creation: at 12 megabytes per frame, each second stamped onto celluloid amounts to 288 megabytes or 17.28 gigabytes per minute. Deduct the credits from the 166-minute movie and you understand better what the final file consists of.
But the immersive effect of AVATAR comes from the many hours or days of attention to each of about 240,000 frames that go into the final product. Weta asked us to mention vfx supervisor Joe Letteri, who oversaw the interactions of directors, a half-dozen lead concept artists and the supporting artists who made the technology process so intensive and powerful.

No comments: