Skip to main content

The 'amazing' facial capture technology behind 'L.A. Noire'

Mark Milian
Actor Aaron Staton, from TV's "Mad Men," sits before 32 cameras in a cutting-edge studio to record his lines for "L.A. Noire."
Actor Aaron Staton, from TV's "Mad Men," sits before 32 cameras in a cutting-edge studio to record his lines for "L.A. Noire."
STORY HIGHLIGHTS
  • For "L.A. Noire," engineers built a studio with 32 cameras to be pointed at actors' faces
  • The process, called MotionScan, realistically maps an actor's face to the game engine
  • The creators needed the technology for this detective saga, where eye contact is key
  • "L.A. Noire," set in 1947 Los Angeles, went on sale Tuesday

(CNN) -- Thirty-two: That's the magic number of cameras needed to capture the nuances of a person's facial expressions, according to the developers of a bold new video game, "L.A. Noire."

So that's why "Mad Men" actor Aaron Staton spent several months sitting in a bright, sterile room in a Culver City, California, studio, where 32 cameras were pointed at his head while he read lines from "L.A. Noire's" 2,200-page script.

The game's creators employed this groundbreaking animation technology to capture every nuance of Staton's facial performance and transfer him into a virtual 1940s-era Los Angeles. For Staton, the process was part of the gig's allure.

"My first thought in seeing it -- I just thought it was incredible," said the actor, who plays adman Ken Cosgrove in the AMC hit series. "I was just blown away by the detail, that this was footage from a video game."

Rockstar Games, creator of the "Grand Theft Auto" franchise and publisher of "L.A. Noire," hope gamers are equally impressed. After "L.A. Noire's" coming-out party last month at the Tribeca Film Festival, the detective story arrives Tuesday for the PlayStation 3 and the Xbox 360.

The game, which echoes movies like "L.A. Confidential," puts players in the shoes of LAPD Detective Cole Phelps (Staton) as he investigates a string of arson attacks, racketeering conspiracies and murders rocking the city in 1947. Phelps must search for clues, chase down suspects and interrogate witnesses to figure out who is telling the truth and who is lying.

The game puts players in the shoes of LAPD detective Cole Phelps (Staton) as he investigates a string of crimes.
The game puts players in the shoes of LAPD detective Cole Phelps (Staton) as he investigates a string of crimes.

Because eye contact and subtle facial movements are key to the story, the game's creators needed a better way to capture those details.

Seven years of research and development went into "the rig," the crew's affectionate name for the 32-camera setup. At Depth Analysis studio, Hollywood effects merge with the cutting-edge camera system, which was designed to produce the most lifelike digital faces possible.

With spotlights beaming from all sides to eliminate shadows, Staton wore an orange T-shirt -- a necessity for the editing software -- makeup and a slicked-back hairdo, which, in addition to his every grin, wince and lick of the lips, gets fed from the cameras to a bank of computers and eventually into the game.

Depth Analysis owns the elaborate camera-and-server technology, which it calls MotionScan. A fellow Australian company, Team Bondi, developed the "L.A. Noire" game.

Both companies were founded and are run by Brendan McNamara, who penned "Noire's" ambitious script and directed the game's development. Before that, McNamara created "The Getaway," a London crime series for the PlayStation 2 that drew many comparisons to "Grand Theft Auto."

The interconnected camera rig that he dreamed up may find itself embedded in the fabric of game development in this fast-moving industry.

Rather than asking animators to manipulate facial designs in 3-D rendering software as they've done for years, Depth Analysis uses the footage from all of those cameras, situated at various angles, to replicate an actor's face. Servers automatically map the faces, which greatly reduces how long the process takes, McNamara said.

Depth Analysis plans to license the technology to other companies. The studio has given tours to interested parties, including Activision Blizzard staffers potentially scouting for future "Call of Duty" games and Hideo Kojima, the famed Japanese creator of the "Metal Gear" series. (Kojima was "very impressed," McNamara said.)

It was pretty amazing in the tests. ... It looks like it's been filmed. It doesn't look like it's been animated.
--Jeronimo Barrera, Rockstar Games vice president for product development
RELATED TOPICS

Rockstar, which waited patiently as McNamara perfected the process, was also taken with the technology.

"It was pretty amazing in the tests that he had," said Jeronimo Barrera, Rockstar's vice president for product development. "It looks like it's been filmed. It doesn't look like it's been animated."

But some developers have expressed concern about ceding control over how a character looks and acts.

"Is it right for every game? No, not at all," said Barrera, adding that Rockstar has no immediate plans to use MotionScan in any of its other games. If you're not happy about a line, you have to bring the actor back in instead of just tweaking the problem in software, he said.

Game makers will have to adapt their processes to work under MotionScan's constraints, McNamara said. That entails hiring professional actors and directing them to do multiple takes -- much like a TV or film production.

"If you get a great performance out of somebody, why do you want to play around with it?" McNamara said. "Some people want to have this control over the character. They say: 'Can I control the actor's eyes?' And my answer was: 'The actor controls his own eyes.' "

Near-blinding lights and 32 cameras aside, the process feels in some ways like a Hollywood set.

Actors must plan on several hours to go through hair and makeup, Barrera said. If a character has taken a beating, black eyes are applied with powder and eyeliner; for especially bad smack downs, the actor chomps on a blood capsule.

"We had burn victims who were in there for 4 hours getting prosthetic stuff," McNamara said.

Once an actor is in the rig, MotionScan requires him or her to stay mostly stationary (how ironic) or else the software loses the full picture.

"You're sort of glued to a chair -- although not literally," said Staton, one of hundreds of actors who worked on "L.A. Noire." "Though, at one point, they did consider putting in seat belts."

McNamara has been working in motion capture for a dozen years. Motion capture places neon-colored balls on an actor's knees, elbows and other body parts to let cameras record movements. When that's adapted to faces, "you're capturing rotations, not an eyelid flutter," he said. "You're kind of using the wrong thing."

"Why can't you just capture the outside of people instead of trying to find an approximation of where the bones are?" McNamara asked himself. "The idea, for me, has been around almost as long as since I started doing motion capture."

MotionScan was born from necessity. McNamara wanted to revive the detective-thriller genre, which has produced such movie hits as "Dick Tracy," "Se7en" and, yes, "L.A. Confidential," but virtually no great games. He figured that if gamers were to become virtual gumshoes, they'd need to be able to read the characters' faces to evaluate when someone is lying.

After many failed experiments, including an attempt at using sonar ("It kind of bounces all over the place," McNamara said), he arrived at the multi-camera concept.

The next iteration of MotionScan is expected to involve many more cameras shooting at even higher resolutions, so that actors can actually walk around instead of shooting motion capture and faces separately. Currently, the system records faces at 1 gigabyte per second, McNamara said. (That's about two high-definition episodes of "Mad Men" every second.) Full-body scans might require 150 gigabytes to 200 gigabytes a second, he said.

[TECH: NEWSPULSE]

Most popular Tech stories right now