INDIGO PROPHECY: Developer's Diary
THE TECHNOLOGY
Production
The full production of Indigo Prophecy took nearly 80 people about two years to develop, plus we used around 60 actors and stuntmen for the animation and voice work. The game was simultaneously developed on three platforms (PS2/Xbox/PC) using proprietary technologies, engines and tools. The game features about 12 hours of optical motion capture animation, and about three hours of speech translated into seven different languages.
Rendering
For Indigo Prophecy, we wanted to pay specific attention to rendering. We did not want to follow what you typically see in games with clean rendering, lens flares everywhere, and shiny environment and bump mapping all over the place. We were more interested in the quality of the final image in general, like in movies. We developed a technology based on post-rendering filters that allow you to change the key color of a scene to give it a special atmosphere. Most of the scenes are slightly tainted in blue to give them this cold key color feel. Each image is treated with a second layer of high dynamic range effects, a rendering technique that increases the intensity of the light sources and slightly blurs everything on the screen. A third layer was applied with a light grainy noise too, in a similar fashion to what you see in Silent Hill. The result is nothing that you could describe as "eye-candy" at the first glance, but when you play the game, it really gives a unique atmosphere to the experience. Like the interface, it is something the player will forget after the first few minutes, but it greatly supports the ambiance.
Cameras
In a game like Indigo Prophecy, the camera work is important. We wanted to have a technology that would give us as much freedom for direction as we would have if we were producing a real movie. I also wanted to give the player the possibility to participate to a certain extend to the direction without it becoming a burden for him. In short, he should be able to see what he wants to see, when he wants to see it.
The idea of having several cameras positioned on "the set" and all tracking the player was inspired by television broadcast. For live shows on television, all cameras track the action, and the director decides which one he wants to use. We used a very similar system with four cameras at four points in each room. Each camera is positioned by us, can be static or moving on a trail, and trails can communicate together, allowing the camera to move from one to the other seamlessly. With the left and right triggers on the controller, the player can switch between these four cameras all the time, and this guarantees that he will always have a view showing him what he wants to see. We just added a subjective view to the whole system for close ups to complete the system. Of course, we had to face the usual navigation issues when you don't have a camera in the back of your character, requiring that the player gets used to it, but the gain was really significant.
For the more complex scenes requiring a high quality of directing, we have developed a tool that we call "M3" (Movie Maker Module) that is a kind of directing bench like Adobe Premiere, only in real time 3D. Very complex cut scenes like the murder in the opening sequence were done with this tool. It probably gave me more freedom than having a real camera, because with a virtual camera, you can cross the walls, go wherever you want without constraints, and change the lens at will. The tool also integrates filters and special effects as well as sounds. All events are represented as blocks that can be moved and copied. Cameras and trails are directly placed in the real time 3D window with the possibility to place them while the animations play or are paused. For other scenes, we synchronized several movies that were played in Multiview in different windows.
M3 is really a dream for any director. The only limit is your imagination and you can try as long as you want and see immediately the result. I often thought that real directors would kill to have something like that rather than their heavy cameras.
IAM
IAM (Intelligent Adventure Manager) is the scripting tool we used for the game. It is a proprietary technology we created for our previous game Omikron. The philosophy behind it is really simple; allow game designers themselves to assemble the game in real time. The entire tool is based on a simple language that anyone can learn without any programming skills, and it offers a graphic interface to enter instructions and triggers. IAM made complex scenes like the one in the diner possible with many possibilities and paths that would have probably been a complete nightmare without this tool. The tool also manages the conversation trees too, something that was a real gift when building a game offering three hours of interactive dialogue in seven different languages!
MOCAP
In terms of motion capture use, Indigo Prophecy is probably one of the most demanding games ever made. It was clear that this would be the case when I started working on the project, and this is why we decided to have a motion capture set at our studio. Indigo Prophecy could not have been done without that.
The sessions took us about three months, which is about the same as it would take to make a real movie. We recreated the 3D space where the action was taking place in the game with props and accessories to give some reference points to the actors. Directing actors for motion capture is a very specific task. The main difficulty is to help them understand the situation and act without having a real set or costume to react to. I tried to pay as much attention as possible to the quality of the acting. Another difficulty was the fact that several people intervene in the creation of one character. Three or four different actors worked on the motion capture for Lucas (depending on the required skills, acting or stunts), someone else provided the voice, and another person was in charge of the facial animation. Several months sometimes separated these elements, which meant that keeping an eye on consistency as it passed through each artist was extremely important.
Facial animation was made using a unique technique we developed based on a puppeteer and motion capture gloves. Each facial animation (phonemes for lips synching, eye browse, smile, anger, irony, etc.) is assigned to different fingers. The puppeteer triggers them in real time like a real puppet to animate the face. We also added the possibility to animate the faces in our movie maker and by script to make sure that faces would show emotions all the time, and not only in cut scenes.
Music
For the soundtrack, I absolutely wanted to work with a movie composer that would bring a different quality of emotion to the game. I knew a lot of video game composers who would have done some "John Williams" stuff, with a big orchestra, but I wanted exactly the opposite, something sensible, emotional, the kind of soundtrack that we don't often hear in games.
Angelo Badalamenti was on my list almost from the beginning. I discovered his work in Twin Peaks a while ago and I was totally amazed by the wonderful atmosphere he added to the series. As a big fan of David Lynch, I followed his work in all his movies as well as his collaboration with French director Jean-Pierre Jeunet. Working with Angelo quickly became an obligation in my mind. He was the best person to translate into a music score the fate of Lucas Kane.
I had a good experience working with gifted musicians, after my collaboration with David Bowie on Omikron. Both have been incredibly easy to work with.
My only recommendation to Angelo was to forget that he was working on a video game, and think about his work exactly like if it was a real movie. I sent him the full scenario, a synopsis, the bible, visuals, storyboards, movies of the game, the full game of course and some boring long documents explaining my hopes for the music. Angelo was extremely receptive and open-minded. When he made me hear the first theme he wrote for Lucas during the opening sequence, it was obvious it was the right one. It had all the emotion I was looking for, this dark, epic, human feeling I was desperate to hear. He then worked on themes for all of the characters. The soundtrack was recorded with a full orchestra in Canada under the direction of Normand Corbeil who did a wonderful job.
NEXT >>
Labels: David Cage, Developers Diaries, English, Fahrenheit Indigo Prophecy
Thursday, September 22, 2005
<< HOME
<< HOME
Previous News
- INDIGO PROPHECY: Developer's Diary
- INDIGO PROPHECY: Developer's Diary
- INDIGO PROPHECY: Developer's Diary
- INDIGO PROPHECY: Developer's Diary
- Quantic Dream choose Malaysia
- GamePro.com: Q & A with David Cage About Indigo Pr...
- Fahrenheit Poland
- JeuxVideo: David Cage est ici
- Fahrenheit ( Пророчество цвета Индиго) Russian
- Fahrenheit Australia and NZ