Jon Favreau's live-action remake of
The Lion King has already amassed close to $1 billion at the worldwide box office and, honestly, it now feels like a matter of when rather than if Disney decides to move forward with an adaptation of the animated sequel which introduced Simba and Scar's children.
In the meantime, we're keeping the focus on the first instalment as we recently had the chance to talk to MPC's Elliot Newman. He served as The Lion King's VFX Supervisor and talks in detail to us about a huge number of topics, including fascinating explanations about how Virtual Reality helped create this world, how that could be used moving forward, and the creation of those realistic animals.
Elliot started working in visual effects in 2002 and joined MPC in 2004. He's worked on everything from
World War Z to
300: Rise of an Empire and previously teamed up with Favreau on
The Jungle Book.
This is a really interesting interview that's well worth spending the time to read through. We obviously want to say a huge thank you to Elliot for his time and you can learn more about MPC
right here.
Can you start by talking about the role VR played in the creation of The Lion King?
The VR toolsets we built for The Lion King were used to effectively shoot the film. Obviously, on a traditional movie set, you would have a cinematography crew to actually photograph real stuff with real cameras. With The Lion King, there were no real plates used in the movie...it was all pure graphics. We wanted to use more traditional filmmaking tools to actually produce the photographic side of it like how the camera moves and having proper operators. That's where VR came in. We basically had a game engine which ran in real-time, we built master scenes which were animated out in a rough way based on the story, and then those master scenes, which would be quite long, would be loaded onto the stage/game engine and there would be a virtual camera and physical camera on the stage which would effectively be motion captured so we'd pull all of the positional data from that camera. However, it wasn't actually a real camera and it didn't have a real lens on it; it was just a rig used to represent a camera. On the camera rig itself, you would have a monitor and monitors on stage so you could see exactly where you were pointing. That would be a feed from the game engine and then it would be real-time tracking all your movements.
Then, in the VR world, you'd put the VR goggles on and you'd be inside that game environment with that animation loaded in. You could see where the camera was positioned and you could see where the actors are. You could walk around, teleport yourself around the set in a virtual space, and that meant Caleb [Deschanel], the DP on the movie, was able to shoot it like he would any film. It was just him using slightly different tools but they'd be designed to operate and film the same way as if he was to do a practical shoot. We then received that in post-production, that would become effectively our plate but what it gave us is 3D data from the shoot so we could take the camera, that traditionally would be a match move process, that would track the movement of a camera and extract the camera from a plate so you could add visual effects to it and everything lines up. On this one, we captured all that while the shoot was recorded so we received the camera, the animation files, and everything would be in a 3D space. We'd then convert all the animation and assets to final quality and then it would all be finally rendered. That a very quick synopsis of the whole process!
Do you think moving forward that we're going to start seeing more locations created solely through VFX for an entire film whether it's a jungle or even a city? How do you see the technology progressing from here?
The virtual reality toolset in itself didn't make the environment and jungle. I think in the future, there's the potential for that. With The Lion King, once we'd come off the stage, we went more into the traditional visual effects process of loading very complicated character rigs and scene assets, things that you wouldn't be able to load in a game engine. That would just be too complicated for a game engine to handle. As much as everything was captured on the stage in real-time with Caleb controlling the camera, it was still an approximation of what was the end result which was our final quality render. I think there's definitely a lot of potential with the toolsets. We've already revised them massively since The Lion King because that was about two years ago for us now in terms of when the shoot happened and the tools that we used for the virtual production side were first generation. We've improved them quite a lot since, streamlined them, and made them quite a lot quicker to use and quicker to get takes and render them. Obviously, game engines will improve as the technology that goes into games is looking pretty incredible and we're seeing some tests from things like Unreal Engine and Unity which is what we use and they really are getting quite close to final production quality which is a full on, full blown software render (which is how The Lion King was produced).
I think it seems that's the way the industry is interested in going as it seems there's a convergence between the games industry and the visual effects industry and I think for certain applications, maybe even in the short term, you'll see things being captured that way on set. You'll be able to do your visual effects ahead of time if you just want a simple background set extension that doesn't need stupid, complicated rendering. Maybe there is scope to have it all real-time in the game engine and pre-loaded and then to just shoot it and have that become your plate. It's all real-time composited in. I certainly think that it's an interesting prospect for filmmakers as it gives them a lot more flexibility than they'd have otherwise. You can do all sorts of things but you don't have the same physical restraints as shooting. Even changing lenses can happen in an instant...you just say I want to shoot on a 50 and not a 25 and then one second later, your render's changed. I think it definitely opens up new potentials of how to shoot and a lot of the filmmakers we're talking to are really interested in the process and even if there's practical photography associated with the shoot, there's a lot of interest in figuring out what the composition looks like before committing to a plate. You can see it on the stage in real-time and I definitely think this is just the beginning and we'll see more of it in future, for sure.
Do you think we've reached a point where a fully photorealistic VFX human actor could be created in a similar method to the animals in The Lion King for a movie that looks as 100% real as this one?
I think humans are a different ball game. They've kind of been the Achilles heel for a long time. They're very hard. We've started doing more of that type of work; we did the Arnold [Schwarzenegger] reconstruction in Terminator: Genisys and we did Rachel in Blade Runner 2049. They were fairly small sequences of full CG performing actors. It's a slightly different process with a human. Firstly, humans are very susceptible to seeing if you haven't quite polished the final product as we're very sensitive to something that doesn't look right with a human face. We're so hard wired to understand to the very small degree of emotional change in a human face that it becomes incredibly critical to pull that off and if you don't, you get into the uncanny valley situation. With a lion, you have a little more flexibility and scope as we're not quite as attuned with a lion's face as we are a human's. My main thing, really, would be, what's the reason for it? There's a reason The Lion King was done this way. You can't go out and shoot this stuff in nature. You wouldn't be able to put a story together like we have so there's a reason for the visual effects.
I think with anything you'd have to ask why you wouldn't just shoot an actor and ask yourself, even if the technology existed and with the right budget we could create digital humans, what would be the reason to do that? Why would that be better? Why wouldn't you just shoot them in camera? It's the same with a David Attenborough documentary. Yeah, we've shown with The Lion King that we can produce something that gets incredibly close to reality and it makes the audience wonder what they're looking at and that's exciting but at the same time, personally, I wouldn't want to watch a David Attenborough documentary that was all animated because it wouldn't make sense. I'm watching it to see real animals and for that reason, not to be fooled. I think as technology and processors evolve over time, I think it if it gives you the ability to make actors thirty years younger or there's a safety reason on set where you can't shoot something a certain way, digital acting is an option and definitely something that, over the last five or six years, we've moved further away from the uncanny valley which was the Achilles heel for a long time but I think a lot of it would have to be justified. There needs to be a reason because it's not easy and it requires a lot of humans to produce and animate every shot. It also requires a lot of technical data sets when you're working with CG humans. There's a lot of technical capture processes you have to put in place if you want to capture it correctly so none of it's easy. Even if you could do it for ten shots, that's still an incredible amount of work for a lot of humans to take on. Technology is not at that point where you can click a button and say you want five digital actors in a shot! There's still a lot of work involved.
Finding a way to make realistic animals talk is one thing but how challenging was it when it came to having them sing?
That's a good question. I think it's probably something we thought was going to be a lot harder. I'm not sure in reality if it really was. They both represent their own challenges because if you've got a scene that's very high on emotion and it's all about the subtleties of the performance, that can be just as hard as having Simba and Nala running at full pelt and singing. They're different levels of performance and they come with their own challenges. I think when we went into the movie and started getting into the animation of it, we were all a bit worried about how we would make these real lions sing but I think the way it was done and the way Jon directed it helped a lot. He was keeping us to the more natural side of movement and not make things too floaty and not too exaggerated and that enabled us to work a lot closer to real reference.
There was a lot of research and reference photography that went into the movie and to portray the lions as real lions, it was key to us to always refer to nature. The way Jon directed those scenes helped because they're still just lions. Yeah, they're singing but there's nothing that's been pushed too far which would have made it a lot harder to pull it off. I think it's something you think is going to be really hard and tricky and worrisome and then when you start getting into it, it was actually a really similar process to any other scene in the movie in terms of how you animate it. A lot of it was making sure the story and timing and editorial cuts worked. The animators stuck with doing what they do best which is just working and making sure that the way it all transferred was all correct and the timings and everything and the realism of the movements was sound and doing their due diligence to work from that real reference as much as possible.
In terms of giving the animals a personality, how difficult was that and did you have to take the actors who voice them into account while going down that route?
Absolutely. We would animate with the video recordings of the voice performances as they would be captured as well so any dialogue shots, the animators would have that to hand along with the reference we had that we captured ourselves from Kenya and Animal Kingdom in Florida where we shot some footage of real lions. It was all done in a way to allow the animals to continue as they normally would. That was a remit Jon said as he didn't want any animal on a stage or in a scanning booth as he didn't want us to interact with the animals while capturing them so we always stayed from a distance and captured their natural actions in their normal habitats. So we did those shoots in Kenya and Florida and that built us a big library. We also had another research team pulling a lot of other footage of animal behaviour and we used the two together. We used a lot of the behavioural traits seen in the wild in terms of how animals move and react and then take some of the nuanced performances from the voice recordings and then use parts of both.
We didn't ever want to bring too much to the human aspect into the animals as it wouldn't have worked with syncing into the visuals as everything about this movie is about realism. That was really important to make sure we were always grounded in reality and any time we wanted to push the emotion too far, it very quickly breaks and Jon would always steer us back. It's about making sure the audience is following the emotion of the story and what's happening in that and don't push the facial expressions because lions don't have those muscles and the human facial structure is very different to a lion. Don't animate pulling the brow shape up to make the lion look sad because as soon as you do that, you break the reality of the movie into a strange new territory which I don't think would have synced up with everything else in the movie. It was a balance to hit as they still had to talk and emote and you still had to understand what's happening in the story. They can't just be 100% real animals. There was a level of interpretation there that Andy Jones, the animation director who worked with us in London, and Jon Favreau would both be directing the team and making sure that we were always hitting the right balance between the two.
Talking of realism, was there a specific reason why Mufasa's spirit was portrayed in that storm rather than just being a head in the sky, so to speak?
We always knew it was going to be more realistic. It would have just broken you out of what the movie was if we'd just gone, 'Let's just put Mufasa in the sky and have him floating in the air.' It wouldn't have gelled with the rest of the movie because it's about realism. We knew this was a magical scene so we had some licence to do something different but we didn't want to push it too far. We went through a lot of design iterations and blocked out the scene in a number of different ways making the face in the clouds more visible, less visible...how do we light the face in the clouds? How do we use lightning to help us? We worked very closely with the editorial team in LA to get a sense for the timings. It was a difficult scene because most of the shots have character rigs loaded into the scene and you animate them and then present those in a rough sketch form for Jon to look at and say 'I appreciate the animation in this so let's move on to the next stage and next stage' which would be rendering and simulation which are expensive and time consuming.
The process of animation was quite linear in that fashion but the problem with the cloud sequence is that it's all just a big heavy expensive simulation of biometric clounds and they're like some of the hardest things to get right and it was difficult for us to stay in that draft mode because simulations to that extent are incredibly expensive and time consuming so it was quite difficult to support intuitive draft versions of that to work with Jon and show updates quite frequently. We had to come up with a way of simulating the clouds effectively so we could emulate the blocking process where we could present ideas. If we had just put the Mufasa character rig in the sky and animated that, and said to Jon, 'Imagine this as clouds in the sky and we'll animate this,' he wouldn't have been able to tell and been like, 'Sure.' Then, we would have gone and done a six month high resolution cloud simulation, showed him towards the end of the project saying 'Here's the animation we showed you before you said you were okay with, here it is in the final form' and he'd see how different looking it was.
We knew that was going to be a problem as we knew we couldn't block it out with animation and get Jon to buy into that and move into the next step. We had to start with proper cloud simulations, biometrics, and rendering for the whole thing. Oliver Wayward, who was our lead effects artist on that sequence, he delivered ways in Houdini, the software we used, to optimise it so it didn't take weeks and weeks to simulate so we actually could show Jon at least some steps before he could feel confidence that it was working and he was happy with it. Then, it was just a process of how much do you want to see, how much did you want to leave it up to the audience for whether they saw something or not, and how visible do you want to make it? There was a lot of tinkering in that respect but we always knew it was never going to be literally there he is. We always knew it had to be something played down to a degree and use more of the audience's imagination as it's forming and you then have the beats where you see it's definitely him. We had cameras and composition and editorial to emphasise that in shots with timing and to use Simba's reaction as well. The whole process was one of the hardest sequences and one of the last we finished. It was definitely one of the hardest for sure as it was very complicated simulation work and design and animation work.
Who would you say were the most challenging characters to animate? Scar is obviously very stylised in the original movie so was it particularly difficult to give him that realistic look while also ensuring he didn't just look like a slightly different Mufasa?
That's a good question. Scar, I wouldn't say was the most difficult but he was probably the most stylised. Saying that, when we were looking at the lion reference, we were amazed at just how different there are between lions. You see skull structures changing from lion to lion which is kind of insane. As a character, I think he is probably the most stylised. He was also very successful as when we put him in a shot, he always worked brilliantly. Some of the characters worked like that and some needed more time and love in the shots. Some of the characters, you first see them rendered and say, 'That looks great.' He was one of them. I don't know if he was the hardest. Rafiki was quite difficult because there was a little bit of design work needed as he was probably humanised more than a real Mandrill. You might notice a few differences as we had to do that to support some of the performance requirements of Rafiki in the movie so that required a little bit of time and attention. The Simba cub was probably the hardest one out of all the main characters because he's such a hero character and in so much of the movie. When we built Simba at the start, he was still the first leading hero character we went into the movie with and we developed a lot of new technology for shaders, how we build eye anatomy, how we textured the fur, how we developed dynamic fur movement, and a lot of that technology and practice was employed on Simba at the start so he became almost the Guinea pig of the new technology in that regard. I think he probably ended up being the hardest and he went through a lot of changes for us to get the correct recipe. Once we had that recipe and the new technology was working and locked in, a lot of the later characters we worked on came easy as we employed the same technology.
One of the best examples of this technology was when Simba's hair travels back to Rafiki; just how challenging was it to create that sequence and were there any different ideas on the table?
I'm not sure if there were that many other ideas. I think Jon had this idea in his head. It was definitely a Jon thing. The comedy aspect and the fact this giraffe eats it and then poops it out and then you've got this dung bug that's rolling it. That was all Jon's direction and when we watched it at the premiere in Los Angeles it was great to hear everyone laughing because [Laughs] it took us a long time. It was important that you actually read the storytelling was correct. That was the most important thing for Jon. Do you see the tuft? Is it big enough? It it lit? Do you understand what you're looking at when you follow that sequence? Do you understand what the story is because there's no dialogue? That was challenging to get right because it did require us to model a few different variations of the tuft because as it travels through time it might lose a little bit of its density. In this shot, it might have to be slightly bigger than in the next one and we did that to emphasise the story points and make sure that when you're watching it, the story flows and you understand what you're looking at. It was obviously a journey we had to produce and the tuft itself was one thing but obviously this thing travels through a lot of different environments so those backgrounds had to be built and there was one point where it lands in a river and that requires a simulation of that river just to tell a story. It was a lot of work to support the journey this thing goes through.
Jon Favreau has said that there's only one real shot in the movie and, so far, no one has spotted it. How satisfying is that for you as the movie's VFX Supervisor?
It's great! I'm looking forward to the person that finds it. We'll see. [Laughs] It was pretty cool. That was one thing that he wanted to put in the movie and to say that, yes, there are probably a few shots in the movie, establishing scenes, where you could have just used a real photographic plate. You definitely could have done. But Jon didn't want to do that, he wanted to be able to say that this movie is 100% animated and rendered apart from one shot. I don't think he's going to give it away so we're going to have to see if anyone can find out which one it is. I'm not sure if someone does find the right one if he's going to say but I'm not saying anything!
Finally, which sequence that you worked on stands out as being your favourite?
I'd probably have to say the Mufasa death scene. It was at the end of a very long sequence. The stampede sequence for us was quite long and intense and full of action and something we took on quite early in the production. Once we actually arrived at the part of the sequence where Mufasa has fallen off and Simba has to find his dad, setting the mood and tone for that was quite intricate work as was making sure the compositions and lighting were correct and even the amount of dust in the air which was the residual dust from the stampede. Making sure that there was that emotional connection between Mufasa and Simba and even the part where they nuzzle up against each other was a complex interaction as was sand in the ground. There was a lot of mechanical parts we had to put together and make sure they worked as well as a lot of composition and photographic design to make it feel cinematic. Once we nailed that, we got a positive review and that was one of the first major milestones for me on the movie and that's stuck with me because it was a real moment where you sort of have a sigh of relief for reaching a point of establishing something and look forward to finishing the rest of the movie. At that point we had a long way to go but it did feel like quite a big leap for us to reach that point and say okay this is an incredibly important scene that Jon and the studio were worried about but to then get the positive reaction where everyone was happy with it is definitely something that stuck in my mind.