VES Talk: "Creating The Creator" (2024)

  • BLENDING LIVE-ACTION AND ANIMATION TO CAPTURE THE CAST OF IMAGINARY FRIENDS THAT SPARK IF May 29,2024

    By OLIVER WEBB

    Images courtesy of Paramount Pictures.

    Writer/director/actor John Krasinski’s live-action animated feature IF follows 12-year-old Bea as she begins to see everyone’s imaginary friends (IFs) who have been forgotten after their real-life friends have grown up. The impressive ensemble cast features Cailey Fleming, Ryan Reynolds, Emily Blunt, Steve Carell, Blake Lively, Matt Damon, Brad Pitt, Louis Gossett Jr., Bradley Cooper, Pheobe Waller-Bridge, Bill Hader and George Clooney.

    VES Talk: "Creating The Creator" (1)

    Ryan Reynolds (Cal) and Steve Carell (Blue) in IF. Blue is around eight feet tall, almost as wide and fills the frame. A challenge was making his character more appealing to audiences.

    “We wanted [Blue and Blossom] to feel grounded and real and tangible in the world of our movie, without feeling like they are on another layer. John wanted the comedy energy, but he also wanted the heart and the emotional resonance of the movie to be with the characters in the world. We talked about magical realism only feeling like magic when it feels real; this idea that the characters needed to be very tangible. So, we did things where we made their eyes feel like they were puppets, with resin and tiny micro scratches.”

    —Chris Lawrence, Production Visual Effects Supervisor

    VES Talk: "Creating The Creator" (2)

    Ryan Reynolds (Cal), Cailey Fleming (Bea), Steve Carell (Blue) and Phoebe Waller-Bridge (Blossom). Director Krasinski wanted the characters to feel like actors on the screen, emphasizing that every character needed to be thinking and every emotion counted.

    Chris Lawrence served as Production Visual Effects Supervisor on the film. “We had a meeting with director John Krasinski in April 2022.” Lawrence recalls. “He’d seen our work previously on Christopher Robin, and there was an obvious similarity. John has this amazing energy. He’s a real film lover. You could really tell that he had thought every shot of the movie through. It became very clear that there was a love affair to have with these characters and ideas that John was pushing for. His kids are slightly older than mine, but there was an affinity with wanting to do something that the kids would enjoy, and it was the pride of my life seeing my four-year-old daughter on the edge of her seat at the premiere. There aren’t many films you can make in this industry that you can take someone that young to go and see.”

    VES Talk: "Creating The Creator" (3)

    Ryan Reynolds (Cal) and Steve Carell (Blue). Director Krasinski had done a lot of artwork and designs before bringing on the VFX team. As a result, they were able to capitalize on that advance work and progress quickly.

    Some of the creative references that were discussed in the early stages included notable ’80s fantasy movies such as The Dark Crystal, Labyrinth and E.T. “We also spoke about growing up in the ’80s and how we make movies now,” Lawrence adds. “It was originally one of those meetings where you can go in with complete confidence as I knew I wasn’t going to do the project because I was engaged on another show. But when I left the meeting, I really wanted to do it. The other show got pushed by quite a lot and suddenly I was available, so it worked out and that’s how it got started. It went very quickly from that first meeting and then shooting in August. It was a very quick turnaround from that moment.”

    VES Talk: "Creating The Creator" (4)

    There were more than 40 IFs created for the film. From left, clockwise: Maya Rudoplph (Ally Aligator), Keegan-Michael Key (Slime Ball), Sam Rockwell (Super Dog), Phoebe Waller-Bridge (Blossom), George Clooney (Spaceman), Steve Carell (Blue), Matt Damon (Flower), Emily Blunt (Unicorn), Richard Jenkins (Art Teacher), Akwafina (Bubble), Matthew Rhys (Ghost) and Bill Hader (Banana).

    “Particularly in a live-action world, with live-action actors acting next to a CG character, you really want to feel the believability and you want the characters to sound honest in terms of performance. When I recently watched it for the first time on a big IMAX screen, I was thinking how many details there are, and I was thinking how [director] John [Krasinski] really wanted these characters to feel like actors on the screen. Every character needs to be thinking and every emotion counts; they can tell a huge story with just one eye dart.”

    —Arslan Elver, Animation Supervisor, Framestore

    “John had done lots of artwork and designed lots of stuff before we joined, so we were really able to capitalize on that move very quickly,” Lawrence continues. “We worked with Framestore’s visual development team to do really quick renders. I think John had been in this world of sketches and tests that weren’t quite there, and we had this opportunity to build on top of that, rather than to keep trying different things. So, we dived straight into that process, especially with the characters Blue and Blossom, who are the two hero IFs in the story. They were two completely different challenges. Blue is around eight feet tall and almost as wide and fills the frame. There was the challenge of that and making his character appealing. Then, there was Blossom, who was much smaller, but she was really modeled on ideas from the Fleischer era, such as Betty Boop. We tried to come up with a design that was authentic to that but worked three-dimensionally so it worked with the style of animation that we were going to use. Again, we wanted her to feel grounded and real and tangible in the world of our movie, without feeling like they are on another layer. John wanted the comedy energy, but he also wanted the heart and the emotional resonance of the movie to be with the characters in the world. We talked about magical realism only feeling like magic when it feels real; this idea that the characters needed to be very tangible. So, we did things where we made their eyes feel like they were puppets, with resin and tiny micro scratches.”

    VES Talk: "Creating The Creator" (5)

    Krasinski wanted the comedy energy, but he also wanted the heart and the emotional resonance of the movie to be with the characters. Front left, front row: Jon Stewart (Robot), Sam Rockwell (Super Dog), Emily Blunt (Unicorn), Maya Rudolph (Ally Aligator), Phoebe Waller-Bridge (Blossom), Keegan-Michael Key (Slime Ball), Blake Lively (OctoCat), From left, back row: Matt Damon (Flower), Richard Jenkins (Art Teacher), Bill Hader (Banana), Amy Schumer (Gummy Bear) and Louis Gossett Jr. (Lewis).

    Arslan Elver was Framestore’s Animation Supervisor on the film. “Performance-wise it was also quite special,” Elver remarks. “I remember in the first meeting, John looked at Framestore’s character animation reel and there was a shot of Rocket from Guardians, which he loved as he felt that it looked so real and true. He said he felt like it was the first time a CG character worked as a supporting actor in a movie. He also really loved Christopher Robin because you really feel those characters in the scene, but they are also moving in a believable way. They are not over the top and too cartoony. Particularly in a live-action world, with live-action actors acting next to a CG character, you really want to feel the believability and you want the characters to sound honest in terms of performance. When I recently watched it for the first time on a big IMAX screen, I was thinking how many details there are, and I was thinking how John really wanted these characters to feel like actors on the screen. Every character needs to be thinking and every emotion counts; they can tell a huge story with just one eye dart.”

    “Blossom was another one of our characters that was really tricky, not just because of the design, but also performance-wise. It’s a very different character, proportion-wise as well. Phoebe Waller-Bridge brings an amazing energy to the character. They are very careful, curated performances for these CG characters, but when you watch the film, it’s not like they are CG and just part of the film. I felt like that was a big hit of the film.”

    —Arslan Elver, Animation Supervisor, Framestore

    VES Talk: "Creating The Creator" (6)

    Matt Damon voices Flower. Looking at Framestore’s character animation reel, Krasinski admired Rocket from Guardians of the Galaxy because it looked so real and true-to-life. He felt like it was the first time a CG character successfully worked as a supporting actor in a movie.

    When it came to Blue’s first animation test, Elver and his team kept a sizzle reel with all the development work being cut in with the music to show the studio and filmmakers where they were. “I remember there was this costume they built for Blue, and I was able to walk around in it,” Elver notes. “It’s a big character and just to feel the weight of that was important. We started developing some animation walk-cycle tests to find his character, and we used our Framestore lobby reception as a testing ground and shot some tests for John. With animation, you have to try multiple things. Some of them will be wrong, but you will find what the director wants, and that will just really guide you through that world. Blossom was another one of our characters that was really tricky, not just because of the design, but also performance-wise. It’s a very different character, proportion-wise as well. Phoebe Waller-Bridge brings an amazing energy to the character. They are very careful, curated performances for these CG characters, but when you watch the film, it’s not like they are CG and just part of the film. I felt like that was a big hit of the film.”

    VES Talk: "Creating The Creator" (7)

    Imaginary friends, from left: Sam Rockwell (Super Dog), Allyson Seeger (Viola), Akwafina (Bubble), Matt Damon (Flower), Keegan-Michael Key (Slime Ball) and Jon Stewart (Robot).

    Elver and Lawrence didn’t want the animation and visual effects to draw attention to themselves. “Sometimes you can think of huge explosions as being the archetypal and challenging visual effects, but I think it’s very challenging to make visual effects that don’t impede the filmmaker and doesn’t put them in an environment where there is a greenscreen stage all around, or talking to somebody wearing a green morph suit,” Lawrence observes. “We worked really hard to overcome that. We cast a great puppeteer to perform as Blue. We made sure he took up the physical space. So, he was wearing a suit with a big thing above his head, and we didn’t use motion control. We would allow our Steadicam shots to develop, and they’d act with real people, giving them that live feedback and creating that magical spark that is so much part of John’s process on set. Our job then was to apply all the complexity behind that, which included some vastly complex paint-outs. There’s amazing behind-the-scenes invisible work that’s done by some of the more junior artists who worked on the film, and I think they deserve a huge amount of recognition, because they were the ones who facilitated us to make this film and do it in a way that supported the filmmaking at every stage. We honestly took that philosophy all the way through from the first days of prep to the last days of post. We were always trying to prioritize getting the best result overall for the movie, even if it was making our lives a little bit harder.”

    VES Talk: "Creating The Creator" (8)

    One of the most challenging sequences to capture was the big dance sequence, which filmmakers executed in postvis. Because the sequence was performance animation, the characters needed to breathe for a moment, extending shots to let the emotional impact sink in. From left: George Clooney (Spaceman), Amy Schumer (Gummy Bear), Emily Blunt (Unicorn), Steve Carell (Blue), Matt Damon (Flower), Cailey Fleming (Bea), Phoebe Waller-Bridge (Blossom), Richard Jenkins (Art Teacher) and Maya Rudolph (Ally).

    Another one of the central characters was Lewis, voiced by Louis Gossett Jr. “We don’t talk about Lewis enough,” Elver says. “Although he doesn’t have a lot of screentime, we always considered Blue, Blossom and Lewis as the three primary characters of the film. Lewis was also a very cool character because his human used to be a jazz player. I remember doing two animation tests and showing them to John on set. One of them was just Lewis standing up and bowing his hat, and the other one was a little dance, but he loved the hat so much, and he wanted that in the film. It was a lovely performance generated by the animators. When I watched it on the big screen, it looked gorgeous. There was this little spark in Lewis’ eyes that I remember John wanted. It was very true and layered. These characters act next to a human without being out of place. Lewis was a character that was very successful and dear to me.”

    VES Talk: "Creating The Creator" (9)

    VES Talk: "Creating The Creator" (10)

    VES Talk: "Creating The Creator" (11)

    Blossom (voiced by Phoebe Waller-Bridge) was modeled on ideas from the Fleischer Studios era of the 1930s, like Betty Boop.

    Lawrence and his team decided to postvis the circled takes and selects so John could edit using the postvis. “Everybody said to me that it wouldn’t work because takes are a minute and a half long, so you can never track in time and never animate in time, but it turns out you can, and it does work,” Lawrence deatils. “It was the right solve for us because it gave John a base. If he changed the take, then we still had basically animated the take and you could still reload it, and it was pretty quick. The other benefit of that is it allowed him to make very deliberate editing choices, based on Arslan’s direction of the animation. So, there was a very true-standing approach where Arslan was directing the characters to do what he understood having been there on the day and understanding what the nuance of their performance should be. Then presenting that to John and allowing him to select it and cut around it and give notes. It was just very successful for those sequences to come together in a good way.”

    VES Talk: "Creating The Creator" (12)

    The characters act next to humans without being out of place, such as Lewis (Louis Gossett Jr.) holding hands with Bea (Cailey Fleming) at the amusem*nt park.

    One of the most challenging sequences to capture was the big dance sequence. “It was beautifully choreographed by Mandy Moore,” Elver explains. “We had these dancers, and for filmmakers to be able to do this scene in postvis was just so useful. Because this was performance animation, you need to let the characters breathe for a moment to make the necessary emotional impact hit you. Occasionally, I’d ask John if we could extend a shot, as I thought it would help, and he would always say ‘go for it.’ There were even a couple of shots where he’d edit more frames as he just loved it so much. He wanted that look, or that moment to stay on the screen a little longer for the emotional impact. That was a joyful experience because he was very collaborative.”

    VES Talk: "Creating The Creator" (13)

    Cosmo (Christopher Meloni) gets in Ryan Reynolds’ face, although it was director Krasinski who initially “attacked” Reynolds to get him in that position for the animation. Cailey Fleming (Bea) and Louis Gossett Jr. (Lewis) share the moment.

    VES Talk: "Creating The Creator" (14)VES Talk: "Creating The Creator" (15)VES Talk: "Creating The Creator" (16)

    VES Talk: "Creating The Creator" (17)

    A puppeteer was cast to perform as Blue. The VFX team felt that the characters needed to be grounded and tangible for the audience to believe they were real and not just another layer.

    “It’s an amazing body of work that Framestore did [for the big dance sequence], and the animation of the dancing is just joyous. On top of that, it’s a real high point of the film for me. It’s a celebration of imagination set to a beloved Tina Turner song. It exemplified our philosophy of what we were trying to do and why. It’s where visual effects can become this unsung hero, enabling amazing live-action filmmaking.”

    —Chris Lawrence, Production Visual Effects Supervisor

    VES Talk: "Creating The Creator" (18)

    Emily Blunt is the voice of Unicorn. Creative references discussed in the early stages included notable ’80s fantasy movies such as The Dark Crystal, Labyrinth and E.T. the Extra-Terrestrial.

    VES Talk: "Creating The Creator" (19)VES Talk: "Creating The Creator" (20)

    VES Talk: "Creating The Creator" (21)

    Director John Krasinski ready on set with the Unicorn model for positioning purposes with Ryan Reynolds, to be replaced by the animated character. Krasinski served as director, writer, producer and had a starring role in the film.

    VES Talk: "Creating The Creator" (22)

    Ryan Reynolds (Cal), Cailey Fleming (Bea) and Louis Gossett Jr. (Lewis) interview Jon Stewart (Robot). Framestore, Cadence, One of Us and Untold split the workload, which consisted of more than 1,200 visual effects shots and 700 animation shots.

    The Framestore team in Montreal was responsible for the dance sequence. “Framestore did a fantastic recreation of the theater environment,” Lawrence says. “Completely invisible work where they recreated the whole thing. It intercuts live-action, then it’s completely CG theater replacements next to live-action again. The lights are all moving. It was quite an amazing thing when we first saw that all projected with all the renders in. It’s an amazing body of work that Framestore did, and the animation of the dancing is just joyous. On top of that, it’s a real high point of the film for me. It’s a celebration of imagination set to a beloved Tina Turner song. It exemplified our philosophy of what we were trying to do and why. It’s where visual effects can become this unsung hero, enabling amazing live-action filmmaking.”

    VES Talk: "Creating The Creator" (23)

    IF Richard Jenkins (Art Teacher) interacts with Ryan Reynolds and Cailey Fleming. Kasinski chose to go with his orginal idea for IF, rather than take on an existing franchise or toy.

    Framestore, Cadence, One of Us and Untold served as the vendors on the film, splitting the workload, which consisted of over 1,200 visual effects shots and more than 700 animation shots. There were more than 40 IFs created for the film. “You don’t always work on a movie with such an enthusiastic and creative director, with characters who are so much joy to work with. That experience from start to finish was just pure joy for me because you don’t often work on a show that is so animation-orientated, and it’s really fun animation on top of that,” Elver says.

    VES Talk: "Creating The Creator" (24)

    When they first met, Steve Carell donned Blue’s costume and hugged Cailey Fleming. Production Visual Effects Supervisor Chris Lawrence and Director Kasinski shared an affinity for making a movie their young children could go to watch in the theater.

    Lawrence concludes, “I think the development period and the problem-solving around the creation and ideation of these characters was very special. It’s so rare that you get to work on something that’s a completely original idea. I want to applaud John’s bravery for pushing to do that, rather than taking an existing franchise or toy. He completely put it out there with his own ideation. John was tearfully happy to see these ideas he had visually represented. Previs always evolves, but it completely informed what we were going to do, and John’s emotional reaction to seeing that we were getting it and onboard and were able to service him in the making of this film was, for me, the true pleasure.”

  • BUILDING THE 3D WEED FARM EXTENSIONS FOR THE GENTLEMEN May 21,2024

    By OLIVER WEBB

    Images courtesy of Netflix.

    Guy Ritchie’s The Gentlemen is a spin-off of Ritchie’s 2019 film of the same name. The series follows Eddie Horniman, the 13th Duke of Halstead (Theo James), as he inherits his family estate and discovers a huge weed empire hidden beneath the property run by Susie Glass (Kaya Scodelario). After realizing that Susie has no intention of leaving, Eddie must navigate unfamiliar territory.

    VES Talk: "Creating The Creator" (25)

    Aristocrat Eddie Horniman (Theo James) inherits the family estate and discovers that it’s home to a huge weed empire, and its proprietors aren’t going anywhere. A wide variety of VFX shots was needed to keep the story moving and add scale. From left: Liran Nathan, Theo James, Kaya Scodelario and Kristofer Hivju.

    James Jordan served as Visual Effects Producer and Supervisor on the show. “There wasn’t a huge amount of VFX on The Gentlemen, but it still needs to be managed well creatively, schedule-wise and quality wise,” Jordan states. “Moonage, Guy Ritchie and Netflix had a high level of expectation for the show. Additionally, we wanted to ensure that the VFX for the show met the creative needs and moved forward from the original film.” A style guide already existed for the show due to the legacy of the film. “It wasn’t so much the look,” Jordan explains. “We had lots of meetings about the style in which Guy likes to shoot and how he likes to depict his action sequences and his blood or gunshots. Also, what the cannabis farm needed to evoke and look like. We needed to add scale from a VFX point of view and other details to enhance the creative.”

    VES Talk: "Creating The Creator" (26)

    Mob boss Kaya Scodelario (Susie Glass) is running her father’s cannabis empire while he’s in prison. Director Guy Ritchie is a storyteller for whom VFX is primarily a tool to enhance aspects that can’t be shot practically.

    VES Talk: "Creating The Creator" (27)

    Daniel Ings (Freddy Horniman), Eddie’s unreliable brother, patrols the estate while under the influence of cocaine, which is why he’s acting here like a manic guard-chicken. (Photo: Christopher Rafael)

    “Guy doesn’t really like to do anything that doesn’t drive the story. He’s 100% story-orientated. He doesn’t want the VFX to be the story. It’s there as a tool to enhance aspects that can’t be shot for real. Some of that has to do with scale while other things have to do with the visible VFX that we already have, such as gunshots, wounds and set extensions. We have a lot of set extensions on the houses and weather fixes. It’s quite a variety of shots that they needed, but it’s there really to keep the story going and add scale to the story.”

    —James Jordan, Visual Effects Producer/Supervisor

    VES Talk: "Creating The Creator" (28)

    Visual Effects Producer/Supervisor James Jordan and his team created mostly invisible VFX. Splitting the shots between multiple vendors allowed different teams to work in parallel and turn around the episodes efficiently without compromising quality. From left: Joely Richardson, Theo James and Daniel Ings.

    Jordan was already familiar with Ritchie’s work before coming onboard the project. “I really like his storytelling,” Jordan says. “I’ve watched most of his films, but The Gentlemen obviously felt the most relevant in terms of references as the show is a spin-off of the film. Guy doesn’t really like to do anything that doesn’t drive the story. He’s 100% story-orientated. He doesn’t want the VFX to be the story. It’s there as a tool to enhance aspects that can’t be shot for real. Some of that has to do with scale while other things have to do with the visible VFX that we already have, such as gunshots, wounds and set extensions. We have a lot of set extensions on the houses and weather fixes. It’s quite a variety of shots that they needed, but it’s there really to keep the story going and add scale to the story.”

    VES Talk: "Creating The Creator" (29)

    Lead creative vendor Jellyfish Pictures was primarily responsible for the weed farm extensions.

    When it came to managing the VFX workload with his team, Jordan went through the scripts in advance, breaking down the arc of the stories and finding out what the balance of shots were going to be. “That informed how we would create some of the assets that needed to be repeated,” Jordan notes. “For example, for the weed farm extension, the set was one container deep, so every time you see the weed farm looking through it, it all had to be a 3D build, and so that is a 3D set extension. The way we built that was determined by how many times we knew we were going to be seeing it. The more you are going to see it, the more you want to make sure that you’ve got enough detail and it’s a 3D asset that you can see from any angle. So, we worked out what the types of shots we would need. There is also a graphic element in terms of the episode title, and Guy had a very specific idea of how he wanted the text to come across. There were other graphic elements that needed to have a certain energy to them, and even though they were digitally created, the series style required them to feel organic and more filmic, like an old-style film title rather than a digital film title. So, we very much wanted it to be a lot more set into the footage, rather than having a clean graphic. So, those were some of the discussions we were having creatively about how he wanted things to look and fit.”

    VES Talk: "Creating The Creator" (30)

    Because the weed farm set was only one container deep, every time the weed farm was seen and looked through, a 3D build and 3D set extension was needed with enough detail that it could be viewed from any angle.

    “[F]or the weed farm extension, the set was one container deep, so every time you see the weed farm looking through it, it all had to be a 3D build, and so that is a 3D set extension. The way we built that was determined by how many times we knew we were going to be seeing it. The more you are going to see it, the more you want to make sure that you’ve got enough detail and it’s a 3D asset that you can see from any angle.”

    —James Jordan, Visual Effects Producer/Supervisor

    VES Talk: "Creating The Creator" (31)

    Visual Effects Producer/Supervisor James Jordan and his team shot a motion control animation sequence to create a time-lapse sequence as part of documenting a cannabis plant’s journey from seed to street.

    Jordan worked alongside VFX Editor Celine Glasman James and VFX Production Manager Irene Garcia Alonso. “We had the three of us on the VFX production team, but we spilt our work between five vendors because we knew we had quite a quick turnaround on each episode from a delivery point of view,” Jordan says. “We broke the show down so that we had vendors doing particular tasks. Our lead creative vendor, Jellyfish Pictures, was doing the weed farm extensions primarily and some of the enhancements. For example, the dart in the head we had to put into 3D as it was it was difficult to shoot with a prop. We also had Peerless, who did some of the matte paintings. For set extensions, we had a couple of mid-level vendors who were doing such things as replacements and graphic type elements, and then a fifth company that was specializing primarily in clean-up. On average, we had about 70 to 80 shots per episode, with one episode consisting of 150 shots, so it was not a heavy VFX show. Splitting the shots between multiple vendors meant we could work in parallel and turn around the episodes very efficiently without compromising quality.”

    “The most complicated challenge that we had was time, in terms of how quickly we were turning the episodes around. From starting the VFX on any particular episode to delivering the finals, we had on average four or five weeks. That’s a really quick turnaround, particularly if you want to maintain a high quality, which was a must for us. The show is very polished, and the VFX had to meet the quality and the expectation of Guy, Moonage and Netflix…”

    —James Jordan, Visual Effects Producer/Supervisor

    VES Talk: "Creating The Creator" (32)

    Building the 3D environments for the weed farm extensions involved work across all the episodes, and had to service all angles and lighting conditions. Above: Daniel Ings as Freddy.

    The most complicated aspect of the visual effects for Jordan and his team was building the 3D environment for the weed farm extensions. “That involved work across all the episodes, and it had to service all the different angles and lighting conditions. That was probably our biggest asset for the season. We also shot a motion control animation sequence to create a time-lapse sequence as part of a cannabis plant’s journey from seed to street.”

    VES Talk: "Creating The Creator" (33)

    VES Talk: "Creating The Creator" (34)

    Due to changes in the weather, weather corrections were needed. Fixes were added in post to make the sequences consistent with frost and snowfall.

    “Most of the shots were relatively straightforward,” Jordan continues. “In this type of show there are things like muzzle flashes and blood on face and weapons in close proximity to people where we added all the lighting and the muzzle flashes and blood interaction. I wouldn’t say any of those are particularly complicated, but they do need to look and feel right. The most complicated challenge that we had was time, in terms of how quickly we were turning the episodes around. From starting the VFX on any particular episode to delivering the finals, we had on average four or five weeks. That’s a really quick turnaround, particularly if you want to maintain a high quality, which was a must for us. The show is very polished, and the VFX had to meet the quality and the expectation of Guy, Moonage and Netflix while ensuring that everything was done on schedule and within budget, but still meeting the creative needs.”

    VES Talk: "Creating The Creator" (35)

    VES Talk: "Creating The Creator" (36)

    Muzzle flashes and blood were added in post. Most of the visible VFX in the show consisted of gunshots, wounds and set extensions.

    VES Talk: "Creating The Creator" (37)

    VES Talk: "Creating The Creator" (38)

    Invisible VFX included bluescreens, replacements and clean-up, such as removing vehicles parked along the road leading to the manor to make it look like an isolated country estate.

    “When you have block directors, it’s also interesting to see how they then take the framework of the lead director and add their own creativity for their episodes. Each director did two episodes, and Guy did Episodes 1 and 2. It was a truly collaborative team. … It all felt very supported, and even though there were busy days and lots of changes, it always felt like there was an understanding of what the challenge was and an appreciation of the efforts that were undertaken to make everything work.”

    —James Jordan, Visual Effects Producer/Supervisor

    VES Talk: "Creating The Creator" (39)

    Half of the outdoor prison sequence was shot with real snow and half without, because it was shot during a period when there was a cold spell with only a few days of snow.

    Jordan and his team relied on creating mostly invisible visual effects shots. “It’s not about the VFX in this show. In this instance, if you are seeing the VFX as obvious then we haven’t really done a good job,” Jordan notes. “There’s lots of weather corrections throughout, such as the sequence with the snow when the travelers arrive. Half of that was shot with real snow and half without because it was shot during a period when we had a cold spell and a few days of snow, which affected the arrival of the travelers at the house and also the prison scene with Ray Winstone where he was talking to Suzie and Eddie for the first time. Half of that was shot when it was foggy and the other half wasn’t, so we had to weather correct all those shots so it all fits together well. They are quite tricky things to get right, but they are not things that you make a fuss about, yet it all flows and is not questioned. The challenge for ]VFX] is to not stand out in any way, so you don’t notice it, which is much the same for some of the other visual effects that we had. Sometimes we were shooting day sequences or looking through windows at night, and they used bluescreen. For some of the shots, you are looking at the background in-camera; other [backgrounds] have been added digitally. If you notice the difference then we haven’t done our jobs well enough. There really isn’t such a thing anymore as a non-VFX show. Anything less than 120 or 150 shots per episode I would regard as low, but there are always lots of things that are needed. Every type of show you see now has VFX embedded.”

    VES Talk: "Creating The Creator" (40)

    Director Guy Ritchie and actor Vinnie Jones on the set of The Gentlemen. (Photo: Christopher Rafael)

    Jordan concludes, “It was an absolutely fantastic crew. I really enjoyed working with the three directors, David Caffrey, Eran Creevy and Nima Nourizadeh. They all had different approaches, which was great. Guy has his method, and having watched and enjoyed his films, I was intrigued to see how he pieces it all together. It’s great to see how a successful director runs his creative part of the set; I enjoyed seeing that. When you have block directors, it’s also interesting to see how they then take the framework of the lead director and add their own creativity for their episodes. Each director did two episodes, and Guy did Episodes 1 and 2. It was a truly collaborative team. Moonage is a fantastic production company. It all felt very supported, and even though there were busy days and lots of changes, it always felt like there was an understanding of what the challenge was and an appreciation of the efforts that were undertaken to make everything work. It’s one of the shows I’ve enjoyed working on the most, primarily for that reason. It never felt stressful, even though it was difficult at times.”

  • SCOTT PILGRIM TAKES OFF AS AN ANIME SERIES FOR NETFLIX May 15,2024

    By TREVOR HOGG

    Images courtesy of Netflix.

    Time is not always cruel to box-office bombs. Just look at the cult status achieved for Blade Runner, Fight Club, The Shawshank Redemption and Scott Pilgrim vs the World. Much of the original cast of Scott Pilgrim vs. the World reunited with filmmaker Edgar Wright during the first year of the COVID-19 pandemic for a table read to mark the 10th anniversary of its theatrical release. Also present at the time was the creator and illustrator of the graphic novel, Bryan Lee O’Malley, who would get an opportunity four years later for another a reunion, but this time for anime version produced by Netflix. O’Malley partnered with BenDavid Grabinski, with both of them serving as executive producers, writers and showrunners for the eight episodes that revisit and reimagine the story about an aspiring musician living in Toronto who has to battle the evil exes of his ladylove.

    Science SARU [Japanese animation studio headquartered in Kichijōji, Tokyo] is so amazing and can push things so much further than I would have done in my 20s in a black-and-white comic book with my level of drawing. Having this Olympic team at our disposal, we were like, ‘Let’s come up with some fun and powerful images that we couldn’t have created on our own.’”

    —Bryan Lee O’Malley. Executive Producer/Writer/Showrunner

    VES Talk: "Creating The Creator" (41)

    Ramona Flowers changing her hair color is a visual motif that honors the anime tradition of repeating shots and emphasizes to the viewer that she is the main protagonist of the story.

    Two decades after the publication of Scott Pilgrim’s Precious Little Life, O’Malley has a different perspective of the source material. “Like a 1000%,” he says. “That’s why, despite maybe people thinking anime should be a complete direct translation of the book into animation, I wanted to revisit and re-excavate it and do different things with it, and that’s how I ended up reinventing it as a whole and we ended up coming up with these scripts. Science SARU [Japanese animation studio headquartered in Kichijōji, Tokyo] is so amazing and can push things so much further than I would have done in my 20s in a black-and-white comic book with my level of drawing. Having this Olympic team at our disposal, we were like, ‘Let’s come up with some fun and powerful images that we couldn’t have created on our own.’” A key plot point was altered. “BenDavid came up with this great idea of reversing the first fight and having Matthew Patel win, and then we started coming up with new stuff that could spin off of that and new ways to look at the characters. It felt freeing,” O’Malley reveals.

    VES Talk: "Creating The Creator" (42)

    Fight sequences take advantage of the fact that Scott Pilgrim was turned into an action RPG video game, Scott Pilgrim vs. the World: The Game.

    Along with being responsible for helming Science SARU’s contribution to Star Wars: Visions called “T0-B1,” Spanish animator Abel Góngora directed the eight episodes of Scott Pilgrim Takes Off. “Having part of the team in the U.S. added an extra level of difficulty because we have very different ways of developing animation and cinema,“ Góngora states. “Nevertheless, most of the production followed the same development process as any other Japanese anime series. Bryan, BenDavid and I had meetings often where we shared ideas and opinions. Our creative team in Japan would develop the concepts and designs from the script, and we could share them every week and get feedback by email or have a discussion in a video call. Sometimes they would update the script based on our new ideas that would enrich the story.” A variety of visual research was conducted. Góngora remarks, “We traveled to Canada and took so many photos, visiting the places you see in the original story and the film, and also visiting the studios where the film was produced. We wanted to know how it feels to walk around Toronto. We also collected a lot of pop culture references from cinema, music, video games and comics; that was already very important in the graphic novel, so I wanted to push it much more.”

    VES Talk: "Creating The Creator" (43)

    The thick linework was not something usually associated with anime.


    VES Talk: "Creating The Creator" (44)

    One of the characters that benefited from an expanded storyline is Knives Chau.

    “I was worried about whether or not we’d be able to maintain that [use of thicker lines]. Because of how many different key animators and in-betweeners take part in commercial anime projects, you often see variation in linework. There was a lot of trial-and-error as we tried to get a look Abel [director Abel Góngora] was happy with, including with the effects we used on the lines.”

    —Masamichi Ishiyama, Character Designer and Executive Animation Director

    Toronto remains a character in its own right. “It’s tricky because Toronto has changed so much, so do you do 2000’s Toronto or now Toronto?” O’Malley notes. “We had to split the difference and do a fantasy homage of Toronto, like in the Scott Pilgrim vs. the World movie, but we also branched out into this futuristic version of Toronto which is a contrasting weird anime thing. That was fun.” Hanging in O’Malley’s office is a poster of Sailor Moon, which was his introduction to anime and consequently was an inspiration for the production. “Sailor Moon played on Canadian TV back in 1994 or 1995, and my little sister was watching it before school and I would peak in and go, ‘This is girlie stuff.’ But then the story got so intricate and interesting that eventually I was on the couch with her every morning watching it.” Anime is well-suited to the subject matter. “Animation is one of the most ideal forms for doing this type of story,” Grabinski believes. “Bryan’s thing is a collection of different approaches where there is some level of realism of human behavior and a recognizable reality, but then there is heightened action and impressionistic things. When it all becomes animated, to me it feels more cohesive.”

    VES Talk: "Creating The Creator" (45)

    Showrunner Bryan Lee O’Malley got to revisit his early musician days by songwriting with American chiptune-based pop and rock band Anamanaguchi.

    Serving as the Character Designer and Executive Animation Director was Masamichi Ishiyama. “The biggest difference was of course the art style. There isn’t a lot of cartoon-style animation produced domestically in Japan, so you don’t get the chance to work on that very often. Personally, I’d always wanted to work on something more cartoony, so it was really great luck that I ended up working on Scott Pilgrim Takes Off. It was also quite fresh to do an adaptation of a foreign property.” An unusual decision was to make use of much thicker lines than what was usually done in anime. Ishiyama explains, “I was worried about whether or not we’d be able to maintain that. Because of how many different key animators and in-betweeners take part in commercial anime projects, you often see variation in linework. There was a lot of trial-and-error as we tried to get a look Abel was happy with, including with the effects we used on the lines. There was also the language barrier [which required the assistance of a Japanese-English interpreter]. There were some times when my understanding was different from the director’s, which led to retakes.”

    VES Talk: "Creating The Creator" (46)

    There is a graphic-novel aesthetic to the visual language that incorporates split screens.

    “It was my first anime. It was Bryan’s [creator/showrunner Bryan Lee O’Malley] first show. Science SARU’s first time working with ‘outside’ showrunners. We were all, every single day, learning a new part of it. It was fun, where we would spend so much time over here on the music and then they would be doing sound effects over there. The first time we hear those things combined, there were so many surprises. It was a fun process.”

    —BenDavid Grabinski, Executive Producer/Writer/Showrunner

    VES Talk: "Creating The Creator" (47)

    An environment revisited from the original graphic novel is the video store.

    Even though Scott Pilgrim is in the title, he is not the actual protagonist of the series, which is reflected by Ramona Flowers changing her hair color each time there is a shift in the narrative. “It felt to us like a great visual concept to keep telling the viewer this is Ramona’s show, not Scott’s,” Grabinski observes. The creative and thematic choice was also a homage. “It’s an anime thing too,” O’Malley remarks. “I was thinking back to time-saving, cost-cutting measures of anime where you’re always going to repeat scenes. I thought it would be a fun nod to that, even though we ended up having a lavish anime that doesn’t cut any corners.” Familiar and new environments are explored. O’Malley notes, “We figured out what each episode was, where we would go and what kind of locations we would bring back from the books, like the video store. We spent two episodes in a movie studio, which is totally new. That’s me working out my demons from having gone through the whole process of making the Scott Pilgrim movie. That was fun for me.”

    VES Talk: "Creating The Creator" (48)

    One of the new environments is the movie studio storyline, which was inspired in part by the making of the feature Scott Pilgrim vs. the World.

    Graphic novel and video game aesthetics are central to the visual language and shot design. “There are two main worlds in Scott Pilgrim,” Góngora points out. “The normal world is the around 2010 Toronto, where Scott lives, that I wanted to make look realistic and cinematic. We would use darker colors than usual and some other effects to integrate the characters in the backgrounds in a more realistic way. The second is the fantasy world, connected with Ramona and the League of Evil Exes, characterized by impossible fight scenes and flashy video game-style concepts. It is part of the Scott Pilgrim universe that anything can happen, and visually crazy things are definitely going to distract you from the story, but that is the point.” Visual effects were an essential tool. Góngora explains, “I gave special importance to the depth of field blur and perspective for more realistic visuals. The characters might look simple and two-dimensional, but I wanted to have rich and textured images, coming from underground-rock-band aesthetics and punk fanzines made with photocopies and then mixed with ’90s video game pixel-style graphics.”

    VES Talk: "Creating The Creator" (49)

    Visually crazy things are going to distract from the storytelling, but that is the point.

    “Another special case would be the Ninja Paparazzi design, who started as a normal, stereotyped paparazzi man, but going through many back and forth notes, the misunderstandings lead us to make them look like evil ninja. This was unexpected, but Bryan and BenDavid loved it in the end.”

    —Abel Góngora, Director

    VES Talk: "Creating The Creator" (50)

    The casual acting featured in the coffee shop between Ramona Flowers and Julie Powers is a personal favorite of Character Designer and Executive Animation Director Masamichi Ishiyama.

    Getting the original cast to return to voice their anime personas was a coup. “They were all excited that they had new things to do and it wasn’t just coming back to do the same lines, emotions and scenes,” Grabinski observes. “They got to build on top of characters in a way they weren’t expecting to do because it could naturally feel like their entire experience in the Scott Pilgrim world had already been done. I don’t think that a lot of them had even planned to do more with it. I felt that people like Ellen Wong enjoyed getting to have Knives Chau have a new story and do a lot more fun stuff. Creatively, that’s a fun experience for anybody.” Pacing and tone had to be mapped out before animation commenced. “It was mostly instinct and our gut feeling about how much time we should be spending with each of these characters at this point of the story,” Grabinski explains. “How many digressions could we get away with? How often do we need to be on story? And how often could we go on these side quests? It became a lot of instinct as things would start coming together responding to [these questions], but there are not a lot of opportunities to change things at that point, so we have to be judicious at every stage of it.”

    VES Talk: "Creating The Creator" (51)

    Visual effects were used to manipulate the depth of field.

    Ramona Flowers was the character who went through the most iterations. “Ramona becomes the main character, so we put special effort in her clothes and hair changing color,” Góngora notes. “We also developed two future Ramona designs that never appeared in the comics or other media. It was very complicated, but also a lot of fun. Ramona’s house was also important, so I put special effort in designing the rooms and furniture, making it more complex than the original design and adding some extra details. Another special case would be the Ninja Paparazzi design, who started as a normal, stereotyped paparazzi man, but going through many back and forth notes, the misunderstandings lead us to make them look like evil ninja. This was unexpected, but Bryan and BenDavid loved it in the end.” Music is a strong component of the storytelling. “I worked a lot on the songs with Anamanaguchi, the band,” O’Malley adds. “I used to be in a band and write songs. Actually, it was fulfilling to flex those muscles in this context and get to help them with songs. Filling the frame with music was totally new to me.”

    VES Talk: "Creating The Creator" (54)

    Science SARU was not only responsible for the animation but was also a major collaborator in developing the story and look.

    Creating and executing the pilot episode was no easy task. “It was the most difficult to pull off in terms of figuring out how to structure, pacing, how it would play for someone who has never seen any version of it, and how it would play for someone who has seen the movie a bunch of times and read the books,” Grabinski states. “There was more time and thought put into that episode, but still at the end of the day you feel like you don’t have your objectivity about it. That’s one of the best things about telling a new story. We could be, ‘Does this work as a new thing?’ Even on the page, it took more out of me to write the first episode by far than any other episode because it felt like there were a million ways to do it and there was no clear path of what the exact right thing was to do.” The production found its groove. “We were blown away by all of the fights, especially Episode 102,” O’Malley singles out. “It’s an amazing fight. Then I was happy with how the ending came together. It all dovetailed nicely.” Ishiyama is looking forward to people seeing a couple of scenes in particular. “There are tons of great parts in all the episodes, but if I was to pick one, I’d choose the conversation between Ramona and Julie in the coffee shop in Episode 103. The casual acting is really amazing. The battle later in the video store feels great as well. If you’re after really exciting action, though, I’d recommend all the action scenes in Episode 104. That was probably the highest-level animation in the whole show.”

    VES Talk: "Creating The Creator" (55)

    The North Star for the character designs for Scott Pilgrim Takes Off was a X-Men cover homage that Bryan Lee O’Malley did after he completed the six graphic novels.

    Scott Pilgrim Takes Off was an educational experience for all of those involved with the production. “This was my first time working on a project, based on a foreign property, that primarily targeted a foreign market,” Ishiyama remarks. “I wasn’t sure how far all the skills I’d developed would get me. I was also worried that my technical skills wouldn’t be up to Abel’s high standards. It was a great honor to be able to serve as Chief Animation Director using Mr. Handa’s designs, which really deftly captured the good parts of the comic, while also being a great source of pressure. The entire project, from start to finish, was nothing but new challenges for me. I’m forever grateful to Bryan Lee O’Malley, BenDavid Grabinski and everyone else involved with the project for giving me a chance.” Grabinski went through his own learning curve. It was my first anime,” Grabinski states. “It was Bryan’s first show. Science SARU’s first time working with ‘outside’ showrunners. We were all, every single day, learning a new part of it. It was fun, where we would spend so much time over here on the music and then they would be doing sound effects over there. The first time we hear those things combined, there were so many surprises. It was a fun process.”

  • ADDING AN EXTRA LAYER OF CREATIVE TO GODZILLA X KONG: THE NEW EMPIRE May 8,2024

    By OLIVER WEBB

    Images courtesy of Warner Bros. Pictures and Legendary Entertainment. GODZILLA TM & © Toho Co., Ltd.

    Godzilla x Kong: The New Empire, the fifth entry in the MonsterVerse franchise and the sequel to Godzilla vs. Kong, follows Godzilla and King Kong as they face a new threat hidden deep within the planet and must come together to help the survival of the human race.

    VES Talk: "Creating The Creator" (56)

    Godzilla x Kong: The New Empire was directed by Adam Wingard, who also directed the most recent movie in the franchise, Godzilla vs. Kong, and developed a distinctive visual style.

    Paul Franklin, who served as Visual effects Supervisor on the film, is the Senior Visual Effects Supervisor at DNEG and one of the founders of the company. “Godzilla x Kong: The New Empire was already in post by the time [DNEG Visual Effects Supervisor] Aleks Pejic and I joined the show,” Franklin explains. “The work that we were awarded was very diverse and complex. It became pretty clear that we needed a bit of extra firepower in the team. The movie was shot during the summer of 2022, and a lot of the design had already been completed. We were brought on to really add an extra layer of creative, specifically for some of the bigger environments, in particular the Hollow Earth sequences, which from a design point of view are quite challenging. While the production had done a lot of concept work themselves and had a lot of creative exploration of that space, I think they developed their thinking as the movie moved through post.”

    “There is also a heightened aspect to everything, particularly when we go into the mysterious world of Hollow Earth. The clarity of photorealism with the stylings is what I would describe as neon pop art. There is very much an electric quality to all of the imagery that Adam brings in there, but also in terms of the way the storytelling works.”

    —Paul Franklin, Visual Effects Supervisor, DNEG

    VES Talk: "Creating The Creator" (57)

    Godzilla and Kong were already created by another vendor and DNEG picked up the assets.

    Godzilla x Kong: The New Empire was directed by Adam Wingard who also directed Godzilla vs. Kong. “I think Adam had developed a very distinctive visual style with that film,” Franklin says. “There is also a heightened aspect to everything, particularly when we go into the mysterious world of Hollow Earth. The clarity of photorealism with the stylings is what I would describe as neon pop art. There is very much an electric quality to all of the imagery that Adam brings in there, but also in terms of the way the storytelling works. A lot of the storytelling in these films is through the creatures themselves. They are not just giant raging monsters smashing everything up; they have a personality and a distinct character. Kong is very distinct, and Godzilla has different aspects to his character. He’s not just a belligerent, raging monster all the time. There are moments of pathos and introspection, which I know sounds a bit strange when you are talking about a 400-foot-tall radioactive dinosaur, but that’s very definitely there and was something we could see from looking at the previous film. I think Adam was quite keen to explore that. We were keen to bring as much as that as possible to the performance of the creatures in our sequences.”

    VES Talk: "Creating The Creator" (58)

    Much of the workload for DNEG consisted of creature work.

    In terms of references, Franklin and his team focused on real footage of demolitions, collapsing buildings and earthquakes, and looked at the way buildings had been destroyed and cities in past films. “Obviously, we were doing a lot of real-world environments as well as fantasy environments,” Franklin notes. “We had a big sequence set in Rome, which opens the movie, when Godzilla is facing a monster called Charybdis, which is a giant spider-lobster creature. We also spent a lot of time looking at the way places like Rome are actually photographed and how you show off the architecture in its best light. There is a lot of plate photography in that sequence, live-action photography from the location, but quite a bit of it is created in the computer. You are wanting to capture the character of what you see in the live action but do it in a creative and stylistic fashion. So that was pretty important.”

    VES Talk: "Creating The Creator" (59)

    DNEG Visual Effects Supervisors Paul Franklin and Aleks Pejic spent a lot of time looking at natural crystals and the stylizations that people bring to the depiction of crystals in big visual effects movies.

    “A lot of the storytelling in these films is through the creatures themselves. They are not just giant raging monsters smashing everything up; they have a personality and a distinct character. Kong is very distinct and Godzilla has different aspects to his character. He’s not just a belligerent, raging monster all the time. There are moments of pathos and introspection, which I know sounds a bit strange when you are talking about a 400-foot-tall radioactive dinosaur, but that’s very definitely there and was something we could see from looking at the previous film.”

    —Paul Franklin, Visual Effects Supervisor, DNEG

    The workload was split across three locations. Pejic notes, “It started with the team in Montreal and Mumbai and then London as well. We also had artists in Vancouver and Toronto, but it was all led by the Montreal office. Paul and I are both based in London. We had key people in each location, and they all took on board the briefs and notes from the client. With the time difference, where we wouldn’t be able to accommodate Zoom calls and review sessions, they would just step in and help the team.”

    VES Talk: "Creating The Creator" (60)

    Kong is very distinct, and Godzilla has different aspects to his character. They aren’t just giant raging monsters; they have a personality and a distinct character.

    When it came to creating the fantasy environments, the biggest and most challenging one was the Iwi realm. “That’s where we meet the Iwi people, in the village at the base of the trio of glowing pyramids,” Franklin explains. “That was something that had been extensively explored by the concept work that the production had done, but I think they felt they wanted to take it to a new place. So we spent a lot of time looking at natural crystals and the stylizations that people bring to the depiction of crystals in big visual effects movies. We tried to capture something that was somewhere between a natural-looking rock crystal and something that embodied this sense of mystical power and energy that the Iwi people were able to control. Then, of course, there are a couple of subsidiary environments there. We go inside what’s called the engineering room, which is a large cave carved with hieroglyphs and filled with a lake of liquid mercury, which rises into these giant crystals columns. Later, we go inside the pyramid itself and into what’s called the ceremonial chamber, which is a large space and void within the pyramid where you meet another even bigger crystal that controls mystical energies. Figuring all that out was quite a challenge because there are no real-world references that you can draw from and no single reference. You are thinking about ideas of architecture and how crystals look, and you are trying to tell this story of this magical world they’re in. Then you have all the challenges of integrating it with the live-action as well.”

    VES Talk: "Creating The Creator" (61)

    A lot of the storytelling in these films is told through the creatures themselves. Neither speaks, so it’s all done with the eyes, particularly with Kong.

    VES Talk: "Creating The Creator" (62)

    When it came to creating the fantasy environments, the biggest and most challenging one was the Iwi realm.

    VES Talk: "Creating The Creator" (63)

    There are two flavors of Godzilla: Godzilla and Godzilla evolved, which is the version with pink spines as opposed to blue spines.

    “There was also a short sequence called ‘the veil’ that was just as challenging, which is this sort of reflective organic energy barrier that protects the Iwi realm from the rest of Hollow Earth and hides it,” Franklin continues. “This was described as being a reflective barrier that was somehow organic, as if it had grown there and had this energetic reaction to anybody that interacts with it. That went through many iterations. It took lots of work from our internal art department to come up with all sorts of different ideas and concepts, and a lot of hard work from the visual effects team, especially the team in Montreal who did the work where we see the veil reacting to Trapper (Dan Stevens) when he’s pressing against it. Eventually, when they pull the veil apart, the look of the tearing fibers were very complex cloth simulations, and there were all sorts of things going into it to create that moment.”

    VES Talk: "Creating The Creator" (64)

    Director Adam Wingard wanted to change up the way the giant spider-lobster creature Charybdis looked and make it more exotic-looking, so the DNEG team gave it a more vivid color scheme. The big sequence, which opens the movie, was set in Rome.

    Much of the workload for DNEG consisted of creature work. “Godzilla and Kong were already created by another vendor,” Pejic details. “We picked up the assets. They delivered the data, but our renderer calculated the data slightly differently. On one hand you get everything delivered, but on the other hand, it’s the bare bones that you need to put together and references provided that we needed to match. So there is still a lot of work involved in order to make it like-for-like because of the different software involved. That took a while for both Godzilla and Kong, when it comes to using the different tools that bring a completely different level of expertise in order to match it like-for-like.”

    “There’s a lot of character moments in this, and it’s not just about creature action. It’s about personality and the soul of the character. Of course, neither of these characters speak. They might roar and bellow, but they don’t speak, so it’s all done with the eyes, particularly with Kong. We really had to pay attention to the subtle differences in eye shapes and the way the light was hitting the eyes and reflecting inside the irises that really capture that sense of who Kong is. That took a little while to get right, but once we figured that out it started coming out really well.”

    —Paul Franklin, Visual Effects Supervisor, DNEG

    VES Talk: "Creating The Creator" (65)

    Although Kong is a gorilla, he’s not a regular gorilla. His skin is a lot lighter and a lot less shiny.

    Alessandro Ongaro was the Overall Visual Effects Supervisor on the film. “He is also a former colleague of ours,” Franklin adds. “He started his earlier career at DNEG, so we know him very well. He’s got incredible attention to detail, and he was scrutinizing our version of Kong right down to the individual placements of hair. We had to match that perfectly because you want these sequences to cut back and forth with material that has been produced by the folks over at Wētā Workshop and Scanline, who were the principal vendors on the show. Our work has to slot right in there.”

    VES Talk: "Creating The Creator" (66)

    Alessandro Ongaro was the Overall Visual Effects Supervisor on the film and also a former colleague of Franklin and Pejic.

    VES Talk: "Creating The Creator" (67)

    The workload was split across three DNEG locations: Montreal, Mumbai and London. DNEG provided over 300 visual effects shots for the film.

    “[In the Iwi realm] we go inside what’s called the engineering room, which is a large cave carved with hieroglyphs and filled with a lake of liquid mercury, which rises into these giant crystals columns. Later, we go inside the pyramid itself and into what’s called the ceremonial chamber, which is a large space and void within the pyramid where you meet another even bigger crystal that controls mystical energies. Figuring all that out was quite a challenge because there are no real-world references that you can draw from and no single reference.”

    —Paul Franklin, Visual Effects Supervisor, DNEG

    DNEG provided over 300 visual effects shots for the film. “One of the other things that we had to pay attention to was not just matching the assets on a technical level, but also understanding the cinematic language that has been developed to portray these characters, in particular in the way that they were lensing up Kong,” Franklin adds. “There was a sequence on Monarch Island with Kong where he is getting his tooth pulled. We are very close to Kong, and there are more close-ups of Kong in that sequence than in any other part of the movie. It was all these tight close-ups of his face, and it’s an IMAX sequence, so it’s delivered at very high resolution in the large format. One of the things that is important to bear in mind is that although Kong is a gorilla, he’s not a regular gorilla. His skin is a lot lighter and a lot less shiny. Once we understood that, we started getting good results. There’s a lot of character moments in this, and it’s not just about creature action. It’s about personality and the soul of the character. Of course, neither of these characters speak. They might roar and bellow, but they don’t speak, so it’s all done with the eyes, particularly with Kong. We really had to pay attention to the subtle differences in eye shapes and the way the light was hitting the eyes and reflecting inside the irises that really capture that sense of who Kong is. That took a little while to get right, but once we figured that out it started coming out really well.”

    VES Talk: "Creating The Creator" (68)

    DNEG paid specific attention to the subtle differences in eye shapes and the way the light was hitting Kong’s eyes.

    VES Talk: "Creating The Creator" (69)

    Franklin and his team wanted the effects to be bigger and better than the last film, not just more of the same.

    Concludes Franklin: “While we have seen Kong, Godzilla and Charybdis in other movies, there is a slightly different take each time. For example, Adam wanted to change up the way Charybdis looked and make it more exotic-looking, so we gave Charybdis a more vivid color scheme and that was interesting, but you also want to bring ever increasing levels of spectacle to these films. I always describe it as a spectacle arms race. You don’t want to do what they just did in the last movie because people have already seen it; they want to come to this film and see something new. If you just go through the motions, you won’t get stuff that is worthy of going into a film of this type. Anyone working on these sorts of films always faces the same situation. We always want them to be bigger and better than the last one, not just more of the same.”

  • INTERSECTING DIGITAL AND PRACTICAL EFFECTS FOR CONSTELLATION May 1,2024

    By TREVOR HOGG

    Images courtesy of AppleTV+.

    If the success of Everything Everywhere All at Once has proven anything, it is that parallel universe storylines are not confined to the Marvel Cinematic Universe. Following that trend is the AppleTV+ series Constellation, which explores the concept of transitional points in time and space known as liminal spaces. The science fiction psychological thriller created by Peter Harness consists of eight episodes and revolves around a mysterious experiment onboard the International Space Station causing a devastating accident that prompts an astronaut to question her reality. Tasked with 1,582 shots and ensuring that everything was visually coherent and consistent upon repeat viewings was the visual effects team supervised by Doug Larmour and consisting of One of Us, Outpost VFX, Jellyfish Pictures, Mathematic Studio, Spectral, Studio 51 and Dazzle Pictures.

    VES Talk: "Creating The Creator" (70)

    Morocco stands in for Kazakhstan where Roscomos Mission Control is located.

    “We were aware through the editing process, how much do you reveal of the alternative realities while still tricking the viewer into thinking that it is a linear story. We would put little quirks in the plates so that you think, ‘Was that the right or wrong thing I just saw?’”

    —Doug Larmour, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (71)

    The computer operating systems onboard the International Space Station are from over 20 years ago because that was when the ISS was constructed.

    “In terms of the style and feel of what directors Michelle MacLaren, Joseph Cedar and Oliver Hirshbiegel and Peter Harness were going for, it was along the lines of putting yourself outside of your comfort zone to make you feel as if, ‘Are you sure as a viewer yourself that you have seen what you have seen?’” Visual Effects Supervisor Larmour explains. “We were aware through the editing process, how much do you reveal of the alternative realities while still tricking the viewer into thinking that it is a linear story. We would put little quirks in the plates so that you think, ‘Was that the right or wrong thing I just saw?’”

    VES Talk: "Creating The Creator" (72)

    Reflections are one of the visual elements utilized to imply the existence of multiple realities.

    Liminality influenced the visual aesthetic of the show. “We experimented with several different ideas involving reflections, light changes and lens warping. Whenever you see Jo Ericsson (Noomi Rapace) in Sweden or in a liminal moment where she is transitioning from one reality to another, we used quite a lot of lens effects to create a more tunneled view with a wavering double imaging so you felt the change. Within that, you have the idea of slowness of time. We had to create a particle system of snow, build those individual snowflakes and make them slow down or speed up whenever going through the liminal forcefield.”

    VES Talk: "Creating The Creator" (73)

    Actors did tricks like standing on one leg to simulate floating through the frame.

    “We experimented with several different ideas involving reflections, light changes and lens warping. Whenever you see Jo Ericsson (Noomi Rapace) in Sweden or in a liminal moment where she is transitioning from one reality to another, we used quite a lot of lens effects to create a more tunneled view with a wavering double imaging so you felt the change. Within that, you have the idea of slowness of time. We had to create a particle system of snow, build those individual snowflakes and make them slow down or speed up whenever going through the liminal forcefield.”

    —Doug Larmour, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (74)

    A spacewalk takes place in an effort to repair the damaged ISS.

    VES Talk: "Creating The Creator" (75)

    Jo Ericsson (Noomi Rapace) literally and figuratively sees double representations of herself as she watches her colleagues head back to Earth.

    Natural elements like embers were treated differently in outer space as opposed to on Earth because of the absence of gravity. “We shot lots of elements of embers for the fire that you see in the cabin later on in Episode 107, but the zero-g embers were fully made in Houdini,” Larmour reveals. “There has not been much in the way of experiments with fire in space because obviously it’s quite dangerous. No one knows what a big zero-g fire would look like other than it’s not withheld by gravity and is not so up and down but a broader flame. It was the embers that gave the fire a floaty feel because they were moving every which way.” The snow was treated completely differently from the embers. “We had some effects guys who were able to create on-set snow in the Arctic Circle, but at the same time we had two units shooting across a frozen lake and forest, so it was impossible for us to cover the entire area all of the time. There was a whole continuity thing where we had to match the bits that had snow with the ones that did not,” Larmour adds.

    “We had a famous American astronaut, Scott Kelly, as our expert advisor, and he had created some experiments in space with fluid. We completely used those as a reference guide for what a blob of fluid would look like floating through the frame. The special effects makeup guys spent two weeks making me some zero-g blood. It was an aloe vera-type substance that would stick and wobble like jelly, but it wouldn’t drip. Quite often on set, when it was going to be a blood scene, we would put a blob of this zero-g blood on the wall behind them so we could dress it as if it floated off and stuck to the wall.”

    —Doug Larmour, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (76)

    Practical sets were constructed for the mid and wide shots where characters would be traveling and/or touching various parts of the ISS.

    Rather than having a space capsule splash down off the coast of Florida, the water has been replaced with the desert environment of Kazakhstan. “I don’t think it’s a pleasant experience landing in a Soyuz capsule!” Larmour notes. “When it works well, they land in Kazakhstan which is a flat area of tundra. There are rockets that fire just before it hits to slow it down. We did a lot of research in making sure that our landing matched all of the footage that we had of Soyuz capsules landing.” A dog-wolf crossbred threatens Jo upon the capsule door opening. “Thankfully, it wasn’t CG, but it took a lot of handling to make it to do the right things at the correct time, like growl. There were different bluescreen plates for Jo in the capsule and of the wolf. We didn’t have them onsite, so we had to shoot the background when we were in Morocco. It was a three-plate composite whenever you see Jo and the wolf together.”

    VES Talk: "Creating The Creator" (77)

    VES Talk: "Creating The Creator" (78)

    There were always a couple of screen graphics being captured in-camera while the rest were composited later in post.

    “The ISS was built 20 to 30 years ago, and a lot of the operating systems of the personal computers that are up there are different and older. We had to make sure that our screen graphics were matching the actual screen graphics in the ISS now. Also, it being an Apple show, you have to make sure that your Apple products are exactly right in terms of their operating systems, how they work and when and how you swipe. A lot of effort went into making sure all of these screens were exact.”

    —Doug Larmour, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (79)

    The CAL (Cold Atomic Laboratory), which is the cause of the multiple reality chaos, was treated as an electrical device rather than a magical contraption.

    VES Talk: "Creating The Creator" (80)

    Special effects deployed fog machines used by the Navy to create the desired atmospheric effect.

    Invariably, comparisons with Gravity will be made because of the destruction of the ISS, which is ironic because both projects had the same Production Designer, Andy Nicholson. “It was a brilliant piece of hiring because of the whole wealth of experience that Andy brought from Gravity, not only from having done that show, but the problems that they had shooting that and where it had gone well and hadn’t gone well,” Larmour states. “The first thing I did was to get The Third Floor involved, which allowed us to do a virtual tour of the whole ISS and give Michelle one or two months just sitting with The Third Floor, flying her way through the ISS and working out where she would like to put things, where the actors would be and go from one to place to another; and where to put the camera so at least we shot the whole scenes in the ISS before we actually had to shoot the [full scene]. Based on that, we were able to go to Andy and say, ‘These are the shots. For the big wide expanses of the ISS, let’s do those CG. The things where you are seeing a real close-up, it doesn’t matter as long as it has something in the background. The hard ones are the mid to wide shots where you see them travel and touch lots of bits of the ISS. Those are the bits we have to build with enough room to fly there.’ That meant roofs that they could take off and big greenscreen teasers all of the way down the massive stage we had in Berlin. Previs helped us to know exactly what we had to shoot.”

    VES Talk: "Creating The Creator" (81)

    A practical fire setup was the burning cottage.

    VES Talk: "Creating The Creator" (82)

    CG snow had to be art directed to ensure that it seamlessly matched with the shots where the snow was achieved practically.

    Anti-gravity movements were mimed by the cast. Larmour explains, “When it came to moving slower than usual, the actors stood on one leg so they could drift through a frame. After a certain period of time, you get used to the idea of what that feels like to shoot. Stefan Sosna, our camera operator, got used to the idea of having a little bit of float. There are so many videos of NASA astronauts or cosmonauts filming while they’re floating, and the camera is always slightly moving because itself is floating. It never felt static. Having seen that with the full CG shots, we tried to integrate that as well. Whenever there wasn’t enough of that, we tried to integrate it in post in order to get that feel. Then you have lots of CG objects.” Carnage unfolds inside of the ISS resulting in blood simulations. “We had a famous American astronaut, Scott Kelly, as our expert advisor, and he had created some experiments in space with fluid. We completely used those as a reference guide for what a blob of fluid would look like floating through the frame. The special effects makeup guys spent two weeks making me some zero-g blood. It was an aloe vera-type substance that would stick and wobble like jelly, but it wouldn’t drip. Quite often on set, when it was going to be a blood scene, we would put a blob of this zero-g blood on the wall behind them so we could dress it as if it floated off and stuck to the wall,” Larmour says.

    VES Talk: "Creating The Creator" (83)

    Noomi Rapace and Henry David are supported and move through the set for the ISS via a wire system.

    UI had to be created for the various computer monitors. “We worked with [graphic designer] David Henry, who had previously collaborated with Michelle on The Morning Show, which also had a lot of screens,” Larmour remarks. “Whenever you see a lot of screens, there are some that are practical. There was never just a whole wall of blue. Usually, there were at least two or three screens out of the 35 that had something on them. However, we didn’t always keep what was there. The ISS was built 20 to 30 years ago, and a lot of the operating systems of the personal computers that are up there are different and older. We had to make sure that our screen graphics were matching the actual screen graphics in the ISS now. Also, being an Apple show, you have to make sure that your Apple products are exactly right in terms of their operating systems, how they work and when and how you swipe. A lot of effort went into making sure all of these screens were exact.”

    VES Talk: "Creating The Creator" (84)

    Previs was crucial in determining what sections of the ISS had to be built practically.

    VES Talk: "Creating The Creator" (85)

    The ISS sequences were shot at Turbin Studios in Berlin.

    VES Talk: "Creating The Creator" (86)

    Black screens were deployed for the spacewalk sequences to get the proper bounce light.

    In the middle of the multiple-reality chaos is a container called the Cold Atomic Laboratory (CAL). “We did that in-house. It’s a lot about feel because you don’t want it to feel too magical; you want to base it on the idea of being an electrical appliance. It has to react in a way that creates double realities. There are these massive mad experiments that are a kilometer underground with gold rooms, which are there to capture small particles from the sun. What you see are little sparks, so we used the idea that you would have little pinging particles along with the double exposure because that’s what it’s creating and it’s electrical so it feels like a blue LED running through it. You put it all together and go, ‘That’s nice, isn’t it!?’”

  • DELIVERING THE FIERY CAVE DRAGON FOR DAMSEL April 24,2024

    By TREVOR HOGG

    Images courtesy of Netflix.

    When a young bride becomes an unexpected offering to a dragon, both the creature and the royal family that betrayed her get more than what they bargained for as she has no intention of carrying on the sacrificial tradition. This is the premise for the Netflix production of Damsel directed by Juan Carlos Fresnadillo (Intruders) and starring Millie Bobby Brown, Ray Winstone, Nick Robinson, Shohreh Aghdashloo, Angela Bassett and Robin Wright. Dividing the visual effects work for the dark fantasy feature were supervisors Nigel Denton-Howes and Nicholas Brooks. “I came in at the beginning of post-production to help bring the dragon along because my background is doing creature stuff,” Denton-Howes states. “The original supervisor was more of an environments person.”

    VES Talk: "Creating The Creator" (87)

    The notes of Production Designer Patrick Tatopoulos regarding the wings of the dragon.

    Responsible for the production design was Patrick Tatopoulos. who has made a name for himself as a creature designer. “Patrick was brought back as well in post, which is unusual,” Denton-Howes notes. “I managed to get him to work with the artists at One of Us, and I finished off the look development and all of the details that are needed to make [the dragon] look real when you get into the shots.” The desire was not to go for a lizard-inspired dragon like Game of Thrones. “Our dragon is much closer to a panther, which is why when we brought her into the environments and caves, she is just as comfortable running around the caves as she is flying around them. Whereas your stereotypical dragon is lumbering on the ground and graceful in the air,” Denton-Howes explains. Tatopoulos’ designs for the dragon were refined with the original version having a strong graphical orange line going down the flanks and back of the neck. “We followed the line to the spine and tail. That allows her to stand out in the caves. But the whole textural approach is that she is part of this environment and is supposed to blend in,” Denton-Howes says.

    VES Talk: "Creating The Creator" (88)

    Great attention was spent by Production Designer Patrick Tatopoulos to get the dragon anatomically correct, such as the hip bones.

    “Our dragon is much closer to a panther, which is why when we brought her into the environments and caves, she is just as comfortable running around the caves as she is flying around them. Whereas your stereotypical dragon is lumbering on the ground and graceful in the air.”

    —Nigel Denton-Howes, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (89)

    Getting the fire simulations correct was a major responsibility for One of Us, which handled the dragon and was aided by on-set lighting.

    Shohreh Aghdashloo provides the voice of the dragon. “There are certain sounds that are awkward for a mouth that big and a muzzle that long to make,” Denton-Howes remarks. “There is some lip sync, and we’re using the jaw a lot, but a lot of the motion and mechanics were actually done with the neck. When she inhales, the neck plates open, and it compresses like bellows when she exhales. We added a shiver to the neck plates to correspond to her emotional state. When she is confident, there is very little flutter in them and when she gets angry, they vibrate like crazy. That informed the sound design.” Something unusual for Denton-Howes was getting an opportunity to work directly with the sound design team. “We did a bunch of loops back and forth of animation and sound tests until we got a final dialogue sound that was going to work.”

    VES Talk: "Creating The Creator" (90)

    The neck plates were utilized to help make it believable that the dragon could speak as well as convey the emotional state.

    VES Talk: "Creating The Creator" (91)

    VES Talk: "Creating The Creator" (92)

    The dragon was modeled on a panther, meaning that it was equally comfortable moving on the ground and flying.

    Environments were enhanced to get a proper interaction with the dragon. “A lot of the environments are CG, but on the sets that were built we added all of the rocks and debris on top of them because they were actually bare,” Denton-Howes says. “When the dragon is walking, she can kick stones, and everything extended in the background is CG. When she is interacting with characters, like when one of the guards gets grabbed, it’s a takeover into an all-CG character. For some of them, the whole shot is CG. When we’re interacting with Millie, like when the dragon’s hand is on her neck, on set there were interactive elements such as claws that could press down to allow her to feel some of it. Then we also bent her skin in 2D to add indentations, and, in the dragon, there was some modeling to push in the pads of the thumbs and fingers to make them squishy so you can feel that the two are really touching each other.” Each cave was distinct. “One had stalactites and stalagmites. The main action area has giant columns and looks like a cathedral. Then there is the crystal cave that Millie climbs up. It’s done so you don’t feel as if you’re in the same place all of the time.”

    VES Talk: "Creating The Creator" (93)

    Each cave was treated as a different environment.

    “A lot of the motion and mechanics were actually done with the neck. When she inhales, the neck plates open, and it compresses like bellows when she exhales. We added a shiver to the neck plates to correspond to her emotional state. When she is confident, there is very little flutter in them, and when she gets angry, they vibrate like crazy. That informed the sound design. We did a bunch of loops back and forth of animation and sound tests until we got a final dialogue sound that was going to work.”

    —Nigel Denton-Howes, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (94)

    VES Talk: "Creating The Creator" (95)

    The task for visual effects was to refine the details for the dragon.

    Light continuity was the biggest issue for when Elodie (Millie Bobby Brown) is tossed into a crevasse that leads to the caves inhabited by the dragon. “You were starting at one place and knew what was going to be at the bottom,” Denton-Howes describes. “They were totally different stages and sets, and you’re telling a story of moving through space with bespoke shots where no two shots are the same, so you’re not reusing anything other than the digital double.” Assisting the cave lighting were glow worms. “Glow worms don’t have magical healing properties, but they actually exist. The bluish white light was part of the production design, and Larry Fong (Kong: Skull Island), our DP, ran with that. Throughout the whole thing we were trying to be photographic. When Millie falls down and has the pomander, it goes to black and slowly comes back into lighting. The idea is that your eyes are adjusting to the dark. We were trying to find photographic reasons for there to be light, and glow worms were one of them. Even in the main caves, it was a tricky lighting scenario on the set because Larry didn’t have a lot of choice of how he lit because the stages were small.”

    VES Talk: "Creating The Creator" (96)

    VES Talk: "Creating The Creator" (97)

    Modern-day elements like cruise ships had to be painted out.

    “Castles are like digital people where everybody knows what they look like, so you know when it’s not quite right. We did a lot of work on that, balancing fantasy with realistic. Initially, [director] Juan Carlos [Fresnadillo] wanted the castle to be clean and beautiful. but when you do that it doesn’t look real. You need to grunge the castle up and allow it to have a couple of centuries of weathering, but it’s still beautiful and magnificent.”

    —Nigel Denton-Howes, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (98)

    VES Talk: "Creating The Creator" (99)

    Because of an actual drought, the colors in the plate photography had to be enhanced by Rodeo FX to make Aurea appear lush.

    Primary vendors for the 1,202 visual effects were One of Us, who was responsible for the dragon, Rodeo FX, who did a lot of environments and glow worms, Pixomondo, who handled the dragon, dragon lair, the opening and end sequences, and Important Looking Pirates, who created the harbor environment and Elodie’s homeland. Other contributions came from The Yard VFX, Rising Sun Pictures, Rebel Unit, Atomic Arts, Primary VFX, NetFX and TPO VFX. “Later in reshoots, we added the opening scenes of Elodie’s homeland as visual contrast, as well as for storytelling reasons,” Denton-Howes states. “When you get into Aurea, it needs to look realistic but really lush and beautiful. In the grade, [director] Juan Carlos Fresnadillo pushed it into gold and warmed things up even further, which is a subtle change.”

    VES Talk: "Creating The Creator" (100)

    Looming over the castle is the Stone Mountain, which was inspired by the tooth of a cat.

    “When we’re interacting with Millie [Bobby Brown], like when the dragon’s hand is on her neck, on set there were interactive elements such as claws that could press down to allow her to feel some of it. Then we also bent her skin in 2D to add indentations, and, in the dragon, there was some modeling to push in the pads of the thumbs and fingers to make them squishy so you can feel that the two are really touching each other.”

    —Nigel Denton-Howes, Visual Effects Supervisor

    VES Talk: "Creating The Creator" (101)

    Weathering had to be added to the castle to make it appear more believable.

    VES Talk: "Creating The Creator" (102)

    Atmospherics were pivotal in obscuring the Dragon Gate to the point that the viewer would not be sure if a real dragon was staring right at them.

    VES Talk: "Creating The Creator" (103)

    A lighting source in the caves are glow worms that have been given healing properties.

    VES Talk: "Creating The Creator" (104)

    VES Talk: "Creating The Creator" (105)

    Rodeo FX had to replicate a partial set for the crystal cave so it would be appear to be a mountainous climb for Elodie.

    As for the Stone Mountains that loom over the castle, the feline anatomy was an inspiration, thereby tying the ominous natural landmark with the design of the dragon. “If you were to zoom out, the main mountain is analogous to a tooth of a cat, and for the lower mountains, you could put a jaw of a cat there,” Denton-Howes reveals. “The base of the castle is real in close-up shots extended up, and for most shots it’s entirely CG. That was time-consuming to do. Castles are like digital people where everybody knows what they look like, so you know when it’s not quite right. We did a lot of work on that, balancing fantasy with realistic. Initially, Juan Carlos wanted the castle to be clean and beautiful, but when you do that it doesn’t look real. You need to grunge the castle up and allow it to have a couple of centuries of weathering, but it’s still beautiful and magnificent.”

  • LAS VEGAS’ SPHERE: WORLD’S LARGEST HIGH-RES LED SCREEN FOR LIVE ACTION AND VFX April 15,2024

    By CHRIS McGOWAN

    VES Talk: "Creating The Creator" (106)

    The newest addition to the Greater Las Vegas skyline is the 366-foot-tall Sphere. Its exosphere, the exterior shell of Sphere, has 580,000 square feet of LED panels that morph into all types of images. Sphere’s images range from a giant eyeball and leaf-like color bursts to an architectural lattice and a vivid moon. The Rockettes’ kicking and dancing also fill the Sphere and seem particularly well-suited to light up a Las Vegas night. (Photos courtesy of Sphere Entertainment)

    On the outskirts of the Las Vegas Strip, a 366-foot-tall eyeball gazes out at the urban landscape. The traffic-stopping orb, simply named Sphere, has an exosphere of 580,000 square feet of LED panels that morph into the moon, an immense pumpkin, vast fireworks and much more.

    While the exterior of Sphere is now an imposing part of the Greater Vegas skyline, its interior is an immersive, scaled-up entertainment destination with seats for 17,600+. Films, concerts and events are displayed on the largest high-resolution LED screen in the world, an arena-sized canvas for live action and visual effects.

    The wraparound 16K x 16K resolution interior display is 240 feet tall, covers 160,000 square feet and is comprised of 64,000 LED tiles manufactured by Montreal-based SACO Technologies. The audio system, powered by Berlin’s Holoplot, uses 3D audio beam-forming technology and wave-field synthesis. Sphere Entertainment’s $2.3 billion project was designed by global architectural design firm Populous.

    Sphere Entertainment developed bespoke technology for the outsized format, including its Big Sky 18K x 18K, 120 fps camera system. The Sphere Studios division’s main Burbank campus is dedicated to production and post-production of visuals and mixing of immersive audio for Sphere and houses Big Dome, a 28,000-square-foot, 100-foot-high geodesic dome that is a quarter-sized version of Sphere, for content screening.

    The rock band U2 inaugurated Sphere with a five-month-plus residency for “U2: UV Achtung Baby Live at Sphere,” and showed off the venue’s vast creative possibilities for live shows. Director Darren Aronofsky’s immersive 50-minute film Postcard from Earth, which debuted soon after U2’s launch, tells the story of our planet seen from the future. Postcard used the Big Sky camera as well as Sphere’s 4D technologies, including an infrasound haptic system to simulate the rumbles of thunder or a rocket launch and sensory effects like breezes and scents.

    VES Talk: "Creating The Creator" (107)

    Nevada’s most endangered species crowd Sphere’s interior in Es Devlin’s “Nevada Ark” for U2’s show. (Photo: Es Devlin. Courtesy of disguise and U2)

    “At its best, cinema is an immersive medium that transports the audience out of their regular life, whether that’s into fantasy and escapism, another place and time or another person’s subjective experience. The Sphere is an attempt to dial up that immersion,” Aronofsky wrote in a press release.

    Soon after Sphere’s opening, Autodesk and Marvel Studios teamed up to create an ad celebrating the former’s software and The Marvels film for an Autodesk customer event in Las Vegas. The Mill helped with the VFX, utilizing the Autodesk tools Maya and Arnold. The segment featured a gigantic Goose the flerken (a cat-like creature that transforms into a monstrous alien) on the exterior of Sphere, another massive visual certain to draw attention for miles around.

    7thSense provides Sphere’s in-house media servers, processing and distribution systems utilized fully on Postcard from Earth. They are the venue’s main playback system. For “U2:UV,” the visuals were coordinated by Treatment Studio and powered at Sphere by a disguise playback system.

    U2 AT SPHERE

    Brandon Kraemer served as a Technical Director for Treatment Studio on the “U2:UV” residency at Sphere. He comments, “The unique thing that Sphere brings to the concert experience is a sense of immersion. Given that it’s a spherical image format and covers much of your field of view – and it’s taller than the Statue of Liberty on the inside – means it becomes an instant spectacle, and if you leverage that for all its uniqueness, you can’t help but blow audiences’ minds.”

    Kraemer recalls, “Willie Williams [U2 Creative Director and Co-Founder of London-based Treatment Studio] contacted me in September of 2022 about the project. That was very early on in the process. Early creative was being discussed then, but just as importantly we started to embark on just how we were going to technically pull this off.”

    Kraemer continues, “The majority of the visuals were designed by the artists at Treatment under the creative direction of Williams and Producer Lizzie Poco*ck. However, there were other collaborators on key pieces as well. Khatsho Orfali, David Isetta and their team from Industrial Light & Magic created an amazing cityscape that deconstructs itself for U2’s new song ‘Atomic City.’ And, he adds, “Marco Brambilla and his team at The Mill in Paris created a unique world for ‘Even Better Than the Real Thing,’ a dense psychedelic collage.”

    VES Talk: "Creating The Creator" (108)

    VES Talk: "Creating The Creator" (109)

    VES Talk: "Creating The Creator" (110)

    The newest addition to the Greater Las Vegas skyline is the 366-foot-tall Sphere. Its exosphere, the exterior shell of Sphere, has 580,000 square feet of LED panels that morph into all types of images. Sphere’s images range from a giant eyeball and leaf-like color bursts to an architectural lattice and a vivid moon. The Rockettes’ kicking and dancing also fill the Sphere and seem particularly well-suited to light up a Las Vegas night. (Photos courtesy of Sphere Entertainment)

    VES Talk: "Creating The Creator" (111)

    VES Talk: "Creating The Creator" (112)

    To capture large-scale, ultra-high-resolution imagery, Sphere Entertainment’s Burbank-based unit, Sphere Studios, developed the 18K x 18K, 120fps Big Sky camera system, used in spectacular fashion by Darren Aronofsky’s Postcard from Earth. (Photo courtesy of Sphere Entertainment)

    VES Talk: "Creating The Creator" (113)

    A massive cross of light is a simple but powerful visual at this scale, part of the band’s “U2: UV Achtung Baby Live at Sphere” residency. (Photo Kevin Mazur. Courtesy of disguise and U2)

    There were numerous technical challenges and quite a few diplomatic challenges as well, and these two areas often overlapped. Kraemer explains, “Opening a building and working in a construction site while stepping through rehearsal programming is quite a feat. My hats off to U2’s legendary Production Manager, Jake Berry, for keeping the whole operation moving forward in the face of what were, at times, some serious headwinds. Getting content rendered on that screen has lots of challenges along the way, and we were also very fortunate to have the support of disguise and their [GX 3] servers as the backbone of the playback system. We couldn’t have produced the show we did without their support.” In addition, the show utilized a custom stage, based on a turntable design by Brian Eno, and covered by Yes Tech and ROE panels.

    U2’s reaction was very positive, according to Kraemer. “The band put a lot of trust in the teams that Willie Williams put together, and they were pretty blown away by it all.”

    DISGUISE

    Peter Kirkup, disguise’s Solutions and Innovation Director, recalls, “We first became involved in Sphere through [U2’s Technical Director and Video Director] Stefaan ‘Smasher’ Desmedt. Together with Smasher, disguise has been working on U2 shows for decades, so it was a perfect fit.”

    Kirkup adds, “Disguise’s software and hardware powered the visuals that were displayed on Sphere’s wraparound LED screen during the U2 show. First, our Designer software was used to help previsualize and edit the visual content – all brought together by the creative minds at Treatment Studio, including Brandon Kraemer and Lizzie Poco*ck as well as Willie Williams.”

    Disguise’s Designer software allowed the creative team to previs their visuals on a computer with the help of a 3D digital twin of the Sphere stage. “This real-time 3D stage simulator meant ideas could be communicated more clearly and quickly to get everyone on the same page,” Kirkup notes. “Designer also helped the team to sequence the visuals into a timeline of beats and bars – and import audio to lock visuals to the beat. This helped create snappy, rhythmic edits and some extra looping segments that could be pulled in on the fly in case the band decided to do an extra riff on the day of the show.”

    Kirkup continues, “Once the visuals were complete, our software split and distributed the 16K video into sections. We were working with one contiguous LED screen but still needed to split the video into sections because of the sheer volume of content involved. We were playing real-time Notch effects and pre-rendered NotchLC content at 60fps across the Sphere’s 256,000,000 pixel, 16K x 16K interior canvas.

    “Finally, our GX 3 media servers enabled all individual pieces to be perfectly in sync throughout the show,” Kirkup says. “This technology also allowed us to composite layers of video together in real time. For example, the video feed of the band that cinematic cameras were capturing during the show could be composited into our LED visuals from the Designer software. Each server was also upgraded with a 30-terabyte hard drive, so we had local storage machines for playout and 100GB networking back to the content store for file transfers and media management.”

    Kirkup adds, “We furthered our Single Large Canvas workflows, which enable content to be broken up into pieces and distributed across a cluster of machines – essential work to make a project like this come to life. We also introduced some custom color pipeline work for Sphere, adapting our standard color pipeline to match the unique characteristics of the in-house LED system.” Adds Kirkup. “A big challenge was handling such a large volume of content across 256,000,000 pixels – in real time. There were 18,000 people watching the show, and they all had their camera phones ready to broadcast to even more people, so we really had to make sure the show went well.”

    Kirkup remarks, “Bono mentioned this during the show, but I believe the most important thing about Sphere is that for the first time, a venue of this scale is being created with musicians in mind. In the past, musicians needed to squeeze into sporting arenas or stadiums that weren’t created for music – they may have had tiny screens or the wrong acoustics. With Sphere, that’s all changed. For real-time graphics and VFX artists, that’s a big trend to watch for in 2024 and beyond. I expect to see more venues designed specifically to highlight 3D visuals. With that, more VFX artists and studios will be pulled in to develop not only movie and TV effects – but incredible visuals for live events, too. The two industries will start to blur.”

    7THSENSE

    7thSense – a creative software and technology company based in Sussex, England – put together the Sphere in-house playback system and provides hardware for media serving, pixel processing and show control. “Building a first-of-its-kind venue like Sphere brought with it a significant number of challenges that the 7thSense team was keen to dig their collective fingers into,” explains Richard Brown, CTO of 7thSense.

    Brown notes, “Managing exceptionally large canvases of playback, generative and live media as a single harmonious system is of utmost importance in a venue of this scale, and it is a workflow and underpinning technology we have been working on for quite some time. With a 16K x 16K canvas size, Sphere placed a priority on accelerating the development of the tools for media playback, multi-node rendering of generative assets and live compositing from multiple ST 2110 streams, as well as for pre-visualizing the show without having access to the full system. Because time in the venue is an incredibly rare commodity, anything that can be done ‘offline’ helps to make the time in the venue more productive.”

    VES Talk: "Creating The Creator" (114)

    The visuals for U2’s “Atomic City,” with VFX work by ILM, includes a stunning deconstruction of Las Vegas going back in time. (Photo: Rich Fury. Courtesy of disguise and U2)

    VES Talk: "Creating The Creator" (115)

    The desert landscape around Las Vegas became a backdrop for U2’s “Atomic City.” (Photo: Rich Fury. Courtesy of disguise and U2)

    VES Talk: "Creating The Creator" (116)

    Marco Brambilla’s dense psychedelic collage “King Size,” put together with the help of the Mill in Paris, is an ode to Elvis Presley that accompanies the U2 song “Even Better than the Real Thing.” (Photo: Rich Fury. Courtesy of disguise and U2)

    VES Talk: "Creating The Creator" (117)

    The interior display of Sphere is 240 feet tall and covers 160,000 square feet with LED panels from SACO Technologies. (Photo: Rich Fury/Ross Andrew Stewart. Courtesy of disguise and U2)

    VES Talk: "Creating The Creator" (118)

    The interior display of Sphere can create huge individual displays for any performer, and the venue uses 3D audio beam-forming technology and wave field synthesis for an appropriately big and precise sound. (Photo courtesy of disguise and U2)

    VES Talk: "Creating The Creator" (119)

    The huge $2.3 billion Sphere has altered the Greater Las Vegas skyline and become an entertainment destination, celebrating its launch in September 2023 with the “U2: UV Achtung Baby Live at Sphere” residency. (Photo courtesy of Sphere Entertainment)

    Brown adds, “High-speed streaming of uncompressed media from Network Attached Storage (NAS) is something we have been wanting to do for a long time, but the technology was not sufficiently advanced to support the bandwidth and timely delivery of data until very recently. Fortunately, the use case for this technology aligned very much with the desired workflow at Sphere, giving us the chance to really dig into what could be an industry-changing technology for media production and presentation systems.”

    Brown continues, “Managing synchronized media playback across dozens of servers is one thing, but making it straightforward for a show programmer to build the show that spans dozens of servers is quite another. 7thSense developed an Asset Logistics workflow that simplifies what actual movie frames each server streams from the NAS based on representative meta-media used for programming the show timeline.”

    Brown explains, “Each server is configured with what section of the dome it is responsible for playing back, and this information, coupled with the name of the movie from the timeline, is used to determine the file path on the NAS that each media server uses to access the appropriate movie frames. This workflow reduces user error and makes timeline programming significantly faster than managing individual movies per server.”

    Brown comments that Sphere is the first entertainment venue of its kind when it comes to the size and resolution of the media being presented to an audience. He says, “It is imperative that all media players, generative engines and pixel processors are working in absolute synchronization, or the illusion of immersion is lost for the audience. Worse than that, image tearing or jitter, could cause the audience to become ill because of the immersive nature of the media plane. Everywhere you look, you are surrounded by the media.”

    In addition, Brown notes, “Not only is it our first major application of ST 2110, it just happens to be the largest ST 2110 network in an entertainment venue on the planet!” 7thSense has been in the world of immersive presentations in planetaria, domed theaters, museums and theme park attractions since going into business nearly 20 years ago. But what has been created at Sphere is something new, a destination live-event venue, and the technology far surpasses what has been built to date. This hybrid type of entertainment has the potential to create its own category of immersive live show experience. It’s exciting to be part of the team building it from the ground up.”

    “I think it’s an experience like no other,” Treatment Studio’s Kraemer says about Sphere. “It was a thrilling experience to be part of the first creative team to produce an amazing show there.

    I think ‘U2:UV’ will be a very tough act to follow, but I think there is a tremendous opportunity to give an audience something that is impossible in a stadium or arena show, and I look forward to seeing how this all evolves.”

  • THE EXPANDING HORIZONS OF MOTION CAPTURE April 15,2024

    Chris McGowan

    VES Talk: "Creating The Creator" (120)

    Snoop Dogg at Astro Project motion capture studio in Santa Monica for his “Crip Ya Enthusiasm” music video utilizing the Vicon system and StretchSense gloves. (Image courtesy of Vicon and Astro Project, LLC)

    Motion capture, performance capture and volumetric video technologies are rapidly advancing, incorporating AI and ML to a greater extent and focusing on enhancing realism, precision and accessibility. Peter Rabel, Technical Product Manager at Digital Domain, comments, “The trend towards real-time capabilities has become prominent, allowing for immediate feedback and integration into virtual environments, video games and live events. As we integrate artificial intelligence and machine learning as tools to enhance these functions’ capabilities further, it will enable automated analysis and capture of movements in real-time, which will help save time on the process, leading to cost savings. It’s essential for us to stay updated on recent developments and industry trends to understand the current trajectory of these capture technologies as technology continues to evolve so we can better serve our clients.”

    VICON: MARKERLESS

    Vicon made a splash in 2023 with its Los Angeles SIGGRAPH announcement of the debut of its machine learning (ML) powered markerless mocap. The news came after some three years of research and development focusing on the integration of ML and AI into markerless motion capture at Vicon’s R&D facility in Oxford, U.K. Vicon collaborated on the technology with Artanim, the Swiss research institute that specializes in motion capture, and Dreamscape Immersive, the VR experience and tech company.

    “The ability to capture motion without markers while maintaining industry-leading accuracy and precision is an incredibly complex feat,” says Mark Finch, Vicon’s Chief Technology Officer. “After an initial research phase, we have focused on developing the world-class markerless capture algorithms, robust real-time tracking, labeling and solving needed to make this innovation a reality. It was our first step towards future product launches, which will culminate in a first-of-its-kind platform for markerless motion capture.”

    VES Talk: "Creating The Creator" (121)

    On the mocap set of She-Hulk: Attorney at Law with diode suit and Digital Domain’s Charlatan “face-swapping” system. (Photo: Chuck Zlotnick. Courtesy of Marvel Studios)

    Finch continues, “What we demonstrated at SIGGRAPH was markerless recognition of the human form – using prototype cameras, software and algorithms – to track six people, with their full body solved in real-time, in a VR experience. This completely the need for participants to wear heavy gear with motion capture markers. As a result, the VR experience is more seamless and believable as the motion capture technology is largely invisible and non-invasive.” Finch adds, “Of the technology we showcased, Sylvain Chagué, Co-Founder and CTO of Artanim and Dreamscape, said, ‘Achieving best-in-class virtual body ownership and immersion in VR requires both accurate tracking and very low latency. We spent substantial R&D effort evaluating the computational performance of ML-based tracking algorithms, implementing and fine-tuning the multi-modal tracking solution, as well as taking the best from the full-body markerless motion capture and VR headset tracking capabilities.’ ”

    ROKOKO VISION

    Based in Copenhagen, Rokoko had two major announcements on the product front in the last year, “First, with Rokoko Vision, our vision AI solution that allows for suit-less motion capture from any camera. We released the first iteration mainly to get to know the space and gather insights from early use of the product,” CEO and Founder Jakob Balslev comments. “It’s becoming increasingly clear to us what the users need, and we are excited to release more updates on that front.

    VES Talk: "Creating The Creator" (122)

    Rokoko’s Coil Pro is the company’s recent innovation in motion capture hardware, featuring no drift and no occlusion through a fusion of EMF and IMU capture. (Image courtesy of Rokoko)

    VES Talk: "Creating The Creator" (123)

    OptiTrack’s Primex 120 and Primex 120W cameras offer the company’s longest camera-to-marker range for Passive and Active markers. OptiTrack accuracy with more range enables very large tracking volumes for a wide variety of training and simulation scenarios, extreme ground or aerial robotic facilities and larger cinematic virtual production studios. (Image courtesy of OptiTrack)

    VES Talk: "Creating The Creator" (124)

    OptiTrack’s Primex cameras quickly identify and track Passive and Active markers. (Image courtesy of OptiTrack)

    He adds, “Second, we unveiled our Coil Pro – the biggest innovation we’ve ever done on the hardware side – and, in my eyes, probably the biggest innovation ever in motion capture. Through a fusion of EMF and IMU capture, the Coil Pro unlocks the holy grail of motion capture: No drift and no occlusion. With drift-free global position over time and no need for line of sight from optical solutions, the Coil Pro is the best of both worlds of mocap [IMU and optical]. The underlying platform, named Volta Tracking Technology, fuses EMF and IMU and will be at the core of all our motion capture hardware solutions going forward.”

    DIGITAL DOMAIN: CHARLATAN

    Digital Domain is further developing its machine learning neural rendering software Charlatan (sometimes referred to as a face-swapping tool). “Acknowledging the expense and time associated with traditional methods, including our top-tier Masquerade [facial capture] system, we developed Charlatan to introduce efficiency and affordability,” Rabel comments. “Several years ago, Charlatan was created using machine learning techniques. This innovative approach involves utilizing real photography of an individual’s face and applying enhancements, seamlessly transferring it to another person’s face, or even manipulating discrete aspects such as aging or de-aging. Recently, we have been developing Charlatan 3D, which evolves this technology to produce full 3D geometry from this process but at a lower cost and simpler capture conditions than Masquerade. In essence, Charlatan represents a significant stride towards streamlining the creation of lifelike digital humans with unparalleled realism.”

    OPTITRACK: NEW CAMERAS

    OptiTrack provides tracking solutions that vary in use, including AAA game studios, medical labs, and consumer and prosumer budget solutions. In November the firm announced its three most advanced motion capture cameras; the PrimeX 120, PrimeX 120W and SlimX 120. “With higher resolution and increased field of view, these new additions enable larger tracking areas for a wider variety of training and simulation scenarios and larger cinematic virtual production studios,” says Anthony Lazzaro, Senior Director of Software at OptiTrack. All three cameras, which are designed and manufactured at OptiTrack’s headquarters in Corvallis, Oregon, feature their highest-yet resolution, 12 megapixels. With the PrimeX 120, customers benefit from a standard 24mm lens while the PrimeX 120W comes with an 18mm lens with a wider field of view. [And] we have 24mm or 18mm wide lens options available with the Slim X 120.”

    Lazzaro continues, “We also released a more informative and intuitive version of our mocap software, which is now compatible with all OptiTrack mocap cameras. Motive 3.1 is aimed at simplifying high-quality, low-latency performance motion tracking, offering users easy-to-use presets and labeling for tracked items that deliver the best possible motion data while saving time and eliminating extra steps. Customers also have greater visibility into possible issues and can automatically resolve against the harshest of tracking environments.”

    STRETCHSENSE: MOCAP GLOVES

    Founded in Auckland in 2012, StretchSense took on the mission to build the world’s best stretchable sensors for comfortably measuring the human body. “Building on top of our sensor technology, in 2019 we pivoted the business to focus on motion capture gloves for AAA studios, indie studios, streamers, VR/AR, live shows and more,” explains StretchSense Co-Founder and VP Partnerships & New Markets Benjamin O’Brien.

    “Our Studio Gloves are incredibly unobtrusive, with a less than 1mm thick sensor layer on top of breathable athletic fabric, and a small transmitting module,” O’Brien says. “This is more than just a comfort and style thing though; it means that our gloves don’t get in your way, and you can continue to type, use a mouse, hold a prop, use your phone or just get a pizza from the door. Once you start to think about mixed-reality applications, this becomes even more critical, as our gloves allow you to switch seamlessly between interacting with virtual spaces and the real world.”

    O’Brien adds, “Our mission is to democratize motion capture, allowing independent content creators and streamers to create incredible and immersive stories and experiences. To achieve this, we have a long-term goal of getting our gloves down to a true consumer price point, which will really open up the space. At $795, we think our latest StretchSense Studio Glove is the biggest step the industry has ever taken towards this goal; less than two years ago, something with similar performance would have cost well over $5,000.”

    ARCTURUS AND VOLUMETRIC VIDEO

    Based in Beverly Hills, Arcturus Studios was founded in 2016 by veterans of DreamWorks, YouTube, Autodesk, Netflix and other notable companies. “Together, they saw the potential for volumetric video and decided to work together to steer its development,” recalls Piotr Uzarowicz, Head of Partnerships and Marketing at Arcturus. “That led to the creation of the HoloSuite tools, consisting of HoloEdit – a tool that can edit the 3D performances of performers recorded with volumetric video – and HoloStream, software that can compress a completed volumetric video file and stream it to any 2D or 3D device, even if the broadband signal is unstable. Together, HoloSuite has helped make it possible to use volumetric video for everything from e-commerce to AR projects to virtual production and more.”

    Uzarowicz continues, “Arcturus took over Microsoft’s Mixed Reality Capture Studios (MRCS) business [in 2023], including the development of that capture system – the most sophisticated in the world – as well as the rights to maintain and supply MRCS licenses to studios around the world. That has put Arcturus in a unique position where it is now developing for all stages of volumetric video, from the capture and editing all the way to the final distribution.”

    “One of our goals has always been to make volumetric video more accessible. We’re looking at new ways to make it easier to capture volumetric videos using fewer cameras, including the use of AI and machine learning. With the MRCS technology and our licensees, we are working with some of the best and most creative content creators in the world to find where the technology can evolve and improve the production experience,” comments Uzarowicz. “We just released a new video codec called Accelerated Volumetric Video (AVV) that makes it possible to add more volumetric characters to a digital environment. With the MRCS technology, the quality of a captured performance is better than ever. Volumetric video is constantly evolving,” he adds.

    VES Talk: "Creating The Creator" (125)

    OptiTrack’s Motive 3.1 advanced motion capture software can be paired with any of OptiTrack’s motion capture cameras, including the premium PrimeX, Slim or low-cost Flex series. Motive 3.1 also offers trained markersets, enhanced sensor fusion and pre-defined settings. (Image courtesy of OptiTrack)

    VES Talk: "Creating The Creator" (126)

    StretchSense makes motion capture gloves for major and indie studios, streamers, VR/AR and live shows. (Image courtesy of StretchSense)

    VES Talk: "Creating The Creator" (127)

    StretchSense’s mocap gloves are unobtrusive, with a less than 1mm-thick sensor layer on top of breathable athletic fabric and a small transmitting module. StretchSense’s $795 Studio Glove is a step toward the company’s goal of getting its gloves down to a true consumer price point. (Image courtesy of StretchSense)

    “The trend towards real-time capabilities has become prominent, allowing for immediate feedback and integration into virtual environments, video games and live events. As we integrate artificial intelligence and machine learning as tools to enhance these functions’ capabilities further, it will enable automated analysis and capture of movements in real-time, which will help save time on the process, leading to cost savings.”

    —Peter Rabel, Technical Product Manager, Digital Domain

    VES Talk: "Creating The Creator" (128)

    Arcturus took over Microsoft’s Mixed Reality Capture Studios (MRCS) business in 2023, including development of the capture system, as well as rights to maintain and supply MRCS licenses to studios worldwide. Arcturus also now develops for all stages of volumetric video.
    (Image courtesy of Arcturus)

    VES Talk: "Creating The Creator" (129)

    Arcturus’s HoloSuite tools consist of HoloEdit – a tool that can edit the 3D performances of performers recorded with volumetric video – and HoloStream, software that can compress a completed volumetric video file and stream it to any 2D or 3D device, even if the broadband signal is unstable. With HoloSuite it’s possible to use volumetric video for e-commerce, AR projects and virtual production. (Image courtesy of Arcturus)

    MOVE AI

    Move AI has announced the official release of a single-camera motion capture app, Move One, the company revealed in late November. “The app is now available to animators and creditors looking to bring realistic human motion to their 3D characters,” said the company. “Move AI makes it easy to capture and create 3D animations.”

    AI/ML

    “Arcturus is currently experimenting with AI and machine learning in several ways. From the moment we were founded, one of our main goals has always been to make volumetric video more accessible, and AI can help us do that in a few different ways,” Uzarowicz comments. “Among other things, one of the areas we are currently focusing on in our R&D is using AI to help us capture the same level of quality – or better – we can currently capture but use fewer cameras. One of the things that makes our MRCS technology the best in the world is the software that converts the multiple captured recordings into a single 3D file. With AI, we hope to improve that process.” Regarding AI/ML, O’Brien says, “We are seeing many companies using motion capture to create their own proprietary databases for training or tuning generative AI models, and we are looking at how we can lean into this. Finally, we are ourselves constantly investing in machine learning to improve the data quality [of ] our products.”

    “Given our experience with machine learning, we see Gen AI as a tool like any other in our toolbox, enabling us to create artistically pleasing results efficiently in support of the story,” Digital Domains’s Rabel says. “We have found that the combination of powerful tools, such as machine learning and AI, with our artists’ creative talent produces the photorealistic, relatable, believable and lifelike performances we are striving for. We feel the nuances of an actor’s performance in combination with our AI and machine learning toolsets are critical to achieving photorealistic results that can captivate an audience and cross the uncanny valley.”

    Lazzaro comments, “OptiTrack already uses ML algorithms to derive optimal solutions for things like continuous calibration and trained markersets. Continuous calibration takes existing visible objects in a scene, i.e. markers, and uses that data to determine how to make small adjustments to fix calibration issues related to bumps, heat or human error. Trained markersets allow you to feed marker data into an algorithm to make a model that can track objects that were previously not trackable, such as trampolines, jump ropes and other non-rigid objects. Lazzaro adds, “Advances in AI and ML will continue to shape the way that objects are tracked in the future.” Rokoko’s Balslev notes, “AI/ML will fundamentally change the motion capture space. Text-to-motion tools are emerging and maturing and will eventually completely disrupt the stock space for online marketplaces and libraries. These tools will however not be able to replace any custom mocap that requires acting and specific timing.”

    Our mission is to democratize motion capture, allowing independent content creators and streamers to create incredible and immersive stories and experiences. To achieve this, we have a long-term goal of getting our gloves down to a true consumer price point, which will really open up the space. At $795, we think our latest StretchSense Studio Glove is the biggest step the industry has ever taken towards this goal; less than two years ago, something with similar performance would have cost

    well over $5,000.”

    —Benjamin O’Brien, Co-Founder and

    VP Partnerships & New Markets, StretchSense

    VES Talk: "Creating The Creator" (130)

    VES Talk: "Creating The Creator" (131)

    Move AI offers a single-camera motion capture app, Move One, for animators looking to bring realistic human motion to their 3D characters, making it easy to capture and create 3D animations. (Images courtesy of Move AI)

    VR AND MOCAP

    “We [Vicon and Dreamscape Immersive] are together mapping out just how far markerless mocap can go in providing a more true-to-life adventure than any other immersive VR experience by allowing for more free-flowing movement and exploration with even less user gear,” Vicon’s Finch comments. “Dreamscape has said it has long awaited the time when markerless could break from concept and into product, where the technology could support the precision required to realize its amazing potential. We’re testing that potential together now.” Finch adds, “Seeing people’s initial reactions to VR when they’re fully immersed is remarkable. The fantasy-reality line blurs, the more freedom you have in a VR space, which is reduced when a user is tethered and they feel the pull of the cable or know they’re wearing a backpack.” He continues, “There’s also the customer experience element that’s a central driver in all of this. People’s experience with markerless is a big wow moment. Markerless is going to lead to more magic – more wow.”

    Lazzaro explains, “Mocap is used in all sorts of VR and AR applications. Typically, home systems use what is called inside-out tracking to have a head-mounted display [HMD] track the world around a user. This works great for HMD and controller tracking, but can’t be used to see other people wearing HMDs. OptiTrack uses an approach called outside-in tracking where we track the HMD, controllers and props using external cameras. This allows users to build location-based VR experiences in which multiple people can go through an experience together or engineers can work on designs in VR as a group.”

    OUTLOOK

    “We think these markets [motion capture, performance capture and volumetric video] will all be changed with the continued increase in accessibility,” comments StretchSense’s O’Brien. You can now do full-body mocap for less than the cost of a new iPhone, and basic volumetric capture can now be had for free on that same iPhone. This means different things for different markets: On a major AAA studio, you are going to see mocap happening on all of the people all of the time, and also on more ambitious projects that have more animated content than ever before. For independent creators, the financial costs of getting into mocap are dropping away so more people can join the space. Finally, there are millions of streamers worldwide who are getting new ways to connect with their community and make money while doing so by stepping into virtual worlds.”

    “Mocap has a bright future in a variety of markets,” OptiTrack’s Lazzaro says. “This includes but is not limited to movies, video games, medical applications, robotics, measurement and VR. Mocap techniques are also becoming more commonplace with V-Tubers and other prosumer applications.”

  • SEIZING THE OPPORTUNITY TO VISUALIZE THE 3 BODY PROBLEM April 15,2024

    By TREVOR HOGG

    Images courtesy of Netflix.

    VES Talk: "Creating The Creator" (132)

    A major visual effects undertaking was constructing the environment and crowd at Tsinghua University watching the torture of intellectuals during the Chinese Cultural Revolution.

    A computational conundrum occurs when the motion of three celestial bodies mutually influences each other’s gravitation pull. This serves as the premise for the science fiction series 3 Body Problem by novelist/series writer Liu Cixin, where an alien race living on an environmentally unstable planet caught between a trio of suns sets in motion a plan to invade Earth with the assistance of human conspirators. Adapting the novels for Netflix is Game of Thrones duo, David Benioff and D.B. Weiss, along with True Blood veteran Alexander Woo. The first season of 3 Body Problem encompasses eight episodes that feature major visual effects spanning environment builds, a multi-dimensional supercomputer compressed into a proton, a sliced and diced oil tanker, characters being rehydrated/dehydrated and a virtual reality game that literally feels real. The epic scope of the project required the creation of 2,000 shots by Scanline VFX, Pixomondo, BUF, Image Engine, Screen Scene and El Ranchito. An in-house team took care of additional cleanups, which ranged from a character blinking too much to having to paint out an unwanted background element.

    Previs was an indispensable tool. “It’s a complete game-changer being able to do everything in Unreal Engine,” Visual Effects Supervisor Stefen Fangmeier states. “We did nearly no storyboarding. It was essentially camerawork. The funny thing was they were trying to get me to use a camera controller, and I said, ‘No. I’m a curve guy.’ I set a keyframe here and a keyframe there and interpolate. I even reanimated characters, which you can do in Unreal Engine in the most elegant way. You can take a couple of big performances and mix them together; it’s a fantastic tool. We worked with NVIZ in London who would prep all of these scenes, do the animation, then I would go shoot and light it; that was a great joy for me, being interactive. What was so interesting about 3 Body Problem was there is an incredible variety of work.”

    VES Talk: "Creating The Creator" (133)

    Vedette Lim as Vera Ye in one of the many environments given the desired scope and vastness through digital set extensions.

    A unique cinematic moment involves an oil tanker being sliced by nanowires as part of an elaborate trap to capture a hard drive belonging to a cult that supports the San-Ti invading Earth. “People get sliced every 50 cm, which we did mostly with digital doubles and a few practically built hallways and interior buildings. When you slice something that heavy vertically at 50 cm increments, the weight of what’s above it keeps it in place until the bow hits the shoreline. The dish on top of it collapses into the Panama Canal, which we created as a full CG environment,” Fangmeier states.

    Opening the series is a massive crowd gathering at Tsinghua University during the Chinese Cultural Revolution to watch the torture of intellectuals, and because of the controversial nature of the subject matter shooting in Beijing was not an option. “Ultimately, we built the environment from photography and then took some liberties,” Visual Effects Producer Steve Kullback describes. “We wanted it to be realistic, but how big is the quad? What did the buildings actually look like? I don’t think anybody is tracking it quite that precisely, but what we ended up with is having 100,000 screaming students in front of us, and that was all shot quite virtually with a stage set that was built out and extended. It was an array of bluescreens on Manitous that were set up to move around and reposition behind 150 extras.” Crowd tiling was minimal. “We did one shot, which was a poor artist’s motion control. The director wanted a shot where the camera is pushing out towards the stage over the crowd, so what we did was start in the foreground pushing over it, repeat the move pushing over it and move everyone up. We put the pieces together, and it worked quite well. We didn’t have a motion control crane, just a 50-foot Technocrane and a good team that was able to repeat their moves nicely,” Kullback says.

    VES Talk: "Creating The Creator" (134)

    Bai Mulin (Yang Hewen) sits alongside Young Ye Wenjie (Zine Tseng) who makes a fateful first contact with the San-Ti, which sets their invasion plans in motion.

    VES Talk: "Creating The Creator" (135)

    A radar dish test at Red Coast Base kills a flock of birds that were entirely CG.

    VES Talk: "Creating The Creator" (136)

    Sophon (Sea Shimooka) is an avatar in a VR game created by the San-Ti to illustrate the destructive environmental impact of living next to three suns.

    VES Talk: "Creating The Creator" (137)

    The reflective quality of the VR headset meant that extensive photogrammetry had to be taken so each set piece could be reconstructed digitally.

    VES Talk: "Creating The Creator" (138)

    One of the major environments simulated in the VR game is the observation deck of the Pleasure Dome constructed by Kublai Khan.

    Another key environment build was the Red Coast Base where astrophysics prodigy Ye Wenjie makes first contact with the San-Ti in the 1960s, which sparks an invasion conspiracy. “For Red Coast Base, we had part of an observation base in Spain that was on a mountaintop, and it was a windy day with no rain, so we had some nice sunsets and great clouds,” Visual Effects Supervisor Rainer Gombos remarks. “Some of the buildings didn’t match what we wanted, and the main building was missing the large radar dish. We only had the base built for that. We had some concepts from the art department for how the extensions should work, and then we did additional concept work once we had the specific shots and knew how the sequence would play out.” The years leading up to the present day have not been kind to the Chinese national defense facility. “The roofs have collapsed, so we had to design that. It had to look like winter and cold when it was actually a hot spring day with lots of insects flying around, which had to be painted out. There is a sequence where the radar dish is being used for some test, and birds are flying from the forest and get confused by what is happening, fly close to the dish and die. There were a lot of full CG shots there and CG birds that had to be added. Also, one of the characters revisits the base to commit suicide, so we had to introduce a digital cliff that allowed her to walk up to the side of the dish and look over,” Gombos adds.

    VES Talk: "Creating The Creator" (139)

    30 million Mongol soldiers appear in front of the Pleasure Dome before being lifted into the air because of the gravitational pull of the three suns.

    Simulating what life is like on Trisolaris is a virtual reality experience developed by the San-Ti that demonstrates the global catastrophes caused by living in close proximity to three suns. “It was described as a simple arid desert landscape,” Fangmeier explains. “The more unique aspect of that was a certain lighting change. One sun, small and in the distance, was rising, and then suddenly that goes away and it’s night again. Having the light on the actors move that quickly was tricky to achieve on set. We decided along with Jonathan Freeman, the DP for Episodes 101 and 102, to shoot that in a LED stage with a bunch of sand on the ground where we could animate hot spots and the colors of the panels even though we were going to replace all of that in CG.” Being in the realm of VR meant that the destruction could be fantastical, such as 30 million Mongol soldiers being lifted in the air because gravity no longer exists, or witnessing the entire landscape engulfed by a sea of lava. Fangmeier explains, “Then, we have some pseudoscience, like going inside of a particle accelerator. The San-Ti have sent these two supercomputers the size of a proton to stop the progress of human technology, so when they arrive 400 years later [Trisolaris is over three light years from Earth], we won’t be able to easily destroy their fleet. The proton [referred to as a sophon] unfolds into this giant two-dimensional sphere that then gets etched with computer circuitry. We talked a lot about going from 10 dimensions down to two and then going back to a 10-dimensional object. It’s stuff where you go, ‘That’s what it said in the book and script. But how do you visualize that?’”

    VES Talk: "Creating The Creator" (140)

    The VR game created by the San-Ti is so sophisticated that it stimulates the five senses of users such as Jin Cheng (Jess Hong).

    VES Talk: "Creating The Creator" (141)

    The VR game setting allowed for a more hyper-real visual language and the ability to defy physics, like when Sophon (Sea Shimooka) talks with Jin Cheng (Jess Hong) and Jack Rooney (John Bradley) in Episode 103.

    VES Talk: "Creating The Creator" (142)

    The Follower (Eve Ridley) and Sophon (Sea Shimooka) are San-Ti appearing in human form to make it easier for VR users from Earth to relate to them.

    VES Talk: "Creating The Creator" (143)

    Eiza González portrays Auggie Salazar, a member of the Oxford Five, which attempts to foil the invasion plans of the San-Ti.

    VES Talk: "Creating The Creator" (144)

    Cinematographer Jonathan Freeman made use of complex and specific lighting panels for the VR setting shots to emulate what it would be like surrounded by three suns.

    To preserve their species until the chaotic era gives way to a stable one, the San-Ti have a specific methodology that involves dehydrating and rehydrating their bodies. “It happens in two places and provided us with unique challenges and creative opportunities,” Kullback observes. “The first time we see it is when the rolled-up dehydrated bodies are being tossed into the water by the army to bring our characters back to life. The rolled-up bodies that get rehydrated were a prop that was designed by the prosthetics artists and looked quite beautiful. We go underwater and see the roll land and begin to unfold. The camera is below it and the sun is above the water, so you have these beautiful caustics and an opportunity for all kinds of subsurface scattering and light effects that make the image magical and ethereal and support the birthing process that it’s meant to represent. At the end of the experience, you have a beautiful nude woman who comes to the surface. Then, you find there are other nude folks who have been rebirthed. We shot in a tank at Pinewood to have the underwater shots and the shots of the woman, who is the final realization of this rebirthing. For the elements of the roll landing in the water, we did shoot one for real, but ultimately that was CG. Then the environment above the surface was fully CG. But then you go to the virtual reality game where Jin Cheng is walking with the Emperor and the Follower, and a chaotic era suddenly comes upon us, and there is no room to hide behind a rock from the immense forces of the sun getting ready to melt everybody. The Follower lies down on the ground in a vast desert with the pyramid off in the distance and has to dehydrate. That one presented a bit more of a challenge because you didn’t have the opportunity to travel around her and have these beautiful caustics. We heavily researched the footage of things dehydrating, like fruit left in the sun rotting, to try to get a look that was like how the body would deflate when it was completely sapped of water.”

    Being able to digitally reconstruct sets and locations was made even more important by having a highly reflective VR headset. “The reflective headset required some photogrammetry type work while you were shooting because it was often in smaller places, and there’s some crew, all of the lighting equipment, and everything is dressed in one direction,” Gombos remarks. “You had to capture that three-dimensionally because as production turned around, you needed it for the paint-out from the other direction. We had HDRI panorama photography of that, but then we also had good spatial information about the room and how that would connect to the shot lighting we would do. We wanted to be precise, and on top of that, we often did a special reconstruction shoot after we were done. I would come in for a few hours and do the photography and LiDAR required for locations. These assets were created on the fly, so we had them to review our work but also to send off to the vendors, and they were using them in post. The 3D assets were helpful in quality-controlling the work and a good tool for orienting our teams. I could have this little 3D representation of the set and share and discuss that with the DP or director. I would say, ‘If they are here, it’s going to look like this.’ It wasn’t theoretical but quite precise.”

    “One thing that was a bit different for me was that I did a lot of the concept work,” Gombos observes. “I enjoyed doing that for set extensions that then Stefen and the visual effects vendor working with him would execute.” Fangmeier is intrigued by what the viewer reaction will be beyond hardcore sci-fi fans of the books. “It’s not your typical sci-fi where you spend a lot of time in outer space or meet aliens, and it’s not an alien invasion per se. It’s the first season, so it’s fairly mellow and highbrow. It’s deals with concepts other than the stuff that people are usually used to when they watch sci-fi. I’m curious what the mainstream viewer will think about that.”

    There is a core mandate no matter the project for Kullback. “If we are able to help tell the story visually in areas where you can’t photograph something, then that’s our dimension. We’re never creating eye candy for the sake of eye candy. We work hard to have everything that we do fit into the greater whole and to do it in a seamless and attractive way. And, most importantly, in a way that communicates and moves the story forward and realizes the vision of the filmmakers.”

  • SEARIT HULUF BRINGS TOGETHER LIVE-ACTION AND ANIMATION April 15,2024

    By Trevor Hogg

    Images courtesy of Pixar Animation Studios.

    VES Talk: "Creating The Creator" (145)

    Searit Huluf, Writer and Director of “Self.”

    With the release of “Self,” a cautionary tale about the desire to please and be accepted by others, Searit Huluf got an opportunity to showcase her filmmaking talents as part of the Pixar SparkShort program. The project was partly inspired by her parents trying to adjust to life in America after immigrating from Ethiopia, which, at the time, was ravaged by civil war.

    “My mom and dad separated, so it was just my mom looking after me. I had a lot more independence because she was working a lot. I mainly stayed in the east side of Los Angeles, which became my playground. It wasn’t until I got to UCLA that I started to explore more of Los Angeles, in particular the west side, which felt like being in a different country because everything is so clean, and there were a lot more shops.”

    An opportunity presented itself to visit Ethiopia right before the coronavirus pandemic paralyzed international travel. “It was our first mother/daughter trip, and I had forgotten what it was like to be under my mom again,” Huluf recalls. “While in Ethiopia, my mother was cautious because the capital of Addis Ababa is not where my people are from, which is the Tigray region. It wasn’t until we got to Mekelew where my mom’s side of the family lives that we got to relax and meet people.” Huluf watched her aunts make coffee called ‘buna’ from scratch. “After roasting the coffee, they take it to everyone to smell to say thanks before grinding. Then you have to hand-grind the roasted coffee with a mortar and pestle. My friends and I made it every day. It was so much fun.”

    Participating in sports was not an affordable option growing up, so Huluf consumed a heavy dose of anime consisting of Sailor Moon, Naruto, One Piece and Bleach. What was made available to her in high school was the ability to take community college classes on computer coding and engineering through STEM [Science Technology Engineering and Mathematics] programming. “I did a website competition inside of which there was a film competition, so I did a live-action short with all of the seniors in my group, and afterward I was like, ‘I want to go to art school.’” The art school in question was the UCLA School of Theater, Film and Television where she studied screenwriting and stop-motion animation. “I was trying to figure out what is the closest I could get to animation but not have to draw, and it was stop-motion; that was the happy medium because I do love live-action and animation. My schooling was live-action, but a lot of my internships were animation; that’s how I divided it up.”

    Internships included Cartoon Network and DreamWorks Animation, then Pixar came to UCLA. “I kept in contact with the recruiter and started at Pixar as an intern in production management while making films on the side,” Huluf remarks. “I am also big in the employee resource groups within Pixar. I spearheaded the first celebration of Black History Month at Pixar and decided to make a documentary where Black Pixar employees talk about what it is like to be Black in America. The 19th Amendment documentary came about because I cared about people voting for the 2020 elections. It was a way to promote Pixar fans to go out and vote by having Pixar women talk about why they should do it and the complicated history of the 19th Amendment. Documentaries are scary because you go in with what’s there and make the story in the editing room. That was a lot of fun, and I gained more confidence to be a filmmaker, and I switched back to making narrative films.”

    VES Talk: "Creating The Creator" (146)

    Soul was the first high-profile project at Pixar for Searit Huluf.

    “I got to work with Tippett Studio, which I love! … There’s that Pixar comfort where everybody knows each other or someone adjacent. But these were complete strangers, and there was a big age gap between us. A little bit of me was going, ‘Are they not going to respect me?’ And it was the exact opposite. They were so loving and caring.”

    —Searit Huluf, Writer and Director of “Self”

    Critiquing, not writing, is where Huluf excels. “I went to a talk where a writer said that you have to wear different hats when you’re writing. When you’re wearing the writing hat, you’re writing all of your thoughts and ideas. Once you’re done writing, you put on the critique hat, and that’s where you start editing what you wrote. Is this actually good? Is it going to help your story? Is your structure right? You can’t wear both hats at the same time. I think a lot about that when I write. What is also great is that I went to UCLA and did screenwriting. I’m still in touch with all my screenwriting friends, and everyone is still writing. It’s nice to write something and the next week we do a writing session together and talk about the things that we’re writing.” Two individuals standout for their guidance, she says. “I still keep in touch with my UCLA professor, Kris Young, and am part of the Women in Animation mentorship program; [director] Mark Osborne is my mentor. It’s nice talking with him. He did Kung Fu Panda and The Little Prince. Mark is doing everything I want to do with my life! He’s doing live-action and animation. In this mentorship program, other women are working on their own projects. One Saturday we have it with him and the other Saturday is just us. That has been great.”

    VES Talk: "Creating The Creator" (147)

    “Self” was inspired by Searit Huluf desiring to gain social acceptance as well as by the struggles her parents faced immigrating to America from Ethiopia.

    VES Talk: "Creating The Creator" (148)

    “Self” marks the first time since WALL-E that live-action elements have been integrated with computer animation by Pixar.

    VES Talk: "Creating The Creator" (149)

    Soul afforded Huluf the opportunity to work with one of her role models, writer/director Kemp Powers, who co-directed Soul.

    VES Talk: "Creating The Creator" (150)

    Spearheading the first celebration of Black History Month at Pixar, Huluf went on to serve as a cultural consultant on Soul.

    VES Talk: "Creating The Creator" (151)

    Searit Huluf helped to facilitate brainstorming sessions to make sure that there was cultural authenticity to the story, character designs and animation for Soul.

    “[Director] Mark [Osbourne] is doing everything I want to do with my life! He’s doing live-action and animation. In this mentorship program, other women are working on their own projects. One Saturday we have it with him and the other Saturday is just us. That has been great.”

    —Searit Huluf, Writer and Director of “Self”

    Huluf has a support network at Pixar. “Luckily for me, I’m not the first Black shorts director at Pixar. Aphton Corbin made “Twenty Something,” so it‘s nice to be able to talk to her about it. Michael Yates did the Win or Lose streaming [series for Disney+], and I keep regular contact with Kemp Powers. It’s nice to talk to people who are in your arena. Personally, too, that’s why I do both live-action and animation, because there’s something about both mediums that gives me motivation and hope.”

    Like Mark Osborne with The Little Prince, Huluf was able to combine computer animation and stop-motion to make “Self,” where the protagonist is a wooden puppet surrounded by environments and metallic characters created digitally. “I got to work with Tippett Studio, which I love! I studied stop-motion at UCLA, so I know what the process looks like, but I have never done it in a professional setting, and I’m not the animator; other people are doing this who have worked on James and the Giant Peach and The Nightmare Before Christmas. There’s that Pixar comfort where everybody knows each other or someone adjacent. But these were complete strangers, and there was a big age gap between us. A little bit of me was going, ‘Are they not going to respect me?’ And it was the exact opposite. They were so loving and caring. I still text with them.”

    “I spearheaded the first celebration of Black History Month at Pixar and decided to make a documentary where Black Pixar employees talk about what it is like to be Black in America. The 19th Amendment documentary came about because I cared about people voting for the 2020 elections. It was a way to promote Pixar fans to go out and vote by having Pixar women talk about why they should do it and the complicated history of the 19th Amendment.”

    —Searit Huluf, Writer and Director of “Self”

    VES Talk: "Creating The Creator" (152)

    Going through various characters designs for the character of Self.

    A significant lesson was learned when making “Self.” “I did a lot of my independent films by myself, and this time I had people who are paid and wanted to be involved,” Huluf notes. “Working with the animators was one of the most insightful moments for me. I would film myself and say, ‘How about we do this?’ They would be like, ‘We could do that, but how about this?’ And it was so much better. In the beginning, I was very precious about it and slowly realized, ‘They know what this film is and what needs to be told, too.’ It was a learning curve for me.” The transition to feature directing is more likely to first occur in live-action rather than animation. “That’s primarily because the stakes are higher in animation than a live-action film. This is purely based on budgets.”

    VES Talk: "Creating The Creator" (153)

    A comparison of Self with one of the female Goldies.

    VES Talk: "Creating The Creator" (154)

    A personal joy for Huluf was being able to design the costume for Self.

    “When I think about filmmakers I look up to, I see that they start with smaller indie features. Barry Jenkins is a perfect example. Moonlight was only a couple of million dollars, and then he made a higher-ground film If Beale Street Could Talk. I want to start small and slowly build myself up. The big jump for me now is to do a feature. Luckily for me, I’m not too intimidated to do it. It’s more about when someone will give me the chance. I do believe in my ideas and storytelling capabilities. Right now, I’m writing and seeing how things go. I look forward to people watching ‘Self’ and being able to talk to them about it because that’s something new for me.”

    VES Talk: "Creating The Creator" (155)

    Tippett Studio Senior Art Director and Lead Puppeteer Mark Dubeau explains the puppet design to Searit Huluf.

    VES Talk: "Creating The Creator" (156)

    The hair of Self was the hardest aspect to get right. It was inspired by the hairstyle of Searit Huluf.

    VES Talk: "Creating The Creator" (157)

    A dream come true for Huluf was being able to collaborate with Tippett Studio on “Self.”

    VES Talk: "Creating The Creator" (158)

    Showcasing the detailed eyeballs for the stop-motion puppet crafted by Tippett Studio.

    Pixar SparkShorts Build “Self” Esteem for Emerging Filmmakers

    Treading a path blazed by WALL-E where live-action footage was incorporated into the storytelling, the Pixar SparkShort “Self,” conceived by Searit Huluf, revolves around a wooden stop-motion puppet desperate to be accepted into a society of metallic beings.

    “For me, it was, ‘I really want to do stop-motion. I want to visually see something alive onscreen that you can see the handprint of a human touching it,” Huluf states. “I wanted the story to be the reason it had to be stop-motion.”

    A central theme is the personal cost of gaining social acceptance. “I will play this game in my head of hiding parts of myself so I can conform and be part of the group,” Huluf explains. “That’s how I visualized Self as she literally rips herself apart to be like everyone else. The other aspect is my mom immigrated to America from Ethiopia, and I wanted to talk about how immigrants are usually not seen or heard. I wanted Self to feel like she is Ethiopian, so she has natural wood that has been carved by a masterful craftsman. There is something nice about her being so natural herself but wanting to be something so shiny, plastic and fake. There is something visually beautiful about that. Another layer on top is that she is even animated differently. Self is stop-motion, so she’s animated on 2s and 3s versus the CG Goldies, which are on 1s and are so slick when they move. Self is poppy and jumpy at points when she tries to talk and interact with them.”

    Excitement and fear were felt when working out the logistics for the project. “I was excited about doing something so different and unique, but at the same time I had no idea of how you properly schedule out and manage a stop-motion film,” remarks Eric Rosales, Producer of “Self.” “I was like, ‘Alright, let’s learn this on the fly.’ You’re taking this whole new element and trying to fit pieces into our puzzle and take their puzzle pieces and put them all together.” The other puzzle pieces belonged to Tippett Studio which constructed, animated and shot the stop-motion puppet. Rosales says, “It was a breath of fresh air in the sense that you get to see how other studios approach their scheduling, decision-making and problem-solving. It was exciting for us to learn from them as much as they were learning from us, and learn how to take the different aspects of the stop-motion process and incorporate it into our pipeline. And vice versa, how we would handle something and transfer that information back over to Tippett. We did a lot of back and forth with them and shared a lot of thoughts.”

    Complimenting and informing the design of the physical puppet was the digital version. “We had a digital puppet that Pixar was able to move around in the computer and act out what they wanted the puppet to do. That informed us in terms of how we needed to build the puppet to be able to effectively move in those ways,” states Mark Dubeau, Senior Art Director and Lead Puppeteer at Tippett Studio. “There is a lot you can do digitally that you can’t do with a puppet, and so we knew probably that we would have to build about three or four puppets to be able to do that number of shots.” Nine different faces were constructed to express panic, sadness, happiness and anger.

    For a long time, the digital double of Self was a placeholder for 19 shots that utilized stop-motion animation. “But as things progressed, we turned off our character as she is now being added in the comp,” states Nathan Fariss, Visual Effects Supervisor of “Self.” “The amount of color tweaking and general polish that was happening in comp, and even the color grading steps in post, were much more than any of our other projects because we needed to match a photographic element to our CG world and vice versa.”

    VES Talk: "Creating The Creator" (159)

    “Self” Producer Eric Rosales and Huluf examine the various pieces that go into making a stop-motion puppet.

    VES Talk: "Creating The Creator" (160)

    Various body parts and variations had to be created by Tippett Studio to give the stop-motion puppet the correct range of physicality and emotion.

    Previs and layout dictated the shot design for the stop-motion scenes. “We had a first lighting pass that was already done and even before Tippett started lighting everything up,” Rosales remarks. “We sent members of our lighting team over there to do the last bits of tweaking. Searit acted out every single shot that Tippett was going to do. She did it in her living room by herself. To sell the foot contact, Tippett ended up building a concrete slab out of Styrofoam so we were able to see Self physically walking on top of something.”

    Self makes a wish upon a falling star that enables her to exchange wooden body parts with metallic ones. “I usually talk about what the character is feeling at the moment,” Huluf states. “The way we talked about that scene of her jumping off of the roof, I wanted to show how she goes from, ‘Oh, cool these body pieces are falling from the sky,’ to slowly becoming more obsessive in finding them. That face is the last piece for her. ‘I’m going to finally belong.’ A lot of people do a lot of crazy things to belong. In Self’s case she’ll rip herself apart to be like everyone. Self-jumping off of the roof is the climax of the film because it’s her craziness and obsessiveness all wrapped into one as she falls into darkness. We had a lot of conversations about how she snaps out of it, and for me, your face is who you are. As she steps on her own face, it snaps her back into reality and makes her realize and go, ‘Oh, my God! Why did I do this?’”

    The cityscape did not have to be heavily detailed. “We ended up settling up a look that was a flat color or a gradient so it felt like there was a little bit of life in the city and things were lit up,” Fariss reveals. “There were other people present in the buildings, but it didn’t necessarily draw the audience into the lives that are going on in the buildings around there. The cities were mostly hand-built. There wasn’t enough scope to warrant going a procedural route to put the cities together, so they were hand-dressed, and there was a lot of shot-by-shot scooting some buildings around to get a more pleasing composition.”

    More problematic was getting the hair right for the puppet. States Dubeau, “Once we figured out what urethane to use then we did all of the hair. However, we found out it was too heavy for the head. We had to go back and make two pieces of hair that go down and frame either side of her face. Those were made out of that material and painted. We hollow-cast the ones on the back, which had a wire that went into the head, and then you could move those pieces around, but you couldn’t bend them. The ones in front could swing and twist. It totally worked. Now you got the sense of this light, fluffy hair that was bouncing around on her head.”

    “Self” was an educational experience. “One of the things that we learned from Lisa Cooke [Stop-Motion Producer] at Tippett is you end up saving your time in your shot production,” Rosales notes. “It’s all of the pre-production and building where you’re going to spend the bulk of your money. There was a lesson in patience for us because with CG we can take everything up to the last minute and say, ‘I want to make this or that change.’ But here we needed to zero in and know what we’ve got going on. Once the animators get their hands on the puppet and start doing the shots, the first couple of shots take a little bit of time. After that handful of shots, they get a feel for the character, movement and puppet, and it starts moving quickly. Then we were able to get our team on, and they were able to start learning their cadence as well. It started becoming a nice little machine that we were putting together.”

    Searit appreciated the collaborative spirit that made the stop-motion short possible. “I’m approving things at Tippett and going back to Pixar to approve all of the CG shots multiple times a week. We had a lot of people who were big fans of ‘Self’ and helped us while they were on other shows or even on vacation or working on the weekend because they were so passionate. I’m grateful that Jim Morris [President of Pixar] let me have this opportunity to make a stop-motion film, which has never been done before at Pixar.”

    Trevor Hogg

  • VES Talk: "Creating The Creator" (2024)

    References

    Top Articles
    Latest Posts
    Article information

    Author: Gov. Deandrea McKenzie

    Last Updated:

    Views: 6175

    Rating: 4.6 / 5 (66 voted)

    Reviews: 89% of readers found this page helpful

    Author information

    Name: Gov. Deandrea McKenzie

    Birthday: 2001-01-17

    Address: Suite 769 2454 Marsha Coves, Debbieton, MS 95002

    Phone: +813077629322

    Job: Real-Estate Executive

    Hobby: Archery, Metal detecting, Kitesurfing, Genealogy, Kitesurfing, Calligraphy, Roller skating

    Introduction: My name is Gov. Deandrea McKenzie, I am a spotless, clean, glamorous, sparkling, adventurous, nice, brainy person who loves writing and wants to share my knowledge and understanding with you.