Join artists from Sony Pictures Imageworks and Sony Pictures Animation for an exclusive behind-the-scenes presentation of "The Mitchells vs. The Machines." This production session will focus on the artistic and technological challenges of creating a visual style that has never been seen before, through the lens of the story of an everyday family's struggle to relate while technology rises up around the world!
When Katie Mitchell, a creative outsider, is accepted into the film school of her dreams, her plans to meet "her people" at college are upended when her nature-loving dad Rick determines the whole family should drive Katie to school together and bond as a family one last time. Katie and Rick are joined by the rest of the family, including Katie's wildly positive mom Linda, her quirky little brother Aaron, and the family's delightfully chubby pug Monchi for the ultimate family road trip. Suddenly, the Mitchells' plans are interrupted by a tech uprising: all around the world, the electronic devices people love -- from phones, to appliances, to an innovative new line of personal robots - decide it's time to take over. With the help of two friendly malfunctioning robots, the Mitchells will have to get past their problems and work together to save each other and the world.
From director Mike Rianda (Gravity Falls) and producers Phil Lord and Christopher Miller (Spider-Man: Into the Spider- Verse), "The Mitchells vs. The Machines" is coming to Netflix in 2021.
In the Fall of 2019, Lifelike & Believable Animation Design, in partnership with Animatrik Film Design, began a multi-year collaboration with world-renowned circus arts collective, Les 7 Doigts (The 7 Fingers) to explore the combination of realtime motion-capture, rendering and projection with traditional circus disciplines to create unique theatrical performances simultaneously presented before live theatre audiences and remotely-connected VR participants. This collaboration came to be known as "The LiViCi Series" (for Live Virtual Circus).
Shocap Entertainment was born in April 2020 out of this partnership, and in 2021, Shocap and Les 7 Doigts announced plans to present "The LiViCi Series", a hybrid livestream / immersive performance series combining live music with deathdefying acrobatics in cinematic virtual environments.
In this session, presented from Animatrik's performance capture studio in Burnaby, British Columbia, just outside of Vancouver, Shocap Executive and Creative Director, Athomas Goldberg, along with Les 7 Doigts Artistic Director, Samuel Tetreault, will discuss the unique technical challenges and creative opportunities that come with capturing the breathtakingly dynamic motion of trained circus performers in real-time and translating that into a poetic and life-affirming digital expression.
In a series of discussions and live demonstrations, we will take you through the live virtual performance production process, including a discussions of the tools, techniques and best practices used to bring this to project to life, and give you a glimpse into what we have in store for the future of live physical performance that bridges the real and virtual worlds.
What are the mysterious Druun? How can enemies from different parts of a vast world learn to trust each other again? In Walt Disney Animation Studios' "Raya and the Last Dragon," our protagonist first breaks the world and then leads us on a journey in search of a magical solution to heal it.
In this session, our filmmakers show how they collaboratively crafted the extensive lands of Kumandra and filled them with distinctive characters and detail that represent a compelling mix of fantasy and Southeast Asian influences. We'll reveal the challenge of creating a brood of magical water dragons that rippled through multiple departments. Creating a nemesis for the dragons unlike anything seen before was another task for our effects and lighting departments. All shot through the lens of a story-enhancing cinematography plan that considered every detail right down to the dramatic film grain, all while working from home.
Please unite with us as we detail our collaborative filmmaking process.
The production of Soul faced many immense challenges, not to mention building a dialogue across all elements of the pipeline to help ensure an authentic approach to portraying a culturally specific story onscreen. In this production session, hear from a cross section of people who worked as artists on the film, along with those who participated in the "cultural trust" which helped guide many of the artistic choices along the way.
Please join Marvel Studios in presenting our first ever episodic streaming series. The teams that worked on WandaVision, The Falcon and the Winter Soldier, and Loki will take SIGGRAPH audiences through their VFX journeys as they discuss some of their shows' most innovative visual effects.
WandaVision - Wanda and Vision are two super-powered beings trying to fit in to their ideal suburban lives, but amidst the hijinks, they begin to suspect that everything is not as it seems. Marvel Studios, MARZ, and Rodeo FX discuss developing the various sitcom looks and Wanda's magic.
The Falcon and the Winter Soldier - After the events of Avengers: Endgame, Sam Wilson (Falcon) and Bucky Barnes (Winter Soldier) team up to go on a globe-trotting adventure in pursuit of a new foe, testing both their abilities and their patience with one another. Marvel Studios, Weta Digital, and Sony Pictures Imageworks discuss how they took the visual spectacle to new heights with Falcon's flying effects, and created the complex CG environments.
Loki - Picking up immediately after Loki steals the Tesseract in Avengers: Endgame, he finds himself called before the Time Variance Authority and given a choice: face deletion from reality as we know it or assist them in catching an even greater threat. Marvel Studios and ILM discuss designing some of the show's most mind-bending effects.
In this session, the team behind the Immersive Best-in-Show winner (Bonfire) at Siggraph 2019 reveals creative and technical insights from their two recent award winning VR projects: Baba Yaga and Namoo.
Baba Yaga: Inspired by one of the most distinctive and well-known characters from Eastern European folklore, Baobab Studio's interactive VR story Baba Yaga re-imagines this ancient fairytale with themes of environmental conservation and female empowerment (the project features an all-female, diverse cast with Kate Winslet, Daisy Ridley, Jennifer Hudson, and Glenn Close). Baba Yaga is Baobab's most ambitious interactive experience to date, the culmination of all of its previous narrative experiments with AI intelligent characters, real-time responsive environments, emergent branching storytelling, all while pushing the boundaries of what it means to tell stories in immersive animation. The creative team will explore the following areas of innovation (and more) on this project: How do they make you, the audience, a main character where your choices really matter and have meaningful consequences. How did they create a fairytale universe that is fully interactive with real-time AI-driven characters and environments. How did they employ a theatrical art style for VR that combines theatrical lighting, stage-craft design elements, and a hand-crafted feel all running in real-time on a mobile headset. How did they layer spatialized sound and music into our process to recreate the mythical world of Baba Yaga?
The Baba Yaga speakers are Eric Darnell, writer/director and co-founder of Baobab Studios, Nathaniel Dirksen, visual effects supervisor, Amy Tucker, lighting supervisor, Larry Cutler, executive producer, and Scot Stafford, Sound Supervisor.
Namoo: Namoo (meaning "tree" in Korean), is a narrative poem come to life as an animated VR experience entirely created with Oculus's VR animation tool "Quill." The project is led by esteemed Korean director Erick Oh (who won Annecy's Cristal Award for TV for The Dam Keeper Poems with Tonko House) in partnership with Baobab Studios. The entire piece takes place on a grassy knoll next to a seed that grows into a sapling and eventually a fully mature tree. This namoo might be interpreted as a kind of metaphor for the man's life, as it collects his meaningful memories in its branches - from pacifiers and stuffed animals to books, typewriters, and favorite scarves - to broken glasses and objects from times he'd rather soon forget. Namoo is a deeply personal yet surprisingly universal piece that will undoubtedly resonate with each viewer differently. The Namoo team will dive into all aspects of VR filmmaking to bring this visually rich film to life using Quill, from storyboarding to visual development to camera and staging to animation to optimizations for rendering on the Oculus Quest mobile headset.
The Namoo speakers are Erick Oh, writer/director, Anika Nagpal, production manager, Eusong Lee, art director, and Nick Ladd, lead quill artist.
Godzilla vs. Kong is a classic hero's journey that follows Kong on his quest to reconcile his past and defend his future. In this talk, VFX Supervisor Kevin Smith teams up with Animation Supervisor Dave Clayton to discuss Kong and the evolution of his personality and creating some of the amazing worlds he explores.
Weta Digital worked hand-in-hand with filmmakers to imagine and visualise shots of Kong's story moments well in advance, including Kong's hilarious 'morning routine' Animation beats, editing style, and pacing were heavily explored from the early stages.
Weta was responsible for Hollow Earth and the spectacular effects created by the ships that travel there in pursuit of Kong.
Weta assumed responsibility for building thirteen hero environments, some of which sprawled hundreds of kilometres. One features a physical impossibility, with the horizon meeting an entirely new landscape rather than the sky. Another is in Antarctica, where wild weather whips through the scenes, coating sets and characters in layers of snow. Kevin and Dave will explain how they worked out a system of blocking out action beats and layouts as a single big choreographed movement independent of shot cuts. This approach gave filmmakers flexibility and freedom to get their action locked down in unison with FX and layout and edit the film as if it were a live-action sporting event, capturing a moment in time and presenting it to an audience with maximum drama.
ILM and its partner visual effects companies take you behind the scenes of the second season of Lucasfilm's hit Disney+ series "The Mandalorian." The team will discuss an array of the 5,000 visual effects shots created by the global team from sand dragons to ice spiders and miniatures as well. advancements made in the virtual production arena. The panelists will discuss ILM StageCraft 2.0 as well as Helios, ILM's groundbreaking new real-time render engine implemented for the first time this season.
Derived from a novel called "The Good Shepherd" by C.S. Forester, Greyhound was adapted for the screen by Tom Hanks. The story follows the experience of U.S. Naval Commander Ernest Krause (played by Hanks) as he escorts a convoy across the Mid-Atlantic Gap in 1942.
Visualizing this story brought complex but exciting challenges to the DNEG team. We had to deliver elaborate shots of WWII ships and submarines, produce ocean sims, enhance limited live-action set photography and craft the North Atlantic around it - all while coming onto the film after principal photography had finished.
Once awarded to DNEG, Greyhound's VFX work had an immediately tight time-frame. An early objective was to produce shot content that genuinely assisted the film-makers in making confident narrative and editorial decisions. From our side at DNEG, this meant implementing efficient workflows that allowed a shot in layout to be passed quickly through the pipeline, all the way to producing a client deliverable with minimal repetitive human input. Although we were working with a short post-production period, we were able to allow the filmmakers a surprising amount of latitude by letting them view the impact of their decisions, not only as cheap low-fi renders but as lit, rendered and composited sequences.
Authenticity is key for a show like this. A minute-by-minute account via radar plans was issued by production under the remit of a naval professional. This determined the weather conditions, ocean Beaufort, time of day, ship orientation and course per shot. We mocked out all of these plans into one master scene in layout, and this gave us a shot-by-shot lighting timeline for the entire show.
The story required the portrayal of different hours (and subsequent weather changes) over a 24-hour period. To achieve this visualization we used a specially built multi-camera rig which enabled us to shoot 360-degree time-lapses from dawn until dusk in high dynamic range. For night-time battle sequences, we achieved photorealistic looks by using subtle moonlight, fire, flares, and gun-muzzle-flashes as light sources.
The story of Greyhound follows Allied convoys as they cross an oceanic zone known as the 'Black Pit', called so because it is an area out of range of protective air cover. Because of this, the digital recreation of the North Atlantic gap was critical to visualizing the story.
The ocean itself is also critical in affecting how ships would move and react to changes in the weather. Using the Beaufort scale, DNEG was able to match the sea conditions to the film's story beats. This in turn drove the animation of the ships, the water simulations and the look of the shots. In Greyhound, every shot that featured the North Atlantic as a background was a digital replacement.