How Avatar Has Changed the Game

Performance Capture

James Cameron’s Avatar pushed the boundaries of performance capture and motion capture techniques. Weta VFX developed FACS (Facial Animation Coding System) to provide a more comprehensive solver for more accurate translation to the facial rig. They also established a smaller footprint with multiple cameras that were lighter, more mobile, and had higher resolution. For the sequel Avatar 2, Weta advanced facial capture system by using two CG puppets and simulating fibers in the eyes for the first time. In addition, they cracked underwater performance capture with a new revolutionary system that blends underwater filming and makes use of hundreds of cameras and markers in a 900,000 gallon tank. With these technological advancements, James Cameron is leading the way for a new era of filmmaking in Avatar 2.

Animation

James Cameron and Weta VFX are changing the game of filmmaking by pushing the boundaries of character and environment creation with the developed of a new facial rig in Maya with a blend-shape system that used muscles as the basis for the controls and improved skin texturing for the Na’vi and other creatures. They also developed a virtual workflow for the lush jungles of Pandora and hand painted everything to make sure it was of the highest quality and uniform in 3D space. For the sequel Avatar 2, Weta has developed new facial animation techniques and organic tree-growth tool to depict more of Pandora’s jungles and new plant life below water. 

Lighting

Weta VFX developed a global illumination system for lighting that was based on image-based lights, but was converted to spherical harmonics. This allowed for pre-computation of all the lighting contributions in a given scene, and allowed for realistic effects such as subsurface scattering and transmission through the ears and nasal cartilage of the Na’vi. To avoid a plastic look for the blue skin of the Na’vi, Weta studied how light reflected off plants and how the face takes light from the sky. They then applied green bounce light in conjunction with white to properly convey the faces. Since then, Weta has further developed the lighting system, with PhysLight simulating on-set lighting, which has been used in recent films such as The Batman and Black Panther: Wakanda Forever.

Rendering

For Avatar, Weta’s super computers had to render up to 1.4 million tasks per day using RenderMan. This consisted of processing 8 GBs of data per second running 24 hours for more than a month. Often each of the film’s frames took several hours to render.  But no matter how computationally intensive and nightmarish it was, the more Wētā strove for photorealism, the realer it actually looked.

Stereoscopic Cinematographer and Projection

Finally, the revolution of stereoscopic cinematography and projection with the use of the Sony CineAlta Venice 3D for Avatar 2: The Way of Water. The camera consists of two Sony F950 cameras mounted on a special rig with two J-cam optical blocks and was designed to match-move the performance-captured CG characters for compositing in establishing shots. Cameron has also paid special attention to the post-production process for the best possible viewing experience, recently upgrading the original Avatar for a 4K reissue and using the latest advance in laser projection for a brighter picture. For Avatar 2, the camera will be used for both underwater and flying sequences, and the sequel will be offered in an unprecedented number of formats, including 4K and 3D at a high frame rate of 48 fps. However, Cameron found the hyperrealism of 120 fps too jarring for non-action scenes, and has alternated between shooting in 48 fps and the traditional frame-rate standard of 24 fps. For theaters, the projector will run the film at 48fps, with any part of the scene that needs to be 24fps having the same frame shown twice. With this advanced cinematography and post-production, Cameron is leading the way for a new era of filmmaking in Avatar 2.

Follow us on social media for the latest updates in B2B!

Image

Latest

data-driven tools
Leverage Data-Driven Tools and Local SEO for Maximum Search Engine Rankings
July 26, 2024

As businesses continue to navigate the digital landscape, data-driven tools are more crucial than ever for effective SEO strategies. Understanding and implementing the proper SEO practices can make a significant difference with evolving algorithms and competitive markets. Given that 75% of users never scroll past the first page of search results, this statistic underscores…

Read More
On-device AI
On-Device AI is Today’s Tech Innovation, Competition and Market Leadership Driver
July 26, 2024

On-device AI revolutionizes the tech landscape, making it a critical factor for industry dominance. This cutting-edge technology directly integrates advanced AI capabilities into devices, transforming consumer and enterprise applications. This shift stems from the need for improved performance, reduced latency, enhanced data privacy & security, and personalized user experiences. With advancements in neural processing…

Read More
modern supply chains
The Role of AI in Modern Supply Chains: Insights from Aaron Hatfield at Arvist
July 26, 2024

Artificial intelligence rapidly transforms modern supply chains, with companies like Arvist leading the charge. In a recent episode of Hammer Down, hosted by Mike Bush, Aaron Hatfield, the Head of Sales at Arvist, sheds light on AI’s practical applications and benefits in enhancing supply chain operations. Is AI in the supply chain a double-edged…

Read More
semiconductor manufacturing
Training New Semiconductor Manufacturing Professionals is Key to Meet Coming Domestic Manufacturing Demand
July 26, 2024

Over the past few years, the U.S. has made significant strides in semiconductor manufacturing, driven by substantial investments and strategic policies. With the CHIPS Act expected to triple domestic semiconductor manufacturing capacity by 2032, the need for a skilled workforce is more urgent than ever. This discussion explores the key question: What does the…

Read More