Virtual cinematography

Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. It includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for the automated creation of real and simulated camera angles. Virtual cinematography can be used to shoot scenes from otherwise impossible camera angles, create the photography of animated films, and manipulate the appearance of computer-generated effects.

Early stages
An early example of a film integrating a virtual environment is the 1998 film, What Dreams May Come, starring Robin Williams. The film's special effects team used actual building blueprints to generate scale wireframe models that were then used to generate the virtual world. The film went on to garner numerous nominations and awards including the Academy Award for Best Visual Effects and the Art Directors Guild Award for Excellence in Production Design. The term "virtual cinematography" emerged in 1999 when special effects artist John Gaeta and his team wanted to name the new cinematic technologies they had created.

Modern virtual cinematography
The Matrix trilogy (The Matrix, The Matrix Reloaded, and The Matrix Revolutions) used early Virtual Cinematography techniques to develop virtual "filming" of realistic computer-generated imagery. The result of John Gaeta and his crew at ESC Entertainment's work was the creation of photo-realistic CGI versions of the performers, sets, and actions. Their work was based on Paul Debevec et al.'s findings on the acquisition and subsequent simulation of the reflectance field over the human face acquired using the simplest of light stages in 2000. Famous scenes that would have been impossible or exceedingly time-consuming to produce within the context of traditional cinematography include the burly brawl in The Matrix Reloaded (2003) where Neo fights up-to-100 Agent Smiths and the beginning of the final showdown in The Matrix Revolutions (2003), where Agent Smith's cheekbone gets punched in by Neo leaving the digital look-alike unharmed.

For The Matrix trilogy, the filmmakers relied heavily on virtual cinematography to attract audiences. Bill Pope, the Director of Photography, used this tool in a much more subtle manner. Nonetheless, these scenes still managed to reach a high level of realism and made it difficult for the audience to notice that they were actually watching a shot created entirely by visual effects artists using 3D computer graphics tools.

In Spider-Man 2 (2004), the filmmakers manipulated the cameras to make the audience feel as if they were swinging together with Spider-Man through New York City. Using motion capture camera radar, the cameraman moves simultaneously with the displayed animation. This makes the audience experience Spider-Man's perspective and heightens the sense of reality. In Avengers: Infinity War (2018), the Titan sequence scenes were created using virtual cinematography. To make the scene more realistic, the producers decided to shoot the entire scene again with a different camera so that it would travel according to the movement of the Titan. The filmmakers produced what is known as a synthetic lens flare, making the flare very akin to the originally produced footage. When the classic animated film The Lion King was remade in 2019, the producers used virtual cinematography to make a realistic animation. In the final battle scene between Scar and Simba, the cameraman again moves the camera according to the movements of the characters. The goal of this technology is to further immerse the audience in the scene.

Virtual cinematography in post-production
In post-production, advanced technologies are used to modify, re-direct, and enhance scenes captured on set. Stereo or multi-camera setups photograph real objects in such a way that they can be recreated as 3D objects and algorithms. Motion capture equipment such as tracking dots and helmet cameras can be used on set to facilitate the retroactive data collection in post-production.

Machine vision technology called photogrammetry uses 3D scanners to capture 3D geometry. For example, the Arius 3D scanner used for the Matrix sequels was able to acquire details like fine wrinkles and skin pores as small as 100 μm.

Filmmakers have also experimented with multi-camera rigs to capture motion data without any on set motion capture equipment. For example, a markerless motion capture and multi-camera setup photogrammetric capture technique called optical flow was used to make digital look-alikes for the Matrix movies.

More recently, Martin Scorsese's crime film The Irishman utilized an entirely new facial capture system developed by Industrial Light & Magic (ILM) that used a special rig consisting of two digital cameras positioned on both sides of the main camera to capture motion data in real time with the main performances. In post-production, this data was used to digitally render computer generated versions of the actors.

Virtual camera rigs give cinematographers the ability to manipulate a virtual camera within a 3D world and photograph the computer-generated 3D models. Once the virtual content has been assembled into a scene within a 3D engine, the images can be creatively composed, relighted and re-photographed from other angles as if the action was happening for the first time. The virtual "filming" of this realistic CGI also allows for physically impossible camera movements such as the bullet-time scenes in The Matrix.

Virtual cinematography can also be used to build complete virtual worlds from scratch. More advanced motion controllers and tablet interfaces have made such visualization techniques possible within the budget constraints of smaller film productions.

On-set effects
The widespread adoption of visual effects spawned a desire to produce these effects directly on-set in ways that did not detract from the actors' performances. Effects artists began to implement virtual cinematographic techniques on-set, making computer-generated elements of a given shot visible to the actors and cinematographers responsible for capturing it.

Techniques such as real-time rendering, which allows an effect to be created before a scene is filmed rather than inserting it digitally afterward, utilize previously unrelated technologies including video game engines, projectors, and advanced cameras to fuse conventional cinematography with its virtual counterpart.

The first real-time motion picture effect was developed by Industrial Light & Magic in conjunction with Epic Games, utilizing the Unreal Engine to display the classic Star Wars "light speed" effect for the 2018 film Solo: A Star Wars Story. The technology used for the film, dubbed "Stagecraft" by its creators, was subsequently used by ILM for various Star Wars projects as well as its parent company Disney's 2019 photorealistic animated remake of The Lion King.

Rather than scanning and representing an existing image with virtual cinematographic techniques, real-time effects require minimal extra work in post-production. Shots including on-set virtual cinematography do not require any of the advanced post-production methods; the effects can be achieved using traditional CGI animation.

Software

 * Autodesk Maya is a 3D computer graphics software that runs on Windows, OS X and Linux.
 * Autodesk 3ds Max is a professional 3D computer graphics program for making 3D animations, models, games and images for Windows only.
 * Blender (software) is a free and open-source 3D computer graphics software product used for creating animated films, visual effects, art, 3D printed models, interactive 3D applications and video games, intended for DIY virtual cinematographers.
 * Pointstream Software by Arius3D is a professional dense motion capture and optical flow system using a pixel and its movement as the unit of tracking usually over a multi-camera setup.