Previsualization

Previsualization (also known as previsualisation, previs, previz, pre-rendering, preview or wireframe windows) is the visualizing of scenes or sequences in a movie before filming. It is a concept used in other creative arts, including animation, performing arts, video game design, and still photography. Previsualization typically describes techniques like storyboarding, which uses hand-drawn or digitally-assisted sketches to plan or conceptualize movie scenes.

Description
Previsualization’s advantage is that it allows a director, cinematographer, production supervisor, or VFX supervisor to experiment with different staging and art direction options, such as lighting, camera placement and movement, stage direction and editing, without incurring actual production costs. On larger budget projects, directors may previsualize with actors in the visual effects department or dedicated rooms. Previsualization can include music, sound effects, and dialogue that closely mimics fully produced and edited sequences. It is usually employed in scenes that involve stunts, special effects (such as chroma key), or complex choreography and cinematography. It also is used in projects that combine production techniques, such as digital video, photography, and animation, notably 3D animation.

Origins
Ansel Adams wrote about visualization in photography, defining it as "the ability to anticipate a finished image before making the exposure.” The term previsualization has been attributed to Minor White, who divided visualization into previsualization, what occurs while studying the subject, and postvisualization, how the visualized image is rendered at printing. White said vizualization was a "psychological concept" he learned from Adams and Edward Weston.

Storyboarding, the earliest planning technique, has been used since the silent picture era. Disney Studios first used the term “storyboard” sometime after 1928, when its typical practice was to present basic action and gags on drawn panels, usually three to six sketches per vertical page. By the 1930s, storyboarding live-action films was common and a regular studio art department task.

Disney Studios also invented the Leica reel process, which filmed and edited storyboards to the film soundtrack. It is the predecessor of modern computer previsualization. Other 1930s prototyping techniques involved miniature sets that were often viewed with a “periscope,” a small optical device with deep depth of field. The director would insert the periscope into the miniature set to explore camera angles. Set designers also used a technique called “camera angle projection” to create perspective scene drawings from a plan and elevation blueprint. This allowed the set to be accurately depicted for a lens of a specific focal length and film format.

With the arrival of cost-effective video cameras and editing equipment in the 1970s, most notably Sony's ¾-inch video and U-Matic editing systems, advertising agencies began to use animatics regularly as a television commercial sales tool and to guide the ad’s actual production. An animatic is a video of a hand-drawn storyboard with very limited added motion accompanied by a soundtrack. Like the Leica reel, animatics were primarily used for live action commercials.

Beginning in the mid-'70s, the first three Star Wars films introduced low-cost pre-planning innovations that refined complex visual effects sequences. George Lucas, working with visual effects artists from the newly established Industrial Light & Magic, used footage from Hollywood World War II movie aerial dogfight clips to template the X-wing space battles in the first Star Wars film. Another innovation, developed by Dennis Muren of Industrial Light and Magic, was shooting video in a miniature set using toy figures attached to rods, hand-manipulated to previsualize the speeder bike forest chase in Return of the Jedi. This allowed the film's producers to see a rough version of the sequence before the costly full-scale production started.

Francis Ford Coppola made the most comprehensive and revolutionary use of new technology to plan movie sequences in his 1982 musical feature, One From the Heart. He developed the “electronic cinema” process, making the animatic the basis for the entire film. Coppola gave himself on-set composing tools to extend his thought processes. The actors read the script dramatically in a “radio-style” recording. Storyboard artists then drew more than 1800 individual storyboard frames. The drawings were then recorded onto analog videodisks and edited to match the voice recordings.

Once production began, the video from the 35-mm cameras shooting the live performance movie gradually replaced the storyboarded stills to give Coppola a more complete vision of the film's progress. Instead of working with the actors on set, Coppola directed from an Airstream trailer nicknamed “Silverfish.” The trailer was outfitted with then state-of-the-art monitors and video editing equipment. Video feeds from the five stages at the Hollywood General Studios were fed into the trailer, which also had an off-line editing system, switcher, disk-based still store, and Ultimatte keyers. The setup allowed live and/or taped scenes to be made from both full- and miniature-sized sets.

3D computer graphics were relatively rare until 1993, when Steven Spielberg made Jurassic Park using revolutionary and Oscar-winning visual effects work by Industrial Light and Magic, one the only companies that could use digital technology to create imagery. In Jurassic Park, Lightwave 3D was used for previsualization, running on an Amiga computer with a Video Toaster card.

In Paramount Pictures' Mission: Impossible, visual effects supervisor (and Photoshop creator) John Knoll asked artist David Dozoretz to create one of the first-ever previsualizations for an entire sequence of shots rather than just one scene. Producer Rick McCallum showed this sequence to George Lucas, who hired Dozoretz in 1995 for work on the new Star Wars prequels. This was a novel development, marking the first time a previsualization artist reported to the film's director and not the visual effects supervisor.

Since then, previsualization has become an essential tool for large scale film productions, including the Matrix trilogy, The Lord of the Rings trilogy, Star Wars Episode II and III, War of the Worlds, and X-Men.

Visual effects companies that specialize in large project previsualization often use common software packages, like Newtek's Lightwave 3D, Autodesk Maya, MotionBuilder, and Softimage XSI. This technology is expensive and complex. Consequently, some directors prefer to use general purpose 3D programs, like iClone, Poser, Daz Studio, Vue, and Real3d. Others use 3D previsualization programs like FrameForge 3D Studio, which won a Technical Achievement Emmy with Avid’s Motion Builder for representing an improvement on existing methods [that] are so innovative in nature that they materially have affected the transmission, recording, or reception of television.

Digital previsualization
Digital previsualization is merely technology applied to the visual plan for a motion picture. Coppola based his new methods on analog video technology, which was soon to be superseded by an even greater technological advance—personal computers and digital media. By the end of the 1980s, the desktop publishing revolution was followed by a similar revolution in film called multimedia (a term borrowed from the 1960s), but soon to be rechristened desktop video.

The first use of 3D computer software to previsualize a scene for a major motion picture was in 1988 by animator Lynda Weinman for Star Trek V: The Final Frontier (1989). The idea was first suggested to Star Trek producer Ralph Winter by Brad Degraff and Michael Whorman of VFX facility Degraff/Whorman. Weinman created primitive 3D motion of the Starship Enterprise using Swivel 3D software designing shots based on feedback from producer Ralph Winter and director William Shatner.

Another pioneering previsualization effort, this time using gaming technology, was for James Cameron's The Abyss (1989). Mike Backes, co-founder of the Apple Computing Center at the AFI (American Film Institute), introduced David Smith, creator of the first 3D game, The Colony, to Cameron recognizing the similarities between The Colony's environment and the underwater lab in The Abyss. The concept was to use real-time gaming technology to previsualize camera movement and staging for the movie. While the implementation of this idea yielded limited results for The Abyss, the effort led Smith to create Virtus Walkthrough, an architectural previsualization software program, in 1990. Virtus Walkthrough was used by directors such as Brian De Palma and Sydney Pollack for previsualization in the early '90s.

The outline for how the personal computer could be used to plan sequences for movies first appeared in the directing guide Film Directing: Shot By Shot (1991) by Steven D. Katz, which detailed specific software for 2D moving storyboards and 3D animated film design, including the use of a real-time scene design using Virtus Walkthrough.

While teaching previsualization at the American Film Institute in 1993, Katz suggested to producer Ralph Singleton that a fully animated digital animatic of a seven-minute sequence for the Harrison Ford action movie Clear and Present Danger would solve a variety of production problems encountered when the location in Mexico became unavailable. This was the first fully produced use of computer previsualization that was created for a director outside of a visual effects department and solely for the use of determining the dramatic impact and shot flow of a scene. The 3D sets and props were fully textured and built to match the set and location blueprints of production designer Terrence Marsh and storyboards approved by director Phillip Noyce. The final digital sequence included every shot in the scene including dialog, sound effects and a musical score. Virtual cameras accurately predicted the composition achieved by actual camera lenses as well as the shadow position for the time of day of the shoot. The Clear and Present Danger sequence was unique at the time in that it included both long dramatic passages between virtual actors in addition to action shots in a complete presentation of all aspects of a key scene from the movie. It also signaled the beginning of previsualization as a new category of production apart from the visual effects unit.

In 1994, Colin Green began work on previsualization for Judge Dredd (1995). Green had been part of the Image Engineering department at Ride Film, Douglas Trumball's VFX company in the Berkshires of Massachusetts, where he was in charge of using CAD systems to create miniature physical models (rapid prototyping). Judge Dredd required many miniature sets and Green was hired to oversee a new Image Engineering department. However, Green changed the name of the department to Previsualization and shifted his interest to making 3D animatics. The majority of the previsualization for Judge Dredd was a long chase sequence used as an aid to the visual effects department. In 1995, Green started the first dedicated previsualization company, Pixel Liberation Front.

By the mid-1990s, digital previsualization was becoming an essential tool in the production of large budget feature film. In 1994, David Dozoretz, working with Photoshop co-creator John Knoll, created digital animatics for the final chase scene for Mission: Impossible (1996). In 1995, when Star Wars prequel producer Rick McCallum saw the animatics for Mission: Impossible, he tapped Dozoretz to create them for the pod race in Star Wars: Episode I – The Phantom Menace (1999). The previsualization proved so useful that Dozoretz and his team ended up making an average of four to six animatics of every F/X shot in the film. Finished dailies would replace sections of the animatic as shooting progressed. At various points, the previsualization would include diverse elements including scanned-in storyboards, CG graphics, motion capture data and live action. Dozoretz and previsualization effects supervisor Dan Gregoire then went on to do the previsualization for Star Wars: Episode II – Attack of the Clones (2002) and Gregoire finished with the final prequel, Star Wars: Episode III – Revenge of the Sith (2005).

The use of digital previsualization became affordable in the 2000s with the development of digital film design software that is user-friendly and available to any filmmaker with a computer. Borrowing technology developed by the video game industry, today's previsualization software give filmmakers the ability to compose electronic 2D storyboards on their own personal computer and also create 3D animated sequences that can predict with remarkable accuracy what will appear on the screen.

More recently, Hollywood filmmakers use the term pre-visualization (also known as pre-vis, pre vis, pre viz, pre-viz, previs, or animatics) to describe a technique in which digital technology aids the planning and efficiency of shot creation during the filmmaking process. It involves using computer graphics (even 3D) to create rough versions of the more complex (visual effects or stunts) shots in a movie sequence. The rough graphics might be edited together along with temporary music and even dialogue. Some pre-viz can look like simple grey shapes representing the characters or elements in a scene, while other pre-vis can be sophisticated enough to look like a modern video game.

Nowadays many filmmakers are looking to quick, yet optically-accurate 3D software to help with the task of previsualization in order to lower budget and time constraints, as well as give them greater control over the creative process by allowing them to generate the previs themselves.

Previs software
Some popular tools for directors, cinematographers and VFX Supervisors is FrameForge 3D Studio, ShotPro (for iPad and iPhone), Shot Designer, Toonboom Storyboard Pro, Moviestorm and iClone, amongst others.