Bullet time

Bullet time (also known as frozen moment, dead time, flow motion or time slice) is a visual effect or visual impression of detaching the time and space of a camera (or viewer) from that of its visible subject. It is a depth enhanced simulation of variable-speed action and performance found in films, broadcast advertisements, and realtime graphics within video games and other special media. It is characterized by its extreme transformation of both time (slow enough to show normally imperceptible and unfilmable events, such as flying bullets), and of space (by way of the ability of the camera angle—the audience's point-of-view—to move around the scene at a normal speed while events are slowed). This is almost impossible with conventional slow motion, as the physical camera would have to move implausibly fast; the concept implies that only a "virtual camera", often illustrated within the confines of a computer-generated environment such as a virtual world or virtual reality, would be capable of "filming" bullet-time types of moments. Technical and historical variations of this effect have been referred to as time slicing, view morphing, temps mort (French: "dead time") and virtual cinematography.

The term "bullet time" was first used with reference to the 1999 film The Matrix, and later in reference to the slow motion effects in the 2001 video game Max Payne. In the years since the introduction of the term via the Matrix films it has become a commonly applied expression in popular culture.

History


The technique of using a group of still cameras to freeze motion occurred before the invention of cinema itself with preliminary work by Eadweard Muybridge on chronophotography. In The Horse in Motion (1878), Muybridge analyzed the motion of a galloping horse by using a line of cameras to photograph the animal as it ran past. Eadweard Muybridge used still cameras placed along a racetrack, and each camera was actuated by a taut string stretched across the track; as the horse galloped past, the camera shutters snapped, taking one frame at a time. Muybridge later assembled the pictures into a rudimentary animation, by having them traced onto a glass disk, rotating in a type of magic lantern with a stroboscopic shutter. This zoopraxiscope may have been an inspiration for Thomas Edison to explore the idea of motion pictures. In 1878–1879, Muybridge made dozens of studies of foreshortenings of horses and athletes with five cameras capturing the same moment from different positions. For his studies with the University of Pennsylvania, published as Animal Locomotion (1887), Muybridge also took photos from six angles at the same instant, as well as series of 12 phases from three angles.

A debt may also be owed to MIT professor Harold Edgerton, who, in the 1940s, captured now-iconic photos of bullets using xenon strobe lights to "freeze" motion.

Bullet-time as a concept was frequently developed in cel animation. One of the earliest examples is the shot at the end of the title sequence for the 1966 Japanese anime series Speed Racer: as Speed leaps from the Mach Five, he freezes in mid-jump, and then the "camera" does an arc shot from front to sideways.

In 1980, Tim Macmillan started producing pioneering film and later, video, in this field while studying for a BA at the (then named) Bath Academy of Art using 16mm film arranged in a progressing circular arrangement of pinhole cameras. They were the first iteration of the Time-Slice' Motion-Picture Array Cameras" which he developed in the early 1990s when still cameras for the array capable of high image quality for broadcast and movie applications became available. In 1997 he founded Time-Slice Films Ltd. (UK). He applied the technique to his artistic practice in a video projection, titled Dead Horse in an ironic reference to Muybridge, that was exhibited at the London Electronic Arts Gallery in 1998 and in 2000 was nominated for the Citibank Prize for photography.

Another precursor of the bullet-time technique was "Midnight Mover", a 1985 Accept video. In this video, Academy Award winning special effects director Zbigniew Rybczynski mounted thirteen 16mm film cameras on a specially constructed hexagonal rig that encircled the performers. The resulting footage was meticulously edited to create the illusion of the band members spinning in place while moving in real time. In the 1990s, a morphing-based variation on time-slicing was employed by director Michel Gondry and the visual effects company BUF Compagnie in the music video for The Rolling Stones' "Like A Rolling Stone", and in a 1996 Smirnoff commercial the effect was used to depict slow-motion bullets being dodged. Similar time-slice effects were also featured in commercials for The Gap (which was directed by M. Rolston and again produced by BUF), and in feature films such as Lost in Space (1998) and Buffalo '66 (1998) and the television program The Human Body.

It is well-established for feature films' action scenes to be depicted using slow-motion footage, for example the gunfights in The Wild Bunch (directed by Sam Peckinpah) and the heroic bloodshed films of John Woo. Subsequently, the 1998 film Blade featured a scene that used computer-generated bullets and slow-motion footage to illustrate characters' superhuman bullet-dodging reflexes. The 1999 film The Matrix combined these elements (gunfight action scenes, superhuman bullet-dodging, and time-slice effects), popularizing both the effect and the term "bullet-time". The Matrix version of the effect was created by John Gaeta and Manex Visual Effects. Rigs of still cameras were set up in patterns determined by simulations, and then shot either simultaneously (producing an effect similar to previous time-slice scenes) or sequentially (which added a temporal element to the effect). Interpolation effects, digital compositing, and computer-generated "virtual" scenery were used to improve the fluidity of the apparent camera motion. Gaeta said of The Matrix use of the effect:

"For artistic inspiration for bullet time, I would credit Otomo Katsuhiro, who co-wrote and directed Akira, which definitely blew me away, along with director Michel Gondry. His music videos experimented with a different type of technique called view-morphing and it was just part of the beginning of uncovering the creative approaches toward using still cameras for special effects. Our technique was significantly different because we built it to move around objects that were themselves in motion, and we were also able to create slow-motion events that 'virtual cameras' could move around – rather than the static action in Gondry's music videos with limited camera moves."

Following The Matrix, bullet time and other slow-motion effects were featured as key gameplay mechanics in various video games. While some games like Cyclone Studios' Requiem: Avenging Angel, released in March 1999, featured slow-motion effects, Remedy Entertainment's 2001 video game Max Payne is considered to be the first true implementation of a bullet-time effect that enables the player to have added limited control (such as aiming and shooting) during the slow-motion mechanic; this mechanic was explicitly called "Bullet Time" in the game. The mechanic is also used extensively in the F.E.A.R. series, combining it with squad-based enemy design encouraging the player to use bullet time to avoid being overwhelmed.

Bullet time was used for the first time in a live music environment in October 2009 for Creed's live DVD Creed Live.

The popular science television program, Time Warp, used high speed camera techniques to examine everyday occurrences and singular talents, including breaking glass, bullet trajectories and their impact effects.

Technology
The bullet time effect was originally achieved photographically by a set of still cameras surrounding the subject. The cameras are fired sequentially, or all at the same time, depending on the desired effect. Single frames from each camera are then arranged and displayed consecutively to produce an orbiting viewpoint of an action frozen in time or as hyper-slow-motion. This technique suggests the limitless perspectives and variable frame rates possible with a virtual camera. However, if the still array process is done with real cameras, it is often limited to assigned paths.

In The Matrix, the camera path was pre-designed using computer-generated visualizations as a guide. Cameras were arranged, behind a green or blue screen, on a track and aligned through a laser targeting system, forming a complex curve through space. The cameras were then triggered at extremely close intervals, so the action continued to unfold, in extreme slow-motion, while the viewpoint moved. Additionally, the individual frames were scanned for computer processing. Using sophisticated interpolation software, extra frames could be inserted to slow down the action further and improve the fluidity of the movement (especially the frame rate of the images); frames could also be dropped to speed up the action. This approach provides greater flexibility than a purely photographic one. The same effect can also be simulated using pure CGI, motion capture and other approaches.

Bullet time evolved further through The Matrix series with the introduction of high-definition computer-generated approaches like virtual cinematography and universal capture. Universal capture, a machine vision guided system, was the first ever motion picture deployment of an array of high definition cameras focused on a common human subject (actor, Neo) in order to create a volumetric photography. Like the concept of bullet time, the subject could be viewed from any angle yet, at the same time, the depth based media could be recomposed as well as spatially integrated within computer-generated constructs. It moved past a visual concept of a virtual camera to becoming an actual virtual camera. Virtual elements within the Matrix Trilogy utilized state-of-the-art image-based computer rendering techniques pioneered in Paul Debevec's 1997 film The Campanile and custom evolved for The Matrix by George Borshukov, an early collaborator of Debevec. Inspiration aside, virtual camera methodologies pioneered within the Matrix trilogy have been often credited as fundamentally contributing to capture approaches required for emergent virtual reality and other immersive experience platforms.

For many years, it has been possible to use computer vision techniques to capture scenes and render images of novel viewpoints sufficient for bullet time type effects. More recently, these have been formalized into what is becoming known as free viewpoint television (FTV). At the time of The Matrix, FTV was not a fully mature technology. FTV is effectively the live action version of bullet time, without the slow motion.