17

Imaging

One of the surest signs of an encroaching revolution is the tendency for the old guard to fall behind, mired in a mindset that has lost its relevance. A symbol of this can be seen in the death and rebirth of the glamorous Hollywood theater venue once known as the Kodak Theater and its resurgence as the Dolby Theater. Eastman Kodak Company, the raw film stock giant that had bankrolled the spectacular theater, dominated the motion picture industry for numerous decades. But it was slow to read the writing on the wall. Kodak did not properly anticipate the demise of film or, rather, the rise of file-based digital cinema.

Almost overnight, studios abandoned the industry’s staple—celluloid film—and began investing in new companies with names like RED and Blackmagic. Even Panavision, the premier motion picture camera and lens manufacturer, faltered, losing much of its market share to competitors like Arriflex with its Alexa digital camera. Under bankruptcy protection, Eastman Kodak transferred its stake in the expensive theater to Dolby, a company that had found it easier to ride the digital revolution.

Originally a designer of noise-reduction systems, Dolby had branched out with innovations in multi-track digital surround sound (such as 5.1 and 7.1) and digital mastering of soundtracks (Dolby Digital). While speaking to my students, Michael Cioni, the CEO of the digital finishing company LightIron, pointed out that audio tends to experience technological revolutions about ten years before video. This is due partially to the lesser complexity of audio material. Digital audio preceded digital video. File-based technologies such as CDs, Dolby Digital, and iPods appeared years before the transition from motion picture film and analog videotape to digital cinema.

K vs. mm

As we emerge from the celluloid past, leaving behind the world of millimeters— 8 mm,16 mm, 35 mm and 70 mm—we find ourselves in a new world of Ks—2K, 4K, 5K, 6K, 8K, and upward. What was once measured in feet and frames is now measured in clock time or timecode. The world of filmmaking has changed forever.

The new, ultra high resolutions put special demands on editors and the postproduction process. In the heyday of film, the options were fairly limited in terms of media handling. Film, though marketed in various speeds—this referred to the medium’s light sensitivity, which in turn affected the resolution—maintained a stable system because the choices in terms of exposing, inputting, and outputting film were rather limited. With digital cinema the playing field is wide open.

Ironically, there exists a self-limiting factor to advancing technology. In the digital realm, as processing speed and hard drive volume increases, camera manufacturers achieve greater resolution and depth of color while software developers design more complex editing interfaces and systems capable of the handling higher resolution. But at a cost. These improvements tax computer systems, causing potential glitches, slow downs, and media-management issues. Handling large files takes more and more time. Importing and exporting also becomes time consuming.

In order to handle these weighty new codecs, manufacturers of nonlinear editing systems have introduced short cuts to make the offline editing process more manageable. Avid recently introduced the Media Browser window, allowing editors and their assistants to link to large media files such as those produced on the RED or Alexa cameras, without having to immediately import voluminous media of many gigabytes each. Avid’s linking system creates a proxy file that sees into the actual raw media while displaying it at a lower resolution, allowing the editor to immediately begin editing.

After editing a scene it can then be transcoded into new media at a different resolution and made stable. Playback can also be altered in systems like Premiere Pro by selecting quarter, half or full resolution playback. The approach oscillates between two camps. One is the workflow in The Martian (2015), where massive amounts of media—250 hours of RED footage per eye in 3D plus another 60 hours of GoPro footage, some shot in resolutions as high as 6K—were all transcoded into Avid’s lower resolution codec, DNxHD36, to expedite the physical process of editing. This contrasts with the 500 hours of 6K RED footage shot on Gone Girl (2014), and the large amount of media management required to edit offline with 2.5K files and the ability to playback raw footage in 6K.

Another crucial aspect involves rendering. Rendering is another aspect that, in some cases, haunts the realm of high-definition video. Rendering creates a new, stabilized media from the original media. This becomes necessary when dealing with memory-dense files, such as high-level color correction, visual effects, and titles. At one time it was necessary, with systems like Premiere Pro, to even render a cut on the timeline in order to view an edit. At this writing, one of the factors that still slows down Premiere is the frequent need to render files. Avid, while still requiring rendering for some complex effects, generally renders on the fly, meaning that the editor does not need to disrupt his or her process in order to render clips or effects.

From Taxidermy to Taxonomy

Film terms, techniques, and equipment are becoming relics of the past. The requirements of digital cinema have unleashed a world of new taxonomies, such as codec, meta-data, LUT, and DCP. It has also created new job opportunities, such as the position of DIT, or Digital Image Technician. In the past, shooting celluloid film left director and cinematographer with the mystery of what would be revealed when the film was developed by a film laboratory and delivered as dailies the next day. Though experienced directors of photography (DPs) had a pretty good sense of how film would look once it was processed, file-based cinematography creates immediate access and review of exposed material. There is no guessing. The DIT is present on the set to organize and transcode scenes and takes, even introducing basic color correction before the dailies are delivered, by hard drive or over the Cloud, to the editor.

New wireless tablets allow directors, editors, script supervisors, and DITs to view and annotate takes as they are happening. With the advent of Cloud technology, new systems, like BeBop, eliminate the need for cumbersome hard drives, storing not only the media but also the editing program for access on one’s workspace from any remote location that has internet access.

Digital cinema has also altered the aesthetic approach, allowing for greater flexibility in dealing with images.

Less Painful Extractions

In the editing room, one of the greatest breakthroughs of high-resolution production resides in the ability to alter an image without degradation. In the day of celluloid film and lower resolution video, alterations to the frame, such as enlargement of the image, became painfully noticeable. Grain increased and sharpness decreased. With ultra high-resolution formats, such as 6K and above, shots can be blown up, allowing for an extraction of a select portion of the frame, without sacrificing image quality. The editor can shift the frame up or down, left or right, resize and stabilize the image.

This works particularly well when the final movie will be released in a resolution lower than the original camera resolution. For instance, deriving a 2K projection master from a 6K raw file with a 5K extraction yields a sharp image regardless of how many shots were scaled up significantly. This approach allows for extreme blow-ups to reinforce an emotion or story point, or to create coverage that had been neglected during production (such as close-ups or inserts shots). Or, by adding keyframes, the editor can animate a wide shot into a tighter view for a push in.

Image alteration through other visual effects benefits from higher resolutions as well. The high-resolution realm allows effects artists and editors to introduce highly doctored images that blend well with the existing material. Previously, extreme image manipulation would result in compromised images. Editors can also achieve high-quality image stabilization through the availability of higher resolution. What once required experts to steady an overly shaky image, can now be corrected by the editor on a nonlinear, offline editing system.

Case Study

During the editing of a recent thriller, I was able to enhance the film’s tone by creating slow push-ins on faces or objects that had previously existed as static shots. Because the film was shot in 6K on the RED Dragon and initially released in 1920 × 1080, I had significant latitude to work with. Push-ins and blow-ups became second nature, with dozens created in the editing room. Due to the tight shooting schedule, some coverage was limited. In several scenes, close-up shots were created as extractions from the well-lit wide shots.

Case Study

In a recent documentary the director and I tested blow-ups from ultra high-resolution files and found that he could shoot some of the interviews in relative wide shots and then blow them up in the editing room to create occasional close-ups of the interview subjects. This particularly helped in transitioning from one idea to another, where some of the interview subject’s dialogue would be trimmed out.

As mentioned in the documentary section in Chapter 10, in order to truncate dialogue the editor often needs to cut away to another image, usually B-camera coverage, hiding the audio edit beneath the other image before returning to the on-camera interviewee. With the ability to alter the image size, it is now possible to remain on the same subject, giving the impression of a match cut when interstitial dialogue is removed, by simply blowing up the frame from wide to close-up. Interview dialogue often presents a challenge when editing documentaries since only a minute or two of the interview (out of an hour or more of footage) may find its way into the final cut.

Tech Note

It should also be noted that another advantage of digital editing for documentaries resides in transition effects like Avid’s Fluid Morph. If the camera and subject remain fairly steady, this effect will allow the editor to cut out part of an interview, butt the two pieces together, and then invisibly blend the cut point through morph technology. When done well the cut appears imperceptible and the audience sees one continuous image.

Many early high-definition digital cameras produced an image with a satisfying color and density scheme. This could be altered further in postproduction through color correction to achieve the tones that best fit the scene. With ultra high resolution, many cameras shoot raw. This means that the image initially appears overexposed and of low contrast. Directors who are not aware of this aspect and see this image for the first time sometimes panic. So it helps to apply a LUT—Look Up Table—to achieve an initially balanced image while editing. In the final color grading process, the full advantages of RAW files make themselves known through the wider availability of color and density.

This process equates loosely to the one-light prints that film labs used to produce when delivering dailies. The one light is a single color and brightness correction that puts the film dailies into a viewable range. Later, after the film was edited and ready to be printed for wide release, a color timer sat down and re-timed or graded every scene in the film according to a scale of 50 points each representing red, green, and blue.

A similar process occurs after the DIT or editor applies a LUT. This general correction creates an acceptable look for the editor to cut with. Afterwards, when the film is locked, the LUT is removed and a colorist goes through the movie shot by shot and re-grades it. These days the options found in a color session with equipment such as DiVinci’s Resolve allow additional creative control. Using power windows, sections of the frame may be enhanced for color or density, further enhancing the emotional aspects of the film. Ultimately, all these technical advantages exist to help better tell the movie’s story or to engender an emotional response.

Rx

Take advantage of the new avenues open to editors through ultra high-definition digital cinema, including:

  •  Digital push-ins
  •  Frame repositioning
  •  Blow-ups (scaling)
  •  Digital blending as in Avid’s Fluid Morph
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset