13
Video on the Web

Types of Video on the Web

One of the fastest-developing areas for distribution of video content, movies, and television programming in general is via the Internet, including the World Wide Web. Web distribution of video has become extremely popular. Streaming media players like Roku, Apple TV, Google Chromecast, and Amazon Fire, and streaming media services such as Netflix, Hulu, and Amazon among others, make it possible for audiences to access at any time, on-demand or live, and on any device, all kinds of programs including news, movies, television shows, sports, etc. In addition, video hosting services such as YouTube and Facebook allow users to upload their videos to be shared with friends, special groups, or millions of people. Users can also send links of live events from their smartphones. Webcasting, a method that uses the Internet to “broadcast” live programming (video, audio, text, or graphics), has become a tool widely used in business and educational organizations for marketing and training purposes. Webcasts can also be recorded and accessed in demand at a later time that is more convenient for the user.

Video streaming has grown exponentially and has become an essential component of the evolving communication landscape. Streaming devices coupled with sufficient bandwidth and new compression technologies allow any user to upload video to a server or the cloud, or broadcast live video to the whole world via the Internet. The Cisco Visual Networking Index 2015–2020 (www.cisco.com) estimates that by 2020 fixed broadband speeds will double and it would take a person more than 5 million years to watch the amount of video circulating through global IP networks.

This chapter explains the basic concepts of video on demand and streaming media, the steps in the process of preparing video and audio for streaming over the Internet, and the technology that is involved in this process.

Video on Demand

Beginning in the 1990s, Web-based video was modeled on the concept of video on demand that had been developed by the cable television industry. At that time cable television companies attempted to provide subscribers with the option to select their preferred programs at any time of the day or night from a large digital database of program material. Subscribers would select a program, and the cable company would transmit it to the viewer’s home television. Today, video on demand is found on many cable and satellite television systems and in hotels, where guests can select from a relatively small menu of movies and other program titles.

Web-based video on demand systems require that video be downloaded onto a computer hard disk and then played back on the subscriber’s personal computer, television set, or any other device with an LCD screen, such as a smartphone or tablet. In other words, Web video on demand is mainly a download-and-play technology. Because video files were usually very large and typically downloaded over 28, 33, or 56 kbps (kilobits per second) dial-up modems, it took most users too long to download the files. As a result, most Internet content providers avoided Web-based video on demand because it was too slow and inefficient. Today, faster networks—e.g., 2, 5, and 10 Mbps (megabits per second)—and better compression technologies have made it possible to deploy Web-based video in most Web pages and deliver full HD programming to any user’s device.

Streaming Media

With the advent of faster computers and networks, increased adoption of wireless broadband services such as high-speed Internet services, and the development of more efficient video and audio compression systems, some more convenient launching platforms for streaming media on the Internet have been developed.

Streaming media is a technology that delivers dynamic content, such as video or audio files, over a local computer network or the Internet. It transmits files in small packets that “stream” for immediate, real-time viewing or listening. While the user watches or listens to the first packet, the next one is buffered or downloaded to the computer’s memory and then released for continuous viewing and/or listening similar to live television. The stream (packet of information) is encoded at the sender’s end and decoded at the user’s end by using appropriate computer software. The term has come to exemplify the digital convergence of communication and information technologies, such as video, audio, graphics, voice, text, and data. These elements come together online to provide a wide range of audiovisual experiences that make the content much easier to understand and much more fun to interact with.

Many people consider streaming media to be a new delivery medium comparable to broadcast television or cable/satellite television. Called Internet television, it has revolutionized the ways in which organizations communicate with their employees, their customers, and the public in general. Many news media organizations stream news clips from their regular broadcast newscasts. For example, CBSN streams live 24/7 original CBS news programming. In addition, the digital video advertising market continues to grow and many social media sites have taken serious steps toward enhancing their streaming video capabilities.

Film studios and producers regularly stream movie trailers for upcoming theatrical film releases, while a plethora of media outlets such as Netflix and Amazon offer online movie streaming. Collegiate athletics has become an important player in the live video streaming field as more and more intercollegiate games are streamed live, in many cases using their own production crews and equipment such as the TriCaster multi-format switcher which allows for live streaming and recording of events. (See Figure 13.1.)

Table 13.1Typical uses of streaming media

Areas

Uses

Entertainment

Music

Movies and movie trailers

Sports

Information

News

Weather

Government

Training

Military

Education

Distance learning

Academic events

Research

Corporate

Training

Sales

Marketing and advertising

Videoconferencing

Community

Services (church worship, government meetings)

Major League Baseball, in open competition with the broadcast networks, started to stream live baseball games during the 2003 season. Yahoo broadcast the 2003 NCAA basketball tournament through its Platinum Internet service. The Harvard School of Public Health and Stanford University, among many other educational institutions, deliver on-demand and live streaming video of courses, commencements, services provided by their departments, and important events taking place on their campuses and provide links to research sites with streaming content. Table 13.1 shows some of the areas in which streaming media, particularly video, are being used.

These examples illustrate some of the most common uses of streaming video. Often, as in the case of news clips or sportscasts, the streaming content is regular television programming that was originally produced for viewing on broadcast or cable television but now has been “repurposed” and “repackaged” for delivery through the Internet. In other cases, content (programs) is produced with specific streaming (Web) values and characteristics in order to optimize, both aesthetically and technically, the experience for the Web viewer, thus improving the chance of achieving the program’s objectives.

Video on the web can be delivered from a streaming media server or from a Web server at different speeds or data rates. Higher delivery speeds or data rates produce better-quality streaming media than do slower delivery speeds. The delivery speeds or data rates are key factors when considering which technique or method for delivery of media is to be used. The next section presents an overview of such methods.

Methods of Delivering Video over the Web: Download and Play, Progressive Download, and Live Streaming

Download and Play

The download and play delivery approach follows the traditional model of prerecorded television broadcast delivery of programming. In the download and play delivery mode, archived (previously recorded) media residing on a web server is downloaded to the user’s hard drive and then accessed at the viewer’s convenience. This approach allows the user to copy or move the file to another device. However, because this raises issues of copyright management, it has fallen out of favor.

Progressive Download

Progressive download is very similar to the video on demand approach described earlier; however, the file does not have to be downloaded completely before viewing. In this approach, a media file to be delivered over a specific network’s bandwidth is placed on a regular Web server, also known as a standard HTTP server, and then a simple HTML or HTML5 hyperlink to the video file is provided.

After being accessed by the client computer, the server starts downloading the file, and after a few seconds of downloading, the user can play the video or audio while it is being downloaded. (See Figure 13.2.) What the user sees is the part of the file that has been downloaded up to that time, thus simulating the process of “true” streaming. When dealing with HTTP protocols it is important to keep in mind that they can use “adaptive streaming,” which is a method by which the systems create multiple files at different data rates but the file delivered is the one that best fits the recipient’s system bandwidth and technical capabilities.

Progressive download is now replaced by HLS (HTTP live streaming) or HTML5 which eliminates downloads and can be saved on a web server (HTTP) to be accessed directly to become live streaming without the need for a streaming server.

Live Streaming

Live streaming follows the traditional model of live broadcast delivery. The live streaming technology allows for an event to be viewed in real time as it is transmitted live. Afterwards, it can be archived and then viewed or “rebroadcast” using the real-time mode of delivery.

The computer systems that are used for real-time and live streaming require special dedicated servers called streaming media servers that keep the bandwidth (capacity) of the network matched to that of the viewer. This mode of streaming offers anytime and anyplace delivery of archived (on demand) media to users anywhere in the world.

The Process of Preparing Streaming Video

The process for preparing streaming video can be divided into two stages: (1) creating/acquiring and editing the media (keep in mind that not all content requires editing as it may be streamed or recorded live and then delivered to a user), and (2) encoding, delivering to servers, and streaming the media. (See Figure 13.3.)

In stage 1, during preproduction, scripts or storyboards are finalized, and decisions are made on which streaming technologies to use. During production and postproduction the video and audio are streamed live or recorded, captured to a video card as a digital signal, and edited if necessary.

In Stage 2, the signal is sent via the video card to a streaming encoder, which prepares it for distribution over a network. The signal is then delivered to a web server, a streaming server, or to a content delivery network (CDN) for distribution over the Internet.

A CDN consists of a series of networked or distributed servers located in different geographical areas, usually with a global reach. The advantage of a CDN over other types of delivery networks is that content is delivered to a user from the closest server, thus speeding the delivery of such content and providing protection from overflows of data traffic.

Stage 1: Creating and Encoding the Media

As you can see in Figure 13.4, the stage in which the video is created and prepared for streaming involves some of the already familiar steps seen in our production model in Chapter 3. Preproduction, production, and postproduction (if necessary) are all part of the process. However, as you will see in a moment, production values and postproduction procedures for streaming video operate on a different set of principles than those for traditional video. Furthermore, streaming technology works with different network protocols and tools that have to be taken into account during the very early preproduction stage. In other words, as in any video production, careful planning is required to make a successful streaming product.

PREPRODUCTION As we mentioned earlier in the book, planning is at the heart of any media production process, and the preproduction stage is critical in preparing your video for the Web. Besides the regular steps in the production model that was presented in Chapter 7, when preparing video content for the Web, you also need to consider the following:

  1. What type of production do you want or need to do? Is it going to be a single talking head or are you going to use a large cast?
  2. is your intended target audience going to access your video? Large-scale streaming requires high bandwidth capability. Bandwidth, the amount of information per unit of time of your delivery channel, is measured in bits per second (bps). Dial-up modems delivered information at speeds of 28, 33, or 56 kilobits (thousand bits) per second (kbps). Other channels such as DSL (digital subscriber line) or T1 lines have download speeds that range from around 384 kbps for DSL to 1.544 megabits (million bits) per second (mbps) for T1 lines. Use of dial-up connections has fallen significantly (except in some rural areas) and most Internet home connections now use broadband with speeds ranging from 500 kbps to 6 Mbps for upload speeds and up to 120 Mbps for download speeds. In general, the larger the bandwidth, the better the quality of your video on the Web. For example, if your message is going to be delivered through a 56 kbps dial-up modem, then you should not plan a video with lots of motion and effects. If you plan to deliver your program over a 700 or higher kbps channel, then you will have more flexibility in enhancing your video with more effects, transitions, and animated graphics. For instance, for streaming Standard Definition video a broadband speed of 2 Mbps should be used; 5 Mbps should be used for High Definition; and 9 Mbps for Ultra High Definition.
  3. What kind of delivery platform and media players will be used to encode and deliver your video? How much interactivity do you wish your audience to experience? When considering a streaming platform (a video hosting service like YouTube), it is important to understand that, in most cases, online video platforms use the adaptive HTTP live streaming (HLS) protocol.

PRODUCTION Shooting Video for the Web and Streaming There are significant differences between shooting video for distribution via traditional broadcast or cable television and shooting video for the Web. Although in both cases high aesthetic standards are to be applied, Web technology such as compression codecs and bandwidth limitations can in many cases change your design approach to video shooting.

The first important thing to understand is that before getting to the Web, video has to be compressed (made smaller). Video compression works by eliminating redundant information. When compression is employed, each frame of information is analyzed to identify which information in the frame is repetitive and which is new. If there are a lot of changes between one frame and the next, it will take more bandwidth to transmit the signal. If there are few changes between one frame and the next, only the new information needs to be transmitted. So when we shoot video for the Web, we are concerned with keeping the scene as unchanged as possible.

  1. Motion. Avoid unnecessary camera or subject movements. Pans, tilts, zooms, and other camera movements radically change the content of the shot from one frame to the next. Unfortunately, when video is compressed and distributed over the Web, these camera movements will mostly appear to be blurry on the screen. If you must pan or zoom, do it as slowly as possible. Try to keep your subject’s motion to a minimum. Also use a tripod whenever you can. Shaky pictures are as bad for compression as a moving image is.
  2. Background. The background should be kept as static and uniform (a single color) as possible. Excessive motion in the background will cause the picture to be blurry. Remember, compression is about change, and the less change your compressor has to worry about, the better your video is going to look. A talking head against a dark blue background is a good option for streaming video.
  3. Framing and focus. Video is, in general, a close-up medium. Video for the Web is even more a close-up medium. Although we are in the age of large, LED screens for home viewing, chances are good that your video will be displayed in a small window on a computer screen or in a small screen of a smartphone or tablet, and not as a full-screen image. Close-ups will help the viewer to appreciate the details of your production. Also, make sure that your images are in sharp focus.
  4. Lighting and colors. Do not have talent wear white, black, or clothes with fine patterns (e.g., herringbone). Try to create good contrast between foreground and background. A well-lit and sharply detailed scene will undoubtedly look much better than a poorly lit one. Although today’s cameras allow you to shoot in very low light and your video may look acceptable when viewed on a regular video monitor, these images will degrade considerably by the time they are ready to be streamed.

Audio for the Web As in regular video production, there is a tendency to concentrate on the picture elements and take the audio for granted, thus degrading the whole production because of poor audio. However, keep in mind that audio and video both take up bandwidth. So, the more complex the audio portion of your program is, the less bandwidth you will have for the delivery of the video. Again, planning becomes essential to find the proper balance between audio and video. Here are some basic guidelines for dealing with the sound side of your production.

  1. Use external microphones. High-quality external microphones will usually give you better sound quality than will the camera’s built-in microphone. The idea is to create the best audio signal possible, clear and clean, because it will be degraded somewhat when compressed.
  2. Ambient sound. Natural background sound, also called ambient sound, is probably desired in most video productions shot on location. However, make sure the background sound your microphone is picking up is not simply background noise. And make sure that the background sound is not so loud that it competes for attention with the principal foreground sound in your program.
  3. Audio levels. Make sure your levels are high but do not peak into the red, which can cause serious distortion of the sound. When converting analog audio into digital files, remember that the analog peak of 0 VU is equivalent to −20 db in the digital domain. (See Chapter 5 for more discussion about audio levels.)

Keep in mind that there are many sites to which you can upload videos depending on your intended audience and the type of video you wish to distribute. Each site will have its own parameters for encoding media at specific audio levels. You can find specific instructions for uploading requirements on each site. However, ensuring that there is no distortion on your end is always the best way to prepare audio for streaming.

POSTPRODUCTION Once your video and audio have been recorded, the next steps in preparing your media for distribution on the Web involve capturing and editing your media. You have already reviewed general principles of video editing in previous chapters. The following editing guidelines should help you to enhance the quality of your streaming video.

Editing Editing, as previously explained, is the process of constructing your completed video project out of individual shots, audio clips, titles, and animation. The guidelines that follow are specifically geared toward producing video for the Web.

  1. The first principle to adhere to is to make sure that your video and audio source material is recorded and captured at the highest quality level, because there will be some inevitable quality loss when it is compressed for the Web.
  2. Avoid unnecessary transitions and special effects. Just as we avoid camera movement and subject motion when shooting for the Web, we must avoid some of the transitions and special effects that we normally use when editing for the television screen. Dissolves, wipes, and other effects look good on the television screen, but they require a lot of bandwidth to transmit via the Internet, and they may not look very good after they have been compressed.
  3. Titles. Titles are part of almost every video, but again, you should not give them the same treatment that you do in regular video production. For the Web, keep your titles simple, and avoid any motion. Use as large a font as is practical because it is likely that your video, as mentioned before, in a small screen on a mobile device or in a smaller size than full frame on the computer screen.

Stage 2: Encoding, Servers, and Delivery

Figure 13.5 presents a model of Stage 2 of the streaming process.

ENCODING AND COMPRESSION Encoding is a term that is used to describe the compression of files into specific formats. In the case of on-demand or streaming video files, encoding refers to the compression (reducing the file size) of those files into video files that can be viewed on demand or via streaming over a network. Keep in mind that most compression algorithms used in streaming are “lossy” which means that after processing there is a loss of data (bits of information), which lowers the quality of the image.

Data transmission rate and file size can also be affected by frame rate and window size. A regular NTSC (SD) video or television signal has a 4:3 aspect ratio and plays at a frame rate of 30 frames per second (fps) with a window size (output resolution) of 640 × 480 pixels, or 320 × 240. (See Figure 13.6.) However, almost all video programs today are shot and recorded in High Definition (HD) formats using a 16:9 aspect ratio and window sizes of 1920 x 1080 and 1280 x 720. In most cases frame rates are 30 fps, but certain events, such as live productions, can use 60fps.

Once your video has been edited and finalized, you need to create files that are streamable. During the preproduction steps you made some decisions about your audience, what bandwidth you were going to use to deliver your video, and the type of media player that would be used to view it at the user’s end. Most streaming servers are now using the adaptive streaming technique. Therefore, it is no longer necessary to create a separate file for each speed you intend to stream to, since the server is capable of automatically choosing a speed for a given user or group of users. If you intend to stream your video from a streaming media server, then you will need to create only one single file with the capability of being streamed at different speeds. This is where container formats and codecs come in.

Figure 13.6
Figure 13.6
Figure 13.6

A container format or “wrapper” is a file format that contains specific information or metadata on how video, audio, and other types of data are encoded so players can identify its basic elements and reproduce them accordingly. A common metaphor for understanding wrappers or containers is that of a shipping container. Inside the container there could be many different types of artifacts or content. In our case that content is audio and video tracks—a file—that has been encoded using a specific codec. (See Figure 13.7.) A container then can identify, read, and display the information encoded in such file (can play the video). A container may be able to read a large number of codecs (identified by the file extension). For instance, QuickTime, one of the most common containers/wrappers used in video production, processes supports over 150 codecs. Other common containers are MPEG, JPEG, FLV (Flash Video), MOV (Apple), AVI (Windows Professional), and VOB (DVDs).

As explained in Chapter 5, codec is an abbreviation for “compression/decompression.” A codec is a mathematical algorithm that is used to squeeze media files into a smaller format for streaming. Compression occurs during the encoding stage and decompression occurs when the file is viewed at the user’s computer codecs can be categorized by their functionality which can be described as follows: (1) acquisition codecs, used in video cameras; (2) intermediate codecs used in editing; and (3) delivery codecs used for distribution.

The early days of streaming—the late 1990s and early 2000s—was dominated by five widely used video on demand and streaming technologies: RealNetworks Media Player, Windows Media Player, QuickTime, MPEG-4, and Flash Video. Then, in 2002, developers came up with the MPEG-4 part 10 or H.264, compression technology that standardized the encoding and delivery for streaming and multimedia in general. The H.264 codec and AAC—Advanced Audio Coding—have become the most common video and audio encoders used in the media industry. Recently, H.265 came into the market as the successor to H.264 and with strong capabilities for handling ultra-high-definition (UHD) video up to 8K.

In summary, these streaming technologies consist mainly of three components: the encoding software that compresses and converts the media files to a format that can be viewed by a player, the server software that delivers the stream to multiple users, and the media player that allows the end user to view (decompress) the media files.

The developers of these technologies are continually improving the quality of the encoding and media player products and developing new ones that are making the area of video on the Web more and more attractive to both content providers and end users. The MPEG format has played a significant role in the development of the interactive video and streaming industries

A Final Consideration

One important issue that a producer must think about when planning a webcast is what kind of computer hardware and Internet connectivity the target audience has. This is very different from a live televised event, in which the producer does not worry about the type of television set or how the audience is receiving the signal (e.g., over-the-air broadcast, cable, satellite) because it can be assumed that every television set can decode and display the television signal. In contrast, for video streamers the type of technology the target audience has—bandwidth, computer, connection to the Internet—will determine many of the technical and aesthetic aspects of the streaming event. And this in turn makes it imperative that your streamed production, like every other media event production, must be the result of a comprehensive planning process.

Figure 13.8 illustrates one way in which a live event can be streamed and the different elements that come into play for this particular event. Notice that in this case the live signal is encoded on-site and then delivered through the Internet.

As the technology moves forward and broadband channels become more readily available to millions of users, we will see an ever-increasing number of live streaming events. Furthermore, as the convergence between video and computers moves them closer to becoming an integrated medium, we will be able to receive both television broadcasts and live streaming programs through the same channel and view them on the same screen.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset