20. Media

The Media API provides applications with the ability to record and play audio files. With this API, there is some overlap with the capabilities of the Capture API described in Chapter 12. Essentially, this API has limited capabilities as compared to the Capture API, and it’s likely that not much will be done with this API going forward. The documentation for the Media API even kicks off with a warning to developers reminding them that the API doesn’t align with the W3C specification for media capture and that future development will revolve around the Capture API.

That being said, the Media API is still useful. Even though the Capture API allows you to capture audio files, duplicating the same functionality provided in the Media API, the Capture API doesn’t provide a mechanism to play audio files, so you’ll need the Media API to do that. The API is limited in that it offers support only for Android, iOS, and Windows Phone devices today. Any application you build using this API would not work on a BlackBerry device, for example.

This API works differently than any of the other PhoneGap APIs covered so far. As you’ll see in this chapter, the way the API exposes information about the media file creates some challenges for the PhoneGap developer.

The Media Object

The following sections describe how to work with the Media object in your PhoneGap applications.

Creating a Media Object

Before a PhoneGap application can play an audio file, it must first create a Media object that points to the audio file. Once the object has been created, methods are exposed through the object that allow the application to play the file plus pause and stop playback. These capabilities are essential for gaming applications that need the ability to play audio clips during play.

To create a Media object that can be used to play an audio file, at a minimum, an application would use the following code:

theMedia = new Media(mediaFileURI, onSuccess);

In this case, we’re creating a new theMedia object that application will interact with. The Media object’s constructor is passed a file URI pointing to the audio file being opened plus an onSuccess function that will be executed when the Media API completes playing or recording an audio clip.


Image Note

The onSuccess function doesn’t execute when this call completes; it executes every time the object completes playing or recording an audio file.


When this process completes, you have a new Media object to work with, but nothing is known about the audio clip. The object you’ve created exposes methods your application will use to play the audio clip or record a new audio clip, but the application has not accessed the media file yet. As you’ll see later, there are methods you can use to determine the duration of a clip and read or set the current position within the clip, but none of those operations have any value unless the clip is currently being played.

File URI

The mediaFileURI passed to the constructor points to the audio clip that will be accessed. This could be a file located on a file server, as you’ll see in the example code shown later in the chapter. It could also point to a file on the local file system. In this case, the application could use the File API described in Chapter 18 to obtain access to the local file system and browse for and access files.

An application can also package the audio files it needs into the PhoneGap application and access them directly from within the application. The beauty of this approach is that the files will be guaranteed to be there when the application needs them without having to connect to a remote server to play them or rely upon the files being in a particular location in temporary or persistent storage (explained in Chapter 18). The drawback of this approach is that depending on the mobile platforms you support with your application, you will find the files in different locations on the device.

When you include media files in an application on Android, those files can be accessed from the android_asset folder, as shown in the following example:

theMedia = new Media("/android_asset/thefile.mp3", onSuccess);

On iOS, the files are accessible directly from the / folder, as shown in the following example:

theMedia = new Media("/thefile.mp3", onSuccess);

The previous, just in case you’re interested, resolves to the www folder within the private folder structure created for each installed application on iOS.

Applications/08B5D45E-1128-4FA1-97D6-1CD092B16CD7/myapp.app/
  www/thefile.mp3

So, if you’re building applications for multiple mobile device platforms, you’ll have to determine which platform the application is running on and pull the files from the appropriate folder depending on the mobile device.

Callback Functions

This is where things start to get weird with the Media API. The onSuccess function being passed in to the constructor doesn’t identify the code that will be executed when the creation of the new Media object completes successfully; instead, you’re specifying the function that will be executed when the playing or recording of audio clips completes successfully. Even though the documentation clearly said this, I didn’t understand it properly until I got my application working and saw what was really happening. You’ll see how this impacts your applications in a little while.

The Media constructor supports additional, optional callback functions, as shown in the following example:

theMedia = new Media(mediaFileURI, onSuccess, onError,
  onStatus);

The optional onError function is executed whenever an error occurs during playing or recording audio. As with the other PhoneGap APIs, the onError function is passed an object an application can use to understand and report the nature of the error as shown. With many of the other PhoneGap APIs, only an error code is returned, so it must be compared to a list of error constants to determine the source of an error; the Media API makes this easier by also including an error message, as shown in the following example:

function onMediaError(e) {
  var msgText = "Media error: " + e.message + "(" + e.code +
    ")";
  console.log(msgText);
  navigator.notification.alert(msgText, null, "Media Error");
}

The API also provides the following error constants, which can be used to identify each error type:

MediaError.MEDIA_ERR_ABORTED

MediaError.MEDIA_ERR_DECODE

MediaError.MEDIA_ERR_NETWORK

MediaError.MEDIA_ERR_NONE_SUPPORTED

So, an application can respond directly to each type of error using an approach similar to the following one:

function onMediaError(e) {
  switch(e.code) {
    case MediaError.MEDIA_ERR_ABORTED:
      //Do something about the error

      break;
    case MediaError.MEDIA_ERR_NETWORK:
      //Do something about the error

      break;
    case MediaError.MEDIA_ERR_DECODE:
      //Do something about the error

      break;
    case MediaError.MEDIA_ERR_NONE_SUPPORTED:
      //Do something about the error

      break;
    default:
      navigator.notification.alert("Unknown Error: " +
        e.message + " (" + e.code + ")", null, "Media Error");
  }
}

The optional onStatus function is periodically executed during and after playback to indicate the status of the activity. The following function illustrates the onSuccess function in action:

function onMediaStatus(statusCode) {
  console.log("Status: " + statusCode);
}

The supported values for statusCode are as follows:

• 0: Media.MEDIA_NONE

• 1: Media.MEDIA_STARTING

• 2: Media.MEDIA_RUNNING

• 3: Media.MEDIA_PAUSED

• 4: Media.MEDIA_STOPPED

Current Position

An application can determine the current position within a playing audio clip using the getCurrentPosition method, which uses a callback function to deliver the current position to the application, as shown in the following example:

function updateUI() {
  theMedia.getCurrentPosition(onGetPosition, onMediaError);
}

function onGetPosition(filePos) {
  console.log('Position: ' + Math.floor(filePos) + ' seconds'),
}

This value applies only to an audio clip that is currently being played. If the clip is paused or has not yet been played, the method will return a value of -1.

Since the method doesn’t work unless the clip is playing and, as you’ll see later, the Media object’s play method doesn’t provide a callback function, in order to be able to update its UI with information about playback progress, your application will need to fire off a timer immediately after calling the play method and have that timer query getCurrentPosition and update the application’s UI accordingly. This is performed through a call to the setInterval method, as shown in the following example:

Var theTimer = window.setInterval(updateUI, 1000);

The application will need to suspend updates before playback is paused or stopped. Refer to the source code listing for Example 20-1 for an example of one way to implement this approach.

Duration

An application can determine the length of a playing audio clip using the getDuration method, as shown in the following example:

console.log('Duration: ' + theMedia.getDuration() + ' seconds'),

getDuration will report a -1 if the audio clip is not currently playing. Unlike getCurrentPosition, the getDuration method will return the clip length even if playback is paused.

Releasing the Media Object

When an application is finished with an audio clip, it should release the memory allocated to the Media object using the following code:

theMedia.release();

Performing this step is especially important on Android devices because of the way Android allocates resources for audio playback.

Playing Audio Files

To work with audio files, an application using the Media API will first create a Media object, as shown earlier, and use methods on that object to control audio playback. The following sections will illustrate how to use each of the options available when playing audio files using this API.

Play

To play the audio clip associated with a Media object, an application should simply call the object’s play method, as shown in the following example:

theMedia.play();

The method does not support any input parameters or any callback functions. It simply starts playing the audio clip (if it can) and allows the application to continue. If your application needs to update the UI to indicate progress, it will need to use the setInterval method as described previously to create a timer that is fired periodically to update the UI and perform whatever additional housekeeping tasks are required.

When the play method is invoked, the application will open the file URI provided in the constructor for the Media object. This is the first time your application will have actually tried to access the media file. If the file is not available or is somehow not playable on the device, an error will be generated, and the application will have to do whatever it can to recover. If the file resource is stored on a remote server, there will be a delay in playback while the application first downloads the file before attempting playback.

This is a risky situation for any application. Since you won’t know whether the audio clip will play until you actually try to play it, your application will have to do extra work to ensure success or at least recover gracefully on failure.

Pause

To pause a playing audio clip, an application should call the Media object’s pause method, as shown in the following example:

theMedia.pause();

If an application invokes pause on a Media object that is not currently playing, no error will be reported to the application.

Stop

To stop playback of an audio clip, an application should call the Media object’s stop method, as shown in the following example:

theMedia.stop();

If an application invokes stop on a Media object that is not currently playing, no error will be reported to the application.

Seek

An application can programmatically seek to a specific position within an audio clip using the seekTo method of the Media object, as shown in the following example:

theMedia.seekTo(3600);

The method takes a single input parameter: a numeric value indicating the position within the audio file in milliseconds. So, in the example shown, playback will skip to a position 3,600 milliseconds (3.6 seconds) from the beginning of the audio clip.

Recording Audio Files

To record audio files, an application must first create a media object as shown earlier and use methods on that object to control the audio recording process. The following sections will illustrate how to use options available for recording audio using this API.


Image Note

The audio recording capabilities offered by the PhoneGap Capture API are much better suited for audio recording; I recommend you utilize that API instead.


Start Recording

To begin recording audio, an application should call the startRecord method of a Media object, as shown in the following example:

theMedia.startRecord();

This method doesn’t support any direct callback functions, but the onError function that was defined when the Media object was created will fire if there’s an error creating the recording. If you want to indicate that the application is recording and update the application’s UI with the recording status (recording length, for example), you will have to do it manually using the setInterval method described previously.

Stop Recording

To discontinue recording audio, an application should call the stopRecord method of a Media object, as shown in the following example:

theMedia.stopRecord();

This method doesn’t support any direct callback functions, but the onError function that was defined when the Media object was created will fire if there’s an error.

Seeing Media in Action

To illustrate how different aspects of the Media API are used within an application, I created Example 20-1, which highlights one way to manage audio clip playback using this API.

Figure 20-1 shows the application at startup. Notice how the audio clip’s duration is set to -1; this was discussed earlier in the chapter. Even though the application has created a Media object, the getDuration method does not return a value unless the clip is actually playing.

As with some of the other examples in this book, this application uses jQuery (www.jquery.com) and jQuery Mobile (www.jquerymobile.com) to create the application’s UI. The application doesn’t use many features of jQuery Mobile as some other examples; I wanted the buttons to fit together cleanly and an unobtrusive mechanism for updating page content, so I took this approach. Where you see HTML attributes data-role and data-icon, those are instructions to jQuery Mobile to help clean up the UI. The $().html functions you see peppered throughout the code are simply a shortcut notation for updating the HTML content of page elements. Beyond that, everything is straight HTML and JavaScript.

Image

Figure 20-1 Example 20-1 at startup


Example 20-1

<!DOCTYPE html>
<html>
  <head>
    <title>Example 20-1</title>
    <meta name="viewport" content="width=device-width,
      height=device-height initial-scale=1.0,
      maximum-scale=1.0, user-scalable=no;" />
    <meta http-equiv="Content-type" content="text/html;
      charset=utf-8">
    <link rel="stylesheet" href="jquery.mobile1.0b3.min.css" />
    <script type="text/javascript" charset="utf-8"
      src="jquery1.6.4.min.js"></script>
    <script type="text/javascript" charset="utf-8"
      src="jquery.mobile1.0b3.min.js"></script>
    <script type="text/javascript" charset="utf-8"
      src="phonegap.js"></script>
    <script type="text/javascript" charset="utf-8"
      src="main.js"></script>
    <script type="text/javascript">

      var fileDur, theMedia, theTimer;

      function onBodyLoad() {
        //Add the PhoneGap deviceready event listener
        document.addEventListener("deviceready", onDeviceReady,
          false);
      }

      function onDeviceReady() {
        //Get our media file and stuff
        init();
      }

      function init() {
        var fileName = "http://server/folder/file_name.mp3 ";
        console.log(fileName);
        //Create the media object we need to do everything we
        // need here
        theMedia = new Media(fileName, onMediaSuccess,
          onMediaError, onMediaStatus);
        //Update the UI with the track name
        $('#track').html("<b>File:</b> " + fileName);
        $('#pos').html('Duration: ' +
          Math.round(theMedia.getDuration()) + ' seconds'),
      }

      function onMediaSuccess() {
        console.log("onMediaSuccess");
        window.clearInterval(theTimer);
        theTimer = null;
      }

      function onMediaError(e) {
        var msgText;
        console.log(msgText);
        navigator.notification.alert(msgText, null,
          "Media Error");
      }

      function onMediaStatus(statusCode) {
        console.log("Status: " + statusCode);
      }

      function doPlay() {
        if(theMedia) {
          //Start the media file playing
          theMedia.play();
          //fire off a timer to update the UI every second as
          //it plays
          theTimer = setInterval(updateUI, 1000);
        } else {
          alert("No media file to play");
        }
      }

      function doPause() {
        if(theMedia) {
          //Pause media play
          theMedia.pause();
          window.clearInterval(theTimer);
        }
      }

      function doStop() {
        if(theMedia) {
          //Kill the timer we have running
          theTimer = null;
          //Then stop playing the audio clip
          theMedia.stop();
        }
      }

      function updateUI() {
        theMedia.getCurrentPosition(onGetPosition,
          onMediaError);
      }

      function onGetPosition(filePos) {
        //We won't have any information about the file until
        //it's actually played. Update the counter on the page
        $('#pos').html('Time: ' + Math.floor(filePos) + ' of '
          + theMedia.getDuration() + ' seconds'),
      }
    </script>
  </head>
  <body onload="onBodyLoad()">
    <section id="main" data-role="page" >
      <header data-role="header">
        <h1>Example 20-1</h1>
      </header>
      <div data-role="content">
        <p id="track"></p>
        <p id="pos"></p>
        <div data-role="controlgroup">
          <a onclick="doPlay();" id="btnPlay"
            data-role="button" data-icon="arrow-r">Play</a>
          <a onclick="doPause();" id="btnPause"
            data-role="button" data-icon="grid">Pause</a>
          <a onclick="doStop();" id="btnStop"
            data-role="button" data-icon="delete">Stop</a>
        </div>
      </div>
    </section>
  </body>
</html>


When the user clicks play, the UI will update showing playback status, as shown in Figure 20-2, as the clip plays through the device speakers.

Image

Figure 20-2 Example 20-1 playing an audio clip

To make the application work with a server-based audio clip on iOS, I had to configure the ExternalHosts array, as shown in Figure 20-3. This property is a list of external hosts that the application is authorized to pull content from.

Image

Figure 20-3 Configuring ExternalHosts in Xcode

In my testing, I tried each of the following options for the field:

http://server_name/

http://server_name/*

http://server_name/folder_name/

http://server_name/folder_name/*

http://server_name/folder_name/file_name.ext

None of them worked; I had to use the asterisk, which I thought was a wildcard value authorizing any external resource. What I learned afterward from Bryce is that it’s looking for a regular expression here, not a wildcard. He indicated that it would be changing from regular expressions to wildcards in the future, and a recent update to the PhoneGap wiki included “Wildcards are ok. So if you are connecting to ‘http://phonegap.com’, you have to add ‘phonegap.com’ to the list (or use the wildcard ‘*.phonegap.com’ which will match subdomains as well),” so it looks like it’s already been fixed.

I’m not sure how PhoneGap Build will deal with this restriction. As documented today, Build doesn’t currently support configuration options for security settings like this.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset