The main classes for handling audio in the SDK are in the AVFoundation and Media Player frameworks. This chapter will provide a brief overview of how to play and record audio using these frameworks.
Whilst most phones have only one microphone, iPhone 4 has two. The main microphone is located normally on the bottom next to the dock connector, while the second microphone is built into the top near the headphone jack. This second microphone is intended for video-calling, but is also used in conjunction with the main microphone to suppress background noise.
In comparison the iPad 2 has a single microphone, but there is a difference between the two models which could lead to a difference in audio recording quality between the 3G and WiFi-only models. On the WiFi-only model, the microphone hole is built-into the back of the device, whereas on 3G models, it’s built into the antenna casing. There are suggestions that this difference may lead to cleaner audio recordings with the WiFi model, with the 3G model sounding muffled and echo-prone by comparison.
Both the iPhone 4 and the iPad use an Apple branded Cirrus Logic 338S0589 for their audio DAC, with a frequency response of 20Hz to 20kHz, and audio sampling of 16-bit at 44.1kHz.
All of the current iPhone, iPad and iPod touch models use a 2.5mm 4-pole TRRS (tip, ring, ring, sleeve) connector which has a somewhat unorthodox mapping to the standard RCA connector as shown in Table 3-1.
Let’s first look at playing back existing media stored in the iPod library. Apple has provided convenience classes that allow you to select and play back iPod media inside your own application as part of the Media Player framework.
The following examples make use of the iPod library; this is not present in the iPhone Simulator and will only work correctly on the device itself.
The approach uses picker controllers and delegates as in the
previous chapter. In this example I use an MPMediaPickerController
that, via the MPMediaPickerControllerDelegate
protocol,
returns an MPMediaItemCollection
object
containing the media items the user has selected. The collection of items
can be played using an MPMusicPlayerController
object.
Lets go ahead and build a simple media player application to illustrate how to use the media picker controller. Open Xcode and start a new View-based Application project, naming it “Audio” when requested. Click on the Audio project file in the Project navigator window, select the Target and click on the Build Phases tab. Click on the Link with Libraries drop down and click on the + button to add the MediaPlayer framework.
Edit the AudioViewController.h
interface file to import the MediaPlayer framework and declare the class
as an MPMediaPickerControllerDelegate
.
Then add the IBOutlet
instance
variables and IBAction
methods for the
buttons we will create in Interface Builder:
#import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> @interface AudioViewController : UIViewController <MPMediaPickerControllerDelegate> { IBOutlet UIButton *pickButton; IBOutlet UIButton *playButton; IBOutlet UIButton *pauseButton; IBOutlet UIButton *stopButton; MPMusicPlayerController *musicPlayer; } - (IBAction)pushedPick:(id)sender; - (IBAction)pushedPlay:(id)sender; - (IBAction)pushedPause:(id)sender; - (IBAction)pushedStop:(id)sender; @end
Save your changes, and open the AudioViewController.m implementation file. In
the pushedPick:
method, instantiate an
MPMediaPickerController
object. The
view will be modal, which means that the user must make a selection to
leave picking mode. We’ll link this method directly to a button in the
user interface:
-(IBAction) pushedPick:(id)sender { MPMediaPickerController *mediaPicker = [[MPMediaPickerController alloc] initWithMediaTypes: MPMediaTypeAnyAudio]; mediaPicker.delegate = self; mediaPicker.allowsPickingMultipleItems = YES; [self presentModalViewController:mediaPicker animated:YES]; [mediaPicker release]; }
You must now implement the following two delegate methods, which are used to dismiss the view controller and handle the returned items:
- (void) mediaPicker:(MPMediaPickerController *) mediaPicker didPickMediaItems:(MPMediaItemCollection *) userMediaItemCollection { [self dismissModalViewControllerAnimated: YES]; musicPlayer = [MPMusicPlayerController applicationMusicPlayer]; [musicPlayer setQueueWithItemCollection: userMediaItemCollection]; } - (void) mediaPickerDidCancel: (MPMediaPickerController *) mediaPicker { [self dismissModalViewControllerAnimated: YES]; }
You’ll link the remaining methods directly to buttons in the user interface:
-(IBAction) pushedPlay:(id)sender { [musicPlayer play]; } -(IBAction) pushedPause:(id)sender { [musicPlayer pause]; } -(IBAction) pushedStop:(id)sender { [musicPlayer stop]; }
Remember to release the instance objects in the dealloc
method:
- (void)dealloc { [pickButton release]; [playButton release]; [pauseButton release]; [stopButton release]; [musicPlayer release]; [super dealloc]; }
Save your changes, and click on the AudioViewController.xib NIB file to open it in
Interface Builder. Drag four UIButton
elements from the Library window into the View window. Double click on
each of them and change the default text to be “Pick”, “Play”, “Pause” and
“Stop”. Then open the Assistant Editor (View→Assistant Editor→Show Assistant Editor) and Control-Click and drag
to associate the buttons with their respective IBOutlet
and IBAction
outlets and actions in the AudioViewController.h interface file, see Figure 3-1.
Save your changes and click on the Run button in the Xcode toolbar to build and deploy the code to your device.
Remember that you’ll need to test the application on your device.
Once the application loads, tap on the “Pick” button to bring up the picker controller, select some songs, and tap the Done button.(see Figure 3-2). Press the “Play” button and the music you selected should start playing.
Once playback has begun you need to keep track of the currently
playing item and display that to the user. At the very least you must
provide some way for the user to pause or stop playback, and perhaps to
change their selection. The MPMusicPlayerController
class provides two
methods: the beginGeneratingPlaybackNotifications:
method,
and a corresponding endGeneratingPlaybackNotifications:
. Add the
highlighted line below to your mediaPicker:didPickMediaItems:
delegate
method:
- (void) mediaPicker:(MPMediaPickerController *) mediaPicker
didPickMediaItems:(MPMediaItemCollection *) userMediaItemCollection {
[self dismissModalViewControllerAnimated: YES];
musicPlayer = [MPMusicPlayerController applicationMusicPlayer];
[musicPlayer setQueueWithItemCollection: userMediaItemCollection];
[musicPlayer beginGeneratingPlaybackNotifications];
}
When the begin method is invoked the class will start to generate
notifications when the player state changes and when the current playback
item changes. Your application can access this information by adding
itself as an observer using the NSNotificationCenter
class:
- (void) mediaPicker:(MPMediaPickerController *) mediaPicker didPickMediaItems:(MPMediaItemCollection *) userMediaItemCollection { [self dismissModalViewControllerAnimated: YES]; musicPlayer = [MPMusicPlayerController applicationMusicPlayer]; [musicPlayer setQueueWithItemCollection: userMediaItemCollection]; [musicPlayer beginGeneratingPlaybackNotifications]; NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(handleNowPlayingItemChanged:) name:@"MPMusicPlayerControllerNowPlayingItemDidChangeNotification" object:musicPlayer]; [notificationCenter addObserver:self selector:@selector(handlePlaybackStateChanged:) name:@"MPMusicPlayerControllerPlaybackStateDidChangeNotification" object:musicPlayer]; }
This will invoke the selector methods in the class when the
appropriate notification arrives. You could, for example, use the first to
update a UILabel
in the view telling
the user the name of the currently playing song.
For now let’s just go ahead and implement these methods to print messages to the console log. In the AudioViewController.m implementation file, add the method below. This will be called when the current item being played changes:
- (void)handleNowPlayingItemChanged:(id)notification { MPMediaItem *currentItem = [musicPlayer nowPlayingItem]; NSString *title = [currentItem valueForProperty:MPMediaItemPropertyTitle]; NSLog(@"Song title = %@", title); }
Unusually, the MPMediaItem class only has one instance method,
the valueForProperty:
method. This
is because the class can wrap a number of media types, and each type
can have a fairly wide range of metadata associated with it. A full
list of possible keys can be found in the MPMediaItem class reference,
but keys include MPMediaItemPropertyTitle
,
MPMediaItemPropertyArtwork
, etc.
You can use this to update the user interface, e.g., changing the state of the play and stop buttons when the music ends:
- (void)handlePlaybackStateChanged:(id)notification { MPMusicPlaybackState playbackState = [musicPlayer playbackState]; if (playbackState == MPMusicPlaybackStatePaused) { NSLog(@"Paused"); } else if (playbackState == MPMusicPlaybackStatePlaying) { NSLog(@"Playing"); } else if (playbackState == MPMusicPlaybackStateStopped) { NSLog(@"Stopped"); } }
Save your changes, and click on the Run button in the Xcode toolbar to build and deploy your code onto your device. Once your application loads, press the “Pick” button to bring up the pick controller again, select some songs, and press the “Done” button. Press “Play” and the music should start playing. You should also see something similar to the log messages below in the Debugger Console:
2011-06-01 19:23:07.602 Audio[2844:707] Song title = Affirmation 2011-06-01 19:23:07.617 Audio[2844:707] Playing
You could go on to develop the application by displaying information about the currently playing and queued songs. Let’s move on from playing existing media and look at how to play and record your own audio on the device.
The AVAudioRecorder
class is part
of the AVFoundation framework and provides audio recording capabilities
for an application. The framework allows you to:
Record until the user stops the recording
Record for a specified duration
Pause and resume a recording
The corresponding AVAudioPlayer
class (also part of the AVFoundation framework) provides some fairly
sophisticated functionality allowing you to play sound in your
application. It can:
Play sounds of any duration
Play sounds from files or memory buffers
Loop sounds
Play multiple sounds simultaneously (one sound per audio player) with precise synchronization
Control relative playback level and stereo positioning for each sound you are playing
Seek to a particular point in a sound file, which supports such application features as fast forward and rewind
Lets build a simple application to record some audio to a file and play it back later. Open Xcode and start a new View-based Application, naming it Recorder when requested.
When the Xcode project opens, add both the AVFoundation and CoreAudio frameworks to the project in a similar manner as we added the MediaPlayer framework to the Audio application earlier in the chapter.
Click on the RecorderViewController.h interface file to open it in the Standard Editor and make the following changes to the template file generated for you by Xcode:
#import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface RecorderViewController : UIViewController <AVAudioRecorderDelegate> { IBOutlet UIButton *startStopButton; NSURL *tmpFile; AVAudioRecorder *recorder; BOOL recording; } - (IBAction)startStopButtonPressed; @end
Save your changes and open the corresponding RecorderViewController.xib
file in Interface
Builder. Drag and drop a UIButton
from the Object Library in the Utilities pane into the View and change
the title text to “Start Recording”. Then connect it to the IBOutlet
and IBAction
in your interface file using the
Assistant Editor, as in Figure 3-3.
Save your changes and open the RecorderViewController.m implementation file in the Standard Editor, making the following changes to the default template generated by Xcode:
#import "RecorderViewController.h" #import <CoreAudio/CoreAudioTypes.h> @implementation RecorderViewController - (IBAction)startStopButtonPressed { AVAudioSession * audioSession = [AVAudioSession sharedInstance]; if (!recording) { // Add code here... } else { // Add code here... } } - (void)dealloc { [startStopButton release]; [tmpFile release]; [recorder release]; [super dealloc]; } - (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; } #pragma mark - View lifecycle - (void)viewDidLoad { [super viewDidLoad]; recording = NO; } - (void)viewDidUnload { [super viewDidUnload]; } - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation) interfaceOrientation { return (interfaceOrientation == UIInterfaceOrientationPortrait); } @end
Then make the following changes to the startStopButtonPressed
method:
- (IBAction)startStopButtonPressed { AVAudioSession * audioSession = [AVAudioSession sharedInstance]; if (!recording) { recording = YES; [audioSession setCategory:AVAudioSessionCategoryRecord error:nil]; [audioSession setActive:YES error:nil]; [startStopButton setTitle:@"Stop Recording" forState:UIControlStateNormal]; NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init]; [recordSetting setValue: [NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey]; [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey]; tmpFile = [NSURL fileURLWithPath: [NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: @"%.0f.%@", [NSDate timeIntervalSinceReferenceDate] * 1000.0, @"caf"]]]; recorder = [[AVAudioRecorder alloc] initWithURL:tmpFile settings:recordSetting error:nil]; [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder record]; } else { recording = NO; [audioSession setActive:NO error:nil]; [startStopButton setTitle:@"Start Recording" forState:UIControlStateNormal]; [recorder stop]; } }
If you save your changes and click on the Run button to build and deploy the application to your device you should see the “Start Recording” button changes to “Stop Recording” when pressed. Pressing the button again should change the text back to “Start Recording”. In the next section I’ll show you a way to check that the device is actually recording audio.
Open up the RecorderViewController.h interface file and
add the following IBOutlet
instance
variable:
IBOutlet UIButton *playButton;
along with the following IBAction
method:
- (IBAction)playButtonPressed;
Then single click on the RecorderViewController.xib file to open it in
Interface Builder. Drag and drop and new UIButton into the view and
change the title text to be “Play Recording”. Use the Assistant Editor
to connect the new button to the recently added IBOutlet
and IBAction
in the interface file, see Figure 3-4.
Save your changes, return to the RecorderViewController.m implementation file and add the following method implementation:
- (IBAction)playButtonPressed { AVAudioSession * audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayback error:nil]; [audioSession setActive:YES error:nil]; AVAudioPlayer * player = [[AVAudioPlayer alloc] initWithContentsOfURL:tmpFile error:nil]; [player prepareToPlay]; [player play]; }
Save your changes and click on the Run button in the Xcode toolbar to build and deploy the application to your device. You should see something like Figure 3-5.
If you now tap on the “Start Recording” button the title of the button should change to “Stop Recording”; speak into the iPhone’s microphone for few seconds and tap the button again. Then tap on the “Play Recording” button and you should hear yourself speaking.