Phones with cameras only started appearing on the market in late 2001; now they’re everywhere. By the end of 2003 more camera phones were sold worldwide than standalone digital cameras, and by 2006 half of the world’s mobile phones had a built-in camera.
The social impact of this phenomenon should not be underestimated; the ubiquity of these devices has had a profound affect on society and on the way that news and information propagate. Mobile phones are constantly carried, which means their camera is always available. This constant availability has led to some innovative third party applications, especially with the new generation of smart phones. The iPhone has been designed with always-on connectivity in mind.
Until recently, only the iPhone has featured a camera in all of the available models. However the latest generation of both the iPod touch and iPad now also have cameras.
The original iPhone and iPhone 3G feature a fixed-focus 2.0-megapixel camera, while the iPhone 3GS features a 3.2-megapixel camera with auto-focus, auto-white balance and auto-macro focus (up to 10cm). The iPhone 3GS camera is also able of capturing 640×480 pixel video at 30 frames per second. Although the earlier models are physically capable of capturing video, they are limited in software and this feature is not available at the user level. The latest iPhone 4 features a 5-megapixel camera with better low-light sensitivity and backside illuminated sensor. The camera has an LED flash and is capable of capturing 720p HD video at 30 frames per second. The iPhone 4 also has a lower-resolution front-facing camera, which is capable of capturing 360p HD video at 30 frames per second.
The iPhone 3GS and iPhone 4 cameras are known to suffer from rolling shutter effect when used to take video. This effect is a form of aliasing that may result in distortion of fast moving objects, or image effects due to lighting levels that change as a frame is captured. At the time of writing it’s not clear whether the 4th generation iPod touch and iPad 2 cameras suffer the same problem.
The latest generation of iPod touch and iPad also have both rear- and front-facing cameras, both of which are far lower resolution than the camera fitted to the iPhone 4, see Table 2-1 for details. You’ll notice the difference in sizes between still and video images on the iPod touch and the iPad 2. It’s unclear whether Apple is using a 1280×720 sensor and cropping off the left and right sides of the video image for still images, or whether it is using a 960×720 sensor and up-scaling it on the sides for video. The later would be an unusual approach for Apple, but is not inconceivable.
Model | Focus | Flash | Megapixels | Size | Video |
Original iPhone | Fixed | No | 2.0 | 1600×1200 | No |
iPhone 3G | Fixed | No | 2.0 | 1600×1200 | No |
iPhone 3GS | Autofocus | No | 3.2 | 2048×1536 | VGA at 30fps |
iPhone 4 | Autofocus | LED flash | 5.0 for still | 2592×1944 | 720p at 30fps |
1.4 for video | 1280×1024 | ||||
Fixed | No | 1.4 | 1280×1024 | 360p at 30fps | |
iPod touch (4th Gen) | Fixed | No | 0.69 for still | 960×720 | 720p at 30fps |
0.92 for video | 1280×720 | ||||
Fixed | No | 1.4 | 1280×1024 | VGA at 30fps | |
iPad 2 | Fixed | No | 0.69 for still | 960×720 | 720p at 30fps |
0.92 for video | 1280×720 | ||||
Fixed | No | 1.4 | 1280×1024 | VGA at 30fps |
All models produce geocoded images by default.
The UIImagePickerViewController
is an Apple-supplied interface for choosing images and movies, and taking
new images or movies (on supported devices). This class handles all of the
required interaction with the user and is very simple to use. All you need
to do is tell it to start, then dismiss it after the user selects an image
or movie.
Let’s go ahead and build a simple application to illustrate how to use the image picker controller. Open Xcode and start a new project. Select a View-based Application for the iPhone, and name it Media when requested.
The first thing to do is set up the main view. This is going to
consist of a single button that is pressed to bring up the Image Picker
controller. An UIImageView
will display
the image, or thumbnail of the video, that is captured.
Select the MediaViewController.h interface file to open it
in the editor and add a UIButton
and an
associated method to the interface file. Flag these as an IBOutlet
and IBAction
respectively
. You also need to add a UIImageView
to display that image returned by
the image picker, which also needs to be flagged as an IBOutlet
. Finally, add a UIImagePickerController
, and flag the view
controller as both UIImagePickerControllerDelegate
and UINavigationControllerDelegate
. The code to add
to the default template is shown in bold:
#import <UIKit/UIKit.h> @interface MediaViewController : UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate> { IBOutlet UIButton *pickButton; IBOutlet UIImageView *imageView; UIImagePickerController *pickerController; } -(IBAction) pickImage:(id) sender; @end
Next, open the MediaViewController.m implementation file and
add a stub for the pickImage:
method.
As always, remember to release the pickButton
, imageView
and the pickerController
in the dealloc
method:
-(IBAction) pickImage:(id) sender { // Code goes here later } - (void)dealloc { [pickButton release]; [imageView release]; [pickerController release]; [super dealloc]; }
After saving your changes (⌘-S) single click on the MediaViewController.xib NIB file to open it in
Interface Builder. Drag and drop a UIButton
and a UIImageView
into the main View window. Go ahead
and change the button text to something appropriate, and in the
Attributes Inspector of the Utilities panel set the
UIImageView
’s view mode to be
Aspect Fit
. Use the Size inspector resize the UIImageView
to a 4:3 ratio. I used 280×210
points which fits nicely in a Portrait-mode iPhone screen.
Next click on “File’s Owner” in the main panel. In the Connections
inspector of the Utilities panel, connect both the pickButton
outlet and the pickImage:
received action to the button you
just dropped into the View choosing Touch Up Inside as the action, see
Figure 2-1.
Then connect the imageView
outlet
to the UIImageView
in our user
interface.
Click on the MediaViewController.m implementation file and
uncomment the viewDidLoad:
method.
You’re going to use this to initialize the UIImagePickerController
. Make the changes shown
in bold:
- (void)viewDidLoad { [super viewDidLoad]; pickerController = [[UIImagePickerController alloc] init]; pickerController.allowsEditing = NO; pickerController.delegate = self; }
This allocates and initializes the UIImagePickerController
; don’t forget to
release it inside the dealloc
method.
This line prevents the picker controller from displaying the crop and resize tools. If enabled, the “crop and resize” stage is shown after capturing a still. For video, the trimming interface is presented.
This line sets the delegate class to be the current class, the
MediaViewController
.
The UIImagePickerController
can
be directed to select an image (or video) from three image sources:
UIImagePickerControllerSourceTypeCamera
,
UIImagePickerControllerSourceTypePhotoLibrary
and UIImagePickerControllerSourceTypeSavedPhotosAlbum
.
Each presents a different view to the user allowing her to take an image
(or a video) with the camera, from the image library, or from the saved
photo album.
Now write the pickImage:
method
that will present the image picker controller to the user. There are a few
good ways to do that, depending on the interface you want to present. The
first method, makes use of a UIActionSheet
to choose the source type,
presenting the user with a list to decide whether they will take a still
image or a video:
-(void)pickImage: (id)sender { UIActionSheet *popupQuery = [[UIActionSheet alloc] initWithTitle:nil delegate:self cancelButtonTitle:@"Cancel" destructiveButtonTitle:nil otherButtonTitles:@"Photo",@"Video",nil]; popupQuery.actionSheetStyle = UIActionSheetStyleBlackOpaque; [popupQuery showInView:self.view]; [popupQuery release]; }
If we’re going to use this method we must specify that the view
controller supports the UIActionSheetDelegate
protocol in the interface
file:
@interface MediaViewController : UIViewController <UIImagePickerControllerDelegate, UINavigationControllerDelegate, UIActionSheetDelegate> {
In the implementation file, provide an actionSheet:clickedButtonAtIndex:
delegate
method to handle presenting the image picker interface. If there is no
camera present the source will be set to the saved photos album:
- (void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex { if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) { pickerController.sourceType = UIImagePickerControllerSourceTypeCamera; } else { pickerController.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; } if (buttonIndex == 0) { pickerController.mediaTypes = [NSArray arrayWithObject: kUTTypeImage]; } else if (buttonIndex == 1) { pickerController.mediaTypes = [NSArray arrayWithObject: kUTTypeMovie]; } [self presentModalViewController:pickerController animated:YES]; }
Since we’ve made use of the kUTTypeImage
and kUTTypeMovie
type codes in this method we have
to add the Mobile Core Services framework to our project.
For those of you used to working in Xcode 3, the way you add frameworks to your project has changed. In the past you were able to right-click on the Framework’s group and then select Add→Existing Frameworks. Unfortunately this is no longer possible and adding frameworks has become a more laborious process.
To add the framework, select the Media project file in the Project navigator window. You should see a panel as in see Figure 2-2. Select the Target and click on the Build Phases tab. Select the Link with Libraries drop down and use the + button to add the MobileCoreServices.framework from the list of available frameworks.
Add the following to the view controller interface file:
#import <MobileCoreServices/MobileCoreServices.h>
After saving the changes you can click on the Build and Run button.
You should be presented with an interface much like Figure 2-3 (left). Clicking on
the “Go” button you should be presented with the UIActionSheet
that prompts the user to choose
between still image and video capture.
If you do go ahead and test the application in the iPhone Simulator you’ll notice that there aren’t any images in the Saved Photos folder, see Figure 2-3 (right). However there is a way around this problem. In the Simulator, tap on the Safari Icon and drag and drop a picture from your computer (you can drag it from the Finder or iPhoto) into the browser. From the browser you can save the image to the Saved Photos folder.
Instead of explicitly choosing an image or video via the action sheet, you could instead allow the user to pick the source. The following alternative code determines whether your device supports a camera and adds all of the available media types to an array. If there is no camera present the source will again be set to the saved photos album:
-(void)pickImage: (id)sender { if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) { pickerController.sourceType = UIImagePickerControllerSourceTypeCamera; NSArray* mediaTypes = [UIImagePickerController availableMediaTypesForSourceType: UIImagePickerControllerSourceTypeCamera]; pickerController.mediaTypes = mediaTypes; } else { pickerController.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; } [self presentModalViewController:pickerController animated:YES]; }
Here instead of presenting an action sheet and allowing the user to choose which source type they wish to use we interrogate the hardware and decide which source types are available. We can see the different interfaces these two methods generate in Figure 2-4. The left interface is the still camera interface, the middle image is the video camera interface and the final (right-hand) image is the joint interface, which allows the user to either take still image or video.
The final interface, where the user may choose to return either a
still image or a video, is the one presented by the second version of the
pickImage:
method. This code is also
more flexible as it will run unmodified on any of the iPhone models that
have a camera device. If your application requires either a still image or
a video (and can not handle both) you should be careful to specify either
kUTTypeImage
or kUTTypeMovie
media type as you did in the first
version of the method.
You can choose either of the two different methods I’ve talked about above to present the image picker controller to the user. In either case when the user has finished picking an image (or video) the following delegate method will be called in the view controller:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { if( [info objectForKey:@"UIImagePickerControllerMediaType"] == kUTTypeMovie ) { // add code here } else { imageView.image = [info objectForKey:@"UIImagePickerControllerOriginalImage"]; } [self dismissModalViewControllerAnimated:YES]; }
When the UIImagePickerController
returns it passes an NSDictionary
containing a number of keys, listed in Table 2-2. Use the UIImagePickerControllerMediaType
key to decide
whether the image picker is returning a still image or a movie to its
delegate method.
Key | Object type |
|
|
|
|
|
|
|
|
|
|
We can retrieve the original image (or cropped version if editing is
enabled) directly from the NSDictionary
that was passed into the delegate method. This image reference can be
passed directly to the UIImageView
and
displayed, as shown in the code in the next section and Figure 2-5.
There is no easy way to retrieve a thumbnail of a video, unlike still photos. This section illustrates two methods of grabbing raw image data from an image picker.
One way to grab a video frame for creating a thumbnail is to drop down to the underlying Quartz framework to capture an image of the picker itself. To do so, add the following highlighted code to the image picker delegate described previously in this chapter:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { if( [info objectForKey:@"UIImagePickerControllerMediaType"] == kUTTypeMovie ) { CGSize pickerSize = CGSizeMake(picker.view.bounds.size.width, picker.view.bounds.size.height-100); UIGraphicsBeginImageContext(pickerSize); [picker.view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); imageView.image = thumbnail; } else { imageView.image = image; } [self dismissModalViewControllerAnimated:YES]; }
Since picker.view.layer
is part
of the UIView
parent class and is of
type CALayer
, the compiler doesn’t
know about renderInContext:
method
unless you import the QuartzCore header file. Add the following to the
implementation file:
#import <QuartzCore/QuartzCore.h>
Another method to obtain a thumbnail that will result in a better
image is to use the AVFoundation
framework. First replace the code you added in the previous section with
the highlighted code below:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { if( [info objectForKey:@"UIImagePickerControllerMediaType"] == kUTTypeMovie ) { AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:[info objectForKey:UIImagePickerControllerMediaURL] options:nil]; AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset]; generator.appliesPreferredTrackTransform=TRUE; [asset release]; CMTime thumbTime = CMTimeMakeWithSeconds(0,30); AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) { if (result != AVAssetImageGeneratorSucceeded) { NSLog(@"Error:%@", error); } imageView.image = [[UIImage imageWithCGImage:im] retain]; [generator release]; }; CGSize maxSize = CGSizeMake(320, 180); generator.maximumSize = maxSize; [generator generateCGImagesAsynchronouslyForTimes: [NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler]; } else { imageView.image = image; } [self dismissModalViewControllerAnimated:YES]; }
Then make sure to add the AVFoundation and CoreMedia frameworks to the project by importing the header files at the top of the implementation:
#import <AVFoundation/AVFoundation.h> #import <CoreMedia/CoreMedia.h>
The only real downside of this method is that AVAssetImageGenerator
makes use of key frames,
which are typically spaced at one second intervals. Hopefully the key
frame will make a good thumbnail image.
You can save both images and videos to the Photo Album using the
UIImageWriteToSavedPhotosAlbum
and
UISaveVideoAtPathToSavedPhotosAlbum
methods. The method will also obtain a thumbnail image for the video if
desired.
The saving functions in this example are asynchronous; if the application is interrupted (e.g., takes a phone call) or terminated, the image or video will be lost. You need to ensure that your user is aware that processing is happening in the background as part of your application interface.
The following example save the image to the Photo Album by adding a
call to UIImageWriteToSavedPhotosAlbum
to the image picker delegate. The example will then provide feedback when
the image has been successfully saved or an error occurs. Add the
following highlighted lines to the image picker controller presented
earlier in the chapter:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { if( [info objectForKey:@"UIImagePickerControllerMediaType"] == kUTTypeMovie ) { CGSize pickerSize = CGSizeMake(picker.view.bounds.size.width, picker.view.bounds.size.height-100); UIGraphicsBeginImageContext(pickerSize); [picker.view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); imageView.image = thumbnail; } else { UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"]; UIImageWriteToSavedPhotosAlbum( image, self, @selector( imageSavedToPhotosAlbum:didFinishSavingWithError:contextInfo:), nil); imageView.image = image; } [self dismissModalViewControllerAnimated:YES]; }
Then add the following method, which presents a UIAlertView
notifying the user that the save has
occurred:
- (void)imageSavedToPhotosAlbum:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo { NSString *title; NSString *message; if (!error) { title = @"Photo Saved"; message = @"The photo has been saved to your Photo Album"; } else { title = NSLocalizedString(@"Error Saving Photo", @""); message = [error description]; } UIAlertView *alert = [[UIAlertView alloc] initWithTitle:title message:message delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; }
The call to UIImageWriteToSavedPhotosAlbum
can typically
take up to 4 seconds to complete in the background. If the application
is interrupted or terminated during this time then the image may not
have been saved.
You can similarly add the following highlighted lines to the delegate method to save captured video:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { if( [info objectForKey:@"UIImagePickerControllerMediaType"] == kUTTypeMovie ) { NSString *tempFilePath = [[info objectForKey:UIImagePickerControllerMediaURL] path]; if ( UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(tempFilePath) ) { UISaveVideoAtPathToSavedPhotosAlbum( tempFilePath, self, @selector(video:didFinishSavingWithError:contextInfo:), tempFilePath); } CGSize pickerSize = CGSizeMake(picker.view.bounds.size.width, picker.view.bounds.size.height-100); UIGraphicsBeginImageContext(pickerSize); [picker.view.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage *thumbnail = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); imageView.image = thumbnail; } else { UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"]; UIImageWriteToSavedPhotosAlbum(image, self, @selector( imageSavedToPhotosAlbum:didFinishSavingWithError:contextInfo:), nil); imageView.image = image; } [self dismissModalViewControllerAnimated:YES]; }
Next add the following method to report whether the video has been successfully saved to the device’s Photo Album, or an error occurred:
- (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(NSString *)contextInfo { NSString *title; NSString *message; if (!error) { title = @"Video Saved"; message = @"The video has been saved to your Photo Album"; } else { title = NSLocalizedString(@"Error Saving Video", @""); message = [error description]; } UIAlertView *alert = [[UIAlertView alloc] initWithTitle:title message:message delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; }
Make sure you’ve saved your changes, and click on the Run button in the Xcode toolbar to compile and deploy the application to your device. If everything is working, you will see a thumbnail after you take a photo or video. After a few seconds a confirmation dialog will appear reporting success or an error. See Figure 2-6.
If you are capturing video you can make some video-specific
customizations using the videoQuality
and videoMaximumDuration
properties of
the UIImagePickerController
class:
pickerController.videoQuality = UIImagePickerControllerQualityTypeLow; pickerController.videoMaximumDuration = 90; // Maximum 90 seconds duration
Table 2-3 illustrates
the expected sizes of a typical 90 second movie file for the three
possible image quality levels, which defaults to UIImagePickerControllerQualityTypeMedium
.
Quality | Size |
| 1.8 MB |
| 8.4 MB |
| 32 MB |
The maximum, and default, value for the videoMaximumDuration
property is 10 minutes.
Users are forced to trim longer video to match the duration you
request.