Chapter 1. The Hardware

The arrival of the iPhone changed the whole direction of software development for mobile platforms, and has had a profound impact on the hardware design of the smart phones that have followed it. The arrival of the iPad has turned what was a single class of device into a platform.

Available Sensor Hardware

While the iPhone is almost unique amongst mobile platforms in guaranteeing that your application will run on all of the current devices (see Figure 1-1), however there is an increasing amount of variation in available hardware between the various models, as shown in Table 1-1.

Timeline showing the availability of iPhone, iPod Touch, iPad modelsComment [AA2]: Can we get this redrawn by the art department (and not include the Apple TV (2G)? This is the Wikipedia timeline from,
Figure 1-1. Timeline showing the availability of iPhone, iPod Touch, iPad models
Table 1-1. Hardware support in various iPhone, iPod touch, and iPad

Hardware Feature

iPhone

iPod touch

iPad

iPad 2

Original

3G

3GS

4

1st Gen

2nd Gen

3rd Gen

4th Gen

WiFi

3G

WiFi

3G

Cellular

WiFi

Bluetooth

Speaker

Audio In

Accelerometer

Magnetometer

Gyroscope

GPS

Proximity Sensor

Camera

Video

Vibration

Most of the examples in this book will be built as iPhone however depending on the availability of hardware the examples will run equally well on the iPod touch and iPad; the underlying code is equally applicable as we’re dealing for the most part directly with that hardware.

Differences Between iPhone and iPad

The most striking, and obvious, difference between the iPhone and the iPad is screen size. The original iPhone screen has 480×320 pixel resolution at 163 pixels per inch. The iPhone 4 and 4th generation iPod touch Retina Displays have a resolution of 960×640 pixel at 326 pixels per inch. Meanwhile both generations of the iPad screen have 1024×768 pixel resolution at 132 pixels per inch. This difference will be the single most fundamental thing to affect the way you design your user interface on the two platforms. Attempting to treat the iPad as simply a rather oversized iPod touch or iPhone will lead to badly designed applications. The metaphors you use on the two different platforms

The increased screen size of the device means that you can develop desktop-sized applications, not just phone-sized applications, for the iPad platform. Although in doing so, a rethink of the user interface to adapt to multi-touch is needed. What works for the iPhone or the desktop, won’t automatically work on an iPad. For example, Apple totally redesigned the user interface of the iWork suite when they moved it to the iPad. If you’re intending to port a Mac OS X desktop application to the iPad you should do something similar.

Note

Interestingly there is now an option for iOS developers to port their iPhone and iPad projects directly to Mac OS X. The Chameleon Project http://chameleonproject.org is a drop in replacement for UIKit that runs on Mac OS X, allowing iOS applications to be run on the desktop with little modification, in some cases none.

Due to its size and function the iPad is immediately associated in our minds with other more familiar objects like a legal pad or a book. Holding the device triggers powerful associations with these items, and we’re mentally willing to accept the iPad has a successor to these objects. This is simply not true for the iPhone; the device is physically too small.

However this book is not about how to design your user interface or manage your user experience. For the most part the examples I present in this book are simple view-based applications that could be equally written for the iPhone and iPod touch or the iPad. The user interface is only there to illustrate how to use the underlying hardware. This book is about how to use the collection of sensors in these mobile devices.

Device Orientation and the iPad

The slider button on the side of the iPad can, optionally, be used to lock the device’s orientation. This means that if you want the screen to stay in portrait mode, it won’t move when you turn it sideways if locked. However despite the presence of the rotation lock (and unlike the iPhone where many applications only supported Portrait mode) an iPad application is expected to support all orientations equally.

Note

Apple has this to say about iPad applications: “An application’s interface should support all landscape and portrait orientations. This behavior differs slightly from the iPhone, where running in both portrait and landscape modes is not required.”

To implement basic support for all interface orientations, you should implement the shouldAutorotateToInterfaceOrientation: method in all of your application’s view controllers, returning YES for all orientations. Additionally, you should configure the auto-resizing mark property of your views inside Interface Builder so that they correctly respond to layout changes (i.e. rotation of the device).

Going beyond basic support

If you want to go beyond basic support for alternative orientations there is more work involved. Firstly for custom views, where the placement of subviews is critical to the UI and need to be precisely located, you should override the layoutSubviews method to add your custom layout code. However, you should override this method only if the autoresizing behaviors of the subviews are not what you desire.

When an orientation event occurs, the UIWindow class will work with the front-most UIViewController to adjust the current view. Therefore if you need to perform tasks before, during, or after completing device rotation you should use the relevant rotation UIViewController notification methods. Specifically the view controller’s willRotateToInterfaceOrientation:duration:, willAnimateRotationToInterfaceOrientation:duration:, and didRotateFromInterfaceOrientation: methods are called at relevant points during rotation allowing you to perform tasks relevant to the orientation change in progress. For instance you might make use of these callbacks to allow you to add or remove specific views and reload your data in those views.

Detecting Hardware Differences

Because your application will likely support multiple devices, you’ll need to write code to check which features are supported and adjust your application’s behavior as appropriate.

Camera Availability

We cover the camera in detail in Chapter 2, however it is simple matter to determine whether a camera is present in the device:

BOOL available = [UIImagePickerController
  isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera];

Once you have determined that a camera is present you can enquire whether it supports video by making a call to determine the available media types the camera supports:

NSArray *media = [UIImagePickerController availableMediaTypesForSourceType:
                  UIImagePickerControllerSourceTypeCamera];

If the kUTTypeMovie media type is returned as part of the array, then the camera will support video recording:

if ( [media containsObject:(NSString *)kUTTypeMovie ] ){
    NSLog(@"Camera supports movie capture.");
}

Audio Input Availability

An initial poll of whether audio input is available can be done using the AVAudioSession class by checking the inputIsAvailable class property:

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
BOOL audioAvailable = audioSession.inputIsAvailable;

Note

You will need to add the AVFoundation.Framework (right-click/Control-click on the Frameworks folder in Xcode, then choose AddExisting Frameworks). You’ll also need to import the header (put this in your declaration if you plan to implement the AVAudioSessionDelegate protocol discussed later):

#import <AVFoundation/AVFoundation.h>

You can also be notified of any changes in the availability of audio input, e.g., if a second generation iPod touch user has plugged in headphones with microphone capabilities. First, nominate your class as a delegate:

audioSession.delegate = self;

And then declare it as implementing the AVAudioSessionDelegate protocol in the declaration:

@interface YourAppDelegate : NSObject <UIApplicationDelegate,
  AVAudioSessionDelegate >

Then implement the inputIsAvailableChanged: in the implementation:

- (void)inputIsAvailableChanged:(BOOL)audioAvailable {
      NSLog(@"Audio availability has changed");
}

GPS Availability

The short answer to a commonly asked question is that the Core Location framework does not provide any way to get direct information about the availability of specific hardware such as the GPS at application run time, although you can check whether location services are enabled:

BOOL locationAvailable = [CLLocationManager locationServicesEnabled];

However, you can require the presence of GPS hardware for your application to load (see ).

Magnetometer Availability

Fortunately Core Location does allow you to check for the presence of the magnetometer (digital compass) fairly simply:

BOOL magnetometerAvailable = [[CLLocationManager headingAvailable];

Setting Required Hardware Capabilities

If your application requires specific hardware features in order to run you can add a list of required capabilities to your application’s Info.plist file. Your application will not start unless those capabilities are present on the device.

To do this, open the project and click on the application’s Info.plist file to open it in the Xcode editor. Click on the bottommost entry in the list. A plus button will appear to the right-hand side of the key-value pair table.

Click on this button to add a new row to the table, and scroll down the list of possible options and select “Required device capabilities” (the UIRequiredDeviceCapabilities key). This will add an (empty) array to the plist file.

The allowed values for the keys are:

  • telephony

  • wifi

  • sms

  • still-camera

  • auto-focus-camera

  • front-facing-camera

  • camera-flash

  • video-camera

  • accelerometer

  • gyroscope

  • location-services

  • gps

  • magnetometer

  • gamekit

  • microphone

  • opengles-1

  • opengles-2

  • armv6

  • armv7

  • peer-peer

A full description of the possible keys is given in the Device Support section of the iPhone Application Programming Guide available from the iPhone Development Center.

Persistent WiFi

If your application requires a persistent WiFi connection you can set the Boolean UIRequiresPersistentWiFi key in the Application’s Info.plist file to ensure that WiFi is available. If set to YES the operating system will open a WiFi connection when your application is launched and keep it open while the application is running. If this key is not present, or is set to NO, the Operating System will close the active WiFi connection after 30 minutes.

Background Modes

Setting the UIBackgroundModes key in the Application’s Info.plist file notifies the operating systems that the application should continue to run in the background, after the user closes it, since it provides specific background services.

Note

Apple has this to say about background modes, “These keys should be used sparingly and only by applications providing the indicated services. Where alternatives for running in the background exist, those alternatives should be used instead. For example, applications can use the significant location change interface to receive location events instead of registering as a background location application.”

There are three possible key values: audio, location, and voip. The audio key indicates that after closing the application will continue to play audible content. The location key indicates that the application provides location-based information for the user using the standard Core Location services, rather than the newer significant location change service. Finally, the voip key indicates that the application provides Voice-over-IP services. Applications marked with this key are automatically launched after system boot so that the application can attempt to re-establish VoIP services.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset