Hour 18. Sensing Orientation and Motion


What You’ll Learn in This Hour:

Image The purpose of Core Motion

Image How to determine a device’s orientation

Image How to measure tilt and acceleration

Image How to measure rotation


The Nintendo Wii introduced motion sensing as an effective input technique for mainstream consumer electronics. Apple has applied this technology with great success to the iPhone, iPod touch, and iPad.

Apple devices are equipped with an accelerometer that provides measurements of the orientation, movement, and tilt of the device. With the accelerometer, a user can control applications by simply adjusting the physical orientation of the device and moving it in space. In addition, Apple has included a gyroscope in all currently shipping iDevices. This enables the device to sense rotation motions that aren’t against the force of gravity. In short, if a user moves a gyroscope-enabled device, there are ways that your applications can detect and react to that movement.

The motion-input mechanism is exposed to third-party applications in iOS through a framework called Core Motion. In Hour 17, “Using Advanced Touches and Gestures,” you saw how the accelerometer provides the shake gesture. Now you learn how to take direct readings from iOS for determining orientation, acceleration, and rotation. For all the magic that a motion-enabled application appears to exhibit, using these features is surprisingly simple.

Understanding Motion Hardware

All iOS devices, to date, can sense motion through the use of the accelerometer and gyroscope hardware. To get a better sense for what this means to your applications, let’s review what information each of these pieces of hardware can provide.


Tip

For most applications in this book, using the iOS Simulator is perfectly acceptable, but the Simulator does not simulate the accelerometer or gyroscope hardware. So for this hour, be sure to have a physical device provisioned for development. To run this hour’s applications on your device, follow the steps in Hour 1, “Preparing Your System and iDevice for Development.”


Accelerometer

An accelerometer uses a unit of measure called a g, which is short for gravity. 1g is the force pulling down on something resting at sea level on Earth (9.8 meters per second squared). You don’t normally notice the feeling of 1g (that is, until you trip and fall, and then 1g hurts pretty bad). You are familiar with g-forces higher and lower than 1g if you’ve ever ridden on a roller coaster. The pull that pins you to your seat at the bottom of the roller coaster hill is a g-force greater than 1, and the feeling of floating up out of your seat at the top of a hill is negative g-force at work.


Note

An accelerometer measures acceleration relative to a free fall—meaning that if you drop your iDevice into a sustained free fall, say off the Empire State Building, its accelerometer will measure 0g on the way down. (Just trust me; don’t try this out.) The accelerometer of a device sitting in your lap, however, measures 1g along the axis it is resting on.


The measurement of the 1g pull of Earth’s gravity on the device while it’s at rest is how the accelerometer can be used to measure the orientation of the device. The accelerometer provides a measurement along three axes, called x, y, and z (see Figure 18.1).

Image

FIGURE 18.1 The three measurable axes.

Depending on how your device is resting, the 1g of gravity will be pulling differently on the three possible axes. If it is standing straight up on one of its edges or is flat on its back or on its screen, the entire 1g is measured on one axis. If the device is tilted at an angle, the 1g is spread across multiple axes (see Figure 18.2).

Image

FIGURE 18.2 The 1g of force on a device at rest.

Interpreting iOS accelerometer data depends largely on how it will be used. Acceleration data, for example, provides a measure of the g forces on a device as it is being moved. If the acceleration data returned is greater than 1, it is accelerating in a direction. If it is a negative number, it is decelerating.

When the device is at rest, however, acceleration is 0, but there are still g forces acting on it that can be used to determine how it is positioned. A measurement of the amount of “tilt” of your device around its x, y, and z axis is called its attitude and is independent of motion. Attitude values are referred to using the terms roll, pitch, and yaw. For iOS devices, roll is the amount of “tilt” around the y axis, pitch measures tilt around the x axis, and yaw, the z axis. Attitude is sometimes defined relative to a reference frame—the “zero” point for tilting. When a game or application asks you to calibrate the tilting of your device, it is setting the reference frame.

Gyroscope

Think about what you’ve just learned about the accelerometer hardware. Is there anything it can’t do? It might seem, at first, that by using the measurements from the accelerometer, we can make a good guess as to what the user is doing, no matter what. Unfortunately, that’s not quite the case.

The accelerometer measures the force of gravity distributed across your device. Imagine, however, that your iPhone or iPad is lying face up on a table. We can detect this with the accelerometer, but what we cannot detect is if you start spinning it around in a rousing game of “spin the bottle... err... iDevice.” The accelerometer will still register the same value regardless of how the device is spinning.

The same goes for if the device is standing on one of its edges and rotates. The accelerometer can be used only if the device is changing orientation with respect to gravity; but the gyroscope can determine whether, in any given orientation, the device is also rotating while maintaining the orientation.

When querying a device’s gyroscope, the hardware reports back with a rotation value along the x, y, and z axes. The value is a measurement, in radians per second, of the speed of rotation along that axis. If you don’t remember your geometry, rotating 2 × pi radians is a complete circle, so a reading of 2 × pi (about 6.3) on any of the gyroscope’s three axes indicates that the device is spinning once per second, along that axis, as shown in Figure 18.3.

Image

FIGURE 18.3 A reading of roughly 6.3 from the gyroscope indicates that the device is rotating (spinning in a complete circle) at a rate of one revolution per second.

Accessing Orientation and Motion Data

To access orientation and motion information, we use two different approaches:

Image First, to determine and react to distinct changes in orientation, we can request that our iOS device send notifications to our code as the orientation changes. We can then compare the messages we receive to constants representing all possible device orientations—including face up and face down—and determine what the user has done.

Image Second, we take advantage of a framework called Core Motion to directly access the accelerometer and gyroscope data on scheduled intervals.

Let’s take a closer look before starting this hour’s projects.

Requesting Orientation Notifications Through UIDevice

Although it is possible to read the accelerometer hardware directly and use the values it returns to determine a device’s orientation, Apple has made the process much simpler for developers. The singleton instance UIDevice (representing our device) includes a method beginGeneratingDeviceOrientationNotifications that will tell iOS to begin sending orientation notifications to the notification center (NSNotificationCenter). Once the notifications start, we can register with an NSNotificationCenter instance to have a method of our choosing automatically be invoked with the device’s orientation changes.

Besides just knowing that an orientation event occurred, we need some reading of what the orientation is. We get this via the UIDevice orientation variable property. This property, of type UIDeviceOrientation, can be one of six predefined values:

Image UIDeviceOrientation.FaceUp: The device is lying on its back, facing up.

Image UIDeviceOrientation.FaceDown: The device is lying on its front, with the back facing up.

Image UIDeviceOrientation.Portrait: The device is in the “normal” orientation, with the Home button at the bottom.

Image UIDeviceOrientation.PortraitUpsideDown: The device is in portrait orientation with the Home button at the top.

Image UIDeviceOrientation.LandscapeLeft: The device is lying on its left side.

Image UIDeviceOrientation.LandscapeRight: The device is lying on its right side.

By comparing the variable property to each of these values, we can determine the orientation and react accordingly.

Reading Acceleration, Rotation, and Attitude with Core Motion

To work directly with motion data (readings from the accelerometer and gyroscope), you need to work with Core Motion. Despite the complex nature of motion sensors, this is one of the easier frameworks to integrate into your applications.

First, you need to include the Core Motion framework to your project. Next, you create an instance of the Core Motion motion manager: CMMotionManager. The motion manager should be treated as a singleton—one instance can provide accelerometer and gyroscope motion services for your entire application.


Note

Recall that a singleton is a class that is instantiated once in the lifetime of your application. The readings of iOS hardware are often provided as singletons because there is only one accelerometer and gyroscope in the device. Multiple instances of the CMMotionManager objects existing in your application wouldn’t add any extra value and would have the added complexity of managing them.

Unlike orientation notifications, the Core Motion motion manager enables you to determine how often you receive updates (in seconds) from the motion sensors and allows you to directly define a closure that executes each time an update is ready.



Tip

You need to decide how often your application can benefit from receiving motion updates. You should decide this by experimenting with different update values until you come up with an optimal frequency. Receiving more updates than your application can benefit from can have some negative consequences. Your application will use more system resources, which might negatively impact the performance of the other parts of your application and can certainly affect the battery life of the device. Because you’ll probably want fairly frequent updates so that your application responds smoothly, you should take some time to optimize the performance of your CMMotionManager-related code.


Setting up your application to use CMMotionManager is a simple three-step process of initializing and allocating the motion manager, setting an updating interval, and then requesting that updates begin and be sent to a handler closure via startDeviceMotionUpdatesToQueue:withHandler.

Consider the code snippet in Listing 18.1.

LISTING 18.1 Using the Motion Manager


1: var motionManager: CMMotionManager = CMMotionManager()
2: motionManager.deviceMotionUpdateInterval = 0.01
3: motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue(), withHandler: {
4:     (motion: CMDeviceMotion!, error: NSError!) in
5:      // Do something with the motion data here!
6: })


In line 1, the motion manager is allocated and initialized.

Line 2 requests that the motion sensors send updates every .01 seconds (or 100 times per second).

Lines 3–6 start the motion updates and define a closure that is called for each update.

The closure can be confusing looking, but in essence, it’s like a new method being defined within the startDeviceMotionUpdatesToQueue:withHandler invocation.

The motion handler is passed two parameters: motion, an object of type CMDeviceMotion; and error, of type NSError. The motion object includes variable properties for everything motion-related that you need:

Image userAcceleration: An acceleration variable property of the type CMAcceleration. This will be the information we are interested in reading and includes acceleration values, measured in gravities, along the x, y, and z axes.

Image rotationRate: Data of the type CMRotationRate. The rotation provides rotation rates in radians per second along the x, y, and z axes.

Image attitude: An object that contains information regarding the current tilt of the device. Within the attitude object are roll, pitch, and yaw values, measured in radians.


Tip

Unfortunately, the values returned in the motion variables will require conversion before they can be used. You’re going to see several uses of Double() and a CGFloat() or two before we’re through.

The basic steps for figuring out when this is necessary are to look at the Xcode documentation to see if the variable types you’re using (such as the acceleration, rotation, or attitude) match the method parameters that you’re using them in. If they don’t, you’ll need to wrap them in a conversion method like CGFloat(), Double(), or Float(). Someday, I expect these conversions to be automatic, but for now, there are many type-matching problems that appear with floating point values in Swift.


When you have finished processing motion updates, you can stop receiving them with the CMMotionManager method stopDeviceMotionUpdates.

Feeling confused? Not to worry; it makes much more sense seeing the pieces come together in code.


Note

We’ve skipped over an explanation of the chunk of code that refers to the NSOperationQueue. An operations queue maintains a list of operations that need to be dealt with (such as motion readings). The queue you need to use already exists, and we can access it with the code fragment NSOperationQueue.currentQueue(). So long as you follow along, there’s no need to worry about managing operation queues manually.


Sensing Orientation

As our first introduction to detecting motion, we create the Orientation application. Orientation won’t be wowing users; it’s simply going to say which of six possible orientations the device is currently in. The Orientation application will detect these orientations: standing up, upside down, left side, right side, face down, and face up.

Implementation Overview

To create the Orientation application, we build an interface that contains a single label and then code up a method that executes whenever the orientation changes. For this method to be called, we must register with the NSNotificationCenter to receive notifications when appropriate.

Remember, this isn’t the same as interface rotation and resizing; it doesn’t necessitate a change in the interface, and it can handle upside-down and right-side-up orientations as well.

Setting Up the Project

As you’ve grown accustomed, begin by starting Xcode and creating a new project. We use our old standby, the Single View Application template, and name the new project Orientation.

Planning the Variables and Connections

In this project, we need a single label in our main application view that can be updated from code. We name this orientationLabel and, as you might guess, set it to a string containing the current device orientation.

Designing the Interface

Orientation’s UI is simple (and very stylish); I’ve used a yellow text label in a field of gray. To create your interface, select the Main.storyboard file to open the Interface Builder (IB) editor. Use the Attributes Inspector with the view controller selected to switch to a standard device size.

Next, open the Object Library (View, Utilities, Show Object Library) and drag a label into the view. Set the label’s text to read Face Up.

Using the Attributes Inspector (Option-Command-4), set the color of the label, increase its font size, and set its alignment to center. After configuring the attributes of your label, do the same for your view, setting an appropriate background color for the label.


Tip

Now is a good time to put into use the techniques you learned in Hour 16 to keep the text centered onscreen while the device rotates. It isn’t necessary for the completion of the project, but it is good practice!


The finished view should look like Figure 18.4.

Image

FIGURE 18.4 The Orientation application’s UI.

Creating and Connecting the Outlet

Our application will need to be able to change the text of the label when the accelerometer indicates that the orientation of the device has changed. We need to create a connection for the label that we added. With the interface visible, switch to the assistant editor and make sure you are editing the ViewController.swift file.

Control-drag from the label to just below the class line in ViewController.swift. Name the new outlet orientationLabel when prompted. That’s it for the bridge to our code: just a single outlet and no action.

Implementing the Application Logic

Two pieces remain in this puzzle. First, we must tell iOS that we are interested in receiving notifications when the device orientation changes. Second, we must react to those changes. Because this is your first encounter with the notification center, it might seem a bit unusual, but concentrate on the outcome. The code patterns for notifications aren’t difficult to understand when you can see what the result is.

Registering for Orientation Updates

When our applications view is shown, we must register a method in our application to receive UIDeviceOrientationDidChangeNotification notifications from iOS. We also need to tell the device itself that it should begin generating these notifications so that we can react to them. You can accomplish all of this setup work in the ViewController.swift viewDidLoad method. Let’s implement that now. Update the viewDidLoad method to read as shown in Listing 18.2.

LISTING 18.2 Watching for Orientation Changes


 1: override func viewDidLoad() {
 2:     super.viewDidLoad()
 3:
 4:     UIDevice.currentDevice().beginGeneratingDeviceOrientationNotifications()
 5:
 6:     NSNotificationCenter.defaultCenter().addObserver(self,
 7:         selector: "orientationChanged:",
 8:         name: "UIDeviceOrientationDidChangeNotification",
 9:         object: nil)
10: }


In line 4, we use the method UIDevice.currentDevice() to return an instance of UIDevice that refers to the device our application is running on. We then use the beginGeneratingDeviceOrientationNotifications method to tell the device that we’re interested in hearing about it if the user changes the orientation of his or her device.

Lines 6–9 tell the NSNotificationCenter object that we are interested in subscribing to any notifications with the name UIDeviceOrientationDidChangeNotification that it may receive. They also set the class that is interested in the notifications to ViewController by way of the addObserver method’s first parameter: self. We use the selector parameter to say that we will be implementing a method called orientationChanged. In fact, coding up orientationChanged is the only thing left to do.

Determining Orientation

To determine the orientation of the device, we use the UIDevice variable property orientation. Unlike other values we’ve dealt with in the book, the orientation is of the type UIDeviceOrientation (a simple constant, not an object). Therefore, you can check each possible orientation via a simple switch statement and update the orientationLabel in the interface as needed.

Implement the orientationChanged method as shown in Listing 18.3.

LISTING 18.3 Changing the Label as the Orientation Changes


 1: func orientationChanged(notification: NSNotification) {
 2:
 3:     let orientation:UIDeviceOrientation = UIDevice.currentDevice().orientation
 4:
 5:     switch (orientation) {
 6:     case UIDeviceOrientation.FaceUp:
 7:         orientationLabel.text="Face Up"
 8:     case UIDeviceOrientation.FaceDown:
 9:         orientationLabel.text="Face Down"
10:     case UIDeviceOrientation.Portrait:
11:         orientationLabel.text="Standing Up"
12:     case UIDeviceOrientation.PortraitUpsideDown:
13:         orientationLabel.text="Upside Down"
14:     case UIDeviceOrientation.LandscapeLeft:
15:         orientationLabel.text="Left Side"
16:     case UIDeviceOrientation.LandscapeRight:
17:         orientationLabel.text="Right Side"
18:     default:
19:         orientationLabel.text="Unknown"
20:     }
21: }


The logic is straightforward. This method is called each time we have an update to the device’s orientation. The notification is passed as a parameter, but we don’t really need it for anything.

In line 3, we declare an orientation constant equal to the device’s orientation variable property (UIDevice.currentDevice().orientation).

Lines 5–20 implement a switch statement (refer to Hour 3, “Discovering Swift and the iOS Playground,” for details on switch) that compares each possible orientation constant to the value of the orientation variable. If they match, the orientationLabel text variable property is set appropriately.


Tip

Technically, the application is done; but when it runs, you’ll notice that the iPhone interface doesn’t support all orientations. The application runs as expected, but when you turn the phone upside down, the text will also be upside down. You can enable all possible orientations for your interface by adding this method to your view controller:

override func supportedInterfaceOrientations() -> Int {
    return Int(UIInterfaceOrientationMask.All.rawValue)
}

Adding this is purely optional—it corrects what you might consider a visual flaw, rather than a bug in functionality.


Building the Application

Save your files, and then run the application. Your results should resemble Figure 18.5. If you’re running in the iOS Simulator, rotating the virtual hardware (Hardware, Rotate Left/Right) will work, but you won’t be able to view the face-up and face-down orientations.

Image

FIGURE 18.5 Orientation in action.

Detecting Acceleration, Tilt, and Rotation

In the Orientation application, we ignored the precise values coming from the accelerometer and instead just allowed iOS to make an all-or-nothing orientation decision. The gradations between these orientations, such as the device being somewhere between its left side and straight up and down, are often interesting to an application.

Imagine you are going to create a car racing game where the device acts as the steering wheel when tilted left and right and the gas and brake pedals when tilted forward and back. It is helpful to know how far the player has turned the wheel and how hard the user is pushing the pedals to know how to make the game respond.

Likewise, consider the possibilities offered by the gyroscope’s rotation measurements. Applications can now tell whether the device is rotating, even if there is no change in tilt. Imagine a turn-based game that switches between players just by rotating the iPhone or iPad around while it is lying on a table or sitting in a charging dock.

Implementation Overview

In our next application example, ColorTilt, we use acceleration to set the background color of a view, and rotation or tilt to make it progressively more transparent. The current attitude will be displayed at all times via roll, pitch, and yaw readouts. Three toggle switches (UISwitch) will be added to enable/disable motion updates, and accelerometer/gyroscope reactions.

The application logic is broken down into four methods: one to toggle motion manager updates on and off; and three others to react to attitude readings, device acceleration, and device rotation.

It’s not as exciting as a car racing game, but it is something we can accomplish in an hour, and everything learned here will apply when you get down to writing a great iOS motion-enabled application.

Setting Up the Project

Open Xcode and begin by creating a new project based on the Single View Application template. Name this application ColorTilt.

Planning the Variables and Connections

Next, we’ll identify the variable properties and connections we need. Specifically, we want a view (UIView) that changes colors (colorView) and three UISwitch instances. Two of the switches will let us indicate whether we should watch the accelerometer and gyroscope data (toggleAccelerometer and toggleGyroscope). The third does the heavy lifting of turning on or off the motion monitoring via a method controlHardware. To provide output of attitude values (roll, pitch, and yaw), we will use three UILabels: rollOutput, pitchOutput, and yawOutput.

We also need a constant (or a variable property) for our CMMotionManager object, which we’ll call motionManager. Because this is not directly related to an object in the storyboard and is part of enabling the functional logic, we add this in the implementation of the view controller logic.

Adding a Radian Conversion Constant

Later in the project, we’ll be asking Core Motion to provide us with the attitude of our iOS device. Core Motion supplies this information in radians, but we are mere mortals and tend to think in degrees. Converting radians to degrees requires multiplying by a constant, which we’ll add now. Update ViewController.swift to include a kRad2Deg constant after its class line:

let kRad2Deg:Double = 57.2957795


Note

As I write this hour, I struggle not to make a joke each time I type about an iOS device’s “attitude.” This has been very difficult for me, and I apologize for even thinking about it.


Designing the Interface

Like the Orientation application, the ColorTilt application’s interface is not a work of art. It requires a few switches, labels, and a view. Open the interface by selecting the Main.storyboard file. Use the Attributes Inspector with the view controller selected to switch to a standard device size, if desired.

Lay out the user interface by dragging three UISwitch instances from the Object Library to the top-right of the view. Stack them, one under the other. Use the Attributes Inspector (Option-Command-4) to set each switch’s default to Off.

Add three labels (UILabel), naming them Motion Tracking, Accelerometer, and Gyroscope, to the view, positioned beside each switch.

Next, add three more labels, positioned horizontally to output roll, pitch, and yaw, setting the default value of each to 0. Add Roll:, Pitch:, and Yaw: labels in front of each. Yes, that’s quite a few labels.

Finally, drag a UIView instance into the view and size it to fit in the view below the switches and labels. Use the Attributes Inspector to change the view’s background to green.

Your view should now resemble Figure 18.6. If you want to arrange the controls differently, feel free.

Image

FIGURE 18.6 Create a layout that includes three switches, a boatload of labels, and a color view.

Creating and Connecting the Outlets and Actions

Despite its functional simplicity, quite a few connections are required in this application. Here’s what we’ll be using, starting with the outlets:

Image The view that will change colors (UIView): colorView

Image Toggle switch that activates/deactivates the accelerometer (UISwitch): toggleAccelerometer

Image Toggle switch that activates/deactivates the gyroscope (UISwitch): toggleGyroscope

Image Toggle switch that activates/deactivates motion tracking (UISwitch): toggleMotion

Image Label displaying the attitude value roll in degrees (UILabel): rollOutput

Image Label displaying the attitude value pitch in degrees (UILabel): pitchOutput

Image How you make a horse move (UILabel): yawOutput

And the action:

Image Toggle motion tracking on and off from the toggleMotion switch: controlHardware

Select the Main.storyboard file and open the assistant editor, making sure that ViewController.swift is visible on the right side. Clear some room in your workspace if necessary.

Adding the Outlets

Control-drag from the green UIView to just below the constant kRad2Deg in ViewController.swift. Name the outlet colorView when prompted, as shown in Figure 18.7. Repeat the process for the three switches, connecting the switch beside the Motion Tracking label to toggleMotion, and the switches beside the Accelerometer and Gyroscope labels to toggleAccelerometer and toggleGyroscope, respectively.

Image

FIGURE 18.7 Connect the objects to the outlets.

Complete the outlets by connecting the roll, pitch, and yaw readouts to rollOutput, pitchOutput, and yawOutput.

Adding the Action

To finish the connections, the toggleMotion switch must be configured to call the controlHardware method when the Value Changed event occurs. Define the action by Control-dragging from the switch to just below the last @IBOutlet line in the ViewController.swift file.

When prompted, create a new action named controlHardware that responds to the switch’s Value Changed event.

Implementing the Application Logic

Our ColorTilt application isn’t complicated, but will require several different methods to add all the motion features we want. So, we need to cover the following areas:

1. Initialize and configure the Core Motion motion manager (CMMotionManager).

2. Manage events to toggle the motion tracking on and off (controlHardware), registering a handler closure when the hardware is turned on.

3. React to the accelerometer/gyroscope readings, updating the background color and alpha transparency values appropriately.

4. Prevent the device interface from rotating; the rotation will interfere with displaying feedback to fast events.

Let’s work our way through the corresponding pieces of code now.

Initializing the Core Motion Motion Manager

When the ColorTilt application launches, we need to allocate and initialize a Core Motion motion manager (CMMotionManager) instance. Before we can do that, however, we need to make sure that our code knows about the Core Motion framework by importing the Core Motion module. Add the following import statement to ViewController.swift following the default UIKit import:

import CoreMotion

Next, we need to declare our motion manager. Create a new constant, motionManager, by updating the code at the top of the ViewController.swift file and adding this line following the @IBOutlets you added earlier:

let motionManager: CMMotionManager = CMMotionManager()

Now we have an instance of the Core Motion motion manager already assigned to a constant, motionManager. We’re ready to start configuring and accessing the motion manager.

Configuring the manager is simple; we only need to set its variable property deviceMotionUpdateInterval to match the frequency (in seconds) with which we want to get updates from the hardware. We’ll update at 100 times a second, or an update value of .01. This configuration is done in viewDidLoad so that we are ready to start monitoring as soon as our interface loads.

Update the viewDidLoad method, as shown in Listing 18.4.

LISTING 18.4 Initializing the Motion Manager


override func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view, typically from a nib.
    motionManager.deviceMotionUpdateInterval = 0.01
}


The next step is to implement our action, controlHardware, so that when one of the UISwitch instances is turned on or off, it tells the motion manager to begin or end readings from the accelerometer/gyroscope.

Managing Motion Updates

The controlHardware method acts as the “master control” for our application. If the motion tracking switch is toggled on, the CMMotionManager instance, motionManager, is asked to start monitoring for motion updates. Each update is processed by a handler closure that calls up to three additional methods: doAttitude to display attitude values, doAcceleration to handle acceleration events, and doGyroscope to process data related to rotation.

I say “up to three additional methods” because we don’t necessarily want the device trying to react to acceleration changes and rotation at the same time, so we take into account the status of the Accelerometer and Gyroscope switches before deciding to call each method.

In the case of the motion tracking switch being toggled off, the controlHardware method stops updates from the motion manager.

Update the controlHardware method stub, as shown in Listing 18.5.

LISTING 18.5 Implementing the controlHardware Method


 1: @IBAction func controlHardware(sender: AnyObject) {
 2:     if toggleMotion.on {
 3:  motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue(),
 4:             withHandler: {
 5:             (motion: CMDeviceMotion!, error: NSError!) in
 6:                 self.doAttitude(motion.attitude)
 7:                 if self.toggleAccelerometer.on {
 8:                     self.doAcceleration(motion.userAcceleration)
 9:                 }
10:                 if self.toggleGyroscope.on {
11:                     self.doRotation(motion.rotationRate)
12:                 }
13:             })
14:     } else {
15:         toggleGyroscope.on=false
16:         toggleAccelerometer.on=false
17:         motionManager.stopDeviceMotionUpdates()
18:     }
19: }


Let’s step through this method to make sure we’re all still on the same page.

Line 2 checks to see whether the motion tracking switch toggleMotion is set to On. If it is, lines 3–5 tell the motion manager to start sending updates and define a code block to handle each update.

The code block (lines 6–12) receives all the device’s motion data in a CMDeviceMotion object named motion. In line 6, we send the attitude data from this object (motion.attitude) to the method doAttitude. Lines 7–9 check to see whether the toggleAccelerometer switch is set to On, and, if it is, sends acceleration data (motion.userAcceleration) to the method doAcceleration. Lines 10–12 do the same for the toggleGyroscope switch, sending rotation data (motion.rotationRate) to the doRotation method. Notice anything weird here? I’m using self in front of the methods I call. The reason for this is because the closure acts as a unique chunk of code that is aware of what class it is in, but is also independent of that class. Without the self in front of the method names, Swift doesn’t know for sure that we mean the methods within the ViewController class—and we’d get an error to that effect.

The final lines (14–18) are evaluated if the motion tracking switch is set to Off. If it is, the other toggle switches are also turned off in lines 15–16. This helps keep our display consistent. Line 17 tells the motion manager to stop sending updates.


Tip

If you know you’re only going to use the accelerometer or gyroscope in your application—or want to provide different update rates for each—you can request updates for a specific motion monitor using the motion manager methods startAccelerometerUpdatesToQueue:withHandler and startGyroscopeUpdatesToQueue:withHandler. As the method names suggest, these are specific to the accelerometer and gyroscope and have their own independent update rates defined by the variable properties accelerometerUpdateInterval and gyroscopeUpdateInterval.



Caution: Building Bigger Closures Isn’t Better

It is possible to define all of our application’s motion logic within the closure supplied by the startDeviceMotionUpdatesToQueue:withHandler method, but following that approach can get ugly. Keeping your closures small and using them to invoke other methods results in more manageable and understandable code.


Displaying Attitude Data

The method doAttitude has a very simple function, and a very simple implementation. This method updates the roll, pitch, and yaw labels to display the corresponding values in degrees. The method also uses the pitch to determine the amount of tilt, forward and back, to set the alpha value of colorView. It will only do this, however, when the toggleGyroscope switch is off; otherwise, rotation and tilt are competing with one another.

Before we implement this method, it’s important to understand what data we will be getting from Core Motion. The method receives an object of the type CMAttitude. Within this, we’ll access variable properties roll, pitch, and yaw—each containing Euler angles in radians.


Tip

Learn about Euler angles in this excellent Wolfram Mathworld article: http://mathworld.wolfram.com/EulerAngles.html.


Each angle is expressed in radians. To convert between radians and degrees, we multiply by the constant kRad2Deg that was added at the start of the project. The raw radian values for pitch for an iOS device sitting straight up and tipped completely over vary between 1/–1 and 0, so we can use the absolute value of the pitch to set the alpha for colorView. No additional math needed!

Open ViewController.swift and add the implementation of doAttitude shown in Listing 18.6.

LISTING 18.6 Implementing the doAttitude Method


1: func doAttitude(attitude: CMAttitude) {
2:     rollOutput.text=String(format:"%.0f",attitude.roll*kRad2Deg)
3:     pitchOutput.text=String(format:"%.0f",attitude.pitch*kRad2Deg)
4:     yawOutput.text=String(format:"%.0f",attitude.yaw*kRad2Deg)
5:     if !toggleGyroscope.on {
6:         colorView.alpha=CGFloat(fabs(attitude.pitch))
7:     }
8: }


In line 1, the method receives attitude data from controlHardware and references it in the object attitude. Lines 2–4 set the rollOutput, pitchOutput, and yawOutput labels to attitude’s corresponding roll, pitch, and yaw variables (after converting them to degrees). I use a string format of %.0f, which effectively tells the system to output the floating point value without any decimal points.

Lines 5–7 check whether the toggleGyroscope switch is on. If it isn’t, the code updates the colorView and sets its alpha value to the absolute value of attitude.pitch. The alpha variable property must be set to a CGFloat, so we use CGFloat() to convert the absolute value to the right type before making the assignment.

Handling Acceleration Data

Next we need to react to the accelerometer data. This method has one purpose: to change the color of colorView if the user moves the device suddenly.

To change colors, we need to sense motion. One way to do this is to look for g-forces greater than 1g along each of our x, y, and z axes. This is good for detecting quick, strong movements. As luck would have it, the doAcceleration method will be receiving a CMAcceleration structure that contains a measurement of the force of gravity along x, y, and z.

Implement doAcceleration, as shown in Listing 18.7.

LISTING 18.7 Implementing the doAcceleration Method


 1: func doAcceleration(acceleration: CMAcceleration) {
 2:     if (acceleration.x > 1.3) {
 3:         colorView.backgroundColor = UIColor.greenColor()
 4:     } else if (acceleration.x < -1.3) {
 5:         colorView.backgroundColor = UIColor.orangeColor()
 6:     } else if (acceleration.y > 1.3) {
 7:         colorView.backgroundColor = UIColor.redColor()
 8:     } else if (acceleration.y < -1.3) {
 9:         colorView.backgroundColor = UIColor.blueColor()
10:     } else if (acceleration.z > 1.3) {
11:         colorView.backgroundColor = UIColor.yellowColor()
12:     } else if (acceleration.z < -1.3) {
13:         colorView.backgroundColor = UIColor.purpleColor()
14:     }
15: }


Lines 2–14 check the acceleration along each of the three axes to see whether it is greater (or less) than 1.3—that is, greater than the force of gravity on the device. If it is, the colorView UIView’s backgroundColor variable property is set to one of six different predefined colors. In other words, if you jerk the device in any direction, the color will change.


Tip

A little experimentation shows that +/–1.3g is a good measure of an abrupt movement. Try it out yourself with a few different values; you might decide another value is better.


Not that bad, right? We finish up by reading device rotation via the gyroscope.

Reacting to Rotation

The goal of the doRotation method is to alter the alpha value of colorView as the user spins the device. Instead of forcing the user to rotate the device in one direction to get the alpha channel to change, we combine the rotation rates along all three axes. Rotation data is supplied to us in the form of another structure (this time, of type CMRotationRate). We access the x, y, and z values of this structure to determine the rotation, in radians per second, around each access.

Implement doRotation, as shown in Listing 18.8.

LISTING 18.8 Implementing the doRotation Method


1: func doRotation(rotation: CMRotationRate) {
2:     var value: Double = fabs(rotation.x)+fabs(rotation.y)+fabs(rotation.z)/12.5;
3:     if (value > 1.0) { value = 1.0;}
4:     colorView.alpha = CGFloat(value)
5: }


In line 1, we receive the rotation rate data and store it in rotation. Line 2 declares value as a double-precision floating-point number (the same as the rotation rates) and sets it to the sum of the absolute values of the three axes’ rotation rates (rotation.x, rotation.y, and rotation.z) divided by 12.5.


Note

Why are we dividing by 12.5? Because an alpha value of 1.0 is a solid color, and a rotation rate of 1.0 means that the device is rotating at about 1/6 of a rotation a second (1 radian × pi = 3.14; that is, half of a complete rotation). In practice, this is way too slow a rotation rate to get a good effect; barely turning the device at all gives us a solid color.

By dividing by 12.5, the rotation rate would have to be roughly two revolutions a second (2 revolutions × 2 radians × pi = 12.56) for value to reach 1, meaning that it takes much more effort to make the view’s background color solid.


In line 3, if value is greater than 1.0, it is set back to 1.0 because that is the maximum that the alpha variable property of colorView can accept. Finally, in line 4, alpha is set to value.

Preventing Interface-Orientation Changes

At this point, you can run the application, but you probably won’t get very good visual feedback from the methods we’ve written. Apple’s iOS templates include interface rotation settings that are turned on by default. The animation of the interface rotation will interfere with the quick color changes we need to see in the view.

To fix the problem, we need to disable support for rotation. Do this by selecting the project group in the project navigator, then using the Deployment Info section under the General settings to uncheck all but the Portrait orientation. Alternatively (and possibly easier), add the method in Listing 18.9 to ViewController.swift.

LISTING 18.9 Disabling Interface Rotation


override func shouldAutorotate() -> Bool {
    return false
}


This turns off interface orientation changes for all possible orientations except Portrait orientation, making our UI static.

Building the Application

You’ve finished the application. Plug in your iDevice (this won’t work right in the Simulator), choose your Device from the Xcode Scheme pop-up menu, and then click Run. Experiment first with simple motion tracking and the attitude readings, as shown in Figure 18.8. Once you have a sense for how the attitude readings vary, try activating the accelerometer to read acceleration data and use sudden motions to change the background color. Your last test should be of the gyroscope. Activate the gyroscope to change the opacity of the background based on rotation speed, rather than tilt.

Image

FIGURE 18.8 Tilt the device to change the opacity of the background color.

It’s been a bit of a journey, but you can now tie directly into one of the core features of Apple’s iOS device family: motion. Even better, you’re doing so with one of Apple’s latest-and-greatest frameworks: Core Motion.

Further Exploration

The Core Motion framework provides a great set of tools for dealing with all the iOS motion hardware in a similar manner. As a next step, I recommend reviewing the Core Motion Framework Reference and the Event Handling Guide for iOS, both available through the developer documentation system in Xcode. You also want to review the CMAttitude class documentation, which offers additional methods to establish reference frames. These will help you determine device orientation and motion in reference to known frames of reference, such as “north.”

If you’re lucky enough to be able to target Apple’s latest iOS devices, including the iPhone 5s, you will want to look into the CMMotionActivityManager and CMMotionActivity classes. These classes work with the M7 coprocessor to provide access to motion data on an ongoing and historical basis, even when an application isn’t running. You can, for example, ask the new APIs to provide the number of steps taken since your app last started, or whether or not your user is using the app while in an automobile. Nifty stuff!

Regardless of how you read motion data or what data you use, the biggest challenge is to use motion readings to implement subtler and more natural interfaces than those in the two applications we created in this hour. A good step toward building effective motion interfaces for your applications is to dust off your old math, physics, and electronics texts and take a quick refresher course.

The simplest and most basic equations from electronics and Newtonian physics are all that is needed to create compelling interfaces. In electronics, a low-pass filter removes abrupt signals over a cutoff value, providing smooth changes in the baseline signal. This is useful for detecting smooth movements and tilts of the device and ignoring bumps and the occasional odd, spiked reading from the accelerometer and gyroscope. A high-pass filter does the opposite and detects only abrupt changes; this can help in removing the effect of gravity and detecting only purposeful movements, even when they occur along the axes that gravity is acting upon.

When you have the right signal interpretation in place, there is one more requirement for your interface to feel natural to your users: It must react like the physical and analog world of mass, force, and momentum, and not like the digital and binary world of 1s and 0s. The key to simulating the physical world in the digital is just some basic seventeenth-century physics.

Summary

At this point, you know all the mechanics of working with orientation, and with the accelerometer and gyroscope via Core Motion. You understand how to use the Core Motion motion manager (CMMotionManager) to take direct readings from the available sensors to interpret orientation, tilt, movement, and rotation of the device. You understand how to create an instance of CMMotionManager, how to tell the manager to start sending motion updates, and how to interpret the measurements that are provided.

Q&A

Q. Should I base my game’s controls off of gyroscope and accelerometer readings?

A. I recommend that you offer more traditional touchscreen controls in addition to motion-based options. Although many users enjoy using motion controls, they require a certain amount of space and privacy that isn’t always available.

Q. Are motion features only good for games?

A. Absolutely not! By combining the different sensor readings from the iPhone or iPad, you can learn a lot about how your users interact with your applications and adjust accordingly. Is your application sensing a face-down orientation? Your user is likely lying down and reading in bed. So, offering to switch to night colors may be appropriate. Are you getting frequent spikes in the accelerometer readings? If yes, your application is likely being used while the user is walking or riding in a vehicle. Allowing more time to react to interface events might be an appropriate adaptation under these circumstances.

Workshop

Quiz

1. An accelerometer measures acceleration relative to what?

a. Shaking

b. Free fall

c. Speed

d. Thrust

2. Acceleration is measured along how many axes at once?

a. 0

b. 1

c. 2

d. 3

3. The tilt of a device at a point in time is called what?

a. Altitute

b. Acceleration

c. Rotation

d. Attitude

4. On iOS, rotation is measured in what?

a. Degrees/second

b. Radians/second

c. Feet/second

d. Meters/second

5. To set how quickly your device receives motion updates, you should set what variable property?

a. deviceMotionUpdateInterval

b. motionUpdateSetting

c. deviceIntervalMotionSetting

d. updateInterval

6. An independent block of code used with a handler (among other things) is known as a what?

a. Method

b. Blockset

c. Closer

d. Closure

7. An instance of which class is used to manage motion events?

a. CMMotionHandle

b. CMMotionManager

c. CMMotionMaker

d. CMMotion

8. Yaw is the measure of tilt around which axis?

a. Y

b. X

c. Z

d. X, Y, and Z

9. When a device is lying facing up, the orientation is set to what constant?

a. UIDeviceOrientation.FaceUp

b. UIDeviceOrientation.Up

c. UIDeviceOrientation.BackDown

d. UIDeviceOrientation.Face

10. To stop motion readings, you would use which method of the Core Motion motion manager?

a. cancelDeviceMotionUpdates()

b. cancelUpdates()

c. stopUpdates()

d. stopDeviceMotionUpdates()

Answers

1. B. An accelerometer measures acceleration relative to free-fall. Acceleration is measured in gravities.

2. D. Acceleration is measured along x, y, and z axis simultaneously.

3. D. The measure of tilt of a device at a given moment in time is called its attitude.

4. B. iOS returns rotation rates in radians per second.

5. A. The deviceMotionUpdateInterval is a variable property that sets how quickly motion events are measured.

6. D. A closure is an independent block of code that acts independently of the code around it, without requiring a formal method/function definition.

7. B. To handle motion events, you’ll need to use the CMMotionManager singleton.

8. C. Yaw is the measurement of tilt around the z axis.

9. A. When a device is laying face up, its orientation is set to (surprise!) the UIDeviceOrientation.FaceUp.

10. D. Use the stopDeviceMotionUpdates() method to stop the Core Motion motion manager from sending motion updates to your code.

Activities

1. When the Orientation application is in use, the label stays put and the text changes. This means that for three of the six orientations (upside down, left side, and right side), the text itself is also upside down or on its side. Fix this by changing not just the label text but also the orientation of the label so that the text always reads normally for the user looking at the screen. Be sure to adjust the label back to its original orientation when the orientation is standing up, face down, or face up.

2. In the final version of the ColorTilt application, sudden movement is used to change the view’s color. You may have noticed that it can sometimes be difficult to get the desired color. This is because the accelerometer provides a reading for the deceleration of the device after your sudden movement. So, what often happens is that ColorTilt switches the color from the force of the deceleration immediately after switching it to the desired color from the force of the acceleration. Add a delay to the ColorTilt application so that the color can be switched at most once every second. This makes switching to the desired color easier because the acceleration will change the color but the deceleration will be ignored.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset