iOS Sensor Fusion

Posted: July 10, 2011 by Mike Ilich in Design, Programming

One might suggest that we’re approaching the golden age of Human Factors design with the introduction of peripheral sensory input that has been inherent in many smartphones for a number of years now. While multi-touch LCD screens have eliminated the need for a separate keyboard or pointing device, paving the way for modern tablets, screen real estate became the limiting factor with on-screen keyboards and the dreaded dual-thumbstick overlay for navigating 3D graphic environments. Built-in sensors such as accelerometers, gyroscopes and digital compasses have only recently seen extensive use as part of the overall UX design, offering physical motion and device orientation as viable UI controls.

While these features are inherent to most modern smartphone and tablet devices, they have also been the least documented with respect to development APIs, for integration in mobile apps. A simple Google search will reveal that the few books available on iOS sensors have yet to be released and Apple documentation is sparse, with few examples that provide an understanding of what’s happening at the hardware level.

On the surface, accelerometers typically measure acceleration with respect to weight in units of g-force, with gravity represented as an upward supporting force of 1g. You can account for this in your vectors and/or rotation matrices by normalizing an inverted Z-axis vector [-x, -y, -z] , assigning the Y-axis to be a normalized vector perpendicular to gravity and using the cross-product of these to determine the X-axis. On the other hand, gyroscopes provide a measure of angular velocity with respect to the rotation about these three axes, which provides more precise motion detection for complete 6DOF tracking; however, the position of the gyroscope sensor is prone to drift and the initial orientation may shift after subsequent motions in 360 degrees. The digital compass provides a more precise measure of the device orientation with respect to true (geographic) or magnetic north, but is susceptible to surrounding magnetic fields and electronic interference.

Each of these sensor possesses inherent strengths and weaknesses, as the accelerometer detects instantaneous motion but incurs a delayed response, the gyroscope provides precise motion tracking but suffers from drift and the precise orientation of the compass affected by magnetic interference. The advantage realized through sensor fusion is in the interpolation of data from each of these devices to establish an initial point of reference, measure minor units of movement (finer granularity) and apply drift corrections using major units of movement (coarser granularity).

On the iOS platform, start by establishing a Core Motion motion manager:

    CMMotionManger *motionManager;
    CMAttitude *referenceAttitude;
    if (motionManager == nil) {
        motionManager = [[CMMotionManager alloc] init];
    }
    motionManager.accelerometerUpdateInterval = 0.01;
    motionManager.deviceMotionUpdateInterval = 0.01;
    [motionManager startDeviceMotionUpdates];
    CMDeviceMotion *dm = motionManager.deviceMotion;
    referenceAttitude = [dm.attitude retain];

And later in your recurring event management callback:

    CMRotationMatrix rotation;
    CMAcceleration userAcceleration;   
    dm = motionManager.deviceMotion;       
    CMAttitude *attitude = dm.attitude;
    rotation = attitude.rotationMatrix;
    float myRoll = attitude.roll;
    float myYaw = attitude.yaw;
    float myPitch = attitude.pitch;

Note that these attitude values are all in radians.

From this point, apply a high pass filter to the userAcceleration values (x, y, z) and use these as part of a higher level interpolation algorithm for tracking position in real-world 3D space, mapping to your virtual or augmented reality environment as your application demands. For greater precision or in cases where the signal-to-noise ratio on the sensor input is quite low (accelerometer is stationary), you may want to consider a basic Kalman filter over the prescribed high pass filter or rolling average calculations. Here’s a white paper that discusses the Kalman.

For the gyroscope values, the roll, yaw and pitch attributes of the attitude structure from the device motion manager can provide independent values of rotation about each axis independently; however, this motions are not independent and will exhibit a strong interdependence in 3D space. For this reason, rotation matrices based on Euler angles can be applied, but this introduces a state of gimbal lock in which rotation of the first two axes creates a parallel configuration that limits rotation in the third axis. Alternatively, quaternions serve as a number system in four-dimensional space that resolves the gimbal lock scenario by applying transformations simultaneously. Translation of perspective by successive rotation matrices or application of the individual roll, yaw and pitch angles is directly affected by the order of rotation or multiplication. I found from personal experience with a recent application that ordering rotation by yaw (z-axis), pitch (x-axis) and roll (y-axis) minimizes errors due to the azimuth.

iPhone Accelerometer Axes

Here’s a link to a fantastic GoogleTech talk by David Sachs on sensor fusion in mobile devices, specifically Android but applicable to other platforms as well.


Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s