Android, Cross Platform, Digital, iOS, Xamarin

Your Phone has Attitude!


The axis on a mobile device
The axis on a mobile device

Sorry, this post isn’t about your phone or tablets bad attitude and the way it doesn’t let you do what you want. Instead, this post is about how we at TrueViewVisuals use the sensors built into your device to understand the direction it is orientated to and how that can be used to do interesting things.

This is a core piece of technology we use within our solutions. We have spent a lot of time and effort interfacing with the sensors within your devices. This experience and skill go into several of our mobile apps to provide a more intuitive and useful mobile experience.

In this post, we’ll talk about attitude (or geospatial orientation), sensors and sensor fusion, then show some example code of how to get this attitude information on each of the major platforms. I’ll write a follow-up post that will dig more deeply into what sensor fusion is and how we have customised it in TrueViewVisuals, to provide a better user experience in our augmented reality applications.

Attitude

To allow the device to present useful information about its surroundings, we need to know the direction the device is looking. This provides key information that you must know to be able to do any proper augmented reality. The direction your device is looking is called ‘the attitude’ (or geographic orientation) of the device. In essence, this is a value that represents the rotation of the device in real world coordinates.  In mathematics, this rotation value can be represented in a number of ways: a quaternion, a rotation matrix or as three separate values for yaw, pitch and roll. We use a quaternion to represent this rotation because this is smaller, involves simpler maths to work with and avoids known problems with rotation matrices – I’ll cover that in a separate blog post some time.

Sensors

Modern phones and tablets have lots of sensors in them – they allow app developers to get an insight into the world around them. In terms of attitude, the ones we are interested in this post are:

  • Compass – gives the direction of magnetic north in 3D space
  • Gyroscope – this measures angular rotation – how far you have rotated the device
  • Accelerometer – measure the direction of gravity in 3D space

There are a couple of limitations of these sensors that are worth knowing about:

  • Digital compasses are very noisy and susceptible to interference so often they jump during real world use. This is down to the characteristics of the sensor – as an app developer, there is not much you can do about it.
  • Gyroscopes tend to drift. There is no real world reference for the gyroscope, it is just measuring rotation. If you did a complete 360°, you would expect the gyroscope to give the same result. Unfortunately it doesn’t, after a while of running, it tends to drift.

For these reasons, some very clever people came up with the concept of sensor fusion.

 Sensor Fusion

These sensors can be merged through software into a single “virtual” sensor using a process called Sensor Fusion. Many people have written in-depth articles about what Sensor Fusion is and how it works – but you may need a PhD to understand them. I think it is easiest to see it as a mathematical process that takes input from the three physical sensors (Compass, Gyro and accelerometer) and provides one unified quaternion representing the attitude of the device.

Sensor Fusion block diagram
Sensor Fusion block diagram

To provide a more detailed example, if you were standing in the northern hemisphere with the device perpendicular to the ground facing the north pole (i.e. level on a tripod, facing a heading of 0 degrees), the device’s attitude would be:

0 degrees 45 degrees 90 degrees 180 degrees
x  0  0  0  0
y  -1  -0.9238795  – 0.7071068  0
z  0  0  0  0
w  0  0.3826834   0.7071068  -1

How does it help

As I said at the start, the integration to the sensors is at the core of what we do at TVV. We have several apps that read data from the sensors and provide a real time view across a 3D world. We can pull in a real world terrain model and show what the terrain looks like in a particular direction.

Implementations

Each device manufacture / OS vendor provides their own implementation of sensor fusion within their devices. These are usually good enough for general or gaming purpose – they tend to have an emphasis on speed of response instead of absolute accuracy. Below I have shown some code that allows you to get a quaternion out of the API provided by the OS.

All code below is c# as all the code we write is c#. For more information on running c# on iOS or Android have a look at what Xamarin are up to.

Apple (iOS)

Apple provide the CMMotionManager classes that can be used on iOS.

[code language=”csharp”]
public class IOSSensorFusionExample
{
public void Start()
{
CMMotionManager _motionManager = new CMMotionManager();
_motionManager.DeviceMotionUpdateInterval = 1/60; //request 60 updates a second
_motionManager.StartDeviceMotionUpdates(
CMAttitudeReferenceFrame.XMagneticNorthZVertical,
_backgroundQueue,
delegate (CMDeviceMotion motionData, NSError error)
{
CMQuaternion cMQuatAttitude = (CMQuaternion)motionData.Attitude.Quaternion;
//do something useful with the quaternion here
});
}
}
[/code]

(See Xamarin API for mode details)

Android

Android provides the RotationVector sensor type accessible from their SensorManager class:

[code language=”csharp”]
public class AndroidSensorFusionExample : Java.Lang.Object, ISensorEventListener
{
public void Start()
{
SensorManager sensorManager = this.GetSystemService(Context.SensorService);
var defaultRotationVectorSensor = sensorManager.GetDefaultSensor(SensorType.RotationVector);
sensorManager.RegisterListener(this, defaultRotationVectorSensor, SensorDelay.Game);
}

public void OnSensorChanged(SensorEvent e)
{
float[] q = new float[4];
SensorManager.GetQuaternionFromVector(q, e.Value.Values.ToArray());
Quaternion quaternion = new Quaternion(q[1], q[2], q[3], q[0]);
//do something useful with the quaternion here
}
}
[/code]

(see Xamain Android API and Android Docs for more information)

Windows Phone

Windows Phone provides the motion classes:

[code language=”csharp”]
Motion sensor = new Microsoft.Devices.Sensors.Motion();
sensor.CurrentValueChanged += (sender, args) =>
{
var quaternion = args.SensorReading.Attitude.Quaternion;
//do something useful with the quaternion here
};
sensor.Start();
[/code]

(see MSDN for more details)

Windows 8

Windows 8 uses the motion class:

[code language=”csharp”]
var sensor = Windows.Devices.Sensors.OrientationSensor.GetDefault();
sensor.ReadingChanged += (sender, args) =>
{
var quaternion = args.Reading.Quaternion;
//do something useful with the quaternion here
};
sensor.ReportInterval = 16;
[/code]

(see MSDN for more details)