RSS Feed
Download our iPhone app
Browse DevX
Sign up for e-mail newsletters from DevX


Augmented Reality on Android: Using GPS and the Accelerometer : Page 2

Follow this guide for implementing two critical elements of an Augmented Reality application on Android: GPS and the accelerometer.


Requesting Accelerometer Data

The last piece you need to implement for your AR engine is access to the accelerometer data. Thankfully, Android has made this information easy to gather. In the previous AR article, the example called for requesting the orientation of the phone, and a call to registerListener on the location manager object retrieved the compass data. You use nearly the same technique to request accelerometer data, except of course you ask for accelerometer data with the following line:
sensorMan = (SensorManager) ctx.getSystemService(Context.SENSOR_SERVICE);

You call the getSystemService method on a context object (ctx in the above code). Here is the complete code for your orientation and the accelerometer listener.

private SensorEventListener listener = new SensorEventListener(){
   public static volatile float direction = (float) 0;
   public static volatile float inclination;
   public static volatile float rollingZ = (float)0;

   public static volatile float kFilteringFactor = (float)0.05;
   public static float aboveOrBelow = (float)0;

   public void onAccuracyChanged(Sensor arg0, int arg1){}

   public void onSensorChanged(SensorEvent evt)
      float vals[] = evt.values;
      if(evt.sensor.getType() == Sensor.TYPE_ORIENTATION)
         float rawDirection = vals[0];

         direction =(float) ((rawDirection * kFilteringFactor) + 
            (direction * (1.0 - kFilteringFactor)));

          inclination = 
            (float) ((vals[2] * kFilteringFactor) + 
            (inclination * (1.0 - kFilteringFactor)));

          if(aboveOrBelow > 0)
             inclination = inclination * -1;
         if(evt.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
            aboveOrBelow =
               (float) ((vals[2] * kFilteringFactor) + 
               (aboveOrBelow * (1.0 - kFilteringFactor)));

Yikes, that's a lot of code. What is going on here? First, you set up all the values for your listener. This means that at any time (like when a draw call passes by, for example) you can query the listener for the compass bearing and inclination of the phone. Each value comes in an array of integers. Those values will be different depending on the type of update you're getting (orientation or acceleration) You can see from the code which values to pull out when.

Next, you take the sensor data and use a little filtering math to determine two critical pieces of information:

  1. The direction the phone is pointing
  2. The screen's angle relative to the horizon

The first is called the Azimuth, or direction from north, and the second is called the inclination, or the angle above or below the horizon at which the camera is pointed. To determine these values, your first mathematical task is to filter the compass movement of the camera. This is called a rolling filter because of the length constraints. The direction variable tells you where the top of the phone is pointed, not where the camera itself is pointed, so you'll need to correct it a little.

Your second mathematical task is to run a rolling filter on the pitch, which will provide a measurement in degrees, where 90 is on the horizon, 45 is halfway up or halfway down, and 0 is straight up or straight down. Notice that with a reading of 45 you don't know whether the phone is pointed up or down from the horizon. This is where the accelerometer comes in. You use the reading generated by the accelerometer to make the inclination positive (above the horizon) or negative (below the horizon).

That, in a nutshell, is all you need from the acceleration sensor.

The Building Blocks of Augmented Reality

You now have all the tools required to build your own Augmented Reality engine. All that's left is a little math, some Android layout foo, and a fair amount of gumption. If you are more interested in building an AR app than the AR engine itself, I am currently putting together an open source Augmented Reality engine for Android. You can follow my somewhat faltering progress on Twitter at twitter.com/androidarkit.

As much as I'd like to put all the math and drawing code together and unify these three pieces of information with an overlay on the camera, all that is beyond the scope of this article. However, this and the previous AR article should serve as complete introductions to the compass, camera preview, accelerometer, and GPS subsystems of Android. Now you have the building blocks you need to create the next big Augmented Reality application.

Chris Haseman is an independent software engineer specializing in wireless development. He can be found riding his bike between coffee shops in San Francisco. He's the author of the book Android Essentials (published by Apress). In his spare time, he's a resident DJ at xtcradio.com and a martial arts instructor.
Email AuthorEmail Author
Close Icon
Thanks for your registration, follow us on our social networks to keep up-to-date