NOTE: For the uber-eager, the actual rotation starts at the 1:00 mark. There’s some introductory explanation of what’s happening and the hardware in use that occurs on before that.
This video is a demo of the IMU algorithm results (a.k.a. sensor fusion) achieved with SparkFun’s 6DOF motion sensor board, which uses an ADXL345 digital 3-axis accelerometer and ITG-3200 digital 3-axis gyroscope. The readings from the board are raw from each device, and combined on the Teensy++’s MCU to create a quaternion representation of the orientation.
There is a very slight delay between the physical movement and the on-screen representation, which is caused by a secondary “smoothing” filter that I hope to improve a bit. The orientation algorithm itself is very fast, but the raw data coming from the sensors is a little noisy, which is why there’s some extra filtering in place.
This data is sent over the serial connection to the InvenSense “teapot” demo (designed to be used with their MPU-6050), which I used because I knew it worked given the right input data. The MPU-6050 is capable of doing all of the complex math on the chip itself, which is way better than forcing the MCU to do it, but this is a proof of concept that shows that it can all be done right on the microcontroller. I haven’t been able to get the MPU-6050 to work perfectly yet, so I’m taking advantage of the hardware I know how to use right now—namely, the 6DOF board.
Quaternion info and the basis for the IMU filter in use came from the Madgwick report (PDF), which Fabio Varesano (@fax8) recommended to me. I don’t quite grasp all of the math involved, but I understand the basic concept of what needs to happen. That report very helpfully includes a C-optimized example program to implement both a gyro/accel IMU algorithm and a gyro/accel/compass MAGS algorithm. I’m using a slightly modified version of the IMU algorithm in this demo video. You can view the relevant Keyglove source file here.
I’m going to turn this into a simple Arduino sketch that you can use as-is with a SparkFun 6DOF board and get filtered orientation data. When that is finished, I’ll link to it from here.
With this information and a bit more math, most of which is in place, I will be able to calculate gravity-free velocity with respect to the ground, so that regardless of the orientation, the six basic directions (up, down, left, right, forward, backward) are always the same relative to a known “pointing forward” condition. As soon as that’s ready, I’ll post another video, and a follow-up post to the first one I wrote about dead reckoning.