Sunday, October 30, 2011

Controlling PR2

Hello, everyone! Last week, my aim was to control a PR2 model in Unity with my iPhone. The model moved forward at a set pace and as directions ("N", "SW", etc.) were received from the phone, the model adjusted its rotation to match. See the video below for more details.
There are several problems with this approach. One is that there is no GPS data currently being sent from the iPhone to the server. The data that is currently being sent is heading (0-360 degrees), and the data from the phone's 3 accelerometers. Another problem is that the data is being sent to the server over WiFi, which is sometimes slow and unreliable. This is a problem because we aim to control the robot in real time, so lag or lost  commands could be very dangerous. The third problem is that the "commands" are currently determined by a certain amount of acceleration in any direction. There are no "gestures", merely a recognition when the phone accelerates above a certain threshold.
I hope to fix the first two issues for this week by sending more information (like GPS coordinates) to the server, and by using Bluetooth communication instead of WiFi. The third issue (gestures) will be a larger portion of the project, done over the coming weeks.

Saturday, October 8, 2011

Shake It Like a High Pass Filter

Hey everyone,

Last week, I got my feet wet with the iPhone accelerometer and compass data. To get started, I checked out some sample code from Apple's iOS Developer Library. I decided to use the AccelerometerGraph sample code as a basis for my code. 
The AccelerometerGraph app in action. AccelerometerGraph shows two graphs, one of the raw data , and one of the filtered data.
AccelerometerGraph samples the X, Y, and Z accelerometers at 60Hz (displayed as Red, Green, and Blue, respectively). The acceleration values are measured in units of g-force, with a value of 1.0 approximately equal to gravity acting on that axis of the phone.There was no minimum acceleration threshold in the original app; every second exactly 60 measurements were logged and graphed. The app provides options for Low Pass/High Pass and Standard/Adaptive filtering. Low pass filters allow the lower values to go through, while filtering out the higher values. In terms of acceleration, this means that quick, sharp movements will be filtered out, while steady sources of acceleration like gravity or larger movements will show up in the graph. High pass filters do the opposite. They allow the higher values to go through while filtering out the lower values, which means that the effects of gravity will be filtered out, leaving just the shorter, sharper movements in the graph. An adaptive filter is one that self-adjusts based on an error signal to attenuate noise while leaving the original signal intact.

When modifying AccelerometerGraph for gesture recognition, I added a minimum threshold. Any measurement where none of the accelerometers recorded a value higher than 0.05 was thrown out. The Adaptive High Pass Filter worked the best for filtering out both the effects of gravity and of small incidental movements like shifting from side to side. The image above shows the Adaptive High Pass Filter in action; the erratic measurements in the top graph are "toned down" to become the measurements in the bottom graph.

When an acceleration passes the threshold, I also get the current heading from the compass. The heading is returned in the form of a float from 0-360 degrees, where 0 is North, 90 is East, etc. These directions are usually based on true North, but if that data is not available then magnetic North is used. Heading is measured with respect to the top of the phone, so some trigonometry may be required in the future if the phone is tilted.

Once the heading measurement is recorded, I used the TouchJSON library to encode the three accelerations and heading as a JSON string. Using the ASIHTTPRequest library, I sent the JSON to a Node.js backend (currently running on my laptop), where the heading value is converted to a string value like "N", "NW", "NNW", etc. For next week I hope to replace the Node backend with Unity/C#. In the long term, I also hope to develop more rigorous ways of detecting a gesture in addition to the threshold and filters. I will be reading up on the iOS shake gesture recognition code.

Saturday, October 1, 2011

First Post

Hello, everyone! This is the inaugural post in my Master's Thesis blog. I'll be detailing my progress every week, and maybe even answering a few questions.

The high-level goal for the project is to use the iPhone accelerometers and compass to read certain gestures (e.g. pointing in a direction) from the motion of a person holding the phone. Those gestures, once parsed, can be sent to a server and used to control a variety of robotic vehicles.
The three iPhone accelerometers.
To start the project off, I'll be pulling data from the iPhone accelerometer/compass and streaming it to the computer. I'll also be spending a lot of time looking at Apple's iOS Developer documentation and Unity to see if there are additional technologies that we can utilize.

That's all for now, tune in next week.