Hello, everyone! This is the inaugural post in my Master's Thesis blog. I'll be detailing my progress every week, and maybe even answering a few questions.
The high-level goal for the project is to use the iPhone accelerometers and compass to read certain gestures (e.g. pointing in a direction) from the motion of a person holding the phone. Those gestures, once parsed, can be sent to a server and used to control a variety of robotic vehicles.
 |
The three iPhone accelerometers. |
To start the project off, I'll be pulling data from the iPhone accelerometer/compass and streaming it to the computer. I'll also be spending a lot of time looking at Apple's iOS Developer documentation and Unity to see if there are additional technologies that we can utilize.
That's all for now, tune in next week.
No comments:
Post a Comment