This week was dedicated to filling out IRB forms and working on the questionnaire that will be given after the experiment. This marks the beginning of the serious thinking about experiment design. I decided that every subject should perform the obstacle course three times: once with on-screen controls, once with tilt-based controls, and once with hand gestures.
The questionnaire aims to explain why subjects performed the way they did during the study. There is a short section about video game fluency, especially console games, to figure out which subjects are gamers and which are not. Most of the questions will be changed to 5-level Likert scale questions and the questionnaire will be put online via SurveyMonkey or Google Docs (link to come).
In terms of the system itself, I purchased an arm band for the experiments, and will hopefully try it out over the next week. My hope is that being able to attach the phone in some way to the subject will reduce the awkwardness of handling the controls, since the phone is very large/flat.
Tune Belt armband
The hand gesture based controls are not complete, but it is more important to decide on which gestures will be used. Many of them will probably be pulled from the 1987 Army Visual Signals Field Manual.
There have been quite a few developments since Winter Break. First, on the administrative side, the proposal was submitted, including a draft of an abstract and title:
Using smartphones for gesture-based control of robotic systems
Robotic control systems are becoming more common, especially in the military. With military applications, there are lives at stake, so having the most efficient, intuitive control system can make a large difference in the success of a mission and the safety of the soldiers involved. Arm and hand gestures are typical human forms of communication, so applying that to a robotic control system can yield a more intuitive system. Varcholik et. al. describe a gesture based control system that uses the Nintendo Wiimote to determine arm/hand gestures to control a robot. In this thesis, I propose the use of smartphones for gesture-based control of robotic systems. The proposed controller will be evaluated by performing a set of carefully designed human factors experiments and computing a set of metrics (e.g. time taken to complete tasks) to measure the efficacy of the gesture-based control system.
The thesis committee will consist of Prof. Badler, Prof. Lane, and Prof. Daniilidis. I also took Winter Break as an opportunity to purchase a Roomba 560, RooTooth dongle, XBOX 360 controller, and a Samsung Galaxy II for use with the project.
Samsung Galaxy II
Roomba 560
Next, the robotics side! After several days of working through the Roomba Open Interface Specification, the Roomba finally accepts commands through the RooTooth Bluetooth dongle. It can now be controlled through the XBOX controller, keyboard arrow keys and command line, using scripts that I wrote with the RoombaSCI python library. I also downloaded the Cellbots Android app so there are several more phone-based direct control schemes that are working out of the box (see first part of the video).
Cellbots for Android. I plan to use the D-Pad and Tilt controls from Part 1. Cellbots provides an API for connecting Android phones to robots like the Roomba via Bluetooth, so I will almost certainly be using it to write the gesture-based app. I will also look into using this framework, which provides gesture recording and recognition as an Android service.
Once the "hand waving" app is complete, I can start to develop experiments involving several control methods to determine which scheme people would rather use.