“Gestures in Human-Computer Communication”, a chapter on gestural input in “The Art of Human-Computer Interface Design” co-authored by Gordon Kurtenbach (publication list) and Eric A. Hulteen (publication list) made me think about how the iSight could be used as a gestural input device.
Stationary figures or symbols.
- Hold up your hand with 1 to 5 fingers extended to rate a selected song, photo, movie, whatever.
- Point in a direction with a single finger to scroll.
- Point left or right with two or more fingers to go Back or Forward in a file or web browser.
Motion over time.
- Conduct a simple V gesture to input the beats per minute of a playing song.
- Lower your hand, palm down, to lower the audio volume; raise your hand, palm up, to raise the volume.
- Move your hand diagonally from the center toward a corner of the screen to trigger an Exposé action, activate the screen saver, or execute an arbitrary action.
I have no idea how computationally intensive visual gesture recognition is, particularly dynamic, but I’d hope it can be done with hardware currently on the market.
My desire for visual gestural recognition is partly self-interest; I’ve been playing drums for 13 years and for a time practiced about 3 hours per day. With all that playing, I’ve never developed the kind of soreness I get from prolonged mouse use.