Understanding How People Use Skin as an Input Surface for Mobile Computing surveys how people might use skin as a gesture-based input device. There are no actual sensors involved, instead the research is more interested in the gestures themselves, which they hope will inform input sensors in the future, rather than have the gestures be restricted by the sensors.
This paper contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations.