Manipulation, Learning, and Recall with Tangible Pen-Like Input

Project description: For my master's work at the University of Waterloo, I designed and fabricated a cuboid-shaped tangible pen-like input device. I 3D modeled and 3D printed a conductive outer case that can be used on any capacitive touch-screen. An IMU inside the device senses its orientation and transmits this information to a laptop over wifi. Machine learning techniques use angle data from the IMU to determine which of the 26 corners, edges, or sides, of the device contacts the touch-screen. The idea is to map different commands to each contact for easy command selection that doesn't take up screen space. Two important usability questions arise: How long does it take users to manipulate the device and does this depend on its size; and Can anyone actually learn the location of all 26 commands (spoiler, yes!)? This work was used in my master's thesis and was published at CHI 2020.

Peer-Reviewed Paper:

Peer-Reviewed Demo:


Collaborators: Jean-Baptiste Beau, Géry Casiez, Daniel Vogel