THESIS
2019
xi, 154 pages : illustrations ; 30 cm
Abstract
Wearable computers have seen a recent resurgence in interest and popularity in which
smartglasses and smart watches are poised to impact the way we play and work. Today,
consumer applications of these wearable computers are focused on fitness tracking, message
notifications, gaming and entertainment. Under several key constraints on wearable
computers such as miniature-sized touch interface, small-sized screen real estate, user
mobility, low computational resource and limited battery life, existing input techniques
designed for desktop computers and smartphones are obsolete and incompatible with the
mobile scenarios. The cumbersome and difficult interaction with the wearable computers
has become a hurdle to their wider application as we have seen in nowadays smartphones.
Theref...[
Read more ]
Wearable computers have seen a recent resurgence in interest and popularity in which
smartglasses and smart watches are poised to impact the way we play and work. Today,
consumer applications of these wearable computers are focused on fitness tracking, message
notifications, gaming and entertainment. Under several key constraints on wearable
computers such as miniature-sized touch interface, small-sized screen real estate, user
mobility, low computational resource and limited battery life, existing input techniques
designed for desktop computers and smartphones are obsolete and incompatible with the
mobile scenarios. The cumbersome and difficult interaction with the wearable computers
has become a hurdle to their wider application as we have seen in nowadays smartphones.
Therefore, there is an unmet demand for interaction techniques particularly designed for
wearable computers.
In this thesis, we present several embodied interaction techniques to enhance object
manipulation and text entry in the constrained environment. These techniques are devised
in a way that leverage on advantageous features of human body and experiences
such as the dexterity of fingertip, lexicographical order ingrained in our memory, proprioception, as well as opposable thumbs. We thoroughly consider the key constraints on
wearable computers and explore different modalities whether the users have accepted, as
follows. Our first study proposes a pointing technique for manipulation of digital objects
in computing resource constrained augmented reality smartglasses, which improves the
user operations by 46%. In the next study, we investigate two text entry interfaces on AR
headset, knowing that the small-scale screen real estate should be reserved for user interaction
with digital objects overlaying on physical surroundings. An 1-line invisibly layout
is proposed, and remarkably only occupies 13.14% of the screen real estate at the edge region,
which is 62.80% smaller than the default keyboard layout. Next, we contribute to a
force-assisted text entry technique for the constrain-sized touchscreen on smart watches.
By leveraging the force augmentation on the touchscreen, we propose a trimetric optimized
ambiguous keyboard as small as a one cent USD (19.5mm2) and explore the force
disambiguation technique on smart watches. Finally, we design a one-handed thumb-to-finger
text entry interface and implement a quadmetric optimized 12-keypad layout on a
prototypical glove. The glove enables the users to accomplish text entry task through the
unnoticeable thumb movements within the finger space of one hand, while another hand
is reserved for user mobile scenarios, including carrying a briefcase, shaking hands, or
handling other daily tasks.
Post a Comment