Active authentication is emerging as a promising way to continuously and
unobtrusively authenticate smartphone users post-login. Although research in
this area has shown that behavioral traits, such as touchscreen gestures and
device movements, can be used to distinguish a legitimate user from an
attacker, fundamental questions about these traits still remain unanswered.
These include: how, and to what extent, do posture and movement impact
behavioral traits; what is the impact of human variability (anthropometric
properties, age, gender, and health conditions) on behavioral traits; to what
extent can these traits be spoofed using posture and movement observations; and
how can we strengthen these traits against spoofing attacks. In this project,
an interdisciplinary team of investigators from the Computer Science,
Biomedical Sciences, Physical Therapy, and Art and Media Technologies at NYIT
will leverage capabilities in 3D motion capture, behavioral biometric
authentication research, and motor control research to address these questions.
This project is funded by the National Science Foundation.
Common smartphone authentication mechanisms such as PINs, graphical passwords,
and fingerprint scans offer limited security. They are relatively easy to guess
or spoof, and are ineffective when the smartphone is captured after the user
has logged in. Multi-modal active authentication addresses these challenges by
frequently and unobtrusively authenticating the user via behavioral biometric
signals, such as touchscreen interaction, hand movements, gait, voice, and
phone location. However, these techniques raise significant privacy and
security concerns because the behavioral signals used for authentication
represents personal identifiable data, and often expose private information
such as user activity, health, and location. Because smartphones can be easily
lost or stolen, it is paramount to protect all sensitive behavioral information
collected and processed on these devices. One approach for securing behavioral
data is to perform off-device authentication via privacy-preserving protocols.
However, our experiments show that the energy required to execute these
protocols, implemented using state-of-the-art techniques, is unsustainably
high, and leads to very quick depletion of the smartphone’s battery. This
research advances the state of the art of privacy-preserving active
authentication by devising new techniques that significantly reduce the energy
cost of cryptographic authentication protocols on smartphones. Further, this
research takes into account signals that indicate that the user has lost
possession of the smartphone, in order to trigger user authentication only when
necessary. The focus of this project is in sharp contrast with existing
techniques and protocols, which have been largely agnostic to energy
consumption patterns and to the user1s possession of the smartphone
post-authentication. The outcome of this project is a suite of new
cryptographic techniques and possession-aware protocols that enable secure
energy-efficient active authentication of smartphone users. These cryptographic
techniques advance the state of the art of privacy-preserving active
authentication by re-shaping individual protocol components to take into
account complex energy tradeoffs and network heterogeneity, integral to modern
smartphones. Finally, this project will focus on novel techniques to securely
offload computation related to active authentication from the smartphone to a
(possibly untrusted) cloud, further reducing the energy footprint of
authentication. The proposed research will thus make privacy-preserving active
authentication practical on smartphones, from both an energy and performance
perspective. This project is funded by the National Science Foundation.
Hand Movement, Orientation, and Grasp (HMOG) is a set of behavioral features to
continuously authenticate smartphone users. HMOG features unobtrusively capture
subtle micro-movement and orientation dynamics resulting from how a user
grasps, holds, and taps on the smartphone. In this project, we evaluated
authentication and biometric key generation (BKG) performance of HMOG features
on data collected from 100 subjects typing on a virtual keyboard. Data was
collected under two conditions: sitting and walking. We achieved authentication
EERs as low as 7.16% (walking) and 10.05% (sitting) when we combined HMOG, tap,
and keystroke features. We performed experiments to investigate why HMOG
features perform well during walking. Our results suggest that this is due to
the ability of HMOG features to capture distinctive body movements caused by
walking, in addition to the hand-movement dynamics from taps. With BKG, we
achieved EERs of 15.1% using HMOG combined with taps. In comparison, BKG using
tap, key hold, and swipe features had EERs between 25.7% and 34.2%. We also
analyzed the energy consumption of HMOG feature extraction and computation. Our
analysis shows that HMOG features extracted at 16Hz sensor sampling rate
incurred a minor overhead of 7.9% without sacrificing authentication accuracy.