Modeling Mobile Interface Tappability Using Crowdsourcing and Deep Learning

Type: Article

Publication Date: 2019-04-29

Citations: 46

DOI: https://doi.org/10.1145/3290605.3300305

Download PDF

Abstract

Tapping is an immensely important gesture in mobile touchscreen interfaces, yet people still frequently are required to learn which elements are tappable through trial and error. Predicting human behavior for this everyday gesture can help mobile app designers understand an important aspect of the usability of their apps without having to run a user study. In this paper, we present an approach for modeling tappability of mobile interfaces at scale. We conducted large-scale data collection of interface tappability over a rich set of mobile apps using crowdsourcing and computationally investigated a variety of signifiers that people use to distinguish tappable versus not-tappable elements. Based on the dataset, we developed and trained a deep neural network that predicts how likely a user will perceive an interface element as tappable versus not tappable. Using the trained tappability model, we developed TapShoe, a tool that automatically diagnoses mismatches between the tappability of each element as perceived by a human user---predicted by our model, and the intended or actual tappable state of the element specified by the developer or designer. Our model achieved reasonable accuracy: mean precision 90.2% and recall 87.0%, in matching human perception on identifying tappable UI elements. The tappability model and TapShoe were well received by designers via an informal evaluation with 7 professional interaction designers.

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Modeling Mobile Interface Tappability Using Crowdsourcing and Deep Learning 2019 Amanda Swearngin
Yang Li
+ PDF Chat Predicting and Explaining Mobile UI Tappability with Vision Modeling and Saliency Analysis 2022 Eldon Schoop
Xin Zhou
Gang Li
Zhourong Chen
Bjoern Hartmann
Yang Li
+ PDF Chat Modeling Mobile Interface Tappability Using Crowdsourcing and Deep Learning 2021 Amanda Swearngin
Li Yang
+ Predicting and Explaining Mobile UI Tappability with Vision Modeling and Saliency Analysis 2022 Eldon Schoop
Xin Zhou
Gang Li
Zhourong Chen
Björn Hartmann
Yang Li
+ TapNet: The Design, Training, Implementation, and Applications of a Multi-Task Learning CNN for Off-Screen Mobile Input 2021 Michael Xuelin Huang
Yang Li
Nazneen Nazneen
Alexander T. Chao
Shumin Zhai
+ TapNet: The Design, Training, Implementation, and Applications of a Multi-Task Learning CNN for Off-Screen Mobile Input 2021 Michael Xuelin Huang
Yang Li
Nazneen Nazneen
Alexander T. Chao
Shumin Zhai
+ TapNet: The Design, Training, Implementation, and Applications of a Multi-Task Learning CNN for Off-Screen Mobile Input 2021 Michael Xuelin Huang
Yang Li
Nazneen Nazneen
A. Chao
Shumin Zhai
+ Never-ending Learning of User Interfaces 2023 Jason Wu
Rebecca Krosnick
Eldon Schoop
Amanda Swearngin
Jeffrey P. Bigham
Jeffrey Nichols
+ Never-ending Learning of User Interfaces 2023 Jason Wu
Rebecca Krosnick
Eldon Schoop
Amanda Swearngin
Jeffrey P. Bigham
Jeffrey A. Nichols
+ PDF Chat Tappy Plugin for Figma: Predicting Tap Success Rates of User-Interface Elements under Development for Smartphones 2024 Shota Yamanaka
Hiroki Usuba
Junichi Sato
Naomi Sasaya
Fumiya Yamashita
Shuji Yamaguchi
+ MotorEase: Automated Detection of Motor Impairment Accessibility Issues in Mobile App UIs 2024 Arun Krishna Vajjala
S M Hasan Mansur
Justin Jose
Kevin Moran
+ PDF Chat TapType: Ten-finger text entry on everyday surfaces via Bayesian inference 2022 Paul Streli
Jiaxi Jiang
Andreas Fender
Manuel Meier
Hugo Romat
Christian Holz
+ Predicting Human Performance in Vertical Menu Selection Using Deep Learning 2018 Yang Li
Samy Bengio
Gilles Bailly
+ Predicting Human Performance in Vertical Menu Selection Using Deep Learning 2018 Yang Li
Samy Bengio
Gilles Bailly
+ PDF Chat Deep Learning for Enhanced Scratch Input 2021 Aman Bhargava
Alice X. Zhou
Adam Carnaffan
Stephen Mann
+ Deep Learning for Enhanced Scratch Input 2021 Aman Bhargava
Alice X. Zhou
Adam Carnaffan
Stephen Mann
+ Predicting the usability of mobile applications using AI tools: the rise of large user interface models, opportunities, and challenges 2024 Abdallah Namoun
Ahmed Alrehaili
Zaib Un Nisa
Hani Almoamari
Ali Tufail
+ MUD: Towards a Large-Scale and Noise-Filtered UI Dataset for Modern Style UI Modeling 2024 Sidong Feng
Suyu Ma
Han Wang
David S. Kong
Chunyang Chen
+ PDF Chat MUD: Towards a Large-Scale and Noise-Filtered UI Dataset for Modern Style UI Modeling 2024 Sidong Feng
Suyu Ma
Han Wang
David S. Kong
Chunyang Chen
+ Single-tap Latency Reduction with Single- or Double- tap Prediction 2023 N. Nishida
Kaori Ikematsu
Junichi Sato
Shota Yamanaka
Kota Tsubouchi

Works That Cite This (26)

Action Title Year Authors
+ StateLens 2019 Anhong Guo
Junhan Kong
Michael L. Rivera
Frank F. Xu
Jeffrey P. Bigham
+ PDF Chat Video2Action: Reducing Human Interactions in Action Annotation of App Tutorial Videos 2023 Sidong Feng
Chunyang Chen
Zhenchang Xing
+ Interactive Flexible Style Transfer for Vector Graphics 2023 Jeremy Warner
Kyu Won Kim
Bjoern Hartmann
+ Screen2Words: Automatic Mobile UI Summarization with Multimodal Learning 2021 Bryan Wang
Gang Li
Xin Zhou
Zhourong Chen
Tovi Grossman
Yang Li
+ Screen2Words: Automatic Mobile UI Summarization with Multimodal Learning 2021 Bryan Wang
Gang Li
Xin Zhou
Zhourong Chen
Tovi Grossman
Yang Li
+ PDF Chat UIClip: A Data-driven Model for Assessing User Interface Design 2024 Jason Wu
Yi-Hao Peng
Amanda Li
Amanda Swearngin
Jeffrey P. Bigham
Jeffrey Nichols
+ PDF Chat Artificial Intelligence for Human Computer Interaction: A Modern Approach 2021 Forrest Huang
Eldon Schoop
David Ha
Jeffrey Nichols
John Canny
+ PDF Chat Modeling Mobile Interface Tappability Using Crowdsourcing and Deep Learning 2021 Amanda Swearngin
Li Yang
+ PDF Chat ActionBert: Leveraging User Actions for Semantic Understanding of User Interfaces 2021 Zecheng He
Srinivas Sunkara
Xiaoxue Zang
Ying Xu
Lijuan Liu
Nevan Wichers
Gabriel Schubiner
Ruby Lee
Jindong Chen
+ PDF Chat EvIcon: Designing High‐Usability Icon with Human‐in‐the‐loop Exploration and IconCLIP 2023 I‐Chao Shen
Fu‐Yin Cherng
Takeo Igarashi
Wen‐Chieh Lin
Bing‐Yu Chen