Ru Wang / 王儒

PhD student at UW-Madison

ru [dot] wang [at] wisc [dot] edu

Computer Sciences
University of Wisconsin-Madison


I’m a 3rd year PhD student at University of Wisconsin-Madison. I’m currently working with Prof. Yuhang Zhao. My research interests include Human-Computer Interaction, Accessibility, Mental Health, and AR/VR Technology. I’m interested in designing real-time intention-aware interventions to improve people’s productivity and mental well-being.

Before Madison, I received my MS from UCSD where I worked with Prof. Nadir Weibel and Prof. Xinyu Zhang. Prior to UCSD, I received my BS from Shanghai Jiao Tong University.


GazePrompt: Enhancing Low Vision People’s Reading Experience with Gaze-Aware Augmentations

Ru Wang, Zach Potter, Yun Ho, Daniel Killough, Linda Zeng, Sanbrita Mondal, Yuhang Zhao

Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (CHI ‘24)

Practices and Barriers of Cooking Training for Blind and Low Vision People (Poster)

Ru Wang, Nihan Zhou, Tam Nguyen, Sanbrita Mondal, Bilge Mutlu, Yuhang Zhao

The 25th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ‘23)

Understanding How Low Vision People Read using Eye Tracking

Ru Wang, Linda Zeng, Xinyong Zhang, Sanbrita Mondal, Yuhang Zhao

Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ‘23)

Characterizing Barriers and Technology Needs in the Kitchen for Blind and Low Vision People

Ru Wang, Nihan Zhou, Tam Nguyen, Sanbrita Mondal, Bilge Mutlu, Yuhang Zhao

arXiv 2023

Peer Attention Enhances Student Learning

Songlin Xu, Dongyin Hu, Ru Wang, Xinyu Zhang

PNAS (Under Review)

“I Expected to Like VR Better”: Evaluating Video Conferencing and Desktop Virtual Platforms for Remote Classroom Interactions

Matin Yarmand, Ru Wang, Haowei Li, Nadir Weibel

ISLS 2024 (Under Review)

ARTEMIS: A Collaborative Mixed-Reality Environment for Immersive Surgical Telementoring

Danilo Gasques, Janet Johnson, Tommy Sharkey, Yuanyuan Feng, Ru Wang, Zhuoqun Robin Xu, Enrique Zavala, Yifei Zhang, Wanze Xie, Xinming Zhang, Konrad Davis, Michael Yip, Nadir Weibel

Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ‘21)

Approximate Random Dropout for DNN training acceleration in GPGPU

Zhuoran Song, Ru Wang, Dongyu Ru, Zhenghao Peng, Hongru Huang, Hai Zhao, Xiaoyao Liang, Li Jiang

2019 Design, Automation & Test in Europe Conference & Exhibition (DATE 2019)


Designing AR intervention to support people with OCD

Advised by Prof. Yuhang Zhao,  UW-Madison

We seek to understand the core strategies of effective OCD therapies (e.g., ERP and ACT) and design AR real-time intervention to support OCD people’s mental health when they are outside of therapy sessions.

Gaze‐Aware Visual Augmentations to Enhance Low Vision People's Reading Experience

Advised by Prof. Yuhang Zhao,  UW-Madison

We improve the gaze data collection process to make eye tracking accessible to low vision users, and identify low vision people’s unique gaze patterns when reading using eye-tracking. Based upon our findings, we further design gaze-aware visual augmentations that enhance low vision users’ reading experience. Our system facilitates line switching/following and difficult word recognition.

AR Systems to Facilitate Safe Cooking for People with Visual Impairment

Advised by Prof. Yuhang Zhao,  UW-Madison

We conduct contexual inquery study to observe how people with visual impairments (PVI) cook in their own kitchens. We also interview rehabilitation professionals about kitchen related training. Combining the two studies, we seek to understand PVI’s unique challenges and needs in the kitchen and identify design considerations for future assistive technology.

CogTeach: Real-Time Cognitive Feedback for Online Instruction

Advised by Prof. Xinyu Zhang,  UCSD

We propose an online instruction platform CogTeach that collects students’ eye gaze and facial expression to infer their cognitive state (confusion and engagement). The system will then visualize the students’ cognitive states and fixations on the instructor’s side in real-time to help improve instruction and decision making.

ARTEMIS (Augmented Reality Technology-Enabled reMote Integrated Surgery)

Advised by Prof. Nadir Weibel,  UCSD

ARTEMIS is a collaborative system for surgical telementoring. The surgical field is recreated for a remote expert in VR, and the remote expert annotations and avatar are displayed into the novice’s field of view in real-time using AR. ARTEMIS supports remote surgical mentoring of novices through synchronous point, draw, and look affordances and asynchronous video clips.


Winner of CHI 2021 Student Volunteer (SV) happi design contest!