Publications

Watch Your Mouth: Silent Speech Recognition with Depth Sensing Honorable Mention Award

Xue Wang, Zixiong Su, Jun Rekimoto, Yang Zhang (CHI 2024)

[Video] [DOI] [PDF] [Github]

WheelPose: Data Synthesis Techniques to Improve Pose Estimation Performance on Wheelchair Users

William Huang, Sam Ghahremani, Siyou Pei, Yang Zhang (CHI 2024)

[Video] [DOI] [PDF] [Github]

UI Mobility Control in XR: Switching UI Positionings between Static, Dynamic, and Self Entities

Siyou Pei, David Kim, Alex Olwal, Yang Zhang, Ruofei Du (CHI 2024)

[Video] [DOI] [PDF]

TextureSight: Texture Detection for Routine Activity Awareness with Wearable Laser Speckle Imaging

Xue Wang, Yang Zhang (IMWUT 2024)

[Video] [DOI] [PDF]

CubeSense++: Smart Environment Sensing with Interaction-Powered Corner Reflector Mechanisms

Xiaoying Yang, Jacob Sayono, Yang Zhang (UIST 2023)

[Video] [DOI] [PDF]

Headar: Sensing Head Gestures for Confirmation Dialogs on Smartwatches with Wearable Millimeter-Wave Radar

Xiaoying Yang, Xue Wang, Gaofeng Dong, Zihan Yan, Mani Srivastava, Eiji Hayashi, Yang Zhang (IMWUT 2023)

[Video] [DOI] [PDF]

Embodied Exploration: Facilitating Remote Accessibility Assessment for Wheelchair Users with Virtual Reality

Siyou Pei, Alexander Chen, Chen Chen, Mingzhe "Franklin" Li, Megan Fozzard, Hao-Yun Chi, Nadir Weibel, Patrick Carrington, Yang Zhang (ASSETS 2023)

[Video] [DOI] [PDF]

Interaction Harvesting: A Design Probe of User-Powered Widgets

John Mamish, Amy Guo, Thomas Cohen, Julian Richey, Yang Zhang, Josiah Hester (IMWUT 2023)

[DOI] [PDF]

E3D: Harvesting Energy from Everyday Kinetic Interactions Using 3D Printed Attachment Mechanisms

Abul Al Arabi, Xue Wang, Yang Zhang, Jeeeun Kim (IMWUT 2023)

[DOI] [PDF]

Bring Environments to People – A Case Study of Virtual Tours in Accessibility Assessment for People with Limited Mobility

Hao-Yun Chi, Jingzhen 'Mina' Sha, Yang Zhang (W4A 2023)

[Slides] [DOI] [PDF]

LaserShoes: Low-Cost Ground Surface Detection Using Laser Speckle Imaging

Zihan Yan, Yuxiaotong Lin, Guanyun Wang, Yu Cai, Peng Cao, Haipeng Mi, Yang Zhang (CHI 2023)

[Video] [Slides] [DOI] [PDF]

ForceSight: Non-Contact Force Sensing with Laser Speckle Imaging

Siyou Pei, Pradyumna Chari, Xue Wang, Xiaoying Yang, Achuta Kadambi, Yang Zhang (UIST 2022)

[Video] [Slides] [DOI] [PDF]

Freedom to Choose: Understanding Input Modality Preferences of People with Upper-body Motor Impairments for Activities of Daily Living

Mingzhe Li, Xieyang Liu, Yang Zhang, Patrick Carrington (ASSETS 2022)

[DOI] [PDF]

MiniKers: Interaction-Powered Smart Environment Automation

Xiaoying Yang, Jacob Sayono, Jess Xu, Jiahao "Nick" Li, Josiah Hester, Yang Zhang (IMWUT 2022)

[Slides] [DOI] [PDF] [Github]

SkinProfiler: Low-Cost 3D Scanner for Skin Health Monitoring with Mobile Devices

Zhiying Li, Tejas Viswanath, Zihan Yan, Yang Zhang (MobiSys Workshop 2022)

[DOI] [PDF]

Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions Honorable Mention Award

Siyou Pei, Alexander Chen, Jaewook Lee, Yang Zhang (CHI 2022)

[Video] [DOI] [PDF] [Github]

EmoGlass: an End-to-End AI-Enabled Wearable Platform for Enhancing Self-Awareness of Emotional Health

Zihan Yan, Yufei Wu, Yang Zhang, Anthony Chen (CHI 2022)

[DOI] [PDF]

FaceBit: Smart Face Masks Platform

Alexander Curtiss, Blaine Rothrock, Abu Bakar, Nivedita Arora, Jason Huang, Zachary Englhardt, Aaron-Patrick Empedrado, Chixiang Wang, Saad Ahmed, Yang Zhang, Nabil Alshurafa, Josiah Hester (IMWUT 2021)

[DOI] [PDF]

Nod to Auth: Fluent AR/VR Authentication with User Head-Neck Modeling

Xue Wang, Yang Zhang (CHI 2021 LBW)

[DOI] [PDF]

CubeSense: Wireless, Battery-Free Interactivity through Low-Cost Corner Reflector Mechanisms

Xiaoying Yang, Yang Zhang (CHI 2021 LBW)

[DOI] [PDF]

Duco: Autonomous Large-Scale Direct-Circuit-Writing (DCW) on Vertical Everyday Surfaces Using A Scalable Hanging Plotter

Tingyu Cheng, Bu Li, Yang Zhang, Yunzhi Li, Charles Ramey, Eui Min Jung, Yepu Cui, Sai Ganesh Swaminathan, Youngwook Do, Manos Tentzeris, Gregory D. Abowd, HyunJoo Oh (IMWUT 2021)

[Video] [DOI] [PDF]

Vibrosight++: City-Scale Sensing Using Existing Retroreflective Signs and Markers

Yang Zhang, Sven Mayer, Jesse T. Gonzalez, Chris Harrison (CHI 2021)

[Video] [DOI] [PDF]

Welcome to HiLab

We create sensing technologies for next-generation computing devices to perceive users and their environments. We envision a future where physical AI systems will actively adapt to our daily tasks in a practical, inclusive, and sustainable manner. To achieve this vision, we 1) advance sensing techniques that recognize events and user activities, 2) empower emerging computing devices with efficient and fluid interactions, and 3) engineer smart systems that harness user interaction as a source of power.

We are actively looking for students at all levels. If you are interested in working with us, please first read the Faq section. If you want to join the lab as a PhD student, please apply the PhD program at UCLA.

Lab news

April 18 2024
Yokohama, Japan
Yang accepted the invitation to serve as Subcommittee Chair for CHI 2025.
March 20 2024
Honolulu
Xue and William will present their first papers at CHI.
Oct 29 2023
San Francisco
HiLab attending UIST with one paper presentation and a few mobile demos.
Jul 21 2023
Los Angeles
HiLab and SsysArch successfully organized the Los Angeles Computing Circle (LACC) 2023 for local high school students.
Oct 30 2022
Bend
All hands attending UIST. The demo of ForceSight received Best Paper Honorable Mention.
Aug 5 2022
Bend
Xiaoying's first full paper is accepted at ACM IMWUT, and Siyou's second full paper is accepted at ACM UIST 2022.
Jul 28 2022
Los Angeles
Celebrating one year birthday of HiLab.
Apr 18 2022
Los Angeles
Alexander Chen is invited to present his research at the UCLA Undergraduate Research Week.
Mar 27 2022
New Orleans
Hand Interfaces receive the Best Paper Honorable Mention Award from CHI 2022. Congrats to Siyou and the team.
Jan 2 2022
Los Angeles
FaceBit is featured on NSF research news.
Oct 1 2021
Los Angeles
Kicked off the first UCLA HCI meeting with Anthony's lab.
21 September 2021
Los Angeles
OptoSense (through collaborations with GaTech) received IMWUT 2020 Distinguished Paper Award.
21 September 2021
Los Angeles
Vibrosight++ is a finalist in Fast Company's 2021 Innovation by Design Awards (Experimental category).
09 September 2021
Los Angeles
Siyou's first first-author paper was submitted to CHI 2022.
26 April 2021
Los Angeles
Xiaoying receives the Graduate Dean's Scholar Award.
18 February 2021
Yokohama, Japan
Xue's work on fluent AR/VR authentication and Xiaoying's work on reflector-based IoT interactive sensing will be presented at CHI LBW 2021.
22 January 2021
Delft, Netherlands
Riku's work on Millimeter-Wave Interactive Sensing will be presented at CHIIoT.
09 January 2021
Pittsburgh
Three summer projects submitted! It has been a joyful journey thanks to all diligent and talented intern students.
31 December 2020
Pittsburgh
Goodbye 2020. You will not be missed. Looking forward to the new adventure in 2021!
04 November 2020
Tokyo, Japan
Congratulations to Riku for being awarded the Funai Overseas Scholarship.
11 August 2020
Pittsburgh
HiLab logo is finalized. It was co-designed by Liang He. The shape of "H" in the logo was inspired by the appearance of the Royce Hall.
8 July 2020
Pittsburgh
Summer projects in full swing.
5 May 2020
Pittsburgh
Lab website domain registered (https://hilab.dev). Beta version is up.

Current Team

Team member

Yang Zhang

Lab Director

Team member

William Huang

PhD Student

Team member

Xue Wang

PhD Student

Team member

Xiaoying Yang

PhD Student

Team member

Siyou Pei

PhD Student

Team member

Richard Lin

Visiting Researcher

Team member

Alec Hartman

Research Intern

Team member

Krish Patel

Research Intern

Team member

Cameron Fish

Research Intern

Team member

Muhan Zhang

Research Intern

Team member

Jacob Sayono

Research Intern

Team member

Alexander Chen

Research Intern

Alumni

Team member

Zixiong Su

Visiting Researcher (Next stop: Visiting Researcher at Meta Reality Labs)

Team member

Ronak Kaoshik

Research Intern

Team member

Jess Xu

Research Intern

Team member

Khushbu Pahwa

Research Intern (Next stop: PhD student at RICE)

Team member

Hao-Yun Chi

Visiting Researcher

Team member

Zihan Yan

Research Intern (Next stop: PhD student at UIUC)

Team member

Zhiying "Steven" Li

Research Intern (Next stop: PhD student at MSU)

Team member

Jaewook Lee

Research Intern (Next stop: PhD student at UW)

Team member

Riku Arakawa

Research Intern (Next stop: PhD student at CMU)

Team member

Tejas Viswanath

Research Intern

Team member

Chengshuo Xia

Visiting Researcher

Frequently asked questions

Can I join the lab as PhD student?

Yes, HiLab is taking new students! You should apply to the ECE department and give a heads up to Yang. In general, we look for students interested in Human-Computer Interaction research with technical backgrounds. Since the graduate application process at UCLA is highly competitive, we encourage you to reach out and discuss your application with Yang beforehand. Reaching out to us and working with us is always a great way to initiate collaborations -- if you are interested in our research, we will most likely be working in the same research field, and the chance is good that and we will be your future collaborators, colleagues, thesis committee members, or recommendation letter writers. So we would love to know about you!

Other opportunities to work with the team?

Working with students is always exciting to us and we are constantly looking for students who show strong self motivation and can be persistent during up-and-downs when tackling research problems. A technical background in EE/CS is a plus but is not a must. That said, there are more active projects that require students with a certain level of experience in programming, embedded system development, and circuit design. The best way to reach out is to send us your CV, along with ideas of one or a few projects you want to work on, with focuses on 1) why it is an important problem, 2) how previous research has addressed the problem, and 3) what new can be introduced by this project. This process gives us a sense of what you feel excited about, which would help the match-making.

I want to learn more about Human-Computer Interaction (HCI). How can I get started?

HCI is a method, and it is also a discipline. The ultimate question HCI researchers aim to answer is how can we make computer technologies better for humans - users in their context of computing applications ranging from AR/VR, IoT, wearables, haptics, digital health, accessibility, and beyond. At HiLab, HCI means 1) we study human-related signals, 2) we focus on technologies that have user applications, and 3) we build systems with users in the loop, and we evaluate these systems with real user populations. If you want to learn more about HCI, see Scott Klemmer's excellent course.

What venues does the lab publish at?

We primarily publish at ACM conferences and journals on Human-Computer Interaction. Below is a list of publication venues where we have published.

Can we meet?

Sure! Though we enjoy meeting new students, our time could be limited and should be primarily focused on conducting research. If you want to schedule a meeting with a team member, please first send an email with your name, institution, the purpose of the meeting, potential time slots, and other information you want to share. Lab members could have tight schedules so please excuse us if your email is not responded timely.

What is the lab culture?

Be nice and be kind. That's what we expect all lab members to be. UCLA ECE has a wide spectrum of research focuses, from AI/ML and quantum computing, to millimeter-wave and photon detection systems, which requires each member of the community to take personal interests out of calculation when evaluating each other's research contribution. A great exemplar set of lab values around Ethics, Diversity, and Community can be found on Jennifer Mankoff's Make4all lab webpage. We also practice the reasonable person principle and follow UCLA's Standards of Ethical Conduct in the lab.

Contact us

HiLab is part of the ECE department in UCLA's Samueli School of Engineering. This website for the lab is actively being developed at the moment. So, stay tuned! There is more information on the lab space and resources to come. To contact the lab, please email yangzhang@ucla.edu.

Human-Centered Computing & Intelligent Sensing Lab

420 Westwood Plaza
Los Angeles, CA 90095

Email: yangzhang@ucla.edu