Overview

The Computational Health and Interaction (CHAI) lab is led by Prof. Alex Mariakakis in the Department of Computer Science at the University of Toronto. We focus on using ubiquitous and emergent technologies like smartphones, wearables, and virtual reality to address problems related to people’s health and wellbeing. We develop new sensing technologies for measuring physiological, behavioral, and contextual health indicators, and we also take the time to understand the implications of these technologies in people’s hands. We proactively engage relevant stakeholders like clinicians, global health organizations, and industry to ensure that our designs are sensitive to their needs. Our group has expertise in multiple subareas of computer science, particularly machine learning, signal processing, computer vision, and human-computer interaction. We therefore publish our work at various international venues such as IMWUT, CHI, and PerCom.

Examples of Ongoing Projects

Our work can be loosely split into three categories: active sensing, passive sensing, and design. The projects listed below represent just a subset of our group's ongoing projects. For more information, please reach out to either the project leads or Prof. Alex Mariakakis to learn more about these projects or any others.

Active Sensing

These projects involve assessments that require people to perform a particular task in order to learn something about their health.

Virtual Reality for Cognitive Exams
Virtual Reality for Cognitive Exams
Project Lead(s): Andrii Lenyshyn

Goal: To improve upon traditional paper-based cognitive assessments like the Flanker test with virtual reality.

Earbud-Based Cardiac Assessment
Earbud-Based Cardiac Assessment
Project Lead(s): Ken Christofferson

Goal: To assess heart health using audio transduced in the ear canal.

Hyperspectral Retinal Imaging with Smartphones
Hyperspectral Retinal Imaging with Smartphones
Project Lead(s): Dhruv Verma

Goal: To transform the smartphone into a hyperspectral ophthalmoscope for screening cases of retinal hypoxia.

Passive Sensing

These projects involve the use of sensors to detect digital health biomarkers without requiring direct involvement from the people being monitored.

Respiratory Assessments Through Speech Analysis
Respiratory Assessments Through Speech Analysis
Project Lead(s): Sejal Bhalla, Salaar Liaqat

Goal: To identify symptoms related to chronic obstructive pulmonary disease (COPD) in speech extracted from continuously recorded smartwatch audio.

Design

These projects involve studying people's experiences as they interact with health-related technologies.

Context Aware Alarms
Context Aware Alarms
Project Lead(s): Ian Ruffolo

Goal: To reduce alarm fatigue in hospitals, by using physiological sensor data combined with electronic health records to create smarter alarms.

Pre-Consultation Chatbots
Pre-Consultation Chatbots
Project Lead(s): Brenna Li

Goal: To leverage chatbots as a way of gathering background information before face-to-face interactions between doctors and patients.

Supporting Menstrual Health Tracking
Supporting Menstrual Health Tracking
Project Lead(s): Blue Lin

Goal: To create accurate, inclusive, and accessible menstrual trackers through passive physiological monitoring and user-centered design.