Conversational Access to User
Interfaces
People with visual impairments use screen readers to navigate graphical interfaces, i.e. laptops, mobiles. But, screen readers are inherantly readers, they provide one-way linear stream of information. What if screen readers were intelligent? What if they could present information in an efficient and intentional manner? I investigate voice-based access to screens, using design research and prototyping methodologies.
Supervised By:
Dr. Shiri Azenkot, Dr. Qian Yang
Perceptions and New Wave of
Voice Assistants
Personified voice assistants, such as Alexa and Siri, cultivate interesting power (im)balances between themselves and people with disabilities. To some, these agents may contribute to confidence and self-efficacy, while to others, it may be disheartening to rely on a black-box system. I study the perceptions around voice assistants, and bridge these mental models to newer designs of voice assistants that "empower" users with disabilities.
Supervised By:
Dr. Shiri Azenkot
Microagressive Experiences of People with Disabilities
Social media is a place where marginalized communities are susceptible to subtle forms of discrimination and harm; perpetuating experiences of feeling unwelcomed. Through this study, we aim to understand how people with disabilities experience microaggressions (e.g., ableism, patronization) on social media platforms, and how their social media experience is shaped through such disability-specific microaggressions.
Supervised By:
Dr. Aditya Vashistha, Dr. Megh Marathe
Self-Presentation in Social
Virtual Reality
(In brainstorming phase)
Accessible Mid-Air Haptics Design
Mid-air haptic interfaces enable rich 3D interactions but are inherantly inaccessible to various users with disabilities. We designed an interactive simulation of a contactless elevator control panel with mid-air touch feedback and accessibility considerations. Despite being fully contactless, the controls are tactile and closely resemble the mental model of ordinary elevator buttons. Published at CHI Interactivity '21'.
Collaborators:
Tanay Singhal, University of Waterloo
Hands-Free Virtual Reality
People with neuromotor impairments have difficulty operating virtual reality controllers, especially if they operate a wheelchair, have lower fine motor ability, or have spasticity in their muscles. To this end, we explored the use of head orientation to onboard the Oculus and navigate menus in virtual reality.
Supervised by:
Dr. Steven Feiner, Columbia University