PhD Dissertation Research

Designing an Enabling Computer Interface for Physical Rehabilitation Purposes

Designing an Enabling Computer Interface for Physical Rehabilitation Purposes —

On this page you can learn about my award-winning PhD dissertation (Northeastern University).

My research consisted of two main components:

  1. A User Experience Survey (UXS) targeting computer users with various levels of physical disability. This survey sought to quantify the impact of physical disability on computer use, in terms of: positive affect, negative affect, competence, control, and accessibility.

  2. The design and development of an Adaptive Interface Design (AID) that promotes Accessibility, Usability, and Inclusion. I applied agile methods to achieve this goal, executing iterative hardware and software development phases and usability research cycles. The hardware comprised an eye-tracker and data-glove, and the software consisted of Labview and UDP communication on the back-end, combined with a simple video game built in Unity.

For the virtual environment test bed, I started with Unity's beautiful demo island and expanded it by including several challenges such as a coconut throwing game, a bowling game, a puzzle game, and a helicopter ride. All of the interactions were triggered using the Adaptive Interface Design (AID) that I developed. By combining the input of an eye-tracking device and a self-made data-glove, users are able to explore and manipulate their virtual surroundings through natural eye-fixations and intuitive hand gestures.

The remainder of this page includes several captures, schematics, and demonstration videos.

Picture of me receiving the ‘Excellence in Research’ Award from Stephen W. Director and President Joseph E. Aoun at the 2012 Research, Innovation, and Scholarship Expo (RISE:2012).

Visit my

System overview with labels for the display, eye tracker, and data glove.

Labeled system overview: display, eye tracker, and data glove.

Data glove schematic explaining sensor placement and operations

Data glove schematic explaining sensor placement and operations.

System overview including: data-glove, DAQ, eye-tracker, monitor, and speakers.

Diagram showing how eye fixations will automatically bring objects of interest into center-view.

Overview of game mechanics in the virtual environment: throwing, puzzle, bowling.

Northeastern University

Research Innovation & Scholarship Expo (RISE)

Demo Video

Northeastern University — Research Innovation & Scholarship Expo (RISE) — Demo Video —

NEU RISE - Demo Video

Here you can watch a brief demonstration video for my project. In this video I discuss the interface design that I developed and its implications on human-computer interaction; especially for those who experience physical disability. I will apologize for the low quality/rough edit. The time limit for these demonstration videos was 5 minutes, and was made in a bit of a rush to finalize it in time for the research expo.

Data Glove

UDP Demonstration Video

Data Glove — UDP Demonstration Video —

This video demonstrates the wireless (UDP) connection I built in to enable communication between the data-glove and a remote computer. In this demo, I designed a screen on my iPad to reflect the data-glove signals (flexing of the fingers and pressure applied to the thumb) in real-time. The underlying idea behind this concept is that a remote physician could review on a patient's performance without having to physically be there (e.g., in combination with a video call).

Data Glove

Object Control

Data Glove — Object Control —

This video illustrates how the data-glove can be used to control the movement of a virtual helicopter. In this scenario, the helicopter’s path was predetermined, but the user can control the speed of the helicopter very intuitively by altering the amount of flex in the fingers (more flex = higher speed). I added this feature to the environment to provide an additional game element (during the helicopter ride the player visually scans the island for collectible tokens), and to demonstrate the interface's capability in other control modalities.

In this case the user is not manipulating their virtual avatar, but an on-screen object instead (in third-person view), once again using intuitive hand gestures and eye movements.

Previous
Previous

UXS Design