
MIT XR Hackathon 2019
​
Our project “Accessible Locomotion WebXR” was awarded 3 prices at MIT XR Hackathon 2019.
My role
Designer, tinker, team coffee deliverer and maker of smiles.
How was the
process like?
Hackathon paced project over the course of 4 days, strongly user centered design, with a constant communication with team member in site.
Who is our user?
Any non body-abled user with an interest in XR experiences.
Outcome
Our project was one of the most awarded at the MIT XR Hackathon, winning 3 different prizes for accessibility, best use of Vive Focus and Wayfair’s sponsorship challenge.
Being selected to be one of the participants for 2019 MIT XR Hackathon called Reality Virtually Hack was exciting enough, but once I got there and I met my future team I was thrilled. Teams were formed on the spot and ideas, were floating wildly in the air. My team was Roland Dubois a multitalented WebXR creator and designer, Jan Kalfus a german architect, and Selena De Leon a polymath physicist and humanist. I felt overwhelmed with joy and humbled to be in such a talented team.
​
Why Accessible Locomotion?
Do you believe that XR is a medium that is currently enjoyable by everyone?
The XR industry has been accelerating in the past 5 years, adding degrees of freedom, resolution, better graphics, higher processing power but some times failing to deliver accessibility needs to those users that are non-body abled, non-sighted, or hard of hearing.
Most headsets and XR applications and games are exclusively designed for full-body able users, and we wanted to design a locomotion system specifically for them in mind.
​

Our goal was to establish new binary control standard in XR, so non body abled users can access and enjoy XR experiences utilizing already existing inputs, such as sip and puff.
Why WebXR?
The current state of Web XR is in constant evolution. Many would not see WebXR as ad dominant medium, but our team firmly believed that standardized web XR protocols will play a massive role in the years to come, especially after the arrival of 5G.
​
WebXR targets a wide array of XR devices (Oculus, Vive, Mobile, Magic Leap, etc..). The spec has been around since 2017 and has seen widespread community adoption (releases of A-Frame and three.js use or integrate well with WebXR). New social application arrivals like Mozilla Hubs keep the medium moving forward.
​
We wanted to create a tool that could be as democratic as possible, with no entry barrier and for everyone. WebXR suited our needs for openness, accessibility, and inclusiveness.

What we built
We created an A-frame custom component that can transform simple inputs into a locomotion and selection system. We also developed a demo in A-frame that demonstrates how a single input from the user's mouth (sip-and-puff) can navigate a virtual experience.
Our project used a binary input function from the analysis of sip-and-puff mechanics -- technology created in the 1960s for individuals who are unable to use their limbs, and therefore, use their mouths to “sip” or “puff” air into a device (usually straw or tube) that is linked to hardware or software that enables them to carry out day-to-day functions.
​
We used JavaScript & A-frame to create the VR experience, as well as Maya for reducing poly-count and editing models, Rhino for building free-models to scale, and Google Drive for writing notes and a slide presentation. All models are integrated into A-Frame.

Input and Output
We opted for gaze-based input: this input is an interaction mechanism in XR where the user has a reticle/cursor fixed between the eyes at an arm-length distance that moves alongside the head’s rotation. Gaze-based input interaction is the default and the fallback for mobile XR when no other input device is available.
​
Our binary control component had to enable exploration and interaction in virtual worlds. So we distinguished two different interaction paradigms and split them into separate input modes:
-
Locomotion mode
-
Interaction mode
​

Demo
Once that we had our binary component ready we had to find an example of usability and showcase how it worked.
I personally raised the question that issues with wheelchair accessibility are more present in older buildings which are not built to modern building code guidelines and that we should focus on a real, working-class apartment.
So for a use case demo, we built a middle class apartment that our architect Jan designed as environment, and built an experience that can be navigated and explored by quadriplegic users using a sip and puff, they can inspect the furniture that may not be the size or height they want in their living space, furniture that might restrict their wheelchair navigation. The demo also serves as an empathetic insight of how people in wheelchairs experience an environment and how it feels to live without embodied agency and limited interaction in real life.
Outcome
Among 433 hackers in 110 teams, at the world’s largest, most diverse XR Hackathon. January 17–21st, 2019 @ the MIT Media Lab, Boston. Our project won 3 awards:
​
-
Wayfair Way-more
-
Best Application For Accessibility
-
Best Use of an HTC Vive Focus
​
We created a live demo that you can experience here (without a sip and puff) Try the live demo.
​
We showcased our prototype at the Hackathon Expo and we were happy to see that getting used to the mechanics of the binary input controls due to its simplicity took in general just a few minutes for users of any age group.
​
Next Steps
We partnered with Thomas Logan and the EqualEntry network and are reaching out to quadriplegic testers, component usability testing and fine-tuning ahead, very exciting!
![]() | ![]() | ![]() | ![]() |
---|