Headshot of Johann Wentzel.

Johann Wentzel

I'm a PhD candidate in the UWaterloo HCI Lab, supervised by Daniel Vogel. My work focuses on designing, implementing, and understanding interaction techniques that facilitate more inclusive computing.

My research process involves qualitative methods and co-design, prototyping new interfaces and interaction techniques, then deploying controlled user studies for empirical findings. My PhD explores improving VR’s accessibility for people with motor disabilities by combining more familiar input devices (game controllers, mice, etc.) to supplement or replace VR hand controllers.

Recently, I’ve been exploring how machine learning models of gaze-tracking can aid diagnosis of eye movement conditions, how users can interact with AI for VR environment design, and how inferring user context can make remapping 3D motion in VR easier for users as well as developers.

During my graduate studies I have interned with Meta Reality Labs (CTRL-Labs), Microsoft Research (Ability Team), Autodesk Research (UI Research Group), and New York University (Future Reality Lab).

Headshot of Johann Wentzel.

Timeline

Timeline of studies and career. Information also available in CV.

Publications

A Valve Index controller raycasting toward a white sphere.

A Comparison of Virtual Reality Menu Archetypes: Raycasting, Direct Input, and Marking Menus, TVCG 2024

Johann Wentzel, Matthew Lakier, Jeremy Hartmann, Falah Shazib, Géry Casiez, Daniel Vogel

PDF (Preprint) - IEEE - Blog


A person using an AR headset. Emojis show above their head.

More Than Input: Using the Gaze-Psychology Link for More Accessible Augmented Reality, CHI 2024 Workshop "Designing Inclusive Future Augmented Realities"

Alessandra Luz, Johann Wentzel

PDF


A Valve Index controller raycasting toward a white sphere.

SwitchSpace: Understanding Context-Aware Peeking Between VR and Desktop Interfaces, CHI 2024

Johann Wentzel, Fraser Anderson, George Fitzmaurice, Tovi Grossman, Daniel Vogel

PDF - ACM - Blog - Talk (YouTube) - Slides


A controller raycasting to a point on the ground, for VR teleportation.

Bring-Your-Own Input: Context-Aware Multi-Modal Input for More Accessible Virtual Reality, CHI 2023 Doctoral Consortium

Johann Wentzel

PDF - ACM - Poster


A space scene with a bright sun, showing lens flare artifacts.

Volumetric and User-Centric Rendering Techniques for Lens Flare and Film Grain in Virtual Reality Environments, CVMP 2022 Extended Abstracts

Johann Wentzel, Lesley Istead

PDF


A person's feet interacting with the buttons on the Xbox Adaptive Controller, with several button switches surrounding it.

Understanding How People with Limited Mobility Use Multi-Modal Input, CHI 2022

Johann Wentzel, Sasa Junuzovic, James Devine, John Porter, Martez Mott

PDF - Blog - ACM - Talk (YouTube)


The letter B, written in a fleshy font.

Same Place, Different Space: Designing for Differing Physical Spaces in Social Virtual Reality, CHI 2021 Social VR Workshop

Johann Wentzel, Daekun Kim, Jeremy Hartmann

PDF - Blog


The letter B, written in a fleshy font.

Font Your Friends and Loved Ones: On the Utility of Ugly Interfaces, alt.chi 2021

Josh Urban Davis, Johann Wentzel

PDF - Blog - ACM - Demo - YouTube


Man with VR headset gestures, a virtual controller appears further away from his hand.

Improving Virtual Reality Ergonomics through Reach-Bounded Non-Linear Input Amplification, CHI 2020

Johann Wentzel, Greg d'Eon, Daniel Vogel

PDF - Blog - ACM - YouTube

 Best Paper Honourable Mention


A robot holding a circuit board while a man solders it.

Shared Presence and Collaboration using a Co-Located Humanoid Robot, HAI 2015

Johann Wentzel, Daniel J. Rea, James E. Young, Ehud Sharlin

PDF - ACM - YouTube