Virtual Sign

My RoleDevelopment, UI/UX Design

Team
-
Duration1 month

Recognition
-
Virtual Sign is a virtual-reality (VR) application designed to teach American Sign Language (ASL) by real-time hand tracking for an immersive and interactive learning experience.

I developed and designed this app by using Unity, C#, and Figma.






Overview

Problem

American Sign Language (ASL) is a visual language that uses handshapes, facial expressions, and body movements to convey meaning.

However traditional methods of learning ASL, such as books and videos, lack the immersive and interactive experience needed to master the language.



Outcome

Virtual Reality (VR) application designed to teach ASL in an interactive and engaging manner. It provides real-time feedback through hand detection, making the learning process rewarding and effective.


Final Design








Research

Timeline

This project was broken down into 4 weekly sprints, from discovery to final build. Based on pain point topic, I have set my topic as American Sign Language (ASL) learning platform and Virtual reality (VR) as a type of platform.




Competitive Analysis

Broadly speaking, there are three types of ASL learning platforms - web/mobile app, book, and video. I have specifically focused on digital platforms for this project for effective comparison and further development of the system.

Limitations of current solutions include:

  • Only offers limited learning as in one-way interaction
  • Interrupt users to continuously engage with hand signs due to learning speed or methods





Journey Mapping

I conducted journey mapping exercise with a specific scenario to find out opportunities.









Design Process

Ideation

By comparing As-is and To-be, I have concluded with core system of the project - VR environment, gamification, and gesture recognition.

These three essentials can solve the limitations that users have been facing from the current solutions, gaining more effective education with hands-free interaction, interactive learning experience, and real-time feedback on their hand signs.





Information Architecture

The app's sitemap features a streamlined design, ensuring easy navigation with consistent VR hand interactions and environments.

Visual cues guide users through the interface, allowing them to effortlessly explore content aligned with their learning objectives and quickly grasp the overall navigation structure.





Initial Sketch

I used VR paper prototyping by Saara Kamppari-Miller to visualize the idea. I have added the environment as well as interaction so that I can thoroughly build them into the Virtual-Reality environment.





Development

Unity’s OpenXR offers an XR Hands package, which defines an API that allows access to hand tracking data from devices supporting this feature.

While Unity provides built-in hand gestures like pinch, poke, grab/grip, and point for creating actions within a scene, I decided to create custom gestures that portray fingerspelling in American Sign Language (ASL).

Custom hand pose is combined of finger shape, hand shape, and hand orientation. Setting each component thoroughly by reflecting the real hand gesture of each alphabet in ASL, it can be virtually recreated in the VR platform created by Unity.





Gesture Accuracy Testing

We tested 26 ASL alphabets across 8 users and found an 65% recognition accuracy rate, with misinterpretations occurring mostly in signs with subtle finger differentiations.








Solution

Menu




Game











Outcome

Results

Through this project, I have learned advantages of using VR as a platform for learning ASL, such as gamification for greater immersion and ability to provide silent feedback and guidance through visual cues and haptic feedback, aligning with the visual nature of sign language.

However, I have encountered a few limitations due to the Unity's OpenXR Plugin.

  • As I used 'Static Hand Gesture' component for hand detection, it is only able to capture static gesture rather than dynamic ones. As some sign languages involve motion, it cannot be accurately represented or detected by systems designed only for static hand poses. 
  • Another limitation is that it has difficulty in capturing both hands at the same time. Both hands are often used to create one sign, yet it is not combined as a single gesture in the program.

    While these limitations are inevitable in ASL communication, I discovered the VR’s potential of freeing hands of users and provide multisensory feedback to users with hearing impairments.




    Key Takeaways

    With limited time and knowledge, I successfully built a working prototype of the application as I initially planned in the brainstorming phase.

    As it is my first Extended-Reality (XR) project that I have fully participated from development to design, I gained deep understanding in UI/UX design in XR as well as XR development. I now understand the perspective of developers, feel confident communicating and collaboring with them.

    ©2025