Virtual Sign
Team
-
Recognition
-
I developed and designed this app by using Unity, C#, and Figma.
Problem
However traditional methods of learning ASL, such as books and videos, lack the immersive and interactive experience needed to master the language.
Outcome
Final Design
Timeline
Competitive Analysis
Limitations of current solutions include:
- Only offers limited learning as in one-way interaction
- Interrupt users to continuously engage with hand signs due to learning speed or methods
Journey Mapping
Ideation
These three essentials can solve the limitations that users have been facing from the current solutions, gaining more effective education with hands-free interaction, interactive learning experience, and real-time feedback on their hand signs.
Information Architecture
Visual cues guide users through the interface, allowing them to effortlessly explore content aligned with their learning objectives and quickly grasp the overall navigation structure.
Initial Sketch
Development
While Unity provides built-in hand gestures like pinch, poke, grab/grip, and point for creating actions within a scene, I decided to create custom gestures that portray fingerspelling in American Sign Language (ASL).
Custom hand pose is combined of finger shape, hand shape, and hand orientation. Setting each component thoroughly by reflecting the real hand gesture of each alphabet in ASL, it can be virtually recreated in the VR platform created by Unity.
Gesture Accuracy Testing
Menu
Game
Results
However, I have encountered a few limitations due to the Unity's OpenXR Plugin.
- As I used 'Static Hand Gesture' component for hand detection, it is only able to capture static gesture rather than dynamic ones. As some sign languages involve motion, it cannot be accurately represented or detected by systems designed only for static hand poses.
- Another limitation is that it has difficulty in capturing both hands at the same time. Both hands are often used to create one sign, yet it is not combined as a single gesture in the program.
While these limitations are inevitable in ASL communication, I discovered the VR’s potential of freeing hands of users and provide multisensory feedback to users with hearing impairments.
Key Takeaways
As it is my first Extended-Reality (XR) project that I have fully participated from development to design, I gained deep understanding in UI/UX design in XR as well as XR development. I now understand the perspective of developers, feel confident communicating and collaboring with them.