Enhancing Urban Wildlife Awareness through Location-Based AR Exploration
I led concept development, interaction design, and prototyping to create an educational scavenger hunt that blends storytelling with exploration.
AR Development, UX/UI Design
8 weeks
The prototype featured a mobile AR scavenger hunt with 3D animal encounters, designed to spark curiosity and support place-based learning.
Final Design
Scavenger Map
Start your adventure with a map that guides you to hidden animal spots around Central Park. Each location is part of a playful scavenger hunt—walk, explore, and unlock new creatures along the way.
Wildlife AR Lens
Use your camera to bring animals to life right where you’re standing. The AR lens lets you see 3D creatures in their natural environment, complete with fun facts and sounds that make each moment feel alive.
Scrapbook
Keep track of all the animals you’ve found in your own personal scrapbook. You can revisit each one, learn more about their history, and complete your collection as you go.
Industrialization of NY leads to wildlife loss
While Central Park is home of wildlife animals in New York, continuous urban development and human activities negative impact wildlife in Central Park such as habitat loss, disease, pollution, and direct harm.
According to the Shannon-Wiener Diversity Index, higher population density is associated with lower species richness.
Augmented Reality revives the forgotten and rarely seen animals of Central Park
As urbanization reshaped Central Park, many native species quietly disappeared from view. Wild Peek uses AR to reintroduce these rarely seen animals—bridging the gap between past ecosystems and present-day awareness.
Through immersive sound, animation, and location-based interaction, AR transforms passive observation into memorable learning. By enhancing engagement, improving visualization, and supporting diverse learning styles, it offers a playful yet powerful tool for ecological education.
From park visit to wildlife discovery
A location-based experience designed to guide users from discovery to interaction—blending physical movement with digital exploration. Different user touch points are considered including poster and banner for app promotion to mobile app experience itself.
Mapping the experience across realities
This stage focused on defining both the mobile interface and the AR environment. While wireframes shaped the app’s onboarding and navigation flow, the 8th Wall prototype brought the AR interactions to life—allowing users to see, hear, and move around the animals in real-world context.
Building on Central Park’s voice
The early visual direction was guided by the Central Park Conservancy’s branding system—leveraging its color palette, typefaces, and signage style to create a sense of familiarity and place. This foundation helped anchor the AR experience in the park’s real-world identity, building trust and coherence from the first interaction.
Testing ideas on real ground
To validate the concept beyond screens, I visited Central Park to observe how people moved through the space and interacted with their surroundings. I conducted lightweight concept testing with early visuals and prototypes—gathering feedback on clarity, engagement, and physical context. This helped surface practical needs like clearer onboarding, sound cues for attention, and location marker visibility in a dynamic outdoor environment.
Rethinking the experience through iteration
Initial feedback and real-world testing made it clear: the design needed to shift to better suit younger users and create more meaningful engagement. I narrowed the target age group, introduced a more playful tone, and restructured the experience to go beyond visual novelty—adding context, storytelling, and educational layers.
The before-and-after comparisons reflect key pivots in tone, clarity, and interaction design.
A playful conservation experience designed for curious young explorers
Wild Peek blends storytelling, spatial interaction, and AR technology to create an engaging journey through Central Park’s forgotten wildlife. Rather than presenting facts passively, the app invites kids to walk, search, and unlock animals—turning each discovery into a moment of wonder and learning. By combining movement, sound, and context, Wild Peek builds emotional connections with nature while fostering early awareness of urban biodiversity loss.
Where It Landed
This project was created as part of an academic AR brief, and while it hasn’t launched publicly, it laid the groundwork for designing location-based educational tools through immersive technology. If given more time, I would:
• Conduct testing with children and families to better understand usability and engagement in real-world settings
• Explore WebAR compatibility and GPS-based logic to trigger context-aware content more accurately
Ultimately, Wild Peek showed me how AR can spark curiosity and learning in outdoor spaces—especially for younger audiences—by making the invisible past visible again.
What did I learn from this project?
Test Early, Even If It’s Rough
One of the biggest lessons from Wild Peek was how valuable early testing is—even if the prototype isn’t polished. Getting feedback early helped me clarify the direction of both interaction and UI design, and surface unexpected issues before they became blockers. It reminded me that iteration doesn’t need to wait for perfection.
Designing for AR Means Thinking Beyond the Screen
Unlike non-AR apps, this project required me to consider things I usually take for granted—like environmental context, sound design, and permission prompts. I had to think through how users would grant location or camera access, how animals should sound when they appear, and even what safety or warning messages might be needed. It was a challenge, but also what made designing for AR so engaging.