
(Master’s Capstone)
Social Sonar for Ray-Ban Meta Glasses
Prototyping a spatial audio interface on Meta’s AI glasses to guide in-person meetups while keeping you off your phone and in the moment.
February 2025 – August 2025
CONTEXT
Do you remember your last journey to meet a friend in a public space?
Obstacles in your social wayfinding journey such as these likely created unnecessary stress and confusion.
To address these issues, my team and I partnered with Meta Reality Labs to explore how spatial audio — technology that places virtual sounds around your head in the real world — could create navigation technology that keeps people off their screens and present in the moment.
THE SOLUTION
Social Sonar is a fully-functional prototype for Meta’s AI glasses that uses 3D sounds and voice interactions to guide you seamlessly to your friends.
A 20-second POV of Social Sonar in action.
Social Sonar in a nutshell.
This demo shows how spatial audio plays from a friend’s direction, and how Social Sonar changes in tone and tempo to reassure you’re on the right track.
Clear checkpoints from Meta AI along the way.
Not just abstract sound effects!
Social Sonar checks your location and automatically communicates key updates with Meta AI voice notifications as you get closer.
Hands-free. Screen-free.
Keep your phone in your pocket.
By reducing cognitive load and moderating the meet-up process, Social Sonar helps you stay present to find your people.
MY ROLE
I owned the entire sound, voice, and interaction design of Social Sonar.
I defined Social Sonar’s experience with bespoke sound assets and voice interactions.
On a team of four, I was the sole technical product designer!
I translated rich research insights into screen-free wearable features.
I ensured Social Sonar addressed both user needs and Meta’s existing research principles.
I collaborated with the team’s design engineer and researchers for rapid evaluation and iteration.
I evaluated Social Sonar’s effectiveness in real-world scenarios via an aggressive RITE sprint.
IMPACT
Handed off to Reality Labs, Social Sonar now sits in Meta’s portfolio for future development on its AI glasses.
I directly contributed to Meta Reality Labs’ body of research.
My design uncovered two new behavioral principles with how humans interact with spatial audio interfaces to inform Meta's future work.
My design outperformed existing navigation tools with its unique strengths!
In a final evaluative study had participants preferring it for its moderated communication, delight, and reduced screen time.
I communicated Social Sonar’s efficacy by running moderated demonstrations with Meta executives and lead researchers.
To further promote stakeholder buy-in, I defended my team’s overall design through playtests at Meta Reality Labs’ Redmond office.
I moderated playtests of our final design with interaction designers on-site at Reality Labs!
Understanding the problem
GENERATIVE RESEARCH
I led two generative research studies to understand how people currently navigate public spaces to find their friends.
I watched pairs of friends meet up, start to finish, via contextual inquiry.
From initial communication, through travel, and finishing at the in-person meeting, I observed five pairs of friends completing a full social wayfinding journey — a fair representation of the status quo. (n=10)
I analyzed the “last leg” of social wayfinding journeys, observing people meet their friends for a soccer match.
I used passive observation and guerilla interviews (Billy on the street-style) to study the unique behaviors when people are moments away from meeting their friends, compared to the entire journey. (n=20)
THE DRIVING INSIGHT
A sense of control and reassurance is core to social wayfinding — shaped by the communication between friends and the information and tools used.
This manifests in a number of ways:
1.
Peoples’ sense of control depends on digital information aligning with their physical environment.
Social wayfinding relies on comparing screen-based information with the physical environment. When this process fails, peoples’ sense of control erodes further.
2.
Friends rely on existing social norms for reassurance.
I observed the friend who arrived first would stay put and “wait,” while the other would come to them and “seek.” This prior learned behavior enhanced reassurance and increased the likelihood of a smooth meetup.
3.
Emotional reliability is more important than exact precision.
Managing expectations through communication and predictability is far more important than having an optimized travel time with poor communication.
After wrapping up generative research, we ran internal prototyping tests to understand our physical constraints before establishing design principles.
HOW DOES THIS INFORM MY DESIGN?
Social Sonar must use screens sparingly.
To reduce cognitive load, Social Sonar should use interaction touchpoints on the Ray-Ban Meta’s as much as possible.
Social Sonar should use the “waiter-seeker” dynamic for reassurance.
By designing in tandem with social norms like the “waiter-seeker” model, Social Sonar can help users understand their friend’s progress, thereby increasing their confidence.
Social Sonar must communicate critical wayfinding information.
Specifically, Social Sonar must communicate the distance and direction to a friend, as well as dynamic feedback to guide users through their journey.
Social Sonar’s scope: the “last leg” for outdoor public spaces.
By focusing on the “last leg” of the journey for outdoor areas, Social Sonar can provide a more pointed, effective solution than a generalist approach to all situations and settings.
The full design, step-by-step
VOICE COMMANDS
Kick off Social Sonar with a voice command when you're ready to begin.
I capitalized on existing Ray-Ban hardware for seamless hands-free input.
With Ray-Ban Meta’s native voice commands to start the experience, Social Sonar stays screen-free while still letting users act and receive information seamlessly.
Voice commands are integrated with the “waiter-seeker” dynamic.
Social Sonar’s hero flow begins when one friend arrives at their destination and says, “Hey Meta, I’ve arrived,” creating the structure to guide and moderate both users.
In this clip, Paulina uses a voice command to start Social Sonar, establish her position, and set herself as the “waiter.”
PHASE NOTIFICATIONS
Social Sonar plays phase notifications as you get closer to your friend, serving as checkpoints with key details.
These segment the journey into manageable phases and moderate social updates and expectations.
Phase notifications trigger at 250, 80, and 30 meters, indicating positive feedback and progress while mirroring text message updates that friends typically share as they approach.
My design showcases Meta AI’s ability for contextual, adaptive updates.
Here, I include detailed information about a friend’s location, relative to their surroundings (“they will be on your left, near the cafe”), imagining Meta AI’s future inclusion of nearby geotagged data.
I combined sound effects and spoken feedback for a clear design.
Phase notifications start with a sound effect to grab attention, but I followed with an explicit voice cue to deliver important information without confusion.
Phase notifications play as you get closer to your friend, with different updates during the journey.
BEACONS
Marco Polo, but without the hiding. Social Sonar plays repeating spatial audio “beacons” from your friend’s direction as the crow flies, guiding you to them.
Beacons make spatial audio the hero of Social Sonar!
With beacons as the main navigation feature, I ensured critical information about a friend’s location was conveyed through spatial audio that repeated on a steady, reassuring cadence.
I mirrored patterns from everyday interactions for intuitive sound design.
The beacon’s mallet “pings” increase in tempo as you get closer to your friend, much like a car parking assist system — reinforcing real-world patterns to make Social Sonar naturally adoptable.
I added a layer of musical pitch to help correct confusion.
I used basic music theory to assist spatialization! I changed the beacon chord’s musical tone to sound happier as the user faces in the right direction as their friend.
I provide immediate feedback for users with a contrasting “lock” feature.
As a final layer of positive feedback, I added a “lock” sound that plays immediately when you face a friend’s direction, removing the need to wait for the beacon to repeat.
A visual representation of my beacon design, broken into three pieces.
Revisit the demo to see them in action!
GESTURE
Once you and your friend are within sight, Social Sonar ends with a hand wave gesture from both parties.
I combined Meta’s hand pose detection with the social norm of waving hello.
Inspired by the Xbox Kinect initializing gesture, I ended Social Sonar with a wave to ground the experience in familiar behavior while still achieving users’ goals without screens.
A video MVP clip showing hand waves to say hello and end Social Sonar!
Strength through iteration
TIMELINE
My design of Social Sonar wasn’t just hypothetical. I built and rigorously tested it!
I collaborated with my design engineer to create a live mobile web prototype of Social Sonar. Then, via an aggressive RITE sprint schedule, I leveraged weekly users tests to iterate on features and fidelity. This rigorous blend of prototyping, coding, sound design, and research transformed an abstract design into a vetted solution.
EXPLORATIONS
I experimented with edge-case interactions to showcase Social Sonar’s breadth and proactively address stakeholder questions.
During design iterations, I explored Social Sonar’s full potential with a variety of use cases to support my design’s future development. Answering these hypotheticals gave initial leads to how Social Sonar would expand beyond its hero flow.
“What if I want more immediate feedback from the beacons?”
I explored stripping down my sound design for more effective feedback. Our experiment revealed a crucial balance between clarity and comfort: while a simpler beacon was easier to understand, it was less pleasant to use (a bit too similar to Jaws).
“What if it’s too loud to hear the open-ear speakers?”
I experimented with increasing the sound’s audibility and resonance in loud environments. Though users clearly noticed changes to the sounds during A/B testing, they need explicit messaging to understand its meaning alongside it.
“What if I want to seek my friend at the same time?”
I tested Social Sonar with two participants simultaneously searching for each other, breaking the “waiter-seeker” model. While this scenario was much quicker than our hero flow, the breakdowns in “dual-seeker” were much more difficult to resolve.

FUTURE WORK
Nighttime use
Social Sonar can be intimidating to use at night! To resolve, we would design more layers of reassurance, along with safety measures.
Elevation
To expand into more built environments, we need communicate if your friend is above or below you.
Integrate phone calls
Social Sonar should allow users to communicate directly with their friend. We could possibly spatialize a phone conversation!
Putting Social Sonar to the test
THE UPSHOT
For a summative evaluation of Social Sonar, my team and I conducted a final study to assess whether new users could successfully find one another using our prototype.
METHODOLOGY
We recruited pairs of young adults with no experience in spatial audio.
The target audience for Ray-Ban Meta’s, based on Meta’s current marketing campaign.
(4 pairs, n=8)
We started with a screen-based tutorial to simulate the out-of-box experience and quickly bring participants up-to-speed on spatial audio.
We deemed these screens necessary to prepare novice users for their first-time use of Social Sonar in a succinct and clear way.
Participants were then given our prototype to find their friend waiting in a public park (~500 meters apart).
My team and I moderated sessions via think-aloud protocol, but otherwise remained uninvolved.
We designed the waiting experience to limit how often beacons repeated.
Our research showed friends who “wait” often send updates and then return to their phones until further correspondence is needed. Thus, we only played the waiter’s beacons twice until the next update or interaction.
WHAT WORKED WELL?
Pacing.
By pacing interactions with phase notifications, Social Sonar was valued by users for its predictability and clarity.
Multiple layers of feedback.
Not redundant, but a prudent and effective strategy! Users valued Social Sonar’s many mediums to understand information when unsure of the direction of sound alone.
Task completion!
All user tests successfully completed, with the seeking participant using Social Sonar alone to find their waiting friend.
WHAT DIDN’T WORK AS WELL?
Visualizing the journey.
Even though users found the act of navigation to be much more natural, they noted it was difficult to anticipate their exact path compared to using screen-based tools.
The waiter’s experience.
We limited the repetition of information with the waiter’s experience a bit too much. Waiting friends need more feedback that Social Sonar is active, even in moments of downtime.
“It’s probably not going to be best if they’re [beacons] constantly playing, but definitely there was too much of a gap…I didn’t hear enough of them to feel confident.” – W2
Parting thoughts
DIRECT IMPACT
I directly contributed to Meta Reality Labs’ body of research, informing their future spatial audio technology.
At the start of our project, Meta’s audio team shared their design principles and learnings in how people to interact with spatial audio interfaces. Here’s how I expanded on those principles with my design:
1.
Spatial audio tempo can communicate proximity
Previous Meta designs have communicated how close something is via volume (something is louder as I get closer to it). I demonstrated with Social Sonar that tempo speeding up is an effective way to communicating your target is getting closer.
2.
Voice interfaces do not always need to be “call-and-response”
Users expect to verbally respond when interacting with voice interfaces. However, my phase notifications showed that users do not need to respond to voice interfaces if immediately followed by an instruction or task.
FUTURE IMPACT
Social Sonar currently sits in Meta’s portfolio for future development on the Ray-Ban’s and other AI smart glasses.
While our stakeholder could not provide a concrete timeline for Social Sonar’s release, here are the key areas Social Sonar's impact will likely materialize first for future Meta wearable technologies:
Better head-tracking
Along with other hardware-related improvements on the Meta’s AI glasses.
More solutions for social wayfinding
Addressing this problem area with new tools, with or without spatial audio.
Spatial audio for utility
Increase of real-world applications of spatial audio, not just for entertainment.
FINAL REFLECTION
This non-visual project challenged me to rethink design and interaction from the ground up — proof that strong design is not always tied to a screen.
There’s a common notion that product design starts and ends with Figma, but that wasn’t true for Social Sonar. What proved far more valuable was my ability to move quickly, experiment, and jump across a wide range of technical skills (including Figma, when appropriate).
Social Sonar demonstrates that if I can design comprehensively without screens, I can adapt, acclimate, and design effectively under any constraint.
Full video MVP. Reminder to please wear headphones for the best spatial audio experience!














