Problem / Challenge
Fragrance discovery is broken. Retail environments overwhelm with options and sensory fatigue. Mobile apps reduce scent to keywords and star ratings. Immersive experiences exist but remain disconnected from the fragrance storytelling journey. There is no place where scent, story, and self come together.
My Role
One of four team members across an 8-week Extended Reality course. I designed the UI screens for the visionOS interface, architected the user flow across all six XR experience stages, and implemented the interface frames in SwiftUI. My teammate Tina Jiang led the 3D environment design and spatial development.
Process
XR design requires thinking beyond screens. Before designing a single frame, I mapped the full experiential arc that any XR product must move a user through. Each stage of the ScentSync Cycle maps directly to this framework. Hover each stage to learn how it shaped the design.
The ScentSync Cycle / XR Experience Arc
Anticipation
Before the headset goes on
Transition
Crossing into XR
Onboarding
Learning to exist here
Doing
The core experience
Completing
The payoff moment
Reflection
The afterglow
01
Anticipation
Users are primed through emotional fragrance storytelling before the experience begins. The buildup of curiosity and expectation before the headset goes on is the first design surface.
XR Arc Mapping
Mapped each of the six XR stages to a concrete moment in the ScentSync experience, from the emotional priming before entry through the multisensory memory that persists after the headset comes off.
SwiftUI Implementation
Implemented the UI frames in SwiftUI for visionOS. Testing required physical Vision Pro hardware, which compressed the iteration cycle and made upfront design thinking more valuable than rapid prototyping.
Key Design Decisions
1. Designing for a Medium Without Conventions
visionOS has no scroll bars, no obvious buttons, no familiar affordances. Users do not know what is interactive or how to interact with it. My solution was to embed instructional language directly into the UI as a design element. Each fragrance collection card displays the name and a line of instructional text: "Tap to preview." This matched the user's mental model while establishing the interaction pattern for the entire experience without a separate onboarding screen.
2. Emotion First, Technology Second
The six-stage XR framework was the foundation for every UI decision. The fragrance selection screen, the home panel layout, the Explore More collection cards, the Replay and Stop controls — each was designed in service of the emotional journey, not the technical constraints of the platform. Scent is emotional technology. The UI had to feel that way.
3. Standard visionOS Interaction Patterns
For selection and confirmation, I used the standard visionOS pattern: eye tracking to focus, pinch to confirm. This was a deliberate choice to reduce cognitive load. In a new medium, unfamiliar interactions add friction. Leaning into platform conventions meant users could focus on the experience rather than figuring out the controls.
Final Designs
Outcomes & Metrics
Presented to Matt Stern, Enterprise and Interactive Technology Lead at Apple. Stern praised the UI design and the novelty of the concept. His feedback focused on interaction discoverability within the 3D environment: users could not easily tell how to interact with spatial elements like the floating bubbles. Was it a pinch? A tap? A gesture? This feedback confirmed a real tension in XR design — the gap between an element being visually interactive and being behaviorally legible.
4
Team members across 3 disciplines
8wk
Concept to functional XR demo
✓
Apple engineer review
Learnings & Reflection
Designing for XR forced me to think about experience architecture before interface design. On a phone or a website, the screen is the product. In XR, the screen is one layer inside a world the user physically inhabits. The hardest constraint was the feedback loop: you cannot prototype a spatial experience in Figma. Testing requires hardware, which slows iteration dramatically. Getting the emotional arc right on paper before touching SwiftUI saved significant time.
The Matt Stern feedback also taught me something important: discoverability in XR is not a visual problem, it is a behavioral one. You can see a bubble. You do not know what to do with it. That distinction will shape how I design spatial interactions going forward.