Dolby XP: tvOS Live Event Streaming
A tvOS experience optimized for Apple TV remote interactions, designed to make live event viewing seamless. Launched with Macklemore’s SXSW 2023 performance, it supported a live watch party at the Dolby House.
Overview
This project gave me the chance to design for an interaction paradigm entirely different from traditional touchscreens: the Apple TV Remote. Because all user input is indirect, every interaction had to be intentionally modeled and pressure-tested. This pushed me to expand my design practice into a new interaction language and to collaborate more closely than ever with engineering. Ultimately, it reinforced something I carry into every project today: technical constraints are opportunities for creative invention, not blockers.
The Design Challenge
Unlike mobile or web, where users can tap precisely, tvOS relies on focus states, directional movement, and gestural navigation. This meant we needed to design an interface that maintained clarity while the user was watching a live event, and support core behaviors like exploring menus or opening the side panel without pulling attention away from the main content. Every interaction needed to feel predictable, stable, and effortless, especially under the time-sensitive nature of live streaming.
Context + My Role
I partnered with another designer to define the interaction model for a new live-streaming feature on tvOS. Our goal was to create a seamless viewing experience that felt natural with the Apple Remote and consistent with Apple’s platform patterns. I collaborated daily with developers and product stakeholders to align on feasibility, refine motion and focus behavior, and prototype layouts that made navigation feel intuitive and delightfully simple.
Ideation
We began with a live sketching session to align quickly and explore wide. Working broad-to-narrow over several rounds, we mapped out what needed to stay on screen, what should live in the side panel, and which interactions could remain hidden until invoked. This collaborative sketching functioned as both ideation and knowledge-sharing, ensuring that we were operating from the same mental model before moving into higher-fidelity design.
Prototyping
Because tvOS interactions are difficult to explain through static wireframes, we built lightweight prototypes that used arrow keys and spacebar inputs to simulate remote behavior. These prototypes became essential for clarifying intent with developers and identifying gaps between design logic and platform constraints. They also helped us refine edge cases. For example, how focus shifts when users switch zones or when the side panel expands and collapses.

Solution
Our final design centered around a “three-zone” model that balanced user intuition with technical feasibility: Zone 1: Familiar video-player controls, including Closed Captioning and language options, aligned to existing user mental models. Zone 2: A right-hand side panel housing Dolby-specific live-event features such as chat, Q&A, polls, and links. Users access this by tapping or sliding right. Zone 3: Contextual content and nested menu items within the panel. Once open, users navigate between Zones 2 and 3 vertically. Because tvOS only allows left-right/up-down focus transitions, we intentionally prevented users from jumping directly from Zone 3 back to Zone 1, avoiding unpredictable leaps and preserving logical spatial flow.
Impact
This work strengthened cross-functional collaboration across design, product, and engineering, and helped define tvOS interaction patterns for live and time-sensitive content at Dolby. The Live Event Page debuted publicly with Macklemore’s SXSW 2023 performance, which was streamed across platforms and showcased during a watch party at the Dolby House. The event served as a real-time stress test in front of industry guests. Not only did the interaction model perform reliably under live conditions, it also informed broader standards for future Dolby tvOS experiences.
Reflection
Designing for tvOS fundamentally shifted how I think about interaction design. Working within the constraints of indirect input pushed me to build richer mental models of user behavior, consider spatial patterns more intentionally, and communicate more precisely with engineering partners. This project strengthened my belief that constraints are creative catalysts, and that great interaction design emerges from continuously translating between user needs, platform logic, and technical reality. It also deepened my appreciation for cross-functional collaboration: the most impactful decisions came from moments where design, product, and engineering worked as a single problem-solving unit.






