Rina Kim

Product Designer — UX/UI Design · Prototyping · Systems Thinking

Masters in Human-Computer Interaction from Carnegie Mellon University. Previously at BMW Group Technology Office. Specializing in Automotive HMI, Interface Design, and Rapid Prototyping that bridge research and production.

00BMW Group

Adaptive Generative UI

2025 Jun – 2026 Jan  ·  UX Engineer Intern

Overview

Developed a context-aware UI framework that utilizes generative models to synthesize interface components in real-time. By analyzing driver telemetry, cabin state, and environmental context, the system provides proactive information hierarchy, minimizing cognitive load while enhancing vehicle interaction.

Key Contributions
  • Designed and implemented a modular HMI architecture that decouples UI layout from underlying data streams, enabling seamless adaptation to varying driver contexts.
  • Developed a React-based orchestration layer that synchronizes generative model outputs with high-fidelity vehicle displays, ensuring sub-50ms latency for interface transitions.
  • Prototyped an Adaptive HMI concept for automotive production, translating experimental generative design into a safety-critical dashboard implementation.
Technologies
ReactThree.jsGenerative AIAutomotive UI
BMW Interface
Layers diagram
Pipeline diagram
Skills Applied
UX Research
Interpreted biometric and behavioral signals through scenario-based testing to surface interaction principles for high-stakes, low-attention driving contexts
Interaction Design
Designed a generative UI system with context-gated surfacing and glance-budget constraints, prototyped through adaptive driving modes in Figma
Systems Thinking
Modeled multi-agent orchestration across sensor, context, and intent layers to map real-time inputs into legible, prioritized UI decisions
Cross-functional
Translated raw sensor outputs and engineering constraints into shared design requirements, aligning research, product, and engineering on what the car should show and when
Research Approach

Grounded in two complementary methods: physiological measurement to capture what drivers cannot self-report, and structured usability testing to observe how they interact with dynamic interfaces.

Physiological Data Collection

Integrated a multi-sensor pipeline — PSI, PPG, heart rate, and CRM measurements — to capture driver state data that feeds directly into the UI's context gating logic. Design decisions grounded in measurable cognitive and physiological signals rather than self-reported preference alone.

Biometric data captureCSV export pipelineCRM measurementsPPG + HR monitoring

Usability Testing Framework

Designed a structured testing framework to evaluate how drivers interact with dynamically generated interfaces under varying scenario conditions. Testing ran Oct–Nov, with findings directly informing iteration on the isochrone GenUI concept and real-time layout logic.

Scenario-based task testingObservational sessionsIterative prototype refinementData collection
Key Design Decisions

Three principles shaped every design decision across the project.

Surface, don’t buryInformation the driver needs appears without navigation. The interface assembles itself around context — snowy roads, an incoming call, a known route — rather than waiting for the driver to request it.
Minimum viable glanceEvery screen evaluated against a glance budget. Safety-critical data chunked into scannable modules readable in under two seconds without sustained visual attention.
State over preferenceDriver state data — fatigue signals, cognitive load, speed — gates what is shown and how. Comfort controls auto-adjust. Map overlays appear and collapse based on current capacity, not just settings.
Context continuityThe system maintains awareness across modes — navigation, comfort, media, vehicle health — orchestrating agents so decisions in one domain inform layout choices in another.
Interaction Model

Voice and sensor input become generators of new interface components. The driver does not navigate — the interface responds.

01

Driver Input

Voice · Sensor · Telemetry

02

Intent Parsing

NLP · Context classification

03

UI Generation

Component synthesis · Agent selection

04

Layout Orchestration

Priority scoring · Glance-budget check

05

Rendered Interface

Sub-50ms · Auto-dismissed when resolved

Example

Driver says "Show me roads to avoid" during a snowstorm → system generates a contextual map overlay with icy patch markers, congestion warnings, and a black ice alert — assembled on demand, dismissed automatically when conditions clear.

High-Fidelity Screens

Designed in Figma, validated against one primary scenario — Kennedy Expressway, Chicago, 32°F. High information density, genuine safety stakes.

Passenger Comfort Panel

Identity, dual-zone temp, seat controls, tire pressure, and range — one glanceable view. No multi-step navigation.

Danger Zone Map Overlay

Icy patch markers, congestion flags, and black ice alert. Auto-surfaces on weather trigger. Dismissed when conditions clear.

Snow Readiness Module

Tire pressure, brake wear, road focus mode, heated steering — assembled proactively. Surfaces without being asked.

Climate Ring Display

Dual-zone temp as ambient rings in the instrument cluster. Communicates state without pulling sustained attention from the road.

Design System Architecture

A five-domain agent architecture where each agent owns a discrete area of the driving experience. A central orchestrator determines priority and layout based on live context signals.

Navigation
Route and map layer — routine learning, live traffic, ETA, and proactive POI surfacing based on calendar and habitual patterns.
Comfort
Climate, seat controls, and cabin environment — auto-adjusts based on passenger profiles and ambient conditions.
Weather
Live precipitation radar, road condition prediction, ice and hydroplaning risk, fog detection — feeds directly into map overlay and alert design.
Media
Playback, audio zone management, and context-aware suggestions — attenuates automatically when safety-relevant alerts surface.
Vehicle Health
Continuous monitoring of tire pressure, oil level, brake fluid and pad wear. Triggers automatic video recording on impact detection. Feeds snow readiness and proactive maintenance surfaces.
Reachability ModePassenger ModeRoad Focus ModeEntertainment ModeEco Mode
Proactive Agent Pipeline
Response Accuracy Graph
01CMU SmaSH Lab
Deep Sea Diorama

Proactive Agent

2024 Sep – 2025 Aug  ·  Research Assistant

Overview

Researched & built a semantic classification pipeline for adaptive multimodal systems that isolates relevant user intent from environmental noise and linguistic variations, achieving >90% response accuracy.

Key Contributions
  • NLP pipeline development
  • User intent prediction model
  • Real-time response generation
Impact
  • Robust Semantic Orchestration: Engineered a classification pipeline that bridges the gap between raw speech-to-text and actionable intent, handling the nuance of informal speech.
  • Multimodal Scalability: Created a modular framework for adaptive systems that can be integrated into various hardware environments, from smart homes to automotive HMIs.
  • Real-time Decision Logic: Developed the logic for "Agent Responds vs. Agent Ignores," a critical component for the next generation of "always-on" ambient computing.
Performance Metric>90% Accuracy
02CMU & Surefront

AI Trend Forecasting

2024 Jan – 2024 Aug  ·  UX Researcher & Developer

Overview

Built a B2B tool designed to bridge the gap between high-level data analytics and creative execution. It leverages real-time data and AI to empower fashion professionals — from designers to merchandisers — to make informed, proactive decisions.

Key Features
  • Dynamic Data Integration: Built to process and visualize real-time growth metrics, search volumes, and month-over-month performance curves.
  • Agentic AI Implementation: The Personal Fashion Assistant democratizes data — instead of navigating three layers of menus, a user can simply ask: "What footwear is trending in Paris right now?"
  • Advanced Analytics & Visualization: Real-time growth metrics, search volumes, and month-over-month performance curves.
Impact
  • 75% Reduction in Research Latency: Consolidating real-time APIs into a single HMI — reducing validation time from 4 hours of manual cross-referencing to <60 seconds.
  • 90% Decrease in Interaction Cost: Replaced a 12-click multi-step filtering process with a single natural language input, significantly lowering cognitive load for non-technical stakeholders.
  • Trend Velocity Accuracy: Month-over-month performance curves provide a 42.5% more granular view of trend momentum vs. traditional static quarterly reports.
Research Latency−75%
Interaction Cost−90%
AI Trend Forecasting Dashboard
Surefront Interviews
Emma's Tree
Tree System
Temperature Change
03Biomimicry Hardware

Emma's Tree

2023 Jun – 2024 Jun  ·  Designer & Developer

Overview

Built this tree for a med student who wanted greenery without maintenance stress. Using biomimicry, the tree reacts to environmental changes just like a real organism — a functional, stress-free monitor designed to never die.

Key Highlights
  • The "Bloom" Effect: Temperature Sensitive Filament enables color transformation on touch or warmth.
  • Moisture Sensing: Moisture Sensor signals watering cues organically.
  • Visual Feedback: LED Indicators communicate environmental state at a glance.
  • Growth & Structure: 3D Pen for organic branching forms.
  • Sustainability: Solar-powered Battery Bank.
Design Thinking
  • Designed using biomimicry to solve "care fatigue" — a synthetic plant that lives and reacts right along with Emma. Temperature-sensitive filaments and environmental sensors make it a responsive sensory anchor that mimics the energy of a real plant.
  • Natural materials — real moss, a ceramic pot — keep the technology grounded rather than clinical, delivering restorative nature vibes that are durable and emotionally meaningful.
Temp FilamentMoisture SensorLED Feedback3D PenSolar Power
Filter

Selected Projects

ResponsiveTale
Interactive · XRResponsive TaleAdaptive storytelling interface reacting to reader behavior
Interface DesignMultimodal SystemsPhysical Computing
Pepper's Ghost
Spatial · IllusionPepper's GhostHolographic display using classic stage illusion technique
Spatial ComputingRapid PrototypingTangible Environments
FlexVR
XR · WearableFlexVRFlexible VR interface that adapts to body movement
Spatial ComputingInterface DesignPhysical ComputingRapid Prototyping
Emma's Jellyfish
Interactive · BioEmma's JellyfishBioluminescent jellyfish environment responding to gesture
Physical ComputingRapid PrototypingInterface DesignMultimodal Systems
LeARn
AR · EducationLeARnAugmented reality learning environment for spatial comprehension
Spatial ComputingInterface DesignRapid Prototyping
Stop Motion 02
Physical · AnimationStop Motion 02Stop motion study with extended material and texture exploration
Rapid PrototypingTangible Environments
CMU Popup
InstallationCMU PopupPop-up exhibition experience designed for CMU campus
Rapid PrototypingTangible Environments
Portal Reef
XR · EnvironmentPortal ReefImmersive underwater portal experience in mixed reality
Spatial ComputingRapid PrototypingPhysical ComputingInterface Design
Stop Motion
Physical · AnimationStop MotionFrame-by-frame physical animation exploring material storytelling
Rapid PrototypingTangible Environments
Library VR
VR · EnvironmentLibraryImmersive virtual library space designed for focused study
Spatial Computing
RH Cloud
Responsive · AtmosphericRH CloudVolumetric cloud environment exploring presence and scale
Spatial ComputingRapid PrototypingTangible Environments
Music Box Room
Virtual Reality · AcousticMusic Box RoomSecond iteration with updated lighting and material studies
Spatial Computing
Piano Room
Virtual Reality · AcousticPiano RoomIntimate virtual music room built around spatial audio
Spatial Computing
Study Hall
3D Model · ArchitectureStudy HallCollaborative virtual study hall with adaptive ambient zones
Spatial Computing
Trees 01
Augmented Reality · NatureTrees 01Forest density study exploring depth and spatial perception
Spatial Computing
Flowers
Augmented Reality · NatureFlowersBotanical virtual space with reactive flora and ambient sound
Spatial Computing
Forest Fragrance
Multimodal · SensoryForest FragranceMultimodal sensory experience exploring fragrance interaction
Spatial ComputingMultimodal Systems
Forest
Virtual Reality · EnvironmentForestFull immersive forest environment with layered ambient depth
Spatial Computing