Meta operates products used by billions of people globally, with strong communication surfaces across WhatsApp, Messenger, Instagram, and Meta AI. The company is exploring whether a purpose-built mobile device experience could better serve accessibility-first users and deepen engagement across its ecosystem.
You are the PM responsible for defining a phone experience for deaf and hard-of-hearing users. Today, mainstream smartphones offer fragmented accessibility features, but they are often buried in settings, inconsistent across apps, and not designed around deaf users' primary communication needs. Research from accessibility interviews suggests three recurring pain points: real-time communication breaks down in voice-first situations, environmental awareness is limited when important sounds are missed, and many users must rely on multiple third-party apps for captions, transcription, and sign-language-friendly messaging.
Meta leadership does not want a niche hardware science project. They want a product concept that could launch within 12 months by combining existing smartphone capabilities with Meta surfaces such as WhatsApp, Messenger, Instagram, and Ray-Ban Meta integrations where relevant. The goal is to create a coherent, accessibility-first phone experience rather than a generic phone with a few add-on features.