TranscribeGlass has launched smart eyeglasses that display real-time subtitles of conversations directly onto the lens, designed primarily for deaf and hard-of-hearing users. The lightweight glasses cost $377 with a $20 monthly subscription and represent a focused approach to smart eyewear that prioritizes accessibility over flashy features.
How it works: The 36-gram glasses use a companion iOS app to process audio and project subtitles onto a small display in the user’s field of vision.
- A waveguide projector beams 640 x 480p text onto the glass, with no cameras, microphones, or speakers built into the frames themselves.
- Users can adjust subtitle positioning within a 30-degree field of view and control how many lines of text appear at once.
- The battery lasts approximately eight hours between charges.
What you should know: The transcription accuracy impressed during testing in noisy environments, correctly identifying different speakers and maintaining grammatical precision.
- At a bustling San Francisco coworking space, the glasses successfully transcribed conversations with proper speaker labels.
- The system works fast enough that text sometimes appears too quickly to read comfortably.
- The waveguide display creates a slight shimmer on the lens that’s visible to onlookers.
Features in development: TranscribeGlass is testing real-time language translation and emotion detection capabilities that could enhance conversational understanding.
- The translation feature allows bilingual conversations, displaying Hindi speech as English text while showing English responses as Hindi text on a companion phone.
- An experimental emotion-tracking system analyzes tone of voice to display tags like “[Awkwardness]” or “[Amused]” alongside transcribed words.
- Future updates may include American Sign Language syntax translation, though founder Madhav Lavakare acknowledges potential accuracy concerns.
The big picture: Unlike competitors offering multiple features like navigation and AI chatbots, TranscribeGlass deliberately focuses on solving one specific accessibility challenge.
- “All these smart glasses exist, but no one’s found a great use case for them,” says Lavakare, a 24-year-old Yale senior who founded the company after wanting to help a hard-of-hearing friend.
- The company currently serves a few hundred customers and positions itself against flashier competitors like Even Realities and XRAI.
What they’re saying: Lavakare emphasizes the social isolation that comes from missing conversations as his primary motivation.
- “We think we’ve really found a use case that’s just insanely valuable to the end user,” he explained.
- On emotion detection: “Sign language grammar is actually very different than English grammar. That’s why this is still experimental.”
- “I was pretty obsessed with Google Glass when it came out,” Lavakare admits, acknowledging his early “Glasshole” status.
These Transcribing Eyeglasses Put Subtitles on the World