Home » These Transcribing Eyeglasses Put Subtitles on the World

These Transcribing Eyeglasses Put Subtitles on the World

by Adrian Russell


I knew the AI on these smart glasses worked pretty well once it told me that someone else in the conversation was being the socially awkward one.

TranscribeGlass are smart eyeglasses that aim to do exactly what it says on the tin: transcribe spoken conversations and project subtitles onto the glass in front of your eyes. They’re meant for the Deaf and, primarily, the hard-of-hearing community who struggle to read lips or pick out a conversation in a loud room.

Most face computers are graceless and heavy, but these glasses are light, only 36 grams. TranscribeGlass is able to keep the weight off by relegating most of the main computing features to a companion app (iOS only for now). There are no cameras, microphones, or speakers in the frames, just a small waveguide projector in the rim of one eye that beams a 640 x 480p image onto the glass. That is just enough resolution for text to be legible when it is projected directly into your vision, subtitling the conversations picked up by the mic in your phone.

In the app, subtitles can be moved around in the wearer’s vision, anywhere within a 30-degree field of view. You can change the settings to adjust how many lines of text come in at a time, dialing up to a wall of text and down to one word at a time. The battery in the glasses should last around eight hours between charges. The frames cost around $377, and there’s an additional $20-per-month subscription fee to access the transcription service.

Subtitles are currently available in the glasses, but Madhav Lavakare, the 24-year-old founder of TranscribeGlass, has other features lined up. In the testing phase are a setting to translate languages in real time and one to analyze the tone of voice of the person talking.

Glass Dismissed

As Lavakare told me (and The New Yorker in April), he envisioned the idea for this product after wanting to help a hard-of-hearing friend engage in conversations that were not happening with his needs in mind. Lavakare, who is a senior at Yale University, figured glasses were the way to go. If he could just get them right. And, you know, make them look cooler than some other glasses out there.

“I was pretty obsessed with Google Glass when it came out,” Lavakare says.

“Oh,” I say. “So you were a Glasshole?”

“I was, I was!” he says with a laugh. “And then I was like, why are people calling me that?”

While we are talking, the words pop up onto the screen of the glasses I’m wearing. They show up in a Matrix-y green font that patters out across my vision. It does a pretty good job of transcribing the conversation, though it does split the word “Glasshole” into “Glass Hole,” which is honestly funnier.

Though Lavakare’s smart glasses are much more normal-glasses-adjacent than Google Glass ever was, they still can’t really help but look like smart glasses. The screen has a slight shimmer where the waveguides sit on the glass that is just visible enough to onlookers and is clearly noticeable to me when I am wearing them.



Source link

You may also like

© 2025 cryptopulsedaily.xyz. All rights reserved