top | item 45978025

(no title)

jhugo | 3 months ago

I have the G1 glasses and unfortunately the microphones are terrible, so the live translation feature barely works. Even if you sit in a quiet room and try to make conditions perfect, the accuracy of transcription is very low. If you try to use it out on the street it rarely gets even a single word correct.

discuss

order

chrischen|3 months ago

This is the sad reality of most if these AI products and it’s that they are just taking poor feature implementations on the hardware. It seems like if they just picked one or these features and doing it well will make the glasses useful.

Meta has a model just for isolating speech in noisy environments (the “live captioning feature”) and it seems that’s also the main feature of the Aircaps glasses. Translation is a relatively solved problem. The issue is isolating the conversation.

I’ve found meta is pretty good about not overdelivering on promised features, and as a result even though they probably have the best hardware and software stack of any glasses, the stuff you can do with the Rayban displays are extremely limited.

Xss3|3 months ago

Is it even possible to translate in real time? In many languages and sentences the meaning and translation needs to completely change all thanks to one additional word at the very end. Any accurate translation would need to either wait for the end of a sentence or correct itself after the fact.