top | item 16308522

Intel made smart glasses that look normal

440 points| dphnx | 8 years ago |theverge.com

257 comments

order

nogridbag|8 years ago

Despite all the negativity around Google Glasses' camera, that was actually the best feature. You could really capture some great moments directly from your eye's perspective at the wink of an eye. My mother recently passed and out of all the photos and videos, my favorite was a 10 second video of me handing her flowers shot from Google Glasses. It looks like she's staring right back into my soul.

I have a three month old daughter now and I find myself fumbling about with my phone trying to take photos of her. Just last night I dropped my Pixel phone while trying to capture a photo of her. Phone is fine, but she wasn't too happy with the loud noise of my phone hitting the wood floor :) I kind of miss Google Glasses simply for the camera feature.

Instead of a minimal heads up display, I would much rather have a minimal wearable camera without all the extra functionality Google Glasses offered. Google Clips seems to be an alternative hands-free camera with different pros and cons (+I can be in the photo. -Can't capture the same type of photos from my eye viewpoint).

mjlangiii|8 years ago

Just my two cents; I'm a father of 4 and I'm okay with not capturing those moments on film. And I mean that in two senses. One, I'm fine using my very imperfect memory to recall special moments and two, because of the logistical problems with grabbing the camera I'm not willing to risk missing out being fully in the moment. The occasionally photograph from an event or time period seems sufficient to conjure up the feelings from that time.

Regarding an always on camera - I'll second other comments in warning there are just so many concerns with abuse, I don't see how to get around those.

pavel_lishin|8 years ago

Yup, another father here: I'd love an always-on camera that I can tell to save the last 30 seconds of playback. There are a lot of moments I'd love to have on video or camera that are there-and-gone - even if I had my phone in my pocket, by the time I pulled it out, it would be too late.

thaumaturgy|8 years ago

That's all well and good, and I don't think many people are disputing the value of having more and better memories of loved ones.

The trouble is in resolving that against the kind of culture that has fostered things like the "creepshots" subreddit.

Smartphones (and GoPros) have already shown the value of being able to film and share more of our lives, but they've also shown the downsides of having our graceless moments broadcast to the world, or the many events that are now impossible to enjoy in a sea of smartphones held aloft, or, if you're an attractive woman, far more of you put online for other people to gawk at.

Until those abuses are resolved somehow, people are going to resist having a little camera attached to everyone's face.

ocdtrekkie|8 years ago

Given Google refused to allow you to store pictures local-only, Glass' camera was... it's least-used feature for me. I loved notifications without having to get out my phone (or look at my wrist). Google's absolute refusal to make photo sync optional on Glass was honestly the beginning of the end for my trust in El Goog.

stcredzero|8 years ago

Despite all the negativity around Google Glasses' camera, that was actually the best feature

The negativity around Google Glasses amounted to a mismanaged launch. iPhones were introduced to a public already familiar 1st-hand with cell phones, camera phones, and digital cameras. The public was already passingly familiar with smart phones. Google Glasses were introduced with maximum hype to as many early adopters as possible, with no thought as to how it could backfire, and how to navigate those pitfalls. In retrospect, is it any wonder that there were so many inadvisable actions, all adding up to a societal backlash?

Google Glasses should have been rolled out to far fewer people, and in a form factor almost indistinguishable from ordinary eyeglasses.

azinman2|8 years ago

Snap’s glasses?

xPhobophobia|8 years ago

Except Google Clip is like the camera in the movie "The Circle" and is a rabbid privacy violation. The discontent with Google Glasses was likely because it was ahead of its time.

starsinspace|8 years ago

I don't think anyone cares if you use such a glasses camera in your private space, with people who know you and are fine with it. The problem is when it becomes normal and everyone has them on all the time, in public spaces too. Then nobody can opt out anymore.

SimbaOnSteroids|8 years ago

The trick is getting a camera powerful enough to get acceptable quality pictures but small enough, and blended with the surroundings enough that you don't notice it.

magic_beans|8 years ago

It's ok to not have pictures of every single minute of your daughter's development.

It's probably better for you -- and for her -- not to have so many photos.

bluntfang|8 years ago

this is how we turn into simple rick.

craigsmansion|8 years ago

> You could really capture some great moments directly

That should be, "you could really capture some great moments directly on film..."

You already captured these moments; you were there; they are ingrained in your memory.

> My mother recently passed and out of all the photos [..] staring right back into my soul."

First of all, my sincere condolences.

Again, you were there. That moment, that specific wonderful moment you witnessed, nice as it is to have something to remember it by, it feels very personal, not something you could share unless you were there, and you were you.

Back in the days of photo-lab development, there existed magic moments caught on camera as well, but it was the luck of the draw. Since they were so rare, there was no fear of missing out, but any such moment caught on camera was all the more special for it.

Maybe this fear of missing out on your own life, pushing one to commit everything to camera, comes at the cost of actually missing out. Life is made better by witnessing these magic moments, but I'm unsure the drive to capture them all enhances the experience, witnessed and remembered in that particular fashion by no one but yourself.

y0ghur7_xxx|8 years ago

This is actually something I would like to wear. It's like normal glasses, I am not recording videos of the people around me, and it could show me relevant information when needed without looking at my phone. I can think of a few use cases this could cover, in a very elegant way.

The nice thing is, that they are completely "invisible" for other people around me.

jclardy|8 years ago

I do think they are cool and could be useful, but not to the point of making someone that doesn't wear glasses want to wear them.

My main problem is what does it give me over wearing a smart watch? The only thing I can think of is being slightly more discreet and the "gesture" not being rude to activate when you are in company. That is something I found out quickly when I started wearing an Apple Watch - even if you are just checking a text message it appears you are checking the time and want to leave. Since then I have greatly cut down on notifications going to my watch, and have even fewer that even ping on my phone.

To me, having a camera on it is what would make it compelling, but at the same time make it creepy. Say being able show driving directions overlaid on the actual road, versus some floating text. Or you sit down at a desk with just a keyboard and mouse and your "displays" are only shown in your field of view - and you can customize, move and resize them as you wish.

nogridbag|8 years ago

I'm curious if the position of the "Vaunt display" will feel similar to Google Glass.

From the article:

>> It projects a rectangle of red text and icons down in the lower right of your visual field. But when I wasn’t glancing down in that direction, the display wasn’t there. My first thought was that the frames were misaligned.

The HUD in Google Glass was also outside of your normal field of view and for certain things this was a poor experience. For example, using Google maps integration I felt like I was taking my eyes off the road and felt safer simply using my smartphone mounted to my windshield.

bsenftner|8 years ago

These are going to own the security industry's human surveillance staff. Imagine being a security guard for some corporate or university campus, these are a must.

nradov|8 years ago

I think there's definitely a market for endurance athletes. Runners and cyclists would like to see time, speed, distance, cadence, power output, navigation, etc without having to look down at a wristwatch or bike computer. Those people are accustomed to spending a lot on sports equipment. There are existing products like the Everysight Raptor and Garmin Varia View but they're bulky or goofy looking or obstruct vision, so Intel has plenty of space to offer a better alternative.

bagacrap|8 years ago

I think cyclists would need it more than runners, but cyclists are too small of a market to be worth the r&d cost. That's why current offerings are bad. Cyclist-oriented tech has to piggy back on more popular tech, the way bike lights got a LOT better very quickly after smart phones started pushing li-ion tech.

QasimK|8 years ago

I agree in general that it would be useful in these situations.

As someone who wears glasses, I find it impossible to run with them because of the movement or them just falling off. Are people who wear glasses able to run with them - am I doing something wrong?

maltalex|8 years ago

Good for Intel.

This might just be the start of a new, big, market at a time when they desperately need to diversify their sources of revenue. Seeing them achieve that through in-house innovation as opposed to copying (or buying) competing products would be great.

erikj|8 years ago

I'm disappointed that "programming for Vaunt will involve JavaScript". We're stuck with this horribly designed language on the web because browsers don't run anything else (natively), and there is a strong trend in the field to replace it, either with compilation of saner languages to JS or with WebAssembly. We don't need to infect another nascent market with this atrocity, let it die.

seangrogg|8 years ago

The entire existing JS ecosystem can be leveraged for Vaunt, to include TypeScript/Flow/whatever else. They are target agnostic.

SquareWheel|8 years ago

WebAssembly has no intention of replacing JavaScript. Nor should it.

megy|8 years ago

Sure, but it is easy to use, easy to learn, and everyone has a compiler,

xab9|8 years ago

Just get over it. Either JS will kill every other language or webassembly will save us. Place your bets.

xamuel|8 years ago

ES5 was a trainwreck, but ES2015+ is actually a very good language, if you take the time to study it in depth.

gnicholas|8 years ago

Interesting that they chose red as the display color. One of the co-creators of Google Glass said that they initially all thought red would be good, but after trying different colors the consensus was that it was terrible. This was because there wasn't enough contrast with the background environment.

It sounds like Intel's tech is fundamentally different—they paint your retina with a laser—and this may make the background issue irrelevant. And it was certainly part of the safety pitch, which was that this is a very low-powered laser. If it had blue or green in it they couldn't make this claim.

fermienrico|8 years ago

>> Google Glass said that they initially all thought red would be good, but after trying different colors the consensus was that it was terrible

They should have looked at a larger sample size. Red is used in rangefinder viewfinders for over 30+ years. Leica M240 has red and white as options. Red is by far the best from all standpoints - contrast, legibility and versatility.

Red dot sights are used in firearms and they are also by far superior to any other color.

I am doubtful of Google engineers and whether they put enough thought into exploring colors by using a larger sample size or cross checking different industries.

xab9|8 years ago

I need your clothes, boots and your motorcycle.

athenot|8 years ago

I really like how much effort they put in to make things as natural and unintruive as possible.

The article mentionned there's no interaction yet; I wonder if they could track eyeball movement and use eye blinks for clicking. (Someone else mentionned a ring as an input device which is also an excellent idea.)

marcus_holmes|8 years ago

I want these, and then I want an app on them that will tell me if the person standing directly in front of me is on my LinkedIn/Facebook, and if so, what their name is and what they do.

tomtoise|8 years ago

That'd be quite the technological feat considering they don't currently have cameras built in.

Nition|8 years ago

Unfortunately as the other reply says, there's no camera to see who's there. But as someone who's bad with names I generally agree, in fact I don't even need the Internet connectivity. Just let me write a note on someone - "John Smith. Met at Steve's BBQ" - and have it show up when the camera sees them again.

jotm|8 years ago

I don't see the point of these overpriced smartwatches and glasses that can barely do one thing (display notifications), but I want my future full AR/VR glasses, so I'm glad the early adopters pay for it.

Right now, what I'd want is a very simple, slim, long-lasting watch that would notify me of emails, messages, phone calls and notifications based on their origin (business account, work account, personal, random). It doesn't even need a display, different vibration modes and LED's would do fine.

ocdtrekkie|8 years ago

Fitbit Ionic has a five day battery life, give or take. But a smartwatch pales in comparison to how convenient Glass was to use: I used to be able to be read and reply to texts while doing anything, including driving, safely. And not having to interrupt a conversation when I noticed the incoming call or message was unimportant was fantastic.

jgrahamc|8 years ago

I didn't until I bought an Apple Watch and now I wear it every day and find it a very useful accessory.

rthomas6|8 years ago

Pebble watch?

lev99|8 years ago

Did anyone get tech specs on the display?

How many characters can it display at once? One line or multiple lines? What's the resolution? Are images supported?

I don't think a monochrome device will sell, but the display is (to me) by far the most interesting development here.

mojomark|8 years ago

It's in the article: It's a monochrome Virtual Retinal Display (VRD), powered by a monochrome VCSEL light source and MEMS scanner. The resolution of the image is 400X100 pixels.

JoeAltmaier|8 years ago

Surveillance will become ubiquitous. There's no plausible scenario where it doesn't, right? Can't legislate it away once the sensors/cameras become unnoticeably small.

So we'll need new social norms to control what people share about what they learn. E.g. we already don't mention what we hear from behind bathroom doors in polite company. We'll need rules so that people can continue to operate as humans in this new paradigm.

pi-squared|8 years ago

Yeah, right - "no one will use it for every tweet notification, it will only provide contextual information" - it will be used for whatever people want to use it and we know what people want to use it for (porn).

mynameisvlad|8 years ago

It's a monochrome display. Pretty hard to use it for porn.

deadbunny|8 years ago

Monochrome tiny porn?

Now you mention it, probably...

bencollier49|8 years ago

My immediate thought was "burn-in".

sumnulu|8 years ago

Projects inverse image of every other notification, problem solved EyeSaver (TM)

bsaul|8 years ago

Oh yeah, that swipe gesture looks so natural...

Now in addition to people talking out loud alone on the streets, people checking their notifications on their watch and phones at dinner, we'll have people looking at your teeth doing weird eye and head motions when you talk to them...

lj3|8 years ago

I can't help but think wearable computers aren't going to become viable until we further refine brain computer interfaces.

dvfjsdhgfv|8 years ago

Paradoxically, the biggest selling point for me is that they're not produced by Google.

maltalex|8 years ago

There's nothing paradoxical about that.

Google products come with visible and invisible strings; this, at least for now, doesn't.

epmaybe|8 years ago

I watched the Verge video this morning where they tried to explain how the hologram projector works, but I am still confused. How can it ensure a sharp image shining directly through your own intraocular lens (not the glasses themselves) to the retina? If you have a "longer" or "shorter" eyeball the light ray's focus point will not be on the retina itself.

defterGoose|8 years ago

Laser light is "coherent" both spatially and temporally. Spatial coherence allows the beam to be "collimated", which basically just means that it looks like a cylinder and not a cone. That's why you can shine a laser at the moon and see the spot; most of the energy from the laser makes it to the same place, bounces back to your eyes, and you get a bright spot. Shining a flashlight doesn't work because the light spreads out too much. Technically some of the light still gets to the moon and back to your eyes, it's just below the threshold of your eye's ability to distinguish differences in brightness.

For the glasses, the laser is probably being scanned using a mems mirror (like how a DLP tv works, sorta), and modulated in brightness periodically to create the pixels. Since there's only one "point" of contact between your lens and the beam, the lens doesn't distort the beam like it would an image. That's where the idea of focus comes in. If you were looking at a photograph with your eye, there would be many sources and colors of light. Since the lens refracts incoming light based on direction, position, and color, your lens' job is to make sure the "pixels" of the photograph stay spatially organized with respect to each other. That's what being in focus means. And since the laser has only one color and one direction, all that light stays together and makes a nice dot on your retina. The only thing left to do is make a correction to the overall distortion pattern your lens introduces ,which is similar for pretty much everyone. Same reason you need to add barrel distortion before sending video to an HMD. I think that's what they were showing with that "warping" red image of the glasses' display.

epmaybe|8 years ago

In reading the article I see that they are using VCSE Laser. Does that answer the question?

erikpukinskis|8 years ago

Special things start to happen with retinal projection + bufferless, clockless, stochastic rendering of signed distance fields.

erikpukinskis|8 years ago

Key:

bufferless - Don't wait for an image, just stream pixels as fast as you find them

clockless - No world "ticks", the world is an append-only log of "percepts"¹ which can be projected onto any time.

stochastic - Don't wait for certainty about a pixel, just push out the most probable ones first

signed distance field - Afformentioned "percepts" don't have well defined boundaries like a polygon, instead "fields" centered on a point describe how light moves around them. Any two fields can be trivially summed, so you can ignore most of a scene when searching for a specific pixel near a small number of local fields.

Together they allow you to supply the eyes with nearly zero-latency data with arbitrarily low computing power.

¹ As an aside, there is evidence humans don't see a "now" tick either, we perceive "fields out of time" directly, log them, and interpolate their relationship to "now" thereafter, such that we feel that we are "seeing" something which our eyes have already stopped reporting about. Thus SDFs and clockless rendering are a natural fit and map well to human perception.

asah|8 years ago

I would be excited if these can display charts, graphs, pictures and video -- stuff that's too detailed for a smart watch and Alexa can't speak out loud.

I was just learning how to make a "Julia Child" omelette and would've loved to have her technique (<5 secs!) on repeat while I perform the maneuver.

jpalomaki|8 years ago

Gestures might be needed for some cases, but there are many situations when you could just use the smartphone as the control device. Often I do have one hand available, but the problem using a smartphone is that it requires me to stare at the screen. For example walking in the city, driving the car.

tritium|8 years ago

  Q. Hey, this won't just try to show me 
     more Twitter bullshit, will it?

  A. No, no, no! Heh heh! It will show you
     Yelp bullshit. Much better, yes?

  Q. Ah... so the advertising will finally 
     be the kind we all yearn for?

  A. Yessssss!
Wow. Thanks guys.

Tade0|8 years ago

> “You can ignore people more efficiently that way.”

They should really have a chat with my SO. She can always tell if someone is paying attention.

I think squinting is a pretty natural gesture that could be used for controlling this device. That is - if it's feasible to make it so.

manmal|8 years ago

I‘m not sure I would want these. Shining a laser (even though it will be safe and well tested) right into my eyes seems a bit.. risky. I presume the laser will be controlled by its own microcontroller, well tested and „unhackable“.. but still, if we can learn anything from Spectre, it’s that inconspicuous system parts can open a security hole. Burning holes into a retina is quickly done given the right amount of power.

There’s also the thing that blue light accelerates retina cell death.. I‘d rather wait for some long term studies before putting these on.

driverdan|8 years ago

Just because you don't understand something doesn't mean it's unsafe. From the description it sounds like the laser is incapable of a high enough output to damage your eyes.

lev99|8 years ago

I imagine there could be hardware limits on the laser.

sdrothrock|8 years ago

> Using a Vaunt display is unlike anything else I’ve tried. It projects a rectangle of red text and icons down in the lower right of your visual field.

Interesting choice of location -- I would have thought it would be better to put it somewhere above the normal field of view, as most people tend to look up when thinking and trying to recall something. The kind of information smart glasses offer seems like it could be more naturally accessed that way.

myrandomcomment|8 years ago

So the tech geek in my wants to love these things. It is so SciFi future-y. I just cannot see using them.

1. I paid to have my eyes fixed so I do not need glasses. 2. I am not sure what value any of this brings. I have not see the killer app..

That being said there is an irrational part of me that wants to hold out for the in eyeball version of this. The real issue is the killer app, now that I think about it, is the brain interface where you can think about they information you want and have it come up. Until then...well.

anotherevan|8 years ago

I think this looks awesome. The only problem is I have two pairs of glasses - one for reading/computer and one for everything else - so I'll need two of them.

If I was them, I would add a bone conduction speaker in the stem so it can give you audible information as well.

I also like the ring input device another commenter mentioned.[1]

[1] https://news.ycombinator.com/item?id=16308730

earenndil|8 years ago

> If I was them, I would add a bone conduction speaker in the stem so it can give you audible information as well.

It would probably be too heavy, part of their 'non-intrusiveness' goal is to make it be very light.

vadimberman|8 years ago

Wearables are one area where not having feature bloat means a MUCH better product.

Battery life, normal temperature, reliability vs a gimmick that 1 out of a 1,000 users will want.

sebringj|8 years ago

I can picture using this with a voice interface w/ full screen transparent overlay toggle for navigating the world. Eye-swiping seems too difficult. Couple that with sensing where your hands are and you could manipulate 3d interfaces with gestures. People talking to themselves on a Jawbone was weird enough. Now they'll be full skitso where everyone's a talking mime with their own personal minority report / terminator UI.

rland|8 years ago

I can't help but think of how much more productive a factory worker or repairman would be with these glasses, if it were possible to display instructions, dimensions, parts of a reference manual, etc...

The first successful business in the "glasses with HUD" space will be one that targets businesses which manufacture complex things using human beings.

trisimix|8 years ago

If you can display it on that tiny ass screen you could probably just program a machine to do it. Sounds like a business that wouldnt last long.

dlokshin|8 years ago

Interesting marketing choice that all of the images are of the actual glasses, and none of the images are of what your eyes see.

hbosch|8 years ago

From the description it sounds incredibly difficult to photograph or represent the UI in an accurate way.

walterbell|8 years ago

> Intel intends to attract investors who can contribute to the business with strong sales channels, industry or design expertise, rather than financial backers.

Luxottica-Essilor? They own global eyeglass distribution, both offline and online. They can take the Android market. Apple’s glasses are supposedly a couple of years away.

2muchcoffeeman|8 years ago

These are only normal if you enjoy keeping up with the latest fashions.

I don’t like huge thick rimmed glasses and I never will.

adrianmonk|8 years ago

Whether it fits your personal style has no bearing at all on whether it's normal. I don't like v-neck shirts, so personally I never wear them, but they're definitely normal.

eutropia|8 years ago

As someone who wears glasses that look like those every day -- I think they look great! Big glasses like these help frame my overly-large head and face features, making me look better :)

kazinator|8 years ago

Knowing Intel, project will be canned in a year or two; back to x86 chips and related peripherals.

aurizon|8 years ago

Intel is a dyed in the wool monopolist, which will mean the high cost will bar widespread acceptance

adultSwim|8 years ago

When can I actually buy a pair? How much will they cost?

This is the kind of product I would be really interested in using. I was sad when Google Glass died after pushback in the Bay Area.

pavlov|8 years ago

Basically the Pebble of AR.

Since Intel doesn’t productize themselves, I wonder who Intel will find to build and sell it. Traditional PC companies don’t seem like a great fit, but I have a feeling the launch partners will be companies like Asus anyway.

bravo22|8 years ago

"holographic grading" should really be "holographic grating".

senectus1|8 years ago

give me a small ring on my finger to control it and I'll buy a pair.

ddalex|8 years ago

Funny you should say that - when I was working @Intel I wrote a proposal for a Bluetooth Low Power ring that provide a simple input device for headless devices with 1 rotary input and one press input, sufficient to scroll and click on items.

Another proposal I had was a bracelet that would sense capacitance changes in the hand upon fingers touching each other; this way you could have a 12-key "keyboard" on the phalanges of the non-opposable fingers (3 phalanges x 4 fingers) touch-able by the thumb.

Such technologies would require minimal power input and provide good interaction with any headset; but at the time Intel were not interested in the research needed to build a prototype.

ourmandave|8 years ago

It better be water resistant, what with all the wiseguys kissing it.

clueless123|8 years ago

IMHO, Soon, smart glasses will be as ubiquitous as cellphones. We as humans keep increasing our communications and multitasking. This item covers both.

moogly|8 years ago

Restaurant reviews, recipes, shopping lists. Yeah, I'll wait for something more compelling I guess.

nradov|8 years ago

Well that's the point. Intel is releasing a developer preview in the hopes that third parties will create something more compelling.

adultSwim|8 years ago

For me, real life ad-block is the killer app for AR.

floatboth|8 years ago

This will be the most useful for high school kids cheating at tests :D

TYPE_FASTER|8 years ago

It's interesting that this is on the front page at the same time as the article about Apple. I would love a product with Apple quality that would provide real value AR. Hope they make it happen.

navium|8 years ago

Hope this spectre doesn't meltdown.

ekblom|8 years ago

In a few years, we will all be wearing those VR-goggles that Wade Watts is using in Ready Player One. Can't wait!

sethammons|8 years ago

I'm more interested in the augmented reality glasses in the Daemon and Freedom books by Daniel Suarez. Those could be game changers.

nvus|8 years ago

Good news after bad news: meltdown and spectre.

alvil|8 years ago

[deleted]

crispytx|8 years ago

"uses retinal projection to put a display in your eyeball..."

That sounds awful! Like some shit from Black Mirror! Jesus, Fuck!

crispytx|8 years ago

I'd like to clarify, tech is great and all, but I don't want to become a fucking cyborg. What the hell is wrong with these people?

xedarius|8 years ago

They're waiting for you Gordon ... in the test chamber.

Uhhrrr|8 years ago

This will lead to all glasses wearers being treated with suspicion, until such time as cameras are also readily available in shirt buttons and everything else. At that point recording will become unavoidable and therefore normalized.

ryanpetrich|8 years ago

The device under discussion does not have a camera.

rocky1138|8 years ago

> recording will become unavoidable and therefore normalized

This is already the case.