top | item 34718930

(no title)

skedaddle | 3 years ago

What you described is how a camera sees the world, and how we see it in pictures. Part of what seems unconvincing to me about these concept ideas is that the demo UIs are superimposed over basically isometric, wide depth of field pictures and video.

So by plane of projection I mean the apparent distance of the virtual image of the UI. If your eyes are focused "through" the display on something a few yards away vs. relatively close, the light from the UI needs to come into your eye (or possibly glasses etc.) as though it is tracing rays in parallel from roughly the same distance away.

Quake style consoles and other HUDs work in video games because in reality the entire scene is coming from the plane of the display, some inches in front of you. If you tried to really focus on a game object 20 yards away, instead of on the screen in front of you, the HUD wouldn't be visible anymore.

In VR optics I believe the virtual screen is something like ~6 feet out in front of you. It is a compromise and still causes eye strain, but is workable perceptually. The issues for transparent AR seem much more complex.

Many of the far-out concepts and ideas that are mocked up for AR seem actually very achievable right now, or yesterday, if the display works like a phone or laptop or VR goggle's does, by re-projecting camera input from a plane a few (possibly virtual) inches or yards away from your eye. The value added though is pretty niche, because people have mobile phones anyway. But if the iPhone hadn't happened, a little display in front of the eye, a whole visor, or a pop-up wrist computer might have been possible to sell. It sounds kind of silly now, but that's what the expectation was in the 80s-90s. The idea of putting a computer on your head still seemed cool then.

discuss

order

No comments yet.