I was expecting this to be another phone holder, much more interesting that you hacked together some hardware. Does this feature a low persistence display? Also, you mention "cheap tracking for next big update", is this just going to be an improvement over your current tracking or a full 6 DOF tracking? I don't think I've seen any hobbyist 6 DOF tracking for VR yet.
Oh, one note, in your read me, the part about Jonas convincing Chinese factories to sell you parts at premium prices should be changed. You probably meant he got really good prices, but premium pricing means basically the opposite.
> I don't think I've seen any hobbyist 6 DOF tracking for VR yet.
Webcam based hobbyist 6dof headtracking for use with desktop screens has been around for many years (freetrack, ftnoir/opentrack and so on), but the quality is rather dreadful. Still, people who don't mind glacial latency from very heavy smoothing have been very happy with those solutions.
But VR requires so much headmounted technology that tradeoffs between cost/weight and quality shift a lot. For desktop tracking, adding head mounted sensors to the existing single camera 6dof tracking solutions would at least double the amount of hardware involved. But when your baseline is a full VR headset, those sensors are an almost negligible extension. Gyro sensors and/or an "inside out" camera could easily add a lot of precision/speed (effectively the same metric, with filtering) to the rotary axes of single cam 6dof. Last time I looked at opentrack it already supported some sort of fusion between stationary camera and Android gyros. This would be a good starting point for a hobbyist VR rig (not room scale).
“I know this request is a little bit out of the ordinary, but would it be possible for you to charge us premium prices for this item? We’d like to pay more. No? Please, we’ll only buy it if you charge us more. Ok, thanks!”
About early age children using VR, there was a warning of potential risk for children using VR, two/three years ago:
https://uploadvr.com/study-vr-children/
it’s not a big study nor does it comes to any concrete results, so I was wondering if you knew of more data on the subject, or knew of any real word feedback on the matter.
Motion-to-photon latency isn't mentioned. It's basically the most important characteristic of a good VR set. That's why all smartphone-based VR solutions suck and make you sick.
> Motion-to-photon latency [...] most important [...] sick.
Yes and no. And the "no" seems underappreciated.
I normally run my Vive and Lenovo WMR at 30 fps on an old laptop with Intel integrated graphics. So why hasn't it made people sick? Camera passthrough AR helps. Likely the "comfort mode"-like tunnel-vision effect of not doing barrel or chromatic aberration correction. Perhaps not doing predictive tracking, so lag but no judder. Maybe "visible out the corner of your eye" framing. Maybe something else.
Most VR reporting starts from an assumption of games. Games, games, always games. So "you are there" immersion, with no avoidable visible artifacts, no AR, etc. So 90 fps, constant latency, high GPU and HMD bandwidth demands. But if you don't care about games, if you just want a desktop replacement/alternative... the design constraint space looks very different.
What in a smartphone based VR solution would give to it an inherently slow Motion to photon latency ? The communication between CPU and motion capture ? The 3d rendering ?
Well to be fair, he doesn't own the IP he creates at Oculus, Facebook does. It is entirely possible he'd like to open source it, personally (though I have no idea if this is the case).
We're all impostors, so what? I sometimes feel that this basic insecurity in IT really creates strife and abrasiveness between people instead of mutual respect. I find it better to embrace your own imperfections and in this way find your own strengths, which, sadly, are too often shadowed by the need to keep up appearances. There is no perfect IT person. We all suck. All tech sucks. People in general suck at things people do. This is life, and I am ok with it: I am content with who and where I am, and I am ok with other people having different goals, different life experiences and different achievements. My friend works at NASA; I've tried a lot of psychedelics; somebody has been backpacking around the world with 5$ in their pocket - and we're all deserving to be allowed (by ourselves) to be happy.
That having been said, these kids are really cool and I wish them the best of luck!
Without knowing anything about the quality I can say, this is pretty amazing. I mean putting together a team which builds hardware and software for what they want to have.
Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
There are fundamental limits on latency, especially with spread-spectrum transmission when all of this goes wireless. As accurate as the tracking and pointing are for controllers, I feel like some additional extrapolation is happening. It would be great to have an open source library for this so we can give hand-built rigs the best tracking that's mathematically possible.
>Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
Absolutely. Vive uses a combination of IMU based dead reckoning combined with Lighthouse sensors to provide tracking. The dead reckoning is super important for maintaining tracking during sensor occlusion. The API it interfaces with is SteamVR, which is mostly open source, so you can even see how they’re doing it. The new generation Vive Pro will combine this along with stereo camera CV based inside out tracking for even better precision.
Yes it uses both, relative and absolute measurements (each with its own drawbacks) into what's usually called sensor fusion. It's very well explained here: http://doc-ok.org/?p=1478
I know the Vive does some kind of motion prediction for its controllers, at least in the case they lose tracking: if you quickly move a controller out of view of the lighthouses (kind of hard to do if you have the lighthouses set up well; I had to hide the controller under my shirt) then the system will show the controller continuing to move in the direction it was moving for a short bit.
Can't wait for the day when we truly have modular VR.
It's going to take a few years and I know Oculus has the right idea with their eco system but it sort of bums me out that the Vive didn't end up being the hackers headset.
Today it feels like the Vive was built out of spite and HTC got lucky Valve went them first.
I think the less cynical answer is that HTC often has good ideas for physical devices but then often fails to follow through and iterate well. They're also just struggling as a company in general.
But damn, so I 1000% agree with being bummed about it not being the hackers headset. I preordered the Vive because of a VR video of a guy programming the environment he was in at the moment:
https://www.youtube.com/watch?v=db-7J5OaSag
There's OSVR, which is about as modular as you can get - there are plugins to interface with e.g. SteamVR, and it's compatible with VRPN for peripherals. (Very much a dev kit though, if you're looking for something you can just plug in and use the OSVR headsets are absolutely not that.)
I have to wonder if this team could have gotten more mileage out of working with OSVR, since a lot of the work to connect it with existing VR apps is already done. But there's certainly value in doing it all yourself!
Imagine this with a 4k panel. Depending on the lenses, it could be the highest resolution HMD currently available.
Panelook seems down, but even if only 4K@30 5.5 panels are currently available/affordable... well, no gaming, but I use Vive and WMD at 30 fps as a desktop alternative.
My immediate thought went to seeing if a VR headset could be created with a 200+ degree field of view similar to a Pimax or StarVR using two of these displays!
Hi, how did you guess? Me and my friends were completely in love with SAO, and we decided to build a virtual world to go after (or instead) school. But we ended up building a VR headset.
Because there is no software platform for driving it? There should be a serial usb / xml interchange format for sending the device capabilities to the driving PC. Something HDMI device-id but for VR displays.
Cool project by teenager kids shame that its useless in practice since thered be no sw support from major engines. So as a next project youd need to make an engine plugin for ue4 for example.
All the "useful" stuff you're thinking of very likely started out much less polished / refined than this project. It seems to have struck a chord with this audience, and it will only get better from here.
[+] [-] TipVFL|8 years ago|reply
Oh, one note, in your read me, the part about Jonas convincing Chinese factories to sell you parts at premium prices should be changed. You probably meant he got really good prices, but premium pricing means basically the opposite.
[+] [-] usrusr|8 years ago|reply
Webcam based hobbyist 6dof headtracking for use with desktop screens has been around for many years (freetrack, ftnoir/opentrack and so on), but the quality is rather dreadful. Still, people who don't mind glacial latency from very heavy smoothing have been very happy with those solutions.
But VR requires so much headmounted technology that tradeoffs between cost/weight and quality shift a lot. For desktop tracking, adding head mounted sensors to the existing single camera 6dof tracking solutions would at least double the amount of hardware involved. But when your baseline is a full VR headset, those sensors are an almost negligible extension. Gyro sensors and/or an "inside out" camera could easily add a lot of precision/speed (effectively the same metric, with filtering) to the rotary axes of single cam 6dof. Last time I looked at opentrack it already supported some sort of fusion between stationary camera and Android gyros. This would be a good starting point for a hobbyist VR rig (not room scale).
[+] [-] lovemenot|8 years ago|reply
[+] [-] dannyw|8 years ago|reply
[+] [-] rsbartram|8 years ago|reply
Focusing this type of educational attention on children at an early age is critical for future personal and professional development.
This is what allows consumers to have a build yourself VR headset for $100.
I have covered a few early STEAM programs in the Los Angeles area.
https://latechnews.org/raymond-ealy-founder-steamcoders/ https://latechnews.org/stem3-academy-open-house-november-4/
[+] [-] hrktb|8 years ago|reply
it’s not a big study nor does it comes to any concrete results, so I was wondering if you knew of more data on the subject, or knew of any real word feedback on the matter.
[+] [-] 52-6F-62|8 years ago|reply
This looks like a real fun project.
If you belong to it, then great work!
[+] [-] realusername|8 years ago|reply
[+] [-] bufferoverflow|8 years ago|reply
Carmack has long lectures about that issue.
[+] [-] mncharity|8 years ago|reply
Yes and no. And the "no" seems underappreciated.
I normally run my Vive and Lenovo WMR at 30 fps on an old laptop with Intel integrated graphics. So why hasn't it made people sick? Camera passthrough AR helps. Likely the "comfort mode"-like tunnel-vision effect of not doing barrel or chromatic aberration correction. Perhaps not doing predictive tracking, so lag but no judder. Maybe "visible out the corner of your eye" framing. Maybe something else.
Most VR reporting starts from an assumption of games. Games, games, always games. So "you are there" immersion, with no avoidable visible artifacts, no AR, etc. So 90 fps, constant latency, high GPU and HMD bandwidth demands. But if you don't care about games, if you just want a desktop replacement/alternative... the design constraint space looks very different.
But yes, their focus seems Unity and games.
[+] [-] dsnuh|8 years ago|reply
[+] [-] loulouxiv|8 years ago|reply
[+] [-] DiThi|8 years ago|reply
Well, GearVR and Daydream suck but at least don't make you sick if you stay sit and avoid apps with artificial movement.
[+] [-] dTal|8 years ago|reply
If only he followed his own advice.
[+] [-] georgemcbay|8 years ago|reply
[+] [-] supermatt|8 years ago|reply
[+] [-] kirrent|8 years ago|reply
[+] [-] x775|8 years ago|reply
[+] [-] kristaps|8 years ago|reply
[+] [-] praulv|8 years ago|reply
[+] [-] arpa|8 years ago|reply
That having been said, these kids are really cool and I wish them the best of luck!
*edit: punct.
[+] [-] JepZ|8 years ago|reply
Cool kids :-)
[+] [-] jsemrau|8 years ago|reply
[+] [-] zackmorris|8 years ago|reply
Quick question for any experts reading this - do Oculus or VIVE use any sort of dead reckoning/movement prediction in their tracking? Also does anyone have any documentation on the APIs for this, or information on how the devices keep track of their latency and calibration information?
There are fundamental limits on latency, especially with spread-spectrum transmission when all of this goes wireless. As accurate as the tracking and pointing are for controllers, I feel like some additional extrapolation is happening. It would be great to have an open source library for this so we can give hand-built rigs the best tracking that's mathematically possible.
[+] [-] aphextron|8 years ago|reply
Absolutely. Vive uses a combination of IMU based dead reckoning combined with Lighthouse sensors to provide tracking. The dead reckoning is super important for maintaining tracking during sensor occlusion. The API it interfaces with is SteamVR, which is mostly open source, so you can even see how they’re doing it. The new generation Vive Pro will combine this along with stereo camera CV based inside out tracking for even better precision.
[+] [-] DiThi|8 years ago|reply
The headset is not wireless and the controllers have 1-2 ms of latency. They compress the controller data a lot in a very smart way. More info: https://hackaday.com/2016/12/12/cnlohr-reverses-vive-valve-e...
[+] [-] AgentME|8 years ago|reply
[+] [-] maxime-coutte|8 years ago|reply
[+] [-] txsh|8 years ago|reply
[+] [-] ansible|8 years ago|reply
[+] [-] koiz|8 years ago|reply
Can't wait for the day when we truly have modular VR.
It's going to take a few years and I know Oculus has the right idea with their eco system but it sort of bums me out that the Vive didn't end up being the hackers headset.
Today it feels like the Vive was built out of spite and HTC got lucky Valve went them first.
[+] [-] mmanfrin|8 years ago|reply
But damn, so I 1000% agree with being bummed about it not being the hackers headset. I preordered the Vive because of a VR video of a guy programming the environment he was in at the moment: https://www.youtube.com/watch?v=db-7J5OaSag
I would love to use a VR IDE some day.
[+] [-] ryukafalz|8 years ago|reply
I have to wonder if this team could have gotten more mileage out of working with OSVR, since a lot of the work to connect it with existing VR apps is already done. But there's certainly value in doing it all yourself!
[+] [-] mncharity|8 years ago|reply
Panelook seems down, but even if only 4K@30 5.5 panels are currently available/affordable... well, no gaming, but I use Vive and WMD at 30 fps as a desktop alternative.
[+] [-] sitkack|8 years ago|reply
Do you edit text, code, email and surf inside the helmet?
Do you have any issues with keyboard and mouse?
Do you sit in a desk chair?
[+] [-] pwaai|8 years ago|reply
[+] [-] j45|8 years ago|reply
My immediate thought went to seeing if a VR headset could be created with a 200+ degree field of view similar to a Pimax or StarVR using two of these displays!
[+] [-] coldacid|8 years ago|reply
[+] [-] maxime-coutte|8 years ago|reply
[+] [-] VectorLock|8 years ago|reply
[+] [-] sitkack|8 years ago|reply
[+] [-] ensiferum|8 years ago|reply
[+] [-] ereyes01|8 years ago|reply
[+] [-] DiThi|8 years ago|reply
[+] [-] loulouxiv|8 years ago|reply
[+] [-] yarrel|8 years ago|reply