top | item 7474680

Forge – mobile 3-D capture [video]

82 points| acjohnson55 | 12 years ago |aboundlabs.com | reply

35 comments

order
[+] ihnorton|12 years ago|reply
At the end of the video, one camera is from PrimeSense, and the other (appears) to be a SoftKinetic (or Senz3D); is that correct? Does anyone have experience comparing those cameras for accuracy/fov/framerate/etc.?
[+] nobbis|12 years ago|reply
Correct. It's a SoftKinetic DS325.

Its FOV is higher than the PrimeSense sensor, it's capable of higher frame rate but it's a little less accurate and it's range is currently shorter.

The two technologies (structured light vs time of flight) have different strengths/weaknesses, but I believe ToF is the future.

[+] hatuman|12 years ago|reply
Seems like the tip of a very big iceberg. I sent this on to a friend who creates 3D construction and real estate models. I think he needs to diversify before too long...
[+] nobbis|12 years ago|reply
I agree. Also, Forge's underlying streaming 3D technology has lots of applications - mobile capture is just the first.
[+] diafygi|12 years ago|reply
Great demo! A few questions:

1) What previously existing libraries are you using (if any)?

2) Since the video capture (assuming it's from the tablet's camera) is detached from the depth capture, how do you coordinate the two to make the grid layout in the video? Structure.io seems to require attaching the depth device to the video device (so the perspectives are relatively fixed).

[+] nobbis|12 years ago|reply
1) The core reconstruction technology is written from scratch (it's been a full-time project for over 18 mths.)

Some open source projects it uses: Eigen, OpenNL, OpenMesh, OpenCTM, GLM, Protobuf, Redis, Gluster, Node.

2) Video capture is actually from the RGB-D sensor, so it's registered with the depth. I believe the Structure sensor requires that the camera is fixed to the mobile device so that its IMU can be used for tracking. No such requirement here.

[+] bayesianhorse|12 years ago|reply
Will it be free? Will the models be free? Would be very neat when used with blender...
[+] nobbis|12 years ago|reply
I'd really like for light usage to be free, but I'm undecided as to the best monetization strategy.

One option is for model creators to be paid when users download their public models.

[+] rmc|12 years ago|reply
I did not know you could buy off the shelf 3d cameras. Where can I buy one?
[+] yogrish|12 years ago|reply
This seems to be in line with Project Tango of Google. https://www.google.com/atap/projecttango/
[+] nobbis|12 years ago|reply
Project Tango has a 320x180 depth sensor running at 5 fps, i.e. 290k depth measurements per second. Compare this with off-the-shelf depth cameras (e.g. DS325) that generate 320x240 at 60 fps, i.e. 4.6m measurements per second.

The reason for this is that mobile processors aren't fast enough to process more information. So the Tango prototype has to have, in addition to its depth camera, a special motion tracking camera with a fish-eye lens and 2 dedicated processors in order to robustly track.

Even then, with less depth information, the quality of any Tango reconstruction will be far inferior. Maybe in 5-10 years, mobile processors can approach what desktop GPU's are capable of today.

In any case, it remains to be seen if Google can persuade cellphone manufacturers to include 2 special cameras + 2 extra processors in their future devices.

[+] therobot24|12 years ago|reply
nice! I'd be interested in the method they use to put everything together. My best bet is some basic structure from motion weighted by the depth sensor...or maybe it's simpler than that...
[+] nobbis|12 years ago|reply
Author here. It uses color info as well as depth for tracking. Otherwise, it'd fail if you pointed the camera at featureless geometry, e.g. walls, floors.
[+] bromagosa|12 years ago|reply
I tried the URL that's shown at the end of the video and it didn't work :(

Any live demos?

[+] nobbis|12 years ago|reply
No, sorry. I'm moving coast to coast next week, but hope to start beta testing late April.

Enter your email on the website and I'll let you know before beta testing starts.

[+] jc_dntn|12 years ago|reply
this is going to totally change the way we send dick pics.