top | item 41551872

(no title)

bounceimaging | 1 year ago

Hi! I'm the CEO of Bounce Imaging. Now sure exactly how to post here but excited to see ourselves on here after someone sent me the link. Happy to show some sample video as requested: https://youtube.com/shorts/PlmG9HdzltU?feature=share (I posted this quickly on my own Youtube as I'll need my colleagues at work to put up a better on our official page)

As noted, indeed our near-IR cameras (prior generations - the Recce360 Mini etc) are much easier to interpret in-flight than thermal just because it is hard to get your head around panoramic thermal as easily. If you want to see fully 360 video of that, just download our app (Bounce Viewer) and in the History section scroll down to demo videos and you can pan around as it is thrown in the air or shooting around on the back of a dog. Note that the stabilization is along multiple axes!

Indeed Jonas' setup was pretty cool but this had actually been tried many times before THAT- including cool designs by the Brits, US Navy, and others from decades ago and a cool conceptual design by Franziska Faro (spelling, sorry!). ALL however suffered from the challenge of doing real-time processing with low-latency and automatic stabilization in flight without melting the camera through too much processing. The way we cracked it, first for our visual cameras and now for thermal, is through a stitching method that is 200X-2000X more efficient and noise-insensitive because it is not based on SURF/SIFT feature detection (if you're into the nerdy side of things).

discuss

order

tim-fan|1 year ago

Cool!

Have you looked at producing 3D reconstructions over the thrown trajectory? And/or something like a gaussian splat-based representation for viewing the whole trajectory at once?