(no title)
bounceimaging | 1 year ago
As noted, indeed our near-IR cameras (prior generations - the Recce360 Mini etc) are much easier to interpret in-flight than thermal just because it is hard to get your head around panoramic thermal as easily. If you want to see fully 360 video of that, just download our app (Bounce Viewer) and in the History section scroll down to demo videos and you can pan around as it is thrown in the air or shooting around on the back of a dog. Note that the stabilization is along multiple axes!
Indeed Jonas' setup was pretty cool but this had actually been tried many times before THAT- including cool designs by the Brits, US Navy, and others from decades ago and a cool conceptual design by Franziska Faro (spelling, sorry!). ALL however suffered from the challenge of doing real-time processing with low-latency and automatic stabilization in flight without melting the camera through too much processing. The way we cracked it, first for our visual cameras and now for thermal, is through a stitching method that is 200X-2000X more efficient and noise-insensitive because it is not based on SURF/SIFT feature detection (if you're into the nerdy side of things).
tim-fan|1 year ago
Have you looked at producing 3D reconstructions over the thrown trajectory? And/or something like a gaussian splat-based representation for viewing the whole trajectory at once?