Hi Stuart! I'm an engineer at Suitable Technologies (makers of the Beam). Sorry to see the Beam didn't make it into the video, but it was really inspiring to see this talk. Thanks!
I'm very happy to hear that this tech is doing something useful for you. We want to make it better. (I was the founder of Anybots, and know a lot of other people in the telepresence and drone worlds.) It's great that you're telling your story, because it's much more motivational for people working on the tech than some of the current popular use cases, like managers walking the cubicle farm while traveling.
Hi Stuart, thanks for sharing this and your experience. It's very inspiring to us as we are currently developing another "extensible self" tool [0] that will empower you to "remote control" real people (we call them agents) and show you around, even where drones are not allowed or there isn't someone who can set it up for you.
Can we send you an email and ask you a few questions?
Moreover, can you share a bit how the streaming feed is setup?
Edit: sorry, missed your question about streaming! All of the hard work on the streaming for the event was done by Andrew Nesbitt.[0] He's definitely the guy to talk to about this.
The part where you talked about using FPV goggles to control a drone was particularly interesting, and admittedly, hit me right in the feels.
What's the current state of that technology? Last I knew/read, the latency between what the operator sees and what's actually happening was still high enough to make it disorienting. Is there anything you've seen / are working on that mitigates that? Is it even as much of an issue when your range of motion is constrained to your head?
my friend Henry Evans at Robots for Humanity[0] has done some testing with the Oculus Rift, he has much more limited range of motion than me and he was able to fly a Parrot drone fairly successfully. But I think when she start moving higher speeds than I think the latency is still a problem, although I've not had chance to test it.
At the moment I'm still trying to find people in the UK who can help me set up a similar sort of system, as at the moment I've been using the Parrot AR Drone coupled with NodeCopter[1] and the excellent ardrone-webflight plugin. FPV is my 100%, oh my god I'm flying from a wheelchair, ultimate dream setup, and I think I'm right at the beginning of that journey.
[+] [-] escapologybb|11 years ago|reply
[+] [-] hanskuder|11 years ago|reply
[+] [-] tlb|11 years ago|reply
[+] [-] lyricalpolymath|11 years ago|reply
Moreover, can you share a bit how the streaming feed is setup?
[0] http://eyevel.com
[+] [-] escapologybb|11 years ago|reply
Edit: sorry, missed your question about streaming! All of the hard work on the streaming for the event was done by Andrew Nesbitt.[0] He's definitely the guy to talk to about this.
[0]: https://twitter.com/teabass
[+] [-] tobz|11 years ago|reply
What's the current state of that technology? Last I knew/read, the latency between what the operator sees and what's actually happening was still high enough to make it disorienting. Is there anything you've seen / are working on that mitigates that? Is it even as much of an issue when your range of motion is constrained to your head?
[+] [-] escapologybb|11 years ago|reply
At the moment I'm still trying to find people in the UK who can help me set up a similar sort of system, as at the moment I've been using the Parrot AR Drone coupled with NodeCopter[1] and the excellent ardrone-webflight plugin. FPV is my 100%, oh my god I'm flying from a wheelchair, ultimate dream setup, and I think I'm right at the beginning of that journey.
[0]:http://r4h.org/ [1]:http://ardrone2.parrot.com/ [2]:http://nodecopter.com/ [3]:http://eschnou.github.io/ardrone-webflight/