Live immersive video using off-the-shelf parts available today


Since this totally awesome 360-degree pan-able video came out about OpenROV a few days ago, there has been a lot of talk about how cool immersive video can be in the context of telerobotic exploration.

One potential reality that gets me really excited is the idea of having a bunch of school kids (or anybody, really, but for whatever reason I picture school kids enjoying this the most) wearing Oculus Rifts and getting a feed from a 360 degree view on an ROV so that they can look around at whatever area interests them as the vehicle flies along an underwater canyon or reef somewhere far away. It would be like an episode of The Magic School Bus where all the kids on the bus get to see some distant land they'd never otherwise be able to go to on a field trip, all the while the crazy teacher, Mrs. Frizzle describes what they're seeing and answers questions. If someone sees something cool, everyone can look that way and the pilot.... errr... let's say "bus driver" can even turn the ROV around to go investigate further.

You know, we're really not that far from having that technology now.

It just so happens that we did a little bit of experimentation relating to this sort of thing a few weeks ago using three of our Genius-KYE F100 Wideangle webcams (which have a 120-degree field of view) and a set of three monitors I have in my room at home (because I'm a big nerd).

Basically, using some webcam monitoring software, I linked one camera to each of my three monitors, and configured the three webcams to form an equilateral triangle. Because each camera has a 120 degree viewing angle, the triangle of three cameras created an aggregate 360 degree field of view across the three monitors.

I didn't get to the point of positioning my monitors to make an inward-facing equilateral triangle to be equivalent to the perspective of the cameras (ether my monitors are too small or my head is too big for that to be comfortable), but I did try it out with the 360 view shown across a 180-degree-view screen array. It turns out that it's not as comfortable or intuitive to watch as if the field of view were reduced to just an equivalent 180 degrees. In other words, if you can map each monitor to the general field of view the video it's showing represents, that's a lot better.

But then I realized that with a large room, three projectors, and a few bed sheets, it wouldn't be too hard to make my own video cave!

Right now, I'm not sure if it would be worth the money to buy the additional two projectors we'd need to try this out, but it is tempting...

The real key technology will be live image (or video, rather) stitching so that you can see the right field of view to make things intuitive, but actively change where you're looking based on interest. This is the kind of thing that would be needed for a headset monitor like the Oculus Rift, or for viewing a wide angel (but not 360) perspective on one or several monitors.

Anyway, I think that as technology allows us to digitally view the world in more immersive ways (e.g. circular or spherical field of view, 3d, HD, etc) we'll want to find ways to allow those technologies to help us achieve true telepresence, where we feel that we are physically in a different place. There is always a debate about telerobotic exploration versus manned exploration, and the most common argument for manned trips is that there's an emotional element that can't be replicated by looking at a screen. Call me bias and naive, but in my opinion technology has the capability to bring us data as well as feeling. We may not be there yet, but the potential is immense.

Video Warp suggestion

This is the direction I was thinking and posted about in the other thread. When I have a chance I'll post more info, but a DIY'er could take this further and build a projector and fiberglass dome pretty easily / cheaply. The hard part (at least I think it'll be hard) will be sending a life feed from 3 cameras up a single twisted pair of wires. The other way around is to use a 180 degree fish eye lens rather than multiple cameras. Could probably accomplish that pretty easily. Great topic and something I've spent a fair bit of time working on for clients.


We actually started looking at using the occulus rift head set to do just that. It's on the list of about 100 things that we are all trying to do. But the idea of the big screens would be more appealing and less sickening (head motion and visual delay, etc) than the rift headset.


Hi guys I fly FPV with a pair of fat shark goggles with a gyro built in ..

this then moves the onboard camera on my plane or multi rotor accordingly so I can look around whilst im flying without having to worry about fiddling with knobs on the controls or have multiple cameras..


Would 2 more cameras (one up one down) complete the picture? or would there be a large area of blind spot(s)?


For looking up and down, the most likely space for a blind spot would be where two cameras are looking at points on the same plane, but one is tilted upward or downward 90 degrees. I just tested that with my cameras and it seems that there is in fact a blind spot, but it's quite small. I moved my finger upward directly in-front of the horizontal camera until it went above the frame, and it seemed to be another 10 degrees or so of continued upward movement until it appeared in the bottom of the frame of the upward looking camera.