Working in Public


The best part about the OpenROV project is getting to collaborate with so many amazing people all over the world. We've been getting a tremendous amount of input/feedback/development from the community. In an effort to effectively and efficiently harness that energy, we've been implementing a number of new practices to get more people involved and establish a more collaborative workflow.

One of the latest experiments was an open development call. We hosted a Google+ On Air Hangout with some of the main developers. In addition to being a live video chat with the group, it was also view-able by anyone online with a passing interest in watching or listening. It was also immediately archived to Youtube, so anyone who couldn't make the call could catch up on what they missed.

Here's a video of our first development call:


This was great. the last part where you were playing with the camera and showing the components gave me a much better understanding of the OpenROV. The team discussion was excellent and is helping me get up to speed on this project. I look forward to more of these meetings and hope to contribute once I get my hands on the parts needed to build one and start tinkering with it. Oh my another big learniung curve :) Great to see all the talented people helping out.


Hi there,

I am currently just watching how everything is progressing, because I don't have the time to actually contribute or build an ROV myself. But after watching your Hangout, I thought I might add something to the discussion...

I got myself a Beaglebone to play with and did some testing on its video processing capabilities and found out that the best results you can get with gstreamer. I haven't tested mjpg-streamer (which Dominik mentioned) so far, but I understand that it is a sole Motion JPEG capturing software/server. That could be done with gstreamer with ease as well.

The benefits of gstreamer over the other technologies are (in my eyes):

- There are packages available for Ubuntu and Angstrom, no compiling or setup issues

- It comes with a bunch of other codecs, so you could switch to Theora or some other codec that you can better record for example (MJPG has no timing, it's just frames as they come). The problem with that is that I have not been able to run that in a browser (ogg theora live streaming) so far, but should be doable...

- There are libraries that you can link to, meaning writing your own C/C++ or whatever code. Not sure if mjpg-streamer offers that.

- Most important: You can use the hardware compression of the camera. Following is a link of a guy who built a webcam using a Beaglebone that runs Full HD! h.264 video at 3.1 Mbps and the Beaglebone doesn't even sweat (as he put it). Click here

I admit the problem with that is that if you use hardware acceleration you can't access the images anymore for analysis, but maybe some people just want excellent image quality for recording and don't care for ROVs chasing themselves :-) With gstreamer you have the choice.

Well, just my two cents of input. Maybe you wanna look into that. Keep up the great work! If you should need help with gstreamer, PM me.


We really need a app for the iphone or droid cause i have ideas and stuff it would be cool to have an app!


Thanks for the comment Mike. When I can break a few cycles free I will try out gstreamer. I particularly like: 1) support for h.264 hw compression (I have a Logitech C920); 2) package available for standard BeagleBone distributions such as Angstrom. This allows me to keep my base system in sync with new BeagleBone releases and not have to build from source.