Raspberry PI as a complete OpenROV


I have a very basic question for which I cannot seem to figure out an answer. I was able to get the OpenROV Cockpit & GUI up and running on an RPi 3. It’s all very cool and I love the promising-looking features. I cannot get the camera going and I cannot understand how to connect up my own IMUs and sensors to the GUI to get the whole thing working. So my questions really are,

(a) How and what do I change in OpenROV to get an image streaming from the RPi camera to the OpenROV image window (running on the same RPi)?
(b) How and what do I change in OpenROV to get and collect data from my RPi I2C-connected IMU into the OpenROV instrumentation widgets?

Is there some document that points me to that? A system architecture specification? Sample code? Something?

Thanks for your help, patience and understanding


Here’s some bits of info from others I’ve found in the forums or Developer Chat that will allow you to infer some of the architecture. It may be old info so beware! I haven’t seen any architecture diagrams or a way to get the RPi PiCam to display in Cockpit.

OpenROV Community Hacks - How to create a software plugin for OpenROV Cockpit

We include Cloud9 IDE on the ROV image that we distribute.

OpenROV Developer Chat -

spiderkeys - 05/09/2017
The samd21 is connected via UART.
I2C sensors are connected to the samd.
Only thing on the rpi’s I2C is temperature and an eeprom

You can look at the code for the other I2C sensors we use, such as the BNO055, to figure out how to interface them with the MCU, but you could also directly use the Pi’s I2C interface if you find the right pins and enable it.

Core board/platform files for the SAMD21J18A set up for use on the OpenROV controller board

SAMD21J18A 11-18-16 firmware project for Trident.
Makefile project that contains the entire firmware stack.

Brian_Grau OpenROV Engineer
Feb 20
This driver topic can be found on the camera product thread.
NEW PRODUCT - Pro Camera-HD Upgrade

spiderkeys - 05/11/2017
@tb so on the BBB everything was originally node.js, except for the video stream was using mjpgstreamer, which is a C program. When we came out with the camera uprade, we wrote a new camera streaming program from scratch called geomuxpp which interfaced with our new camera directly. That C++ program sent video over zeromq to the node server internally, which then got it up to the browser. The only other c++ program is the program that blinks the LED pattern on the BBB. Additionally, I made a fork of mjpgstreamer which has a new C++ plugin for transporting video over raw websockets using the uWebSocket library, such that the video could be served directly to the browser from mjpogstreamer rather than going through node, for performance reasons
on trident, things are more heavily c++ and we are starting to support native clients in addition to the web client, but the node server is still there coexisting with the c++ programs which are handling video and the MCU interface

9-00-15 As of 30.0.x, the system used MJPEG streaming for all video. That includes the logitech 920/930. As long as the USB camera attached supports the linux UVC driver, the system will automatically configure it to send out data as mjpeg.

I am actively working on native h.264 support. That will open up some new features that everyone will enjoy. The major hurdle to h.264 is need to convert the h.264 video to a live streaming mp4 format for low latency viewing in cockpit.


This is great! Thanks very much. I found a few of them myself by not a bunch of the others. I hope to post my findings