Conversion to autonomous for a high school project


#1

A few of us along with our high school kids want to convert an OpenROV into autonomous mode. The kids are doing all the building and programming. I wanted to ask folks here for help.

Here is what we have... We got an OpenROV kit and we have assembled it. After some work it is waterproof and can be controlled from a computer running the cockpit program in a browser. We also get the camera feed. We have been able to test it in a swimming pool. This was the basic state we wanted to get to before doing the autonomous conversion.

In the next step we would like to use OpenCV on BeagleBone Black (BBB) and use our own programs to identify a colored object (of a known color, say, bright red) and cause the ROV to move towards it. We have openCV installed on the BBB. We also took a program that has worked in the past with OpenCV and tried to run it. It doesn't find the camera. Has anyone tried something like this? Just for reference, the program we installed on BBB was originally running on a Raspberry Pi and was used in a FIRST robotics competition successfully.

Assuming that we solve the problem above, the next challenge will be to control the ROV's propulsion system. How do we tap into the propulsion control system for the ROV from inside BeagleBone? Any pointers that will narrow the set of documentation and files we need to look at will help.

Thanks,

Rajesh


#2

Rajesh, I am only coming up to speed on the internals myself (I've been building my own ROVs using various bits), I would expect you need to modify the OpenCV bit to play nice with node.js. I am sure someone has done this, I would be surprised if they haven't. The hardest part in my opinion is building the helm software. I've done this using ROS on other projects, but it's non-trivial. You may want to modify the ROV internals to use an APM2 Ardupilot atmega based board. MOOS is another framework you can do mission planning with. An issue that you will find is location and reckoning issues with AUVs.

The propulsion is base off the controller ATmega board, Arduino code, but glued via node.js. I am just delving into the software now, so I'll keep this in mind and let you know when I build a proper plan for it.


#3

Thanks for the pointers, Jim. Please let me know if you have any successes. I will keep you posted of mine as well. Haven't had time to work on it since I posted.


#4

Hey Rajesh.

If you have not already, be sure to kill mjpeg that is binding to the video0 device before running opencv.

-Brian


#5

Haven't had a chance to try it but I was looking for something exactly like that. Probably wouldn't get a chance to try until next weekend but will report back.

Thanks a bunch,

Rajesh


#6

Hi Rajesh

I was looking at implementing an Autonomous element a short while back - haven't got around to investigating fully or experimenting yet - but from the quick hunt through the code I found the following:

The Node.js cockpit interface was taking the controller input and building a command string from it. This was then sent via the serial of the Beaglebone to the Arduino on the cape, which interpreted the command and gave the motors their instructions - the commands are in the /src/lib/OpenROVController.js file.

The Arduino sends back information from the sensors in order for the Heads Up display to be drawn (the receive code is in the same file) - so an Autonomous mode would need to read the input from the sensors and build the command string to direct the ROV based on the direction you want it to head I guess.

Hope that's useful.


#7

Very useful. Thanks.

Now I know where to look for the interface with the Arduino.

Rajesh