Downward Camera Analysis + Line Detection

In this lesson, you will learn about how to analyze the output of the downward-facing camera and use it in your code development for line following.

Integrating opencv and ROS images

You will now learn how to use ROS images with opencv. The key package you will be using to do so is cv_bridge.

The following code is a skeleton node that you will need to complete. In your finished program, you should be able to subscribe to the downward camera and publish a processed image that you should be able to see/verify in rqt_image_view

Copy-paste the following code into a python file and work from there. You will write this program as a team, on the drone.

import cv2
from cv_bridge import CvBridge, CvBridgeError
import rospy
from sensor_msgs.msg import Image

"""
Starter code for an image processing node. Subscribes to the downward camera
and publishes a processed cv image (i.e. colorized, overlayed with a detected line)
This node is not meant to be a full node that you will use in your final line tracking
solution, but it will teach you key concepts that you will need for line following
"""

class ImageToCV:
    def __init__(self):
        # A subscriber to the topic '/aero_downward_camera/image'. self.image_sub_cb is called when a message is recieved
        self.camera_sub = rospy.Subscriber("/aero_downward_camera/image", Image, self.image_cb)

        # A publisher which will publish a parametrization of the detected line to the topic '/image_to_cv/processed'
        self.image_pub = rospy.Publisher("/image_to_cv/processed", Image, queue_size=1)
        self.bridge = CvBridge()

    def image_cb(self, msg):
        """
        Callback function which is called when a new message of type Image is recieved by self.camera_sub.

            Args:
                - msg = ROS Image message in 8UC1 format

            Returns: NA
        """
        try:
            cv_image = self.bridge.imgmsg_to_cv2(msg, "8UC1")
            self.process(cv_image)
        except CvBridgeError as e:
            print(e)

    def process(self, img):
        """
        Takes in an img param and finds the line in it, if it exists

            Args:
                - img : OpenCV Image to process

            Publishes:
                - annotated image to the /image_to_cv/processed topic
        """
        """TODO-START: FILL IN CODE HERE"""
        # Get a colored copy of the image. This will be used solely to provide an annotated version
        # of the image for debuging purposes (for you to see in rqt_image_view). See slides for help here

        # Threshold the image to isolate the line and use it for your linear regression algorithm
        # (tip: check size to ensure you are detecting the LED line or a simple blip on the screen)

        # "Draw" a line if you have found one on the image

        # Publish your newly annotated, color image
        # NOTE: If you do not detect a line, or if you have decided to ignore detected lines of very small
        #       size, be sure to still publish the image. Also be sure to have some way to handle the camera
        #       seeing multiple "lines" at one time


        """TODO-END"""

if __name__ == "__main__":
    rospy.init_node("image_to_cv")
    a = ImageToCV()

    rospy.spin()

Visualization

When you have finished writing the program, make sure you make it exectuable (you only need to do this once) using the usual command (chmod +x filename.py). Do the following to launch your code: Terminal 1:

roscore

Terminal 2 (starts optical flow):

sudo -E bwsi-uav/catkin_ws/src/aero-optical-flow/build/aero-optical-flow

Terminal 3 (run your code):

python filename.py

Terminal 4:

Use rqt_image_view to visualize the resulting images on /image_to_cv/processed. If all goes well, you should see an annotated image when viewing this topic in rqt_image_view, with a contour or a line over where your detected line is. (For reference: your camera feed should look a lot like the images you worked with in the Contour Analysis section of the Downward.ipynb lab.)

Intro to rosbags

rosbag is an ROS tool that allows you to record the output of any number of ROS topics while the drone is in operation. This tool is useful for code development because it means we can test without having a drone flying around all the time. We will be using it to look at the LED strip you will have to follow in this week’s challenge.

Syntax of rosbag record

rosbag record [options] /<topicname> [/<moretopics>...]

  • Usually, you should use -O <name>.bag for your [option]. The -O (capital O) flag tells ros that the following argument (formatted <name>.bag) is the name we want for our bag file. Otherwise, a default, messy name will be used.

  • For this lab’s purpose, you should use this topic for /<topicname>: /aero_downward_camera/image

Recording a rosbag

SSH into the drone.

Terminal 1:

roscore

Terminal 2 (starts optical flow):

sudo -E bwsi-uav/catkin_ws/src/aero-optical-flow/build/aero-optical-flow

Terminal 3:

cd ~/rosbags

(if the rosbags directory doesn’t exist, run mkdir ~/rosbags and then cd ~/rosbags)

Now, you will record a bag file. Make sure to use rqt_image_view on the drone to see what you are recording as you record it. Keep in mind that you will be recording a lot of data in your bag file, even if you are recording only one topic. We will learn later how to record compressed bags, but for now, just remember to keep your bag recordings relatively short, around 30 seconds maximum to avoid filling up your storage.

Use this command for today’s lab:

rosbag record -O downward.bag /aero_downward_camera/image

When you are finished recording, ctrl-c in the window you are recording your bag in to stop it.

Getting your bag off the drone for analysis

After recording a bag, especially a long one, your drone storage might be relatively full. If you want, check disk usage with sudo df -h /. We will now transfer the bag file from the drone to your laptop and delete the bag file from the drone to free up space.

On your laptop:

cd ~/Desktop (or other dir in which you want the bagfile to go)
scp uav@<teamname>.beaver.works:~/rosbags/downward.bag ~/<file/path/on_your_laptop>/.

On the drone:

cd ~/rosbags
rm downward.bag

Your bag should now be on your laptop and deleted off the drone.

Playing your bag file

On your laptop:

Terminal 1:

roscore

Terminal 2:

In the folder where you saved your bag file, use rosbag play <filename>.bag to “play” the bag file.

Terminal 3:

rqt_image_view

If you have completed the above code correctly, you should see an annotated image when you select the /aero_downward_camera/image in rqt_image_view.