Getting Started with dbw_mkz ROS package

Kareem Emad
6 min readFeb 14, 2019

In this article ,I will give a short introduction to get you started with data speed vehicle on gazebo. This article will cover :

  • how to start dbw_node on gazebo.
  • how to read/write commands to and from the vehicle . Basically how to give commands to your vechile to move forward or listen to its sensor (camera for example).
  • how can you add/modify existing sensors on the vechile.

First to get started ,if you haven’t already , head over to http://wiki.ros.org/dbw_mkz and install the dbw_mkz package through the command :

sudo apt-get install ros-kinetic/melodic/indigo-dbw-mkz

where the part kinetic/melodic/indigo depends on the ROS version installed , stick with kinetic if possible as it’s widely supported . Melodic is the newest version and works with ubuntu 18 .

Now that you have the package installed , it’s time to get some visualization to make sure things are working . start a terminal and issue the following command:

roslaunch dbw_mkz_gazebo lane_keep_demo.launch

This command will start the lane keep example from the dbw_mkz_gazebo package , you expect to see gazebo getting started together with rviz .

screenshot taken from gazebo running the lane keep example

In this example , the vehicle is supposed to use the camera to detect lane lines and keep moving inside the road.

Let’s try listening to some of the sensor data published by the vehicle , but first we need to know what kind of topics the vehicle is publishing , list all the topics by :

rostopic list

This will give a large list of names for topics ,we mainly interested in the camera sensor ,so we are looking for “/vehicle/front_camera/image_rect_color” topic. To get some information about the topic :

rostopic info /vehicle/front_camera/image_rect_color

The data published on this topic has type sensor_msgs/Image .let’s start a node and listen to the output of this camera .

start a new package and add sensor_msgs,cv_bridge,rospy,std_msgs,dbw_mkz_msgs as dependencies. then add a new file in the src directory in your package folder with a .py extension ,this will be the node we will be using to listen/publish data . copy the following in your node file.

#!/usr/bin/env python
from __future__ import print_function
import sys
import rospy
import cv2
from std_msgs.msg import String
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
import numpy as np
class test_dbw:
def __init__(self):
self.bridge = CvBridge()
self.image_sub = rospy.Subscriber(“/vehicle/front_camera/image_rect_color”,Image,self.callback)
def callback(self,data):
try:
cv_image = CvBridge().imgmsg_to_cv2(data, “bgra8”)
print(cv_image.shape)
cv2.imshow(“RGB window”,cv_image)
cv2.waitKey(3)
except CvBridgeError as e:
print(“image conversion failed!”)
print(e)
def main(args):

ic = test_dbw()
rospy.init_node(‘test_dbw’, anonymous=True)
try:
rospy.spin()
except KeyboardInterrupt:
print(“Shutting down”)
cv2.destroyAllWindows()
if __name__ == ‘__main__’:
main(sys.argv)

Make sure you give execution permission to your node and then run it while the lane keep demo is still running in another terminal.

This code should start a window showing you the view from the camera window from a separate window

Notice ,in the code, that all we had to do is that we subscribe to the /vehicle/front_camera/image_rect_color topic, convert the image to cv format using cvBridge , and that’s it.

Now that we have seen how we can listen to sensor data, let’s try playing around with this vehicle . if you looked again at the topics list, you will find that we have separate topics for giving forward,steering,braking commands , let’s start with throttle topic:

rostopic info /vehicle/throttle_cmd

You will find out that there is another node publishing on that topic ,which is controlling the vehicle to implement the lane demo. we need to stop it so that the vehicle only listens to us .

simple solution: kill that node :)

rosnode kill /vehicle/twist_control

Now that the vehicle is ready to listen to us . Let’s add some code to move it forward.

#!/usr/bin/env pythonimport numpy as np
import sys
import rospy
from std_msgs.msg import String
from sensor_msgs.msg import Image
from dbw_mkz_msgs.msg import ThrottleReport
from dbw_mkz_msgs.msg import SteeringReport
from dbw_mkz_msgs.msg import ThrottleCmd
from dbw_mkz_msgs.msg import BrakeCmd
from cv_bridge import CvBridge, CvBridgeError
import cv2
class test_dbw:def __init__(self):
print("intializing")
self.bridge = CvBridge()
self.dthrottle_sub = rospy.Subscriber("/vehicle/throttle_report",ThrottleReport,self.callback_thro)
def callback_thro(self,data):
print("recieved data from throttle report")
print("publishing")
self.pub_throttle_cmd = rospy.Publisher('/vehicle/throttle_cmd',ThrottleCmd, queue_size=1)
throttle_command_object = ThrottleCmd()throttle_command_object.pedal_cmd = 0.2851
throttle_command_object.pedal_cmd_type = 2
throttle_command_object.enable = True
throttle_command_object.ignore = False
throttle_command_object.clear = False
throttle_command_object.count = 0
self.pub_throttle_cmd.publish(throttle_command_object)
def main(args):
rospy.init_node('test_dbw2', anonymous=True)
ic = image_converter()
try:
rospy.spin()
except KeyboardInterrupt:
print("Shutting down")
cv2.destroyAllWindows()
if __name__ == '__main__':
main(sys.argv)

Notice that pedal_cmd specifies the speed of the car . Here, we are listening to the throttle report and publishing back throttle command with the same format as specified by the topic. start the node and your vehicle should start moving.

We have successfully subscribed/published data to the dbw_mkz vechile , but you may notice that the camera position is not that good for you or your application ,let’s change that :

cd into dbw_mkz_gazebo:

roscd dbw_mkz_gazebo/

Under the urdf folder ,there a configuration file that looks like the following

<?xml version="1.0"?><robot name="mkz" xmlns:xacro="http://www.ros.org/wiki/xacro" ><!-- Necessary includes -->
<xacro:include filename="$(find dbw_mkz_description)/urdf/vehicle_structure.urdf.xacro" />
<xacro:include filename="$(find dbw_mkz_description)/urdf/vehicle_gazebo.urdf.xacro" />
<!-- Bolt-on sensors go here -->
<xacro:include filename="$(find dbw_mkz_description)/urdf/vehicle_sensors.urdf.xacro" />
<xacro:dbw_mkz_camera name="front_camera" parent_link="base_footprint" x="2.6" y="0" z="1.8" roll="0" pitch="0.6" yaw="0" />
<!--
<xacro:dbw_mkz_gps name="perfect_gps" parent_link="base_footprint" x="0.6" y="0" z="1.45" rate="50.0" ref_lat="45.0" ref_lon="-81.0" />
-->
<xacro:include filename="$(find velodyne_description)/urdf/VLP-16.urdf.xacro"/>
<VLP-16 parent="base_footprint" name="velodyne" topic="velodyne_points" hz="10" samples="440">
<origin xyz="1.1 0 1.49" rpy="0 0 0" />
</VLP-16>
</robot>

What matters for us is the part for the front_camera , you will find the x,y,z,roll,pitch,yaw parameters ready to be tweaked , change them and rerun the simulation till you reach the optimal position/angle for your camera.

Finally , what if we need to add other sensors to the dbw_mkz vechile ? for example what if we want to have depth camera instead of a regular camera.

For this, we need to cd into the dbw_mkz_description package, you will find a file called vehicle sensors under the urdf folder ,that is the file we are looking for ,now let’s add the depth camera as follows:

<?xml version="1.0"?><robot name="mkz_sensors" xmlns:xacro="http://www.ros.org/wiki/xacro" ><xacro:macro name="dbw_mkz_camera" params="name parent_link x y z roll pitch yaw" ><link name="${name}" >
<visual>
<geometry>
<box size="0.1 0.1 0.1" />
</geometry>
<material name="white">
<color rgba="1 1 1 1" />
</material>
</visual>
</link>
<link name="${name}_optical" /><joint name="${name}_optical" type="fixed" >
<parent link="${name}_optical" />
<child link="${name}" />
<origin xyz="0 0 0" rpy="-1.57079632679 0 -1.57079632679" />
</joint>
<joint name="${name}_joint" type="fixed" >
<parent link="${parent_link}" />
<child link="${name}_optical" />
<origin xyz="${x} ${y} ${z}" rpy="${roll} ${pitch} ${yaw}" />
</joint>
<gazebo reference="${name}_optical" >
<material>Gazebo/White</material>
<sensor type="depth" name="${name}_camera">
<update_rate>30.0</update_rate>
<camera name="cam">
<horizontal_fov>1.3962634</horizontal_fov>

<clip>
<near>0.02</near>
<far>300</far>
</clip>
<noise>
<type>gaussian</type>
<!-- Noise is sampled independently per pixel on each frame.
That pixel's noise value is added to each of its color
channels, which at that point lie in the range [0,1]. -->
<mean>0.0</mean>
<stddev>0.007</stddev>
</noise>
</camera>
<plugin name="camera_controller" filename="libgazebo_ros_openni_kinect.so">
<baseline>0.2</baseline>
<alwaysOn>true</alwaysOn>
<updateRate>1.0</updateRate>
<cameraName>${name}</cameraName>
<imageTopicName>/camera/rgb/image_raw</imageTopicName>
<cameraInfoTopicName>/camera/depth/camera_info</cameraInfoTopicName>
<depthImageTopicName>/camera/depth/image_raw</depthImageTopicName>
<depthImageInfoTopicName>/camera/depth/camera_info</depthImageInfoTopicName>
<pointCloudTopicName>/camera/depth/points</pointCloudTopicName>
<frameName>${name}</frameName>
<pointCloudCutoff>0.5</pointCloudCutoff>
<distortionK1>0.00000001</distortionK1>
<distortionK2>0.00000001</distortionK2>
<distortionK3>0.00000001</distortionK3>
<distortionT1>0.00000001</distortionT1>
<distortionT2>0.00000001</distortionT2>
<CxPrime>0</CxPrime>
<Cx>0</Cx>
<Cy>0</Cy>
<focalLength>0</focalLength>
<hackBaseline>0</hackBaseline>
</plugin>
</sensor>
</gazebo>
</xacro:macro>
<xacro:macro name="dbw_mkz_gps" params="name parent_link x y z rate ref_lat ref_lon" >
<link name="${name}" >
<visual>
<origin xyz="0 0 0" rpy="0 0 0" />
<geometry>
<cylinder radius="0.1" length="0.05" />
</geometry>
<material name="white" >
<color rgba="1 1 1 1" />
</material>
</visual>
<collision>
<origin xyz="0 0 0" rpy="0 0 0" />
<geometry>
<cylinder radius="0.1" length="0.05" />
</geometry>
</collision>
<inertial>
<origin xyz="0 0 0" rpy="0 0 0"/>
<mass value="1.0"/>
<inertia ixx="0.037227" ixy="0" ixz="0" iyy="0.037227" iyz="0" izz="0.063878"/>
</inertial>
</link>
<joint name="${name}" type="fixed" >
<origin xyz="${x} ${y} ${z}" rpy="0 0 0" />
<parent link="${parent_link}" />
<child link="${name}" />
</joint>
<gazebo>
<plugin name="gps_plugin" filename="libdbw_mkz_gazebo_gps.so" >
<link_name>${name}</link_name>
<update_rate>${rate}</update_rate>
<ref_lat>${ref_lat}</ref_lat>
<ref_lon>${ref_lon}</ref_lon>
<ant_offset_x>${x}</ant_offset_x>
<ant_offset_y>${y}</ant_offset_y>
<ant_offset_z>${z}</ant_offset_z>
</plugin>
</gazebo>
</xacro:macro>


</robot>

Notice the new topics that have been added to support depth and rgb data from you camera, restart your simulation to see this topics in effect.

And that was everything for today, hope you find the content useful . leave a comment if you need any help.

Thank you

--

--