gazebo+ros搭建單目模擬環境:貼有二維碼的天花板+kobuki+camera(2)
二 。 實驗平臺搭建: 機器人kobuki + camera
有了環境後,那得準備實驗平臺了。 考慮到kobuki有現成的模型,也有gazebo下的模擬。再者,之前改寫過turtlebotgazebo的模擬平臺,
分析:
直接利用之前的turtlebot的gazebo 2d鐳射slam的模型改寫實驗平臺。。第一步需要加一個模組當做攝像頭;第二步將此攝像頭模組具有攝像頭功能,即附上外掛功能。。
第三部,除錯效果,位置姿態資訊是否符合進行驗證。
1. 啟動的launch檔案。環境配置 + 機器人平臺配置+ camera_to_depth的節點處理(turtlebot利用fake 2d laser的節點)
2. 環境已好,fake可以不關注, 關注機器人平臺配置。 :感測器上
slam_world.launch ---》
===》機器人配置資訊 :SENSOR的組合
<include file="$(find turtlebot_gazebo)/launch/includes/$(arg base).launch.xml">
kobuki.launch.xml---》》
====》》xacro檔案
<arg name="urdf_file" default="$(find xacro)/xacro.py '$(find turtlebot_description)/robots/$(arg base)_$(arg stacks)_$(arg 3d_sensor).urdf.xacro'" />
kobuki_nostack_camera.urdf.xacro (新建一檔案)
===》 <sensor_camera parent="base_link"/> 注意 座標系的依賴。
<robot name="turtlebot" xmlns:xacro="http://ros.org/wiki/xacro"> <xacro:include filename="$(find turtlebot_description)/urdf/turtlebot_library.urdf.xacro" /> <kobuki/> <!--<stack_hexagons parent="base_link"/>--> <sensor_rplidar parent="base_link"/> <sensor_camera parent="base_link"/> </robot>
turtlebot_library.urdf.xacro 每個模組的xacro檔案。
<robot xmlns:xacro="http://ros.org/wiki/xacro">
<!-- General -->
<xacro:include filename="$(find turtlebot_description)/urdf/common_properties.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/turtlebot_properties.urdf.xacro"/>
<!-- Bases -->
<xacro:include filename="$(find create_description)/urdf/create.urdf.xacro"/>
<xacro:include filename="$(find kobuki_description)/urdf/kobuki.urdf.xacro" />
<!-- Stacks -->
<xacro:include filename="$(find turtlebot_description)/urdf/stacks/circles.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/stacks/hexagons.urdf.xacro"/>
<!-- 3D Sensors -->
<xacro:include filename="$(find turtlebot_description)/urdf/sensors/kinect.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/sensors/asus_xtion_pro.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/sensors/asus_xtion_pro_offset.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/sensors/rplidar.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/sensors/camera.urdf.xacro"/>
</robot>
camera.urdf.xacro ××××××8感測器的資訊配置
這裡面有兩個資訊 : 一個是 你要加的模組當做相機的屬性資訊,gazebo顯示。。。。。二是 該模組要具有相機採集環境資訊的能力。。相機外掛。
同時include上層檔案,座標系統軸就清楚了。。
<?xml version="1.0"?>
<!-- script_version=1.1 -->
<robot name="sensor_camera" xmlns:xacro="http://ros.org/wiki/xacro">
<xacro:include filename="$(find turtlebot_description)/urdf/turtlebot_gazebo.urdf.xacro"/>
<xacro:include filename="$(find turtlebot_description)/urdf/turtlebot_properties.urdf.xacro"/>
<!-- camera -->
<xacro:macro name="sensor_camera" params="parent">
<joint name="camera" type="fixed">
<origin xyz="0.15 0.0 0.35" rpy="0 -1.5708 0.0" />
<parent link="base_link" />
<child link="base_camera_link" />
</joint>
<link name="base_camera_link">
<visual>
<geometry>
<box size="0.05 0.05 0.05" />
</geometry>
<material name="Red" />
</visual>
<inertial>
<mass value="0.000001" />
<origin xyz="0 0 0" />
<inertia ixx="0.0001" ixy="0.0" ixz="0.0"
iyy="0.0001" iyz="0.0"
izz="0.0001" />
</inertial>
</link>
<!-- Set up laser gazebo details -->
<camera_image />
</xacro:macro>
</robot>
urtlebot_gazebo.urdf.xacro ==》感測器的資訊 plugin 外掛引數的修改
<?xml version="1.0"?>
<robot name="turtlebot_gazebo" xmlns:xacro="http://ros.org/wiki/xacro">
<!-- Microsoft Kinect / ASUS Xtion PRO Live for simulation -->
<xacro:macro name="turtlebot_sim_3dsensor">
<gazebo reference="camera_link">
<sensor type="depth" name="camera">
<always_on>true</always_on>
<update_rate>20.0</update_rate>
<camera>
<horizontal_fov>${60.0*M_PI/180.0}</horizontal_fov>

<clip>
<near>0.05</near>
<far>8.0</far>
</clip>
</camera>
<plugin name="kinect_camera_controller" filename="libgazebo_ros_openni_kinect.so">
<cameraName>camera</cameraName>
<alwaysOn>true</alwaysOn>
<updateRate>10</updateRate>
<imageTopicName>rgb/image_raw</imageTopicName>
<depthImageTopicName>depth/image_raw</depthImageTopicName>
<pointCloudTopicName>depth/points</pointCloudTopicName>
<cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
<depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName>
<frameName>camera_depth_optical_frame</frameName>
<baseline>0.1</baseline>
<distortion_k1>0.0</distortion_k1>
<distortion_k2>0.0</distortion_k2>
<distortion_k3>0.0</distortion_k3>
<distortion_t1>0.0</distortion_t1>
<distortion_t2>0.0</distortion_t2>
<pointCloudCutoff>0.4</pointCloudCutoff>
</plugin>
</sensor>
</gazebo>
</xacro:macro>
<!-- RPLidar LIDAR for simulation -->
<xacro:macro name="rplidar_laser">
<gazebo reference="base_laser_link">
<sensor type="ray" name="laser">
<pose>0 0 0 0 0 0</pose>
<visualize>false</visualize>
<update_rate>5.5</update_rate>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1</resolution>
<min_angle>-3.1415926</min_angle>
<max_angle>3.1415926</max_angle>
</horizontal>
</scan>
<range>
<min>0.10</min>
<max>6.0</max>
<resolution>0.01</resolution>
</range>
<noise>
<type>Gaussian</type>
<mean>0.0</mean>
<stddev>0.01</stddev>
</noise>
</ray>
<plugin name="rplidar_node" filename="libgazebo_ros_laser.so">
<topicName>/scan</topicName>
<frameName>base_laser_link</frameName>
</plugin>
</sensor>
</gazebo>
</xacro:macro>
<!-- camera -->
<xacro:macro name="camera_image">
<gazebo reference="base_camera_link">
<sensor type="camera" name="camera1">
<update_rate>30.0</update_rate>
<camera name="head">
<horizontal_fov>1.3962634</horizontal_fov>

<clip>
<near>0.02</near>
<far>300</far>
</clip>
<noise>
<type>gaussian</type>
<!-- Noise is sampled independently per pixel on each frame.
That pixel's noise value is added to each of its color
channels, which at that point lie in the range [0,1]. -->
<mean>0.0</mean>
<stddev>0.007</stddev>
</noise>
</camera>
<plugin name="camera_controller" filename="libgazebo_ros_camera.so">
<alwaysOn>true</alwaysOn>
<updateRate>30.0</updateRate>
<cameraName>camera1</cameraName>
<imageTopicName>/image_raw</imageTopicName>
<cameraInfoTopicName>/camera_info</cameraInfoTopicName>
<frameName>base_camera_link</frameName>
<hackBaseline>0.07</hackBaseline>
<distortionK1>0.0</distortionK1>
<distortionK2>0.0</distortionK2>
<distortionK3>0.0</distortionK3>
<distortionT1>0.0</distortionT1>
<distortionT2>0.0</distortionT2>
</plugin>
</sensor>
</gazebo>
</xacro:macro>
</robot>
環境變數 ~/.bahsrc
export TURTLEBOT_BASE=kobuki
export TURTLEBOT_STACKS=nostack
#export TURTLEBOT_3D_SENSOR=rplidar
export TURTLEBOT_3D_SENSOR=camera
======
執行顯示結果:
gazebo 模擬中的相機內參說明: 可以用 視角和 畫素大小來求 fx fy
例如 640x480 視角1.3962634 : dx =320 dy = 240 fx=381.3612 fy=286.0209
例如 640x480 視角1.3962634 : 應該是認為 水平視角下 求得 f = 381.3612 (fx = fy)
參考:
sdf http://sdformat.org/spec?ver=1.5&elem=visual#visual_material
gazebo http://www.gazebosim.org/tutorials?cat=build_world