Skip to content

OliMcy/i2ros_autonomous_driving

Repository files navigation

TUM Introduction to ROS: Autonomous Driving 🚘

This is the repo for group project autonomous driving of team 13 the lecture Introduction to ROS in summer semester 2023.

📋 Contents

💾 Result Presentation

📺 Video: Route 1

📺 Video: Route 2

global planner local planner controller
Route 1 waypoint-global-planner teb_local_planner Ackermann controller
Route 2 move_base build-in global_planner move_base build-in base_local_planner PID controller

📈 Rosgraph

  • Route1 Route1

  • Route2 Route2

☑️ Tasks

  • Successfully working perception pipeline
  • Successfully working path planning
  • Successfully working trajectory planning
  • Successfully avoiding other cars
  • Successfully stopping/driving at street lights
  • Time to complete the mission: 212s
  • Solving the problem without using semantic camera: YOLOv5
  • New msg: perception_msgs/Boundingboxes.msg, perception_msgs/Boundingbox.msg, perception_msgs/Trafficstate.msg
  • New Service: planning/srv/PlanGoal.srv

⚠️ Prerequisites

Please git clone this repo according to the first step in Getting Started

chmod +x requirements.sh
./requirements.shgroup
  • Or Run following command to install required packages.

Perception:

sudo apt install ros-noetic-octomap-rviz-plugins ros-noetic-octomap-server
pip install -r Autonomous_Driving_ws/src/perception/yolov5/src/yolov5/requirements.txt

Control:

sudo apt install ros-noetic-pid ros-noetic-robot-localization ros-noetic-smach-ros

Planning

sudo apt-get install ros-noetic-navigation
sudo apt-get install ros-noetic-teb-local-planner
sudo apt-get install ros-noetic-ackermann-msgs

🔰 Getting Started

  1. Use the following command to clone the repository
git clone [email protected]:i2ros_g13/i2ros_g13_autonomous_driving.git --depth 1
  1. Build it with catkin build.
  2. Download the Unity Environment: https://syncandshare.lrz.de/getlink/fiEg9ocZ6Pc5iuEa4QqN1b/
  3. Unzip the Unity file and copy the files to .../devel/lib/simulation/
  4. run the following command to launch.
source devel/setup.bash
roslaunch simulation yolov5_simulation.launch 

The car will start driving along the generated global and local path. The traffic rules will be followed correctly.

📕 Modules description

Route1

Perception

  • perception_pkg: including a node trafficlights_detect_node, which extract the area of traffic light from semantic image and then recognize the color of the traffic light in RGB image. It gives the controller the state of traffic light to stop the car of let it move again. Additionally a bounding box in color red or green is also drawn and outputed by this node.
  • depth_image_proc: a package to convert the depth image data to 3D point cloud data.
  • OctoMap: a package which can get the occupancy grids and map from 3D point cloud data.
  • yolov5: A package to perform traffic light detection using YOLOv5. It includes a trafficlights_detect_node_yolov5 ros node, which uses the trained YOLOv5 model for inference and publish the detection results.

Planning

  • planning(Route 1): node waypoint_sending_server sends waypoints for waypoint global planner; node global_path_planning_client considers traffic lights and car position and decides when and which waypoints will be sent.
  • auto2dnav: congiuration of move_base package
  • move_base: primary pakage used for planning and navigation Tasks

Control

  • auto2dnav: node cmd_vel_to_ackermann_drive converts the orignal desired linear and angular velocity to desired linear velocity and steering angle
  • cnotroller_pkg(Route 1): node controller_node calculates the final command values

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published