Software Engineering for Autonomous Vehicles.
- [done] Refactor nodes into composable containers for better performance
- Add diagnostics and real-time status publishers
- [done] Set up automatic rosbag recording for every launch
- [done] Create analysis notebooks in
docs/notebooks/
- [done] Integrate PlotJuggler or RQT for real-time visualization
- Add GitHub Actions for
colcon build
andcolcon test
- Enforce formatting with
ament_lint
,black
,flake8
- [done] Create video demos and setup tutorials
- Optimization of varibales in nav2 yaml
- Adding video of waypoint
This section provides a curated collection of diagrams, screenshots, and media that illustrate the internal workings and real-world implementation of the SEAV robot. These visual aids complement the technical documentation by offering a clearer understanding of the robot's architecture, behavior in simulation, sensor integrations, and testing procedures.
The following visuals explain the core architecture of our robotic system, how different ROS 2 nodes interact, and how various components such as sensors, control, and planning modules are layered in our implementation.
This section includes visuals that depict how the SEAV robot performs mapping and localization using various sensor fusion techniques, including IMU and odometry. It also shows how SLAM performance improves with better sensor calibration.
Here we showcase hardware-level integration, RViz-based sensor visualization, and the use of RQT for real-time introspection of the ROS graph. These tools played a vital role during debugging, development, and field validation.
Below are real-world snapshots of the robot used in lab and field environments, showing the hardware in action under various conditions.
These short videos highlight specific capabilities of the SEAV robot, such as low-level motor control from ROS and fully autonomous point-to-point navigation.
🎬 Real Car Control via
cmd_vel
→ Arduino
This clip demonstrates how high-level velocity commands (cmd_vel
) from ROS 2 are translated into serial messages that control the motors via an Arduino interface. It validates end-to-end motor control from the ROS navigation stack.
🎬 Side-by-Side: Terminal, RViz, Real Toy Car
This shows live robot behavior, terminal outputs, and RViz in sync while running Nav2 stack on a real robot.
🎬 Autonomous Navigation of Real Toy Car
The robot receives a start and goal position and autonomously computes and follows a global path using SLAM, costmaps, and Nav2 planners. This video confirms real-world applicability of the navigation pipeline.
To validate the runtime behavior of the SEAV robot, a rosbag was recorded during a live navigation session. The bag includes TF transforms, odometry, LIDAR scans, AMCL poses, and diagnostics among others. This data is essential for offline analysis, visualization, and debugging.
- Recording Folder:
docs/rosbag/rosbag2_2025_03_30-14_47_38/
- File Size: Contains a single SQLite3
.db3
file and ametadata.yaml
. - Total Duration: ~546 seconds (≈9 minutes)
- Message Count: 53,909 messages
- ROS 2 Version: Tested with ROS 2 Humble-based setup
- Storage Format:
sqlite3
- Start Time: Unix nanoseconds:
1743338858570917969
Below is a summary of key topics and their message counts:
Topic Name | Msg Type | Messages |
---|---|---|
/tf |
tf2_msgs/msg/TFMessage |
6345 |
/odom |
nav_msgs/msg/Odometry |
5423 |
/imu |
sensor_msgs/msg/Imu |
5458 |
/cmd_vel |
geometry_msgs/msg/Twist |
261 |
/map |
nav_msgs/msg/OccupancyGrid |
47 |
/initialpose |
geometry_msgs/msg/PoseWithCovarianceStamped |
2 |
/diagnostics |
diagnostic_msgs/msg/DiagnosticArray |
299 |
/rosout |
rcl_interfaces/msg/Log |
14,021 |
/distance |
std_msgs/msg/Float32 |
537 |
/map_metadata |
nav_msgs/msg/MapMetaData |
46 |
/tf_static |
tf2_msgs/msg/TFMessage |
5424 |
This data will be replayed using
ros2 bag play
for RViz visualization, algorithm tuning, and mapping evaluation.
Collects and publishes data from IMU, distance sensor, and odometry.
- Nodes:
imu_serial_publisher
,distance_publisher
,odom_publisher
- Publishes:
/imu
,/distance
,/odom
- Subscribes:
/imu
,/distance
- Dependencies:
rclpy
,robot_localization
,tf2_ros
- Launch File:
ekf.launch.py
- Status: ✅ Built
Driver package for RPLIDAR A1.
- Publishes:
/parameter_events
,/rosout
- Launch File:
ekf.launch.py
- Status: ✅ Built
Configurations and launch files for ROS 2 Nav2 stack.
- Nodes:
robot_state_publisher
,map_server
,amcl
, etc. - Publishes:
/tf
,/map
,/amcl_pose
,/cmd_vel
- Subscribes:
/scan
,/goal_pose
,/map
- Launch Files:
nav2.launch.py
,bringup.launch.py
, etc. - Status: ✅ Built
Forwards velocity commands from /cmd_vel
to Arduino.
- Publishes/Subscribes:
/cmd_vel
,/rosout
- Status: ✅ Built
Publishes static TFs between base_link
, imu
, and lidar
.
- Node:
static_transform_publisher
- Launch File:
static_tf_publisher.launch.py
- Status: ✅ Built
Contains robot URDF/Xacro, meshes, and worlds for Gazebo simulation.
- URDF: Modular robot description using
xacro
files - Mesh:
lidar.dae
- Launch File:
bot.launch.py
- Worlds:
empty.world
- Status: ✅ Built
Provides simulation environments and maps for testing and SLAM.
- Launch Files:
autonomous_navigation.launch.py
house_sim.launch.py
house_slam.launch.py
outdoor.launch.py
- Maps:
house_map.pgm
,house_map.yaml
- Worlds:
house.world
,bookstore.world
- Configs:
ekf.yaml
,nav2_params.yaml
- Status: ✅ Built
Contains nodes used for object recognition and distance estimation of the detected object
- Nodes:
yolov8_node
- Publishes:
/object_distance
,/camera/image_raw
- Dependencies:
rclpy
,ultralytics
,cv_bridge
,sensor_msgs
,bookstore.world
- Status: ✅ Built
This guide walks through bringing up your ROS 2-based robot in three structured phases: sensors, actuators, and navigation.
- Ensure your robot is connected and powered.
- Make sure you have access to the correct USB ports (e.g.,
/dev/ttyUSB0
,/dev/ttyUSB1
). - Confirm that
install/setup.bash
exists and is sourced. - Grant necessary permissions for devices:
sudo chmod 666 /dev/ttyUSB*
#!/bin/bash
SESSION=ros_dev
WORKSPACE="/home/pi/seav"
# Launch robot state publisher
tmux new-session -d -s $SESSION -n rsp \
"source $WORKSPACE/install/setup.bash && ros2 launch seav_description rsp.launch.py"
# Launch RPLIDAR
tmux new-window -t $SESSION -n rplidar \
"source $WORKSPACE/install/setup.bash && ros2 launch rplidar_ros rplidar_a1_launch.py"
# Launch sensor publishers
tmux new-window -t $SESSION -n distance_pub \
"source $WORKSPACE/install/setup.bash && ros2 run seav_test_sensors distance_publisher"
tmux new-window -t $SESSION -n imu_pub \
"source $WORKSPACE/install/setup.bash && ros2 run seav_test_sensors imu_serial_publisher"
tmux new-window -t $SESSION -n odom_pub \
"source $WORKSPACE/install/setup.bash && ros2 run seav_test_sensors odom_publisher"
tmux attach -t $SESSION
source /home/pi/seav/install/setup.bash
ros2 topic list
ros2 topic echo /scan
ros2 topic echo /imu
ros2 topic echo /odom
ros2 run tf2_tools view_frames
In RViz2:
- Fixed Frame:
base_link
orbase_footprint
- Add Displays:
TF
,LaserScan
,IMU
,Odometry
,RobotModel
#!/bin/bash
SESSION=ros_dev
WORKSPACE="/home/pi/seav"
# Launch motor controller
tmux new-session -A -t $SESSION -n motor_ctrl \
"source $WORKSPACE/install/setup.bash && ros2 run my_robot_controller serial_motor_controller"
# Launch SLAM Toolbox
tmux new-window -t $SESSION -n slam_toolbox \
"source $WORKSPACE/install/setup.bash && ros2 run slam_toolbox async_slam_toolbox_node \
--ros-args --params-file $WORKSPACE/src/seav_slam/mapper_params_online_async.yaml"
# (Optional) Save the map after mapping
# ros2 run nav2_map_server map_saver_cli -f /home/pi/k11_room_map
tmux attach -t $SESSION
ros2 topic echo /map
- In RViz2:
- Add:
/map
,/scan
,/odom
- Confirm map grows as the robot moves
- Add:
#!/bin/bash
SESSION=ros_dev
WORKSPACE="/home/pi/seav"
MAP_FILE="$WORKSPACE/src/my_nav2_pkg/config/maze_map_save.yaml"
# Launch EKF node
tmux new-session -A -t $SESSION -n ekf \
"source $WORKSPACE/install/setup.bash && ros2 launch my_robot_description ekf_launch.py"
# Launch localization using saved map
tmux new-window -t $SESSION -n nav2_local \
"source $WORKSPACE/install/setup.bash && ros2 launch my_nav2_pkg localization_launch.py map:=$MAP_FILE"
# Launch full navigation stack
tmux new-window -t $SESSION -n nav2_nav \
"source $WORKSPACE/install/setup.bash && ros2 launch my_nav2_pkg navigation_launch.py"
#Run program for object recognition
tmux new-window -t $SESSION -n nav2_nav
"source $WORKSPACE/install/setup.bash && ros2 run yolov8_ros yolov8_node -- device 0"
tmux attach -t $SESSION
ros2 topic echo /odom
ros2 topic echo /odom_filtered
ros2 topic echo /object_distance
ros2 topic echo /camera/image_raw
- Confirm that the following nodes are active:
/amcl
/controller_server
/planner_server
- In RViz2:
- Add:
2D Pose Estimate
,2D Goal
- Set goals and observe robot behavior
- Add:
chmod +x bringup_phase1.sh bringup_phase2.sh bringup_phase3.sh
# Phase 1: Sensors + TF
./bringup_phase1.sh
# After validating TF and sensor topics
./bringup_phase2.sh
# After successful SLAM mapping
./bringup_phase3.sh
- Switch windows:
Ctrl + B
, then pressw
- Kill session:
Ctrl + B
, then type:kill-session
Phase | Functionality |
---|---|
Phase 1 | Robot model + Sensors + TF |
Phase 2 | Actuators + SLAM mapping |
Phase 3 | EKF + Localization + Navigation goals |
ros2_ws/
├── src/
│ ├── my_nav2_pkg/
│ ├── my_robot_controller/
│ ├── rplidar_ros/
│ ├── seav_description/
│ ├── seav_simulation/
│ ├── seav_slam/
│ ├── seav_test_sensors/
│ ├── static_tf_publisher/
│ ├── scripts/
│ └── yolov8_ros/
├── docs/
│ ├── images/ ← TF tree, architecture, circuit diagrams, etc.
│ ├── notebooks/
│ ├── ppt/
│ └── ros2_workspace_documentation.xlsx
├── README.md
├── build/
├── install/
└── log/
---
---
## 🐞 Troubleshooting & Debugging Tips
Use this section to quickly diagnose and fix common bring-up issues in each phase, using topics, TFs, RViz, and ROS 2 logs.
---
### 🔧 General Debugging Tips
| Check | Command |
|---------------------------|--------------------------------------|
| List all topics | `ros2 topic list` |
| Inspect a topic | `ros2 topic echo /topic_name` |
| List active nodes | `ros2 node list` |
| View TF frame tree | `ros2 run tf2_tools view_frames` |
| Visualize in RViz | `ros2 run rviz2 rviz2` |
| View running processes | `htop` or `top` |
| Check connected USB ports | `ls /dev/ttyUSB*` |
---
### 🛠️ Phase 1 Issues (Sensors + TF)
| Problem | Likely Cause | Fix |
|--------------------------|-------------------------------------------|-------------------------------------------------------------|
| Robot not visible in RViz| `robot_state_publisher` or URDF error | Check `rsp.launch.py` and verify URDF validity |
| Wheels not visible | Missing `joint_state_publisher` or TFs | Run: `ros2 run joint_state_publisher joint_state_publisher` |
| No TF frames in RViz | TFs not being published | Ensure `robot_state_publisher` is running |
| IMU or Lidar not publishing | Serial port or USB not detected | Run `ls /dev/ttyUSB*` and update the correct port |
| Permission denied on USB | Insufficient access to `/dev/ttyUSB*` | Run: `sudo chmod 666 /dev/ttyUSB*` |
---
### 🛠️ Phase 2 Issues (Actuators + SLAM)
| Problem | Likely Cause | Fix |
|--------------------|---------------------------------------------|----------------------------------------------------------|
| No map data `/map` | SLAM node not running or missing input | Check `async_slam_toolbox_node` logs |
| Map not updating | Wrong RViz frame or missing `/scan` | Set RViz fixed frame to `map`, confirm laser topic |
| Robot not moving | Motor controller misconfigured or not running | Check serial port and controller node output |
---
### 🛠️ Phase 3 Issues (EKF + Navigation)
| Problem | Likely Cause | Fix |
|--------------------------|----------------------------------------|------------------------------------------------------------|
| Navigation goal not executing | AMCL not initialized | Use `2D Pose Estimate` in RViz |
| No path planned | Map not loaded or pose TFs broken | Verify `map_server` and `robot_pose` TF tree |
| Odometry unstable | EKF config issue or noisy IMU input | Tune EKF parameters or check `/imu` and `/odom` topics |
| Nav2 nodes crash | YAML or parameter error | Check `localization_launch.py` and `navigation_launch.py` |
---
### 📄 Debugging Through Logs
ROS 2 nodes automatically save logs. These are critical for understanding crashes, parameter errors, and startup problems.
#### 🔍 Find logs for a node:
```bash
ros2 node info /your_node_name
~/.ros/log/latest_log/
Each node creates a subfolder with its PID. Inside, you'll find:
stdout.log
— standard outputstderr.log
— error messages
You can inspect them with:
cat ~/.ros/log/latest_log/<node_folder>/stderr.log
- Switch to the node's tmux window to see logs as they stream.
- To scroll in tmux:
- Press
Ctrl + B
, then[
- Use arrow keys to scroll
- Press
q
to exit scroll mode
- Press
export RCUTILS_LOGGING_SEVERITY_THRESHOLD=INFO
export RCUTILS_CONSOLE_OUTPUT_FORMAT="[{severity}] [{name}]: {message}"
- ROS 2 Distro: (Humble)
- Tools:
colcon
xacro
rviz2
robot_localization
nav2_bringup
- ROS 2 Humble
- SLAM Toolbox (Async & Localization Modes)
- Navigation2 (Nav2)
ros2_control
andros2_controllers
- URDF/Xacro (using XML 1.0)
robot_localization
(EKF-based sensor fusion)- Joint State Publisher + GUI
- RViz2 (Visualization tool)
tmux
(Process manager for bring-up scripts)colcon
(ROS 2 build system)- Joshnewans Articubot git repo and youtube channel
- https://github.com/mad-lab-fau/imucal (imu callibration based on ferrais)