dakshu.xyz

<- MyBlog

28th May, 2025

10 min read

A ros2 journey


For our capstone project we were tasked with programming 3 sets of turtlebot3, you can find the manual here.
There are 2 types, we were working with the BURGER turtlebot3. You can notice the lidar on top of the robot, it spins and makes a noise when the robot is turned on.

Our professor wanted us to use ros jazzy and were sure about the turtlebot supporting it so we started using Ubuntu Noble 24.x and installing ros on it, after that you can install python now and use the ros2 package or write a C++/C project

The class requirements (rescue bot)

  • Robot moving autonomously avoiding obstacles and able to manuver itself through cones
  • Detection of object of interest using the camera, aka the object to rescue
  • Broadcast status to other turtlebots and act on it (swarm algorithm)
As an experiment, I decided to implement this in rust, my motivation was to
  • Statically link dependencies into one binary which can be run on the bots computer seamlessly
  • Leverage rust's fearless concurrency to achieve fast reaction times from sensors while preventing data races
  • Less memory footprint compared to Python/Go
Using the following crates

r2r = "0.9" # async rust bindings for ROS2
tokio = { version = "1", features = ["full"] } # async
serde_json = "1.0"
serde = { version = "1.0", features = ["derive"] }
futures = "0.3.31"
usls = { version = "0.0.20", default-features = false } # yolov8 inference
nokhwa = { version = "0.10.7", features = ["input-native", "output-threaded"] } # webcam capture library
image = "0.25.6" # imgae processing
anyhow = "1.0.97" 
ort = "2.0.0-rc.9" # onnix runtime
rand = "0.9.0"
async_cell = "0.2.2"

ros2 node and topics

A ros2 node allows us to send/get information from these "topics" which are initialized when the turtlebot is brought up to life by running the initialization script

ros2 launch turtlebot3_bringup robot.launch.py

To check if the bot is initialized correctly run

ros2 topic list

And you should see the /cmd_vel topic and /scan lidar scan topic

We are able to construct a node given a name using the following function


pub type Node = Arc<Mutex<r2r::Node>>;

// generate a node with a given name and namespace is set to turtlemove
pub fn generate_node(name: &str) -> r2r::Result<Node> {
    let name_space = "turtlemove";
    let ctx = r2r::Context::create()?;
    let node = r2r::Node::create(ctx, name, name_space)?;
    let mutex = Mutex::new(node);
    let arc = Arc::new(mutex);

    Ok(arc)
}

Followed by us constructing 2 nodes,

  • one for navigation, to send Twist msg to /cmd_vel and subscribe to /odom odometer data and provide it to the navigation process
  • one for lidar, to subscribe to /scan topic and process lidar data and give it to the nav process so it can react on it

let lidar_node = generate_node("lidar_node")?;

// Launch the lidar communication channel
// async cell is used to create a callback from a future 
let cell_lidar = AsyncCell::shared();
let weak_lidar = cell_lidar.take_weak();

let lidar_cl = Arc::clone(&lidar_node);
// lidar process
tokio::spawn(async move {
    let mut lidar_node_sub = {
        let mut lock = lidar_cl.lock().expect("failed to lock lidar node");
        // subscribe to lidar node
        let qos = QosProfile::default().best_effort();

        lock.subscribe("/scan", qos)
            .expect("Subscribing to lidar should work")
    };

    lidar::lidar_scan(&mut lidar_node_sub, cell_lidar).await;
});

The lidar_scan processes the LaserScan data and converts it to meaningful directions the nav module can interpert.


let nav_node = generate_node("nav_node")?;

let (yolo_tx, yolo_rx) = mpsc::channel::(1000);

let config_cl = config.clone();
// camera process + yolo detect
// having a on async process in between async processes is fine and I confired it in the rust discord with devs, if it doesn't block it should be fine, it definitely isn't recognized by tokio's runtime's exector to include it in its runtime
std::thread::spawn(move || {
    let cam = camera::cam_plus_yolo_detect(yolo_tx, config_cl);
    println!("{:?}", cam)
});
// There isn't a async way to do video capture yet, maybe with io_uring and ring buffers ?

And finally we have the navigation process and our last operation which blocks the application is the spinning of both the nodes

// Odometer process
tokio::spawn(async move {
    let mut odom_node_sub = {
        let mut lock = cl.lock().expect("Locking the nav node should work");
        let qos = QosProfile::default();
        lock.subscribe("/odom", qos)
            .expect("Subscribing to odom should work")
    };

    odom::listen(&mut odom_node_sub, cell_odom).await;
});

// navigation process
tokio::spawn(async move {
    // this is what the bot is doing at any point in time
    let start_sequence = Sequence::RandomMovement;

    let x = nav::move_process(
        start_sequence,
        cl,
        weak_lidar,
        yolo_rx,
        weak_odom,
        config_cl,
    )
    .await;

    println!("{:?}", x);
});

// spin the nodes
loop {
    if let Ok(mut nav_handle) = nav_node.lock() {
        nav_handle.spin_once(node_spin_dur);
    }

    if let Ok(mut lidar_handle) = lidar_node.lock() {
        lidar_handle.spin_once(node_spin_dur);
    }
}

Navigation process

We setup all our AsyncCells and channel receivers so we are able recieve sensor data We setup a UdpSocket to receive another turtlebot's odometer information


// main navigation logic
pub async fn move_process(
    // sequence to start the nav move from
    starting_seq: Sequence,
    nav_node: crate::Node,
    // lidar scan async cell
    lidar_rx: TakeWeak,
    // yolo object recv from camera
    mut yolo_rx: Receiver,
    // odometer recv
    odom_rx: TakeWeak,
    config: ModelConfig,
) -> Result<()> {
    // publish twist messages
    let publisher = TwistPublisher::new(nav_node.clone());
    // take a random distance step forward
    let distance_step = Uniform::new(400, 500).expect("Failed to create distance step");
    // listening for swarm data
    let socket = UdpSocket::bind("0.0.0.0:8000").await?;
    // connect a single bot
    socket.connect(&config.addr[0]).await?;
    let socket_arc = Arc::new(socket);
    let s = socket_arc.clone();
    let sequence_mut = Arc::new(Mutex::new(starting_seq));
    // respose buffer from a turtlebot navigation process
    let mut buf = [0; 1024];

Now inside our control loop we do two things

  • Try to recieve data from the UdpSocket of another turtlebot and if we do recieve something, we lock on the sequence variable mutex and change it to Swarming
  • Match over the sequence state machine to check what the robot is doing at any time and provide logic for actions
loop {
    let socket_arc = Arc::clone(&socket_arc);
    let seq_cl = Arc::clone(&sequence_mut);
    
    tokio::spawn(async move {
        if let Ok(x) = socket_arc.try_recv(&mut buf) {
            ...
        }
    });

    let mut sequence = sequence_mut.lock().await;
    ..

The Sequence state machine is as follows

/// What the bot is doing at any point in time
pub enum Sequence {
    // Start randomly moving in x, y direction
    RandomMovement,
    // If charm is located, start moving towards it
    TrackingToCharm,
    // If swarm instruction is sent track to the swarm
    Swarming(OdomData),
    // Stop
    Stop,
}

Now for our RandomMovement clause, our logic looks like this

match *sequence {
    Sequence::RandomMovement => {
        // move randomly
        tokio::spawn(nav_move(
            distance_step.sample(&mut rand::rng()) as f64,
            0.2,
            publisher.clone(),
        ));
        
        tokio::select! {
            lidar = &lidar_rx => {
                if let Some(scan) = lidar {
                    let direction = lidar::lidar_data(&scan);
                    
                    if direction.north {
                        nav_move(10.0, -0.2, publisher.clone()).await;
                        rotate(0.1, publisher.clone(), None).await;
                    }

                    if direction.north_west {
                        rotate(2.0, publisher.clone(), None).await;
                    }

                    if direction.north_east {
                        rotate(-2.0, publisher.clone(), None).await;
                    }
                }
            }
            // check yolo reciever
            yolo = yolo_rx.recv() => {
                if yolo.is_some() {
                    *sequence = Sequence::TrackingToCharm;
                    nav_stop(publisher.clone());
                }
            }
        }
    }
    ...    

In the loop, whenever we hit he RandomMovement clause, first we move random distance step forward and then we have a tokio select and two branches

  • lidar channel receiver, as we receive directions from the lidar channel, we immediately act on them by rotating the bot away from the detected direction, the bot is probably in forward motion while this is happening
  • If we detect an item of interest using and receive something on the yolo camera channel then we change the Sequence and stop the robot

This allows the robot to stop itself while its moving randomly and then track itself to the charm or the item of interest

Shows me sshed into the turtlebot's raspberry pi and running the binary of the turtlebot3-rust project, the robot starts moving straight and avoids obsctacles

The robot stops because after avoiding the wall, the camera sees the cone and sends info to the channel and then the tokio branch completes, robot stops and starts tracking to charm.

Compiling the binary

The turtlebot3 computer is powered by raspberry pi, we can run the bringup script and cargo r to run the simulation or cross compile the binary if the raspberry pi is not strong enough, this is what we did as the raspberry pis we were using were pi 3 with around 4gb of RAM.

To cross compile you can use xargo but the problem is the sourcing of ros, you need to source ros before compiling

  • With xargo or normal cargo you would have to install ros in the system you are running cargo/xargo in so it can be only systems ros supports
  • With a docker image tuned for ros, install rust inside it and cross compile the binary in the docker image and copy the resulting binary to the host, we can then scp the binary or do this whole process in github actions as Continous Development

WIN

The docker image works for me on a mac m1, this supports development on virtually any machine that can run docker and rust and doesn't limit you to ros supported devices!


Links: