The Evolution of a Scootbot [Pic Heavy]

Hey All,

I started my journey about a year before I found Josh’s channel. It was a rough journey I had some Python experience but had never heard of ROS.

It was mid-covid and we were all stuck in the house. I bought my son a robot kit that he started but never finished. In an effort to encourage him (He was 17 btw!) I started buying parts myself and having fun. It was supposed to be a joint effort but teens are teens Xbox was more fun to him.

I coded it in pure python with basic movement and openCV camera stuff it was fun but the form factor was a space limitation and it was never going to take a payload.

I needed something BIGGER!

Introducing Scootbot V2! I found a cheap mobility scooter on eBay with the idea that if it can take a human’s weight it can do something more useful than just Scoot about the house.

I have a fairly large garden and certainly, a robot helper would be a fun project.

It was painted and had lights with a relay, and joy control. Still using hand-coded python generic. The Ackerman steering system killed this project. Not only did it have the turning circle of a bus the torque required to turn those wheels on grass and mud was crazy.

I knew at this point diff drive would be the way forward and found a second-hand electric wheelchair.

Scootbot V2 was born!

I started learning ROS2 and it was tough never did I find anything step-by-step that truly guided you through the process from URDF to NAV2 like Josh has done so hats off to him and if you are just getting started, be grateful! His work is by far the best newbie material out there I found in almost 18 months.

Before I found this channel I made some friends on Reddit and we started a slack group one was a professional robot engineer who has helped and guided me so much none of this would have been possible without his kind and patient guidance.

I get the URDF up, and models created. Lidar is working, nav2.

What was not working well was ODOM, I had no encoders or IMU at the time and everything was off the Lidar Odom.

I bought the IMU and Encoders from Phidgets and got to work. IMU was quite easy to set up and after some time working out how to mount the encoders I decided to mount them on the drive shaft of the geared motor.

This was a huge mistake as the drive shaft was spinning so fast it was massively out of the supported RPM of the (actually quite high-end) encoders. This caused patches of missing and inaccurate data.

This took a long time to diagnose and this point was the first time I considered rage quitting.

It was obvious I needed to take the reading from the driven shaft but there was very limited space and only one driven access. A motor designed for it would have a protruding second driven shaft to attach an encoder to.

In the end and after various design iterations there was JUST enough room to squeeze it in here:

Video Link

I was so happy! It worked! No more lost data! Surly everything was going to work like a dream now!!!

Nope, there was still an issue, 2 actually. When I drove the bot around my property trying to map my uniform rectangle house by the time it had scanned all four sides with the lidar the map looked like a trapezium shape!

I was beyond frustrated now. So I talked to be robotics friend who help me identify two root cause issues. Firstly the code I created as the motor control was just python and I was not using ros2_control. I heard of that but seemed too hard to understand. The code was sending motor signals in an INT range from -100 to +100 NOT in m/s velocity. Second issue was the code had no closed loop control as discussed in the videos. Both motors where not the exact same spec so send +50 to each motor over time it would drift to one side.

Now the guy I mentioned is also building his own robot and turns out he had the same sabertooth controller I was using. He said and I quote (I loved this, it was so nice of him!) “I am building my robot with the hardware as close to the scootbot spec as possible so we can build and develop together!”

Now I knew we needed a hardware interface for ros2_control by this point and that and PID and vel commands things would work. But there was no way I was building a C++ interface on my own.

For the last couple of months, I have been waiting for this guy to “catch up” with his physical build and mothballed my project.

and then! Josh to the rescue! After digesting the ros2_control videos a few time, looking for various other none existing solutions I came across a module for the sabertooth motor driver I am using. A hardware module that turns your driver into a pseudo-servo

https://www.dimensionengineering.com/products/kangaroo

After reading the docs I could see that you can just send basic serial commands to get and set the velocity of the motor. After tuning the Roo with encoder settings etc that’s all there is.

Then I realised after trying to digest and learn from the diffdrive_arduino code that’s all the hardware interface is doing on the ros side. So my Roo will basically become the “ardunio” in joshes solution.

So, ladies and gentlemen. It is with great pleasure I announce scootbot is back!

(Hopefully, maybe if this plan works!)

1 Like

This is a fantastic project! It’s great seeing how you’ve pushed through despite the issues - the kind of issues which are unfortunately all-too-common in robotics projects. Which is why I’m glad to see people documenting their work, so that hopefully someone else with similar issues finds a solution! The Kangaroo board looks like it has great potential as a replacement for the Arduino controller code.

P.S. I hope you don’t mind, I went in and rejigged your picture links so they actually display, Google Drive is a bit of a pain to cooperate with.

Hello,
I’m trying to see the pictures, but all links seem dead.
I’ve recently started learning ROS for use with the exact same hardware: an old mobility scooter.
Looking forward to see you project proceed

Greetings from Groningen in The Netherlands