Part A
Part B
Blog post for part A
Blog post for part B
This is the discussion topic for the video and blog post linked above. Please keep all replies relevant to the content, otherwise create a new topic.
Part A
Part B
Blog post for part A
Blog post for part B
This is the discussion topic for the video and blog post linked above. Please keep all replies relevant to the content, otherwise create a new topic.
Hello.
which ip addresses are taken? first line that of the robot and in the second line that of the dev machine?
Unfortunately, the npm server is not running as desired for me.
Do I have to rebuild the root directory if there are changes or just run npm install and then npm start in the example folder?
Are you referring to here where I set the IP addresses in the web page?
The IP address for the ImageStream
should be whichever is running web_video_server
and the one for RealGamepad
/SimGamepad
should be whichever is running rosbridge. In my example the former was the Pi and the latter was the dev machine. Make sure you do it for both of the pages (sim and real).
In what way?
Yeah, good question, I get very confused by it all. In theory when you npm start
it should auto pick up changes. I think in this case because there are actually two levels, the library and the example page, it doesnāt seem to work properly. And to make things confusing the āconceptual top levelā (the example page) code is nested inside the lower level (the library).
So if you are only tweaking something at the example level you should not need to rebuild, but if you tweak the library code I think you need to rebuild it then rerun the example/page/server bit.
This is an area I am very weak in so I probably did it all wrong, and I also wrote it a couple of years ago.
I am (slowly) working on a Foxglove extension to totally replace this which will be a far better approach!
Hi there,
i am having a strange behaviour.
While using a gamepad my robot turns right when i am steering to the left and it turns left when i am steering to the right.
Where is my mistake here?
Thanks
Hey!
The simplest answer is probably: in the teleop_twist_joy
params, try just flipping the sign of the angular Z value.
You can also step through and echo /joy
and your cmd_vel topic to check what the values are at each stage.
Thanks for the fast answer.
Meanwhile i have found the problem.
I had to switch the motor cables on the controller.
Thank you Josh your great video.
my robot turn left with joystick but my rviz turn rightā¦
in your tutorialā¦ you explained I think.
but I couldnāt find your explainā¦
where should I fix itā¦
Anyone knows this issue and how to fix it please let me know
Regards,
Johnā¦
fixed with hardware connection ā¦ motor connection with the arduino.
interrupt each other ā camera.launch.py and launch_robot.launch.py
in ros2_control.xacro
rplidar.launch.py and launch_robot.launch.py is ok
issue: during using rviz2 -d ~~ā¦ can not use cameraā¦
Hey John, sorry Iām a little confused by your replies - can you please clarify what the current problems are?
If the camera is not showing up in RViz, make sure you are using an Image
display (not a Camera
as it can be trickier to get going) and check with the dropdown that the topic is actually being published. There was an issue I didnāt explain clearly at this point that the simulated camera and real camera were set up to publish on slightly different topics.
If you can explain the issue more clearly Iām happy to try and help
There is a clash between the camera and the Arduino.
the lidar and the Arduino(robot) looks fine at the moment.
in the ros2_control.xacro
diffdrive_arduino/DiffDriveArduino
left_wheel_joint
right_wheel_joint
30
<param name="device">/dev/ttyUSB1</param>
**usb0 is lidar, usb1 is robot
āor----
/dev/serial/by-path/platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.3.1:1.0-port0
both I tested but same issueā¦
<param name="baud_rate">57600</param>
<param name="timeout">1000</param>
<param name="enc_counts_per_rev">3436</param>
</hardware>
in the pi
mate_bot@pi:~$ ls /dev/ttyUSB*
/dev/ttyUSB0 /dev/ttyUSB1
mate_bot@pi:~$ ls /dev/serial/by-path
platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.3.1:1.0-port0
platform-fd500000.pcie-pci-0000:01:00.0-usb-0:1.3.3:1.0-port0
in the rviz2
reset button
robot and lidar working
camera warnā¦ camera info: No camera info received on [/camera/carmera_info] Topic may not exist
if only run camera launch file, rqt_image_view is working.
Please let me know how to fix the crash with camera and robot(arduino)
I searched google it but couldnāt find the solution
sorry interrupt and my poor English
camera issue with web server.
any one have an idea please let me know
http://localhost:3000/realgamepaddemo
There is no video stream.
I check the topic list at the dev machine.
There is a /image_raw as below.
dev-pc@Dev-PC:~/dev_ws$ ros2 topic list
/camera_info
/diff_cont/cmd_vel_unstamped
/image_raw
/image_raw/compressed
/image_raw/compressedDepth
/image_raw/theora
/joy
/joy/set_feedback
/parameter_events
/rosout
Tested with the rqt_image_view is working at ehe dev machine
dev-pc@Dev-PC:~/dev_ws$ ros2 run rqt_image_view rqt_image_view
The pi side web_video_server is working well I think
mate_bot@pi:~/robot_ws$ ros2 run web_video_server web_video_server
[INFO] [1679537318.652438037] [web_video_server]: Waiting For connections on 0.0.0.0:8080
Hey John, ok letās unpack a few of these things.
Unless you have a camera that works over serial, it shouldnāt be possible for these to clash.
Ah I think this might be a problem that Iāve been meaning to look into. In RViz there are two ways to look at camera data - an āImageā display and a āCameraā display. The Camera display expects there to be not just an image topic, but a CameraInfo
topic to go along with it. When we simulate with Gazebo it creates both topics, but I think the camera driver Iām using is NOT creating the CameraInfo
topic. I think this might be because it doesnāt have calibration data but Iām not 100% sure.
Everything should work fine regardless, as all the other tutorials rely only on the image data not the camera info. And in RViz you can just change it to an āImageā display.
I saw youāve deleted this one, but to answer the question anyway in case someone else finds this: if you are running it with the āreal controllerā it wonāt show up the buttons until it has seen the gamepad. Usually that will happen the first time you press a button after loading the page.
At a guess, Iād say youāre not running the web video server which needs to also be running and it converts the image topics to a web-friendly format. Here is the repo for it, I do show that in the āPart Bā video.
As a final note, I am still working on my plugin for Foxglove that makes that whole janky web demo completely redundant, so stay tuned for that
Hope that helps, please reply if you have further troubles.
for the rviz2
git clone issueā¦
git clone -b port_plugin-tutorials GitHub - ros-visualization/visualization_tutorials: Tutorials related to using and extending RViz and interactive_markers.
cloning into āvisualization_tutorialsāā¦
fatal: Remote branch port-plugin-tutorials not found in upstream origin
Thank you very much your wonderful lecture
I am not an IT personā¦ so sometime very easy things to youā¦ very difficult to meā¦
your 20 min lecturingā¦ to me more than 1 week learningā¦
Why so many stupid questions and issues
Hey Josh,
I use a Gstreamer node that I found on github: GitHub - clydemcqueen/gscam2: ROS2 camera driver for GStreamer-based video streams, supports intra-process communication
Itās a ROS2 port, publishes the camera info and worked fine for me. The Jetson Nano has some issues with MIPI cameras which is why I went this way.
Teleop 14 Part A. Thank you for this excellent tutorial which brings together essential info on understanding teleoperation. And the joy_tester package is very helpful in determining the Button and Axis numbering in a ROS2 system-not clearly explained anywhere else as clearly.and pleased that it now can do a binary instal on a Ubuntu 22.04 machine, ros-humble-joy-tester.
Hey Josh,
I am trying to subscribe the Joy message from joy_linux node to a custom node. Basically I want to convert the Joy message to Float64 for speed(linear) and float64(wheel angle) for a robot behaving like Ackermann steered. I have written the pub/sub nodes for these two messages but getting error in the code. Also, for Ackermann steered robot, how I can control the robot. As there is no support for Ackermann steered robot in ROS2, no ros2-control is there, no gazebo plugin except for toyota prius example but that doesnāt work. I am new to ROS so need some advice in these issues.
If you can help me with this, it will be great. Using ROS2 Humble, Ubuntu 22.04
THE CODE:
#!/usr/bin/env python
import sys
import rclpy
import pdb
from sensor_msgs.msg import Joy
from std_msgs.msg import Float64
import subprocess, time
from rclpy.node import Node
from std_msgs.msg import String
def rescale(x, oldmin, oldmax, newmin, newmax):
r = (x-oldmin)/(oldmax-oldmin)
out = newmin + r*(newmax-newmin)
return out
class Joystick2speed(Node):
def __init__(self):
super().__init__('joystick2speed')
self.pub = self.create_publisher(Float64,'speedcmd_metersec', 1)
self.sub = self.create_subscription('/joy', Joy, self.joy_callback, 1)
self.buffer = []
for i in range(0,100):
self.buffer.append('X')
self.buffer_index = 0
self.get_logger().info("Speed node has been initialized")
def joy_callback(self, data):
joystick_y = -data.y
deadmanzone_min = -0.2
deadmanzone_max = 0.2
max_fwd_speed_per_second = 2
max_bkwd_speed_per_sec = 2
if joystick_y > deadmanzone_min and joystick_y < deadmanzone_max:
velocity = 0
elif joystick_y > 0:
velocity = rescale(joystick_y, deadmanzone_max, 1., 0., max_fwd_speed_per_second)
elif joystick_y < 0:
velocity = rescale(joystick_y, -1., deadmanzone_min, -max_bkwd_speed_per_sec, 0.)
msg_out = Float64(velocity)
print(velocity)
self.pub.publish(msg_out)
def main(args = None):
rclpy.init(args=args)
joystick2speed = Joystick2speed()
rclpy.spin(joystick2speed)
rclpy.shutdown()
if name == āmainā:
main()
Error: Traceback (most recent call last):
File ā/home/computing/podcar_2/src/pod2_bringup/scripts/joystcik2wheelAngle.pyā, line 34, in
main()
File ā/home/computing/podcar_2/src/pod2_bringup/scripts/joystcik2wheelAngle.pyā, line 30, in main
joystick2wheel = Joystick_2_wheel()
File ā/home/computing/podcar_2/src/pod2_bringup/scripts/joystcik2wheelAngle.pyā, line 16, in init
self.pub = self.create_publisher(āwheelAngleCmdā, Float64, 10)
File ā/opt/ros/humble/local/lib/python3.10/dist-packages/rclpy/node.pyā, line 1273, in create_publisher
final_topic = self.resolve_topic_name(topic)
File ā/opt/ros/humble/local/lib/python3.10/dist-packages/rclpy/node.pyā, line 1226, in resolve_topic_name
return _rclpy.rclpy_resolve_name(self.handle, topic, only_expand, False)
TypeError: rclpy_resolve_name(): incompatible function arguments. The following argument types are supported:
1. (arg0: rclpy::Node, arg1: str, arg2: bool, arg3: bool) ā str
Invoked with: <rclpy._rclpy_pybind11.Node object at 0x7ffb31f413b0>, <class āstd_msgs.msg._float64.Float64ā>, False, False
Hey Josh,
Iāve destroyed all system trying to make ācontrol your robot from your phoneāā¦ Iām not able to install nothing (apt install) anymore, this give me a strange irreversible error in dependenciesā¦(navigation need cereslib that canāt be installed for exampleā¦).
Now, first I try again in new system, need to know if in your method (node.jsā¦stream cameraā¦game pad controller) is possible to stream Rviz tooā¦?ā¦ I would be have camera, controller, and Rviz workingā¦
PS, in my system I donāt use internet but only local net, and now Iām trying to work all system in raspberry, using navigation and mapping tooā¦is very slow but I think in future the pi5 help more to do this. Following your fantastic tutorial in my pc (simulated robot) all work except remote control from web because Iām afraid about trying tutorial to destroy pc system too like raspberry).
Thank you
currently, i can run ros2 run rosbridge_server rosbridge_websocket on workspace ros-ui-react/example is succesfull , so I need to install git clone GitHub - RobotWebTools/rosbridge_suite: Server Implementations of the rosbridge v2 Protocol on workspace : Dev_ws or not ? because i tried to use a joystick. Launch is a real robot but that does not move (I have adjusted joystick.launch.py)
Hey Josh! Thank you so much for the lecture series.
I am trying to run the articubot_one repo on my own system and hardware. Iāve made all the proper motor connections and have gotten all the launch-scripts and nodes to work.
Iāve run into a weird error and I will try my best to describe it so bear with me.
When I try to operate the robot via Teleop commands, the robot only takes the command once every few seconds. It moves once every few seconds and stops after its āsetā timeout period(which according to my code is 2secs). i.e. if it doesnāt receive the next commands with in 2 seconds of the prev command, the motor stops moving. Now i call this behaviour random because sometimes the motor receives the commands 2 seconds apart from each other, sometimes theyāre 5 seconds apart, sometimes even 30 secs apart. It does not move in those brief periods of time.
Can you suggest what else I can do or if anyone has any idea what could be wrong, help would be greatly appreciated.
Thanks in advance