Hi Annin Robotics Community,
I'm excited to share that I've recently developed and released a ROS 2 driver for the AR4 robot arm. It uses ros2_control framework for robot control, MoveIt 2 for motion planning, and Rviz2 for user interface and visualization. It's available to use at https://github.com/ycheng517/ar4_ros_driver and will be actively developed and maintained. Check it out if you wish to tap into the power of ROS 2 with the AR4, and contributions are welcome!
That looks awesome, I'm not familiar with ROS/ROS2, would this run on a light client like a Pi ?
Also, does your driver also control the nano/gripper ?
Yes it should work on the raspberry pi, you'll have to install Ubuntu for Raspberry Pi. I'm not supporting the nano/gripper yet but will be working on that in the medium future
I wanted to mention that Yifei Cheng has added files for the MK2 to this repo. I have not had time to test yet but I was hoping someone in the comunity would be able to test this out on the MK2 and give feedback. I also wanted to thank Yifei for taking the time to put this together and contributing to the project. This is a great development - Thank you.
So I've had a chance to get ros2 installed and try out the ar4_ros_driver repo on my AR4-MK2.
Good news! The initial calibration went without a hitch, and some route planning/execution tests also worked as expected.
If there's anything specific you'd like me to test further please let me know, otherwise I think we can say that the the ros driver works for the AR4-MK2!
Here's a brief video showing it in action.
Update: AR4 servo gripper control has been added!
@ycheng517 I’ll be testing this out this weekend! Get ready for some pull requests with OpenCV/Camera functionality in the coming weeks too 🙂
Great work ! Thanks a lot. My AR4 is now ROS2 controlled. Calibration works, path planning and execution using MoveIt works well. I'm stuck with the hand-eye validation. Intel camera is detected and I see the video in MoveIt. Hand-eye calibration is ok, I took 6 samples, aruco tag is detected. When I try to run the hand-eye validation procedure, the aruco tag is detected but planning fails. with the following message: "unable to sample any valid states for goal tree" Any idea why ??
@ycheng517 You rock ! I didn't catch that part in the instructions, sorry about that. The result is well better now. There are still a few things I'm trying to understand: 1) in validation mode, in general, the robot finds the marker successfully and moves accordingly but sometimes it targets the center of the marker and sometimes it targets the borders (and not always the same side) 2) results are not so good when the marker to find is placed at a lower position. For example: calibration has been performed at Z = approx 25 cm. During validation, the marker is placed at z = approx 10 cm, the robot finds it but goes to low and hit the marker.
For posterity, the calibration routine has been updated and now the validation routine should work no matter the camera angle
When using the driver I often have issues with path tolerance being violated, halting the robots movment and causing problems, this especially happens for joints 3 and 4, do you run into this problem? When the controllers.yaml file in ar_hardware_interface is updated it seems the new values are not used, and it continues to use a tolerance of .200.
@evan-wassmann hey I'm not sure why that's not being updated but another way of tackling the problem is to reduce the max_velocity of those joints in ar_moveit_config/config/joint_limits.yaml
I have tested this with the MK3, looks very promising and works. We are still working with @ycheng517 on calibration details, my joints are a little off.