Hi there!
I am happy to share my first test with AR2, ROS, Kinect, and tensorflow. I use pretrained fasterRCNN (https://github.com/tensorflow/models/tree/master/research/object_detection) and depth map from the Kinect to find the XYZ coordinates of an apple. It works but I think a simple openCVpipeline should be more efficient than deep learning vision algorithm . For the next step I will finish ROS integration (ROS_control), try Opencv pipeline and the DOPE algorithm from Nvidia. It can run on real time with a single RGB cam and returns 6Dpose estimation of an object (https://github.com/NVlabs/Deep_Object_Pose)!!! Thanks for making this project available to the open source community! AR2 is really awesome!!
Hi, any guy here successfully make AR2/3 work with ROS? If yes can share it out at github or youtube?
Please, can you share the code with me, too? I am struggling trying to connect AR2 to ROS and I am facing with multiple errors I don't know how to solve. Please, your help will be very useful.
My email is sergiofernandezleon22@gmail.com
Please, can you share the code with me, too? I am struggling trying to connect AR2 to ROS and I am facing with multiple errors I don't know how to solve. Please, your help will be very useful.
Thank you. for sharing your code. I am interesting this project also.
excellent
bonjour nicolas,
c'est un très bon projet que tu réalise.
je souhaiterais realisé un bras AR3 j'aurais aimé te posé des questions directement a ce sujet, est il possible d'en discuté en privé?
je te laisse mon mail:
samuelmahmutovic@yahoo.com
merci d'avance
Muy bien 😀 si haces un tutorial de este robot con ROS agradeceria que me lo enviará gracias celesazk@gmail.com
Hi! Thank you for your interest. Concerning the tuto and the code I have not finished yet, I need more time to write the tutorial and finished ros_control integration. I would prefer release clean code. But if you give me an email adress I can send you the code and the first version of the tutorial. Your feedback will be very usefull to me.
Hi Chris thank you for sharing your project!!
Actually is really simple thanks to you. the demo from moveit already has a controller. If you update random goal and click on plane and execute button, moveit will publish all the joint positions on a topic called “/movegroup/fake_trajectorycontroller .
I just write a simple python subscriber to receive data from the topic and send command to the Arduino.
In a terminal I launch the demo.launch file from ar2_moveit_config package
In an other terminal I launch my script:
-load all the global variables you define in your source file
-load the serial connection
-calibrate all motors with calRobotAll() function
-move in home position (the same as you can see in moveit)
Then I have a simple loop and I read joint values from the fake trajectory topic, convert them from radian to degree and send them to the Arduino through your MoveNew() function.
It is important to update joint limits in the Urdf file and modified negangllim posinglimit global variables in order to match the real AR2 position and the moviet AR2 position.
I can make a tutorial and share it with my code!
It’s really easy to run if you have ros install
Thats amazing. how are you controlling the robot step/dir outputs through ROS? you say you haven't finish ROS control yet?
thanks for sharing.
Impressif