Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cartesian control #178

Open
vopsi99 opened this issue May 23, 2024 · 9 comments
Open

Cartesian control #178

vopsi99 opened this issue May 23, 2024 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@vopsi99
Copy link

vopsi99 commented May 23, 2024

Hello, I tried using this repository but my knowledge in ros is rather mediocre. I am curious if this stack has cartesian control implementation or not.

It may be a stupid question, but I am stuck for quite some time and I could really use some help..

@mhubii
Copy link
Member

mhubii commented May 23, 2024

no worries @vopsi99 and thank you for raising this issue. In principle yes, in practise users have to put in some effort to do this right now. I can pinpoint you to some examples and try to improve the demos.

What are you trying to achieve:

  • Move to an absolute pose?
  • Specify an end-effector velocity?

@vopsi99
Copy link
Author

vopsi99 commented May 23, 2024

I managed to move the end-effector incrementally but it is really unstable, so I need to move to an absolute pose with a certain velocity. I supposed this kind of motion would be the best to further integrate a stereo camera to position the robot in correlation to some markers in the workspace.

@mhubii
Copy link
Member

mhubii commented May 23, 2024

@vopsi99
Copy link
Author

vopsi99 commented May 23, 2024

So I should only send from another node my desired pose to this demo? This is the demo that I modified to send incremental values for x,y,z on position and orientation, but on Y orientation, for example, it always went crazy even if I was doing really small movement increments.

@mhubii
Copy link
Member

mhubii commented May 23, 2024

okay I'll have to double check this demo, as it was community contributed. But this demo sends an absolute pose.

You will have to make sure you:

  • Get the initial pose
  • Send a small (like really small) increment on top of the initial pose

@mhubii mhubii added the enhancement New feature or request label May 23, 2024
@mhubii mhubii self-assigned this May 23, 2024
@mhubii
Copy link
Member

mhubii commented May 23, 2024

I think this demo needs an update and should really allow users to control a velocity rather than absolute poses maybe

@vopsi99
Copy link
Author

vopsi99 commented May 23, 2024

Yes, I realised that a really small increment (0.001) would make the movement smoother.For positioning on x,y and z everything seems fine, but when I am trying to orient it, it just breaks everything.

After sending the increment on Y for longer it tends to jump into a position and i get the inverse kinematics failed error.

I think velocity control would be really useful for everyone.

@mhubii
Copy link
Member

mhubii commented May 23, 2024

thank you for the feedback. Let's add support for this then.

@BoWangFromMars
Copy link
Contributor

Hello, I am back. I am the contributor for "pose control". The initial pose and position in this demo is singular to some extent, so that it can only support the movement in z axis. However, if we want to orient it, the inverse kinematics would be calculated in a wrong way.
The solution is setting a non-singular pose and position as the initial one. For example, I set the initial joint position as
ros2 topic pub --once /forward_position_controller/commands std_msgs/msg/Float64MultiArray "{data: [2.175727, 0.928842, -0.975961, -1.3158, 0.3883, 1.2622, 0.01274]}"
And I use the following code to orient it, please take it as reference code (interpolation of quaternion, rotating the end effector around its x-axis, y-axis, z-axis)

#include "rclcpp/rclcpp.hpp"
#include "cmath"
#include <iostream>
#include <string>
#include "geometry_msgs/msg/pose.hpp"

#include "tf2/transform_datatypes.h"
#include "tf2_geometry_msgs/tf2_geometry_msgs.hpp"
#include "tf2/convert.h"
#include "tf2/LinearMath/Quaternion.h"
#include "tf2/LinearMath/Matrix3x3.h"

using std::placeholders::_1;

class CartesianPosePublisherNode:public rclcpp::Node
{
  private:
    rclcpp::Publisher<geometry_msgs::msg::Pose>::SharedPtr cartesian_pose_publisher_;
    rclcpp::Subscription<geometry_msgs::msg::Pose>::SharedPtr cartesian_pose_subscriber_;

    geometry_msgs::msg::Pose initial_cartesian_pose_; // robot starts from this pose
    bool is_init_; 

    double amplitude_; // rad
    double frequency_; // Hz
    double sampling_time_; // sampling time for sending position command
    double phase_; // initial phase
    geometry_msgs::msg::Pose previous_cmd_pose_;

    geometry_msgs::msg::Pose target_pose; // target quanternion (target pose)
    geometry_msgs::msg::Pose cartesian_pose_command; // cartesian pose command which is published to other ROS topics
    double t = 0.0;
        
  private:
    /**
     * @function: callback function for Cartesian Pose Subscriber
     * @param msg Cartesian Pose of the robot
    */
    void topic_callback(const geometry_msgs::msg::Pose& msg)
    {            
      if(!is_init_) 
      {
        initial_cartesian_pose_ = msg;
        cartesian_pose_command = initial_cartesian_pose_;
        target_pose = initial_cartesian_pose_;

        rotating(target_pose, initial_cartesian_pose_, 'y', 10.0);

        std::cout << "target pose calculated by function rotating: " << std::endl;
        std::cout << "target_pose.orientation.x " << target_pose.orientation.x << std::endl;
        std::cout << "target_pose.orientation.y " << target_pose.orientation.y << std::endl;
        std::cout << "target_pose.orientation.z " << target_pose.orientation.z << std::endl;
        std::cout << "target_pose.orientation.w " << target_pose.orientation.w << std::endl;

        /* set "initial joints" of KUKA iiwa robot as "2.175727, 0.928842, -0.975961, -1.3158, 0.3883, 1.2622, 0.01274"(the unit is rad),
        and the following quanternion corresponds to the target pose of the end effector, expressed in the "Base" Frame of the robot.
        In addition, the target position of the end effector is as same as the initial position, which can be calculated from "initial joints".
        */

        // make the initial pose matrix of end effector rotate its x-axis for 5 degrees
        /*
        target_quanternion.orientation.w = 0.1350; 
        target_quanternion.orientation.x = -0.5842;
        target_quanternion.orientation.y = 0.7808;
        target_quanternion.orientation.z = -0.1756;
        */

        // make the initial pose matrix of end effector rotate its y-axis for 5 degrees
        /*
        target_quanternion.orientation.w = 0.0749; 
        target_quanternion.orientation.x = -0.5828;
        target_quanternion.orientation.y = 0.7917;
        target_quanternion.orientation.z = -0.1670; 
        */

        // make the initial pose matrix of end effector rotate its z-axis for 5 degrees
        /*
        target_pose.orientation.w = 0.1154; 
        target_pose.orientation.x = -0.5546;
        target_pose.orientation.y = 0.8127;
        target_pose.orientation.z = -0.1365; 
        */

        /*
        std::cout << "initial_pose:" << std::endl;
        std::cout << "position x: " << initial_cartesian_pose_.position.x;
        std::cout << " y: " << initial_cartesian_pose_.position.y;
        std::cout << " z: " << initial_cartesian_pose_.position.z;
        std::cout << std::endl;
        std::cout << "quternion w: " << initial_cartesian_pose_.orientation.w;
        std::cout << " x: " << initial_cartesian_pose_.orientation.x;
        std::cout << " y: " << initial_cartesian_pose_.orientation.y;
        std::cout << " z: " << initial_cartesian_pose_.orientation.z << std::endl;
        */

        is_init_ = true;
      }
      else
      {        
        if(t < 1)
        {
          /* Interpolation of quanternion, which follows the equation "q = [(1-t)*q_1 + t*q_2]/|(1-t)*q_1 + t*q_2|, 
          where q_1 is the initial quanternion corresponding to initial pose, and q_2 is the target quanternion corresponding to target pose.
          */
          cartesian_pose_command.orientation.w = (1-t)*initial_cartesian_pose_.orientation.w + t*target_pose.orientation.w;
          cartesian_pose_command.orientation.x = (1-t)*initial_cartesian_pose_.orientation.x + t*target_pose.orientation.x;
          cartesian_pose_command.orientation.y = (1-t)*initial_cartesian_pose_.orientation.y + t*target_pose.orientation.y;
          cartesian_pose_command.orientation.z = (1-t)*initial_cartesian_pose_.orientation.z + t*target_pose.orientation.z;
          double norm =  std::pow(cartesian_pose_command.orientation.w, 2) + std::pow(cartesian_pose_command.orientation.x, 2) + std::pow(cartesian_pose_command.orientation.y, 2) + std::pow(cartesian_pose_command.orientation.z, 2);
          norm = sqrt(norm);
          cartesian_pose_command.orientation.w = cartesian_pose_command.orientation.w / norm;
          cartesian_pose_command.orientation.x = cartesian_pose_command.orientation.x / norm;
          cartesian_pose_command.orientation.y = cartesian_pose_command.orientation.y / norm;
          cartesian_pose_command.orientation.z = cartesian_pose_command.orientation.z / norm;
          t = t + 0.001*2; // here, the time step is 0.001*2, so the total steps are 1.0/(0.001*2)=500. The control frequency is 100 Hz, so execution time is 5 seconds
          //std::cout << "Cartesian Pose Command: " << std::endl;
          //std::cout << cartesian_pose_command.orientation.w << " " << cartesian_pose_command.orientation.x << " " << cartesian_pose_command.orientation.y << " " << cartesian_pose_command.orientation.z << std::endl;
          cartesian_pose_publisher_->publish(cartesian_pose_command);
        }

        //cartesian_pose_publisher_->publish(cartesian_pose_command);

        previous_cmd_pose_ = cartesian_pose_command;
      }

      return;
    }

  public:
    /**
     * @function: rotate the current end effector pose to the desired end effector pose
     * @param target_pose used to pass the target pose
     * @param current_cartesian_pose current end effector pose, expressed as a quanternion
     * @param axis specify which axis we make the end effector rotate around, we can choose 'x', 'y', 'z'  
     * @param rotation_angle how much the angle we need to rotate, whose unit is [degree], not [rad]     
    */
    void rotating(geometry_msgs::msg::Pose& target_pose, geometry_msgs::msg::Pose current_cartesian_pose, char axis, double rotation_angle_deg)
    {
      // extract the current quaternion from the variable "current_cartesian_pose"
      tf2::Quaternion q_current(
        current_cartesian_pose.orientation.x,
        current_cartesian_pose.orientation.y,
        current_cartesian_pose.orientation.z,
        current_cartesian_pose.orientation.w
      );

      // convert 'degree' to 'rad'
      double rotation_angle_rad = rotation_angle_deg * M_PI / 180.0;

      // create the quaternion which rotates around z axis
      tf2::Quaternion q_rotation;
      if(axis == 'x')
      {
        q_rotation.setRPY(rotation_angle_rad, 0, 0);
      }
      else if(axis == 'y')
      {
        q_rotation.setRPY(0, rotation_angle_rad, 0);
      }
      else if(axis == 'z')
      {
        q_rotation.setRPY(0, 0, rotation_angle_rad);
      }
      else
      {
        std::cout << "wrong input, please input 'x', 'y' or 'z' for specifying the axis" << std::endl;
      }

      // make the current pose of end effector rotate its z axis for 5 degrees
      tf2::Quaternion q_target = q_current * q_rotation; 

      target_pose.orientation.x = q_target.x();
      target_pose.orientation.y = q_target.y();
      target_pose.orientation.z = q_target.z();
      target_pose.orientation.w = q_target.w();

      return;
    }

  public:
	  CartesianPosePublisherNode():Node("cartesian_pose_publisher_node")
	  {
      is_init_ = false;
      amplitude_ = 0.05;
      frequency_ = 0.5;
      sampling_time_ = 0.01;
      phase_ = 0.0;

      // the following ros topics for gazebo simulation 
	    cartesian_pose_publisher_ = this->create_publisher<geometry_msgs::msg::Pose>(
          "/lbr/command/cartesian_pose", 10);
      cartesian_pose_subscriber_ = this->create_subscription<geometry_msgs::msg::Pose>(
          "/lbr/state/cartesian_pose", 10, 
          std::bind(&CartesianPosePublisherNode::topic_callback, this, _1));
	  }
};

int main(int argc, char** argv)
{
  rclcpp::init(argc, argv);
	
  rclcpp::spin(std::make_shared<CartesianPosePublisherNode>()); 

  rclcpp::shutdown(); 
  return 0;
}

Another suggestion is, when you are not sure whether the position and pose is a singular one, please run it in Gazebo or other simulation environment first instead of real robot and print the inverse kinematics solution in "pose control node". Comparing it with the current joint position, it is wrong when these two values differ a lot. Hope it is helpful and please contact me if any other problems. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants