Skip to content

Commit

Permalink
Fix apps/calibration location
Browse files Browse the repository at this point in the history
  • Loading branch information
fspindle committed Oct 3, 2023
1 parent b85bb42 commit 7fa4c0f
Show file tree
Hide file tree
Showing 5 changed files with 28 additions and 28 deletions.
14 changes: 7 additions & 7 deletions doc/tutorial/calibration/tutorial-calibration-extrinsic.dox
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@ The calibration process described in this tutorial consists in 3 steps:
to estimate the \f$^e{\bf M}_c\f$ transformation.

Note that all the material (source code) described in this tutorial is part of ViSP source code
(in `tutorial/calibration` folder) and could be downloaded using the following command:
(in `apps/calibration` folder) and could be downloaded using the following command:

\verbatim
$ svn export https://github.com/lagadic/visp.git/trunk/tutorial/calibration
$ svn export https://github.com/lagadic/visp.git/trunk/apps/calibration
\endverbatim

\section calib_ext_recommendation Recommendations
Expand Down Expand Up @@ -102,7 +102,7 @@ If you want the parameters with distortion, you need to achieve a calibration as

As an example, in ViSP source code you will find a dataset corresponding to data acquired with a real robot:
\verbatim
$ cd $VISP_WS/visp-build/tutorial/calibration
$ cd $VISP_WS/visp-build/apps/calibration
$ ls *.{xml,png,yaml}
camera.xml image-3.png image-6.png pose_fPe_1.yaml pose_fPe_4.yaml pose_fPe_7.yaml
image-1.png image-4.png image-7.png pose_fPe_2.yaml pose_fPe_5.yaml pose_fPe_8.yaml
Expand Down Expand Up @@ -286,10 +286,10 @@ vpRealSense2 class usage.
<b>Step 1: Acquire robot poses and images</b>

Connect the Realsense D435 camera to the computer, put the chessboard in the camera field of view, enter in
`tutorial/calibration` folder and run `visp-acquire-franka-calib-data` binary to acquire the images and the
`apps/calibration` folder and run `visp-acquire-franka-calib-data` binary to acquire the images and the
corresponding robot end-effector positions:

$ cd tutorial/calibration
$ cd apps/calibration
$ ./visp-acquire-franka-calib-data

By default the robot controller IP is `192.168.1.1`. If your Franka has an other IP (let say 10.0.0.2) use
Expand Down Expand Up @@ -392,11 +392,11 @@ vpRealSense2 class usage.
<b>Step 1: Acquire robot poses and images</b>

Connect the Realsense camera to the computer, put the chessboard in the camera field of view, enter in
`tutorial/calibration` folder and run `visp-acquire-universal-robots-calib-data` binary to acquire the images and
`apps/calibration` folder and run `visp-acquire-universal-robots-calib-data` binary to acquire the images and
the corresponding robot end-effector positions:

\verbatim
$ cd tutorial/calibration
$ cd apps/calibration
$ ./visp-acquire-universal-robots-calib-data
\endverbatim

Expand Down
12 changes: 6 additions & 6 deletions doc/tutorial/visual-servo/tutorial-franka-ibvs.dox
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ An example of image-based visual servoing using Panda robot equipped with a Real

- Attach your Realsense camera to the robot end-effector
- Put an Apriltag in the camera field of view
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$ the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `tutorial/calibration/eMc.yaml`.
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$ the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `apps/calibration/eMc.yaml`.

Now enter in `example/servo-franka folder` and run `servoFrankaIBVS` binary using `--eMc` to locate the file containing the \f$^e{\bf M}_c\f$ transformation. Other options are available. Using `--help` show them:

Expand All @@ -26,7 +26,7 @@ Now enter in `example/servo-franka folder` and run `servoFrankaIBVS` binary usin

Run the binary activating the plot and using a constant gain:

$ ./servoFrankaIBVS --eMc ../../tutorial/calibration/eMc.yaml --plot
$ ./servoFrankaIBVS --eMc ../../apps/calibration/eMc.yaml --plot

Use the left mouse click to enable the robot controller, and the right click to quit the binary.

Expand All @@ -40,15 +40,15 @@ At this point the behaviour that you should observe is the following:

You can also activate an adaptive gain that will make the convergence faster:

$ ./servoFrankaIBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --adaptive_gain
$ ./servoFrankaIBVS --eMc ../../apps/calibration/eMc.yaml --plot --adaptive_gain

You can also start the robot with a zero velocity at the beginning introducing task sequencing option:

$ ./servoFrankaIBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --task_sequencing
$ ./servoFrankaIBVS --eMc ../../apps/calibration/eMc.yaml --plot --task_sequencing

And finally you can activate the adaptive gain and task sequencing:

$ ./servoFrankaIBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --adaptive_gain --task_sequencing
$ ./servoFrankaIBVS --eMc ../../apps/calibration/eMc.yaml --plot --adaptive_gain --task_sequencing

To learn more about adaptive gain and task sequencing see \ref tutorial-boost-vs.

Expand All @@ -61,6 +61,6 @@ from a real Franka robot, like in the next video, we recommend to make a tour on
<p align="center"><iframe width="560" height="315" src="https://www.youtube.com/embed/02Bx093Fuak" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
\endhtmlonly

You can also follow \ref tutorial-ibvs that will give some hints on image-based visual servoing in simulation with a free flying camera.
You can also follow \ref tutorial-ibvs that will give some hints on image-based visual servoing in simulation with a free flying camera.

*/
10 changes: 5 additions & 5 deletions doc/tutorial/visual-servo/tutorial-franka-pbvs.dox
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ An example of position-based visual servoing using Panda robot equipped with a R

- Attach your Realsense camera to the robot end-effector
- Put an Apriltag in the camera field of view
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$ the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `tutorial/calibration/eMc.yaml`.
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$ the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `apps/calibration/eMc.yaml`.

Now enter in `example/servo-franka folder` and run `servoFrankaPBVS` binary using `--eMc` to locate the file containing the \f$^e{\bf M}_c\f$ transformation. Other options are available. Using `--help` show them:

Expand All @@ -268,7 +268,7 @@ Now enter in `example/servo-franka folder` and run `servoFrankaPBVS` binary usin

Run the binary activating the plot and using a constant gain:

$ ./servoFrankaPBVS --eMc ../../tutorial/calibration/eMc.yaml --plot
$ ./servoFrankaPBVS --eMc ../../apps/calibration/eMc.yaml --plot

\note If you encounter the following error message:
\verbatim
Expand All @@ -287,15 +287,15 @@ Now you should see new window that shows the image from the camera like in the n

You can also activate an adaptive gain that will make the convergence faster:

$ ./servoFrankaPBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --adaptive_gain
$ ./servoFrankaPBVS --eMc ../../apps/calibration/eMc.yaml --plot --adaptive_gain

You can also start the robot with a zero velocity at the beginning introducing task sequencing option:

$ ./servoFrankaPBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --task_sequencing
$ ./servoFrankaPBVS --eMc ../../apps/calibration/eMc.yaml --plot --task_sequencing

And finally you can activate the adaptive gain and task sequencing:

$ ./servoFrankaPBVS --eMc ../../tutorial/calibration/eMc.yaml --plot --adaptive_gain --task_sequencing
$ ./servoFrankaPBVS --eMc ../../apps/calibration/eMc.yaml --plot --adaptive_gain --task_sequencing

To learn more about adaptive gain and task sequencing see \ref tutorial-boost-vs.

Expand Down
10 changes: 5 additions & 5 deletions doc/tutorial/visual-servo/tutorial-universal-robot-ibvs.dox
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ An example of image-based visual servoing using a robot from Universal Robots eq

- Attach your Realsense camera to the robot end-effector. To this end, we provide a CAD model of a support that could be 3D printed. The FreeCAD model is available [here](https://github.com/lagadic/visp/tree/master/example/servo-universal-robots).
- Put an Apriltag in the camera field of view
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$, the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `tutorial/calibration/ur_eMc.yaml`.
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$, the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `apps/calibration/ur_eMc.yaml`.

Now enter in `example/servo-universal-robots folder` and run `servoUniversalRobotsIBVS` binary using `--eMc` to locate the file containing the \f$^e{\bf M}_c\f$ transformation. Other options are available. Using `--help` show them:

Expand All @@ -144,7 +144,7 @@ $ ./servoUniversalRobotsIBVS --help
Run the binary activating the plot and using a constant gain:

\verbatim
$ ./servoUniversalRobotsIBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot
\endverbatim

Use the left mouse click to enable the robot controller, and the right click to quit the binary.
Expand All @@ -160,19 +160,19 @@ At this point the behaviour that you should observe is the following:
You can also activate an adaptive gain that will make the convergence faster:

\verbatim
$ ./servoUniversalRobotsIBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --adaptive_gain
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --adaptive_gain
\endverbatim

You can also start the robot with a zero velocity at the beginning introducing task sequencing option:

\verbatim
$ ./servoUniversalRobotsIBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --task_sequencing
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --task_sequencing
\endverbatim

And finally you can activate the adaptive gain and task sequencing:

\verbatim
$ ./servoUniversalRobotsIBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --adaptive_gain --task_sequencing
$ ./servoUniversalRobotsIBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --adaptive_gain --task_sequencing
\endverbatim

\section ur_ibvs_next Next tutorial
Expand Down
10 changes: 5 additions & 5 deletions doc/tutorial/visual-servo/tutorial-universal-robot-pbvs.dox
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ An example of position-based visual servoing using a robot from Universal Robots

- Attach your Realsense camera to the robot end-effector. To this end, we provide a CAD model of a support that could be 3D printed. The FreeCAD model is available [here](https://github.com/lagadic/visp/tree/master/example/servo-universal-robots).
- Put an Apriltag in the camera field of view
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$, the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `tutorial/calibration/eMc.yaml`.
- If not already done, follow \ref tutorial-calibration-extrinsic to estimate \f$^e{\bf M}_c\f$, the homogeneous transformation between robot end-effector and camera frame. We suppose here that the file is located in `apps/calibration/eMc.yaml`.

Now enter in `example/servo-universal-robots folder` and run `servoUniversalRobotsPBVS` binary using `--eMc` to locate the file containing the \f$^e{\bf M}_c\f$ transformation. Other options are available. Using `--help` show them:

Expand All @@ -27,7 +27,7 @@ $ ./servoUniversalRobotsPBVS --help
Run the binary activating the plot and using a constant gain:

\verbatim
$ ./servoUniversalRobotsPBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot
$ ./servoUniversalRobotsPBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot
\endverbatim

Use the left mouse click to enable the robot controller, and the right click to quit the binary.
Expand All @@ -43,19 +43,19 @@ At this point the behaviour that you should observe is the following:
You can also activate an adaptive gain that will make the convergence faster:

\verbatim
$ ./servoUniversalRobotsPBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --adaptive_gain
$ ./servoUniversalRobotsPBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --adaptive_gain
\endverbatim

You can also start the robot with a zero velocity at the beginning introducing task sequencing option:

\verbatim
$ ./servoUniversalRobotsPBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --task_sequencing
$ ./servoUniversalRobotsPBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --task_sequencing
\endverbatim

And finally you can activate the adaptive gain and task sequencing:

\verbatim
$ ./servoUniversalRobotsPBVS --eMc ../../tutorial/calibration/ur_eMc.yaml --plot --adaptive_gain --task_sequencing
$ ./servoUniversalRobotsPBVS --eMc ../../apps/calibration/ur_eMc.yaml --plot --adaptive_gain --task_sequencing
\endverbatim

\section ur_pbvs_next Next tutorial
Expand Down

0 comments on commit 7fa4c0f

Please sign in to comment.