diff --git a/docs/genindex.html b/docs/genindex.html index e95647c..c249984 100644 --- a/docs/genindex.html +++ b/docs/genindex.html @@ -76,704 +76,8 @@

Index

- A - | B - | C - | D - | E - | F - | G - | H - | I - | J - | K - | L - | M - | N - | O - | P - | R - | S - | T - | U - | V - | W
-

A

- - - -
- -

B

- - -
- -

C

- - - -
- -

D

- - - -
- -

E

- - - -
- -

F

- - - -
- -

G

- - - -
- -

H

- - - -
- -

I

- - - -
- -

J

- - -
- -

K

- - -
- -

L

- - -
- -

M

- - - -
- -

N

- - - -
- -

O

- - - -
- -

P

- - - -
- -

R

- - - -
- -

S

- - - -
- -

T

- - -
- -

U

- - -
- -

V

- - - -
- -

W

- - -
- diff --git a/docs/index.html b/docs/index.html index 65bc38e..1f973da 100644 --- a/docs/index.html +++ b/docs/index.html @@ -98,31 +98,25 @@

Welcome to the Robotics Course Repo2. Tutorials
diff --git a/docs/notebooks/1a-configurations.html b/docs/notebooks/1a-configurations.html new file mode 100644 index 0000000..208970a --- /dev/null +++ b/docs/notebooks/1a-configurations.html @@ -0,0 +1,485 @@ + + + + + + + 2.1.1. Configurations — Robotics Course documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

2.1.1. Configurations

+

A configuration is essentially a set of (coordinate) frames, where each frame can represent a shape, joint, inertia, etc. This tutorial introduces to basics of creating & loading configurations, the joint and frame state, computing features, and handling the view window.

+
+
[1]:
+
+
+
from robotic import ry
+import numpy as np
+import time
+print(ry.compiled())
+
+
+
+
+
+
+
+
+compile time: Sep 23 2023 12:43:33
+
+
+
+

2.1.1.1. Adding frames to a configuration

+

The starting point is to create a Configuration:

+
+
[2]:
+
+
+
C = ry.Config()
+C.view()
+
+
+
+
+
[2]:
+
+
+
+
+0
+
+
+

This shows an empty configuration. Tip: Make the view window appear “Always On Top” (right click on the window bar)

+

A configuration is essentially a tree (or forrest) of frames. You usually add models from files, but let’s do it manually here.

+
+
[3]:
+
+
+
C.clear()
+f = C.addFrame(name='first')
+f.setShape(type=ry.ST.marker, size=[.4])
+f.setPosition([0.,0.,.5])
+f.setQuaternion([1., .3, .0, .0]) #is normalized internally
+print('frame name:', f.name, 'pos:', f.getPosition(), 'quat:', f.getQuaternion())
+C.view()
+
+
+
+
+
+
+
+
+frame name: first pos: [0.  0.  0.5] quat: [0.95782629 0.28734789 0.         0.        ]
+
+
+
+
[3]:
+
+
+
+
+0
+
+
+

Let’s add a second frame, but with first as parent and with a hinge joint!

+
+
[4]:
+
+
+
f = C.addFrame(name='second', parent='first')
+f.setJoint(ry.JT.hingeX)
+f.setShape(type=ry.ST.marker, size=[.4])
+f.setColor([1,0,0])
+print('frame name:', f.name, 'pos:', f.getPosition(), 'quat:', f.getQuaternion())
+C.view()
+
+
+
+
+
+
+
+
+frame name: second pos: [0.  0.  0.5] quat: [-0.95782629 -0.28734789 -0.         -0.        ]
+
+
+
+
[4]:
+
+
+
+
+0
+
+
+

Since we now have a configuration with a joint, we can articulate it:

+
+
[5]:
+
+
+
q = C.getJointState()
+q[0] = q[0] + .1
+C.setJointState(q)
+print('joint state:', q)
+C.view()
+
+
+
+
+
+
+
+
+joint state: [0.1]
+
+
+
+
[5]:
+
+
+
+
+0
+
+
+

Other examples to add:

+
+
[ ]:
+
+
+
C.addFrame('ball', 'second') .setShape(ry.ST.sphere, [.1]) .setColor([1.,.5,.0]) .setRelativePosition([-.3,.0,.2])
+C.addFrame('box', 'second') .setShape(ry.ST.ssBox, [.3,.2,.1,.02]) .setColor([.5,1.,.0]) .setRelativePosition([.0,.0,.2])
+C.addFrame('capsule', 'second') .setShape(ry.ST.capsule, [.3, .05]) .setColor([.0,1.,.5]) .setRelativePosition([.3,.0,.2])
+for t in range(100):
+    C.setJointState([np.cos(.1*t)])
+    C.view()
+    time.sleep(.1)
+
+
+
+
+
+

2.1.1.2. Loading existing configurations

+
+
[ ]:
+
+
+
C.clear()
+C.addFile(ry.raiPath('panda/panda.g'))
+C.view()
+
+
+
+

Let’s add a second panda, but prefix all frame names, and move it to the side

+
+
[ ]:
+
+
+
C.addFile(ry.raiPath('panda/panda.g'), 'r_')
+base_r = C.getFrame('r_panda_base')
+base_r.setPosition([.0, .5, .0])
+C.view()
+
+
+
+

We can get the joint state of the full configuration:

+
+
[ ]:
+
+
+
print(C.getJointState())
+print('joints:', C.getJointNames())
+print('frames:', C.getFrameNames())
+
+
+
+

Let’s animate:

+
+
[ ]:
+
+
+
q0 = C.getJointState()
+for t in range(20):
+    q = q0 + .1*np.random.randn(q0.shape[0])
+    C.setJointState(q)
+    C.view()
+    time.sleep(.2)
+
+
+
+
+
+

2.1.1.3. Features: computing geometric properties

+

For every frame we can query its pose:

+
+
[ ]:
+
+
+
f = C.getFrame('r_gripper')
+print('gripper pos:', f.getPosition())
+print('gripper quat:', f.getQuaternion())
+print('gripper rot:', f.getRotationMatrix())
+
+
+
+

The above provides basic forward kinematics: After setJointState you can query the pose of any configuration frame. However, there is a more general way to query features, i.e. properties of the configuration in a differentiable manner. You might not use this often; but it is important to understand as these differentiable features are the foundation of how optimization problems are formulated, which you’ll need a lot.

+

Here are some example features to evaluate:

+
+
[ ]:
+
+
+
[y,J] = C.eval(ry.FS.position, ['gripper'])
+print('position of gripper:', y, '\nJacobian:', J)
+
+
+
+
+
[ ]:
+
+
+
# negative(!) distance between two convex shapes (or origin of marker)
+C.eval(ry.FS.negDistance, ['panda_coll7', 'r_panda_coll7'])
+
+
+
+
+
[ ]:
+
+
+
# the x-axis of the given frame in world coordinates
+C.eval(ry.FS.vectorX, ['gripper'])
+
+
+
+
+
[ ]:
+
+
+

+
+
+
+
+
+

2.1.1.4. Joint and Frame State

+

A configuration is a tree of n frames. Every frame has a pose (position & quaternion), which is represented as a 7D vector (x,y,z, qw,qx,qy,qz). The frame state is the \(n\times 7\) matrix, where the i-th row is the pose of the i-th frame.

+

A configuration also defines joints, which means that the relative transfromation from a parent to a child frame is parameterized by degrees-of-freedoms (DOFs). If the configuration has in total n DOFs, the joint state is a n-dimensional vector.

+

Setting the joint state implies computing all relative transformations, and then forward chaining all transformations to compute all frame poses. So setting the joint state also sets the frame state.

+

Setting the frame state allows you to set frame poses that are inconsistent/impossible w.r.t. the joints! Setting the frame state implies computing all relative transformations from the frame poses, and then assigning the joint state to the projection onto the actual DOFs

+
+
[ ]:
+
+
+
C.setJointState(q0)
+C.view()
+
+
+
+

The frame state is a \(n\times 7\) matrix, which contains for all of \(n\) frames the 7D pose. A pose is stored as [p_x, p_y, p_z, q_w, q_x, q_y, q_z], with position p and quaternion q.

+
+
[ ]:
+
+
+
X0 = C.getFrameState()
+print('frame state: ', X0)
+
+
+
+

Let’s do a very questionable thing: adding .1 to all numbers in the frame matrix!

+
+
[ ]:
+
+
+
X = X0 + .1
+C.setFrameState(X)
+C.view()
+
+
+
+

That totally broke the original design of the robot! Setting global frame states overwrites the relative transformations between frames.

+

(Also, the rows of X have non-normalized quaternions! These are normalized when setting the frame state.)

+

Let’s reset:

+
+
[ ]:
+
+
+
C.setFrameState(X0)
+C.view()
+
+
+
+
+
+

2.1.1.5. View interaction and releasing objects

+

You can close and re-open the view window

+
+
[ ]:
+
+
+
C.view_close()
+
+
+
+
+
[ ]:
+
+
+
# things are still there
+C.view(pause=False, message='this is a message')
+
+
+
+

For user interaction it is often useful to wait for a keypress (making view a blocking call):

+

keypressed = C.view(True, ‘press some key!’) print(‘pressed key:’, keypressed, chr(keypressed))

+

Get a screenshot:

+
+
[ ]:
+
+
+
img = C.view_getScreenshot()
+print(type(img), img.shape)
+
+
+
+

And release everything, including closing the view

+
+
[ ]:
+
+
+
del C
+
+
+
+
+
[ ]:
+
+
+

+
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/docs/source/notebooks/core1-config-and-frames.ipynb b/docs/notebooks/1a-configurations.ipynb similarity index 50% rename from docs/source/notebooks/core1-config-and-frames.ipynb rename to docs/notebooks/1a-configurations.ipynb index 789480d..dc0bce1 100644 --- a/docs/source/notebooks/core1-config-and-frames.ipynb +++ b/docs/notebooks/1a-configurations.ipynb @@ -4,34 +4,56 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Configuration & Frames\n", + "# Configurations\n", "\n", - "more detailed docs: https://marctoussaint.github.io/robotics-course/" + "A configuration is essentially a set of (coordinate) frames, where each frame can represent a shape, joint, inertia, etc. This tutorial introduces to basics of creating & loading configurations, the joint and frame state, computing features, and handling the view window." ] }, { "cell_type": "code", - "execution_count": null, + "execution_count": 1, "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "compile time: Sep 23 2023 12:43:33\n" + ] + } + ], "source": [ - "from robotic import ry" + "from robotic import ry\n", + "import numpy as np\n", + "import time\n", + "print(ry.compiled())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Setting up a basic Config\n", + "## Adding frames to a configuration\n", "\n", - "The starting point is to create a `Configuration`." + "The starting point is to create a `Configuration`:" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "C = ry.Config()\n", "C.view()" @@ -43,18 +65,39 @@ "source": [ "This shows an empty configuration. Tip: Make the view window appear \"Always On Top\" (right click on the window bar)\n", "\n", - "You can add things (objects, scene models, robots) to a configuration." + "A configuration is essentially a tree (or forrest) of frames. You usually add models from files, but let's do it manually here." ] }, { "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "frame name: first pos: [0. 0. 0.5] quat: [0.95782629 0.28734789 0. 0. ]\n" + ] + }, + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "C.clear()\n", - "C.addFile('../rai-robotModels/pr2/pr2.g')\n", - "C.addFile('../rai-robotModels/objects/kitchen.g')\n", + "f = C.addFrame(name='first')\n", + "f.setShape(type=ry.ST.marker, size=[.4])\n", + "f.setPosition([0.,0.,.5])\n", + "f.setQuaternion([1., .3, .0, .0]) #is normalized internally\n", + "print('frame name:', f.name, 'pos:', f.getPosition(), 'quat:', f.getQuaternion())\n", "C.view()" ] }, @@ -62,19 +105,38 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "You need to call C.view() to update the view" + "Let's add a second frame, but with first as parent and with a hinge joint!" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "ball = C.addFrame(name=\"ball\")\n", - "ball.setShape(ry.ST.sphere, [.1])\n", - "ball.setPosition([.8,.8,1.5])\n", - "ball.setColor([1,1,0])\n", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "frame name: second pos: [0. 0. 0.5] quat: [-0.95782629 -0.28734789 -0. -0. ]\n" + ] + }, + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f = C.addFrame(name='second', parent='first')\n", + "f.setJoint(ry.JT.hingeX)\n", + "f.setShape(type=ry.ST.marker, size=[.4])\n", + "f.setColor([1,0,0])\n", + "print('frame name:', f.name, 'pos:', f.getPosition(), 'quat:', f.getQuaternion())\n", "C.view()" ] }, @@ -82,19 +144,37 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "One can also add convex meshes (just passing the vertex array), or use sphere-swept convex meshes (ssBox, capsule, sphere, etc)" + "Since we now have a configuration with a joint, we can articulate it:" ] }, { "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "joint state: [0.1]\n" + ] + }, + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ - "C.addFrame(name=\"hand\", parent=\"pr2L\") \\\n", - " .setShape(ry.ST.ssBox, size=[.2,.2,.1,.02]) \\\n", - " .setRelativePosition([0,0,-.1]) \\\n", - " .setColor([1,1,0])\n", + "q = C.getJointState()\n", + "q[0] = q[0] + .1\n", + "C.setJointState(q)\n", + "print('joint state:', q)\n", "C.view()" ] }, @@ -102,7 +182,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In this last example, the new object has another frame (pr2L) as *parent*. This means that it is permanently attached to this parent. pos and quat/rot are interpreted relative to the parent." + "Other examples to add:" ] }, { @@ -111,26 +191,20 @@ "metadata": {}, "outputs": [], "source": [ - "f = C.frame(\"hand\")\n", - "print(\"position:\", f.getPosition())\n", - "print(\"orientation:\", f.getQuaternion())" + "C.addFrame('ball', 'second') .setShape(ry.ST.sphere, [.1]) .setColor([1.,.5,.0]) .setRelativePosition([-.3,.0,.2])\n", + "C.addFrame('box', 'second') .setShape(ry.ST.ssBox, [.3,.2,.1,.02]) .setColor([.5,1.,.0]) .setRelativePosition([.0,.0,.2])\n", + "C.addFrame('capsule', 'second') .setShape(ry.ST.capsule, [.3, .05]) .setColor([.0,1.,.5]) .setRelativePosition([.3,.0,.2])\n", + "for t in range(100):\n", + " C.setJointState([np.cos(.1*t)])\n", + " C.view()\n", + " time.sleep(.1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "TODO (below): getters and setters for frames" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "frameC = C.frame('C')\n", - "print('pos:', frameC.getPosition(), 'quat:', frameC.getQuaternion())" + "## Loading existing configurations" ] }, { @@ -139,20 +213,16 @@ "metadata": {}, "outputs": [], "source": [ - "q[0] = q[0] + .5\n", - "C.setJointState(q)\n", - "print('pos:', frameC.getPosition(), 'quat:', frameC.getQuaternion())" + "C.clear()\n", + "C.addFile(ry.raiPath('panda/panda.g'))\n", + "C.view()" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "[y,J] = C.eval(ry.FS.position, ['C'])\n", - "print('position of C:', y, '\\nJacobian:', J)\n", - "type(J)" + "Let's add a second panda, but prefix all frame names, and move it to the side" ] }, { @@ -161,30 +231,17 @@ "metadata": {}, "outputs": [], "source": [ - "#only the z-position relative to target 0.5:\n", - "C.eval(ry.FS.position, ['C'], [[0,0,1]], [0,0,0.5]) #the scaling is a 1x3 matrix" + "C.addFile(ry.raiPath('panda/panda.g'), 'r_')\n", + "base_r = C.getFrame('r_panda_base')\n", + "base_r.setPosition([.0, .5, .0])\n", + "C.view()" ] }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] - }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Joint and Frame State\n", - "\n", - "A configuration is a tree of n frames. Every frame has a pose (position & quaternion), which is represented as a 7D vector (x,y,z, qw,qx,qy,qz). The frame state is the $n\\times 7$ matrix, where the i-th row is the pose of the i-th frame.\n", - "\n", - "A configuration also defines joints, which means that the relative transfromation from a parent to a child frame is parameterized by degrees-of-freedoms (DOFs). If the configuration has in total n DOFs, the joint state is a n-dimensional vector.\n", - "\n", - "Setting the joint state implies computing all relative transformations, and then forward chaining all transformations to compute all frame poses. So setting the joint state also sets the frame state.\n", - " \n", - "Setting the frame state allows you to set frame poses that are inconsistent/impossible w.r.t. the joints! Setting the frame state implies computing all relative transformations from the frame poses, and then assigning the joint state to the *projection* onto the actual DOFs" + "We can get the joint state of the full configuration:" ] }, { @@ -193,16 +250,16 @@ "metadata": {}, "outputs": [], "source": [ - "q = C.getJointState()\n", - "print('joint names: ', C.getJointNames() )\n", - "print('joint state: ', q)" + "print(C.getJointState())\n", + "print('joints:', C.getJointNames())\n", + "print('frames:', C.getFrameNames())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's move the configuration by adding to the joint configuration" + "Let's animate:" ] }, { @@ -211,16 +268,20 @@ "metadata": {}, "outputs": [], "source": [ - "q[2] = q[2] + 1.\n", - "C.setJointState(q)\n", - "C.view()" + "q0 = C.getJointState()\n", + "for t in range(20):\n", + " q = q0 + .1*np.random.randn(q0.shape[0])\n", + " C.setJointState(q)\n", + " C.view()\n", + " time.sleep(.2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "The *frame state* is a $n\\times 7$ matrix, which contains for all of $n$ frames the 7D pose. A pose is stored as [p_x, p_y, p_z, q_w, q_x, q_y, q_z], with position p and quaternion q." + "## Features: computing geometric properties\n", + "For every frame we can query its pose:" ] }, { @@ -229,15 +290,19 @@ "metadata": {}, "outputs": [], "source": [ - "X0 = C.getFrameState()\n", - "print('frame state: ', X0)" + "f = C.getFrame('r_gripper')\n", + "print('gripper pos:', f.getPosition())\n", + "print('gripper quat:', f.getQuaternion())\n", + "print('gripper rot:', f.getRotationMatrix())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "Let's do a questionable thing: adding .1 to all numbers in the frame matrix!" + "The above provides basic forward kinematics: After `setJointState` you can query the pose of any configuration frame. However, there is a more general way to query *features*, i.e. properties of the configuration in a differentiable manner. You might not use this often; but it is important to understand as these differentiable features are the foundation of how optimization problems are formulated, which you'll need a lot.\n", + "\n", + "Here are some example features to evaluate:" ] }, { @@ -246,18 +311,8 @@ "metadata": {}, "outputs": [], "source": [ - "X = X0 + .1\n", - "C.setFrameState(X)\n", - "C.view()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The rows of X have non-normalized quaternions! These are normalized when setting the frame state.\n", - "\n", - "Also, the frame poses are now *inconsistent* to the joint constraints! We can read out the projected joint state, set the joint state, and get a consistent state again:" + "[y,J] = C.eval(ry.FS.position, ['gripper'])\n", + "print('position of gripper:', y, '\\nJacobian:', J)" ] }, { @@ -266,15 +321,18 @@ "metadata": {}, "outputs": [], "source": [ - "C.setJointState( C.getJointState() )\n", - "C.view()" + "# negative(!) distance between two convex shapes (or origin of marker)\n", + "C.eval(ry.FS.negDistance, ['panda_coll7', 'r_panda_coll7'])" ] }, { - "cell_type": "markdown", + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "Now all *joint* transformations are consistent: just hingeX transformations or alike. However, all the other relative transformations of links and shapes are still messed up from setting their frame pose. Let's bring the configuration back into the state before the harsh *setFrame*" + "# the x-axis of the given frame in world coordinates\n", + "C.eval(ry.FS.vectorX, ['gripper'])" ] }, { @@ -282,20 +340,21 @@ "execution_count": null, "metadata": {}, "outputs": [], - "source": [ - "C.setFrameState(X0)\n", - "C.view()" - ] + "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ - " ## Selecting joints\n", + "## Joint and Frame State\n", "\n", - "Often one would like to choose which joints are actually active, that is, which joints are referred to in q. This allows one to sub-select joints and work only with projections of the full configuration state. This changes the joint state dimensionality, including ordering of entries in q.\n", + "A configuration is a tree of n frames. Every frame has a pose (position & quaternion), which is represented as a 7D vector (x,y,z, qw,qx,qy,qz). The frame state is the $n\\times 7$ matrix, where the i-th row is the pose of the i-th frame.\n", + "\n", + "A configuration also defines joints, which means that the relative transfromation from a parent to a child frame is parameterized by degrees-of-freedoms (DOFs). If the configuration has in total n DOFs, the joint state is a n-dimensional vector.\n", "\n", - "The frame state is not affected by such a selection of active joints." + "Setting the joint state implies computing all relative transformations, and then forward chaining all transformations to compute all frame poses. So setting the joint state also sets the frame state.\n", + " \n", + "Setting the frame state allows you to set frame poses that are inconsistent/impossible w.r.t. the joints! Setting the frame state implies computing all relative transformations from the frame poses, and then assigning the joint state to the *projection* onto the actual DOFs" ] }, { @@ -304,21 +363,15 @@ "metadata": {}, "outputs": [], "source": [ - "C.selectJointsByTag([\"armL\",\"base\"])\n", - "q = C.getJointState()\n", - "print('joint state: ', q)\n", - "print('joint names: ', C.getJointNames() )" + "C.setJointState(q0)\n", + "C.view()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Features & Jacobians\n", - "\n", - "A core part of rai defines features over configurations. A feature is a differentiable mapping from a configuration (or set of configurations) to a vector. Starndard features are \"position-of-endeffector-X\" or \"distance/penetration-between-convex-shapes-A-and-B\", etc. But there are many, many more features defined in rai, like error of Newton-Euler-equations for an object, total energy of the system, etc. Defining differentiable features is the core of many functionalities in the rai code.\n", - "\n", - "Let's define a basic feature over C: the 3D (world coordinate) position of pr2L (left hand)" + "The *frame state* is a $n\\times 7$ matrix, which contains for all of $n$ frames the 7D pose. A pose is stored as [p_x, p_y, p_z, q_w, q_x, q_y, q_z], with position p and quaternion q." ] }, { @@ -327,14 +380,15 @@ "metadata": {}, "outputs": [], "source": [ - "F = C.feature(ry.FS.position, [\"pr2L\"])" + "X0 = C.getFrameState()\n", + "print('frame state: ', X0)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can now evaluate the feature, and also get the Jacobian:" + "Let's do a very questionable thing: adding .1 to all numbers in the frame matrix!" ] }, { @@ -343,29 +397,20 @@ "metadata": {}, "outputs": [], "source": [ - "print(F.description(C))\n", - "\n", - "[y,J] = F.eval(C)\n", - "print('hand position:', y)\n", - "print('Jacobian:', J)\n", - "print('Jacobian shape:', J.shape)" + "X = X0 + .1\n", + "C.setFrameState(X)\n", + "C.view()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can linearly transform features by setting 'scale' and 'target':" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "F2 = C.feature(ry.FS.distance, [\"hand\", \"ball\"])\n", - "print(F2.description(C))" + "That totally broke the original design of the robot! Setting global frame states overwrites the relative transformations between frames.\n", + "\n", + "(Also, the rows of X have non-normalized quaternions! These are normalized when setting the frame state.)\n", + "\n", + "Let's reset:" ] }, { @@ -374,21 +419,22 @@ "metadata": {}, "outputs": [], "source": [ - "F2.eval(C)" + "C.setFrameState(X0)\n", + "C.view()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "## Camera views (needs more testing)" + "## View interaction and releasing objects" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "We can also add a frame, attached to the head, which has no shape associated to it, but create a view is associated with that frame:" + "You can close and re-open the view window" ] }, { @@ -397,10 +443,7 @@ "metadata": {}, "outputs": [], "source": [ - "C.addFrame(name='camera', parent='head_tilt_link', args='Q: focalLength:.3')\n", - "V = C.cameraView()\n", - "IV = V.imageViewer()\n", - "V.addSensor(name='camera', frameAttached='camera', width=600, height=400)" + "C.view_close()" ] }, { @@ -409,32 +452,30 @@ "metadata": {}, "outputs": [], "source": [ - "[image,depth] = V.computeImageAndDepth()" + "# things are still there\n", + "C.view(pause=False, message='this is a message')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "When we move the robot, that view moves with it:" + "For user interaction it is often useful to wait for a keypress (making `view` a blocking call):" ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "C.setJointState(q=[0.5], joints=['head_pan_joint'])\n", - "V.updateConfig(C)\n", - "V.computeImageAndDepth()" + "keypressed = C.view(True, 'press some key!')\n", + "print('pressed key:', keypressed, chr(keypressed))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "To close a view (or destroy a handle to a computational module), we reassign it to zero. We can also remove a frame from the configuration." + "Get a screenshot:" ] }, { @@ -443,9 +484,15 @@ "metadata": {}, "outputs": [], "source": [ - "IV = 0\n", - "V = 0\n", - "C.delFrame('camera')" + "img = C.view_getScreenshot()\n", + "print(type(img), img.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And release everything, including closing the view" ] }, { @@ -454,7 +501,7 @@ "metadata": {}, "outputs": [], "source": [ - "C=0" + "del C" ] }, { diff --git a/docs/notebooks/1b-botop.html b/docs/notebooks/1b-botop.html new file mode 100644 index 0000000..9a599f1 --- /dev/null +++ b/docs/notebooks/1b-botop.html @@ -0,0 +1,454 @@ + + + + + + + 2.1.2. Robot Operation (BotOp) interface — Robotics Course documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+
    +
  • + + +
  • +
  • +
+
+
+
+
+ +
+

2.1.2. Robot Operation (BotOp) interface

+

This describes basics to interact with a real or simulated robot. The BotOp (=robot operation) interface is very narrow. The move methods set/overwrite a spline reference for the robot. (Also compliance around the reference can be set.) The gripper methods operate grippers. The getImage.. methods grab images or point clouds from the camera.

+

This interface different to a more generic physical simulation interface. If you’re interested in the latter (e.g. to implement a gym environment) look at the Simulation tutorial. The simulation used here is a real-time threaded process that mimics the specific BotOp interface – to make it swappable with a real robot.

+

The simulation can be run in many different modes: pure kinematic (no physics for objects), a physics simulator with physics for objects but still kinematic robot, a physic simulator with PD motors for the robot.

+
+
[1]:
+
+
+
from robotic import ry
+import numpy as np
+import time
+
+
+
+

ry has global parameters, that can be defined in rai.cfg or with the following calls. The simulation behaves very differently depending on botim/engine [physx or kinematic] and multibody

+
+
[2]:
+
+
+
ry.params_add({'botsim/verbose': 2., 'physx/motorKp': 10000., 'physx/motorKd': 1000.})
+ry.params_add({'botsim/engine': 'physx'}) #makes a big difference!
+ry.params_add({'physx/multibody': True}) #makes a big difference!
+ry.params_print()
+
+
+
+
+
+
+
+
+-- ry.cpp:operator():99(0) python,
+message: "Hello, the local 'rai.cfg' was loaded",
+botsim/verbose: 2,
+physx/motorKp: 10000,
+physx/motorKd: 1000,
+botsim/engine: physx,
+physx/multibody: 1
+
+
+
+
[3]:
+
+
+
C = ry.Config()
+C.addFile(ry.raiPath('../rai-robotModels/scenarios/pandaSingle.g'))
+C.view(False, 'this is your workspace data structure C -- NOT THE SIMULTATION')
+
+
+
+
+
[3]:
+
+
+
+
+0
+
+
+

We open a robot interface in simulation (False). True would directly open communication to one or two pandas (depending no how many are defined in C). The botsim/verbose above leads to the explicit verbosity when creating the simulator interface.

+
+
[4]:
+
+
+
bot = ry.BotOp(C, False)
+#note that in sim, when physx multibody is activated, arms are going down! free floating...
+
+
+
+
+
+
+
+
+-- kin_physx.cpp:PhysXInterface:768(0) starting PhysX engine ...
+-- kin_physx.cpp:addGround:238(0) ... done starting PhysX engine
+-- kin_physx.cpp:addGround:239(0) creating Configuration within PhysX ...
+-- kin_physx.cpp:addLink:254(0) adding link 'world' as static with 1 shapes
+ table
+-- kin_physx.cpp:addMultiBody:466(0) adding multibody with base 'l_panda_base' with the following links ...
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_base' as kinematic with 1 shapes
+ l_panda_link0_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint1' as dynamic with 1 shapes
+ l_panda_link1_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint2' as dynamic with 2 shapes
+ l_panda_link2_0 bellybutton
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint3' as dynamic with 1 shapes
+ l_panda_link3_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint4' as dynamic with 1 shapes
+ l_panda_link4_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint5' as dynamic with 1 shapes
+ l_panda_link5_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint6' as dynamic with 1 shapes
+ l_panda_link6_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint7' as dynamic with 2 shapes
+ l_panda_link7_0 l_panda_hand_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_finger_joint1' as dynamic with 1 shapes
+ l_panda_leftfinger_0
+-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_finger_joint2' as dynamic with 1 shapes
+ l_panda_rightfinger_0
+-- kin_physx.cpp:addMultiBody:592(0) ... done with multibody with base 'l_panda_base'
+-- kin_physx.cpp:PhysXInterface:805(0) ... done creating Configuration within PhysX
+
+
+

We define 2 reference poses, q0=home and q1=(2nd joint bend), so that we can move back and forth between them

+
+
[5]:
+
+
+
qHome = bot.get_qHome()
+q0 = qHome.copy()
+q1 = q0.copy()
+q1[1] = q1[1] + .2
+print(q0, q1)
+
+
+
+
+
+
+
+
+[ 0.  -0.5  0.  -2.   0.   2.  -0.5] [ 0.  -0.3  0.  -2.   0.   2.  -0.5]
+
+
+

The moveTo is the simplest way to move the robot from current to target. It internally creates a spline to the target with optimal timing and follows it. The call is non-blocking. Also, your workspace config C is not kept in sync with the real/sim. If you want to wait till the motion is finished, you need to do manually checking the /time_til_end_of_reference_spline/, and meanwhile staying sync’ed.

+
+
[6]:
+
+
+
bot.moveTo(q1)
+
+while bot.getTimeToEnd()>0:
+    bot.sync(C, .1)
+
+
+
+

The internal spline reference can be appended: As moveTo is non-blocking, you can append several moves like this:

+
+
[7]:
+
+
+
print('timeToEnd:', bot.getTimeToEnd())
+bot.moveTo(q0)
+print('timeToEnd:', bot.getTimeToEnd())
+bot.moveTo(q1)
+print('timeToEnd:', bot.getTimeToEnd())
+bot.moveTo(q0)
+
+while bot.getTimeToEnd()>0:
+    bot.sync(C, .1)
+
+
+
+
+
+
+
+
+timeToEnd: -0.073524058876411
+timeToEnd: 1.0957189420649807
+timeToEnd: 2.191437884129961
+
+
+

Setting splines becomes reactive, when we can smoothly overwrite the spline reference with high frequency. Let’s create a randomly moving target object and track it.

+
+
[8]:
+
+
+
#this reference frame only appears in your workspace C - not the simulation!
+target = C.addFrame('target', 'table')
+target.setShape(ry.ST.marker, [.1])
+target.setRelativePosition([0., .3, .3])
+pos = target.getPosition()
+cen = pos.copy()
+C.view()
+
+
+
+
+
[8]:
+
+
+
+
+0
+
+
+
+
[9]:
+
+
+
# you'll learn about KOMO later - this defines a basic Inverse Kinematics method
+def IK(C, pos):
+    q0 = C.getJointState()
+    komo = ry.KOMO(C, 1, 1, 0, False) #one phase one time slice problem, with 'delta_t=1', order=0
+    komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], q0) #cost: close to 'current state'
+    komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome) #cost: close to qHome
+    komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'target'], ry.OT.eq, [1e1]) #constraint: gripper position
+
+    ret = ry.NLP_Solver(komo.nlp(), verbose=0) .solve()
+
+    return [komo.getPath()[0], ret]
+
+
+
+

The following is just ‘setting’ the workspace C to the IK solution - no motion send to the real/robot:

+
+
[10]:
+
+
+
for t in range(20):
+    time.sleep(.1)
+    pos = cen + .98 * (pos-cen) + 0.02 * np.random.randn(3)
+    target.setPosition(pos)
+
+    q_target, ret = IK(C, pos)
+    print(ret)
+    C.setJointState(q_target)
+    C.view()
+
+
+
+
+
+
+
+
+{ time: 0.005057, evals: 6, done: 1, feasible: 1, sos: 0.00789171, f: 0, ineq: 0, eq: 0.00138392 }
+{ time: 0.001435, evals: 4, done: 1, feasible: 1, sos: 0.0055078, f: 0, ineq: 0, eq: 0.00038073 }
+{ time: 0.001453, evals: 3, done: 1, feasible: 1, sos: 0.00485786, f: 0, ineq: 0, eq: 0.0031066 }
+{ time: 0.000382, evals: 3, done: 1, feasible: 1, sos: 0.00552496, f: 0, ineq: 0, eq: 0.00285066 }
+{ time: 0.000871, evals: 4, done: 1, feasible: 1, sos: 0.00481289, f: 0, ineq: 0, eq: 0.000176548 }
+{ time: 0.000181, evals: 3, done: 1, feasible: 1, sos: 0.00416384, f: 0, ineq: 0, eq: 0.00404598 }
+{ time: 0.001811, evals: 3, done: 1, feasible: 1, sos: 0.00394648, f: 0, ineq: 0, eq: 0.00118522 }
+{ time: 0.000637, evals: 3, done: 1, feasible: 1, sos: 0.00410849, f: 0, ineq: 0, eq: 0.00429565 }
+{ time: 0.001591, evals: 4, done: 1, feasible: 1, sos: 0.00454469, f: 0, ineq: 0, eq: 0.000211266 }
+{ time: 0.000784, evals: 3, done: 1, feasible: 1, sos: 0.00470588, f: 0, ineq: 0, eq: 0.00357698 }
+{ time: 0.000942, evals: 3, done: 1, feasible: 1, sos: 0.00478467, f: 0, ineq: 0, eq: 0.00148874 }
+{ time: 0.001108, evals: 3, done: 1, feasible: 1, sos: 0.00457452, f: 0, ineq: 0, eq: 0.00162474 }
+{ time: 0.001451, evals: 4, done: 1, feasible: 1, sos: 0.005894, f: 0, ineq: 0, eq: 0.000439926 }
+{ time: 0.002882, evals: 3, done: 1, feasible: 1, sos: 0.00540869, f: 0, ineq: 0, eq: 0.0027726 }
+{ time: 0.001624, evals: 4, done: 1, feasible: 1, sos: 0.00720508, f: 0, ineq: 0, eq: 0.00069492 }
+{ time: 0.001267, evals: 4, done: 1, feasible: 1, sos: 0.0075089, f: 0, ineq: 0, eq: 0.000505518 }
+{ time: 0.002128, evals: 3, done: 1, feasible: 1, sos: 0.0078162, f: 0, ineq: 0, eq: 0.0019024 }
+{ time: 0.001489, evals: 3, done: 1, feasible: 1, sos: 0.00831903, f: 0, ineq: 0, eq: 0.00241169 }
+{ time: 0.005757, evals: 4, done: 1, feasible: 1, sos: 0.0106921, f: 0, ineq: 0, eq: 0.000467631 }
+{ time: 0.001208, evals: 4, done: 1, feasible: 1, sos: 0.0115667, f: 0, ineq: 0, eq: 0.000478951 }
+
+
+

We now generate reative motion by smoothly overwriting the spline reference. Increasing time cost makes it more agressive (penalized total duration of estimated cubic spline).

+
+
[11]:
+
+
+
for t in range(100):
+    bot.sync(C, .1) #keep the workspace C sync'ed to real/sim, and idle .1 sec
+    pos = cen + .98 * (pos-cen) + 0.02 * np.random.randn(3)
+    target.setPosition(pos)
+
+    q_target, ret = IK(C, pos)
+    bot.moveTo(q_target, timeCost=5., overwrite=True)
+
+
+
+

Good practise is to always allow a user aborting motion execution. In this example, key ‘q’ will break the loop and call a home() (which is the same as moveTo(qHome, 1., True)

+
+
[ ]:
+
+
+
for t in range(5):
+    bot.moveTo(q1)
+    bot.wait(C) #same as 'loop sync til keypressed or endOfTime', but also raises user window
+    if bot.getKeyPressed()==ord('q'):
+        break;
+
+    bot.moveTo(q0)
+    bot.wait(C)
+    if bot.getKeyPressed()==ord('q'):
+        break;
+
+bot.home(C)
+
+
+
+

gripper movements also do not block:

+
+
[ ]:
+
+
+
bot.gripperMove(ry._left, width=.02)
+
+while not bot.gripperDone(ry._left):
+    bot.sync(C, .1)
+
+bot.gripperMove(ry._left, width=.075)
+
+while not bot.gripperDone(ry._left):
+    bot.sync(C, .1)
+
+
+
+

Always close the bot/sim properly:

+
+
[ ]:
+
+
+
del bot
+del C
+
+
+
+

As a side note, we can always check which global config parameters have been queried by the code so far. That gives an idea of which global parameters exist:

+
+
[ ]:
+
+
+
ry.params_print()
+
+
+
+
+
[ ]:
+
+
+

+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/docs/notebooks/1b-botop.ipynb b/docs/notebooks/1b-botop.ipynb new file mode 100644 index 0000000..a0a1da5 --- /dev/null +++ b/docs/notebooks/1b-botop.ipynb @@ -0,0 +1,483 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "e193f01d", + "metadata": {}, + "source": [ + "# Robot Operation (BotOp) interface\n", + "\n", + "This describes basics to interact with a real or simulated robot. The BotOp (=robot operation) interface is very narrow. The move methods set/overwrite a spline reference for the robot. (Also compliance around the reference can be set.) The gripper methods operate grippers. The getImage.. methods grab images or point clouds from the camera.\n", + "\n", + "This interface different to a more *generic physical simulation* interface. If you're interested in the latter (e.g. to implement a gym environment) look at the `Simulation` tutorial. The simulation used here is a real-time threaded process that mimics the specific BotOp interface -- to make it swappable with a real robot.\n", + "\n", + "The simulation can be run in many different modes: pure kinematic (no physics for objects), a physics simulator with physics for objects but still kinematic robot, a physic simulator with PD motors for the robot." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "31a434d3", + "metadata": {}, + "outputs": [], + "source": [ + "from robotic import ry\n", + "import numpy as np\n", + "import time" + ] + }, + { + "cell_type": "markdown", + "id": "74bb70b4", + "metadata": {}, + "source": [ + "ry has global parameters, that can be defined in `rai.cfg` or with the following calls.\n", + "The simulation behaves very differently depending on `botim/engine` [physx or kinematic] and `multibody`" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "bb4031b4", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "-- ry.cpp:operator():99(0) python,\n", + "message: \"Hello, the local 'rai.cfg' was loaded\",\n", + "botsim/verbose: 2,\n", + "physx/motorKp: 10000,\n", + "physx/motorKd: 1000,\n", + "botsim/engine: physx,\n", + "physx/multibody: 1\n" + ] + } + ], + "source": [ + "ry.params_add({'botsim/verbose': 2., 'physx/motorKp': 10000., 'physx/motorKd': 1000.})\n", + "ry.params_add({'botsim/engine': 'physx'}) #makes a big difference!\n", + "ry.params_add({'physx/multibody': True}) #makes a big difference!\n", + "ry.params_print()" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "d1bff41b", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = ry.Config()\n", + "C.addFile(ry.raiPath('../rai-robotModels/scenarios/pandaSingle.g'))\n", + "C.view(False, 'this is your workspace data structure C -- NOT THE SIMULTATION')" + ] + }, + { + "cell_type": "markdown", + "id": "2333c4b1", + "metadata": {}, + "source": [ + "We open a robot interface in simulation (False). True would directly open communication to one or two pandas (depending no how many are defined in C). The `botsim/verbose` above leads to the explicit verbosity when creating the simulator interface." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "6832eb10", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "-- kin_physx.cpp:PhysXInterface:768(0) starting PhysX engine ...\n", + "-- kin_physx.cpp:addGround:238(0) ... done starting PhysX engine\n", + "-- kin_physx.cpp:addGround:239(0) creating Configuration within PhysX ...\n", + "-- kin_physx.cpp:addLink:254(0) adding link 'world' as static with 1 shapes\n", + " table\n", + "-- kin_physx.cpp:addMultiBody:466(0) adding multibody with base 'l_panda_base' with the following links ...\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_base' as kinematic with 1 shapes\n", + " l_panda_link0_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint1' as dynamic with 1 shapes\n", + " l_panda_link1_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint2' as dynamic with 2 shapes\n", + " l_panda_link2_0 bellybutton\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint3' as dynamic with 1 shapes\n", + " l_panda_link3_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint4' as dynamic with 1 shapes\n", + " l_panda_link4_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint5' as dynamic with 1 shapes\n", + " l_panda_link5_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint6' as dynamic with 1 shapes\n", + " l_panda_link6_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_joint7' as dynamic with 2 shapes\n", + " l_panda_link7_0 l_panda_hand_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_finger_joint1' as dynamic with 1 shapes\n", + " l_panda_leftfinger_0\n", + "-- kin_physx.cpp:addMultiBody:486(0) adding multibody link 'l_panda_finger_joint2' as dynamic with 1 shapes\n", + " l_panda_rightfinger_0\n", + "-- kin_physx.cpp:addMultiBody:592(0) ... done with multibody with base 'l_panda_base'\n", + "-- kin_physx.cpp:PhysXInterface:805(0) ... done creating Configuration within PhysX\n" + ] + } + ], + "source": [ + "bot = ry.BotOp(C, False)\n", + "#note that in sim, when physx multibody is activated, arms are going down! free floating..." + ] + }, + { + "cell_type": "markdown", + "id": "c4ad9bb7", + "metadata": {}, + "source": [ + "We define 2 reference poses, q0=home and q1=(2nd joint bend), so that we can move back and forth between them" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "afe800f7", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 0. -0.5 0. -2. 0. 2. -0.5] [ 0. -0.3 0. -2. 0. 2. -0.5]\n" + ] + } + ], + "source": [ + "qHome = bot.get_qHome()\n", + "q0 = qHome.copy()\n", + "q1 = q0.copy()\n", + "q1[1] = q1[1] + .2\n", + "print(q0, q1)" + ] + }, + { + "cell_type": "markdown", + "id": "86f72e9b", + "metadata": {}, + "source": [ + "The `moveTo` is the simplest way to move the robot from current to target. It internally creates a spline to the target with optimal timing and follows it. The call is *non-blocking*. Also, your workspace config C is not kept in sync with the real/sim. If you want to wait till the motion is finished, you need to do manually checking the /time_til_end_of_reference_spline/, and meanwhile staying sync'ed." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "443856f3", + "metadata": {}, + "outputs": [], + "source": [ + "bot.moveTo(q1)\n", + "\n", + "while bot.getTimeToEnd()>0:\n", + " bot.sync(C, .1)" + ] + }, + { + "cell_type": "markdown", + "id": "fa41eca8", + "metadata": {}, + "source": [ + "The internal spline reference can be appended: As `moveTo` is non-blocking, you can append several moves like this:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "182b64dd", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "timeToEnd: -0.073524058876411\n", + "timeToEnd: 1.0957189420649807\n", + "timeToEnd: 2.191437884129961\n" + ] + } + ], + "source": [ + "print('timeToEnd:', bot.getTimeToEnd())\n", + "bot.moveTo(q0)\n", + "print('timeToEnd:', bot.getTimeToEnd())\n", + "bot.moveTo(q1)\n", + "print('timeToEnd:', bot.getTimeToEnd())\n", + "bot.moveTo(q0)\n", + "\n", + "while bot.getTimeToEnd()>0:\n", + " bot.sync(C, .1)" + ] + }, + { + "cell_type": "markdown", + "id": "abc71ed9", + "metadata": {}, + "source": [ + "Setting splines becomes reactive, when we can smoothly overwrite the spline reference with high frequency. Let's create a randomly moving target object and track it." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "c3dbe900", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "#this reference frame only appears in your workspace C - not the simulation!\n", + "target = C.addFrame('target', 'table')\n", + "target.setShape(ry.ST.marker, [.1])\n", + "target.setRelativePosition([0., .3, .3])\n", + "pos = target.getPosition()\n", + "cen = pos.copy()\n", + "C.view()" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "b2a168d1", + "metadata": {}, + "outputs": [], + "source": [ + "# you'll learn about KOMO later - this defines a basic Inverse Kinematics method\n", + "def IK(C, pos):\n", + " q0 = C.getJointState()\n", + " komo = ry.KOMO(C, 1, 1, 0, False) #one phase one time slice problem, with 'delta_t=1', order=0\n", + " komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], q0) #cost: close to 'current state'\n", + " komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome) #cost: close to qHome\n", + " komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'target'], ry.OT.eq, [1e1]) #constraint: gripper position\n", + " \n", + " ret = ry.NLP_Solver(komo.nlp(), verbose=0) .solve()\n", + " \n", + " return [komo.getPath()[0], ret]" + ] + }, + { + "cell_type": "markdown", + "id": "76e13a25", + "metadata": {}, + "source": [ + "The following is just 'setting' the workspace C to the IK solution - no motion send to the real/robot:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "4998d869", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.005057, evals: 6, done: 1, feasible: 1, sos: 0.00789171, f: 0, ineq: 0, eq: 0.00138392 }\n", + "{ time: 0.001435, evals: 4, done: 1, feasible: 1, sos: 0.0055078, f: 0, ineq: 0, eq: 0.00038073 }\n", + "{ time: 0.001453, evals: 3, done: 1, feasible: 1, sos: 0.00485786, f: 0, ineq: 0, eq: 0.0031066 }\n", + "{ time: 0.000382, evals: 3, done: 1, feasible: 1, sos: 0.00552496, f: 0, ineq: 0, eq: 0.00285066 }\n", + "{ time: 0.000871, evals: 4, done: 1, feasible: 1, sos: 0.00481289, f: 0, ineq: 0, eq: 0.000176548 }\n", + "{ time: 0.000181, evals: 3, done: 1, feasible: 1, sos: 0.00416384, f: 0, ineq: 0, eq: 0.00404598 }\n", + "{ time: 0.001811, evals: 3, done: 1, feasible: 1, sos: 0.00394648, f: 0, ineq: 0, eq: 0.00118522 }\n", + "{ time: 0.000637, evals: 3, done: 1, feasible: 1, sos: 0.00410849, f: 0, ineq: 0, eq: 0.00429565 }\n", + "{ time: 0.001591, evals: 4, done: 1, feasible: 1, sos: 0.00454469, f: 0, ineq: 0, eq: 0.000211266 }\n", + "{ time: 0.000784, evals: 3, done: 1, feasible: 1, sos: 0.00470588, f: 0, ineq: 0, eq: 0.00357698 }\n", + "{ time: 0.000942, evals: 3, done: 1, feasible: 1, sos: 0.00478467, f: 0, ineq: 0, eq: 0.00148874 }\n", + "{ time: 0.001108, evals: 3, done: 1, feasible: 1, sos: 0.00457452, f: 0, ineq: 0, eq: 0.00162474 }\n", + "{ time: 0.001451, evals: 4, done: 1, feasible: 1, sos: 0.005894, f: 0, ineq: 0, eq: 0.000439926 }\n", + "{ time: 0.002882, evals: 3, done: 1, feasible: 1, sos: 0.00540869, f: 0, ineq: 0, eq: 0.0027726 }\n", + "{ time: 0.001624, evals: 4, done: 1, feasible: 1, sos: 0.00720508, f: 0, ineq: 0, eq: 0.00069492 }\n", + "{ time: 0.001267, evals: 4, done: 1, feasible: 1, sos: 0.0075089, f: 0, ineq: 0, eq: 0.000505518 }\n", + "{ time: 0.002128, evals: 3, done: 1, feasible: 1, sos: 0.0078162, f: 0, ineq: 0, eq: 0.0019024 }\n", + "{ time: 0.001489, evals: 3, done: 1, feasible: 1, sos: 0.00831903, f: 0, ineq: 0, eq: 0.00241169 }\n", + "{ time: 0.005757, evals: 4, done: 1, feasible: 1, sos: 0.0106921, f: 0, ineq: 0, eq: 0.000467631 }\n", + "{ time: 0.001208, evals: 4, done: 1, feasible: 1, sos: 0.0115667, f: 0, ineq: 0, eq: 0.000478951 }\n" + ] + } + ], + "source": [ + "for t in range(20):\n", + " time.sleep(.1)\n", + " pos = cen + .98 * (pos-cen) + 0.02 * np.random.randn(3)\n", + " target.setPosition(pos)\n", + " \n", + " q_target, ret = IK(C, pos)\n", + " print(ret)\n", + " C.setJointState(q_target)\n", + " C.view()" + ] + }, + { + "cell_type": "markdown", + "id": "c0d79cae", + "metadata": {}, + "source": [ + "We now generate reative motion by smoothly overwriting the spline reference. Increasing time cost makes it more agressive (penalized total duration of estimated cubic spline)." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "c5af1933", + "metadata": {}, + "outputs": [], + "source": [ + "for t in range(100):\n", + " bot.sync(C, .1) #keep the workspace C sync'ed to real/sim, and idle .1 sec\n", + " pos = cen + .98 * (pos-cen) + 0.02 * np.random.randn(3)\n", + " target.setPosition(pos)\n", + " \n", + " q_target, ret = IK(C, pos)\n", + " bot.moveTo(q_target, timeCost=5., overwrite=True)" + ] + }, + { + "cell_type": "markdown", + "id": "35e22aa1", + "metadata": {}, + "source": [ + "Good practise is to always allow a user aborting motion execution. In this example, key 'q' will break the loop and call a home() (which is the same as moveTo(qHome, 1., True)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "09468d0c", + "metadata": {}, + "outputs": [], + "source": [ + "for t in range(5):\n", + " bot.moveTo(q1)\n", + " bot.wait(C) #same as 'loop sync til keypressed or endOfTime', but also raises user window\n", + " if bot.getKeyPressed()==ord('q'):\n", + " break;\n", + " \n", + " bot.moveTo(q0)\n", + " bot.wait(C)\n", + " if bot.getKeyPressed()==ord('q'):\n", + " break;\n", + "\n", + "bot.home(C)" + ] + }, + { + "cell_type": "markdown", + "id": "3776fd7a", + "metadata": {}, + "source": [ + "gripper movements also do not block:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9b62c7c5", + "metadata": {}, + "outputs": [], + "source": [ + "bot.gripperMove(ry._left, width=.02)\n", + "\n", + "while not bot.gripperDone(ry._left):\n", + " bot.sync(C, .1)\n", + "\n", + "bot.gripperMove(ry._left, width=.075)\n", + "\n", + "while not bot.gripperDone(ry._left):\n", + " bot.sync(C, .1)" + ] + }, + { + "cell_type": "markdown", + "id": "2dfbfcea", + "metadata": {}, + "source": [ + "Always close the bot/sim properly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2ea154ed", + "metadata": {}, + "outputs": [], + "source": [ + "del bot\n", + "del C" + ] + }, + { + "cell_type": "markdown", + "id": "ac37869b", + "metadata": {}, + "source": [ + "As a side note, we can always check which global config parameters have been queried by the code so far. That gives an idea of which global parameters exist:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "24342be8", + "metadata": {}, + "outputs": [], + "source": [ + "ry.params_print()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6c9473f1", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/notebooks/1c-komo.html b/docs/notebooks/1c-komo.html new file mode 100644 index 0000000..6180101 --- /dev/null +++ b/docs/notebooks/1c-komo.html @@ -0,0 +1,652 @@ + + + + + + + 2.1.3. KOMO: Motion Optimization — Robotics Course documentation + + + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

2.1.3. KOMO: Motion Optimization

+

KOMO is a framework for designing motion by formulating optimization problems. Inverse kinematics (IK) is the special case of optimizing only over a single configuration rather than a path. Formulating KOMO problems is key to realizing motion in rai.

+

This tutorial shows basics on how IK, rough waypoint optimization, and fine path optimization can be formulated as non-linear mathematical program (NLP) using KOMO. Essentially, the addObjective allows to add costs or constraints over any Feature to the NLP (same features that can be evaluated with ‘C.eval’).

+
+

2.1.3.1. Minimal IK example

+
+
[1]:
+
+
+
from robotic import ry
+import numpy as np
+import time
+
+
+
+
+
[2]:
+
+
+
C = ry.Config()
+C.addFile(ry.raiPath('scenarios/pandaSingle.g'))
+C.view()
+
+
+
+
+
[2]:
+
+
+
+
+0
+
+
+
+
[3]:
+
+
+
C.addFrame('box') \
+    .setPosition([-.25,.1,1.]) \
+    .setShape(ry.ST.ssBox, size=[.06,.06,.06,.005]) \
+    .setColor([1,.5,0]) \
+    .setContact(True)
+C.view()
+
+
+
+
+
[3]:
+
+
+
+
+0
+
+
+

The following defines an optimization problem over a single configuration. The KOMO object essentially contains (1) copies of the configuration(s) over which we optimize, and (2) the list of objectives (=costs & constraints) that define the optimization problem.

+

The constructor declares over how many configurations (single, waypoints, path..) we optimize. The addObjective methods add costs or constraints:

+
+
[4]:
+
+
+
qHome = C.getJointState()
+komo = ry.KOMO(C, 1, 1, 0, False)
+komo.addObjective(times=[], feature=ry.FS.jointState, frames=[], type=ry.OT.sos, scale=[1e-1], target=qHome);
+komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1]);
+
+
+
+

We explain the KOMO constructor arguments later. (The above defines an IK problem.)

+

The addObjective method has signature * times: the time intervals (subset of configurations in a path) over which this feature is active (irrelevant for IK) * feature: the feature symbol (see advanced Feature tutorial) * frames: the frames for which the feature is computed, given as list of frame names * type: whether this is a sum-of-squares (sos) cost, or eq or ineq constraint * scale: the matrix(!) by which the feature is multiplied * target: the offset which is substracted from +the feature (before scaling)

+

Please see more formal details <here - link to script!>

+

Given this definition of an optimization problem, we can call a generic NLP solver:

+
+
[5]:
+
+
+
ret = ry.NLP_Solver(komo.nlp(), verbose=4) .solve()
+print(ret)
+
+
+
+
+
+
+
+
+{ time: 0.000823, evals: 6, done: 1, feasible: 1, sos: 0.00414146, f: 0, ineq: 0, eq: 0.00188382 }
+====nlp==== method:AugmentedLagrangian bounded: yes
+==nlp== it:0 evals:0 mu:1 nu:1 muLB:0.1
+----newton---- initial point f(x):16.0447 alpha:1 beta:1
+--newton-- it:   1  |Delta|:        0.2  alpha:          1  evals:   2  f(y):    6.55808  ACCEPT
+--newton-- it:   2  |Delta|:        0.2  alpha:          1  evals:   3  f(y):   0.686083  ACCEPT
+--newton-- it:   3  |Delta|:   0.144223  alpha:          1  evals:   4  f(y):  0.0170221  ACCEPT
+--newton-- it:   4  |Delta|:  0.0221449  alpha:          1  evals:   5  f(y): 0.00418093  ACCEPT
+--newton-- stopping: 'absMax(Delta)<options.stopTolerance'
+==nlp== it:   0  evals:   5  A(x): 0.00418093  f: 0.00414937  g:          0  h: 0.00951471  |x-x'|:   0.373024      stop:DeltaConverge
+==nlp== it:   1  evals:   5  A(x): 0.00437027  mu:5
+--newton-- it:   5  |Delta|: 0.00240133  alpha:          1  evals:   6  f(y): 0.00413537  ACCEPT
+--newton-- stopping: 'absMax(Delta)<options.stopTolerance'
+==nlp== it:   1  evals:   6  A(x): 0.00413537  f: 0.00414146  g:          0  h: 0.00188382  |x-x'|: 0.00240133      stop:DeltaConverge
+==nlp== StoppingCriterion Delta<0.01
+----newton---- final f(x):0.00413537
+
+
+

The KOMO view displays the optimized configuration(s) stored by KOMO. (For paths, this is an overlay of many configurations. For IK, just one.)

+
+
[6]:
+
+
+
komo.view(False, "IK solution")
+
+
+
+
+
[6]:
+
+
+
+
+0
+
+
+

We can get the sequence of joint state vectors for the optimized configuration(s) with getPath. Since this is only an IK problem, the sequence contains only the joint state vector for the single optimized configuration:

+
+
[7]:
+
+
+
q = komo.getPath()
+print(type(q), len(q))
+
+
+
+
+
+
+
+
+<class 'numpy.ndarray'> 1
+
+
+

We’re done with KOMO and can destroy it. Then set the optimal joint state in C and view it:

+
+
[8]:
+
+
+
del komo #also closes komo view
+C.setJointState(q[0])
+C.view()
+
+
+
+
+
[8]:
+
+
+
+
+0
+
+
+
+
+

2.1.3.2. Example for more constraints: box grasping IK

+

The key to design motions is to add clever constraints. Here is an example for more realistic box grasping:

+
+
[9]:
+
+
+
komo = ry.KOMO(C, 1,1,0, True)
+komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome)
+komo.addObjective([], ry.FS.accumulatedCollisions, [], ry.OT.eq)
+komo.addObjective([], ry.FS.jointLimits, [], ry.OT.ineq)
+komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1])
+komo.addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+komo.addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+komo.addObjective([], ry.FS.distance, ['l_palm', 'box'], ry.OT.ineq, [1e1])
+
+
+
+

The two scalarProduct feature state that the gripper x-axis (which is the axis connecting the fingers) should be orthogonal to the object x- and z-axes. That implies fingers to normally oppose the object’s y-planes.

+

Note that grasping could also be opposing the object x- or z- planes – see below.

+
+
[10]:
+
+
+
ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()
+print(ret)
+if ret.feasible:
+    print('-- Always check feasibility flag of NLP solver return')
+else:
+    print('-- THIS IS INFEASIBLE!')
+
+
+
+
+
+
+
+
+{ time: 0.001353, evals: 4, done: 1, feasible: 1, sos: 0.00552548, f: 0, ineq: 0, eq: 0.00124449 }
+-- Always check feasibility flag of NLP solver return
+
+
+
+
[11]:
+
+
+
q = komo.getPath()
+C.setJointState(q[0])
+C.view(False, "IK solution")
+
+
+
+
+
[11]:
+
+
+
+
+0
+
+
+

Reusing the KOMO instance is ok if some aspect of the configuration changes and you want to resolve the same problem:

+
+
[12]:
+
+
+
box = C.getFrame('box')
+box.setPosition([-.25,.1,1.])
+p0 = box.getPosition()
+
+
+
+
+
[13]:
+
+
+
for t in range(10):
+    box.setPosition(p0 + .2 * np.random.randn(3))
+    komo.updateRootObjects(C) #only works for root object (the 'box' is one)
+    ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()
+    print(ret)
+    q = komo.getPath()
+    C.setJointState(q[0])
+    C.view(False, 'IK solution - ' + ('*** INFEASIBLE ***' if not ret.feasible else 'feasible'))
+    time.sleep(1.)
+
+
+
+
+
+
+
+
+{ time: 0.001635, evals: 4, done: 1, feasible: 1, sos: 0.00352417, f: 0, ineq: 0, eq: 0.00216173 }
+{ time: 0.006638, evals: 10, done: 1, feasible: 1, sos: 0.0207398, f: 0, ineq: 0, eq: 0.00141503 }
+{ time: 0.006179, evals: 10, done: 1, feasible: 1, sos: 0.00694048, f: 0, ineq: 0, eq: 0.00205847 }
+{ time: 0.004424, evals: 8, done: 1, feasible: 1, sos: 0.0152197, f: 0, ineq: 0, eq: 0.000563407 }
+{ time: 0.008229, evals: 9, done: 1, feasible: 1, sos: 0.0104266, f: 0, ineq: 0, eq: 0.00210566 }
+{ time: 0.008263, evals: 11, done: 1, feasible: 1, sos: 0.0165911, f: 0, ineq: 0, eq: 0.00175033 }
+{ time: 0.009852, evals: 13, done: 1, feasible: 1, sos: 0.019162, f: 0, ineq: 0, eq: 0.0395863 }
+{ time: 0.005472, evals: 6, done: 1, feasible: 1, sos: 0.0100858, f: 0, ineq: 0, eq: 0.00146084 }
+{ time: 0.00514, evals: 6, done: 1, feasible: 1, sos: 0.00327878, f: 0, ineq: 0, eq: 0.000313999 }
+{ time: 0.007166, evals: 8, done: 1, feasible: 1, sos: 0.030142, f: 0, ineq: 0, eq: 0.00111338 }
+{ time: 0.249048, evals: 722, done: 1, feasible: 1, sos: 0.0802649, f: 0, ineq: 0, eq: 0.102041 }
+{ time: 0.051558, evals: 61, done: 1, feasible: 0, sos: 0.0779214, f: 0, ineq: 0, eq: 5.93279 }
+{ time: 0.018069, evals: 30, done: 1, feasible: 1, sos: 0.0117585, f: 0, ineq: 0, eq: 0.000335161 }
+{ time: 0.008438, evals: 14, done: 1, feasible: 1, sos: 0.0114653, f: 0, ineq: 0, eq: 0.000106237 }
+{ time: 0.005415, evals: 8, done: 1, feasible: 1, sos: 0.0221542, f: 0, ineq: 0, eq: 0.000198337 }
+{ time: 0.078182, evals: 94, done: 1, feasible: 1, sos: 0.0645244, f: 0, ineq: 0, eq: 0.116509 }
+{ time: 0.017363, evals: 28, done: 1, feasible: 1, sos: 0.0795235, f: 0, ineq: 0, eq: 0.000762834 }
+{ time: 0.127013, evals: 216, done: 1, feasible: 1, sos: 0.114508, f: 0, ineq: 0, eq: 0.481591 }
+{ time: 0.021431, evals: 61, done: 1, feasible: 1, sos: 0.0178241, f: 0, ineq: 0, eq: 0.000106574 }
+{ time: 0.018806, evals: 28, done: 1, feasible: 1, sos: 0.0130349, f: 0, ineq: 0, eq: 0.00024334 }
+
+
+

So the solver finds feasible grasps and exploits the null space of the constraints (grasps from different directions, but always opposing the y-planes).

+

To make this proper, we should actually test all three possible grasps - so let’s define 3 IK problems, solve each, and pick the best:

+
+
[14]:
+
+
+
del komo
+komo = []
+for k in range(3):
+    komo.append(ry.KOMO(C, 1,1,0, True))
+    komo[k].addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome)
+    komo[k].addObjective([], ry.FS.accumulatedCollisions, [], ry.OT.eq)
+    komo[k].addObjective([], ry.FS.jointLimits, [], ry.OT.ineq)
+    komo[k].addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1])
+    komo[k].addObjective([], ry.FS.distance, ['l_palm', 'box'], ry.OT.ineq, [1e1])
+
+komo[0].addObjective([], ry.FS.scalarProductXY, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+komo[0].addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+
+komo[1].addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+komo[1].addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+
+komo[2].addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+komo[2].addObjective([], ry.FS.scalarProductXY, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])
+
+
+
+
+
[15]:
+
+
+
for t in range(10):
+    box.setPosition(p0 + .2 * np.random.randn(3))
+    box.setQuaternion(np.random.randn(4)) #random orientation
+
+    score = []
+    for k in range(3):
+        komo[k].updateRootObjects(C)
+        ret = ry.NLP_Solver(komo[k].nlp(), verbose=0 ) .solve()
+        score.append( 100.*(ret.eq+ret.ineq) + ret.sos )
+
+    k = np.argmin(score)
+    C.setJointState(komo[k].getPath()[0])
+    C.view(False, f'IK solution {k} - ' + ('*** INFEASIBLE ***' if not ret.feasible else 'feasible'))
+    time.sleep(1.)
+
+
+
+
+
[16]:
+
+
+
del komo
+del C
+
+
+
+
+
+

2.1.3.3. Waypoints example

+

Motion design can often be done by computing waypoints, i.e. a none-fine-resolution sequence of poses. The BotOp interface can then spline-interpolate between them when executing them.

+

Let’s define a configuration where the desired gripper waypoints are pre-defined as marker frames. (That’s a common pattern: Simplify defining constraints by adding helper reference frames in the configuration.)

+
+
[17]:
+
+
+
C = ry.Config()
+C.addFile(ry.raiPath('scenarios/pandaSingle.g'))
+C.addFrame('way1'). setShape(ry.ST.marker, [.1]) .setPosition([.4, .2, 1.])
+C.addFrame('way2'). setShape(ry.ST.marker, [.1]) .setPosition([.4, .2, 1.4])
+C.addFrame('way3'). setShape(ry.ST.marker, [.1]) .setPosition([-.4, .2, 1.])
+C.addFrame('way4'). setShape(ry.ST.marker, [.1]) .setPosition([-.4, .2, 1.4])
+C.view()
+
+
+
+
+
[17]:
+
+
+
+
+0
+
+
+
+
[18]:
+
+
+
komo = ry.KOMO(C, 4, 1, 1, False)
+komo.addControlObjective([], 0, 1e-1)
+komo.addControlObjective([], 1, 1e0)
+komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])
+komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])
+komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])
+komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])
+
+ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()
+print(ret)
+q = komo.getPath()
+print(q)
+
+for t in range(4):
+    C.setJointState(q[t])
+    C.view(False, f'waypoint {t}')
+    time.sleep(1)
+
+
+
+
+
+
+
+
+{ time: 0.003266, evals: 10, done: 1, feasible: 1, sos: 2.39494, f: 0, ineq: 0, eq: 0.000305648 }
+[[-0.35382383 -0.05464486 -0.41770589 -2.0833263  -0.05951381  2.17630166
+  -0.49971152]
+ [-0.29434065 -0.3758991  -0.40467003 -1.73241762 -0.02310692  2.33740027
+  -0.49945799]
+ [ 0.44111823 -0.06340917  0.31723911 -2.1024485   0.12222711  2.20326576
+  -0.49926996]
+ [ 0.43343189 -0.36084256  0.2768184  -1.74048984  0.12188603  2.34753359
+  -0.49916996]]
+
+
+

The KOMO constructor has arguments: * config: the configuration, which is copied once (for IK) or many times (for waypoints/paths) to be the optimization variable * phases: the number P of phases (which essentially defines the real-valued interval [0,P] over which objectives can be formulated) * stepsPerPhase: the step discretizations per phase -> in total we have phasesstepsPerPhases configurations which form the path and over which we optimize k_order: the “Markov-order”, i.e., +maximal tuple of configurations over which we formulate features (e.g. take finite differences)

+

In our waypoint case: We have 4 phases, one for each waypoint. We don’t sub-sample the motion between waypoints, which is why we have stepsPerPhase=1. We formulate this as a 1-order problem: Some features take the finite difference between consecutive configurations (namely, to penalize velocities).

+

The addControlObjective is /almost/ the same as adding a FS.jointState objective: It penalizes distances in joint space. It has three arguments: * times: (as for addObjective) the phase-interval in which this objective holds; [] means all times * order: Do we penalize the jointState directly (order=0: penalizing sqr distance to qHome, order=1: penalizing sqr distances between consecutive configurations (velocities), order=2: penalizing accelerations across 3 configurations) * +scale: as usual, but modulated by a factor “sqrt(delta t)” that somehow ensures total control costs in approximately independent of the choice of stepsPerPhase

+

In our waypoint case: We add control costs for both: homing (order 0, ensuring to stay close to homing), and velocities (order 1, penalizing movement between waypoints)

+

And the addObjective method now makes use of times argument: Specifying [1] means that this objective only holds in the interval [1,1], i.e. at phase-time 1 only.

+
+
+

2.1.3.4. Path example

+

Let’s do almost the same, but for a fine path. First order=1, leading to zig-zag, then order=2, leading to smooth path.

+
+
[19]:
+
+
+
# Note, the stepsPerPhase=10 is the only difference to above
+C.setJointState(qHome)
+komo = ry.KOMO(C, 4, 10, 1, False)
+komo.addControlObjective([], 0, 1e-1) # what happens if you change weighting to 1e0? why?
+komo.addControlObjective([], 1, 1e0)
+komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])
+komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])
+komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])
+komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])
+
+ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()
+print(ret)
+q = komo.getPath()
+print('size of path:', q.shape)
+
+for t in range(q.shape[0]):
+    C.setJointState(q[t])
+    C.view(False, f'waypoint {t}')
+    time.sleep(.1)
+
+
+
+
+
+
+
+
+{ time: 0.017709, evals: 11, done: 1, feasible: 1, sos: 2.51987, f: 0, ineq: 0, eq: 0.00176075 }
+size of path: (40, 7)
+
+
+
+
[20]:
+
+
+
# only differences: the k_order=2, control objective order 2, constrain final jointState velocity to zero
+C.setJointState(qHome)
+komo = ry.KOMO(C, 4, 10, 2, False)
+komo.addControlObjective([], 0, 1e-1) # what happens if you change weighting to 1e0? why?
+komo.addControlObjective([], 2, 1e0)
+komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])
+komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])
+komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])
+komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])
+komo.addObjective([4], ry.FS.jointState, [], ry.OT.eq, [1e1], [], order=1)
+
+ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()
+print(ret)
+q = komo.getPath()
+print('size of path:', q.shape)
+
+for t in range(q.shape[0]):
+    C.setJointState(q[t])
+    C.view(False, f'waypoint {t}')
+    time.sleep(.1)
+
+
+
+
+
+
+
+
+{ time: 0.042404, evals: 25, done: 1, feasible: 1, sos: 16.5162, f: 0, ineq: 0, eq: 0.000765251 }
+size of path: (40, 7)
+
+
+

Notice the new last objective! Without it, final velocity would not be zero. The last objective constrains the order=1 (i.e. velocity!) of the jointState feature to be zero.

+

Let’s plot the trajectory:

+
+
[21]:
+
+
+
import matplotlib.pyplot as plt
+plt.plot(q)
+plt.show()
+
+
+
+
+
+
+
+../_images/notebooks_1c-komo_36_0.png +
+
+
+
[22]:
+
+
+
del C
+
+
+
+
+
[ ]:
+
+
+

+
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/docs/notebooks/1c-komo.ipynb b/docs/notebooks/1c-komo.ipynb new file mode 100644 index 0000000..15de1df --- /dev/null +++ b/docs/notebooks/1c-komo.ipynb @@ -0,0 +1,736 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "dffc232a", + "metadata": {}, + "source": [ + "# KOMO: Motion Optimization\n", + "\n", + "KOMO is a framework for designing motion by formulating optimization problems. Inverse kinematics (IK) is the special case of optimizing only over a single configuration rather than a path. Formulating KOMO problems is key to realizing motion in `rai`.\n", + "\n", + "This tutorial shows basics on how IK, rough waypoint optimization, and fine path optimization can be formulated as non-linear mathematical program (NLP) using KOMO. Essentially, the `addObjective` allows to add costs or constraints over any `Feature` to the NLP (same features that can be evaluated with 'C.eval')." + ] + }, + { + "cell_type": "markdown", + "id": "a177972b", + "metadata": {}, + "source": [ + "## Minimal IK example" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "8e07bf36", + "metadata": {}, + "outputs": [], + "source": [ + "from robotic import ry\n", + "import numpy as np\n", + "import time" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "059a8ee7", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = ry.Config()\n", + "C.addFile(ry.raiPath('scenarios/pandaSingle.g'))\n", + "C.view()" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "582b68ba", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C.addFrame('box') \\\n", + " .setPosition([-.25,.1,1.]) \\\n", + " .setShape(ry.ST.ssBox, size=[.06,.06,.06,.005]) \\\n", + " .setColor([1,.5,0]) \\\n", + " .setContact(True)\n", + "C.view()" + ] + }, + { + "cell_type": "markdown", + "id": "ac059dc2", + "metadata": {}, + "source": [ + "The following defines an optimization problem over a single configuration. The KOMO object essentially contains (1) copies of the configuration(s) over which we optimize, and (2) the list of objectives (=costs & constraints) that define the optimization problem.\n", + "\n", + "The constructor declares over how many configurations (single, waypoints, path..) we optimize. The addObjective methods add costs or constraints:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "bccb7b55", + "metadata": {}, + "outputs": [], + "source": [ + "qHome = C.getJointState()\n", + "komo = ry.KOMO(C, 1, 1, 0, False)\n", + "komo.addObjective(times=[], feature=ry.FS.jointState, frames=[], type=ry.OT.sos, scale=[1e-1], target=qHome);\n", + "komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1]);" + ] + }, + { + "cell_type": "markdown", + "id": "76895850", + "metadata": {}, + "source": [ + "We explain the KOMO constructor arguments later. (The above defines an IK problem.)\n", + "\n", + "The `addObjective` method has signature\n", + "* times: the time intervals (subset of configurations in a path) over which this feature is active (irrelevant for IK)\n", + "* feature: the feature symbol (see advanced `Feature` tutorial)\n", + "* frames: the frames for which the feature is computed, given as list of frame names\n", + "* type: whether this is a sum-of-squares (sos) cost, or eq or ineq constraint\n", + "* scale: the matrix(!) by which the feature is multiplied\n", + "* target: the offset which is substracted from the feature (before scaling)\n", + "\n", + "Please see more formal details " + ] + }, + { + "cell_type": "markdown", + "id": "9e27cfa8", + "metadata": {}, + "source": [ + "Given this definition of an optimization problem, we can call a generic NLP solver:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "178e3d42", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.000823, evals: 6, done: 1, feasible: 1, sos: 0.00414146, f: 0, ineq: 0, eq: 0.00188382 }\n", + "====nlp==== method:AugmentedLagrangian bounded: yes\n", + "==nlp== it:0 evals:0 mu:1 nu:1 muLB:0.1\n", + "----newton---- initial point f(x):16.0447 alpha:1 beta:1\n", + "--newton-- it: 1 |Delta|: 0.2 alpha: 1 evals: 2 f(y): 6.55808 ACCEPT\n", + "--newton-- it: 2 |Delta|: 0.2 alpha: 1 evals: 3 f(y): 0.686083 ACCEPT\n", + "--newton-- it: 3 |Delta|: 0.144223 alpha: 1 evals: 4 f(y): 0.0170221 ACCEPT\n", + "--newton-- it: 4 |Delta|: 0.0221449 alpha: 1 evals: 5 f(y): 0.00418093 ACCEPT\n", + "--newton-- stopping: 'absMax(Delta) 1\n" + ] + } + ], + "source": [ + "q = komo.getPath()\n", + "print(type(q), len(q))" + ] + }, + { + "cell_type": "markdown", + "id": "9f92e896", + "metadata": {}, + "source": [ + "We're done with KOMO and can destroy it. Then set the optimal joint state in C and view it:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "b20fc581", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "del komo #also closes komo view\n", + "C.setJointState(q[0])\n", + "C.view()" + ] + }, + { + "cell_type": "markdown", + "id": "57ccf739", + "metadata": {}, + "source": [ + "## Example for more constraints: box grasping IK\n", + "\n", + "The key to design motions is to add clever constraints. Here is an example for more realistic box grasping:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "bdbbbe7b", + "metadata": {}, + "outputs": [], + "source": [ + "komo = ry.KOMO(C, 1,1,0, True)\n", + "komo.addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome)\n", + "komo.addObjective([], ry.FS.accumulatedCollisions, [], ry.OT.eq)\n", + "komo.addObjective([], ry.FS.jointLimits, [], ry.OT.ineq)\n", + "komo.addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1])\n", + "komo.addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "komo.addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "komo.addObjective([], ry.FS.distance, ['l_palm', 'box'], ry.OT.ineq, [1e1])" + ] + }, + { + "cell_type": "markdown", + "id": "de8fe5b5", + "metadata": {}, + "source": [ + "The two `scalarProduct` feature state that the gripper x-axis (which is the axis connecting the fingers) should be orthogonal to the object x- and z-axes. That implies fingers to normally oppose the object's y-planes.\n", + "\n", + "Note that grasping could also be opposing the object x- or z- planes -- see below." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "dab4fbee", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.001353, evals: 4, done: 1, feasible: 1, sos: 0.00552548, f: 0, ineq: 0, eq: 0.00124449 }\n", + "-- Always check feasibility flag of NLP solver return\n" + ] + } + ], + "source": [ + "ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()\n", + "print(ret)\n", + "if ret.feasible:\n", + " print('-- Always check feasibility flag of NLP solver return')\n", + "else:\n", + " print('-- THIS IS INFEASIBLE!')" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "f1970bb1", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "q = komo.getPath()\n", + "C.setJointState(q[0])\n", + "C.view(False, \"IK solution\")" + ] + }, + { + "cell_type": "markdown", + "id": "bef1a139", + "metadata": {}, + "source": [ + "Reusing the KOMO instance is ok if some aspect of the configuration changes and you want to resolve the same problem:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "a2d386d6", + "metadata": {}, + "outputs": [], + "source": [ + "box = C.getFrame('box')\n", + "box.setPosition([-.25,.1,1.])\n", + "p0 = box.getPosition()" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "67ef81d4", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.001635, evals: 4, done: 1, feasible: 1, sos: 0.00352417, f: 0, ineq: 0, eq: 0.00216173 }\n", + "{ time: 0.006638, evals: 10, done: 1, feasible: 1, sos: 0.0207398, f: 0, ineq: 0, eq: 0.00141503 }\n", + "{ time: 0.006179, evals: 10, done: 1, feasible: 1, sos: 0.00694048, f: 0, ineq: 0, eq: 0.00205847 }\n", + "{ time: 0.004424, evals: 8, done: 1, feasible: 1, sos: 0.0152197, f: 0, ineq: 0, eq: 0.000563407 }\n", + "{ time: 0.008229, evals: 9, done: 1, feasible: 1, sos: 0.0104266, f: 0, ineq: 0, eq: 0.00210566 }\n", + "{ time: 0.008263, evals: 11, done: 1, feasible: 1, sos: 0.0165911, f: 0, ineq: 0, eq: 0.00175033 }\n", + "{ time: 0.009852, evals: 13, done: 1, feasible: 1, sos: 0.019162, f: 0, ineq: 0, eq: 0.0395863 }\n", + "{ time: 0.005472, evals: 6, done: 1, feasible: 1, sos: 0.0100858, f: 0, ineq: 0, eq: 0.00146084 }\n", + "{ time: 0.00514, evals: 6, done: 1, feasible: 1, sos: 0.00327878, f: 0, ineq: 0, eq: 0.000313999 }\n", + "{ time: 0.007166, evals: 8, done: 1, feasible: 1, sos: 0.030142, f: 0, ineq: 0, eq: 0.00111338 }\n", + "{ time: 0.249048, evals: 722, done: 1, feasible: 1, sos: 0.0802649, f: 0, ineq: 0, eq: 0.102041 }\n", + "{ time: 0.051558, evals: 61, done: 1, feasible: 0, sos: 0.0779214, f: 0, ineq: 0, eq: 5.93279 }\n", + "{ time: 0.018069, evals: 30, done: 1, feasible: 1, sos: 0.0117585, f: 0, ineq: 0, eq: 0.000335161 }\n", + "{ time: 0.008438, evals: 14, done: 1, feasible: 1, sos: 0.0114653, f: 0, ineq: 0, eq: 0.000106237 }\n", + "{ time: 0.005415, evals: 8, done: 1, feasible: 1, sos: 0.0221542, f: 0, ineq: 0, eq: 0.000198337 }\n", + "{ time: 0.078182, evals: 94, done: 1, feasible: 1, sos: 0.0645244, f: 0, ineq: 0, eq: 0.116509 }\n", + "{ time: 0.017363, evals: 28, done: 1, feasible: 1, sos: 0.0795235, f: 0, ineq: 0, eq: 0.000762834 }\n", + "{ time: 0.127013, evals: 216, done: 1, feasible: 1, sos: 0.114508, f: 0, ineq: 0, eq: 0.481591 }\n", + "{ time: 0.021431, evals: 61, done: 1, feasible: 1, sos: 0.0178241, f: 0, ineq: 0, eq: 0.000106574 }\n", + "{ time: 0.018806, evals: 28, done: 1, feasible: 1, sos: 0.0130349, f: 0, ineq: 0, eq: 0.00024334 }\n" + ] + } + ], + "source": [ + "for t in range(10):\n", + " box.setPosition(p0 + .2 * np.random.randn(3))\n", + " komo.updateRootObjects(C) #only works for root object (the 'box' is one)\n", + " ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()\n", + " print(ret)\n", + " q = komo.getPath()\n", + " C.setJointState(q[0])\n", + " C.view(False, 'IK solution - ' + ('*** INFEASIBLE ***' if not ret.feasible else 'feasible'))\n", + " time.sleep(1.)" + ] + }, + { + "cell_type": "markdown", + "id": "c0e78a35", + "metadata": {}, + "source": [ + "So the solver finds feasible grasps and exploits the null space of the constraints (grasps from different directions, but always opposing the y-planes).\n", + "\n", + "To make this proper, we should actually test all three possible grasps - so let's define 3 IK problems, solve each, and pick the best:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "id": "8222658d", + "metadata": {}, + "outputs": [], + "source": [ + "del komo\n", + "komo = []\n", + "for k in range(3):\n", + " komo.append(ry.KOMO(C, 1,1,0, True))\n", + " komo[k].addObjective([], ry.FS.jointState, [], ry.OT.sos, [1e-1], qHome)\n", + " komo[k].addObjective([], ry.FS.accumulatedCollisions, [], ry.OT.eq)\n", + " komo[k].addObjective([], ry.FS.jointLimits, [], ry.OT.ineq)\n", + " komo[k].addObjective([], ry.FS.positionDiff, ['l_gripper', 'box'], ry.OT.eq, [1e1])\n", + " komo[k].addObjective([], ry.FS.distance, ['l_palm', 'box'], ry.OT.ineq, [1e1])\n", + "\n", + "komo[0].addObjective([], ry.FS.scalarProductXY, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "komo[0].addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "\n", + "komo[1].addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "komo[1].addObjective([], ry.FS.scalarProductXZ, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "\n", + "komo[2].addObjective([], ry.FS.scalarProductXX, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])\n", + "komo[2].addObjective([], ry.FS.scalarProductXY, ['l_gripper', 'box'], ry.OT.eq, [1e1], [0])" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "id": "09c7a057", + "metadata": {}, + "outputs": [], + "source": [ + "for t in range(10):\n", + " box.setPosition(p0 + .2 * np.random.randn(3))\n", + " box.setQuaternion(np.random.randn(4)) #random orientation\n", + " \n", + " score = []\n", + " for k in range(3):\n", + " komo[k].updateRootObjects(C)\n", + " ret = ry.NLP_Solver(komo[k].nlp(), verbose=0 ) .solve()\n", + " score.append( 100.*(ret.eq+ret.ineq) + ret.sos )\n", + " \n", + " k = np.argmin(score)\n", + " C.setJointState(komo[k].getPath()[0])\n", + " C.view(False, f'IK solution {k} - ' + ('*** INFEASIBLE ***' if not ret.feasible else 'feasible'))\n", + " time.sleep(1.)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "cda905f5", + "metadata": {}, + "outputs": [], + "source": [ + "del komo\n", + "del C" + ] + }, + { + "cell_type": "markdown", + "id": "f7d69b02", + "metadata": {}, + "source": [ + "## Waypoints example\n", + "\n", + "Motion design can often be done by computing waypoints, i.e. a none-fine-resolution sequence of poses. The BotOp interface can then spline-interpolate between them when executing them.\n", + "\n", + "Let's define a configuration where the desired gripper waypoints are pre-defined as marker frames. (That's a common pattern: Simplify defining constraints by adding helper reference frames in the configuration.)" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "a6da9bda", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = ry.Config()\n", + "C.addFile(ry.raiPath('scenarios/pandaSingle.g'))\n", + "C.addFrame('way1'). setShape(ry.ST.marker, [.1]) .setPosition([.4, .2, 1.])\n", + "C.addFrame('way2'). setShape(ry.ST.marker, [.1]) .setPosition([.4, .2, 1.4])\n", + "C.addFrame('way3'). setShape(ry.ST.marker, [.1]) .setPosition([-.4, .2, 1.])\n", + "C.addFrame('way4'). setShape(ry.ST.marker, [.1]) .setPosition([-.4, .2, 1.4])\n", + "C.view()" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "id": "7c3a74d2", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.003266, evals: 10, done: 1, feasible: 1, sos: 2.39494, f: 0, ineq: 0, eq: 0.000305648 }\n", + "[[-0.35382383 -0.05464486 -0.41770589 -2.0833263 -0.05951381 2.17630166\n", + " -0.49971152]\n", + " [-0.29434065 -0.3758991 -0.40467003 -1.73241762 -0.02310692 2.33740027\n", + " -0.49945799]\n", + " [ 0.44111823 -0.06340917 0.31723911 -2.1024485 0.12222711 2.20326576\n", + " -0.49926996]\n", + " [ 0.43343189 -0.36084256 0.2768184 -1.74048984 0.12188603 2.34753359\n", + " -0.49916996]]\n" + ] + } + ], + "source": [ + "komo = ry.KOMO(C, 4, 1, 1, False)\n", + "komo.addControlObjective([], 0, 1e-1)\n", + "komo.addControlObjective([], 1, 1e0)\n", + "komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])\n", + "komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])\n", + "komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])\n", + "komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])\n", + "\n", + "ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()\n", + "print(ret)\n", + "q = komo.getPath()\n", + "print(q)\n", + "\n", + "for t in range(4):\n", + " C.setJointState(q[t])\n", + " C.view(False, f'waypoint {t}')\n", + " time.sleep(1)" + ] + }, + { + "cell_type": "markdown", + "id": "f6263d5c", + "metadata": {}, + "source": [ + "The `KOMO constructor` has arguments:\n", + "* config: the configuration, which is copied once (for IK) or many times (for waypoints/paths) to be the optimization variable\n", + "* phases: the number P of phases (which essentially defines the real-valued interval [0,P] over which objectives can be formulated)\n", + "* stepsPerPhase: the step discretizations per phase -> in total we have phases*stepsPerPhases configurations which form the path and over which we optimize\n", + "* k_order: the \"Markov-order\", i.e., maximal tuple of configurations over which we formulate features (e.g. take finite differences)\n", + "\n", + "In our waypoint case: We have 4 phases, one for each waypoint. We don't sub-sample the motion between waypoints, which is why we have stepsPerPhase=1. We formulate this as a 1-order problem: Some features take the finite difference between consecutive configurations (namely, to penalize velocities).\n", + "\n", + "The `addControlObjective` is /almost/ the same as adding a `FS.jointState` objective: It penalizes distances in joint space. It has three arguments:\n", + "* times: (as for `addObjective`) the phase-interval in which this objective holds; [] means all times\n", + "* order: Do we penalize the jointState directly (order=0: penalizing sqr distance to qHome, order=1: penalizing sqr distances between consecutive configurations (velocities), order=2: penalizing accelerations across 3 configurations)\n", + "* scale: as usual, but modulated by a factor \"sqrt(delta t)\" that somehow ensures total control costs in approximately independent of the choice of stepsPerPhase\n", + "\n", + "In our waypoint case: We add control costs for both: homing (order 0, ensuring to stay close to homing), and velocities (order 1, penalizing movement between waypoints)\n", + "\n", + "And the `addObjective` method now makes use of `times` argument: Specifying [1] means that this objective only holds in the interval [1,1], i.e. at phase-time 1 only." + ] + }, + { + "cell_type": "markdown", + "id": "132f82a0", + "metadata": {}, + "source": [ + "## Path example\n", + "\n", + "Let's do almost the same, but for a fine path. First order=1, leading to zig-zag, then order=2, leading to smooth path." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "dd21ae9e", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.017709, evals: 11, done: 1, feasible: 1, sos: 2.51987, f: 0, ineq: 0, eq: 0.00176075 }\n", + "size of path: (40, 7)\n" + ] + } + ], + "source": [ + "# Note, the stepsPerPhase=10 is the only difference to above\n", + "C.setJointState(qHome)\n", + "komo = ry.KOMO(C, 4, 10, 1, False)\n", + "komo.addControlObjective([], 0, 1e-1) # what happens if you change weighting to 1e0? why?\n", + "komo.addControlObjective([], 1, 1e0)\n", + "komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])\n", + "komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])\n", + "komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])\n", + "komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])\n", + "\n", + "ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()\n", + "print(ret)\n", + "q = komo.getPath()\n", + "print('size of path:', q.shape)\n", + "\n", + "for t in range(q.shape[0]):\n", + " C.setJointState(q[t])\n", + " C.view(False, f'waypoint {t}')\n", + " time.sleep(.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "40341e34", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{ time: 0.042404, evals: 25, done: 1, feasible: 1, sos: 16.5162, f: 0, ineq: 0, eq: 0.000765251 }\n", + "size of path: (40, 7)\n" + ] + } + ], + "source": [ + "# only differences: the k_order=2, control objective order 2, constrain final jointState velocity to zero\n", + "C.setJointState(qHome)\n", + "komo = ry.KOMO(C, 4, 10, 2, False)\n", + "komo.addControlObjective([], 0, 1e-1) # what happens if you change weighting to 1e0? why?\n", + "komo.addControlObjective([], 2, 1e0)\n", + "komo.addObjective([1], ry.FS.positionDiff, ['l_gripper', 'way1'], ry.OT.eq, [1e1])\n", + "komo.addObjective([2], ry.FS.positionDiff, ['l_gripper', 'way2'], ry.OT.eq, [1e1])\n", + "komo.addObjective([3], ry.FS.positionDiff, ['l_gripper', 'way3'], ry.OT.eq, [1e1])\n", + "komo.addObjective([4], ry.FS.positionDiff, ['l_gripper', 'way4'], ry.OT.eq, [1e1])\n", + "komo.addObjective([4], ry.FS.jointState, [], ry.OT.eq, [1e1], [], order=1)\n", + "\n", + "ret = ry.NLP_Solver(komo.nlp(), verbose=0 ) .solve()\n", + "print(ret)\n", + "q = komo.getPath()\n", + "print('size of path:', q.shape)\n", + "\n", + "for t in range(q.shape[0]):\n", + " C.setJointState(q[t])\n", + " C.view(False, f'waypoint {t}')\n", + " time.sleep(.1)" + ] + }, + { + "cell_type": "markdown", + "id": "154ea039", + "metadata": {}, + "source": [ + "Notice the new last objective! Without it, *final velocity* would not be zero. The last objective constrains the order=1 (i.e. velocity!) of the jointState feature to be zero.\n", + "\n", + "Let's plot the trajectory:" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "3d47d887", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAGdCAYAAAAvwBgXAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjYuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/P9b71AAAACXBIWXMAAA9hAAAPYQGoP6dpAABfPklEQVR4nO3dd3hc52Em+vdM74NeBh0gADaxiCKpSlIiLUqyZbnGuZtk5SRrP/HKiR1ncyNls7G92USO7c0mdnxjP/HeKHtTbMmWXGRLssQCShRFiU3saASJ3oHp9Zzv/vENBhgCLCABHJT3p+fonDlzZuY7GBDzzteOIoQQICIiItKBQe8CEBER0crFIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREenGpHcBrkfTNPT29sLtdkNRFL2LQ0RERDdBCIFgMAifzweD4fp1Hos6iPT29qKiokLvYhAREdEt6OrqQnl5+XWPWdRBxO12A5An4vF4dC4NERER3YxAIICKiorM5/j1LOogMtEc4/F4GESIiIiWmJvpVsHOqkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0s6gvekdERESSmkoiHokgEYkgHo0gEQkjHo2m15P71URiVs+bV1aOjR94bJ5KfWMMIkRERDrRVBUR/zhCY6MIjY0iPDYit0fT2+NjCI+NIh4JQ00m56UM1RvvZBAhIiJajlLJJAJDA/AP9GN8sB/+gT6MDwwgODKE8NgoIn4/hNBm9Zxmqw1WhwMWuwNWhxMWhwNWu0OuHQ4YzRYoinLTz5dT4pvtac0pBhEiIqLbEAuFMNbXI4NGf59cD/ZjfKAfodERQIjrPl4xGODMyYUrNw/O3Hy4cnPhzM2DKzcfrtw8OHJyYXe5YbE7YLHbYTAaF+jMFgaDCBER0Q1omorA4CBGe7sx2tOF0b4ejPZ0Y6yvBxH/+HUfa7bakFNcAm9xCbxFcu0pKIIrNw+uvHzYPR4YDMsrXMwGgwgREVFaMhbDaG83Rro706GjG6O93Rjv74WaSl3zca7cPOSU+NJBoxg5xaXwFpUgp7gEdo93Vk0lKw2DCBERrTixcAijPV0Y6e7CSE8XRrs7MdLThcDQ4DUfYzJbkFvqQ66vHHll5cjzySW31AeL3bGApV9eGESIiGhZEkIgGvBjpLsTIz2ySWUidITGRq/5OLvHi/zyinTQqMiEDk9BIRQDp9+aawwiRES0pAlNQ2B4aDJo9HRlgkcsFLzm41x5+cgvr0ReWTnyyypl+CirgMPjXcDSE4MIEc2a0DQkYlHEI2HEIxHEwyG5joTlEpbrRCSCWCSMZDQCVVUhNA1C06CltzVNhaZpEGp6nb6tQIHBZIIxvRhM5sltoxFGkznrfrPNDovNJtfpkQVmmw0Wmx0Wux0W25R9dvuK7hi4lMXCIYz39WKsvxdjfb0YS3cYHe3rRioen/lBigJvYRHyymTIyC+vyIQOq8O5sCdAM2IQIaIMIQRioaCcWGl0JDPJkpxoSS6hsVGEx0ehqarexb1lVocTVqcLNqcLNtfk9uS+9LbLDbvLLdduN8w2OzsdzrNELIrx/r5M0BifCB39vYgG/Nd8nMFoQm6pD/llFchL12zkl1Ugt9QHs9W2gGdAs8UgQrTCxEIhjA/0YXygT06ylN4ODA0hPDZy3ZEBVzMYTbA6nbA5nbDYnbA6HLA6nfKD3uGA1eGC1eGA2W6H0WSGYjDAYDDCYDRASa8NigGK0QiDYXItBKCpKaipFLRUCmoqOWV7Yn8SqqpCTSSQjMeQiEWRiEaRjE1sR+TteAyJqLxPU+W5TdTcBIYGZvWzM5pMsLk9Mpy43bC7POm1e8p+D2wuF+wuD+xuN6xOF4wm/qkFJoNuYGgQgZEhBIcGERgeRGB4CIGhIQRHhm44FNaZk4vc0jLklPiQW+qT/TjKKpBTXLLs5tdYKfivg2gZigYDGOnuxHh/H8YH+jHe34vxATmrYywcuuHj7W5PekKlvKyJlZx56X05ebB7PDDNcgZHvaWSSSQiYcTCIcRCIcTDIbkdDiEeCmX2x8JT7gsGEA0FoSZlGJqoGZoNi92RqVWxuaYsTtdV4S29TLltsljm6acxd1KJBCIBP6ITSzCASCCAaNCPiH8cwZFhGTaGB6/dhDKFze2Ro1NK5JJT6kNuaRlyS0o5OmUZYhAhWsKiwQBGujox0tOJ4a5OOTqgu/OG3yod3hzkFJemJ1kqRU5JKbyFxXDnF8CRkwuT2bwwJ7DATGYzTN4cOLw5s3qcEAKpeBzRUADRYBCxYBDRoB/RUHo7FEivg4il98VCQcQiYUCIdO1MZNY1MICshbE6XbKPi9UGk9UKs8WaWZttNpgsVpit1sm11QqjyQRFMUBRFCgGg1wUZYbbBkBRZNBKJpBKJpBKJJFKxKEmk0glE3J/YmI7iXgkjGjAL8NGwI9kPDarc3J4c+ApLIInvxDuwiJ4CgrhKSiCu6AQ3sJi2FyuWf+caOliECFaAhKxKIYud2Co83ImbNwocHgKi2QV9kTgKClNT7JUDIvNvnCFXwYURYHZZoPZZoOnoOimH6dpKuLhsAwvoclF3g4gFg7LGpp0x954OIR4NCI7+0YjgBBQUylE/OM3DJd6MxiNsHu8sLs9cHg8sLu9mdvuggJ48ovgKSyEO79wSdTy0MJhECFaZCL+cQx2tGPg8iUMXb6EwcuXMNbfe83rVXgKi+VIgPJKFFRUZTrrMWzoz2Awwu72wO72zPqxWSOT0qOQUokEkok4UvE4kvE4Uomr1hPbsZgcpSQ0QIj0iCQNQghAaOnbAkIIeYwmYDSbYbJYYDRbZM2RxSL3mS0wWuQ+eZ8FFrsddo83HTg8sHu8sDqcS6qZjhYPBhEinQghEBgawGDHJQxebsfg5UsY7Gi/5kRLrtw8FFbVIL+iioFjBVAMhkw/ERToXRqi+cMgQrRAIgE/+ttb0N8ml772VsSCgekHKgpyS3woqq5FUU2dXFfXzrpfAxHRUsAgQjQPkrEYBjra0N/eKoNHewv8g9M7KhqMJhRUVmXCRlF1HQqra1jLQUQrBoMI0W0SQmCsrwe9zRfQ23oR/a3NGO7qlG3vV8nzlaNkVQNKVjWgtK4BBVU1y3aEChHRzWAQIZqlRCyK/rYW9LZcRF/rRfS2XJzxehau3DwZOuoa0ut6TilNRHQVBhGi6xBCwD84gN6WC+htuYjelgsYvnJ5Wm2HyWxBcd0q+BrWoLS+ESWrGuDOYw9DIqIbYRAhmkJNpTB0pQM9F8+jp/kcepsvIDw+Nu04d34hfA2r08saFFbXwGhiEwsR0WwxiNCKFo9E0Nd6ET3NF9DbfA69rc3TpqA2GE0orq3LhI7ShtWs7SAimiMMIrSihEZH0H3xHHounkdv8wUMXemY1sxic7rga1wDX+NalDWuQXFdPcwWq04lJiJa3hhEaNkSQmC0txs96eDRc/HcjENovUXFKGtcK4PH6rXIL6uAYjDoUGIiopWHQYSWDTWVwuDldvRcOIee5vPouXge0asmDFMUAwqra1DWuBZlq9ehrHENXHn5OpWYiIgYRGjJSkQj6G1tTjezzNy/w2S2oLS+EWWrZfAorV8Nq4OXESciWiwYRGjJCI2OoKf5AnqaZVPL0OUZ+ne43DJ0pGs8imvrOJqFiGgRYxChRUkIgdGe7kzo6Gk+D/9A/7TjPIXF6eCxBmWr17F/BxHREsMgQotCMh5Df3urnCY9PXnYtNlKFQWFVRP9O2Sthzufw2iJiJYyBhHSRXB0WIaOdPAYvHwJmqpmHWOyWFG6qiETOkobVnOKdCKiZYZBhOadmkpi6MrlzBTpvS0XEBwemnacKzcPvsa18DWsga9xNYqqa9m/g4homWMQoTklhEBwZAh9rS3oa72IvtZmDHa0I5VMZB2nGAworKpJh441KGtYA3dBIRRF0ankRESkBwYRui3JWAz9l1rR19osl7ZmhMdGpx1nc7pQWt+YCR4lqxpgsdl1KDERES0mDCJ009RUEsOdV9Df3oqBS63ob2/FcOeVaUNoDUYjCqtqULKqEb76RpTWNyKnxMfaDiIimoZBhGakqSpGe7rQ396K/kttGGhvwdCVDqip1LRjXfkF8K1qREk6dBTX1MFstelQaiIiWmoYRAhqKoWx3m4MXr6EgY529Le3YvBy+7RZSgHZxFJcV4/i2lUoqa1HyaoGDqElIqJbxiCywiRiUQxd7sDglUsYunwJg5cvYbjrCtRkctqxZpsdxbV1KKlryAQPb3EJm1iIiGjOMIgsU3L0yjBGuq5gMB04hq5cwlh/HyDEtOMtdjsKq2pRVF2Lkrp6FNfVI6+0jLOUEhHRvGIQWeKEEIj4xzHcdQUj3Z1y3SXXiWhkxse48vJRVF0rg0dNLYqqauEtKmboICKiBccgskRMBI6x3h6M9GQHjqsvdT/BYDQip8SXDh01KKqpQ1FVDRzenIUtPBER0TUwiCwy8UgE4/29GO3rwVhvN8b6ejHW14Oxvh4kotGZH6QoyCkuQUFFFfLLq1BQUYmCiirk+so4MykRES1qDCILTGgawuNj8A8NIjA0gMDQIPyD/ZnAER4fu+ZjFcUAT1ER8krLkF9RhYL0kldWzuGyRES0JM1rEHn22Wfx4osv4uLFi7Db7bj33nvx13/912hsbJzPl9WVmkoh4h9HYHhoMmik1xO3Z5qLYyqHNwe5pWXpxYdcXxnySsvgLS6FycwaDiIiWj7mNYg0NTXhqaeewtatW5FKpfCnf/qnePjhh3H+/Hk4nUvrKqqpRALh8VGERkflemwM4fFRhDPrUYTGx2R/jRlGpUylKAa4CwrgKSyCt7AY7oIi5JX6kFtahpxSH2xO1wKdFRERkb4UIW7wqTmHhoaGUFRUhKamJuzYseOGxwcCAXi9Xvj9fng8njkrRzwSwVhvN2LhkFxCIcTDIURDQcTDIcRCwcz+WPr2TJN7XYvBaIQzNw/eomJ4C4vhKSyCp7AY3vTalZcPo4mtYkREtDzN5vN7QT8N/X4/ACAvL2/G++PxOOJTPvADgZlHg9yu7gtn8JOv/8WsH2c0m+HMyYMrNw/O3NzJ7ZxcONNrV24e7G4Ph8ISERHdhAULIpqm4Ytf/CLuu+8+rF+/fsZjnn32WXz1q1+d97LY3V648gtgd7pgdblgc7phc7lgc7lhc7pgc7lgdbrS2+7M2up0clZRIiKiObRgTTOf+9zn8Morr+Ctt95CeXn5jMfMVCNSUVEx500zRERENH8WXdPM5z//ebz88ss4dOjQNUMIAFitVlit1oUoEhERES0C8xpEhBD4/d//fbz00ks4ePAgampq5vPliIiIaImZ1yDy1FNP4d/+7d/w05/+FG63G/39/QAAr9cLu90+ny9NRERES8C89hG5VsfOf/qnf8KnP/3pGz5+vobvEhER0fxZNH1EFnCKEiIiIlqCONkFERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0wyBCREREumEQISIiIt0wiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0wyBCREREumEQISIiIt0wiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREelmXoPIoUOH8Pjjj8Pn80FRFPzkJz+Zz5cjIiKiJWZeg0g4HMbGjRvxne98Zz5fhoiIiJYo03w++aOPPopHH310Pl+CiIiIlrB5DSKzFY/HEY/HM7cDgYCOpSEiIqL5tqg6qz777LPwer2ZpaKiQu8iERER0TxaVEHkmWeegd/vzyxdXV16F4mIiIjm0aJqmrFarbBarXoXg4iIiBbIoqoRISIiopVlXmtEQqEQ2traMrc7Ojpw6tQp5OXlobKycj5fmoiIiJaAeQ0ix44dw4MPPpi5/aUvfQkA8OSTT+K5556bz5cmIiKiJWBeg8iuXbsghJjPlyAiIqIljH1EiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREelmXi96R0RENJ9iSRWBaBL+aBKBWHodTcntSPa+cCKFREpDShNIqhqSqkBKnXpbQ0qV2ylNXrDVqChQFMBoUGA0KFAUBUZlYju9X1FgMCgwGw2wmAywptcWkwGWqdvp21ZTejEbs9cmA6wmI6xmuW2bst9iNMJkTL+G0ZDZNhtlmZYyBhEiIlqUVE1gMBhD91gUPWNR9IxH5fZ4FD1jEfSOxxBNqnoXU3dmowKTQYYSi8kAQzo83ay7a/Pxd7++ef4KeAMMIkREpKuRUBznegM41xtA22AIPeMR9IxH0Tcey9RMXI9BATx2Mzw2M7x2Mzx2k1zbzPDY0/tsJrhsJpiNhsyHtnlKzYLJMFHDkN5nkD0XNCGgCgFNE9CEDEeakMvkNpBSBRKqhkQqvahqZjue0rLvS9+OJzXEUyri6WNiyYltNX2f3I4lNaTSNTgJVZt2/klVIKmqiCZv7ec/FrnFB84RBhEiIloQQgj0jEczoeN8rx9newLoD8Su+RiTQUFpjg1lOXaU5ThQlmtHeY4dZbl2lOXYke+ywGU1LfnmiZslhEBKE5ngM9GklEwJJLXJ5qXZcFn1jQIMIkRENC+GgnEcuTSCsz1+nOv141xvAOPX+PZdU+DEOp8Hq0vcKM91ZIJGsccGo2FlhIyboShKujYHsMOod3HmBIMIERHNiXhKxfErYzjUMoxDLUM43xeYdozJoKCh2I11Po9cyrxYU+rR/Vs56YfvPBER3RIhBDqGwzjUMoRDrcN459IIIonszqPrfB5sqcpNBw8v6otdsJqWxzd5mhsMIkREdNOCsSQOtw2jqWUYb7YOoXssmnV/gcuKHfUF2NFQiPvrC1DgsupUUloqGESIiOi6VE3g7fZh/Oh4N1492494anLkhsVowF3VudjRUIgH6guwpsQDA/t00CwwiBAR0Yzah0L48fFuvHiiJ2tkS3W+A7sai7CjoQB31+bDYeFHCd06/vYQEVGGP5rEL0734UfHu3Ciczyz32s348MbffjElnJsKPeumOGyNP8YRIiIVjhVE3irTTa9vHauH4l004vRoGBnQyE+saUcu9cUsZMpzQsGESKiFWosnMC/Hr2Cf3mnM6vppaHYhU9sKcdHNpWhyGPTsYS0EjCIEBGtMJ0jEfzvty7h+WPdmWu15DjMeGKjD5/YUoH1ZR42vdCCYRAhIlohTnaO4R/fvIRXz/Zj4hIua0s9+OyOWjx6RwmbXkgXDCJERMuYpgm8cWEA//jmJbx3eSyzf2dDIT67oxb31uWz9oN0xSBCRLQMxZIqXjzRg++/eQmXhsMA5OXin9hUhv/0QA1Wl3h0LiGRxCBCRLSMBGNJPHf4Mp57+zJGwgkAgNtmwm9sr8Kn761GiZedT2lxYRAhIloGIokU/vntK/jeofbMFW7Lcuz4nftr8KmtFbyoHC1a/M0kIlrCYkkV/3a0E//PwTYMh2QNSG2hE3/wUD0+tKEUJqNB5xISXR+DCBHREpRIaXjheBe+va8tMwdIRZ4dX9zdgCc2+RhAaMlgECEiWkJSqoafnOrF3+1rQdeovPJtqdeG33+oHp+8qxxmBhBaYhhEiIiWAE0T+MWZPvyvN1pwaUiOgilwWfHUg3X4v7ZVwmbmHCC0NDGIEBEtYkIIHGgexNdfbcbF/iAAOQvq7+2sw3+8p4pXvqUlj7/BRESL1LleP/7yFxfwdvsIAMBtNeEzO2rx2/dVw20z61w6ornBIEJEtMj0+2P45q+a8eMT3RACsBgN+O37qvG5XXXIcVj0Lh7RnGIQISJaJMLxFL536BL+8dClzMXoHt/ow/+9txEVeQ6dS0cz0VQNakpATWnQVAFNnVgLqKoGLSVm3q8KCCEgNEDTNAgNEJqApgm5zrpfbkPIpjohAKTXIr0v675ZyimyY829vjn/2dwsBhEiIp2pmsCPjnfhf/6qBYPBOABgS1Uu/uyDa7C5Mlfn0i0dmqohmdCQiqtIxlUkE6rcTqhIxTUkEyrUpIZUUkUqoSGV1KAmVaSS6e1E+r6khlRCg5rUoKauXgS01OR9t/LBv9hUrs1jECEiWqnebB3CX/7iQqYjamWeA08/uhqPri9Z9hejE0IgldSQiKYQj6SQiKaXmIpkPL2OqUjEUnIdT69jKpKxlAwbU0KHltI/FRiMSnoxwGBUYJyybbh626BASS8GA6AYDOl19n2KATAoCmBQoABQ0v9TFMjfkfRa7gfkUTcvp0Tf2jYGESIiHbQMBPFXv7yAg81DAACPzYQ/2F2P37qnClbT0hmKq2kCiUgKsUgS8fDEOol4JIVYWO6Lx1JIRFKIp4PGxDoRSUHT5j48KApgshphthgza7PVAJPFCJPZAKPZCJPFAJPZAJPZCOOUbZPFAKPZkD7OAKPpqsWsTNtnMCkwmg0yPCzz8DgfGESIiBbQeCSB//V6C/6/d65AE/KKuL91dzX+YPcq3TuiaqqGWDiFaCiBWDCJaCiJWDiJWCiBaCiJaHDidhLxiAwb8Ujqtl9XUQCL3QSrwwSzzQSLzQiLzQSzzQiL1QhzZju9tsl98j4jTBYjzJngIcMBA8HSwSBCRLQAVE3g397txN/8qhlj6YvS7V1XjGceXYPqAue8vKYQAvFICtFgApFAAtFgUm4HE4hedVuGi1sPFWabEVaHCTanGVaHGTanCVaHGVaHDBhWuwkWhwkW2+S21W6CxW6C2WpkcFjBGESIiObZ0Usj+MrPz+NCXwAA0FjsxpcfX4t7VxXM+rmEEEjEVET8cUQCMmBE/Ol1cGI7ngkas276UACbwwybywy7a+raMrntlPtlyDDD6jTByKnl6RYxiBARzZOe8Sj+6pcX8IvTfQAAr92ML32gAb+xvXLaRek0TcjaCX8C4YmQ4Y8j7L9qXyABNanNqhxWhwl2twV2txl2twWOKdt2twUOjwwadpcZVqcZBgNrJ2jhMIgQEc2xWFLF95ou4R+a2hBLaHAB+MS6Unx8jQ+GmIoTr1yRwWJ8ImjEEQkmIWZRe2GxGeHwWuHwWCYX78S2FXa3GQ6PBXaXBUbz/NZWRJIRjMZGpy0j0RGMxceQ0lLQhDZ9gQZNS6/T+wyKAU6TEw6zA06zE05zetvkhMviymxP3Jdry0WeLQ8mAz/Oliq+c0REt2Ci/0V4PI6wP47weByh8TguXBrH+UtjMMY1PKmZ4RIGGADg7XEcenv8+k+qAHa3BU6vDBNOrwwXTq91cu2xwO6xwGxZuJE1kWQE7ePtaBlrQctYC3pDvTJoxEYwGhtFNBVdsLLMRIGCXFsu8u35KLAVoMAul3x7fmZ7YvFYPOyPssgwiBARXSWZUGXAyISMRFbgCKdrMq7VRFIBAJhSC6EADrcFzpx0uMixwumRtx1eayZ4ODxmGHTsa6EJDb2hXjSPNaNlrAWtY61oGWtBZ6ATAtevrbEarciz5SHflo88ex7ybHLJtebCbDTDqBhhUORoFqNihAIFBsUwbVE1FZFUBOFkOGuJJCMIp6ZsJ8MIJUPwx/1QhZqphWlF63XL6TQ7UeosRamzFD6XL2td6ixFoaMQBoX9XRYSgwgRrRiaqiESSE4PFenboXHZTDKb0SNWpwlhA9AViyOkCESNwObGAjxyVxlyC+zpWgx9A8a1DEeH8W7fuzgxeALNo81oHW9FOBme8dh8Wz4achvQkNuAKm+VDBxTgofD5NClpkHVVIzHxzEcHcZIdATDsWEMRyeXkehIZjuQCCCcDKNtvA1t420zPp/JYEKJowQ+lw8+lw9lrjKUu8tR7ipHubsc+bZ81qjMMQYRIlryhCYQDSUz4WKic2dofLIfRng8jkgwgRt8sc8wmQ2yBmPq4rVkth0eC15pG8Q39rViNJwAnMAj60rwPz64ZtFeF8Yf9+NY/zEc7T+Kd/veRbu/fdoxZoMZq3JWoT63PhM86nPrUWCf/QifhWA0GJFvz0e+Pf+Gx8ZSMfSF+9AX6kNvuBe9oV70hfsy68HIIFJaCt2hbnSHumd8DpvRBp/Lh3J3OcpcZVlBxefywW1xz/UpLnuKEIt3pvxAIACv1wu/3w+Px6N3cYhogU0LGBMjScYnR5FMBI+bHaaqGJR03wsrXBPhItcKp3cibFjhzLXCYrv23BYnO8fwlZ+dw/vdfgDAqiIXvvL4Otxfv7g+rMPJME4MnMC7/e/iaN9RXBy9mNXEokDB6rzV2FqyFevy16ExrxGVnkqYDWYdS62flJbCYGQwE0x6Qj3oDnajJ9SDnlAPBiID0MT1Ryy5zW6UumRzj8/py9SsTGznWHNWRI3KbD6/WSNCRAsulVCvmgMjjvDU+TAmhq0GEjc/kmRKR89MoJi6nSM7fNrdllsenjoUjOPrr17EC8flt2WX1YQv7qnHk/dWw7wIml6EEGgZa8G+zn040nsEZ4fPIiWym5nqvHXYVroN20u2466Su+C1enUq7eJjMpgywWEmSTWJvnAfukPd2SEl2IPuUDfG4+MIJoMIjgXRMtYy43PYTXaUOktR4ixBob0QhY5CFNgLUOQoQqFdbhc6CmE1WufzVBcVBhEimhOppAwX0cDkbJ3y9tSJtuSSiM5iBs+pAWPK6JGJWg0ZMuRIkvmaVCupavg/R67gb19vQTAuy/7xO8vxJ482oshtm5fXvFma0HB66DT2de7DG1femNakUO4qx/bS7dhWsg3bSrct2iaWpcBsNKPSU4lKT+WM90eSkUxTT2+oN9P8M7Eejg4jmorikv8SLvkvXfe1PBaPDCaOAhTaC+GxeOCyuOAyu+CyuOA2uydvT+yzuHXrq3M7GESIaEaaqslrjYRksJi41sjkdjpkBBKIBhNIxNRZPb/RZJgy74VlyugRuch+GPp39Hy7bRhf+fk5tAyEAAB3lHnxlQ+vw5aqXN3KlNSSOD5wHG9ceQP7O/djKDqUuc9qtOI+333YVbEL20q3ocxVpls5VxqH2YG6nDrU5dTNeH9cjWf6pwyEBzAcHcZQdEiuI0MYig5hKDKEhJZAIBFAIBGYsR/P9RgUw6xrU+4uvRvfeuhbs3rMXGIQIVoBhCYQj6YQC01etCwWnnJRs3AyfZGz9PVHQgnEw7O/7ojBqMh5LtyWzHwXjvQMnhNzYEyED4vdtKi/uXWPRfBXv7yAX57pBwDkOS34472N+LW7KmDUYebRWCqGI71H8EbnG2jqboI/7s/c5zK7sKN8B/ZU7cF9vvvgMC/OzrIrndVoRbW3GtXe6mseI4RAIBHIhJShiAwqwUQQoWQIoUQIwWRQDl9OhLL2p4ScOG6287ok1MRtntntWZAg8p3vfAff+MY30N/fj40bN+Lb3/42tm3bthAvTbSsaJpAIjrl8uqRZOby6/FIErFwCvFwErFIeh2eDB631C09fd2RzHTgLjNs6fVE2HB4JrcXe7i4GZFECt9tuoTvNbUjntJgUIDfursKX/pAI7yOhe3EqWoqjvYdxc8u/Qz7O/dnfcDkWnPxUOVD2F25G9tLt8Ni1PfKvTQ3FEWB1+qF1+q9Zs3KTIQQiKtxhJIhxFKxWb2mzaRv8+K8B5Ef/vCH+NKXvoTvfve72L59O/72b/8We/fuRXNzM4qKiub75YkWDSEEUkkNiWgKyZiKRCyFRDSFRFRFPJq+pHo0hURE7o9HU1n74tEUErHUTQ8/nYnZaoRtykXLJtZTL2TmcFtgc5thT1/kbKVcd0QIgZ+934uvvXIRfX75h/zu2jx8+fF1WFO6sKP2WsZa8HL7y/jFpV9gMDqY2V/sKMaeqj3YXbkbdxbdCaNh4WZXpcVNURTYTDbdQ8WtmPfhu9u3b8fWrVvx93//9wAATdNQUVGB3//938fTTz993cdy+C7pRQgBTRVIJVSkEhqScTW9pJCMa1O20/tj6XVCnRIy0uvYRPBQZ3UtkesxW42wpi+zbnOaYHPIi5VNXoY9fd/UkOE0z/s1R5aqM91+fPXn53DsyhgAoDzXjj/74BrsXVeyYDU8w9Fh/PLSL/HzSz/HxdGLmf1eqxePVD+Cx+sex4aCDUu+xolWhkUzfDeRSOD48eN45plnMvsMBgP27NmDI0eOTDs+Ho8jHo9nbgcCgXkp12BnAO/9vAOKQZGLokAx4Kq1AkVB5n7DxHZ6MUw9buJ25rkUGAzyuQxZj5nYlo81GNPPbVCgGBUYJl4/69jsxxmmvMZMtyeeD8qU11bkfkysFwkhBISQ/ReEJqBp02+rKQ2aKqYsWmatTt2f0qCmNKSSGrT0Wk1pUDNrgVRqyr6khlRCBodUQm6n0vsmbs9bRFcAi80Ei80Is80Eq90Eq8MEy9T1VfusdjMsdqO85LrDBKOJgWIuDAZj+OZrzXjheDeEAOxmI556sA7/6YFa2MzzX9sQS8VwoOsAftb+MxzpPQJVyA6/JoMJO8t34vHax/FA+QNsdqFlbV6DyPDwMFRVRXFxcdb+4uJiXLx4cdrxzz77LL761a/OZ5EAAAfPncPQGX0v0qQbmUdkIJkIU3IH0lkls539GGVy+6osIwQAATlRkrh6f3r6JJEOHpoMGpoQt9XEsJAUBTDbTDBbDHJtNU5bTFfdttpNsNhMMNuMMnTYjZnbZuu1J8qihZFIaXju7Q58a18bQunhuB/Z5MPTj65BiXd+q7aFEDgzfAY/bv0xfnX5VwglQ5n7NhRswON1j+OR6keQY8uZ13IQLRaLatTMM888gy996UuZ24FAABUVFXP+OpfQheO1r0OBAkUYrlorUDQrTJobNoMXbpMHudYcFDryUOUtgttigaZNfmuXH6pX357ygTt1f9YaM+6berwQ6f3qTK8x5XZ6+6akP/+FEICW3rFYKYDRaIDBKGuPDCYDjBPbU/cbDTCaFBhNBhjNBrk2GWCa2M5ay+NMFiNMFgNMZrk2W4wwptcmS/p+s1xP1FzR0ieEwP6Lg/gfv7iAjmF5TZUN5V58+fG12FKVN6+vHUgE8HL7y/hR64/QOjZ5YTaf04cP1X0Ij9c+ft3RFETL1bwGkYKCAhiNRgwMDGTtHxgYQElJybTjrVYrrNb5n03u93Y9hndW1eLMwCW0jnagJ9SFkUQvItoAhPHazUEiaoAhUIJcUw3qvI3YUrYeD9VuRmNR/qLo0JepadCQDkOTQSVTIzG1ZkLIsCPvw5Tt7HAy8XgAk9NDTz0kU1miZNWYZD68p9zOaq5SspuTMs1LholmKv1/prR8nO8N4NlXLuDN1mEAQIHLij95pBEfv7N83v79CiFwcvAkftz6Y7x2+TXEVdn0bDVa8XDVw/ho/UexpXgLr/ZKK9qCdFbdtm0bvv3tbwOQnVUrKyvx+c9/flF2Vg3EQjjR14b3+9vQOnoZXaEuDMV6ENJ6IAyhaccLoQDJQngNNahyNWBT8TrsqtmMTWUlsLAdn0h3ff4ovvlaC148KfuBWIwG/M79NXjqwTq4bfMzHHc8No6ftf8MP279cdYMmvW59fh4/cfxodoPcWp1WtZm8/k970Hkhz/8IZ588kl873vfw7Zt2/C3f/u3eP7553Hx4sVpfUeutphGzQghcNnfi33tx/Fe31m0jV/ESPISVIN/5uPjRcg1NmJt3ibsqbkbH2hYjRwHO5zdFCGAmB+AAMwOwGiZ3jGF6AYCsSS+e7Ad//utDsRT8kJlH9pQij/e24iqfOecv54QAscGjuFHLT/C61deR1JLApDXFnmk+hF8vOHjHPVCK8aiCiIA8Pd///eZCc02bdqEb33rW9i+ffsNH7eYgsi19AUHcPDKKRztPo2LYxcwGG9HUhmddpyW9MKh1WOVZwMeKN+GRxo3oqbAuXL+KKkpINANhIfTy1B6SW9HhqfcHgbSf8QlRQYSs02uTTbAbJ9cTHbA6gJyKoHcGiCvRq7dpYCBtVIrTSKl4d+OXsG39rdhNCxnjNxWnYdnHluNzZVzPy27P+7Hz9p/huebn8flwOXM/jV5a/CJhk/gsZrH4LK45vx1iRazRRdEbtVSCCIzGYmO4LX2ozjQcRQXxt+HX+0AlOxLR2spB0yJWlQ41uFu31Z8eM1WrPPl6jJ19JyLjgED54D+s8DAGbkeugjMcra/22a0ArlV2eEkrwbIrZbbJtZQLSdCCLx6th9//epFXB6JAADqCp14+tE12LOmaM5D/9nhs/hh8w/xaseriKnyd9thcuCDtR/Exxs+jnX56+b09YiWEgaRRSaSjOBw1wm82v423h86icFEM4SSzDpGqDYosXrUujbjoar78cE1d6CucJHXmGgaMNYB9J8BBs6mg8dZwN818/FGK+AqApwFgLMwvaS3HQXZt50FgGIAkhEgGZPrVGzK7SiQisp1MiqbcsYuy/KMdsgyaNe5VorRApRsAMq2AOV3yXVeLZuAlqjjV0bxl7+4gBOd4wCAApcFX9zTgF/fWgHTHF4wL5qK4pWOV/DD5h/i/Mj5zP6G3AZ8qvFT+GDtB+E0z32zD9FSwyCyyCXVJN4fOotXWt/Gu33H0RU5B1WJZB2jJfJgSTZiTc5d2Ft7P/asrkFZjl2nEk8USgMGzwGXmoCOQ8CVt4FEcOZjvZVAyXqgeD1QcofczqleuKaSiaag0Y7JcDLWAYxeluvE9I7HsOUAZXcCZelgUrYFcBUuTHnplrQOBPE/f9WCV8/JC9PZzUZ8ZkctPrujFi7r3A0KvOS/hBeaX8BP236KYFL+zpsNZuyt3otPNX4KGws3Lu4vDUQLjEFkiVE1FWeGzuInzQfxdu/b6I81QyiTl1QXQoEWK4NLW4uNBdvwoYZ7sLOhZP47vwoBjLQDHQdl8Oh4E4he1f/FaAWK1qTDxh0yeBSvA+w581u22yGEDCPdx4Ge9NL3PqDGpx+bUykDSfX9QO2DrDVZJNoGg/i7fW14+XQvhAAMCvBrd1XgDz/QgGLP3ExIltSS2N+5H883P493+9/N7C9zleHXGn8NH1n1EeTZ5nfukQWnaTKkJ0JAPJheAlO204vQAIsLsDjl2jple2K/Nb3N6+GsSAwiS1wkGcHbPUfx85YmHB86Cn+qO+t+oVqgRlah1LIZD1XuxKNrVmNjuXduqqD93ZM1Hh2HgGBv9v1mJ1B1L1CzA6h5ACi+AzAuqnnxbk0qIWt7eo4DPSfkeqgZ0yZ881YANTuB2l1A7U7Z1EQLpm0wiG/ta8PP0wEEAB5ZV4I//EADGkvcc/Ia/eF+vNDyAl5sfRHDUTnniEExYEf5Dnyq8VO413fv0pz3QwjZIXy0Y7IZc+zyZFNmLHDtGs7bYXbIjuO5VUBO1ZR1tVzsuQz3yxCDyDIzEB7Agc638Gr7IZwdPY64yB4yrMZKYYytwYb8u/HB+ruxq7EEvpttxtE0oOcYcPEXQPMvgeGW7PuNFqBiu/zwrdkhmy6MC3spdN3E/EDvKaDrXaCjCeg6CqiJ7GOK1qZDyS4Z0Kxz82FI2doGQ/j2/lb87P3JALJ3XTH+YHc91vlufz4OTWg43HMYzzc/j0M9h6AJ2bk835aPj9V/DJ9s+CRKXaW3/ToLIjIK9J8GhlvTgePyZPhIhm/uOQwm+btsdQNWz5Tt9AIFSITTS1Cu46HJ2/EQINQbvgwAwOLODil5tena1XX897SEMYgsY5rQcGH0An7ZdgD7rjShJ5r9rV2oNqRCDcg3bMKuigfw8Jo6bK/Jy76AVzImP1gv/gJoeRUITZn5VjHKsFGzQy4V2+UQWZJ/ZDuPyBqjSwflH/upDCbZv6TuQaD+A0DpZg4fvk3tQyF8e58MIBNXMXh4rQwg68tuP4CMxkbxUutLeKHlBfSEejL7t5VswycbP4ndFbthXszBOzggfw/7Tsnmxb73gfHO6zxAAbzlk7URE6PIcqplc+pE6DBZb6+WQggZ2uMhIO6XNa1jV4DxK1PWl7P/9swkr1Z2Ki+5Y3LtLmENyhLAILKCjMXG8Gb3W/h5636cHHoHcTHZCVP2LSmHiKzGNs8G/K7Hjy2xd+Dsasr+ZmT1yA/Oxsfk2sYZH29KeAS4fEiGkksH5R/WqZxFQP3DQMPDsn+Jjb/DN2umAPKBtcX4whwEECEETgyewA+bf4g3rryRmXjMbXHjibon8MnGT6LWW3u7pzC3hJAf5hNhY2IJ9c98fG6N7Ls1deh6bjWQUyFDxmKRjMrgNDWcDLfIEXhXNwtPcBZO9kkr2SC/OOXWMJwsMgwiK5SqqTgzfAavXz6INy43oTfalnV/USqFHZEodkWiWKO6kah8GIV3fRTWVTs5p8ZcGLssA0nbG0D7geyROQYzUHUPUL8XaHgEKFilVykXtdPd4/j+mx14+fRkANmzphhf3HP7AcQf9+MXl36BF1peQNv45L+N9fnr8WuNv4ZHah6B3bRIav/UlBwK3/mOrIXrfGfm0KEYgIIGoHSjXCZqDRZzZ/GbFRpKz0OUXvpOAyOtsqPs1ZyFsva2Yptcl26SEyCSbhhEVrJAL3D+p8C5lzDYewxv2W1octhxxG5DdEozgdBMUMOrgMhabCy4Bw83NGJXY+HKmu11PqXicnhz66+AlteA0fbs+/Nq06HkYaDqvsX1LXWBpVQNr50bwD8d7sCxK2OZ/XvWFOELuxtwR/mtB5CJaddfbH0Rr195PXPROZvRhsdqH8OvNf7a4ph4LB6SfbUmgkf3selDzA0mWctRulF+0JZulP0oLCto3pJEBBi8APS/nw4n78uAomXPywSDGfBtAsq3TYYTzxLp47NMMIisNIE+4MLPgHMvyT9iGQpQeTew5sOI1+/BsfgQXr9yAPuvHMRYIrttVo2VIhVagwJlEx6q2YKdjcW4py5/TudiWNGG24DW12QoufJ29h9Oiwuoe0jWlDTslZO5rQD+SBL//l4n/s/bl9HrlzOTmo0KPrTBh9+9v+a2akCGo8P4adtP8VLbS7gSuJLZP3HRucfrHofHouPflPAwcOXwZPDoOz29c6fVKz9EK+8GKu+RTRDsrzVdMib7yHS9KzuUdx2Vo4Ou5q2UP8+qe2X4L2xkc848YhBZCYIDk+HjytvIGmZacTew7qPA2g8DHt+0hwoh0DbehqauJrzasR/N42ezHq+lXFBDDRCRNdiYvx0PNlRiZ0Mh1pS6WVsyF2IB2YTT+hrQ8isgPDjlTgUo3wo0PiKDSdHaZffHsm0whOfe7sCPj/cgmpQfvnlOC35zeyV+8+4qFN3iPCCqpuJw72G82PoimrqakBJyZl2HyYFHax7Fx+s/jvUF6/X5HQ6PyOBx+U3g8lvA4Pnpx3gr0qEjHTwK17Cz860QQjaTZoLJu3Jo/tVNOo58+XOuuk+Gk5I7OOfJHGIQWa5CQ1PCx+Hsf1jlW4F1HwPWPgF4y2b1tGOxMbzV8xb2XTmAw72HEVMnZ3kVwgA1UoNUaDW8YgN21a7DjoZCPLCqALlO9iu5bZoG9J0Eml+VI5iuHonjrZS1JI2PANUPLNkmHCEEDrUO4/99qwNNLZPfVleXuPE799fgwxt92SO7ZqE31IuX2l7CS60vYSAyWdO3oXADPlH/Ceyt3guH2XHb5zArkVEZOCaWwXPTjylaJ/sNVd4jw4e3fGHLuJLEAnJuoM53gM63ga735CUiprJ65PswUWNSuol9524Dg8hyEhkFLvwcOPeinNl0avVt2ZZ0zccTcgbQOZDUkjg5cBJN3U1yeHD4Stb9WiIfqeBqqOHVWJu3GTvrS3DfqgJsrsyB1cRvE7fN3yMDSctrcoj11AsFmp2TQ4NX7VkSH1x9/ihefr8PPzzWhbZB2edBUWQH1N++rxr31ObfUg1FIBHAG1fewMuXXsax/mMQ6Ro9r9WLx2sfx8fqP4b63Po5PZfrioymazzSwWPg7PRjitbKGXqr7weq7gec+QtXPsqWSsjmnCuHZY1y5ztyBtmpTHZ5Haqqe2VAKd8mZ4ulm8IgstTF/MDFXwJnfwxcOpB98bbSTTJ8rPuIHI43zzoDnTjUfQgHug7i+MBxqGKyLEK1IBVZBTXUCGO8Edsq6nFfXT7uW1WAtaUeGJbDlYT1lIjIMNL8igwmV4+aKFwtA8mqPfKP5SKpLRmPJPDK2X789FQPjnaMZiYgc1lN+ORd5fj0vdWoyp99B8ukmsSbPW/i5Usvo6mrCQltcnK57aXb8fH6j+OhyodgNS7AzyE0BFx5C7h8WH6YzdTUUrhmMnhU379i+v4sSZoqw+OVtyfDSWQk+xjFKJtvqu5N12Ldw2tRXQeDyFIUD8lvwmdfBNpez57Bs3h9Onx8FMiv062I4WQYR3qP4FD3IRzsasJYPPu6M2q8EGq4AalQA9xoxH21pbhvVQHuW5WPyjwH+5fcDk2T3+Da3pBL93vZTXNmh5yAbtUeYNVuOSpnAUUTKvZdHMBPTvaiqWUQSXXyz8q26jx8eJMPT2zywW2b3eRgmtBwavAUXr70Ml67/BoCiclvrXXeOnyo7kN4rOYx+FzT+0LNqWC/rOm4cliGj+Hm6ccUrgGq75NNaFX38UNqKRNCzmdy5W3ZmfjKEcA/w0Rx+asmQ0nl3bwW1RQMIktFPCiHd57/qey0OLXNsqARWP8xGT4KG/Ur4zVoQsOFkQt4q+ctHO49jPeHTkOb0mwkNJPsWxJugBpqRKmzEvfXFWJrTR62VucymNyuyGh6zpJ9MphcXVuSVydDSe1O+UfSMfcXZ0upGt5qG8bPTvXitXP9CCcm3/81pR48scmHxzf6bumq0ZfGL+HlSy/jlx2/zJrxtNBeiMdqHsOH6j6ExtzG+fkd0jRgpA3oTnd2vPK2vH214vUycFTfL78ls8ZjefN3yyaciaacwfOYdi0qe176Ct7pq3f77lyxgZRBZDELj8hrulz4uWx2mVrzkVcrO5yu/9iSGy0RSARwtO8oDvccxls9b2V1GgQALelFKtQANVIHNVyHQkcBtlbLUHJXdR7WlHpgZFPOrRFCViu3vQG0vgF0vZPdnAfI36eqe9PVyvfe8pwKwVgSR9pH0NQyhFfP9mMkPPn7W55rxxObfHhiUxkaimd3jRAhBC6OXsSBrgPY37kfzWOTNQ5OsxN7KvfgQ3UfwtbirTDO9ciGWEDO4dH1ngwf3ceA2PhVBymyWn6imWWewh0tIdExoPNoesK5I/JimVfPZwLIDueZcHKnbF5fAX1NGEQWG3+3vK7LhZ9PH+2SvwpY/SFZ81G6cUmFj2sRQuCS/5KsLek5jOMDx7Pa8wFAjRdBDdfKYBKphdPkxZ1VudiWDiabKnJueRTFihcLyCsnt70hv73N1IyQWzM5bLHqXtnfaIbfPVUTON09jjdbh/Fm6xBOdI5D1Sb/ZOQ7LfjQhlJ8eFMZ7qzMmVUNRVJL4lj/MRzoOoCDXQfRF+7L3GdSTLi/7H58sPaD2Fmxc+5mPNU0OTtn17uToWPwAqZ9szXZAd9moGLrZNX7cpitlOZPKi6/EExcvbvn+PSLiAJyNtzC1XIW3MLG9LJaXvRvOVzJPI1BZDEYbpNDbS++LH8hpyrZAKz5MLDm8RUxqU40FcV7/e/hnb538F7/e2gebc6McpigxkqgRmqhhuuQitTArDixptSD9WVe3FHmxXqfFw0lLo7MuRWhoXQ799ty6GL/melzKrh9so275A4MOVbhcLAEr3UZcLh9BIFYdu1KTYETD9QX4KHVRbh/VQFMxpuf6yKUCOGt3rdwoPMA3ux+E8Hk5GXn7SY77im9Bw9WPoid5TuRa8u9rdNGeEQOmx04P7keujh9xlJAfghUbEvPxLlVNrss5ovd0dIwcQXvnuNA7wkZUgI9Mx9rtMgvpoWNsml+IqTkr1o0HdFng0FED4mI/EPfvk+222d9C52Y4fRxYPUHF2S0y2Lmj/txrP8Y3u1/F+/2v5t13Q8AgFCgxkuhRmqgRiugRishkrkwGw1oKHbLYJJeVpe4WXMyWzG/rBG4chji8ttA7wkoM1Qp+4UDF0UlLhmqIIrWoqDuTqzdeDfKS4pu+qWEEOgKduFI7xEc6DqAo/1HkZrSbJRny8Ouil14sOJB3F16N2ymW5jMLBGRAWPwfHboyJoobgqTXVaRl2+dXNzFs39dolsR6JOhZOC8/JwYagaGW6fPazJBMcjPDLcPcBVNLs4iwFUs+6A4i+T1dhbRvCcMIgtBCGDgnAwe7ftlr+r0dSwAyOtC1OyU4aPxMf6hu46R6AiODRzDe/3v4d3+d9Hh75h+kOpCKlIONVopw0msAtBsMBoU1Be5sL7Mi1VFLtQVulBb6ERlngPmWXxTXwlUTaBjOIQzPX6c6Q7gbK8f53sDSMXD2GRox2alDY2GTqxRulBn6IUJ6sxPlFMlawxyq6/6o1gE4SxEezKA40OncHzgOI4NHMNQNHu67WpPNR6seBAPVT6EOwruuH6fj3hQ/uEO9ADB9DrQm71vpum8J+RWy4nDitfKfjLF62RH3mVUBU7LgKbJUTlDLTJUDzent5uBuP/mn8eeK8OJ1TO7mvayu4BH/mr25b4OBpH5EhqSHUzb98sllN0hE55yYNVDQN1uoHYX25Rv0VBkCO/1v4fTw6dxeug0LoxeyPoWDQAQCpAsQiJSAS1aATVaAS1eBEB+wJgMCirzHZlgUlfoQl16neNYPN8a5ks4nkLnaATnewM40+PH2R4/zvcFEElMDxc2swFrSz3YUJ6D+1YV4O7aPLhNmvyWNnBOtnsPnpfbwb6sx6oAWixmHLfZcMxmxQmbFWPG7GBhhoI7zDnYYSnEg6Y81CoWeXVZLSU796kT66TcpyZlZ9FA7/RJpq7FUTAZNiYCR+HqFdEpkJYxIeTnzEibHEIeGpQ1faEhuT88mN43NL2D+mzU7QZ+68W5KzcYRObGxBC+iXa9ziPTp982O+ScAXUPybkb8lct+/4eeoircVwcvYgzQ2dweug0Tg+fzhrSOUGBESa1GPFIEZLRYqjxEmixEohUDoDJ9yXPaUFlngMlHhtKvDYUe2woTa9LvDaUeGywWxZ3c08sqaJnPIrusSi6RiPoGougezQq12NRjIYTMz7ObjZinc+Tadq6o8yLukLnTfXzEEJgcLQFzZf3o6X/BE4GO3AyPozgVTUnNk3DxngCW2Ix3BWL4454Arbb+TNj9chrJrlLAU+Z3PZM3S7jCBZa2TRNjuIJD8qAEp+hH9T1OAuByu1zWiQGkdkSAvB3ycAxETz63p/521jJBhk66h6Sl5Zegp2IloOR6AjODE8Gk/PD57M6Pk5lgh0WzYdktBjBYCG0WAm0RBGE6sDUgDKV125GiceGYq8NxW4rvHYzXDYTXFYT3DYTXNarb5vgspngtJhmNQxZCIFIQkUglkQgmkqvk/BH5ToQS6XX8v7hUBxdYxEMBOI3fG6v3YyGYlcmcNxR5kVtoeumyhdX42gbb0PLaAtaxuTSPNYM/wzVxE6zE5uLNmNL8RbcVXQn1tlLYI6OyT+IoSH5xzEyKi8oZjDLZhGDWTZfGqeup9xndaeDRqncJqIlhUHkRqJjsrPe1OARGZ5+nMkuh9ROjAGv2bliJ6dZ7IQQ6A/3o3W8FS1jLWgda0XreCs6xjsyV2G9msVgh9NQBJNWCJHIQyyaA3/Qg2gkByKZg4lmntmypGsXJkYGTfwLm/oPbeKfnZhy/2zZzUZU5NlRketARZ4D5bl2VOQ5UJHrQHmeHZ4bzGIqhMBobBR94T70hnrRGexEy6gMHFcCV6BefVl6AEbFiGpPNRryGrAufx3uKrkLjbmNMBnY54KIJjGI3MiZHwE//t3sfQaTbFf23SmDh+9O2cbMTm1LWlJNoiPQIYPJWDqkjLeiP9x/3ccZYECOtRAeYzGsSiFMwgtFc0NLOqGmXEjGHYjGHYjEzAjHNARjKaS0W/+nZDIo8NrN8NjN8NhM6bUZHrspvZb7cxyWdNiwI89pue68HaqmYig6hL5wH3pCPegL9aE33Ju1jqmxaz4+x5qDxtxG1OfWozGvEQ25DajLqVuYa7kQ0ZLGIHIDsePnMfzCRE97hf06rrZCfhzyN3+yZgIQWfuyjr3Bc8lfockfnILs29P239bPWEypRZla/vR92TtuykSgURQF8j/w38VKw7d7xbKtykXBp9fN6XPO5vN7ZX7dd5cCmHJlxUUbxXSyQn4eyjX+f0tm+zO7rZ+xMi3y3DZx1Xr6DSJapsRt1ObOhRUZRKy1XpT+17ntIXxb+Pd+Diz8DzGppjAeH8dYfAzjsTGMxcYwHh9HMBlCPBVDTI0hrsYRS8XT6xhiqRjiWgzxVBwxVe4HZN8LRZlYG2CEEQZFmbxtkLfNihkuixtus0uuLS64zC64zG64ren1lPu8Fu/cX5tlAfCfxAz4Q1lgK+cHrpj0nXNpRQYRxWSA0b3855Kg+WWEFSVwogRleheFiGjJ4tSTREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0wyBCREREumEQISIiIt0wiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHpZt6CyF/+5V/i3nvvhcPhQE5Ozny9DBERES1h8xZEEokEPvnJT+Jzn/vcfL0EERERLXGm+Xrir371qwCA5557br5egoiIiJa4eQsityIejyMej2duBwIBHUtDRERE821RdVZ99tln4fV6M0tFRYXeRSIiIqJ5NKsg8vTTT0NRlOsuFy9evOXCPPPMM/D7/Zmlq6vrlp+LiIiIFr9ZNc380R/9ET796U9f95ja2tpbLozVaoXVar3lxxMREdHSMqsgUlhYiMLCwvkqCxEREa0w89ZZtbOzE6Ojo+js7ISqqjh16hQAYNWqVXC5XPP1skRERLSEzFsQ+fM//3P88z//c+b25s2bAQAHDhzArl275utliYiIaAlRhBBC70JcSyAQgNfrhd/vh8fj0bs4REREdBNm8/m9qIbvEhER0crCIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0wyBCREREumEQISIiIt0wiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3DCJERESkGwYRIiIi0g2DCBEREemGQYSIiIh0wyBCREREumEQISIiIt0wiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6MeldACLSjxAC6ugokt3dSA4MQAuGoIWCUIMhaKEQ1FAQWigMLRhM3w5ltgFAsdmg2KwwWKxQbDYYrHKtWC0wWG1yn80KxW6Hpbwc5spKWKqqYCkrg2Kx6Hz2RLQYMIgQLXNqKIxkT7cMG93dSHSlt3u6kejphYhEbv3Jb/WxRiPMPp8MJZWVsFRXwVJVJYNKeTkUs/nWy0RESwqDCNEyoQaDiDc3I3axGfHmi4i1tCB5pRPq+Pj1H6goMBUXw1xSAoPXA6PLDYPLBaPbBYPLBYPLDYPLCaPbDYPTBYPbBaPLBSgKtFgMIh6HiMehxeIQ8Vh6XyK9nb4vFESiswuJK1eQ6OyEiEaR7OpCsqsL4avLYzTCUlMNx+bNsG/aDPudm2GproaiKPP0kyMiPTGIEC0xQtOQ7O5G7OJFxC82I9bcjPjFi0j29FzzMcacHJjLy2EuL4elvExul5XDUlEOk88HwwI2kwghkBocQuLKZSQ7O2U4uSwDykRISbS1I9HWjvEXfiTLn5sL++bNcNy5GfbNm2Fbvx4Gq3XBykxE80cRQgi9C3EtgUAAXq8Xfr8fHo9H7+IQLTghBJI9PYieeh/R999H7OxZxJuboV2jScRUWgpbYyOsjY2wrW6EpaYG5vJyWYOxBMiQMojYuXOInjiByMlTiJ05A5FIZB9oNsO+di3sd94J++ZNcG7fDqPXq0+hiWia2Xx+M4gQLSJaOIzombOIvv9+ZlFHRqYdp1gssK5aBevq1bCtboS1cTVsjQ0w5uQsfKHnmZZIIH7+PCInTiJ68iQiJ09CHR7OPshohH3zJrh27IRr505YG+rZlEPLmhaJINnfj2RvH5J9vUj19SHZ04tkXx/UYHBWz+XYvAklf/7nc1q+RRFELl++jL/4i7/A/v370d/fD5/Ph9/8zd/Ef/2v/xWWm6wGZhCh5UwIgURHR6a2I/r++4i3tACaln2g2QzbmjWwb9wI+4Y7YFu9GpaaGiimldmyKoRAsqtLhpITJxE5dgyJ9vasY0wlJXDt2AHXrp1wbt8Og9OpU2mJbp0aCiF29hxiFy+kQ0Yvkr29SPX23bjv1yw4778fld//xzl7PmB2n9/z9pfs4sWL0DQN3/ve97Bq1SqcPXsWn/nMZxAOh/HNb35zvl6WaNESySRiFy4gcuw4IsePI3r8+Ix/TEy+Uhk60ott7Vr2h5hCURQ50qayEt4nngAAJLq7ETp0COGmQwgfPYpUfz/Gn38e488/D8VshmPrVrh27YRrxw5Yqqv1PQGiGWjxOOIXLyJ65ixiZ84geuYMEh0dwHXqCgxOJ8w+H0y+UphLS2Eu9cHsK03XjN58jaDeNakL2jTzjW98A//wD/+AS5cu3dTxrBGhpUwLhxF9/30ZPE6cQPT99yGi0axjFKsVtjvWTwkem2AuLtKpxMuDFosh8u67CDUdQqipCcnu7qz7LdXVcD/8MNx7H4Zt7Vo24dCCE6qKeFs7Ymdl4IidOYtYSwuQTE471uQrhX3deliqq2AqTQcOnw/m0lIYF/Hn4qJompnJn/3Zn+HVV1/FsWPHZrw/Ho8jHo9nbgcCAVRUVMx5ENFiMaSGR2AqyIfBZpuz56WVLTUykq7pOIHI8eOIXbgAqGrWMUavF/YtW+DYciccW7bID0JO7DVvZPPXZYSamhA61ITIseNZf+zN5eVw730Ynr17YbvjDoYSmjeJzk6EDx9G+O23EX7nKLQZ+nEYc3PlF5M7NqTXd8CUn69DaW/fogwibW1t2LJlC775zW/iM5/5zIzHfOUrX8FXv/rVafvnOohE3nsPV37rPwIADC4XTPn5MBYWwJRfAFN+PkyFBTDm58NUUAhTQX76/sIFHeJIi5sQAsnOTkSOn0Dk+DFEj59A4vLlaceZfb508JDhw1JXB8XAKyvoRQ2FET7UhMBrv0KoqQkiFsvcZ/KVwvOBh+Heuxf2TRv5PtFtUcfHEX7nqAweb789rWbO4HDAtn59JnDY1t8Bc5lv2YTheQ0iTz/9NP76r//6usdcuHABq1evztzu6enBzp07sWvXLnz/+9+/5uMWqkYkeOAAer7wxelDAm/AWFCQboebqB4rhcnnm2yXy81dNr9ElE2kUohdbEb0xPFMU8u0kRuKAmt9Pex3boZjy11w3LUF5tJSfQpMN6RFIgi9+RaCr72G4MGDWTPMmoqK4H74YXj2Pgz7nXdCMRp1LCktBSKRQPT99xE6fBjht48gdvZsdsdzsxmOTZvgvO9eOO+9F7Z165b179W8BpGhoSGMzDCccKra2trMyJje3l7s2rULd999N5577jkYZvEtYz77iAghoIVCSA0NQx0ZRmpkBKmhYaRGhpEaHoY6PCL3DQ9DHR6GmKHt7mqKzTYZUsrKYK6ogKWiPDORlDEnh0FliUiNjSF2+rQczXLqfURPnZo2d4diNsN2xx1wbNkC+5Y74di8mXNZLFFaLIbw4cMIvPoaQvv3QwtPzvdqKiyE+5FH4HnsUdg3beK/YcpIjY0hdLAJof37ET58eNrfCEtdXSZ4OLduXVGjtxZN00xPTw8efPBBbNmyBf/yL/8C4yzT32LprCqEgDo+jlR/P5K9velx231yu08OpUoNDd3weQwulwwn5eVTQkoFzOVlvAiYjkQigVhzswwc6fCR7OycdpzB7Za1HXdugeOuLZzdc5nSEgmEDx9G8LVfIbh/P7RAIHOf2eeD+9FH4HnsMXZ0XaESly8juG8/ggf2I3riZFathzEvD8577pHB4757YS4p0bGk+loUQaSnpwe7du1CVVUV/vmf/zkrhJTc5JuzWILIzdASiXRQSQeUiYuKdXUj2dV146CiKDCVlsBSXgFzRTksFZXpdQXMFRWsTZkjE9OjR8+cSdd4nEbs/PkZm+ksNTXpkSwbYL/zTljr69lvYIURiQRChw8j8MtXENq3L+sbr6WqCu7HHoX3scdgra/XsZQ0n4SqIvr+aYQO7Edw334krhr1aV29Gu6HHoLrwQdhW7eWfyPSFkUQee655/Dbv/3bM953sy+5lILIjWjRKJI9PUh0dSHZlR1SEt3d04Z1Xi2rNqW8XDb9+HxyXeZbMlN4LyTV70e8pQWxlhbEm1vkBeFaW2e82qzR64Vt00bYN2yAfYOcOIzNLDSVFosh1HQIgVdeQejAAYgp/dms9fXwPPYoPI8+ynlKlgEtFkP47bcR3LcPoYNN2bMbm0xwbtsK14MPwf3QgzCXlelX0EVsUQSRubCcgsj1CCGgjoykQ0pXJqwkuruQ7OxCanDwhs9h8HplMPH5YC7zTYYUnw/m4mIY8/KWbVLXYjEkOjsRb2lFvCUdOFpakOrrm/F4xWKBtaFB1nakw4e5spI1TnTT1FAYoQMHZCh5882sIcHWNWvg2bsXnkf2MpQsIarfj1BTE4Kvv4HQW29lfTk0uFxypt7dD8H1wAOLev6OxYJBZJnRYrHJ2pTOLiR7epDs7ZFT/vb0QPX7b/wkRqMcmlxUBFNhoVxm2s7PW3RTh6uhcPp8e9J9dHrlufemz/86nafNPh+sjY2wNjbA1tAAa2MjLFVVi+4caelS/X4E39iHwC9/ifA772TNHcNQsrglBwYQfOMNhPbtQ/jd94BUKnOfqbQU7ocegnv3Q3DcdRf78M0Sg8gKo4XDSPb2IjHxQZ1Zyw9rdWTkutMEX83gcMDg8cDodk9Zu2F0ezJro8cNg8sNxWKBYjQARlN6bYSSXnD1WghokQi0cFiup26Hp+9PDg4g1dN7U0HL4HLBWl8vA0f66rPW+noY3e7b+dESzUpqbAzBN95A8NXXpoeS1avheWQv3Hv3wlpTo2MpV7b4pUsIvv4Ggm+8gdiZM1n3Wevr4dqzG+49e9gZ+TYxiFAWkUrJociDQ0gNDSE1OCjXV28PD0+/4NoiYfB6M01Olqn9Y9Jrg8fDPxq0qKTGxhDatw+BV19D+MiR7FDS2ChDyQc+ICe54+/uvBGpFKInTyJ48CBC+w/I67dMUBTYN22Ce88euHc/xFqrOcQgQrdEqCpUvx9aMAg1EIQWDEANBKEGA9Cy1kFogQDUYFDOr6KqEKoKaCqEqkGoKUDV5L6J+9J/hA0OBwxO5+Ta6cjapzgcMDqdUBwOmAoKYPaxMy4tfVmh5J13spoAzBUVcO3aBfeDu9gEMEdSY2MIv/WWnOPjzTezhmDDbIbznrvh3r0H7ocehKmwUL+CLmMMIkREi5Q6Po7gvn0IvPYaIkfeyZos0eB0wnn//XA9uAuunTthys3Vr6BLiBAC8dZWGTyamhA9edX8Hjk5cO54AO5du+DcsYNfbBYAgwgR0RKghcMIHzmC4IEDCDUdyr5sQLrZwPXgg3A/uAuWVavYhDOFGgojevyYvMrywYNI9vZm3W9tbIRr5064du2CfeOGZT2d+mLEIEJEtMQITUPs7FkZSg42IX7hQtb95rIyOLZtg+Ouu+DYehfMFRUrKpiooRCix48j8t57CL/7HmLnzmX1u1GsVjjvvhuuXTvh2rkTZp9Px9ISgwgR0RKX7OtD6OBBBA8elE04V83+ayoqyoQS+5YtsK5atazmClKDQUTSwSMyETyu6kxvrqiA89574dq1E86774bBbteptHQ1BhEiomVEi0QQOXZMXvn52DFEz5zJmkQNkLMD2++6S4aTu+6CbXUjFLNZpxLPjkil0pMStiB6+gwi776L2Pnz04NHZSUc27bCuW0bHFu38urWixiDCBHRMqbFYoi+fxqRY+/JYHLyFEQsln2Q2QxLVSWstXWw1NXCWrcK1rpaWGpqYLDZdCm3EAKpgQE5A3JLC+KtrYi1tCLR3j7j9Z7MVZUydEwEjxV8EbmlhkGEiGgFEYkEYufPT9aaHD8OLRic+WBFgbmsTIaT2joZTmprYczJgcHpgtHtgmK3z7r/iRACIhqFOj4uF79fXrV8ZBTx9jZ5CYbW1uyhtFOLZbfDWl8P2+rVcGzdCse2rTAXF8/2R0GLBIMIEdEKJjQNyd4+JC61I95+aXLd3n5zl4QwGGBwuWBwOWF0uadsu2BwuQEhJgPHlOAxU63GNEYjLNXVsDbUy8suNDTAWl8Pc3n5surjstIxiBAR0TRCCKijo4i3tWeFlMSVTjlRYSh0+7Mrm80w5nhhysmB0ZsDY24OLFVVMnA0NMBSWwsDJ21b9mbz+c0rfxERrRCKosiLX+bnw7l927T7M80rwRC0cAhaSC5qML0dDkENBgFFgTEnB0avV65zcmTwyMmB4nCsqGHFdPsYRIiICIAMKopDXnYBKNK7OLRCsEGOiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKQbBhEiIiLSDYMIERER6YZBhIiIiHTDIEJERES6YRAhIiIi3TCIEBERkW4YRIiIiEg3i/rqu0IIAEAgENC5JERERHSzJj63Jz7Hr2dRB5FgMAgAqKio0LkkRERENFvBYBBer/e6xyjiZuKKTjRNQ29vL9xuNxRFmdPnDgQCqKioQFdXFzwez5w+92KyEs5zJZwjwPNcbniey8dKOEdgducphEAwGITP54PBcP1eIIu6RsRgMKC8vHxeX8Pj8SzrX5wJK+E8V8I5AjzP5YbnuXyshHMEbv48b1QTMoGdVYmIiEg3DCJERESkmxUbRKxWK7785S/DarXqXZR5tRLOcyWcI8DzXG54nsvHSjhHYP7Oc1F3ViUiIqLlbcXWiBAREZH+GESIiIhINwwiREREpBsGESIiItLNigwi3/nOd1BdXQ2bzYbt27fj3Xff1btIc+orX/kKFEXJWlavXq13sW7boUOH8Pjjj8Pn80FRFPzkJz/Jul8IgT//8z9HaWkp7HY79uzZg9bWVn0KextudJ6f/vSnp72/jzzyiD6FvUXPPvsstm7dCrfbjaKiInzkIx9Bc3Nz1jGxWAxPPfUU8vPz4XK58PGPfxwDAwM6lfjW3Mx57tq1a9r7+Xu/93s6lfjW/MM//AM2bNiQmejqnnvuwSuvvJK5fzm8l8CNz3M5vJdX+9rXvgZFUfDFL34xs2+u388VF0R++MMf4ktf+hK+/OUv48SJE9i4cSP27t2LwcFBvYs2p9atW4e+vr7M8tZbb+ldpNsWDoexceNGfOc735nx/q9//ev41re+he9+97s4evQonE4n9u7di1gstsAlvT03Ok8AeOSRR7Le33//939fwBLevqamJjz11FN455138PrrryOZTOLhhx9GOBzOHPOHf/iH+PnPf44XXngBTU1N6O3txcc+9jEdSz17N3OeAPCZz3wm6/38+te/rlOJb015eTm+9rWv4fjx4zh27BgeeughPPHEEzh37hyA5fFeAjc+T2Dpv5dTvffee/je976HDRs2ZO2f8/dTrDDbtm0TTz31VOa2qqrC5/OJZ599VsdSza0vf/nLYuPGjXoXY14BEC+99FLmtqZpoqSkRHzjG9/I7BsfHxdWq1X8+7//uw4lnBtXn6cQQjz55JPiiSee0KU882VwcFAAEE1NTUII+d6ZzWbxwgsvZI65cOGCACCOHDmiVzFv29XnKYQQO3fuFF/4whf0K9Q8yc3NFd///veX7Xs5YeI8hVhe72UwGBT19fXi9ddfzzqv+Xg/V1SNSCKRwPHjx7Fnz57MPoPBgD179uDIkSM6lmzutba2wufzoba2Fr/xG7+Bzs5OvYs0rzo6OtDf35/13nq9Xmzfvn3ZvbcAcPDgQRQVFaGxsRGf+9znMDIyoneRbovf7wcA5OXlAQCOHz+OZDKZ9X6uXr0alZWVS/r9vPo8J/zrv/4rCgoKsH79ejzzzDOIRCJ6FG9OqKqKH/zgBwiHw7jnnnuW7Xt59XlOWC7v5VNPPYUPfvCDWe8bMD//Nhf1Re/m2vDwMFRVRXFxcdb+4uJiXLx4UadSzb3t27fjueeeQ2NjI/r6+vDVr34VDzzwAM6ePQu326138eZFf38/AMz43k7ct1w88sgj+NjHPoaamhq0t7fjT//0T/Hoo4/iyJEjMBqNehdv1jRNwxe/+EXcd999WL9+PQD5flosFuTk5GQdu5Tfz5nOEwD+w3/4D6iqqoLP58Pp06fxJ3/yJ2hubsaLL76oY2ln78yZM7jnnnsQi8Xgcrnw0ksvYe3atTh16tSyei+vdZ7A8nkvf/CDH+DEiRN47733pt03H/82V1QQWSkeffTRzPaGDRuwfft2VFVV4fnnn8fv/u7v6lgymgu//uu/ntm+4447sGHDBtTV1eHgwYPYvXu3jiW7NU899RTOnj27LPoxXc+1zvOzn/1sZvuOO+5AaWkpdu/ejfb2dtTV1S10MW9ZY2MjTp06Bb/fjx/96Ed48skn0dTUpHex5ty1znPt2rXL4r3s6urCF77wBbz++uuw2WwL8porqmmmoKAARqNxWu/egYEBlJSU6FSq+ZeTk4OGhga0tbXpXZR5M/H+rbT3FgBqa2tRUFCwJN/fz3/+83j55Zdx4MABlJeXZ/aXlJQgkUhgfHw86/il+n5e6zxnsn37dgBYcu+nxWLBqlWrsGXLFjz77LPYuHEj/u7v/m7ZvZfXOs+ZLMX38vjx4xgcHMSdd94Jk8kEk8mEpqYmfOtb34LJZEJxcfGcv58rKohYLBZs2bIF+/bty+zTNA379u3LauNbbkKhENrb21FaWqp3UeZNTU0NSkpKst7bQCCAo0ePLuv3FgC6u7sxMjKypN5fIQQ+//nP46WXXsL+/ftRU1OTdf+WLVtgNpuz3s/m5mZ0dnYuqffzRuc5k1OnTgHAkno/Z6JpGuLx+LJ5L69l4jxnshTfy927d+PMmTM4depUZrnrrrvwG7/xG5ntOX8/b79v7dLygx/8QFitVvHcc8+J8+fPi89+9rMiJydH9Pf36120OfNHf/RH4uDBg6Kjo0McPnxY7NmzRxQUFIjBwUG9i3ZbgsGgOHnypDh58qQAIP7mb/5GnDx5Uly5ckUIIcTXvvY1kZOTI37605+K06dPiyeeeELU1NSIaDSqc8ln53rnGQwGxX/5L/9FHDlyRHR0dIg33nhD3HnnnaK+vl7EYjG9i37TPve5zwmv1ysOHjwo+vr6MkskEskc83u/93uisrJS7N+/Xxw7dkzcc8894p577tGx1LN3o/Nsa2sT//2//3dx7Ngx0dHRIX7605+K2tpasWPHDp1LPjtPP/20aGpqEh0dHeL06dPi6aefFoqiiF/96ldCiOXxXgpx/fNcLu/lTK4eDTTX7+eKCyJCCPHtb39bVFZWCovFIrZt2ybeeecdvYs0pz71qU+J0tJSYbFYRFlZmfjUpz4l2tra9C7WbTtw4IAAMG158sknhRByCO9/+2//TRQXFwur1Sp2794tmpub9S30LbjeeUYiEfHwww+LwsJCYTabRVVVlfjMZz6z5IL0TOcHQPzTP/1T5phoNCr+83/+zyI3N1c4HA7x0Y9+VPT19elX6Ftwo/Ps7OwUO3bsEHl5ecJqtYpVq1aJP/7jPxZ+v1/fgs/S7/zO74iqqiphsVhEYWGh2L17dyaECLE83kshrn+ey+W9nMnVQWSu309FCCFurS6FiIiI6PasqD4iREREtLgwiBAREZFuGESIiIhINwwiREREpBsGESIiItINgwgRERHphkGEiIiIdMMgQkRERLphECEiIiLdMIgQERGRbhhEiIiISDcMIkRERKSb/x+2OJ/3V3U2MgAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import matplotlib.pyplot as plt\n", + "plt.plot(q)\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "08aacd1a", + "metadata": {}, + "outputs": [], + "source": [ + "del C" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c209e80b", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/notebooks/botop1-motion.html b/docs/notebooks/botop1-motion.html deleted file mode 100644 index 5f118b7..0000000 --- a/docs/notebooks/botop1-motion.html +++ /dev/null @@ -1,256 +0,0 @@ - - - - - - - 2.3.1. Interacting with a Sim/Real: Setting spline references — Robotics Course documentation - - - - - - - - - - - - - - - - - - - - - -
- - -
- -
-
-
-
    -
  • - - -
  • -
  • -
-
-
-
-
- -
-

2.3.1. Interacting with a Sim/Real: Setting spline references

-
    -
  • BotOp is a generic abstraction for interacting with a simulated or real robot

  • -
  • The BotOp class only has a few methods, which should be a clear bottleneck between user code and the robot

  • -
  • The move methods set/overwrite a spline reference for the robot. (Also compliance around the reference can be set.)

  • -
  • The gripper methods operate grippers

  • -
  • The getImage.. methods grap images or point clouds from the camera

  • -
  • The simulation can be run in many different modes: pure kinematic (no physics for objects), a physics simulator with physics for objects but still kinematic robot, a physic simulator with PD motors for the robot.

  • -
-
-
[ ]:
-
-
-
from robotic import ry
-import numpy as np
-import time
-
-
-
-
-
[ ]:
-
-
-
ry.params_add({'botsim/verbose': 1., 'physx/motorKp': 10000., 'physx/motorKd': 1000.})
-ry.params_print()
-
-
-
-
-
[ ]:
-
-
-
C = ry.Config()
-C.addFile(ry.raiPath('../rai-robotModels/scenarios/pandaSingle.g'))
-C.view(False, 'this is your workspace data structure C -- NOT THE SIMULTATION')
-
-
-
-
-
[ ]:
-
-
-
bot = ry.BotOp(C, False)
-#note that in sim, arms are going down! free floating...
-
-
-
-
-
[ ]:
-
-
-
# we need to control it somehow, e.g. to home
-bot.home(C)
-
-
-
-
-
[ ]:
-
-
-
qHome = bot.get_q()
-q = bot.get_q()
-print(q)
-q[1] = q[1] - .1
-print(q)
-
-
-
-
-
[ ]:
-
-
-
bot.moveTo(q, 2)
-
-while bot.getTimeToEnd()>0:
-    bot.sync(C, .1)
-
-
-
-
-
[ ]:
-
-
-
bot.home(C)
-
-
-
-
-
[ ]:
-
-
-
bot.gripperClose(ry._left)
-
-
-
-
-
[ ]:
-
-
-
bot.gripperOpen(ry._left)
-
-
-
-
-
[ ]:
-
-
-
bot.sync(C, .0)
-
-
-
-
-
[ ]:
-
-
-
del bot
-del C
-
-
-
-
-
[ ]:
-
-
-

-
-
-
-
- - -
-
- -
-
-
-
- - - - \ No newline at end of file diff --git a/docs/notebooks/botop1-motion.ipynb b/docs/notebooks/botop1-motion.ipynb deleted file mode 100644 index af21910..0000000 --- a/docs/notebooks/botop1-motion.ipynb +++ /dev/null @@ -1,182 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "id": "e193f01d", - "metadata": {}, - "source": [ - "# Interacting with a Sim/Real: Setting spline references\n", - "* BotOp is a generic abstraction for interacting with a simulated or real robot\n", - "* The BotOp class only has a few methods, which should be a clear bottleneck between user code and the robot\n", - "* The move methods set/overwrite a spline reference for the robot. (Also compliance around the reference can be set.)\n", - "* The gripper methods operate grippers\n", - "* The getImage.. methods grap images or point clouds from the camera\n", - "* The simulation can be run in many different modes: pure kinematic (no physics for objects), a physics simulator with physics for objects but still kinematic robot, a physic simulator with PD motors for the robot." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "31a434d3", - "metadata": {}, - "outputs": [], - "source": [ - "from robotic import ry\n", - "import numpy as np\n", - "import time" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "bb4031b4", - "metadata": {}, - "outputs": [], - "source": [ - "ry.params_add({'botsim/verbose': 1., 'physx/motorKp': 10000., 'physx/motorKd': 1000.})\n", - "ry.params_print()" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "d1bff41b", - "metadata": {}, - "outputs": [], - "source": [ - "C = ry.Config()\n", - "C.addFile(ry.raiPath('../rai-robotModels/scenarios/pandaSingle.g'))\n", - "C.view(False, 'this is your workspace data structure C -- NOT THE SIMULTATION')" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "6832eb10", - "metadata": {}, - "outputs": [], - "source": [ - "bot = ry.BotOp(C, False)\n", - "#note that in sim, arms are going down! free floating..." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "ee98b4f8", - "metadata": {}, - "outputs": [], - "source": [ - "# we need to control it somehow, e.g. to home\n", - "bot.home(C)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "afe800f7", - "metadata": {}, - "outputs": [], - "source": [ - "qHome = bot.get_q()\n", - "q = bot.get_q()\n", - "print(q)\n", - "q[1] = q[1] - .1\n", - "print(q)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "443856f3", - "metadata": {}, - "outputs": [], - "source": [ - "bot.moveTo(q, 2)\n", - "\n", - "while bot.getTimeToEnd()>0:\n", - " bot.sync(C, .1)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "77bee28c", - "metadata": {}, - "outputs": [], - "source": [ - "bot.home(C)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "9b62c7c5", - "metadata": {}, - "outputs": [], - "source": [ - "bot.gripperClose(ry._left)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "1189fd71", - "metadata": {}, - "outputs": [], - "source": [ - "bot.gripperOpen(ry._left)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "66c6a195", - "metadata": {}, - "outputs": [], - "source": [ - "bot.sync(C, .0)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "2ea154ed", - "metadata": {}, - "outputs": [], - "source": [ - "del bot\n", - "del C" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "24342be8", - "metadata": {}, - "outputs": [], - "source": [] - } - ], - "metadata": { - "kernelspec": { - "display_name": "Python 3 (ipykernel)", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.8.10" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} diff --git a/docs/notebooks/botop2-realRobotStarters.html b/docs/notebooks/botop2-realRobotStarters.html index ed7686f..1b78875 100644 --- a/docs/notebooks/botop2-realRobotStarters.html +++ b/docs/notebooks/botop2-realRobotStarters.html @@ -4,7 +4,7 @@ - 2.3.2. Starting with a real robot — Robotics Course documentation + 2.2.7. Starting with a real robot — Robotics Course documentation @@ -23,8 +23,8 @@ - - + + @@ -50,19 +50,25 @@
  • 1. Getting Started
  • 2. Tutorials
      -
    • 2.1. Basics: Configurations, Features, etc
    • -
    • 2.2. KOMO: Optimization Problems to design motion
    • -
    • 2.3. BotOp: Interface to Sim/Real
        -
      • 2.3.1. Interacting with a Sim/Real: Setting spline references
      • -
      • 2.3.2. Starting with a real robot
      • 3. Lecture Script
      • @@ -85,7 +91,7 @@
        • - +
        @@ -95,7 +101,7 @@
        -

        2.3.2. Starting with a real robot

        +

        2.2.7. Starting with a real robot

        This starts exactly as botop1-motion, only switching from BotOp(C, useRealRobot=True) instead of False.

        [ ]:
        @@ -182,7 +188,7 @@ 

        2.3.2. Starting with a real robot

        -

        2.3.2.1. Advanced: Compliance & Force/Torque feedback

        +

        2.2.7.1. Advanced: Compliance & Force/Torque feedback

        [ ]:
         
        @@ -278,8 +284,8 @@

        2.3.2.1. Advanced: Compliance & Forc


        diff --git a/docs/notebooks/botop3-vision-toBeMerged.html b/docs/notebooks/botop3-vision-toBeMerged.html index e17e7c4..2394e30 100644 --- a/docs/notebooks/botop3-vision-toBeMerged.html +++ b/docs/notebooks/botop3-vision-toBeMerged.html @@ -22,8 +22,8 @@ - - + + @@ -370,8 +370,8 @@