+```
+The name and the id are attributes of the returned Joint object.
+
+```python
+reachy.r_arm.r_shoulder_pitch.name
+>>> 'r_shoulder_pitch'
+reachy.r_arm.r_shoulder_pitch.uid
+>>> 8
+```
+
+Joints in Reachy are abstract elements that do not have a physical element. A joint is controlled by several motors of the actuators. The only thing you can do at joint level is reading the **present_position** and send **goal_position**.
+
+#### present_position
+
+You can get the present position of each joint with this attribute.
+
+```python
+reachy.r_arm.r_shoulder_pitch.present_position
+>>> 22.4
+```
+
+> present_position is returned in **degrees**.
+
+#### goal_position
+
+The *goal_position* attribute of a joint can be used to set a new joint's target position to make it move. However, we recommend using the [**goto_joints() method**]({{< ref "developing-with-reachy-2/basics/3-basic-arm-control#goto_joints" >}}) to move the motors which provides better control on the joint's trajectories.
+
+Using goal_position will make the motor move **as fast as it can**, so be careful when using it.
+```python
+reachy.r_arm.r_elbow_pitch.goal_position = -90
+```
+
+> goal_position must be written in **degrees**.
+
+### The gripper
+
+## Arm moves methods
+### goto_joints()
+
+The **`goto_joints()`** method takes a seven-elements-long list, with the angles in this order:
+- r_arm.shoulder.pitch
+- r_arm.shoulder.roll
+- r_arm.elbow.yaw
+- r_arm.elbow.pitch
+- r_arm.wrist.roll
+- r_arm.wrist.pitch
+- r_arm.wrist.yaw
+
+Let's see an example of how to use it.
+
+You will use the `goto_joints()` methods to place the right arm at a right-angled position. First, make sure that the Reachy's right arm is placed on a cleared table and that there will not be obstacles during its movement.
+
+The setup should look like this:
+
+{{< img-center "images/sdk/first-moves/base_pos.jpg" 500x "" >}}
+
+Let's define a list with **reachy.r_arm.elbow.pitch** at -90 degrees to the set a right-angled position for the right arm:
+
+```python
+right_angled_pose = [0, 0, 0, -90, 0, 0, 0]
+```
+
+Then send the `goto_joints()` commands to the right arm:
+Set the right arm motors in stiff mode.
+
+```python
+reachy.r_arm.turn_on() # don't forget to turn the arm on
+
+reachy.r_arm.goto_joints(right_angled_pose)
+```
+
+You can use the
+
+
+The result should look like this:
+
+
+ {{< video "videos/sdk/goto.mp4" "80%" >}}
+
+
+Don't forget to put the right arm's joints back to the compliant mode. Place your hand below the right arm's gripper to prevent the arm from falling hard on the table.
+
+```python
+reachy.r_arm.turn_off()
+```
+
+> To find out whether you have to send positive or negative angles, read next section on the arm kinematics.
+
+### goto_matrix()
+
+The **`goto_matrix()`** method takes a 4x4 matrix expressing the target pose of Reachy 2's end effector in Reachy 2 coordinate system.
+
+> Read next section on [Use arm kinematics]({{< ref "developing-with-reachy-2/basics/4-use-arm-kinematics" >}}) to better understand the use of the `goto_matrix()` method.
+
+## Gripper control
+
+### open()
+
+To **open the grippers**, call the **`open()`** method on the wanted gripper.
+It will open it entirely:
+```python
+reachy.r_arm.gripper.open()
+```
+
+### close()
+
+To **close the grippers**, call the **`close()`** method on the wanted gripper.
+It will close the gripper with the appropriate value, which means it will be entirely closed if there is no object to grasp, or set a suitable value if an object has been detected in the gripper during the closing:
+```python
+reachy.r_arm.gripper.close()
+```
+
+### opening
+
+The opening value corresponds to a **percentage of opening**, which means:
+- 0 is close
+- 100 is open
+
+You can read the opening of the gripper through the opening attribute:
+```python
+reachy.r_arm.gripper.opening
+>>> 20 # almost closed
+```
+
+You can also control the opening of the gripper, using the **`set_opening()`** method.
+
+Send your custom opening value, still between 0 and 100, to the gripper with:
+```python
+reachy.r_arm.gripper.set_opening(50) # half-opened
+```
+
+> Note that there is an smart gripper control that will avoid the gripper from reaching the opening position if an object has been detected while closing the gripper.
+
+
+## Read arm position
+
+### get_joints_position()
+
+You can retrieve the values from each **arm joints** using the **`get_joints_position()`** method.
+
+This method returns a seven-elements-long list, with the angles in this order:
+- r_arm.shoulder.pitch
+- r_arm.shoulder.roll
+- r_arm.elbow.yaw
+- r_arm.elbow.pitch
+- r_arm.wrist.roll
+- r_arm.wrist.pitch
+- r_arm.wrist.yaw
+
+> Angles are returned in **degrees** by default.
+
+```python
+reachy.l_arm.rotate_to(20, 30, -10)
+
+reachy.head.get_joints_position()
+>>> [7, 10, 4, -50, 4, 5, 7]
+
+# r_arm.shoulder.pitch=7,
+# r_arm.shoulder.roll=10,
+# r_arm.elbow.yaw=4,
+# r_arm.elbow.pitch=-50,
+# r_arm.wrist.roll=4,
+# r_arm.wrist.pitch=5,
+# r_arm.wrist.yaw=7,
+```
+
+
+### End effector position
+
+You can get the end effector position of Reachy 2 in Reachy 2 coordinate system using forward kinematics.
+
+Call:
+```python
+reachy.l_arm.forward_kinematics()
+```
+to get the position of the left gripper in cartesian space.
+
+> Read next section on [Use arm kinematics]({{< ref "developing-with-reachy-2/basics/4-use-arm-kinematics" >}}) to better understand the use of the `forward_kinematics()` method.
diff --git a/content/sdk/first-moves/kinematics.md b/content/developing-with-reachy-2/basics/4-use-arm-kinematics.md
similarity index 93%
rename from content/sdk/first-moves/kinematics.md
rename to content/developing-with-reachy-2/basics/4-use-arm-kinematics.md
index d3bb5015..9e5c4383 100644
--- a/content/sdk/first-moves/kinematics.md
+++ b/content/developing-with-reachy-2/basics/4-use-arm-kinematics.md
@@ -1,331 +1,332 @@
----
-title: "4. Use arms kinematics"
-description: "Presentation of Reachy's forward and inverse kinematics."
-lead: ""
-date: 2023-07-25T17:39:00+02:00
-lastmod: 2023-07-25T17:39:00+02:00
-draft: false
-type: docs
-images: []
-toc: true
-weight: "100"
----
-
-> Note : Make sure you checked the [safety page]({{< ref "sdk/getting-started/safety" >}}) before controlling the arm.
-
-## Arm coordinate system
-
-### Joint coordinates
-
-If you remember the [`goto_joint()` function]({{< ref "sdk/first-moves/arm#goto_joints" >}}), to generate a trajectory for the arm, you need to pass a list of joints with the requested position as argument.
-
-For example, to place the right arm in a right angled position, we defined the following list:
-
-```python
-right_angled_position = [0, 0, 0, -90, 0, 0, 0]
-```
-
-and then call the function with is:
-
-```python
-reachy.r_arm.goto_joints(right_angled_position)
-```
-
-In this basic arm control, we used what is called **joint coordinates** to move Reachy. This means that we controlled each joint separately.
-
-Controlling a robot in joint coordinates can be hard and is often far from what we actually do as humans. When we want to grasp an object in front of us, we think of where we should put our hand, not how to flex each individual muscle to reach this position. This approach relies on the cartesian coordinates: the 3D position and orientation in space, this is where the **kinematic model** comes into play.
-
-### Kinematic model
-
-The **kinematic model** describes the motion of a robot in mathematical form without considering the forces and torque affecting it. It only focuses on the geometric relationship between elements.
-
-We have defined the whole kinematic model of the arm. This means the translation and rotation required to go from one joint to the next one. On a right arm equipped with a gripper this actually look like this:
-
-|Motor|Translation|Rotation|
-|-----|-----------|--------|
-|r_arm.shoulder.pitch|(0, -0.019, 0)|(0, 1, 0)
-|r_arm.shoulder.roll|(0, 0, 0)|(1, 0, 0)
-|r_arm.elbow.yaw|(0, 0, -0.280)|(0, 0, 1)
-|r_arm.elbow.pitch|(0, 0, 0)|(0, 1, 0)
-|r_arm.wrist.roll|(0, 0, -0.120)|(0, 0, 1)
-|r_arm.wrist.pitch|(0, 0, 0)|(0, 1, 0)
-|r_arm.wrist.yaw|(0, 0, 0)|(1, 0, 0)
-|r_gripper|(0, ??, ??)|(0, 0, 0)
-
-
-To use and understand the kinematic model, you need to know how Reachy coordinate system is defined (from Reachy's perspective), see below:
-
-{{< img-center "images/sdk/first-moves/arm_axis.png" 400x "" >}}
-
-* the X axis corresponds to the forward arrow,
-* the Y axis corresponds to the right to left arrow,
-* the Z axis corresponds to the up arrow.
-
-The origin of this coordinate system is located in the upper part of the robot trunk, inside Reachy.
- Basically, if you imagine a segment going from the left shoulder to the right shoulder of the robot, the origin is the middle of this segment, which corresponds to behind the center of Pollen's logo on Reachy's torso.
-
-{{< img-center "images/sdk/first-moves/reachy_frame.jpg" 400x "" >}}
-
-> Units in this coordinate system are **meters**. So the point (0.3, -0.2, 0) is 30cm in front of the origin, 20cm to the right and at the same height.
-
-
-### End effector location
-
-We consider the end-effector to be in a virtual joint located in the gripper and referred as *'right_tip'* or *'left_tip'* in the [urdf file](https://github.com/pollen-robotics/reachy_kinematics/blob/master/reachy.URDF), as shown below.
-
-{{< img-center "images/sdk/first-moves/eef.png" 400x "" >}}
-
-The red dot corresponds to the *'right_tip'*.
-
-You can see the right and left end-effectors animated below.
-
-
- {{< video "videos/sdk/eef.mp4" "80%" >}}
-
-
-### Switching between joint and cartesian coordinates
-
-Forward and inverse kinematics are a way to go from one coordinates system to the other:
-
-* **forward kinematics: joint coordinates –> cartesian coordinates**,
-* **inverse kinematics: cartesian coordinates –> joint coordinates**.
-
-## Forward kinematics
-
-Using the kinematic model defined above, we can compute the 3D position and orientation of the right or left end-effector with the **`forward_kinematics()`** method.
-
-### forward_kinematics()
-
-Each arm has a **`forward_kinematics()`** method. To use it, you first need to connect to your Reachy.
-
-```python
-from reachy_sdk import ReachySDK
-
-reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
-
-reachy.r_arm.forward_kinematics()
->>> array([[ 0.04622308, -0.03799621, -0.99820825, 0.31144822],
- [ 0.10976691, 0.99341829, -0.03273101, -0.19427524],
- [ 0.99288199, -0.1080573 , 0.05008958, -0.4255104 ],
- [ 0. , 0. , 0. , 1. ]])
-```
-
-The method returns a 4x4 matrix indicating the position and orientation of the end effector in Reachy 2's coordinate system.
-
-> By specifying no argument, it will give the current 3D position and orientation of the end effector.
-
-You can compute the forward kinematics of the arm for other joints positions, by giving as an argument a seven-element-long list, as for the `goto_joints()`method. The arm will not move, but you can get the target position and orientation of the arm in this configuration.
-
-For example, for the right arm right angled position:
-```python
-reachy.r_arm.forward_kinematics([0, 0, 0, -90, 0, 0, 0])
->>> array([[ 0.04622308, -0.03799621, -0.99820825, 0.31144822],
- [ 0.10976691, 0.99341829, -0.03273101, -0.19427524],
- [ 0.99288199, -0.1080573 , 0.05008958, -0.4255104 ],
- [ 0. , 0. , 0. , 1. ]])
-```
-
-### Understand the result
-The 4x4 matrix returned by the **`forward_kinematics()`** method is what is often called a **pose**. It actually encodes both the 3D translation (as a 3D vector) and the 3D rotation (as a 3x3 matrix) into one single representation.
-
-$$\begin{bmatrix}
-R_{11} & R_{12} & R_{13} & T_x\\\\\\
-R_{21} & R_{22} & R_{23} & T_y\\\\\\
-R_{31} & R_{32} & R_{33} & T_z\\\\\\
-0 & 0 & 0 & 1
-\end{bmatrix}$$
-
-The instruction
-
-```python
-reachy.r_arm.forward_kinematics()
-```
-
-returns the current pose of the right end-effector, based on the present position of every joint in the right arm.
-
-You can also compute the pose for a given joints position, to do that just pass the list of position as argument of forward_kinematics. Be careful to respect the order of the position you give and to give all the joints in the arm kinematic chain (i.e. from *shoulder_pitch* to *wrist_roll*).
-
-For example, we can compute the forward kinematics for the right-angle position we defined earlier.
-
-```python
-reachy.r_arm.forward_kinematics(right_angle_position)
->>> array([[ 0. , 0. , -1. , 0.3675],
- [ 0. , 1. , 0. , -0.202 ],
- [ 1. , 0. , 0. , -0.28 ],
- [ 0. , 0. , 0. , 1. ]])
-```
-
-With this result, we can tell that when the right arm is in the right angle position, the right end-effector is 37cm in front of the origin, 20cm to the left and 28cm below the origin.
-
-As of the rotation matrix, the identity matrix corresponds to the zero position of the robot which is when the hand is facing toward the bottom.
-
-Here we obtained the rotation matrix
-
-$$\begin{bmatrix}
-0 & 0 & -1\\\\\\
-0 & 1 & 0 \\\\\\
-1 & 0 & 0
-\end{bmatrix}$$
-
-We can use scipy to understand what this matrix represents.
-
-```python
-from scipy.spatial.transform import Rotation as R
-import numpy as np
-
-R.from_matrix([
- [0, 0, -1],
- [0, 1, 0],
- [1, 0, 0],
-]).as_euler('xyz', degrees=True)
->>> array([ 0. , -89.99999879, 0. ])
-```
-So scipy tells us that a rotation of -90° along the y axis has been made to get this matrix, which is coherent with the result because having the hand facing forward corresponds to this rotation according to Reachy's xyz axis that we saw above.
-
-## Inverse kinematics
-
-The inverse kinematics is the exact opposite of the forward kinematics. From a 4x4 pose in Reachy 2 coordinate system, it gives you a list of joints positions to reach this target.
-
-Knowing where you arm is located in the 3D space can be useful but most of the time what you want is to move the arm in cartesian coordinates. You want to have the possibility to say: “move your hand to [x, y, z] with a 90° rotation around the Y axis”. This is what **`goto_matrix()`**
-
-### inverse_kinematics()
-
-Each arm has an **`inverse_kinematics()`** method. To use it, you first need to connect to your Reachy.
-You need to specify as an argument a target pose in Reachy coordinate system.
-
-Let's for example ask for the inverse kinematics of the current pose, using the forward kinematics.
-
-```python
-from reachy_sdk import ReachySDK
-
-reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
-
-reachy.r_arm.inverse_kinematics(reachy.r_arm.forward_kinematics())
->>> [0, 0, 0, -90, 0, 0, 0] ??
-```
-
-The method returns a seven-element-long list indicating the position of each arm joint, in the usual order:
-- r_arm.shoulder.pitch
-- r_arm.shoulder.roll
-- r_arm.elbow.yaw
-- r_arm.elbow.pitch
-- r_arm.wrist.roll
-- r_arm.wrist.pitch
-- r_arm.wrist.yaw
-
-Contrary to the forward kinematics which has a unique answer (giving all joints values will always put the end effector at the same target position), inverse kinematics can have an infinite number of answers (for a target position of the end effector, several combinations of joints angles are possible).
-
-#### Using a q0 value
-The inverse kinematics returns one solution, but you may want to custom the position from which the computation is done to get another result.
-To do so, specify a **q0** value when calling the `inverse_kinematics()` method. The **`q0`** argument must be a seven-element-long list as well:
-```python
-reachy.r_arm.inverse_kinematics(
- reachy.r_arm.forward_kinematics(),
- q0=[0, 0, 0, 0, 0, 0, 0])
->>> [0, 0, 0, -90, 0, 0, 0] ??
-```
-
-
-### Example: square movement with goto_matrix()
-
-#### Defining the poses
-
-To make this more concrete, let's first try with a simple example. We will make the right hand draw a square in 3D space. To draw it, we will define the four corners of a square and Reachy's right hand will go to each of them.
-
-The virtual corner is represented below.
-
-{{< img-center "images/sdk/first-moves/square_setup.jpg" 400x "" >}}
-
-For our starting corner A, let's imagine a point in front of the robot, on its right and below its base. With Reachy coordinate system, we can define such a point with the following coordinates:
-
-$$A = \begin{pmatrix}0.3 & -0.4 & -0.3\end{pmatrix}$$
-
-The coordinates of B should match A except the z component wich should be higher. Hence
-
-$$B = \begin{pmatrix}0.3 & -0.4 & 0.0\end{pmatrix}$$
-
-For the corner C, we want a point on the same z level as B in the inner space of Reachy and in the same plane as A and B so we only need to change the y component of B. We can take for example
-
-$$C = \begin{pmatrix}0.3 & -0.1 & 0.0\end{pmatrix}$$
-
-And to complete our corners we can deduce D from A and C. D coordinates should match C except its z component which must the same as A. Hence
-
-$$D = \begin{pmatrix}0.3 & -0.1 & -0.3\end{pmatrix}$$
-
-> **Remember that you always have to provide poses to the inverse kinematics that are actually reachable by the robot.** If you're not sure whether the 3D point that you defined is reachable by Reachy, you can move the arm with your hand in compliant mode, ask the forward kinematics and check the 3D translation component of the returned pose.
-
-But having the 3D position is not enough to design a pose. You also need to provide the 3D orientation via a rotation matrix. The rotation matrix is often the tricky part when building a target pose matrix.
-
-Keep in mind that the identity rotation matrix corresponds to the zero position of the robot which is when the hand is facing toward the bottom. So if we want the hand facing forward when drawing our virtual square, we need to rotate it from -90° around the y axis, as we saw in the forward kinematics part.
-
-We know from before which rotation matrix corresponds to this rotation, but we can use scipy again to generate the rotation matrix for given rotations.
-
-```python
-print(np.around(R.from_euler('y', np.deg2rad(-90)).as_matrix(), 3))
->>> [[ 0. -0. -1.]
- [ 0. 1. -0.]
- [ 1. 0. 0.]]
-```
-
-We got the rotation matrix that we expected!
-
-As mentionned, building the pose matrix can be hard, so don't hesitate to use scipy to build your rotation matrix. You can also move the arm with your hand where you want it to be and use the forward kinematics to get an approximation of the target pose matrix you would give to the inverse kinematics.
-
-Here, having the rotation matrix and the 3D positions for our points A and B, we can build both target pose matrices.
-
-```python
-A = np.array([
- [0, 0, -1, 0.3],
- [0, 1, 0, -0.4],
- [1, 0, 0, -0.3],
- [0, 0, 0, 1],
-])
-
-B = np.array([
- [0, 0, -1, 0.3],
- [0, 1, 0, -0.4],
- [1, 0, 0, 0.0],
- [0, 0, 0, 1],
-])
-
-C = np.array([
- [0, 0, -1, 0.3],
- [0, 1, 0, -0.1],
- [1, 0, 0, 0.0],
- [0, 0, 0, 1],
-])
-
-D = np.array([
- [0, 0, -1, 0.3],
- [0, 1, 0, -0.1],
- [1, 0, 0, -0.3],
- [0, 0, 0, 1],
-])
-```
-
-#### Sending the movements commands
-
-As before, we use the **`goto_matrix()`** to send moving instructions to the arm.
-
-
-```python
-import time
-# put the joints in stiff mode
-reachy.r_arm.turn_on()
-
-# use the goto_matrix() method
-reachy.r_arm.goto_matrix(A)
-reachy.r_arm.goto_matrix(B)
-reachy.r_arm.goto_matrix(C)
-reachy.r_arm.goto_matrix(D)
-
-# put the joints back to compliant mode
-# use turn_off_smoothly to prevent the arm from falling hard
-reachy.r_arm.turn_off()
-```
-
-The result should look like this:
-
-
- {{< video "videos/sdk/goto_ik.mp4" "80%" >}}
+---
+title: "4. Use arm kinematics"
+description: "Harness arm kinematics to create movements using the Python SDK"
+lead: "Harness arm kinematics to create movements"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "SDK basics"
+weight: 230
+toc: true
+---
+
+## Arm coordinate system
+
+### Joint coordinates
+
+If you remember the [`goto_joint()` function]({{< ref "developing-with-reachy-2/basics/3-basic-arm-control#goto_joints" >}}), to generate a trajectory for the arm, you need to pass a list of joints with the requested position as argument.
+
+For example, to place the right arm in a right angled position, we defined the following list:
+
+```python
+right_angled_position = [0, 0, 0, -90, 0, 0, 0]
+```
+
+and then call the function with is:
+
+```python
+reachy.r_arm.goto_joints(right_angled_position)
+```
+
+In this basic arm control, we used what is called **joint coordinates** to move Reachy. This means that we controlled each joint separately.
+
+Controlling a robot in joint coordinates can be hard and is often far from what we actually do as humans. When we want to grasp an object in front of us, we think of where we should put our hand, not how to flex each individual muscle to reach this position. This approach relies on the cartesian coordinates: the 3D position and orientation in space, this is where the **kinematic model** comes into play.
+
+### Kinematic model
+
+The **kinematic model** describes the motion of a robot in mathematical form without considering the forces and torque affecting it. It only focuses on the geometric relationship between elements.
+
+We have defined the whole kinematic model of the arm. This means the translation and rotation required to go from one joint to the next one. On a right arm equipped with a gripper this actually look like this:
+
+|Motor|Translation|Rotation|
+|-----|-----------|--------|
+|r_arm.shoulder.pitch|(0, -0.019, 0)|(0, 1, 0)
+|r_arm.shoulder.roll|(0, 0, 0)|(1, 0, 0)
+|r_arm.elbow.yaw|(0, 0, -0.280)|(0, 0, 1)
+|r_arm.elbow.pitch|(0, 0, 0)|(0, 1, 0)
+|r_arm.wrist.roll|(0, 0, -0.120)|(0, 0, 1)
+|r_arm.wrist.pitch|(0, 0, 0)|(0, 1, 0)
+|r_arm.wrist.yaw|(0, 0, 0)|(1, 0, 0)
+|r_gripper|(0, ??, ??)|(0, 0, 0)
+
+
+To use and understand the kinematic model, you need to know how Reachy coordinate system is defined (from Reachy's perspective), see below:
+
+{{< img-center "images/sdk/first-moves/arm_axis.png" 400x "" >}}
+
+* the X axis corresponds to the forward arrow,
+* the Y axis corresponds to the right to left arrow,
+* the Z axis corresponds to the up arrow.
+
+The origin of this coordinate system is located in the upper part of the robot trunk, inside Reachy.
+ Basically, if you imagine a segment going from the left shoulder to the right shoulder of the robot, the origin is the middle of this segment, which corresponds to behind the center of Pollen's logo on Reachy's torso.
+
+{{< img-center "images/sdk/first-moves/reachy_frame.jpg" 400x "" >}}
+
+> Units in this coordinate system are **meters**. So the point (0.3, -0.2, 0) is 30cm in front of the origin, 20cm to the right and at the same height.
+
+
+### End effector location
+
+We consider the end-effector to be in a virtual joint located in the gripper and referred as *'right_tip'* or *'left_tip'* in the [urdf file](https://github.com/pollen-robotics/reachy_kinematics/blob/master/reachy.URDF), as shown below.
+
+{{< img-center "images/sdk/first-moves/eef.png" 400x "" >}}
+
+The red dot corresponds to the *'right_tip'*.
+
+You can see the right and left end-effectors animated below.
+
+
+ {{< video "videos/sdk/eef.mp4" "80%" >}}
+
+
+### Switching between joint and cartesian coordinates
+
+Forward and inverse kinematics are a way to go from one coordinates system to the other:
+
+* **forward kinematics: joint coordinates –> cartesian coordinates**,
+* **inverse kinematics: cartesian coordinates –> joint coordinates**.
+
+## Forward kinematics
+
+Using the kinematic model defined above, we can compute the 3D position and orientation of the right or left end-effector with the **`forward_kinematics()`** method.
+
+### forward_kinematics()
+
+Each arm has a **`forward_kinematics()`** method. To use it, you first need to connect to your Reachy.
+
+```python
+from reachy_sdk import ReachySDK
+
+reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
+
+reachy.r_arm.forward_kinematics()
+>>> array([[ 0.04622308, -0.03799621, -0.99820825, 0.31144822],
+ [ 0.10976691, 0.99341829, -0.03273101, -0.19427524],
+ [ 0.99288199, -0.1080573 , 0.05008958, -0.4255104 ],
+ [ 0. , 0. , 0. , 1. ]])
+```
+
+The method returns a 4x4 matrix indicating the position and orientation of the end effector in Reachy 2's coordinate system.
+
+> By specifying no argument, it will give the current 3D position and orientation of the end effector.
+
+You can compute the forward kinematics of the arm for other joints positions, by giving as an argument a seven-element-long list, as for the `goto_joints()`method. The arm will not move, but you can get the target position and orientation of the arm in this configuration.
+
+For example, for the right arm right angled position:
+```python
+reachy.r_arm.forward_kinematics([0, 0, 0, -90, 0, 0, 0])
+>>> array([[ 0.04622308, -0.03799621, -0.99820825, 0.31144822],
+ [ 0.10976691, 0.99341829, -0.03273101, -0.19427524],
+ [ 0.99288199, -0.1080573 , 0.05008958, -0.4255104 ],
+ [ 0. , 0. , 0. , 1. ]])
+```
+
+### Understand the result
+The 4x4 matrix returned by the **`forward_kinematics()`** method is what is often called a **pose**. It actually encodes both the 3D translation (as a 3D vector) and the 3D rotation (as a 3x3 matrix) into one single representation.
+
+$$\begin{bmatrix}
+R_{11} & R_{12} & R_{13} & T_x\\\\\\
+R_{21} & R_{22} & R_{23} & T_y\\\\\\
+R_{31} & R_{32} & R_{33} & T_z\\\\\\
+0 & 0 & 0 & 1
+\end{bmatrix}$$
+
+The instruction
+
+```python
+reachy.r_arm.forward_kinematics()
+```
+
+returns the current pose of the right end-effector, based on the present position of every joint in the right arm.
+
+You can also compute the pose for a given joints position, to do that just pass the list of position as argument of forward_kinematics. Be careful to respect the order of the position you give and to give all the joints in the arm kinematic chain (i.e. from *shoulder_pitch* to *wrist_roll*).
+
+For example, we can compute the forward kinematics for the right-angle position we defined earlier.
+
+```python
+reachy.r_arm.forward_kinematics(right_angle_position)
+>>> array([[ 0. , 0. , -1. , 0.3675],
+ [ 0. , 1. , 0. , -0.202 ],
+ [ 1. , 0. , 0. , -0.28 ],
+ [ 0. , 0. , 0. , 1. ]])
+```
+
+With this result, we can tell that when the right arm is in the right angle position, the right end-effector is 37cm in front of the origin, 20cm to the left and 28cm below the origin.
+
+As of the rotation matrix, the identity matrix corresponds to the zero position of the robot which is when the hand is facing toward the bottom.
+
+Here we obtained the rotation matrix
+
+$$\begin{bmatrix}
+0 & 0 & -1\\\\\\
+0 & 1 & 0 \\\\\\
+1 & 0 & 0
+\end{bmatrix}$$
+
+We can use scipy to understand what this matrix represents.
+
+```python
+from scipy.spatial.transform import Rotation as R
+import numpy as np
+
+R.from_matrix([
+ [0, 0, -1],
+ [0, 1, 0],
+ [1, 0, 0],
+]).as_euler('xyz', degrees=True)
+>>> array([ 0. , -89.99999879, 0. ])
+```
+So scipy tells us that a rotation of -90° along the y axis has been made to get this matrix, which is coherent with the result because having the hand facing forward corresponds to this rotation according to Reachy's xyz axis that we saw above.
+
+## Inverse kinematics
+
+The inverse kinematics is the exact opposite of the forward kinematics. From a 4x4 pose in Reachy 2 coordinate system, it gives you a list of joints positions to reach this target.
+
+Knowing where you arm is located in the 3D space can be useful but most of the time what you want is to move the arm in cartesian coordinates. You want to have the possibility to say: “move your hand to [x, y, z] with a 90° rotation around the Y axis”. This is what **`goto_matrix()`**
+
+### inverse_kinematics()
+
+Each arm has an **`inverse_kinematics()`** method. To use it, you first need to connect to your Reachy.
+You need to specify as an argument a target pose in Reachy coordinate system.
+
+Let's for example ask for the inverse kinematics of the current pose, using the forward kinematics.
+
+```python
+from reachy_sdk import ReachySDK
+
+reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
+
+reachy.r_arm.inverse_kinematics(reachy.r_arm.forward_kinematics())
+>>> [0, 0, 0, -90, 0, 0, 0] ??
+```
+
+The method returns a seven-element-long list indicating the position of each arm joint, in the usual order:
+- r_arm.shoulder.pitch
+- r_arm.shoulder.roll
+- r_arm.elbow.yaw
+- r_arm.elbow.pitch
+- r_arm.wrist.roll
+- r_arm.wrist.pitch
+- r_arm.wrist.yaw
+
+Contrary to the forward kinematics which has a unique answer (giving all joints values will always put the end effector at the same target position), inverse kinematics can have an infinite number of answers (for a target position of the end effector, several combinations of joints angles are possible).
+
+#### Using a q0 value
+The inverse kinematics returns one solution, but you may want to custom the position from which the computation is done to get another result.
+To do so, specify a **q0** value when calling the `inverse_kinematics()` method. The **`q0`** argument must be a seven-element-long list as well:
+```python
+reachy.r_arm.inverse_kinematics(
+ reachy.r_arm.forward_kinematics(),
+ q0=[0, 0, 0, 0, 0, 0, 0])
+>>> [0, 0, 0, -90, 0, 0, 0] ??
+```
+
+
+### Example: square movement with goto_matrix()
+
+#### Defining the poses
+
+To make this more concrete, let's first try with a simple example. We will make the right hand draw a square in 3D space. To draw it, we will define the four corners of a square and Reachy's right hand will go to each of them.
+
+The virtual corner is represented below.
+
+{{< img-center "images/sdk/first-moves/square_setup.jpg" 400x "" >}}
+
+For our starting corner A, let's imagine a point in front of the robot, on its right and below its base. With Reachy coordinate system, we can define such a point with the following coordinates:
+
+$$A = \begin{pmatrix}0.3 & -0.4 & -0.3\end{pmatrix}$$
+
+The coordinates of B should match A except the z component wich should be higher. Hence
+
+$$B = \begin{pmatrix}0.3 & -0.4 & 0.0\end{pmatrix}$$
+
+For the corner C, we want a point on the same z level as B in the inner space of Reachy and in the same plane as A and B so we only need to change the y component of B. We can take for example
+
+$$C = \begin{pmatrix}0.3 & -0.1 & 0.0\end{pmatrix}$$
+
+And to complete our corners we can deduce D from A and C. D coordinates should match C except its z component which must the same as A. Hence
+
+$$D = \begin{pmatrix}0.3 & -0.1 & -0.3\end{pmatrix}$$
+
+> **Remember that you always have to provide poses to the inverse kinematics that are actually reachable by the robot.** If you're not sure whether the 3D point that you defined is reachable by Reachy, you can move the arm with your hand in compliant mode, ask the forward kinematics and check the 3D translation component of the returned pose.
+
+But having the 3D position is not enough to design a pose. You also need to provide the 3D orientation via a rotation matrix. The rotation matrix is often the tricky part when building a target pose matrix.
+
+Keep in mind that the identity rotation matrix corresponds to the zero position of the robot which is when the hand is facing toward the bottom. So if we want the hand facing forward when drawing our virtual square, we need to rotate it from -90° around the y axis, as we saw in the forward kinematics part.
+
+We know from before which rotation matrix corresponds to this rotation, but we can use scipy again to generate the rotation matrix for given rotations.
+
+```python
+print(np.around(R.from_euler('y', np.deg2rad(-90)).as_matrix(), 3))
+>>> [[ 0. -0. -1.]
+ [ 0. 1. -0.]
+ [ 1. 0. 0.]]
+```
+
+We got the rotation matrix that we expected!
+
+As mentionned, building the pose matrix can be hard, so don't hesitate to use scipy to build your rotation matrix. You can also move the arm with your hand where you want it to be and use the forward kinematics to get an approximation of the target pose matrix you would give to the inverse kinematics.
+
+Here, having the rotation matrix and the 3D positions for our points A and B, we can build both target pose matrices.
+
+```python
+A = np.array([
+ [0, 0, -1, 0.3],
+ [0, 1, 0, -0.4],
+ [1, 0, 0, -0.3],
+ [0, 0, 0, 1],
+])
+
+B = np.array([
+ [0, 0, -1, 0.3],
+ [0, 1, 0, -0.4],
+ [1, 0, 0, 0.0],
+ [0, 0, 0, 1],
+])
+
+C = np.array([
+ [0, 0, -1, 0.3],
+ [0, 1, 0, -0.1],
+ [1, 0, 0, 0.0],
+ [0, 0, 0, 1],
+])
+
+D = np.array([
+ [0, 0, -1, 0.3],
+ [0, 1, 0, -0.1],
+ [1, 0, 0, -0.3],
+ [0, 0, 0, 1],
+])
+```
+
+#### Sending the movements commands
+
+As before, we use the **`goto_matrix()`** to send moving instructions to the arm.
+
+
+```python
+import time
+# put the joints in stiff mode
+reachy.r_arm.turn_on()
+
+# use the goto_matrix() method
+reachy.r_arm.goto_matrix(A)
+reachy.r_arm.goto_matrix(B)
+reachy.r_arm.goto_matrix(C)
+reachy.r_arm.goto_matrix(D)
+
+# put the joints back to compliant mode
+# use turn_off_smoothly to prevent the arm from falling hard
+reachy.r_arm.turn_off()
+```
+
+The result should look like this:
+
+
+ {{< video "videos/sdk/goto_ik.mp4" "80%" >}}
\ No newline at end of file
diff --git a/content/sdk/first-moves/head.md b/content/developing-with-reachy-2/basics/5-control-head.md
similarity index 87%
rename from content/sdk/first-moves/head.md
rename to content/developing-with-reachy-2/basics/5-control-head.md
index 5d3558d5..b83720a0 100644
--- a/content/sdk/first-moves/head.md
+++ b/content/developing-with-reachy-2/basics/5-control-head.md
@@ -1,173 +1,176 @@
----
-title: "5. Control the head"
-description: "Control the head"
-lead: ""
-date: 2023-07-25T17:38:48+02:00
-lastmod: 2023-07-25T17:38:48+02:00
-draft: false
-type: docs
-images: []
-toc: true
-weight: "110"
----
-
-## Head presentation
-
-Reachy 2's head is mounted on an Orbita3D actuator, referred to as the **neck** actuator, giving 3 degrees of freedom to control the head orientation.
-> Note : the antennas are not motorized for the moment
-
-
- {{< video "videos/sdk/orbita.mp4" "80%" >}}
-
-
-The complete head's specifications are given [here]({{< ref "advanced/specifications/head-specs" >}}).
-
-Before starting to control it, connect to your Reachy and turn it on. As in the other pages:
-
-```python
-from reachy2_sdk import ReachySDK
-
-reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
-
-reachy.head
->>>
-
-reachy.head.turn_on() # we turn on only the head
-```
-
-You could of course turn on the whole robot by calling `reachy.turn_on()` directly.
-
-There are several ways to control the head movements:
-- using the `look_at()`, `rotate_to()` and `orient()` methods, called directly at the **head** level. These methods works as [move commands described previously]({{< ref "sdk/first-moves/moves" >}}).
-- controlling the joints goal positions, namely **reachy.head.neck.roll**, **reachy.head.neck.pitch** and **reachy.head.neck.yaw**.
-
-## Head moves methods
-
-### look_at()
-
-You can use the `look_at()` function to make the head look at a specific point in space. This point must be given in Reachy 2's coordinate system in **meters**. The coordinate system is the one we have seen previously:
-
-* the X axis corresponds to the foward arrow,
-* the Y axis corresponds to the right to left arrow,
-* the Z axis corresponds to the up arrow.
-
-The origin of this coordinate system is located in the upper part of the robot trunk.
-
-{{< img-center "images/sdk/first-moves/reachy_frame.jpg" 400x "" >}}
-
-If you want Reachy to look forward you can send it the following.
-
-```python
-reachy.head.turn_on() # Don't forget to put the hand in stiff mode
-reachy.head.look_at(x=0.5, y=0, z=0.2, duration=1.0)
-```
-
-You can use multiple *look_at* to chain head movements, or even chain them with the `rotate_to()` and `orient()` functions described below. As seen in the [Understand moves in Reachy 2 section]({{< ref "sdk/first-moves/moves" >}}), the commands on the head will be stacked.
-
-
- {{< video "videos/sdk/look.mp4" "80%" >}}
-
-
-Here is the code to reproduce this.
-
-```python
-import time
-
-look_right = reachy.head.look_at(x=0.5, y=-0.5, z=0.1, duration=1.0)
-look_down = reachy.head.look_at(x=0.5, y=0, z=-0.4, duration=1.0)
-look_left = reachy.head.look_at(x=0.5, y=0.3, z=-0.3, duration=1.0)
-look_front = reachy.head.look_at(x=0.5, y=0, z=0, duration=1.0)
-```
-
-The best way to understand how to use the *look_at* is to play with it. Picture a position you would like Reachy's head to be in, guess a point which could match for the *look_at* and check if you got it right!
-
-Another cool thing is that we can combine Reachy's kinematics with the *look_at* so that Reachy's head follows its hand!
-
-
- {{< video "videos/sdk/look_at_hand.mp4" "80%" >}}
-
-
-```python
-reachy.turn_on('head')
-
-x, y, z = reachy.r_arm.forward_kinematics()[:3, -1]
-reachy.head.look_at(x=x, y=y, z=z, duration=1.0)
-
-time.sleep(0.5)
-
-while True:
- x, y, z = reachy.r_arm.forward_kinematics()[:3, -1]
- reachy.head.look_at(x=x, y=y, z=z, duration=0.1)
-```
-
-What the code says is that we compute the [forward kinematics of Reachy's right arm]({{< ref "sdk/first-moves/kinematics#forward-kinematics" >}}), and the x, y, z of Reachy's right end-effector in the Reachy's coordinates system will be the coordinates of the point used by the *look_at*.
-
-### rotate_to()
-
-The `rotate_to()` function is another way to control the head. You directly control the joint of the neck, giving the roll, pitch and yaw angles in degrees. The rotation is made in the order: roll, pitch, yaw, in the Orbita3D coordinate system.
-
-{{< img-center "images/sdk/first-moves/orbita_rpy.png" 400x "" >}}
-
-To make the robot looks a little down:
-```python
-reachy.head.turn_on() # Don't forget to put the hand in stiff mode
-reachy.head.rotate_to(roll=0, pitch=-10, yaw=0, duration=1.0)
-```
-
-### orient()
-
-The last method to control the head is the `orient()` method. You can control the head with a quaternion.
-
-You can use [pyquaternion library](https://kieranwynn.github.io/pyquaternion/) to create suitable quaternion for this method.
-
-```python
-from pyquaternion import Quaternion
-
-q = Quaternion(axis=[1, 0, 0], angle=3.14159265)
-reachy.head.turn_on()
-reachy.head.orient(q)
-```
-
-## Joint's goal_position
-
-
-## Read head position
-
-You can read the head orientation in two different ways:
-
-- using the `get_orientation()` method, which returns a quaternion
-- using the `get_joints_positions()` method, which the neck's roll, pitch and yaw present_position.
-
-### get_orientation()
-
-```python
-q = reachy.head.get_orientation()
-print(q)
->>> ??
-```
-
-### get_joints_positions()
-
-In case you feel more comfortable using roll, pitch, yaw angles rather than working with quaternions, you can retrieve those values from the **neck joints**.
-
-```python
-reachy.head.rotate_to(20, 30, -10)
-time.sleep(2)
-reachy.head.get_joints_positions()
->>> [20, 30, -10] # roll=20, pitch=30, yaw=-10
-```
-
-Be careful that contrary to the quaternion that offers a unique representation of a rotation, it is not the case of the euler angles. Several angles combination can lead to the same orientation in space. For example:
-
-```python
-reachy.head.rotate_to(70, -100, 80) # roll=70, pitch=-100, yaw=80
-time.sleep(2)
-reachy.head.get_joints_positions()
->>> [-110, -80, -100] # roll=-110, pitch=-80, yaw=-100
-```
-
-The values are different, nevertheless it is the same final orientation. You can convince yourself doing:
-```python
-reachy.head.rotate_to(-110, -80, -100)
-```
+---
+title: "5. Control the head"
+description: "First head movements using the Python SDK"
+lead: "First head movements"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "SDK basics"
+weight: 240
+toc: true
+---
+
+
+## Head presentation
+
+Reachy 2's head is mounted on an Orbita3D actuator, referred to as the **neck** actuator, giving 3 degrees of freedom to control the head orientation.
+> Note : the antennas are not motorized for the moment
+
+
+ {{< video "videos/sdk/orbita.mp4" "80%" >}}
+
+
+
+Before starting to control it, connect to your Reachy and turn it on. As in the other pages:
+
+```python
+from reachy2_sdk import ReachySDK
+
+reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
+
+reachy.head
+>>>
+
+reachy.head.turn_on() # we turn on only the head
+```
+
+You could of course turn on the whole robot by calling `reachy.turn_on()` directly.
+
+There are several ways to control the head movements:
+- using the `look_at()`, `rotate_to()` and `orient()` methods, called directly at the **head** level. These methods works as [move commands described previously]({{< ref "developing-with-reachy-2/basics/2-understand-moves" >}}).
+- controlling the joints goal positions, namely **reachy.head.neck.roll**, **reachy.head.neck.pitch** and **reachy.head.neck.yaw**.
+
+## Head moves methods
+
+### look_at()
+
+You can use the `look_at()` function to make the head look at a specific point in space. This point must be given in Reachy 2's coordinate system in **meters**. The coordinate system is the one we have seen previously:
+
+* the X axis corresponds to the foward arrow,
+* the Y axis corresponds to the right to left arrow,
+* the Z axis corresponds to the up arrow.
+
+The origin of this coordinate system is located in the upper part of the robot trunk.
+
+{{< img-center "images/sdk/first-moves/reachy_frame.jpg" 400x "" >}}
+
+If you want Reachy to look forward you can send it the following.
+
+```python
+reachy.head.turn_on() # Don't forget to put the hand in stiff mode
+reachy.head.look_at(x=0.5, y=0, z=0.2, duration=1.0)
+```
+
+You can use multiple *look_at* to chain head movements, or even chain them with the `rotate_to()` and `orient()` functions described below. As seen in the [Understand moves in Reachy 2 section]({{< ref "developing-with-reachy-2/basics/2-understand-moves" >}}), the commands on the head will be stacked.
+
+
+ {{< video "videos/sdk/look.mp4" "80%" >}}
+
+
+Here is the code to reproduce this.
+
+```python
+import time
+
+look_right = reachy.head.look_at(x=0.5, y=-0.5, z=0.1, duration=1.0)
+look_down = reachy.head.look_at(x=0.5, y=0, z=-0.4, duration=1.0)
+look_left = reachy.head.look_at(x=0.5, y=0.3, z=-0.3, duration=1.0)
+look_front = reachy.head.look_at(x=0.5, y=0, z=0, duration=1.0)
+```
+
+The best way to understand how to use the *look_at* is to play with it. Picture a position you would like Reachy's head to be in, guess a point which could match for the *look_at* and check if you got it right!
+
+Another cool thing is that we can combine Reachy's kinematics with the *look_at* so that Reachy's head follows its hand!
+
+
+ {{< video "videos/sdk/look_at_hand.mp4" "80%" >}}
+
+
+```python
+reachy.turn_on('head')
+
+x, y, z = reachy.r_arm.forward_kinematics()[:3, -1]
+reachy.head.look_at(x=x, y=y, z=z, duration=1.0)
+
+time.sleep(0.5)
+
+while True:
+ x, y, z = reachy.r_arm.forward_kinematics()[:3, -1]
+ reachy.head.look_at(x=x, y=y, z=z, duration=0.1)
+```
+
+What the code says is that we compute the [forward kinematics of Reachy's right arm]({{< ref "developing-with-reachy-2/basics/5-control-head#forward-kinematics" >}}), and the x, y, z of Reachy's right end-effector in the Reachy's coordinates system will be the coordinates of the point used by the *look_at*.
+
+### rotate_to()
+
+The `rotate_to()` function is another way to control the head. You directly control the joint of the neck, giving the roll, pitch and yaw angles in degrees. The rotation is made in the order: roll, pitch, yaw, in the Orbita3D coordinate system.
+
+{{< img-center "images/sdk/first-moves/orbita_rpy.png" 400x "" >}}
+
+To make the robot looks a little down:
+```python
+reachy.head.turn_on() # Don't forget to put the hand in stiff mode
+reachy.head.rotate_to(roll=0, pitch=-10, yaw=0, duration=1.0)
+```
+
+### orient()
+
+The last method to control the head is the `orient()` method. You can control the head with a quaternion.
+
+You can use [pyquaternion library](https://kieranwynn.github.io/pyquaternion/) to create suitable quaternion for this method.
+
+```python
+from pyquaternion import Quaternion
+
+q = Quaternion(axis=[1, 0, 0], angle=3.14159265)
+reachy.head.turn_on()
+reachy.head.orient(q)
+```
+
+## Joint's goal_position
+
+
+## Read head position
+
+You can read the head orientation in two different ways:
+
+- using the `get_orientation()` method, which returns a quaternion
+- using the `get_joints_positions()` method, which the neck's roll, pitch and yaw present_position.
+
+### get_orientation()
+
+```python
+q = reachy.head.get_orientation()
+print(q)
+>>> ??
+```
+
+### get_joints_positions()
+
+In case you feel more comfortable using roll, pitch, yaw angles rather than working with quaternions, you can retrieve those values from the **neck joints**.
+
+```python
+reachy.head.rotate_to(20, 30, -10)
+time.sleep(2)
+reachy.head.get_joints_positions()
+>>> [20, 30, -10] # roll=20, pitch=30, yaw=-10
+```
+
+Be careful that contrary to the quaternion that offers a unique representation of a rotation, it is not the case of the euler angles. Several angles combination can lead to the same orientation in space. For example:
+
+```python
+reachy.head.rotate_to(70, -100, 80) # roll=70, pitch=-100, yaw=80
+time.sleep(2)
+reachy.head.get_joints_positions()
+>>> [-110, -80, -100] # roll=-110, pitch=-80, yaw=-100
+```
+
+The values are different, nevertheless it is the same final orientation. You can convince yourself doing:
+```python
+reachy.head.rotate_to(-110, -80, -100)
+```
The head won't move.
\ No newline at end of file
diff --git a/content/sdk/first-moves/cameras.md b/content/developing-with-reachy-2/basics/6-get-images-from-cameras.md
similarity index 90%
rename from content/sdk/first-moves/cameras.md
rename to content/developing-with-reachy-2/basics/6-get-images-from-cameras.md
index 5de47ef7..85ae7030 100644
--- a/content/sdk/first-moves/cameras.md
+++ b/content/developing-with-reachy-2/basics/6-get-images-from-cameras.md
@@ -1,140 +1,144 @@
----
-title: "6. Get images from cameras"
-description: "How to get the images from Reachy 2's cameras."
-lead: ""
-date: 2023-07-25T17:38:34+02:00
-lastmod: 2023-07-25T17:38:34+02:00
-draft: false
-type: docs
-images: []
-toc: true
-weight: "120"
----
-This section assumes that you went through the [Hello World]({{< ref "sdk/getting-started/hello-world" >}}) so that you know how to connect to the robot.
-
-Reachy 2 has 2 types of camera:
-- the **teleop** cameras, with a right and left cameras, located in Reachy 2's head and used for the teleoperation
-- the **SR** camera, which is a depth camera, located in Reachy 2's torso and mainly useful for manipulation tasks
-
-Each camera can be accessed separately through *reachy.cameras*. They both have a right and left view, with the left and right sides considered from Reachy point of view. To be able to specify the view you want to get a frame from, you will need to import CameraView:
-
-```python
-from reachy2_sdk.media.camera import CameraView
-```
-
-## Enable teleop cameras for the SDK
-
-### SR camera
-The SR camera is unplugged by default.
-If you want to use it, plug the SR camera on the robot's computer remaining USB port (2).
-
-{{< img-center "images/sdk/first-moves/plugged-sr.png" 400x "" >}}
-
-> Make sure to unplug it if you want to use the teleoperation.
-
-### Teleop cameras
-The teleop cameras are shared between the teleop service and the SDK server, and can only be used by one at the same time.
-In order to be able to use the teleop cameras with the SDK:
-1. Go to the dashboard
-2. Stop [webrtc service in the services tab]({{< ref "/dashboard/content/services" >}})
-
-{{< img-center "images/sdk/first-moves/stop-webrtc-service.png" 600x "" >}}
-
-## Get images
-
-First, connect to your Reachy.
-
-```python
-from reachy_sdk import ReachySDK
-
-reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
-
-reachy.cameras
->>> ??
-```
-
-The list of initialized cameras should contain both the teleop and SR camera.
-
-For each camera, namely the teleop and the SR ones, you must call the `capture()`function each time you want to get an image. This captures an image from both view of the given camera at the same time. You can then access one of the image with the `get_frame()` method.
-
-### Teleop camera
-
-To get both views of the robot teleop cameras:
-```python
-from reachy2_sdk import ReachySDK
-from reachy2_sdk.media.camera import CameraView
-
-reachy = ReachySDK(host='192.168.0.42')
-
-reachy.cameras.teleop.capture()
-l_frame = reachy.cameras.teleop.get_frame(CameraView.LEFT)
-r_frame = reachy.cameras.teleop.get_frame(CameraView.RIGHT)
-```
-
-Let's display the captured frame with opencv:
-```python
-import cv2
-
-cv2.imshow("left", l_frame)
-cv2.imshow("right", r_frame)
-cv.waitKey(0)
-cv.destroyAllWindows()
-```
-
-### SR camera
-The SR camera works exactly the same as the teleop camera, but you have more elements captured. In fact, it a RGBD camera, so you have both access to the RGB images and depth information.
-
-#### RGB images
-Getting RGB images from the SR camera looks the same as from the teleop one: after having called `capture()`, use `get_frame()` specifying the CameraView you want to get.
-```python
-from reachy_sdk import ReachySDK
-from reachy2_sdk.media.camera import CameraView
-
-reachy = ReachySDK(host='192.168.0.42')
-
-reachy.cameras.SR.capture()
-l_frame = reachy.cameras.SR.get_frame(CameraView.LEFT)
-r_frame = reachy.cameras.SR.get_frame(CameraView.RIGHT)
-```
-
-Let's display them with opencv:
-```python
-import cv2
-
-cv2.imshow("left", l_frame)
-cv2.imshow("right", r_frame)
-cv.waitKey(0)
-cv.destroyAllWindows()
-```
-
-#### Depth information
-
-The SR camera is a depth camera, you can then diplay a left or right **depth frame** using `get_depth_frame()`, but also the **depthmap** and the **disparity**.
-
-You first have to capture all, then you can read the frame and get the information you want:
-```python
-from reachy_sdk import ReachySDK
-from reachy2_sdk.media.camera import CameraView
-
-reachy = ReachySDK(host='192.168.0.42')
-
-reachy.cameras.SR.capture()
-l_depth_frame = reachy.cameras.SR.get_depth_frame(CameraView.LEFT)
-r_depth_frame = reachy.cameras.SR.get_depth_frame(CameraView.RIGHT)
-depth = reachy.cameras.SR.get_depthmap()
-disparity = reachy.cameras.SR.get_disparity()
-```
-
-Let's display them with opencv:
-```python
-import cv2
-
-cv2.imshow("sr_depthNode_left", l_depth_frame)
-cv2.imshow("sr_depthNode_right", r_depth_frame)
-cv2.imshow("depth", depth)
-cv2.imshow("disparity", disparity)
-cv.waitKey(0)
-cv.destroyAllWindows()
-```
-
+---
+title: "6. Get images from cameras"
+description: "Images acquisition using the Python SDK"
+lead: "Images acquisition "
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "SDK basics"
+weight: 250
+toc: true
+---
+
+This section assumes that you went through the [Hello World]({{< ref "developing-with-reachy-2/getting-started-sdk/connect-reachy2" >}}) so that you know how to connect to the robot.
+
+Reachy 2 has 2 types of camera:
+- the **teleop** cameras, with a right and left cameras, located in Reachy 2's head and used for the teleoperation
+- the **SR** camera, which is a depth camera, located in Reachy 2's torso and mainly useful for manipulation tasks
+
+Each camera can be accessed separately through *reachy.cameras*. They both have a right and left view, with the left and right sides considered from Reachy point of view. To be able to specify the view you want to get a frame from, you will need to import CameraView:
+
+```python
+from reachy2_sdk.media.camera import CameraView
+```
+
+## Enable teleop cameras for the SDK
+
+### SR camera
+The SR camera is unplugged by default.
+If you want to use it, plug the SR camera on the robot's computer remaining USB port (2).
+
+{{< img-center "images/sdk/first-moves/plugged-sr.png" 400x "" >}}
+
+> Make sure to unplug it if you want to use the teleoperation.
+
+### Teleop cameras
+The teleop cameras are shared between the teleop service and the SDK server, and can only be used by one at the same time.
+In order to be able to use the teleop cameras with the SDK:
+1. Go to the dashboard
+2. Stop webrtc service in the services tab of the dashboard
+
+{{< img-center "images/sdk/first-moves/stop-webrtc-service.png" 600x "" >}}
+
+## Get images
+
+First, connect to your Reachy.
+
+```python
+from reachy_sdk import ReachySDK
+
+reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
+
+reachy.cameras
+>>> ??
+```
+
+The list of initialized cameras should contain both the teleop and SR camera.
+
+For each camera, namely the teleop and the SR ones, you must call the `capture()`function each time you want to get an image. This captures an image from both view of the given camera at the same time. You can then access one of the image with the `get_frame()` method.
+
+### Teleop camera
+
+To get both views of the robot teleop cameras:
+```python
+from reachy2_sdk import ReachySDK
+from reachy2_sdk.media.camera import CameraView
+
+reachy = ReachySDK(host='192.168.0.42')
+
+reachy.cameras.teleop.capture()
+l_frame = reachy.cameras.teleop.get_frame(CameraView.LEFT)
+r_frame = reachy.cameras.teleop.get_frame(CameraView.RIGHT)
+```
+
+Let's display the captured frame with opencv:
+```python
+import cv2
+
+cv2.imshow("left", l_frame)
+cv2.imshow("right", r_frame)
+cv.waitKey(0)
+cv.destroyAllWindows()
+```
+
+### SR camera
+The SR camera works exactly the same as the teleop camera, but you have more elements captured. In fact, it a RGBD camera, so you have both access to the RGB images and depth information.
+
+#### RGB images
+Getting RGB images from the SR camera looks the same as from the teleop one: after having called `capture()`, use `get_frame()` specifying the CameraView you want to get.
+```python
+from reachy_sdk import ReachySDK
+from reachy2_sdk.media.camera import CameraView
+
+reachy = ReachySDK(host='192.168.0.42')
+
+reachy.cameras.SR.capture()
+l_frame = reachy.cameras.SR.get_frame(CameraView.LEFT)
+r_frame = reachy.cameras.SR.get_frame(CameraView.RIGHT)
+```
+
+Let's display them with opencv:
+```python
+import cv2
+
+cv2.imshow("left", l_frame)
+cv2.imshow("right", r_frame)
+cv.waitKey(0)
+cv.destroyAllWindows()
+```
+
+#### Depth information
+
+The SR camera is a depth camera, you can then diplay a left or right **depth frame** using `get_depth_frame()`, but also the **depthmap** and the **disparity**.
+
+You first have to capture all, then you can read the frame and get the information you want:
+```python
+from reachy_sdk import ReachySDK
+from reachy2_sdk.media.camera import CameraView
+
+reachy = ReachySDK(host='192.168.0.42')
+
+reachy.cameras.SR.capture()
+l_depth_frame = reachy.cameras.SR.get_depth_frame(CameraView.LEFT)
+r_depth_frame = reachy.cameras.SR.get_depth_frame(CameraView.RIGHT)
+depth = reachy.cameras.SR.get_depthmap()
+disparity = reachy.cameras.SR.get_disparity()
+```
+
+Let's display them with opencv:
+```python
+import cv2
+
+cv2.imshow("sr_depthNode_left", l_depth_frame)
+cv2.imshow("sr_depthNode_right", r_depth_frame)
+cv2.imshow("depth", depth)
+cv2.imshow("disparity", disparity)
+cv.waitKey(0)
+cv.destroyAllWindows()
+```
+
> Note that when you call `capture()` on the SR camera, both RGB images and depth information are captured at the same time.
\ No newline at end of file
diff --git a/content/sdk/first-moves/record.md b/content/developing-with-reachy-2/basics/7-record-replay-trajectories.md
similarity index 94%
rename from content/sdk/first-moves/record.md
rename to content/developing-with-reachy-2/basics/7-record-replay-trajectories.md
index 07ea8249..a6eb4572 100644
--- a/content/sdk/first-moves/record.md
+++ b/content/developing-with-reachy-2/basics/7-record-replay-trajectories.md
@@ -1,120 +1,124 @@
----
-title: "7. Record and replay trajectories"
-description: ""
-lead: ""
-date: 2023-07-25T17:39:07+02:00
-lastmod: 2023-07-25T17:39:07+02:00
-draft: false
-type: docs
-images: []
-toc: true
-weight: "130"
----
-
-You can easily record joint trajectories directly on Reachy, store and replay them later. This page will show you how to implement such mechanisms.
-
-All examples given below will show trajectories record on each of the robot joints. The position of each motor will be stored at a predefined frequency (typically 100Hz). Similarly, the replay will set new target position using the same frequency. Those basics examples does not perform any kind of filtering or modification of the data.
-
-In the following examples, we will assume that you are already connected to your robot and know how to control individual motors.
-
-## Recording a trajectory
-
-To record a trajectory, we will simply get the current position of individual motors at a predefiend frequency. We will first define a list of motors that we want to record. In this example, we will only record the joints from the right arm, but you can similarly record a single motor, or all motors of the robot at once.
-
-```python
-# assuming we run something like this before:
-# reachy = ReachySDK(host='192.168.0.42')
-
-recorded_joints = [
- reachy.r_arm.r_shoulder_pitch,
- reachy.r_arm.r_shoulder_roll,
- reachy.r_arm.r_arm_yaw,
- reachy.r_arm.r_elbow_pitch,
- reachy.r_arm.r_forearm_yaw,
- reachy.r_arm.r_wrist_pitch,
- reachy.r_arm.r_wrist_roll,
-]
-```
-
-Now let's define our frequency and record duration:
-
-```python
-sampling_frequency = 100 # in Hz
-record_duration = 5 # in sec.
-```
-
-Our record loop can then be defined as such:
-
-```python
-import time
-
-trajectories = []
-
-start = time.time()
-while (time.time() - start) < record_duration:
- # We here get the present position for all of recorded joints
- current_point = [joint.present_position for joint in recorded_joints]
- # Add this point to the already recorded trajectories
- trajectories.append(current_point)
-
- time.sleep(1 / sampling_frequency)
-```
-If you want to record a demonstration on the robot, first make sure the robot is compliant. Then, put it in the starting position. Run the code, and start moving the robot. After 5 seconds, the loop will stop and the movements you have made on Reachy will be recorded.
-
-Depending on your uses, you can define another duration. You can also choose not to use a specify duration but maybe use start and stop event to record. In such case, the easy way is probably to run the loop within a thread or an asynchronous fonction, so it can run in background.
-
-## Visualise your recordings
-
-The trajectories you recorded can be converted to numpy array for more complex processings:
-
-```python
-import numpy as np
-
-traj_array = np.array(trajectories)
-```
-
-If you are familiar with matplotlib, you can also plot it via:
-
-```python
-from matplotlib import pyplot as plt
-
-plt.figure()
-plt.plot(trajectories)
-```
-
-## Replay a recorded trajectory
-
-Replaying the recorded trajectory basically uses the same loop but set the goal position instead of reading the present position.
-
-But before actually replaying the trajectory, there are a few key points that you should take care of:
-
-- First, make sure the joints you are going to move are stiff.
-- Then, if the arm is not in the same position than the one you use as a start position of your recording, the beginning of the replay will be really brutal. It will try to go to the starting position as fast as possible.
-
-To avoid that, you can use the goto function to first go to the first point of your trajectories:
-
-```python
-from reachy_sdk.trajectory import goto
-
-# Set all used joint stiff
-for joint in recorded_joints:
- joint.compliant = False
-
-# Create a dict associating a joint to its first recorded position
-first_point = dict(zip(recorded_joints, trajectories[0]))
-
-# Goes to the start of the trajectory in 3s
-goto(first_point, duration=3.0)
-```
-
-Now that we are in position, we can actually play the trajectory. To do that, we simply loop over our recordings and set the goal position of each joints at the same frequency:
-
-```python
-import time
-
-for joints_positions in trajectories:
- for joint, pos in zip(recorded_joints, joints_positions):
- joint.goal_position = pos
-
- time.sleep(1 / sampling_frequency)
+---
+title: "7. Record and replay trajectories"
+description: "Record and replay trajectories using the Python SDK"
+lead: "Record and replay trajectories"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "SDK basics"
+weight: 260
+toc: true
+---
+
+
+You can easily record joint trajectories directly on Reachy, store and replay them later. This page will show you how to implement such mechanisms.
+
+All examples given below will show trajectories record on each of the robot joints. The position of each motor will be stored at a predefined frequency (typically 100Hz). Similarly, the replay will set new target position using the same frequency. Those basics examples does not perform any kind of filtering or modification of the data.
+
+In the following examples, we will assume that you are already connected to your robot and know how to control individual motors.
+
+## Recording a trajectory
+
+To record a trajectory, we will simply get the current position of individual motors at a predefiend frequency. We will first define a list of motors that we want to record. In this example, we will only record the joints from the right arm, but you can similarly record a single motor, or all motors of the robot at once.
+
+```python
+# assuming we run something like this before:
+# reachy = ReachySDK(host='192.168.0.42')
+
+recorded_joints = [
+ reachy.r_arm.r_shoulder_pitch,
+ reachy.r_arm.r_shoulder_roll,
+ reachy.r_arm.r_arm_yaw,
+ reachy.r_arm.r_elbow_pitch,
+ reachy.r_arm.r_forearm_yaw,
+ reachy.r_arm.r_wrist_pitch,
+ reachy.r_arm.r_wrist_roll,
+]
+```
+
+Now let's define our frequency and record duration:
+
+```python
+sampling_frequency = 100 # in Hz
+record_duration = 5 # in sec.
+```
+
+Our record loop can then be defined as such:
+
+```python
+import time
+
+trajectories = []
+
+start = time.time()
+while (time.time() - start) < record_duration:
+ # We here get the present position for all of recorded joints
+ current_point = [joint.present_position for joint in recorded_joints]
+ # Add this point to the already recorded trajectories
+ trajectories.append(current_point)
+
+ time.sleep(1 / sampling_frequency)
+```
+If you want to record a demonstration on the robot, first make sure the robot is compliant. Then, put it in the starting position. Run the code, and start moving the robot. After 5 seconds, the loop will stop and the movements you have made on Reachy will be recorded.
+
+Depending on your uses, you can define another duration. You can also choose not to use a specify duration but maybe use start and stop event to record. In such case, the easy way is probably to run the loop within a thread or an asynchronous fonction, so it can run in background.
+
+## Visualise your recordings
+
+The trajectories you recorded can be converted to numpy array for more complex processings:
+
+```python
+import numpy as np
+
+traj_array = np.array(trajectories)
+```
+
+If you are familiar with matplotlib, you can also plot it via:
+
+```python
+from matplotlib import pyplot as plt
+
+plt.figure()
+plt.plot(trajectories)
+```
+
+## Replay a recorded trajectory
+
+Replaying the recorded trajectory basically uses the same loop but set the goal position instead of reading the present position.
+
+But before actually replaying the trajectory, there are a few key points that you should take care of:
+
+- First, make sure the joints you are going to move are stiff.
+- Then, if the arm is not in the same position than the one you use as a start position of your recording, the beginning of the replay will be really brutal. It will try to go to the starting position as fast as possible.
+
+To avoid that, you can use the goto function to first go to the first point of your trajectories:
+
+```python
+from reachy_sdk.trajectory import goto
+
+# Set all used joint stiff
+for joint in recorded_joints:
+ joint.compliant = False
+
+# Create a dict associating a joint to its first recorded position
+first_point = dict(zip(recorded_joints, trajectories[0]))
+
+# Goes to the start of the trajectory in 3s
+goto(first_point, duration=3.0)
+```
+
+Now that we are in position, we can actually play the trajectory. To do that, we simply loop over our recordings and set the goal position of each joints at the same frequency:
+
+```python
+import time
+
+for joints_positions in trajectories:
+ for joint, pos in zip(recorded_joints, joints_positions):
+ joint.goal_position = pos
+
+ time.sleep(1 / sampling_frequency)
```
\ No newline at end of file
diff --git a/content/sdk/first-moves/mobile-base.md b/content/developing-with-reachy-2/basics/8-use-mobile-base.md
similarity index 90%
rename from content/sdk/first-moves/mobile-base.md
rename to content/developing-with-reachy-2/basics/8-use-mobile-base.md
index 7b600d06..d4f0223f 100644
--- a/content/sdk/first-moves/mobile-base.md
+++ b/content/developing-with-reachy-2/basics/8-use-mobile-base.md
@@ -1,16 +1,20 @@
---
title: "8. Use the mobile base"
-description: "How to use the mobile base."
-lead: ""
-date: 2023-07-25T17:38:34+02:00
-lastmod: 2023-07-25T17:38:34+02:00
+description: "First mobile base movements using the Python SDK"
+lead: "First mobile base movements"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
draft: false
-type: docs
images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "SDK basics"
+weight: 270
toc: true
-weight: "140"
---
+
## What is accessible on the mobile base
The following elements are accessible with *reachy.mobile_base*:
* mobile base version,
@@ -42,7 +46,7 @@ The initial position of the odom frame matches the position of the robot when th
## Moving the mobile base
### Using the goto method
-The `goto()` method expects a goal position in the [odom frame]({{< ref "/sdk/first-moves/mobile-base#odom-frame" >}}), composed of 3 elements: x in meters, y in meters and theta in degrees.
+The `goto()` method expects a goal position in the [odom frame]({{< ref "/developing-with-reachy-2/basics/8-use-mobile-base#odom-frame" >}}), composed of 3 elements: x in meters, y in meters and theta in degrees.
:warning: The most important thing to get used to, is the fact that the odom frame is world-fixed and that the position of the robot is always updated as long as the HAL is running (the HAL is automatically started during the robot boot-up). So by default, **if you ask for a ```goto(0, 0, 0)``` the robot will try to comeback to the position it was at boot-up.**
@@ -66,7 +70,7 @@ reachy_mobile.mobile_base.goto(x=0.0, y=0.0, theta=0.0)
We recommend taking the time to play around with this concept.
-> Note the **goto() method of the mobile base does not work like [moves methods explained previously]({{< ref "/sdk/first-moves/moves">}})**
+> Note the **goto() method of the mobile base does not work like [moves methods explained previously]({{< ref "/developing-with-reachy-2/basics/8-use-mobile-base">}})**
By default, the robot will always try to reach the goal position, meaning that even if the robot did reach its position and you push it, it will try to come back to the goal position again.
diff --git a/content/developing-with-reachy-2/basics/_index.md b/content/developing-with-reachy-2/basics/_index.md
new file mode 100644
index 00000000..a88285ce
--- /dev/null
+++ b/content/developing-with-reachy-2/basics/_index.md
@@ -0,0 +1,13 @@
+---
+title: "SDK basics"
+description: "Discover the basics of the Python SDK"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+weight: 20
+---
diff --git a/content/developing-with-reachy-2/getting-started-sdk/_index.md b/content/developing-with-reachy-2/getting-started-sdk/_index.md
new file mode 100644
index 00000000..dbfbe699
--- /dev/null
+++ b/content/developing-with-reachy-2/getting-started-sdk/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Getting started with the SDK"
+description: "First steps with Reachy 2 SDK"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+weight: 10
+---
diff --git a/content/sdk/getting-started/connect.md b/content/developing-with-reachy-2/getting-started-sdk/connect-reachy2.md
similarity index 68%
rename from content/sdk/getting-started/connect.md
rename to content/developing-with-reachy-2/getting-started-sdk/connect-reachy2.md
index e1b777b6..843d112c 100644
--- a/content/sdk/getting-started/connect.md
+++ b/content/developing-with-reachy-2/getting-started-sdk/connect-reachy2.md
@@ -1,33 +1,38 @@
----
-title: "Connect to Reachy 2"
-description: ""
-date: 2023-07-25T18:49:56+02:00
-lastmod: 2023-07-25T18:49:56+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "30"
----
-
-The last required step before being able to use your Reachy 2 is to find its IP address.
-
-> Note: if you haven't connected Reachy to a network yet, please first follow the instructions ???
-
-## Using the LCD screen
-
-If you haven't unplugged it, the LCD screen connected in Reachy's back should be diplaying its IP address.
-
-{{< img-center "images/sdk/getting-started/lcd-display.png" 400x "" >}}
-
-If the LCD screen is not working or is unplugged, check out the page [Find my IP section]({{< ref "help/system/find-my-ip" >}}) to learn other ways to get the IP address.
-
-
-You can check that everything is working as expected by running the following Python code:
-
-```python
-from reachy_sdk import ReachySDK
-
-# Replace with the actual IP you've found.
-reachy = ReachySDK(host='the.reachy.ip.found.')
-```
+---
+title: "Connect to Reachy 2"
+description: "Establish a connection to the robot with the Python SDK"
+lead: "Establish a connection to the robot with the Python SDK"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "Getting started with the SDK"
+weight: 110
+toc: true
+---
+
+
+The last required step before being able to use your Reachy 2 is to find its IP address.
+
+> Note: if you haven't connected Reachy to a network yet, please first follow the instructions ???
+
+## Using the LCD screen
+
+If you haven't unplugged it, the LCD screen connected in Reachy's back should be diplaying its IP address.
+
+{{< img-center "images/sdk/getting-started/lcd-display.png" 400x "" >}}
+
+If the LCD screen is not working or is unplugged, check out the page Find my IP section to learn other ways to get the IP address.
+
+
+You can check that everything is working as expected by running the following Python code:
+
+```python
+from reachy_sdk import ReachySDK
+
+# Replace with the actual IP you've found.
+reachy = ReachySDK(host='the.reachy.ip.found.')
+```
diff --git a/content/sdk/getting-started/install.md b/content/developing-with-reachy-2/getting-started-sdk/installation.md
similarity index 81%
rename from content/sdk/getting-started/install.md
rename to content/developing-with-reachy-2/getting-started-sdk/installation.md
index def5ac61..c63948d1 100644
--- a/content/sdk/getting-started/install.md
+++ b/content/developing-with-reachy-2/getting-started-sdk/installation.md
@@ -1,44 +1,48 @@
----
-title: "Installation"
-description: "How to install the Python SDK, either from PyPi or directly from sources."
-lead: ""
-date: 2023-07-25T18:50:10+02:00
-lastmod: 2023-07-25T18:50:10+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "20"
----
-
-## How to install the Python SDK
-
-The Python SDK is a pure Python library. The installation should thus be rather straightforward.
-
-It supports Python >= 3.10 (older versions will not work because of typing syntax). It works on Windows/Mac/Linux.
-
-We recommend to use [virtual environment](https://docs.python.org/3/tutorial/venv.html) for your development. They make the installation simple and avoid compatibility issues. They also come with their [pip](https://pip.pypa.io/en/stable/) command.
-
-### From PyPi
-
-```bash
-pip install reachy2-sdk
-```
-
-### From the source
-
-```bash
-git clone https://github.com/pollen-robotics/reachy2-sdk
-cd reachy2-sdk
-pip install -e reachy2-sdk
-```
-
-## Dependencies
-
-The SDK relies on a few third-party Python packages, such as:
-
-* [numpy](https://numpy.org) - mostly for trajectory computation
-* [opencv](https://opencv.org) - for camera frame access
-* [grpc](https://grpc.io) - to connect to the robot
-
-They will be **installed automatically** when you install the SDK.
+---
+title: "Installation"
+description: "Install the Python SDK for Reachy 2"
+lead: "Install the Python SDK for Reachy 2"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "Getting started with the SDK"
+weight: 100
+toc: true
+---
+
+
+## How to install the Python SDK
+
+The Python SDK is a pure Python library. The installation should thus be rather straightforward.
+
+It supports Python >= 3.10 (older versions will not work because of typing syntax). It works on Windows/Mac/Linux.
+
+We recommend to use [virtual environment](https://docs.python.org/3/tutorial/venv.html) for your development. They make the installation simple and avoid compatibility issues. They also come with their [pip](https://pip.pypa.io/en/stable/) command.
+
+### From PyPi
+
+```bash
+pip install reachy2-sdk
+```
+
+### From the source
+
+```bash
+git clone https://github.com/pollen-robotics/reachy2-sdk
+cd reachy2-sdk
+pip install -e reachy2-sdk
+```
+
+## Dependencies
+
+The SDK relies on a few third-party Python packages, such as:
+
+* [numpy](https://numpy.org) - mostly for trajectory computation
+* [opencv](https://opencv.org) - for camera frame access
+* [grpc](https://grpc.io) - to connect to the robot
+
+They will be **installed automatically** when you install the SDK.
diff --git a/content/developing-with-reachy-2/getting-started-sdk/visualize-fake-robot.md b/content/developing-with-reachy-2/getting-started-sdk/visualize-fake-robot.md
new file mode 100644
index 00000000..cedaf280
--- /dev/null
+++ b/content/developing-with-reachy-2/getting-started-sdk/visualize-fake-robot.md
@@ -0,0 +1,17 @@
+---
+title: "Visualize with fake robot"
+description: "Use a fake robot mode to test your moves before using the real robot"
+lead: "Use a fake robot mode to test your moves before using the real robot"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "Getting started with the SDK"
+weight: 120
+toc: true
+---
+
+Ohlala it's safe now
\ No newline at end of file
diff --git a/content/developing-with-reachy-2/simulation/_index.md b/content/developing-with-reachy-2/simulation/_index.md
new file mode 100644
index 00000000..8ee8dddf
--- /dev/null
+++ b/content/developing-with-reachy-2/simulation/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Simulation"
+description: "Use gazebo to simulate the robot"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+weight: 40
+---
diff --git a/content/docs/simulation/simulation-installation.md b/content/developing-with-reachy-2/simulation/simulation-installation.md
similarity index 84%
rename from content/docs/simulation/simulation-installation.md
rename to content/developing-with-reachy-2/simulation/simulation-installation.md
index 67252890..8f4f3b45 100644
--- a/content/docs/simulation/simulation-installation.md
+++ b/content/developing-with-reachy-2/simulation/simulation-installation.md
@@ -1,51 +1,56 @@
----
-title: "Simulation installation"
-description: "Simulation installation process."
-lead: "How to install a simulated Reachy 2 on your computer."
-date: 2023-08-09T14:45:14+02:00
-lastmod: 2023-08-09T14:45:14+02:00
-draft: false
-images: []
-toc: true
-weight: "70"
----
-If you want to try movements on the robot without using the real robot, you can install a simulated Reachy 2 on your computer, and run it the same way the real robot is run. The easiest way is using a docker image. We will thus assume that you already have docker installed and setup.
-
-Clone the sources of our docker, and pull the sources:
-```python
-git clone git@github.com:pollen-robotics/docker_reachy2_core.git
-cd docker_reachy2_core
-./pull_sources.sh beta
-```
-
-Then download the configuration files:
-```python
-git clone git@github.com:pollen-robotics/reachy_config_example.git
-cp -r reachy_config_example/.reachy_config ~/
-```
-
-In your docker_reachy2_core folder, compose a container with:
-```python
-docker compose -f dev.yaml up -d core
-```
-> This can take a few minutes to compose.
-
-Build:
-```python
-cbuilds
-```
-
-
-In a first terminal, launch the robot server:
-```python
-# terminal 1
-docker exec -it core bash
-ros2 launch reachy_bringup reachy.launch.py fake:=true start_sdk_server:=true start_rviz:=true
-```
-Keep this terminal open, and in a second terminal:
-```python
-# terminal 2
-docker exec -it core bash
-python3 ../dev/reachy2-sdk/src/example/test_goto.py
-```
-> If you have the Python SDK installed on your computer, you can launch the example outside the container.
+---
+title: "Simulation installation"
+description: "How to install a simulation of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ developing-with-reachy-2:
+ parent: "Simulation"
+weight: 400
+toc: true
+---
+
+If you want to try movements on the robot without using the real robot, you can install a simulated Reachy 2 on your computer, and run it the same way the real robot is run. The easiest way is using a docker image. We will thus assume that you already have docker installed and setup.
+
+Clone the sources of our docker, and pull the sources:
+```python
+git clone git@github.com:pollen-robotics/docker_reachy2_core.git
+cd docker_reachy2_core
+./pull_sources.sh beta
+```
+
+Then download the configuration files:
+```python
+git clone git@github.com:pollen-robotics/reachy_config_example.git
+cp -r reachy_config_example/.reachy_config ~/
+```
+
+In your docker_reachy2_core folder, compose a container with:
+```python
+docker compose -f dev.yaml up -d core
+```
+> This can take a few minutes to compose.
+
+Build:
+```python
+cbuilds
+```
+
+
+In a first terminal, launch the robot server:
+```python
+# terminal 1
+docker exec -it core bash
+ros2 launch reachy_bringup reachy.launch.py fake:=true start_sdk_server:=true start_rviz:=true
+```
+Keep this terminal open, and in a second terminal:
+```python
+# terminal 2
+docker exec -it core bash
+python3 ../dev/reachy2-sdk/src/example/test_goto.py
+```
+> If you have the Python SDK installed on your computer, you can launch the example outside the container.
diff --git a/content/docs/_index.md b/content/docs/_index.md
deleted file mode 100644
index 8d6f4a8b..00000000
--- a/content/docs/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title : "Docs"
-description: "Learn how to install and get started with Reachy."
-lead: ""
-date: 2020-10-06T08:48:23+00:00
-lastmod: 2020-10-06T08:48:23+00:00
-draft: false
-images: []
----
diff --git a/content/docs/advanced/_index.md b/content/docs/advanced/_index.md
deleted file mode 100644
index f5ab8a7f..00000000
--- a/content/docs/advanced/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title : "Advanced usage"
-description: "Advanced usage with Reachy 2."
-lead: ""
-date: 2023-07-25T14:29:14+02:00
-lastmod: 2023-07-25T14:29:14+02:00
-draft: false
-images: []
----
diff --git a/content/docs/advanced/access-computer.md b/content/docs/advanced/access-computer.md
deleted file mode 100644
index 832d335b..00000000
--- a/content/docs/advanced/access-computer.md
+++ /dev/null
@@ -1,61 +0,0 @@
----
-title: "Access Reachy 2 computer"
-description: "How to connect to the robot's embedded computer"
-lead: "How to connect to the robot's embedded computer"
-date: 2023-08-09T14:43:24+02:00
-lastmod: 2023-08-09T14:43:24+02:00
-draft: false
-images: []
-toc: true
-weight: "90"
----
-There are several ways to connect to your robot.
-
-## SSH connection
-Using the robot's IP address (check [Find Reachy 2's IP]({{< ref "help/system/find-my-ip" >}}) if you don't know it), you can directly connect via ssh to Reachy 2's computer:
-
-```python
-ssh bedrock@
-```
-
-> For example, with robot's IP being 192.168.1.42:
-> ```python
-> ssh bedrock@192.168.1.42
-> ```
-
-{{< alert icon="👉" text="Password: root" >}}
-
-## Hard-wired connection
-
-Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
-
-{{< img-center "images/docs/advanced/serial-connection.png" 500x "Serial connection port" >}}
-
-We use `tio`for the serial connection. If you haven't installed it yet on your computer:
-`apt install tio`
-
-{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
Then reboot your computer for the new group to be effective." >}}
-
-Once connected, open a terminal on your computer and run:
-```python
-tio /dev/ttyUSB0
-```
-*Note that depending on the elements you connected to the robot, the port could be something else than ttyUSB0. Check other available serial ports with `ls /dev/ttyUSB*`*
-
-{{< img-center "images/docs/advanced/tio-terminal.png" 500x "Tio connection port" >}}
-
-{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
-
-You are then connected to Reachy 2 computer!
-
-## Avahi connection
-
-Find the serial number of your robot on its back, connect your computer on the same network as your robot, open a terminal and type:
-```bash
-ping .local
-```
-
->For example, if the serial number is reachy2-beta1:
->```bash
->ping reachy2-beta1.local
->```
\ No newline at end of file
diff --git a/content/docs/getting-started/_index.md b/content/docs/getting-started/_index.md
deleted file mode 100644
index 31e6155a..00000000
--- a/content/docs/getting-started/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Getting Started"
-description: "On first start."
-lead: ""
-date: 2023-07-25T14:15:43+02:00
-lastmod: 2023-07-25T14:15:43+02:00
-draft: false
-images: []
----
diff --git a/content/docs/getting-started/dashboard.md b/content/docs/getting-started/dashboard.md
deleted file mode 100644
index 356d60d6..00000000
--- a/content/docs/getting-started/dashboard.md
+++ /dev/null
@@ -1,37 +0,0 @@
----
-title: "Connect to the dashboard"
-description: ""
-lead: ""
-date: 2023-08-09T14:43:48+02:00
-lastmod: 2023-08-09T14:43:48+02:00
-draft: false
-images: []
-toc: true
-weight: "50"
----
-The dashboard is here to give you an overview of the robot's state (what services are running, is there an error on a motor,...) and give you the possibility to access quickly some features (changing a robot's part compliance for example).
-
-This tool has been thought to help you **start easier with the robot** and **facilitate quick debugging**.
-
-## 1. Find Reachy 2's IP address
-
-The LCD screen connected in Reachy's back should be diplaying its IP address.
-
-{{< img-center "images/vr/getting-started/lcd-display.png" 400x "" >}}
-
-If the LCD screen is not working or is unplugged, check out the page [Find my IP section]({{< ref "help/system/find-my-ip" >}}) to learn other ways to get the IP address.
-
-> Note the LCD screen will not work if you plug it after having turned on the computer.
-
-## 2. Connect from the navigator
-
-From your computer, on the same network, open a navigator and go to:
-**`http://:8000/`**
-
-> For example, if the screen indicates `192.168.1.42`, connect to `http://192.168.1.42:8000/`
-
-You should arrive on a services page:
-
-{{< img-center "images/docs/getting-started/dashboard.png" 600x "dashboard" >}}
-
-Usage of the dashboard is detailed in the next sections.
diff --git a/content/docs/getting-started/hello-world.md b/content/docs/getting-started/hello-world.md
deleted file mode 100644
index cad309e2..00000000
--- a/content/docs/getting-started/hello-world.md
+++ /dev/null
@@ -1,62 +0,0 @@
----
-title: "Hello World"
-description: "First robot use."
-lead: "Is everything working fine?"
-date: 2023-08-09T14:44:11+02:00
-lastmod: 2023-08-09T14:44:11+02:00
-draft: false
-images: []
-toc: true
-weight: "60"
----
-## 1. Check services are running
-
-All elements of the robots should have started automatically.
-Check this is the case on [the dashboard]({{< ref "/docs/getting-started/dashboard" >}})
-
-2 services must be launched:
-- **reachy2-core**
-- **reachy2-webrtc**
-
-Click on **Logs** for both services to check they have correctly started.
-You should see content appearing under the services tab:
-
-{{< img-center "images/docs/getting-started/dashboard-services.png" 600x "services" >}}
-
-> If you see any error, click on **Restart** to restart them.
-
-## 2. Try sending commands
-
-(Temporary checkup)
-
-To check everything is working fine, you can use the examples of the Python SDK.
-
-### Clone reachy2-sdk
-
-Clone reachy2-sdk repository from Github on your computer.
-
-### Try the jupyter notebook examples
-
-Then go to `reachy2-sdk/src/examples/`, and try the two first jupyter notebooks:
-- 1_getting_started
-- 2_moves_introduction
-
-Check you manage to connect, to get data from the robot and to make it move.
-
-> We do not test the cameras from the Python SDK so far, because they can be accessed only by one service at the same time, and the webrtc service is already running by default.
-
-
diff --git a/content/docs/simulation/_index.md b/content/docs/simulation/_index.md
deleted file mode 100644
index f600f2fc..00000000
--- a/content/docs/simulation/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Simluation installation"
-description: "Simulation installation process."
-lead: ""
-date: 2023-07-25T15:04:00+02:00
-lastmod: 2023-07-25T15:04:00+02:00
-draft: false
-images: []
----
diff --git a/content/docs/update/_index.md b/content/docs/update/_index.md
deleted file mode 100644
index 18e80c39..00000000
--- a/content/docs/update/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Update Reachy"
-description: "Learn how to update Reachy's software."
-lead: ""
-date: 2023-07-25T15:19:04+02:00
-lastmod: 2023-07-25T15:19:04+02:00
-draft: false
-images: []
----
diff --git a/content/getting-started/_index.md b/content/getting-started/_index.md
new file mode 100644
index 00000000..d577c879
--- /dev/null
+++ b/content/getting-started/_index.md
@@ -0,0 +1,10 @@
+---
+title: "Getting started"
+description: "Assemble and start using your Reachy 2."
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+---
diff --git a/content/dashboard/_index.md b/content/getting-started/dashboard/_index.md
similarity index 54%
rename from content/dashboard/_index.md
rename to content/getting-started/dashboard/_index.md
index a992677b..78ba1f00 100644
--- a/content/dashboard/_index.md
+++ b/content/getting-started/dashboard/_index.md
@@ -1,10 +1,13 @@
----
-title: "Dashboard"
-description: "Use the dashboard to check Reachy's status, debug Reachy's issues and start applications."
-lead: ""
-date: 2023-07-25T15:34:02+02:00
-lastmod: 2023-07-25T15:34:02+02:00
-draft: false
-images: []
-type: docs
----
+---
+title: "Dashboard"
+description: "Overview of the dashboard content, an insightful tool."
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+weigth: 40
+---
diff --git a/content/getting-started/dashboard/discover.md b/content/getting-started/dashboard/discover.md
new file mode 100644
index 00000000..e5f81ed9
--- /dev/null
+++ b/content/getting-started/dashboard/discover.md
@@ -0,0 +1,65 @@
+---
+title: "Discover the dashboard"
+description: "Understand the dashboard features"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Dashboard"
+weight: 400
+toc: true
+---
+
+
+## 1. Find Reachy 2's IP address
+
+After you connected the robot to the network, it should have an IP address. The LCD screen connected in Reachy's back should be diplaying its IP address.
+
+{{< img-center "images/vr/getting-started/lcd-display.png" 400x "" >}}
+
+If the LCD screen is not working or is unplugged, check out the page Find my IP section to learn other ways to get the IP address.
+> Note the LCD screen will not work if you plug it after having turned on the computer.
+
+## 2. Connect from the navigator
+
+From your computer, on the same network, open a navigator and go to:
+**`http://:8000/`**
+
+> For example, if the screen indicates `192.168.1.42`, connect to `http://192.168.1.42:8000/`
+
+You should arrive on a services page:
+
+{{< img-center "images/docs/getting-started/dashboard.png" 600x "dashboard" >}}
+
+
+This tool has been thought to help you **start easier with the robot** and **facilitate quick debugging**.
+
+The dashboard is here to give you an overview of the robot's state as well as giving you the possiblity to access quickly some features (changing a robot's part compliance for example).
+
+## Features Overview
+
+What does the dashboard provide?
+
+* **Access the services** - **Services page**
+Stop or restart the robot's services, see robot logs *(coming soon)*.
+
+* **Manage network connection** - **Network page**
+Choose a wifi network to connect the robot to.
+
+* **Update robot software** - **Updates page**
+Get the last software versions of the robot, and choose the services you want to update.
+
+* **Visualize robot state** - **Visualization tools page**
+Get RViz visualization or even display live data from ROS topics with Foxglove.
+
+* **Send robot commands** - **Reachy control page**
+*Coming soon*
+
+
+On each page, the **serial number** of your robot is also displayed.
+
+More information is available for each page in the content section.
\ No newline at end of file
diff --git a/content/getting-started/safety-first/_index.md b/content/getting-started/safety-first/_index.md
new file mode 100644
index 00000000..f5fb33bb
--- /dev/null
+++ b/content/getting-started/safety-first/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Safety first"
+description: "Mandatory safety guidelines before using the robot."
+date: 2023-07-25T15:34:19+02:00
+lastmod: 2023-07-25T15:34:19+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+weight: 10
+toc: true
+---
diff --git a/content/docs/getting-started/safety.md b/content/getting-started/safety-first/safety-guidelines.md
similarity index 92%
rename from content/docs/getting-started/safety.md
rename to content/getting-started/safety-first/safety-guidelines.md
index e0ddd3ac..42035a02 100644
--- a/content/docs/getting-started/safety.md
+++ b/content/getting-started/safety-first/safety-guidelines.md
@@ -1,15 +1,20 @@
---
title: "Safety guidelines"
-description: "Use Reachy 2 properly."
-lead: "Compulsory reading before using the robot"
-date: 2023-08-09T14:44:05+02:00
-lastmod: 2023-08-09T14:44:05+02:00
+description: "Safety guidelines as mandatory reading"
+lead: "Everything you must know before using Reachy 2 for a safe experience with the robot"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
draft: false
images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Safety first"
+weight: 100
toc: true
-weight: "20"
---
+
{{< warning icon="👉🏾" text="Reachy 2 is much more powerful than the previous version. To avoid any accident, please follow carefully the safety guidelines!" >}}
> There is currently **no automatic collision security** on the robot: it won't stop if hitting anything or anyone, even itself. Remain constantly watchful when using it.
@@ -20,7 +25,7 @@ weight: "20"
Users must be in **full possession of their physical and mental powers at all times** when using the robot. Reachy 2 must never be used by someone having consumed substances that could affect their reactions, such as medication, drugs or alcohol.
-Users must **keep attention focused** on the robot at any time, especially if they are near the robot workspace, and imperatively if they are in its workspace or if they are responsible for the [emergency stop button]({{< ref "/sdk/getting-started/safety#emergency-stop-button" >}}).
+Users must **keep attention focused** on the robot at any time, especially if they are near the robot workspace, and imperatively if they are in its workspace or if they are responsible for the emergency stop button.
### Qualified users
@@ -93,7 +98,7 @@ Never make any hardware intervention on the robot, such as screwing on unscrewin
### Robot toppling risk
-The following section ["...and don't harm Reachy 2!"]({{< ref "/sdk/getting-started/safety#and-dont-harm-reachy-2" >}}) mainly describes risks of robot toppling or collision. This may damage the robot, but also harm anyone near to the robot.
+The following section ["...and don't harm Reachy 2!"]({{< ref "/getting-started/safety-first/safety-guidelines#and-dont-harm-reachy-2" >}}) mainly describes risks of robot toppling or collision. This may damage the robot, but also harm anyone near to the robot.
**All events of the following section can lead to users injuries**, so read them as users safety guidelines as well.
## ...and don't harm Reachy 2!
diff --git a/content/getting-started/setup-reachy2/_index.md b/content/getting-started/setup-reachy2/_index.md
new file mode 100644
index 00000000..2f4c4583
--- /dev/null
+++ b/content/getting-started/setup-reachy2/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Setup Reachy 2"
+description: "From assembling to your first robot move."
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+weight: 20
+---
diff --git a/content/docs/getting-started/unpack.md b/content/getting-started/setup-reachy2/assemble-and-plug.md
similarity index 90%
rename from content/docs/getting-started/unpack.md
rename to content/getting-started/setup-reachy2/assemble-and-plug.md
index 512d7bba..e5db8d99 100644
--- a/content/docs/getting-started/unpack.md
+++ b/content/getting-started/setup-reachy2/assemble-and-plug.md
@@ -1,70 +1,75 @@
----
-title: "Unpack Reachy 2"
-description: "Unpack your robot to start with Reachy 2."
-lead: "How to unpack and start with Reachy 2"
-date: 2023-08-09T14:43:24+02:00
-lastmod: 2023-08-09T14:43:24+02:00
-draft: false
-images: []
-toc: true
-weight: "10"
----
-
-## Your robot is nearly already assembled!
-
-{{< alert icon="👉" text="The robot weigh is around 50kg (110lb). You will need to be at least 3 to carry the robot out of the box. Wear suitable personal protective equipment (e.g. safety shoes) when unpacking the robot. When lifting Reachy 2, pay attention to lift correctly using with your legs, to avoid back injury. Be also aware of your fingers position on the robot." >}}
-
-Unpack your robot, you just have a few things to plug to finish assembling it.
-Check your box contains, in addition to the robot, the following elements:
-- a battery charger
-- a calibration board
-- wifi antennas
-- a mini USB cable
-- an emergency stop button
-- robot antennas screws
-
-{{< alert icon="👉" text="Make a visual check of all the robot to check nothing seems damage after the travel, especially the cables. In case of doubt on any element, please contact us. Do not use the robot if something is damaged." >}}
-
-## Adjust robot size
-
-Reachy 2 is mounted on its mobile base with a tripod for stability.
-
-This tripod is **adjustable in height**: choose a suitable height before starting using the robot.
-
-{{< alert icon="👉" text="Be careful, Reachy 2 is heavy. Ask for help to adjust robot size." >}}
-
-To do so, maintain the mobile base and ask for someone to maintain the robot's torso. Then unscrew all pods (do not unscrew them too much!), and raise the robot's torso.
-
-> The button on the cranks are here to modify their positions without unscrewing them.
-
-## Assemble the last elements
-
-### Screw the robot antennas
-
-Stick the antennas in the support on the head, then use the provided screws to fix them, preferably from the back hole. There is one screw by antenna.
-
-{{< img-center "images/docs/getting-started/antennas.jpg" 400x "Antennas support" >}}
-
-
-### Plug the WiFi antennas
-
-Plug the wifi antennas on the robot computer, on port 1 and 3.
-
-{{< img-center "images/docs/getting-started/plug-antennas.jpg" 300x "Wifi antennas emplacement" >}}
-
-Make sure to place them correctly so that the robot's arms cannot touch them:
-
-{{< img-center "images/docs/getting-started/wifi-antennas.png" 400x "Wifi antennas position" >}}
-
-### Plug the emergency stop button
-
-To plug the emergency state button, you will find a little black connector on the mobile base. Simply connect the button cable to it.
-
-{{< img-center "images/docs/getting-started/bau-connection.png" 300x "Connect the emergency stop button" >}}
-
-### Connect the battery
-
-First make sure the emergency stop button is pressed.
-Then plug the yellow connector of the mobile base so the robot will be powered when the emergency stop button will be unpressed.
-
-{{< img-center "images/docs/getting-started/yellow-connector.png" 300x "Power your robot" >}}
+---
+title: "Assemble & Plug Reachy 2"
+description: "Steps to assemble Reachy 2"
+lead: "Follow these steps to finish assembling your robot"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Setup Reachy 2"
+weight: 200
+toc: true
+---
+
+
+## Your robot is nearly already assembled!
+
+{{< alert icon="👉" text="The robot weigh is around 50kg (110lb). You will need to be at least 3 to carry the robot out of the box. Wear suitable personal protective equipment (e.g. safety shoes) when unpacking the robot. When lifting Reachy 2, pay attention to lift correctly using with your legs, to avoid back injury. Be also aware of your fingers position on the robot." >}}
+
+Unpack your robot, you just have a few things to plug to finish assembling it.
+Check your box contains, in addition to the robot, the following elements:
+- a battery charger
+- a calibration board
+- wifi antennas
+- a mini USB cable
+- an emergency stop button
+- robot antennas screws
+
+{{< alert icon="👉" text="Make a visual check of all the robot to check nothing seems damage after the travel, especially the cables. In case of doubt on any element, please contact us. Do not use the robot if something is damaged." >}}
+
+## Adjust robot size
+
+Reachy 2 is mounted on its mobile base with a tripod for stability.
+
+This tripod is **adjustable in height**: choose a suitable height before starting using the robot.
+
+{{< alert icon="👉" text="Be careful, Reachy 2 is heavy. Ask for help to adjust robot size." >}}
+
+To do so, maintain the mobile base and ask for someone to maintain the robot's torso. Then unscrew all pods (do not unscrew them too much!), and raise the robot's torso.
+
+> The button on the cranks are here to modify their positions without unscrewing them.
+
+## Assemble the last elements
+
+### Screw the robot antennas
+
+Stick the antennas in the support on the head, then use the provided screws to fix them, preferably from the back hole. There is one screw by antenna.
+
+{{< img-center "images/docs/getting-started/antennas.jpg" 400x "Antennas support" >}}
+
+
+### Plug the WiFi antennas
+
+Plug the wifi antennas on the robot computer, on port 1 and 3.
+
+{{< img-center "images/docs/getting-started/plug-antennas.jpg" 300x "Wifi antennas emplacement" >}}
+
+Make sure to place them correctly so that the robot's arms cannot touch them:
+
+{{< img-center "images/docs/getting-started/wifi-antennas.png" 400x "Wifi antennas position" >}}
+
+### Plug the emergency stop button
+
+To plug the emergency state button, you will find a little black connector on the mobile base. Simply connect the button cable to it.
+
+{{< img-center "images/docs/getting-started/bau-connection.png" 300x "Connect the emergency stop button" >}}
+
+### Connect the battery
+
+First make sure the emergency stop button is pressed.
+Then plug the yellow connector of the mobile base so the robot will be powered when the emergency stop button will be unpressed.
+
+{{< img-center "images/docs/getting-started/yellow-connector.png" 300x "Power your robot" >}}
diff --git a/content/docs/getting-started/network.md b/content/getting-started/setup-reachy2/connect-reachy2.md
similarity index 72%
rename from content/docs/getting-started/network.md
rename to content/getting-started/setup-reachy2/connect-reachy2.md
index 91146fa8..99d63712 100644
--- a/content/docs/getting-started/network.md
+++ b/content/getting-started/setup-reachy2/connect-reachy2.md
@@ -1,36 +1,41 @@
----
-title: "Connect your robot to the network"
-description: "How to connect your robot to the network."
-lead: "How to connect Reachy 2 to the network"
-date: 2023-08-09T14:44:05+02:00
-lastmod: 2023-08-09T14:44:05+02:00
-draft: false
-images: []
-toc: true
-weight: "40"
----
-> On the **first connection, connect Reachy 2 to your network using an ethernet cable**. You will then be able to choose another network using the dashboard.
-
-## Hard-wired connection
-
-Use an **ethernet cable** to connect your robot to the network.
-
-Ethernet plugs are available at position (c) of the robot's computer interface.
-Reachy 2's computer is configured to use DHCP. It should thus be directly accessible on your network.
-
-{{< img-center "images/docs/network/hardware-interface.png" 400x "hardware-interface" >}}
-
-To easily find the IP address of the robot, read the little LCD screen plugged in the back of the robot. Wait for the IP address to appear, it may take a few minutes.
-
-{{< img-center "images/vr/getting-started/lcd-display.png" 200x "lcd-display" >}}
-
-> Every 10 seconds, the screen switches between WiFi and Ethernet information.
-
-
-## WiFi
-
-After your first connection with an ethernet connection, simply use the **dashboard** to connect Reachy to WiFi.
-
-
-> **If you cannot use an ethernet connection for your first connection:**
-> {{< my-button link="/docs/getting-started/wifi/" label="How to connect to WiFi without using the dashboard?" >}}
\ No newline at end of file
+---
+title: "Connect Reachy 2"
+description: "Connect your robot to the network"
+lead: "Follow these steps to make your first connection to Reachy 2"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Setup Reachy 2"
+weight: 220
+toc: true
+---
+
+> On the **first connection, connect Reachy 2 to your network using an ethernet cable**. You will then be able to choose another network using the dashboard.
+
+## Hard-wired connection
+
+Use an **ethernet cable** to connect your robot to the network.
+
+Ethernet plugs are available at position (c) of the robot's computer interface.
+Reachy 2's computer is configured to use DHCP. It should thus be directly accessible on your network.
+
+{{< img-center "images/docs/network/hardware-interface.png" 400x "hardware-interface" >}}
+
+To easily find the IP address of the robot, read the little LCD screen plugged in the back of the robot. Wait for the IP address to appear, it may take a few minutes.
+
+{{< img-center "images/vr/getting-started/lcd-display.png" 200x "lcd-display" >}}
+
+> Every 10 seconds, the screen switches between WiFi and Ethernet information.
+
+
+## WiFi
+
+After your first connection with an ethernet connection, simply use the **dashboard** to connect Reachy to WiFi.
+
+
+> **If you cannot use an ethernet connection for your first connection:**
+
diff --git a/content/docs/getting-started/wifi.md b/content/getting-started/setup-reachy2/connect-wifi.md
similarity index 92%
rename from content/docs/getting-started/wifi.md
rename to content/getting-started/setup-reachy2/connect-wifi.md
index ac906043..e3a26896 100644
--- a/content/docs/getting-started/wifi.md
+++ b/content/getting-started/setup-reachy2/connect-wifi.md
@@ -1,47 +1,48 @@
----
-title: "Connect your robot to the WiFi"
-description: "How to connect your robot to the WiFi without using the dashboard."
-lead: "How to connect Reachy 2 to WiFi without the dashboard"
-date: 2023-08-09T14:43:31+02:00
-lastmod: 2023-08-09T14:43:31+02:00
-draft: false
-images: []
-toc: true
----
-## WiFi
-
-On your first connection to a network, the simpliest is to connect your robot with an ethernet cable.
-
-If you cannot do this:
-
-Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
-
-{{< img-center "images/docs/getting-started/serial-connection.png" 400x "Serial connection port" >}}
-
-We use `tio`for the serial connection. If you haven't installed it yet on your computer:
-`apt install tio`
-
-{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
Then reboot your computer for the new group to be effective." >}}
-
-Then, in a terminal on your computer, get access to the robot with:
-
-```python
-tio /dev/ttyUSB0
-```
-
-> Note that the connection could be on another USB port. Check all ports with `ls /dev/ttyUSB*`
-
-{{< img-center "images/docs/getting-started/tio-terminal.png" 400x "tio connection terminal" >}}
-
-{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
-
-
-Manually connect the robot to a WiFi with:
-```bash
-nmcli device wifi connect password
-```
-
-> For example, with the wifi *POLLEN-WIFI*, with password *superstrongpassword*:
-> `nmcli device wifi connect POLLEN-WIFI password superstrongpassword`
-
-{{< my-button link="/docs/getting-started/network/" label="< Back to network connection" >}}
\ No newline at end of file
+---
+title: "Connect your robot to the WiFi"
+description: "How to connect your robot to the WiFi without using the dashboard."
+lead: "How to connect Reachy 2 to WiFi without the dashboard"
+date: 2023-08-09T14:43:31+02:00
+lastmod: 2023-08-09T14:43:31+02:00
+draft: false
+images: []
+toc: true
+hidden: true
+---
+## WiFi
+
+On your first connection to a network, the simpliest is to connect your robot with an ethernet cable.
+
+If you cannot do this:
+
+Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
+
+{{< img-center "images/docs/getting-started/serial-connection.png" 400x "Serial connection port" >}}
+
+We use `tio`for the serial connection. If you haven't installed it yet on your computer:
+`apt install tio`
+
+{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
Then reboot your computer for the new group to be effective." >}}
+
+Then, in a terminal on your computer, get access to the robot with:
+
+```python
+tio /dev/ttyUSB0
+```
+
+> Note that the connection could be on another USB port. Check all ports with `ls /dev/ttyUSB*`
+
+{{< img-center "images/docs/getting-started/tio-terminal.png" 400x "tio connection terminal" >}}
+
+{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
+
+
+Manually connect the robot to a WiFi with:
+```bash
+nmcli device wifi connect password
+```
+
+> For example, with the wifi *POLLEN-WIFI*, with password *superstrongpassword*:
+> `nmcli device wifi connect POLLEN-WIFI password superstrongpassword`
+
+{{< my-button link="/getting-started/setup-reachy2/connect-reachy2/" label="< Back to network connection" >}}
\ No newline at end of file
diff --git a/content/docs/getting-started/turn-on.md b/content/getting-started/setup-reachy2/start-reachy2.md
similarity index 76%
rename from content/docs/getting-started/turn-on.md
rename to content/getting-started/setup-reachy2/start-reachy2.md
index 53f19968..9d4ed837 100644
--- a/content/docs/getting-started/turn-on.md
+++ b/content/getting-started/setup-reachy2/start-reachy2.md
@@ -1,48 +1,51 @@
----
-title: "Start your robot"
-description: "How to switch on your robot."
-lead: "How to switch on your robot."
-date: 2023-08-09T14:44:11+02:00
-lastmod: 2023-08-09T14:44:11+02:00
-draft: false
-images: []
-toc: true
-weight: "30"
----
-
-
-## Power up your robot
-To start your robot:
-1. Press the mobile base button (next to the mobile base's LCD screen). The mobile base's screen should turn on, indicating the current state of the battery (remaining battery percentage, current flow, etc).
-
-2. **Automatic calibration process**
-
-Put the robot in a environment with no obstacle, and make sure its arms or grippers are not touching the tripod.
-
-{{< alert icon="👉" text="The robot is going to slightly move during the calibration. Do not touch the robot during the calibration, and make sure the arms will not meet any obstacle during their movements." >}}
-
-Press and turn clockwise the emergency stop button to raise it. The automatic calibration process will start.
-
-> Do not move the robot until the services running on the computer are ready for use.
-
-
- {{< video "videos/docs/getting-started/calibration-process.mp4" "40%" >}}
-
- Automatic calibration process
-
-
-3. Turn on the robot's computer:
-- plug the green connector to the computer. The computer should automatically turn on.
-
-- if the computer was already plugged, use the (a) button to turn it on.
-{{< img-center "images/docs/getting-started/a-button.png" 400x "drawing" >}}
-
-> We advise to unplug the computer after each use for power saving, because the USB ports are still consuming current when the computer is off.
-
-
-
- {{< video "videos/docs/getting-started/turn-on-reachy.mp4" "40%" >}}
-
- Full turn on process in video
-
+---
+title: "Start Reachy 2"
+description: "Turn on the robot"
+lead: "Follow these steps to start the robot"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Setup Reachy 2"
+weight: 210
+toc: true
+---
+
+
+## Power up your robot
+To start your robot:
+1. Press the mobile base button (next to the mobile base's LCD screen). The mobile base's screen should turn on, indicating the current state of the battery (remaining battery percentage, current flow, etc).
+
+2. **Automatic calibration process**
+
+Put the robot in a environment with no obstacle, and make sure its arms or grippers are not touching the tripod.
+
+{{< alert icon="👉" text="The robot is going to slightly move during the calibration. Do not touch the robot during the calibration, and make sure the arms will not meet any obstacle during their movements." >}}
+
+Press and turn clockwise the emergency stop button to raise it. The automatic calibration process will start.
+
+> Do not move the robot until the services running on the computer are ready for use.
+
+
+ {{< video "videos/docs/getting-started/calibration-process.mp4" "40%" >}}
+
+ Automatic calibration process
+
+
+3. Turn on the robot's computer:
+- plug the green connector to the computer. The computer should automatically turn on.
+
+- if the computer was already plugged, use the (a) button to turn it on.
+{{< img-center "images/docs/getting-started/a-button.png" 400x "drawing" >}}
+
+> We advise to unplug the computer after each use for power saving, because the USB ports are still consuming current when the computer is off.
+
+
+
+ {{< video "videos/docs/getting-started/turn-on-reachy.mp4" "40%" >}}
+
+ Full turn on process in video
+
diff --git a/content/docs/getting-started/turn-off.md b/content/getting-started/setup-reachy2/stop-reachy2.md
similarity index 89%
rename from content/docs/getting-started/turn-off.md
rename to content/getting-started/setup-reachy2/stop-reachy2.md
index 80dda0d3..61d24bc2 100644
--- a/content/docs/getting-started/turn-off.md
+++ b/content/getting-started/setup-reachy2/stop-reachy2.md
@@ -1,41 +1,46 @@
----
-title: "Stop your robot"
-description: "How to switch off your robot."
-lead: "How to switch off your robot."
-date: 2023-08-09T14:44:11+02:00
-lastmod: 2023-08-09T14:44:11+02:00
-draft: false
-images: []
-toc: true
-weight: "70"
----
-## Power off your robot
-
-Power off your robot in the exact opposite order you turned it on!
-
-To stop your robot:
-
-1. From the dashboard, click on the Power Off button in the footer.
-2. Press the emergency stop button.
-3. Press the mobile base button.
-4. Wait for the led of the computer to turn off, then unplug the green port from the robot's computer.
-
-
-## Understanding the power buttons and battery life good practices
-The mobile base uses the 24V battery to power the wheels directly. DC-DC converters are used to generate 5V (emergency button power, USB HUB power and relay logic) and 12V for the upper body.
-The 5V converter draws almost 100mA when idle and is necessary for the emergency button logic.
-
-The emergency button and the mobile base button both need to be ON to turn on the power relay that shares the 24V with the rest of the robot. However, the mobile base button is the only one that, when turned OFF, shuts down the 5V converter.
-
-
-> :warning: **WARNING!** :warning: When turning off the robot, always turn off the mobile base button to minimize the idle current consumption. If you turn off the robot with the emergency button and didn't press the mobile base button before storing your robot, the battery will deplate faster.
-
-:bulb: Even with the mobile base button OFF, the battery screen will be powered (low consumption at around ~1mA). If you plan to store the robot for more than a month, we recommend unplugging one of the wires of the battery (like when you received the robot).
-
-
-| Configuration | Storage time before depleting a full battery |
-| :-----------------------------------------: | :------------------------------------------: |
-| Mobile base button ON, emergency button OFF | A few days |
-| Mobile base button OFF | A few months |
-| Unplugging the battery | A few years |
-
+---
+title: "Stop Reachy 2"
+description: "Turn off the robot"
+lead: "At the end, stop your robot"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Setup Reachy 2"
+weight: 230
+toc: true
+---
+
+## Power off your robot
+
+Power off your robot in the exact opposite order you turned it on!
+
+To stop your robot:
+
+1. From the dashboard, click on the Power Off button in the footer.
+2. Press the emergency stop button.
+3. Press the mobile base button.
+4. Wait for the led of the computer to turn off, then unplug the green port from the robot's computer.
+
+
+## Understanding the power buttons and battery life good practices
+The mobile base uses the 24V battery to power the wheels directly. DC-DC converters are used to generate 5V (emergency button power, USB HUB power and relay logic) and 12V for the upper body.
+The 5V converter draws almost 100mA when idle and is necessary for the emergency button logic.
+
+The emergency button and the mobile base button both need to be ON to turn on the power relay that shares the 24V with the rest of the robot. However, the mobile base button is the only one that, when turned OFF, shuts down the 5V converter.
+
+
+> :warning: **WARNING!** :warning: When turning off the robot, always turn off the mobile base button to minimize the idle current consumption. If you turn off the robot with the emergency button and didn't press the mobile base button before storing your robot, the battery will deplate faster.
+
+:bulb: Even with the mobile base button OFF, the battery screen will be powered (low consumption at around ~1mA). If you plan to store the robot for more than a month, we recommend unplugging one of the wires of the battery (like when you received the robot).
+
+
+| Configuration | Storage time before depleting a full battery |
+| :-----------------------------------------: | :------------------------------------------: |
+| Mobile base button ON, emergency button OFF | A few days |
+| Mobile base button OFF | A few months |
+| Unplugging the battery | A few years |
+
diff --git a/content/getting-started/update-reachy2/_index.md b/content/getting-started/update-reachy2/_index.md
new file mode 100644
index 00000000..89793449
--- /dev/null
+++ b/content/getting-started/update-reachy2/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Update Reachy 2"
+description: "Update your robot to get the latest features."
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+weight: 30
+---
diff --git a/content/docs/update/update.md b/content/getting-started/update-reachy2/update-software.md
similarity index 81%
rename from content/docs/update/update.md
rename to content/getting-started/update-reachy2/update-software.md
index 658a7b61..d2ec734d 100644
--- a/content/docs/update/update.md
+++ b/content/getting-started/update-reachy2/update-software.md
@@ -1,56 +1,60 @@
----
-title : "Update Reachy 2 software"
-description: "Update Reachy"
-lead: "Get latest version of Reachy software"
-date: 2023-07-25T15:15:22+02:00
-lastmod: 2023-07-25T15:15:22+02:00
-draft: false
-images: []
-toc: true
-weight: "80"
----
-
-## Use the dashboard
-
-The update of the robot can be entirely done with the dashboard.
-
-From the **Updates** tab, check if updates are available:
-
-{{< img-center "images/docs/update/dashboard-update-page.png" 600x "" >}}
-
-> Only advanced update management is working so far
-
-## Advanced update management
-
-From the dashboard Update page, click on **Advanced udpate management**:
-
-{{< img-center "images/docs/update/update.png" 600x "PLUM update" >}}
-
-> You can directly access the advanced update dashboard from **`http://:5000/`**
-
-### Fetch updates
-
-Click **Fetch Updates** to check if there is any available update on one of the robot's services.
-Once this is done, you can browse between the 5 services to see if a more recent version is available.
-
-> For example, an update is available for reachy2-dashboard here:
-{{< img-center "images/docs/update/update-available.png" 600x "PLUM update" >}}
-
-### Install update
-
-Select the version you want to download for the upgrade, and click on **Pull Container**.
-Wait for the message "*service.name* Pulled" to appear in the window.
-
-{{< img-center "images/docs/update/pull-container.png" 600x "PLUM update" >}}
-
-When this is done, click on **Generate**.
-Wait for the confirmation message to appear.
-
-{{< img-center "images/docs/update/generate.png" 600x "PLUM Update" >}}
-
-### Activate the update
-
-Finish the update installation by clicking on:
-1. **Enable**, to activate by default the updated service
-2. **Stop**, to stop the current outdated service running
-3. **Start**, to launch the updated service
\ No newline at end of file
+---
+title: "Update Reachy 2 software"
+description: "Steps to update Reachy 2 software to have the latest features"
+lead: "Get the latest features by updating your robot software"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ getting-started:
+ parent: "Update Reachy 2"
+weight: 300
+toc: true
+---
+
+## Use the dashboard
+
+The update of the robot can be entirely done with the dashboard.
+
+From the **Updates** tab, check if updates are available:
+
+{{< img-center "images/docs/update/dashboard-update-page.png" 600x "" >}}
+
+> Only advanced update management is working so far
+
+## Advanced update management
+
+From the dashboard Update page, click on **Advanced udpate management**:
+
+{{< img-center "images/docs/update/update.png" 600x "PLUM update" >}}
+
+> You can directly access the advanced update dashboard from **`http://:5000/`**
+
+### Fetch updates
+
+Click **Fetch Updates** to check if there is any available update on one of the robot's services.
+Once this is done, you can browse between the 5 services to see if a more recent version is available.
+
+> For example, an update is available for reachy2-dashboard here:
+{{< img-center "images/docs/update/update-available.png" 600x "PLUM update" >}}
+
+### Install update
+
+Select the version you want to download for the upgrade, and click on **Pull Container**.
+Wait for the message "*service.name* Pulled" to appear in the window.
+
+{{< img-center "images/docs/update/pull-container.png" 600x "PLUM update" >}}
+
+When this is done, click on **Generate**.
+Wait for the confirmation message to appear.
+
+{{< img-center "images/docs/update/generate.png" 600x "PLUM Update" >}}
+
+### Activate the update
+
+Finish the update installation by clicking on:
+1. **Enable**, to activate by default the updated service
+2. **Stop**, to stop the current outdated service running
+3. **Start**, to launch the updated service
diff --git a/content/hardware-guide/_index.md b/content/hardware-guide/_index.md
new file mode 100644
index 00000000..c126b4ff
--- /dev/null
+++ b/content/hardware-guide/_index.md
@@ -0,0 +1,10 @@
+---
+title: "Hardware guide"
+description: "Find hardware specifications and CAD files"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+---
diff --git a/content/hardware-guide/makers/_index.md b/content/hardware-guide/makers/_index.md
new file mode 100644
index 00000000..66c29962
--- /dev/null
+++ b/content/hardware-guide/makers/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Makers"
+description: "Find elements to make or customize your robot"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+menu:
+ hardware-guide:
+weight: 20
+type: docs
+---
diff --git a/content/hardware-guide/makers/cad-files.md b/content/hardware-guide/makers/cad-files.md
new file mode 100644
index 00000000..96f49c1d
--- /dev/null
+++ b/content/hardware-guide/makers/cad-files.md
@@ -0,0 +1,17 @@
+---
+title: "Find CAD Files"
+description: "Find CAD files to create, repair or customize your robot"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Makers"
+weight: 200
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/_index.md b/content/hardware-guide/specifications/_index.md
new file mode 100644
index 00000000..d749f208
--- /dev/null
+++ b/content/hardware-guide/specifications/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Specifications"
+description: "Find Reachy 2 hardware specifications"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+menu:
+ hardware-guide:
+weight: 10
+type: docs
+---
diff --git a/content/hardware-guide/specifications/audio.md b/content/hardware-guide/specifications/audio.md
new file mode 100644
index 00000000..daa7b7ff
--- /dev/null
+++ b/content/hardware-guide/specifications/audio.md
@@ -0,0 +1,17 @@
+---
+title: "Audio specifications"
+description: "Audio system specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 120
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/general.md b/content/hardware-guide/specifications/general.md
new file mode 100644
index 00000000..a7d5f26a
--- /dev/null
+++ b/content/hardware-guide/specifications/general.md
@@ -0,0 +1,17 @@
+---
+title: "General specifications"
+description: "General specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 100
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/grippers.md b/content/hardware-guide/specifications/grippers.md
new file mode 100644
index 00000000..a9372c36
--- /dev/null
+++ b/content/hardware-guide/specifications/grippers.md
@@ -0,0 +1,17 @@
+---
+title: "Grippers specifications"
+description: "Grippers specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 140
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/mobile-base.md b/content/hardware-guide/specifications/mobile-base.md
new file mode 100644
index 00000000..5101df32
--- /dev/null
+++ b/content/hardware-guide/specifications/mobile-base.md
@@ -0,0 +1,17 @@
+---
+title: "Mobile base specifications"
+description: "Mobile base specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 150
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/motors-actuators.md b/content/hardware-guide/specifications/motors-actuators.md
new file mode 100644
index 00000000..073a8610
--- /dev/null
+++ b/content/hardware-guide/specifications/motors-actuators.md
@@ -0,0 +1,17 @@
+---
+title: "Motors & Actuators specifications"
+description: "Audio system specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 130
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/hardware-guide/specifications/vision.md b/content/hardware-guide/specifications/vision.md
new file mode 100644
index 00000000..40e23feb
--- /dev/null
+++ b/content/hardware-guide/specifications/vision.md
@@ -0,0 +1,17 @@
+---
+title: "Vision specifications"
+description: "Vision system specifications of Reachy 2"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ hardware-guide:
+ parent: "Specifications"
+weight: 110
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/help/contact-support/_index.md b/content/help/contact-support/_index.md
new file mode 100644
index 00000000..77fb27ef
--- /dev/null
+++ b/content/help/contact-support/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Contact support"
+description: "Links to contact Pollen support team and get assistance"
+date: 2023-08-21T16:18:47+02:00
+lastmod: 2023-08-21T16:18:47+02:00
+draft: false
+images: []
+weight: 100
+type: docs
+menu:
+ help:
+weight: 30
+---
diff --git a/content/help/help/support.md b/content/help/contact-support/need-somebody.md
similarity index 68%
rename from content/help/help/support.md
rename to content/help/contact-support/need-somebody.md
index 79fdccca..1d005988 100644
--- a/content/help/help/support.md
+++ b/content/help/contact-support/need-somebody.md
@@ -1,24 +1,27 @@
----
-title : "Support"
-description: "Get support for your Reachy robot."
-lead: "Can't solve your problem? Ask for help!"
-date: 2023-07-26T08:45:50+02:00
-lastmod: 2023-07-26T08:45:50+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "40"
----
-
-## Forum
-
-Join **[our forum](https://forum.pollen-robotics.com/)** if you have any questions or simply want to take a look at others topics!
-
-{{< alert icon="👉" text="Any questions relative to your development with Reachy?Go to Pollen Community" >}}
-
-
-## Pollen Robotics support
-
-For any specific questions concerning your robot or if you meet problems with the product, please contact us at [support@pollen-robotics.com](mailto:support@pollen-robotics.com).
-
+---
+title : "I need somebody"
+description: "Get assistance from the community or Pollen support team"
+lead: "Get assistance from the community or Pollen support team"
+date: 2023-07-26T08:44:51+02:00
+lastmod: 2023-07-26T08:44:51+02:00
+draft: false
+images: []
+type: docs
+menu:
+ help:
+ parent: "Contact support"
+toc: true
+weight: 300
+---
+
+## Forum
+
+Join **[our forum](https://forum.pollen-robotics.com/)** if you have any questions or simply want to take a look at others topics!
+
+{{< alert icon="👉" text="Any questions relative to your development with Reachy?Go to Pollen Community" >}}
+
+
+## Pollen Robotics support
+
+For any specific questions concerning your robot or if you meet problems with the product, please contact us at [support@pollen-robotics.com](mailto:support@pollen-robotics.com).
+
diff --git a/content/help/help/_index.md b/content/help/debug/_index.md
similarity index 87%
rename from content/help/help/_index.md
rename to content/help/debug/_index.md
index 43720ad7..2ce19de8 100644
--- a/content/help/help/_index.md
+++ b/content/help/debug/_index.md
@@ -1,10 +1,13 @@
----
-title: "Debug"
-description: "Learn how you can debug Reachy."
-date: 2023-08-21T16:18:47+02:00
-lastmod: 2023-08-21T16:18:47+02:00
-draft: false
-images: []
-weight: 100
-type: docs
----
+---
+title: "Debug"
+description: "Learn how you can debug Reachy."
+date: 2023-08-21T16:18:47+02:00
+lastmod: 2023-08-21T16:18:47+02:00
+draft: false
+images: []
+weight: 100
+type: docs
+menu:
+ help:
+weight: 10
+---
diff --git a/content/help/debug/debug-checklist.md b/content/help/debug/debug-checklist.md
new file mode 100644
index 00000000..860e34e6
--- /dev/null
+++ b/content/help/debug/debug-checklist.md
@@ -0,0 +1,30 @@
+---
+title : "Quick debug checklist"
+description: "Debug"
+lead: "Easy steps to debug yourself common issues"
+date: 2023-07-26T08:44:51+02:00
+lastmod: 2023-07-26T08:44:51+02:00
+draft: false
+images: []
+type: docs
+menu:
+ help:
+ parent: "Debug"
+toc: true
+weight: 100
+---
+
+
+## Fast recovering
+
+The simpliest way to recover from an error (for example an arm not responding anymore) is to **power cycle the motors and restart the services**.
+
+It as a fast recovery procedure that may cover 80% of unexpected behavior.
+
+To do so:
+1. Suspend your current use of the robot
+2. Press the emergency stop button
+3. Make sure to put the arms and head in a suitable position before restarting the motors
+4. Press and turn clockwise the emergency stop button to raise it
+5. Go to the dashboard and click on *Restart* for `reachy2-core` then `webrtc`
+
diff --git a/content/help/faq/_index.md b/content/help/faq/_index.md
new file mode 100644
index 00000000..aba79044
--- /dev/null
+++ b/content/help/faq/_index.md
@@ -0,0 +1,13 @@
+---
+title: "FAQ"
+description: "Frequently asked questions"
+date: 2023-08-21T16:18:47+02:00
+lastmod: 2023-08-21T16:18:47+02:00
+draft: false
+images: []
+weight: 100
+type: docs
+menu:
+ help:
+weight: 20
+---
diff --git a/content/docs/advanced/calibrate-cameras.md b/content/help/faq/robot-faq.md
similarity index 51%
rename from content/docs/advanced/calibrate-cameras.md
rename to content/help/faq/robot-faq.md
index c6fe36ed..5004ec56 100644
--- a/content/docs/advanced/calibrate-cameras.md
+++ b/content/help/faq/robot-faq.md
@@ -1,114 +1,206 @@
----
-title: "Calibrate teleop cameras"
-description: "How to calibrate the stereovision for the teleop cameras"
-lead: "How to calibrate stereovision on the teleop cameras"
-date: 2023-08-09T14:43:31+02:00
-lastmod: 2023-08-09T14:43:31+02:00
-draft: false
-images: []
-toc: true
-weight: "100"
----
-
-{{< alert icon="👉" text="This calibration is for stereovision only. It will only work if the images are clear.If you want to modify the focus of the cameras because the images are blurred, this requires a hardware intervention on the lenses, which is not covered by the following explanations." >}}
-
-## Repositories installation
-
-The calibration process relies in 2 Pollen Robotics repositories.
-The simpliest way is to clone both of these repositories on your computer:
-
-- Pollen's `multical` fork. [**Clone the repo**](https://github.com/pollen-robotics/multical), then:
-```bash
-cd multical
-pip install -e .
-```
-
-- `pollen-vision` repo. [**Clone the repo**](https://github.com/pollen-robotics/pollen-vision/tree/develop), then:
-```bash
-cd pollen-vision
-pip install -e .[depthai_wrapper]
-```
-
-> We recommand to use virtual environments.
-
-## 1. Charuco calibration board
-
-
-
-Go to `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`.
-
-If you don't have one, generate a charuco board with the following command:
-
-```console
-$ python3 generate_board.py
-```
-
-Print it on a A4 paper and place it on a flat surface (we use a wooden board).
-
-> You should have received a calibration board with the robot, with the relevant information written behind.
-
-Mesure as accurately as possible the size of the squares and the size of the markers and edit the `example_boards/pollen_charuco.yaml` file in the previously cloned `multical` repo to report the values you measured (must be in meters).
-
-## 2. Get some images
-
-Connect the teleop cameras to your computer. You simply have to disconnect the *teleop cameras* USB connector from the robot's computer and plug it to your computer instead.
-
-If it is your first calibration, you must add the udev rules with:
-```bash
-echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="03e7", MODE="0666"' | sudo tee /etc/udev/rules.d/80-movidius.rules
-sudo udevadm control --reload-rules && sudo udevadm trigger
-```
-
-Then, still in `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`, run:
-```console
-$ python3 acquire.py --config CONFIG_IMX296
-```
-
-Press `return` to save a pair of images in `./calib_images/` (by default, use `--imagesPath` to change this).
-
-Try to cover a maximum of the field of view, with the board in a variety of orientations. If the coverage is good, about 30 images is sufficient.
-Also, make sure that most of the board is visible by all the cameras for all the saved images pairs.
-
-Below is an example of good coverage:
-{{< img-center "images/docs/advanced/mosaic.png" 500x "Good coverage images" >}}
-
-## 3. Run multical
-
-```console
-$ cd <...>/multical
-$ multical calibrate --image_path --boards example_boards/pollen_charuco.yaml --isFisheye True
-```
-
-(For some reason, --image_path must be an absolute path, relative paths don't work)
-
-It will write a `calibration.json` file in ``.
-
-## 4. Flash the calibration to the EEPROM
-
-Back in `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`.
-
-Run:
-```console
-$ python3 flash.py --config CONFIG_IMX296 --calib_json_file
-```
-
-A backup file with the current calibration settings stored on the device will be produced in case you need to revert back.
-
-If needed, run:
-```console
-$ python3 restore_calibration_backup.py --calib_file CALIBRATION_BACKUP_<...>.json
-```
-
-## 5. Check the calibration
-
-Run:
-```console
-$ python3 check_epilines.py --config CONFIG_IMX296
-```
-And show the aruco board to the cameras.
-
-An `AVG SLOPE SCORE` below `0.1%` is OK.
-
-Ideally it could be under `0.05%`.
-
-The lower, the better.
+---
+title : "Robot"
+description: "Robot FAQ"
+lead: "Frequently asked questions on the robot"
+date: 2023-07-26T08:44:51+02:00
+lastmod: 2023-07-26T08:44:51+02:00
+draft: false
+images: []
+type: docs
+menu:
+ help:
+ parent: "FAQ"
+toc: true
+weight: 200
+---
+## WiFi
+
+On your first connection to a network, the simpliest is to connect your robot with an ethernet cable.
+
+If you cannot do this:
+
+Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
+
+{{< img-center "images/docs/getting-started/serial-connection.png" 400x "Serial connection port" >}}
+
+We use `tio`for the serial connection. If you haven't installed it yet on your computer:
+`apt install tio`
+
+{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
Then reboot your computer for the new group to be effective." >}}
+
+Then, in a terminal on your computer, get access to the robot with:
+
+```python
+tio /dev/ttyUSB0
+```
+
+> Note that the connection could be on another USB port. Check all ports with `ls /dev/ttyUSB*`
+
+{{< img-center "images/docs/getting-started/tio-terminal.png" 400x "tio connection terminal" >}}
+
+{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
+
+
+Manually connect the robot to a WiFi with:
+```bash
+nmcli device wifi connect password
+```
+
+> For example, with the wifi *POLLEN-WIFI*, with password *superstrongpassword*:
+> `nmcli device wifi connect POLLEN-WIFI password superstrongpassword`
+
+There are several ways to connect to your robot.
+
+## SSH connection
+Using the robot's IP address (check Find Reachy 2's IP if you don't know it), you can directly connect via ssh to Reachy 2's computer:
+
+```python
+ssh bedrock@
+```
+
+> For example, with robot's IP being 192.168.1.42:
+> ```python
+> ssh bedrock@192.168.1.42
+> ```
+
+{{< alert icon="👉" text="Password: root" >}}
+
+## Hard-wired connection
+
+Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
+
+{{< img-center "images/docs/advanced/serial-connection.png" 500x "Serial connection port" >}}
+
+We use `tio`for the serial connection. If you haven't installed it yet on your computer:
+`apt install tio`
+
+{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
Then reboot your computer for the new group to be effective." >}}
+
+Once connected, open a terminal on your computer and run:
+```python
+tio /dev/ttyUSB0
+```
+*Note that depending on the elements you connected to the robot, the port could be something else than ttyUSB0. Check other available serial ports with `ls /dev/ttyUSB*`*
+
+{{< img-center "images/docs/advanced/tio-terminal.png" 500x "Tio connection port" >}}
+
+{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
+
+You are then connected to Reachy 2 computer!
+
+## Avahi connection
+
+Find the serial number of your robot on its back, connect your computer on the same network as your robot, open a terminal and type:
+```bash
+ping .local
+```
+
+>For example, if the serial number is reachy2-beta1:
+>```bash
+>ping reachy2-beta1.local
+>```
+
+
+
+{{< alert icon="👉" text="This calibration is for stereovision only. It will only work if the images are clear.If you want to modify the focus of the cameras because the images are blurred, this requires a hardware intervention on the lenses, which is not covered by the following explanations." >}}
+
+## Repositories installation
+
+The calibration process relies in 2 Pollen Robotics repositories.
+The simpliest way is to clone both of these repositories on your computer:
+
+- Pollen's `multical` fork. [**Clone the repo**](https://github.com/pollen-robotics/multical), then:
+```bash
+cd multical
+pip install -e .
+```
+
+- `pollen-vision` repo. [**Clone the repo**](https://github.com/pollen-robotics/pollen-vision/tree/develop), then:
+```bash
+cd pollen-vision
+pip install -e .[depthai_wrapper]
+```
+
+> We recommand to use virtual environments.
+
+## 1. Charuco calibration board
+
+
+
+Go to `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`.
+
+If you don't have one, generate a charuco board with the following command:
+
+```console
+$ python3 generate_board.py
+```
+
+Print it on a A4 paper and place it on a flat surface (we use a wooden board).
+
+> You should have received a calibration board with the robot, with the relevant information written behind.
+
+Mesure as accurately as possible the size of the squares and the size of the markers and edit the `example_boards/pollen_charuco.yaml` file in the previously cloned `multical` repo to report the values you measured (must be in meters).
+
+## 2. Get some images
+
+Connect the teleop cameras to your computer. You simply have to disconnect the *teleop cameras* USB connector from the robot's computer and plug it to your computer instead.
+
+If it is your first calibration, you must add the udev rules with:
+```bash
+echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="03e7", MODE="0666"' | sudo tee /etc/udev/rules.d/80-movidius.rules
+sudo udevadm control --reload-rules && sudo udevadm trigger
+```
+
+Then, still in `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`, run:
+```console
+$ python3 acquire.py --config CONFIG_IMX296
+```
+
+Press `return` to save a pair of images in `./calib_images/` (by default, use `--imagesPath` to change this).
+
+Try to cover a maximum of the field of view, with the board in a variety of orientations. If the coverage is good, about 30 images is sufficient.
+Also, make sure that most of the board is visible by all the cameras for all the saved images pairs.
+
+Below is an example of good coverage:
+{{< img-center "images/docs/advanced/mosaic.png" 500x "Good coverage images" >}}
+
+## 3. Run multical
+
+```console
+$ cd <...>/multical
+$ multical calibrate --image_path --boards example_boards/pollen_charuco.yaml --isFisheye True
+```
+
+(For some reason, --image_path must be an absolute path, relative paths don't work)
+
+It will write a `calibration.json` file in ``.
+
+## 4. Flash the calibration to the EEPROM
+
+Back in `pollen-vision/pollen_vision/pollen_vision/camera_wrappers/depthai/calibration`.
+
+Run:
+```console
+$ python3 flash.py --config CONFIG_IMX296 --calib_json_file
+```
+
+A backup file with the current calibration settings stored on the device will be produced in case you need to revert back.
+
+If needed, run:
+```console
+$ python3 restore_calibration_backup.py --calib_file CALIBRATION_BACKUP_<...>.json
+```
+
+## 5. Check the calibration
+
+Run:
+```console
+$ python3 check_epilines.py --config CONFIG_IMX296
+```
+And show the aruco board to the cameras.
+
+An `AVG SLOPE SCORE` below `0.1%` is OK.
+
+Ideally it could be under `0.05%`.
+
+The lower, the better.
diff --git a/content/help/faq/sdk-faq.md b/content/help/faq/sdk-faq.md
new file mode 100644
index 00000000..0c38b61a
--- /dev/null
+++ b/content/help/faq/sdk-faq.md
@@ -0,0 +1,25 @@
+---
+title : "Python SDK"
+description: "Python SDK FAQ"
+lead: "Frequently asked questions on the Python SDK for Reachy 2"
+date: 2023-07-26T08:44:51+02:00
+lastmod: 2023-07-26T08:44:51+02:00
+draft: false
+images: []
+type: docs
+menu:
+ help:
+ parent: "FAQ"
+toc: true
+weight: 210
+---
+
+### With the Python SDK
+
+If you are using the cameras with the Python SDK, the cameras are then managed by the reachy2-core service.
+
+Check all logs of the service with:
+
+```bash
+journalctl -b -u reachy2-core
+```
diff --git a/content/help/help/debug.md b/content/help/faq/teleoperation-faq.md
similarity index 50%
rename from content/help/help/debug.md
rename to content/help/faq/teleoperation-faq.md
index 1c951d6b..784a1725 100644
--- a/content/help/help/debug.md
+++ b/content/help/faq/teleoperation-faq.md
@@ -1,51 +1,33 @@
----
-title : "Debug"
-description: "Debug"
-lead: "Use journalctl on the services to look for errors"
-date: 2023-07-26T08:44:51+02:00
-lastmod: 2023-07-26T08:44:51+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "20"
----
-
-## Problem with the motors
-
-The motors are managed by the reachy2-core service.
-Check all logs of the service with:
-
-```bash
-journalctl -b -u reachy2-core
-```
-
-## Problem with the cameras or sound
-
-### With teleoperation application
-
-During teleoperation, the cameras and sound are managed by the webrtc service.
-This service is automatically launched when you start Reachy 2 computer.
-
-> If you have switched between the Python SDK and the teleoperation application without robot rebooting, first make sure:
->- that any running client to the sdk has been disconnected
->- that the speaker has been plugged back
->- that the webrtc services has been restarted
-
-Check all logs of the service with:
-
-```bash
-journalctl -b -u webrtc
-```
-
-### With the Python SDK
-
-If you are using the cameras with the Python SDK, the cameras are then managed by the reachy2-core service.
-
-> First make sure you have enabled correctly the [cameras for the SDK]({{< ref "sdk/first-moves/cameras#enable-teleop-cameras-for-the-sdk">}})
-
-Check all logs of the service with:
-
-```bash
-journalctl -b -u reachy2-core
-```
+---
+title : "Teleoperation issues"
+description: "VR teleoperation application FAQ"
+lead: "Frequently asked questions on VR teleoperation application"
+date: 2023-07-26T08:44:51+02:00
+lastmod: 2023-07-26T08:44:51+02:00
+draft: false
+images: []
+type: docs
+menu:
+ help:
+ parent: "FAQ"
+toc: true
+weight: 230
+---
+
+## Problem with the cameras or sound
+
+### With teleoperation application
+
+During teleoperation, the cameras and sound are managed by the webrtc service.
+This service is automatically launched when you start Reachy 2 computer.
+
+> If you have switched between the Python SDK and the teleoperation application without robot rebooting, first make sure:
+>- that any running client to the sdk has been disconnected
+>- that the speaker has been plugged back
+>- that the webrtc services has been restarted
+
+Check all logs of the service with:
+
+```bash
+journalctl -b -u webrtc
+```
diff --git a/content/help/help/recovering.md b/content/help/help/recovering.md
index 5357d940..2caa4792 100644
--- a/content/help/help/recovering.md
+++ b/content/help/help/recovering.md
@@ -22,5 +22,5 @@ To do so:
2. Press the emergency stop button
3. Make sure to put the arms and head in a suitable position before restarting the motors
4. Press and turn clockwise the emergency stop button to raise it
-5. [Go to the dashboard]({{< ref "/dashboard/introduction/connection" >}}) and click on *Restart* for `reachy2-core` then `webrtc`
+5. Go to the dashboard and click on *Restart* for `reachy2-core` then `webrtc`
diff --git a/content/help/help/torso.md b/content/help/help/torso.md
deleted file mode 100644
index e6192312..00000000
--- a/content/help/help/torso.md
+++ /dev/null
@@ -1,20 +0,0 @@
----
-title : "Hardware intervention"
-description: "Hardware debug in the robot's torso"
-lead: "Open the torso to access hardware inside Reachy 2"
-date: 2023-07-26T08:44:51+02:00
-lastmod: 2023-07-26T08:44:51+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "30"
----
-
-## Remove the torso case
-
-Simply unscrew the 4 little screws maintaining the case in place. There are 2 on each side of the robot:
-
-{{< img-center "images/help/torso/torso-case.png" 200x "torso-case" >}}
-
-> The case is slightly flexible, you will have to flex it a little to set it back in place.
\ No newline at end of file
diff --git a/content/help/safety/VR-use.md b/content/help/safety/VR-use.md
deleted file mode 100644
index d29551ee..00000000
--- a/content/help/safety/VR-use.md
+++ /dev/null
@@ -1,18 +0,0 @@
----
-title: "Use VR teleoperation"
-description: "Guidelines to use the VR teleoperation app safely, for you, the robot and the surrounding people"
-date: 2023-07-26T08:45:40+02:00
-lastmod: 2023-07-26T08:45:40+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "70"
----
-
-## Watch all guidelines in video!
-Watch this quick video to have an overview of the main guidelines to use teleoperation:
-
-{{< youtube bK7th6zY8Rg >}}
-
-
diff --git a/content/help/safety/_index.md b/content/help/safety/_index.md
deleted file mode 100644
index b1e7d65c..00000000
--- a/content/help/safety/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Safety"
-description: "Use the robot safely in any situation"
-date: 2023-07-26T08:45:24+02:00
-lastmod: 2023-07-26T08:45:24+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/help/safety/correct-use.md b/content/help/safety/correct-use.md
deleted file mode 100644
index 6de1c11f..00000000
--- a/content/help/safety/correct-use.md
+++ /dev/null
@@ -1,147 +0,0 @@
----
-title: "Use Reachy 2 properly"
-description: "Guidelines to use Reachy 2 safely, for you and the robot"
-date: 2023-07-26T08:45:34+02:00
-lastmod: 2023-07-26T08:45:34+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "60"
----
-
-{{< warning icon="👉🏾" text="Reachy 2 is much more powerful than the previous version. To avoid any accident, please follow carefully the safety guidelines!" >}}
-
-> There is currently **no automatic collision security** on the robot: it won't stop if hitting anything or anyone, even itself. Remain constantly watchful when using it.
-
-## Users
-
-### Attention and reaction
-
-Users must be in **full possession of their physical and mental powers at all times** when using the robot. Reachy 2 must never be used by someone having consumed substances that could affect their reactions, such as medication, drugs or alcohol.
-
-Users must **keep attention focused** on the robot at any time, especially if they are near the robot workspace, and imperatively if they are in its workspace or if they are responsible for the [emergency stop button]({{< ref "/sdk/getting-started/safety#emergency-stop-button" >}}).
-
-### Qualified users
-
-The robot must not be used if no qualified user is present.
-
-People using the robot or interacting with it must all be aware of the risks and be explicitly informed of the robot capabilities, limitations and restrictions. They must all be able to act with the appropriate behavior using the robot.
-
-{{< alert icon="👉" text="No one should use the robot without knowing the safety guidelines." >}}
-
-## Emergency stop button
-
-The robot is delivered with an emergency stop button.
-
-Pressing the emergency stop button will **immediately power off all motors**, from the arms to the mobile base wheels. Nevertheless it won't power off the computer, which means you won't lose anything running on the computer.
-
-> If you feel like you are losing control of the robot's movements or notice an unexpected behavior at anytime, **never hesitate to press the emergency stop button**.
-
-Someone must be holding the emergency stop button at any time when using the robot, being ready to press the button if needed, and keep its attention focused on the robot.
-
-{{< alert icon="👉" text="Objects may fall out of the grippers when pressing the emergency stop button. Make sure they cannot cause injuries." >}}
-
-## Don't harm yourself...
-
-Reachy 2 is a powerful robot that may hurt you if it is misused.
-
-If you do not respect the safety guidelines, you expose yourself to the following risks:
-- pinching
-- crushing
-- punches
-- electrical hazard
-
-### Alertness
-
-People interacting with the robot or present near its workspace must always look at the robot.
-
-If the robot is being teleoperated with its mobile base, people in the surroundings must be informed of the robot presence, and the operator must never make the robot pop by surprise near a person or come close to a person, making the person reachable by the arms.
-
-### Appropriate position
-
-Do not expose yourself to dangerous punches!
-
-People must never place their head, or any other body parts, in between or underneath segments of the robot when the robot is in use. Their head should never be reachable by the robots' arms if the robot is in use.
-
-If people are near the workspace of the robot, they must always stay in a position that allow them to quickly retract or recoil.
-
-> When the robot is in use, no one should enter or stay in the robot workspace.
-
-### Free space for retracting
-
-If people are standing near the robot workspace, make sure they have **sufficient space to retract or recoil**, and that this space is free of obstacles.
-
-People must never be blocked between the robot and a wall or furniture.
-
-### Objects manipulation with Reachy 2
-
-Be careful with the objects you manipulate with the robot. Sharp and pointed object manipulation is dangerous, do not get close to the robot if it manipulates such objects.
-
-For all manipulation tasks, users are responsible for assessing the hazards and risks relative to the objects they manipulated with the robot.
-
-### Manipulate the robot
-
-When the robot is in use, never manipulate robot parts at the same time.
-
-Users must be careful if putting their fingers in the actuators or between robot parts to avoid pinching or crushing.
-:warning: They must never put their fingers in the actuators or between robot parts if the robot is in use.
-
-### Hardware intervention
-
-Never make any hardware intervention on the robot, such as screwing on unscrewing something, if it is powered on.
-
-### Robot toppling risk
-
-The following section ["...and don't harm Reachy 2!"]({{< ref "/sdk/getting-started/safety#and-dont-harm-reachy-2" >}}) mainly describes risks of robot toppling or collision. This may damage the robot, but also harm anyone near to the robot.
-**All events of the following section can lead to users injuries**, so read them as users safety guidelines as well.
-
-## ...and don't harm Reachy 2!
-
-There are a few things you need to know to make sure that your Reachy doesn't get damaged when using it.
-
-### Carrying heavy objects
-
-Be careful of the position of the arms when lifting heavy objects with the robot.
-Avoid carrying the object to far from the robot torso, mainly to avoid risk of front toppling.
-
-Do not try to lift objects over 3kg (6.6lb).
-
-### Pulling/pushing
-
-Do not try to pull or push elements that are too heavy or oppositing too much!
-
-This may result in a robot toppling.
-
-### Obstacles
-
-Be aware of obstacles!
-
-When you are sending movements instructions to Reachy, be careful to obstacles the robot can meet. The robot will try to reach the positions you asked for as hard as it can, whether or not there is something on its way.
-
-Because of the force of the robot, and depending on the weigh or fragility of the object, two things may occur:
-- make the object fall and/or break it
-- make Reachy 2 tumble
-
-### Self-collision
-
-When you are moving both arms simultaneously, there are no safety measures implemented to prevent them from hitting each other.
-Nothing will neither prevent Reachy's arms from hitting its chest if you ask them to.
-
-If situations like these happen, do not hesitate to turn off the motors so that Reachy's motors will stop trying to reach a position they can't get to.
-
-### Mobile base
-
-#### Surface
-
-The mobile robot is made to be used on **flat surfaces**.
-Never use the robot on slopes, this may result in a robot toppling.
-
-#### Speed and movements
-
-Speed and commands are limited when using the Python SDK, nevertheless you can still generate behaviors that may be dangerous. Do not ask for high speeds and strong stops, or suddent changes of directions.
-Provoking oscillations of the robot may lead to a robot toppling.
-
-### Anti-collision LIDAR safety
-
-:warning: The anti-collision LIDAR safety has been deactivated.
diff --git a/content/help/system/_index.md b/content/help/system/_index.md
deleted file mode 100644
index 2deb5429..00000000
--- a/content/help/system/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "System"
-description: "Resolve most common difficulties and problems with the robot"
-date: 2023-07-26T08:46:02+02:00
-lastmod: 2023-07-26T08:46:02+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/help/system/find-my-ip.md b/content/help/system/find-my-ip.md
deleted file mode 100644
index 475b8928..00000000
--- a/content/help/system/find-my-ip.md
+++ /dev/null
@@ -1,70 +0,0 @@
----
-title: "Find Reachy 2's IP"
-description: "How to find Reachy 2 IP address"
-lead: "How to find your robot's IP address."
-date: 2023-07-26T08:46:47+02:00
-lastmod: 2023-07-26T08:46:47+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "50"
----
-
-Here are 4 different options to find out the IP address of your robot.
-> Make sure your robot has already been connected to a network before trying to get its IP address.
-
-
-## LCD display screen
-
-If you haven't unplugged it, the LCD screen connected in Reachy's back should be diplaying its IP address.
-
-{{< img-center "images/help/system/lcd-display.png" 400x "LCD display for IP" >}}
-
-## Hard-wired connection
-
-Use the appropriate cable and connect your computer directly to Reachy 2's computer. The cable has to be plugged in port (b) of Reachy 2's hardware interface.
-
-{{< img-center "images/help/system/serial-connection.png" 400x "serial connection port" >}}
-
-We use `tio`for the serial connection. If you haven't installed it yet on your computer:
-`apt install tio`
-
-{{< alert icon="👉" text="Make sure dialout is in your groups, otherwise add it to your groups. To check it:
>>> groups
If it doesn't appear in the list, add it with:
>>> sudo usermod -aG dialout $USER
" >}}
-
-Once connected, open a terminal on your computer and run:
-```python
-tio /dev/ttyUSB0
-```
-*Note that depending on the elements you connected to the robot, the port could be something else than ttyUSB0. Check other available serial ports with `ls /dev/ttyUSB*`*
-
-{{< img-center "images/help/system/tio-terminal.png" 400x "tio connection terminal" >}}
-
-{{< alert icon="👉" text="Login: bedrock
Password: root" >}}
-
-You should then be connected to Reachy's computer via serial port.
-You can find the IP address with:
-```python
-ifconfig
-```
-
-{{< img "images/help/system/ifconfig.png" 400x "Square">}}
-> In our case, Reachy 2's IP is *"192.168.86.56"*.
-
-## Using the serial number
-
-Find the serial number of your robot on its back, connect your computer on the same network as your robot, open a terminal and type:
-```bash
-ping .local
-```
-
->For example, if the serial number is reachy2-beta1:
->```bash
->ping reachy2-beta1.local
->```
-
-## Using a smartphone
-
-The **[Fing app](https://www.fing.com/products/fing-app)** let you scan IPs directly from your smartphone.
-
-To use it, install the app on your smartphone and connect your smartphone on the **same network** as the robot, then run an analysis of the network to find out the IPs connected. Reachy 2 must be one of them!
diff --git a/content/sdk/_index.md b/content/sdk/_index.md
deleted file mode 100644
index 21954d9e..00000000
--- a/content/sdk/_index.md
+++ /dev/null
@@ -1,10 +0,0 @@
----
-title: "Docs"
-description: "Discover the Python SDK for Reachy and its mobile base."
-lead: ""
-date: 2022-01-25T14:40:56+01:00
-lastmod: 2022-01-25T14:40:56+01:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/advanced/_index.md b/content/sdk/advanced/_index.md
deleted file mode 100644
index 59631e7c..00000000
--- a/content/sdk/advanced/_index.md
+++ /dev/null
@@ -1,10 +0,0 @@
----
-title : "Advanced Python SDK features"
-description: "Use Reachy 2 Python SDK, advanced features."
-lead: ""
-date: 2023-07-25T17:37:16+02:00
-lastmod: 2023-07-25T17:37:16+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/advanced/mobile-base.md b/content/sdk/advanced/mobile-base.md
deleted file mode 100644
index 912b896c..00000000
--- a/content/sdk/advanced/mobile-base.md
+++ /dev/null
@@ -1,70 +0,0 @@
----
-title: "Mobile base drive and control modes"
-description: "Drive modes and control modes description for the mobile base."
-date: 2023-07-26T08:32:17+02:00
-lastmod: 2023-07-26T08:32:17+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "160"
----
-## Drive modes
-### Overview
-The drive mode impacts the way the mobile base accepts commands. We could say it's the current state of the mobile base.
-
-In most cases, there is no need to think about these modes or to handle them in your code. Below are the most common use cases.
-* If you want to use the set_speed method to spam speed commands (e.g. pilot the robot with a controller), the mode has to be manually changed to 'cmd_vel':
- ```python
- reachy_mobile.mobile_base.drive_mode = 'cmd_vel'
- ```
-* If you want to push the robot easily, this will set the wheels in a compliancy state:
- ```python
- reachy_mobile.mobile_base.drive_mode = 'free_wheel'
- ```
-* On the contrary, if you want the robot to apply a passive resistance to movement, use:
- ```python
- reachy_mobile.mobile_base.drive_mode = 'brake'
- ```
-
-You can use this [Jupyter Notebook](https://github.com/pollen-robotics/mobile-base-sdk/blob/main/mobile_base_sdk/examples/notebooks/drive-modes.ipynb) to explore the drive modes with your mobile base.
-
-### Detailed behaviour
-This section is only useful if you intend to interact directly with the Hardware Abstraction Layer (HAL).
-
-Six drive modes are available for the mobile base:
-* **cmd_vel**: in this mode, speed instructions can be spammed to the wheels controllers. This mode is used for the *set_speed* method.
-* **brake**: in this mode, the wheels will be stiff.
-* **free_wheel**: in this mode, the wheels will be as compliant as possible.
-* **emergency_stop**: in this mode, the wheels will stop receiving mobility commands. Switching to this mode will also stop the mobile base hal code. This is a safety mode.
-* **speed**: another mode to send speed instructions, but less frequently than with the cmd_vel mode. This mode is actually not used at this level (python SDK level), but is implemented at the ROS level, in case one might need it.
-* **goto**: this mode is used for the *goto* method.
-
-*note: the 'speed' and 'goto' modes can't be changed by hand. The drive mode is handled automagically when requesting a set_speed or a goto.*
-
-The code for the [HAL can be found here](https://github.com/pollen-robotics/zuuu_hal)
-
-## Control modes
-### Overview
-The control mode dictates the low level control strategy used by the mobile bases's HAL.
-
-Two control modes are possible:
-* ***open_loop*** (default mode): in this mode, the wheels are compliant and the control is smoother.
- ```python
- reachy_mobile.mobile_base.control_mode = 'open_loop'
- ```
-
-* ***pid***: in this mode, the wheels are stiff and the control is more precise.
- ```python
- reachy_mobile.mobile_base.control_mode = 'pid'
- ```
-:bulb: We recommend that you run the following [Jupyter Notebook](https://github.com/pollen-robotics/mobile-base-sdk/blob/main/mobile_base_sdk/examples/notebooks/control-modes.ipynb) to get a feel of what the control mode does.
-
-### Detailed behaviour
-Regardless of how the mobile base is piloted (goto, set_speed, controller), the HAL always ends up calculating a goal rotational speed for each wheel.
-The control mode only changes the used strategy to reach that rotational speed.
-* In the open_loop mode, a simple affine model was identified to match a PWM to a goal rotational speed. The VESC controllers then apply the PWM directly to the motors of the wheels, without any other low level control. The measures can be found [here](https://github.com/pollen-robotics/zuuu_hal/tree/main/measures). While the model is simple, it does account for the static friction and the experimental data shows a good fit when the mobile base is on a flat surface.
-
-{{< img-center "images/sdk/mobile-base/affine_pwm_model.png" 400x "" >}}
-
-* In the pid mode, the HAL gives the goal rotational speeds directly to the VESC controllers of each wheel. The VESC will use a PID controller to control the speeds.
\ No newline at end of file
diff --git a/content/sdk/first-moves/_index.md b/content/sdk/first-moves/_index.md
deleted file mode 100644
index 095d2ec0..00000000
--- a/content/sdk/first-moves/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "First Moves"
-description: "Basic steps to get started with the SDK. Learn how to control each part of Reachy."
-date: 2023-07-25T17:38:17+02:00
-lastmod: 2023-07-25T17:38:17+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/getting-started/_index.md b/content/sdk/getting-started/_index.md
deleted file mode 100644
index 98076fbf..00000000
--- a/content/sdk/getting-started/_index.md
+++ /dev/null
@@ -1,10 +0,0 @@
----
-title : "Getting Started"
-description: "Getting Started with the SDK."
-lead: ""
-date: 2023-07-25T18:49:17+02:00
-lastmod: 2023-07-25T18:49:17+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/getting-started/hello-world.md b/content/sdk/getting-started/hello-world.md
deleted file mode 100644
index b754521e..00000000
--- a/content/sdk/getting-started/hello-world.md
+++ /dev/null
@@ -1,109 +0,0 @@
----
-title: "Hello World"
-description: "First SDK connection with your Reachy"
-date: 2023-07-25T18:50:04+02:00
-lastmod: 2023-07-25T18:50:04+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "40"
----
-
-## Enable cameras for the SDK
-
-### SR camera
-The SR camera is unplugged by default.
-If you want to use it, plug the SR camera on the robot's computer remaining USB port (2).
-
-{{< img-center "images/sdk/getting-started/plugged-sr.png" 400x "" >}}
-
-> Make sure to unplug it if you want to use the teleoperation.
-
-### Teleop cameras
-The teleop cameras are shared between the teleop service and the SDK server, and can only be used by one at the same time.
-In order to be able to use the teleop cameras with the SDK:
-1. Go to the dashboard
-2. Stop [webrtc service in the services tab]({{< ref "/dashboard/content/services" >}})
-
-{{< img-center "images/sdk/first-moves/stop-webrtc-service.png" 600x "" >}}
-
-## Connect to your robot
-
-Now you should be able to connect to your Reachy 2 and check that everything is ok. As we spoiled in the previous section, to connect to your robot, you simply need to run the following code:
-
-```python
-from reachy2_sdk import ReachySDK
-
-# Replace with the actual IP you've found.
-reachy = ReachySDK(host='the.reachy.ip.found.')
-```
-
-Before diving into the next chapters that will guide you in the depth of what you can do with the Reachy SDK, here is a quick preview.
-
-## Getting joints state
-
-To make sure everything is working fine, let's check the position of its joints. We won't go into details here as we will detail everything later.
-
-To get the state of a joint, you can access the *joints* attribute that contains all joints and iterate over its content:
-
-```python
-for name, joint in reachy.joints.items():
- print(f'Joint "{name}" is at pos {joint.present_position} degree.')
-```
-
-Will show something like:
-```python
-Joint "r_arm.shoulder.pitch" is at pos -3.6 degree.
-Joint "r_arm.shoulder.roll" is at pos 1.5 degree.
-Joint "r_arm.elbow.yaw" is at pos -3.1 degree.
-Joint "r_arm.elbow.pitch" is at pos 2.0 degree.
-Joint "r_arm.wrist.roll" is at pos -54.4 degree.
-Joint "r_arm.wrist.pitch" is at pos -0.9 degree.
-Joint "r_arm.wrist.yaw" is at pos -20.7 degree.
-Joint "l_arm.shoulder.pitch" is at pos 43.0 degree.
-Joint "l_arm.shoulder.roll" is at pos 0.8 degree.
-Joint "l_arm.elbow.yaw" is at pos 0.5 degree.
-Joint "l_arm.elbow.pitch" is at pos 1.2 degree.
-Joint "l_arm.wrist.roll" is at pos 0.1 degree.
-Joint "l_arm.wrist.pitch" is at pos 0.1 degree.
-Joint "l_arm.wrist.yaw" is at pos 1.1 degree.
-Joint "head.neck.roll" is at pos 4.5 degree.
-Joint "head.neck.pitch" is at pos -0.7 degree.
-Joint "head.neck.yaw" is at pos -1.9 degree.
-```
-
-Note that we have accessed the attribute *present_position* to get the joint actual position. You can access the position of a specific joint by using its full name (meaning the part it is attached to plus its name). For instance, to get the position of the 'left shoulder pitch':
-
-```python
->>> print(reachy.l_arm.shoulder.pitch.present_position)
--3.6
-```
-
-You can also get a resume of the joint state by doing:
-```python
->>> print(reachy.l_arm.shoulder.pitch)
-
-```
-
-If you did not run anything else, your robot should be compliant (meaning you can freely move it). You can try to move it and re-run the code above. You should see that without doing anything specific, the positions are automatically updated.
-
-## Seeing through Reachy 2's cameras
-
-Assuming, you are still connected (otherwise, simply reconnect), we will now display what Reachy sees as an [OpenCV window](https://opencv.org).
-
-```python
-import cv2 as cv
-
-while reachy.cameras.teleop.capture():
- l_frame = reachy.cameras.teleop.get_frame(CameraView.LEFT)
- r_frame = reachy.cameras.teleop.get_frame(CameraView.RIGHT)
- cv2.imshow("left", l_frame)
- cv2.imshow("right", r_frame)
- cv2.waitKey(1)
-```
-
-You should now see what Reachy sees!
-
-To stop the code, press Ctrl-C.
-
diff --git a/content/sdk/getting-started/overview.md b/content/sdk/getting-started/overview.md
deleted file mode 100644
index c6b323c8..00000000
--- a/content/sdk/getting-started/overview.md
+++ /dev/null
@@ -1,60 +0,0 @@
----
-title: "SDK Overview"
-description: "Understand the structure of the SDK."
-lead: ""
-date: 2023-07-25T18:49:56+02:00
-lastmod: 2023-07-25T18:49:56+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "50"
----
-
-## Understand the SDK structure
-
-### Reachy
-
-### Parts
-
-
-### Actuators
-
-Reachy's arm offers 7 degrees of freedom. It also gives access to one joint for the gripper.
-The **arm** is divided as follow:
-- **shoulder**, composed of 2 joints (pitch and roll)
-- **elbow**, composed of 2 joints (yaw and pitch)
-- **wrist**, composed of 3 joints (roll, pitch and yaw)
-
-We refer to the shoulder, elbow and wrist as **actuators**.
-For some actions, such as changing the compliance, is the the lowest level of control you will have.
-
-### Joints
-
-Each degree of freedom of Reachy is referred to as a **joint**. Joints are the lowest level of control you can have.
-The Orbita2D (used as shoulders and elbows in Reachy 2) offer the control of 2 joints, while Orbita3D (used as wrists and neck) offer the control of 3 joints.
-A joint is an angle you can control the position of in order to make movements with Reachy. For each joint, you can read a present position and write a goal position. Those position are given in degrees.
-
-```python
-reachy.r_arm.elbow.pitch.present_position
->>> 0.0
-
-reachy.r_arm.elbow.pitch.goal_position = -90
-```
-
-### Cameras
-
-You have 2 different cameras type on Reachy:
-- the **teleop** cameras, which are the cameras in Reachy's head. They are mobile cameras which can move the head and with stereovision, that are used for the teleoperation.
-- the **SR** cameras, which are the short-range cameras on Reachy's torso. They are fix cameras, with an accessible depth map, mainly useful for manipulation tasks.
-
-You can access those cameras doing:
-
-```python
-reachy.cameras.teleop
->>>
-
-reachy.cameras.SR
->>>
-```
-{{< alert icon="👉" text="The teleop cameras are shared between the Teleoperation service and the SDK server, and can only be used by one of them at once. Make sure you enabled the access to the teleop cameras for the SDK server before trying to use them through the Python SDK." >}}
diff --git a/content/sdk/getting-started/safety.md b/content/sdk/getting-started/safety.md
deleted file mode 100644
index 11a53019..00000000
--- a/content/sdk/getting-started/safety.md
+++ /dev/null
@@ -1,148 +0,0 @@
----
-title: "Safety first"
-description: "What you need to be aware of as a user to prevent Reachy 2 from getting damaged and you from getting hurt."
-lead: "Before showing you how to control each part of the robot, let's talk a bit about safety, both for you and the robot."
-date: 2023-07-25T18:50:18+02:00
-lastmod: 2023-07-25T18:50:18+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "60"
----
-
-{{< warning icon="👉🏾" text="Reachy 2 is much more powerful than the previous version. To avoid any accident, please follow carefully the safety guidelines!" >}}
-
-> There is currently **no automatic collision security** on the robot: it won't stop if hitting anything or anyone, even itself. Remain constantly watchful when using it.
-
-## Users
-
-### Attention and reaction
-
-Users must be in **full possession of their physical and mental powers at all times** when using the robot. Reachy 2 must never be used by someone having consumed substances that could affect their reactions, such as medication, drugs or alcohol.
-
-Users must **keep attention focused** on the robot at any time, especially if they are near the robot workspace, and imperatively if they are in its workspace or if they are responsible for the [emergency stop button]({{< ref "/sdk/getting-started/safety#emergency-stop-button" >}}).
-
-### Qualified users
-
-The robot must not be used if no qualified user is present.
-
-People using the robot or interacting with it must all be aware of the risks and be explicitly informed of the robot capabilities, limitations and restrictions. They must all be able to act with the appropriate behavior using the robot.
-
-{{< alert icon="👉" text="No one should use the robot without knowing the safety guidelines." >}}
-
-## Emergency stop button
-
-The robot is delivered with an emergency stop button.
-
-Pressing the emergency stop button will **immediately power off all motors**, from the arms to the mobile base wheels. Nevertheless it won't power off the computer, which means you won't lose anything running on the computer.
-
-> If you feel like you are losing control of the robot's movements or notice an unexpected behavior at anytime, **never hesitate to press the emergency stop button**.
-
-Someone must be holding the emergency stop button at any time when using the robot, being ready to press the button if needed, and keep its attention focused on the robot.
-
-{{< alert icon="👉" text="Objects may fall out of the grippers when pressing the emergency stop button. Make sure they cannot cause injuries." >}}
-
-## Don't harm yourself...
-
-Reachy 2 is a powerful robot that may hurt you if it is misused.
-
-If you do not respect the safety guidelines, you expose yourself to the following risks:
-- pinching
-- crushing
-- punches
-- electrical hazard
-
-### Alertness
-
-People interacting with the robot or present near its workspace must always look at the robot.
-
-If the robot is being teleoperated with its mobile base, people in the surroundings must be informed of the robot presence, and the operator must never make the robot pop by surprise near a person or come close to a person, making the person reachable by the arms.
-
-### Appropriate position
-
-Do not expose yourself to dangerous punches!
-
-People must never place their head, or any other body parts, in between or underneath segments of the robot when the robot is in use. Their head should never be reachable by the robots' arms if the robot is in use.
-
-If people are near the workspace of the robot, they must always stay in a position that allow them to quickly retract or recoil.
-
-> When the robot is in use, no one should enter or stay in the robot workspace.
-
-### Free space for retracting
-
-If people are standing near the robot workspace, make sure they have **sufficient space to retract or recoil**, and that this space is free of obstacles.
-
-People must never be blocked between the robot and a wall or furniture.
-
-### Objects manipulation with Reachy 2
-
-Be careful with the objects you manipulate with the robot. Sharp and pointed object manipulation is dangerous, do not get close to the robot if it manipulates such objects.
-
-For all manipulation tasks, users are responsible for assessing the hazards and risks relative to the objects they manipulated with the robot.
-
-### Manipulate the robot
-
-When the robot is in use, never manipulate robot parts at the same time.
-
-Users must be careful if putting their fingers in the actuators or between robot parts to avoid pinching or crushing.
-:warning: They must never put their fingers in the actuators or between robot parts if the robot is in use.
-
-### Hardware intervention
-
-Never make any hardware intervention on the robot, such as screwing on unscrewing something, if it is powered on.
-
-### Robot toppling risk
-
-The following section ["...and don't harm Reachy 2!"]({{< ref "/sdk/getting-started/safety#and-dont-harm-reachy-2" >}}) mainly describes risks of robot toppling or collision. This may damage the robot, but also harm anyone near to the robot.
-**All events of the following section can lead to users injuries**, so read them as users safety guidelines as well.
-
-## ...and don't harm Reachy 2!
-
-There are a few things you need to know to make sure that your Reachy doesn't get damaged when using it.
-
-### Carrying heavy objects
-
-Be careful of the position of the arms when lifting heavy objects with the robot.
-Avoid carrying the object to far from the robot torso, mainly to avoid risk of front toppling.
-
-Do not try to lift objects over 3kg (6.6lb).
-
-### Pulling/pushing
-
-Do not try to pull or push elements that are too heavy or oppositing too much!
-
-This may result in a robot toppling.
-
-### Obstacles
-
-Be aware of obstacles!
-
-When you are sending movements instructions to Reachy, be careful to obstacles the robot can meet. The robot will try to reach the positions you asked for as hard as it can, whether or not there is something on its way.
-
-Because of the force of the robot, and depending on the weigh or fragility of the object, two things may occur:
-- make the object fall and/or break it
-- make Reachy 2 tumble
-
-### Self-collision
-
-When you are moving both arms simultaneously, there are no safety measures implemented to prevent them from hitting each other.
-Nothing will neither prevent Reachy's arms from hitting its chest if you ask them to.
-
-If situations like these happen, do not hesitate to turn off the motors so that Reachy's motors will stop trying to reach a position they can't get to.
-
-### Mobile base
-
-#### Surface
-
-The mobile robot is made to be used on **flat surfaces**.
-Never use the robot on slopes, this may result in a robot toppling.
-
-#### Speed and movements
-
-Speed and commands are limited when using the Python SDK, nevertheless you can still generate behaviors that may be dangerous. Do not ask for high speeds and strong stops, or suddent changes of directions.
-Provoking oscillations of the robot may lead to a robot toppling.
-
-### Anti-collision LIDAR safety
-
-:warning: The anti-collision LIDAR safety has been deactivated.
diff --git a/content/sdk/introduction/_index.md b/content/sdk/introduction/_index.md
deleted file mode 100644
index 98076fbf..00000000
--- a/content/sdk/introduction/_index.md
+++ /dev/null
@@ -1,10 +0,0 @@
----
-title : "Getting Started"
-description: "Getting Started with the SDK."
-lead: ""
-date: 2023-07-25T18:49:17+02:00
-lastmod: 2023-07-25T18:49:17+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/introduction/introduction.md b/content/sdk/introduction/introduction.md
deleted file mode 100644
index 5c64b639..00000000
--- a/content/sdk/introduction/introduction.md
+++ /dev/null
@@ -1,68 +0,0 @@
----
-title: "Introduction"
-description: "Quick overview of the Python SDK and of the other options available to control the robot."
-date: 2023-07-25T18:50:18+02:00
-lastmod: 2023-07-25T18:50:18+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "10"
----
-
-## The SDK in a nutshell
-
-The [Python SDK](https://github.com/pollen-robotics/reachy2-sdk) lets you easily control and program a Reachy robot. It is used to read information (eg. camera image or joint position) and send commands to make the robot move.
-
-It is designed to:
-
-* let you start controlling your robot in a few lines of codes,
-* allow to focus on your application and not on hardware synchronisation issues,
-* facilitate fast prototyping and iteration.
-
-Connecting to your robot and getting the up-to-date position of all joints is as simple as:
-```python
-from reachy2_sdk import ReachySDK
-
-reachy = ReachySDK(host='192.168.0.42') # Replace with the actual IP
-
-for name, joint in reachy.joints.items():
- print(f'Joint "{name}" position is {joint.present_position} degree.')
-```
-
-You can use it directly on Reachy's computer or work remotely on another computer, as long as you are connected on the same network. The SDK works on Windows/Mac/Linux and requires Python >= 3.10. It is entirely open-source and released under an [Apache 2.0 License](https://github.com/pollen-robotics/reachy-sdk/blob/main/LICENSE).
-
-## Is this the right option for me?
-
-The Python SDK is only one way to control Reachy. There are other options that have different pros and cons.
-
-To know if the SDK is the right option, the TL;DR here would be something like:
-
-* You want to **focus on creating an application or behavior on Reachy**.
-* You **don't want to dig into the details** on how it can be controlled or run very time constrained code (eg. need more than 100Hz control).
-* You have **basic knowledge of Python** (no advanced knowledge is required).
-* You do not already have an important code base running on ROS2.
-
-## The other options
-
-### Unity VR App
-
-If you are interested in teleoperation and want to control Reachy via VR controllers, you can directly use our Unity VR App. More information on the [dedicated section]({{< ref "VR/introduction/introduction" >}}).
-
-### ROS2 Humble packages
-
-Reachy runs on [ROS2 Humble](https://docs.ros.org/en/humble/index.html). ROS is a Robotic Operating System, it offers a huge variety of compatible algorithms and hardware drivers. Yet, if you are not familiar with ROS, the beginning can be a bit overwhelming.
-
-The embedded NUC computer comes with ROS2 and Reachy specific packages already installed and running. They provide full access to Reachy (lower-level than the SDK). You can:
-- get the *joint states* and *forward position controllers*
-- use *Rviz*
-- subscribe to various sensor topic (camera, force sensor, etc)
-- access client for IK/FK
-
-For more information, please refer to the [dedicated section]({{< ref "advanced/software/ros2-level" >}}).
-
-### Custom gRPC client
-
-If you want to use another language than Python, for instance to integrate Reachy's control within an existing code base, you can write your own [gRPC](https://grpc.io) client. Our API is available [here](https://github.com/pollen-robotics/reachy-sdk-api).
-
-The API is used both by the Python SDK and the VR App.
diff --git a/content/sdk/mobile-base/_index.md b/content/sdk/mobile-base/_index.md
deleted file mode 100644
index 40e53140..00000000
--- a/content/sdk/mobile-base/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Mobile Base"
-description: "Learn how to use Reachy's mobile base."
-date: 2023-07-25T18:49:24+02:00
-lastmod: 2023-07-25T18:49:24+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/sdk/mobile-base/safety.md b/content/sdk/mobile-base/safety.md
deleted file mode 100644
index faa56964..00000000
--- a/content/sdk/mobile-base/safety.md
+++ /dev/null
@@ -1,45 +0,0 @@
----
-title: "Anti-collision safety"
-description: "LIDAR based anti-collision behaviour for the mobile base."
-date: 2023-07-26T08:37:55+02:00
-lastmod: 2023-07-26T08:37:55+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "170"
----
-## Overview
-The basic idea is that the LIDAR is used to detect surrounding obstacles and reduce or nullify speed commands that would create a collision with the mobile base.
-
-
- {{< video "videos/sdk/lidar_safety_human.mp4" "80%" >}}
-
-
-The safety is active regardless of how you command the mobile base (teleop, controller, goto and set_speed).
-
-:warning: The safety only works with obstacles that can be seen by the LIDAR. Small obstacles that are below the LIDAR won't be seen. Similarly, the LIDAR will see the legs of a table, but not the table top.
-
-
-## Detailed behaviour
-
- {{< video "videos/sdk/lidar_safety_360.mp4" "80%" >}}
-
-
-- If an obstacle is present inside of the critical distance boundary, then the speed of the mobile base is reduced in all directions, and nullified in the direction that would cause a collision. Rotations are slowed down but are still allowed.
-- Otherwise, if an obstacle is present inside of the safety distance boundary, then the speed of the mobile base is reduced only in the directions that would eventually cause a collision. Rotations are unchanged.
-- Obstacles that are further away than the safety distance do not trigger the safety in any way
-
-
-:bulb: Reachy's design allows the LIDAR to see close to 360° around it, but not entirely because of the metal bar: this creates a small blind spot. Even if a collision would be very unlikely (you'd have to e.g. drive backwards onto a perfectly aligned pole), any speed command that could create an unseen collision are slowed down.
-
-:warning: Do not obstruct the LIDAR by placing an objet on top of the mobile base as it will be considered as an obstacle.
-
-:warning: If the LIDAR disconnects during usage or if its controller crashes, then the mobile base will stop and will reject commands.
-
-## Advanced tuning
-
-The mobile base's Hardware Abstraction Layer runs with the anti-collision behaviour active by default. Currently, disabling/enabling the safety is the only configuration you can make using the SDK. If you need to fine tune the behaviour, you'll have to interact with the world of ROS and change the [HAL parameters](https://github.com/pollen-robotics/zuuu_hal/blob/main/config/params.yaml) (you'll have to recompile the package for the changes to take effect).
-
-The code can be accessed [here.](https://github.com/pollen-robotics/zuuu_hal/blob/main/zuuu_hal/lidar_safety.py)
-
diff --git a/content/teleoperation/_index.md b/content/teleoperation/_index.md
new file mode 100644
index 00000000..48652df5
--- /dev/null
+++ b/content/teleoperation/_index.md
@@ -0,0 +1,10 @@
+---
+title: "Teleoperation"
+description: "Use the VR teleoperation app."
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+---
diff --git a/content/teleoperation/compatibility-specs/_index.md b/content/teleoperation/compatibility-specs/_index.md
new file mode 100644
index 00000000..34653ffc
--- /dev/null
+++ b/content/teleoperation/compatibility-specs/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Compatibility & Specifications"
+description: "Find compatible VR headsets and get minimal computer specifications for teleoperation"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+weight: 10
+---
diff --git a/content/teleoperation/compatibility-specs/compatible-devices.md b/content/teleoperation/compatibility-specs/compatible-devices.md
new file mode 100644
index 00000000..58c7daae
--- /dev/null
+++ b/content/teleoperation/compatibility-specs/compatible-devices.md
@@ -0,0 +1,47 @@
+---
+title: "Compatible devices"
+description: "Find compatible VR headsets and get minimal computer specifications for teleoperation"
+lead: "Compatible VR headsets and minimal computer specifications for VR teleoperation"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Compatibility & Specifications"
+weight: 100
+toc: true
+---
+
+# Compatible VR headsets
+
+So far, the VR teleoperation application has been tested with the following devices:
+* **Meta Quest 2** (with Oculus Link)
+* **Meta Quest 3** (with Oculus Link)
+
+> No native application for Meta Quest headsets has been released at the moment.
+
+
+The application should also support any device compatible with Unity 2022.3 including but not limited to the following devices:
+* **Valve Index**
+* **HTC Vive**
+* **Oculus Rift**
+
+
+Please refer to [Unity documentation](https://docs.unity3d.com/2020.3/Documentation/Manual/VROverview.html) for more information about the compatibility.
+
+# PC requirements
+
+
+The application is built on Unity 2022.3 LTS for which the requirements can be found [here](https://docs.unity3d.com/2020.3/Documentation/Manual/system-requirements.html).
+
+
+In order to use the desktop version of the teleoperation application, your PC needs to support Virtual Reality. We recommend the computer to run on Windows, to be powerful enough and equipped with a graphic card.
+
+The computer minimum requirements are the following:
+* **Operating System:** Windows 10 (or Windows 7 SP1)
+* **Processor:** Intel Core i5-4590/AMD FX 8350 equivalent or better
+* **Memory:** 8GB RAM
+* **Graphic card:** NVIDIA GeForce GTX 970, AMD Radeon R9 290 equivalent or better
+* **Network:** Broadband Internet connection. It is highly recommended for your PC to be **hard-wired** into your router using an **ethernet cable**.
diff --git a/content/teleoperation/compatibility-specs/latency-network.md b/content/teleoperation/compatibility-specs/latency-network.md
new file mode 100644
index 00000000..9b202d36
--- /dev/null
+++ b/content/teleoperation/compatibility-specs/latency-network.md
@@ -0,0 +1,17 @@
+---
+title: "Latency and network insights"
+description: "Get expected latency"
+lead: ""
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Compatibility & Specifications"
+weight: 110
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/teleoperation/getting-started-teleoperation/_index.md b/content/teleoperation/getting-started-teleoperation/_index.md
new file mode 100644
index 00000000..ef722308
--- /dev/null
+++ b/content/teleoperation/getting-started-teleoperation/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Getting started with teleoperation"
+description: "Find out how to start with the VR application for teleoperation"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+weight: 20
+---
diff --git a/content/vr/use-teleop/best-practice.md b/content/teleoperation/getting-started-teleoperation/best-practice.md
similarity index 94%
rename from content/vr/use-teleop/best-practice.md
rename to content/teleoperation/getting-started-teleoperation/best-practice.md
index 226d7b6a..7a412e4b 100644
--- a/content/vr/use-teleop/best-practice.md
+++ b/content/teleoperation/getting-started-teleoperation/best-practice.md
@@ -1,143 +1,147 @@
----
-title: "🚨 Best practice"
-description: "Simple guidelines to follow for a good usage of the VR teleoperation app"
-lead: "Towards a good usage of the VR teleoperation app"
-date: 2023-07-26T09:01:27+02:00
-lastmod: 2023-07-26T09:01:27+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "80"
----
-
-{{< warning icon="👉🏾" text="This page contains really important information about the use of the teleoperation app. Please make sure you read it carefully before teleoperating Reachy." >}}
-
-Using teleoperation application has nothing complicated, but you need to respect a few guidelines to avoid damaging the robot when using it. This page goes through the main elements you need to keep in mind while teleoperating Reachy. The guidelines are not exhaustive, but should give you a good start on how to safely use the application.
-
-## Ideal use of teleoperation
-
-The ideal position to start teleoperation may depend on the surrounding of Reachy. Nevertheless, if the robot environment is compatible with it, we advise to start with the elbows at 90 degrees, lightly away from the torso.
-
-{{< img "images/vr/use-teleop/idealPosFaceReduced.jpg" 300x "ideal position face">}}
-{{< img "images/vr/use-teleop/idealPosProfReduced.jpg" 300x "ideal position side">}}
-
-
-
-Here is a video of movements and positions that are suitable for teleoperation:
-
-
-
-{{< video "videos/vr/use-teleop/ChestOk.mp4" "80%" >}}
-
-
-Follow all the elements described in the next sections to teleoperate Reachy in the best conditions!
-
-## All guidelines in video
-Watch this quick video to have an overview of the main guidelines to use teleoperation:
-
-{{< youtube bK7th6zY8Rg >}}
-
-
-The next sections go deeper into each guideline presented in the video and the risks of not following them.
-
-## Keep the right position
-The mapping between your position and the robot is made when holding (A) to start teleoperation. The position and rotation of your headset at this moment are used to calibrate the system. If you move (i.e. change either your body position or orientation), the controllers positions will still be calculated in this coordinate system, and Reachy movements won't look like like yours anymore. For these reasons, you must:
-
-- Not move your feet when teleoperating Reachy: they must stay static on the floor.
-
-{{< video "videos/vr/use-teleop/FeetOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/FeetNotOk.mp4" "40%" >}}
-
-
-- Not rotate your torso.
-In fact, Reachy's torso won't move, only the arms will try to reach the positions, and this may lead to collision between the Reachy's arms and torso.
-
-{{< video "videos/vr/use-teleop/ChestOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/ChestNotOk.mp4" "40%" >}}
-
-## Avoid movements discontinuities
-Reachy doesn't have the exact same degrees of freedom as you have, neither the same range for each joints. When a position cannot be reached, either because of the position or the orientation, the inverse kinematics gives the closest arm configuration found. The closest configuration found for the next position may be:
-
-- the same as the previous one, so the arm won't move and you have the impression Reachy is not following your movements anymore
-- quite different from the previous one, which will lead to sudden changes of the arm position
-
-All this contribute to give movements that seem incontrollable, due to discontinuities in the arm's inverse kinematics.
-
-**To avoid this situation:**
-
-- Avoid using extreme joints orientations while teleoperating Reachy
-- Avoid unusual arm positions, there are probably above Reachy's joints limits
-
-{{< video "videos/vr/use-teleop/MovementsOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/MovementsNotOk.mp4" "40%" >}}
-
-- The most limiting joint is the elbow: avoid working to close to your chest, the elbow will be at the limit of its range of motion
-
-{{< video "videos/vr/use-teleop/TorsoArmOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/TorsoArmNotOk.mp4" "40%" >}}
-
-- If the robot seems to stop following your movements, do not continue to move in this direction, you have already reached its workspace limit. Go back to a position you know can be reached.
-
-
-## Avoid damaging motors
-Reachy's arms have been designed to manipulate objects at a table level and nearby.
-Some positions away from this nominal area can require a lot of effort from the motors to be maintained, and cause them to overheat fast. Moreover, manipulating objects requires more effort from the motors.
-
-**To avoid damaging motors:**
-
-- Avoid doing movements above your head
-- Avoid keeping your arms straight ahead horizontally to the floor, where the shoulders motors have to carry all the weight of the arms in a static position
-
-{{< video "videos/vr/use-teleop/AboveHeadOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/AboveHeadNotOk.mp4" "40%" >}}
-
-- Do not let the motors in stiff mode when you are in the menu if you are not going to teleoperate the robot soon
-- Do not try to lift objects that are above Reachy's capabilities. If you try to lift an object and see that Reachy's arm can follow your movement or if you head some crackling noise coming from the motors, it probably means that the object is too heavy for Reachy's arm.
-
-{{< video "videos/vr/use-teleop/WeightOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/WeightNotOk.mp4" "40%" >}}
-
-## Avoid damaging 3D parts
-Hitting Reachy's arms on objects can break 3D parts of the robot. It may happen even if the arms crash into something at moderate speed.
-
-**To avoid damaging 3D parts:**
-- Check the environment surrounding the robot before starting the teleoperation. Make sure you have enough space around the robot and that there is no object to be hit by the robot (this may also save your object from being broken...)
-
-{{< video "videos/vr/use-teleop/CheckSpaceRobotOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/CheckSpaceRobotNotOk.mp4" "40%" >}}
-
-- Stop teleoperation close to the position which will be reached when the motors will be compliant, so that the arms won't fall from high.
-
-{{< video "videos/vr/use-teleop/StopArmOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/StopArmNotOk.mp4" "40%" >}}
-
-## Use teleop safely
-- Check the environment around you before starting teleoperation.
-
-{{< video "videos/vr/use-teleop/CheckSpaceOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/CheckSpaceNotOk.mp4" "40%" >}}
-
-- Stop teleoperation before removing your headset! You must be back in the menu before dropping the controllers and removing your headset, because Reachy will continue following your movements until you stop it.
-
-{{< video "videos/vr/use-teleop/RemoveHeadsetOk.mp4" "40%" >}}
-
-{{< video "videos/vr/use-teleop/RemoveHeadsetNotOk.mp4" "40%" >}}
-
-
-## Familiarize yourself with the robot
-- Before teleoperating the actual robot, familiarize yourself with its movements, its workspace and its joints limits. The virtual robot in the mirror scene is a good opportunity for that.
-- Stay near the robot for your first trials: listen to the motors sounds, be aware of your workspace and field of view in a environment you know, try to manipulate light objects.
-- Explore your own workspace with small and quite slow movements to see how the robot reacts and better understand the relation between your movements and its.
-
-
+---
+title: "Best practice"
+description: "Safety guidelines and other best practice for a safe teleoperation"
+lead: "Safety guidelines and other best practice for a safe teleoperation"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Getting started with teleoperation"
+weight: 220
+toc: true
+---
+
+
+{{< warning icon="👉🏾" text="This page contains really important information about the use of the teleoperation app. Please make sure you read it carefully before teleoperating Reachy." >}}
+
+Using teleoperation application has nothing complicated, but you need to respect a few guidelines to avoid damaging the robot when using it. This page goes through the main elements you need to keep in mind while teleoperating Reachy. The guidelines are not exhaustive, but should give you a good start on how to safely use the application.
+
+## Ideal use of teleoperation
+
+The ideal position to start teleoperation may depend on the surrounding of Reachy. Nevertheless, if the robot environment is compatible with it, we advise to start with the elbows at 90 degrees, lightly away from the torso.
+
+{{< img "images/vr/use-teleop/idealPosFaceReduced.jpg" 300x "ideal position face">}}
+{{< img "images/vr/use-teleop/idealPosProfReduced.jpg" 300x "ideal position side">}}
+
+
+
+Here is a video of movements and positions that are suitable for teleoperation:
+
+
+
+{{< video "videos/vr/use-teleop/ChestOk.mp4" "80%" >}}
+
+
+Follow all the elements described in the next sections to teleoperate Reachy in the best conditions!
+
+## All guidelines in video
+Watch this quick video to have an overview of the main guidelines to use teleoperation:
+
+{{< youtube bK7th6zY8Rg >}}
+
+
+The next sections go deeper into each guideline presented in the video and the risks of not following them.
+
+## Keep the right position
+The mapping between your position and the robot is made when holding (A) to start teleoperation. The position and rotation of your headset at this moment are used to calibrate the system. If you move (i.e. change either your body position or orientation), the controllers positions will still be calculated in this coordinate system, and Reachy movements won't look like like yours anymore. For these reasons, you must:
+
+- Not move your feet when teleoperating Reachy: they must stay static on the floor.
+
+{{< video "videos/vr/use-teleop/FeetOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/FeetNotOk.mp4" "40%" >}}
+
+
+- Not rotate your torso.
+In fact, Reachy's torso won't move, only the arms will try to reach the positions, and this may lead to collision between the Reachy's arms and torso.
+
+{{< video "videos/vr/use-teleop/ChestOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/ChestNotOk.mp4" "40%" >}}
+
+## Avoid movements discontinuities
+Reachy doesn't have the exact same degrees of freedom as you have, neither the same range for each joints. When a position cannot be reached, either because of the position or the orientation, the inverse kinematics gives the closest arm configuration found. The closest configuration found for the next position may be:
+
+- the same as the previous one, so the arm won't move and you have the impression Reachy is not following your movements anymore
+- quite different from the previous one, which will lead to sudden changes of the arm position
+
+All this contribute to give movements that seem incontrollable, due to discontinuities in the arm's inverse kinematics.
+
+**To avoid this situation:**
+
+- Avoid using extreme joints orientations while teleoperating Reachy
+- Avoid unusual arm positions, there are probably above Reachy's joints limits
+
+{{< video "videos/vr/use-teleop/MovementsOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/MovementsNotOk.mp4" "40%" >}}
+
+- The most limiting joint is the elbow: avoid working to close to your chest, the elbow will be at the limit of its range of motion
+
+{{< video "videos/vr/use-teleop/TorsoArmOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/TorsoArmNotOk.mp4" "40%" >}}
+
+- If the robot seems to stop following your movements, do not continue to move in this direction, you have already reached its workspace limit. Go back to a position you know can be reached.
+
+
+## Avoid damaging motors
+Reachy's arms have been designed to manipulate objects at a table level and nearby.
+Some positions away from this nominal area can require a lot of effort from the motors to be maintained, and cause them to overheat fast. Moreover, manipulating objects requires more effort from the motors.
+
+**To avoid damaging motors:**
+
+- Avoid doing movements above your head
+- Avoid keeping your arms straight ahead horizontally to the floor, where the shoulders motors have to carry all the weight of the arms in a static position
+
+{{< video "videos/vr/use-teleop/AboveHeadOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/AboveHeadNotOk.mp4" "40%" >}}
+
+- Do not let the motors in stiff mode when you are in the menu if you are not going to teleoperate the robot soon
+- Do not try to lift objects that are above Reachy's capabilities. If you try to lift an object and see that Reachy's arm can follow your movement or if you head some crackling noise coming from the motors, it probably means that the object is too heavy for Reachy's arm.
+
+{{< video "videos/vr/use-teleop/WeightOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/WeightNotOk.mp4" "40%" >}}
+
+## Avoid damaging 3D parts
+Hitting Reachy's arms on objects can break 3D parts of the robot. It may happen even if the arms crash into something at moderate speed.
+
+**To avoid damaging 3D parts:**
+- Check the environment surrounding the robot before starting the teleoperation. Make sure you have enough space around the robot and that there is no object to be hit by the robot (this may also save your object from being broken...)
+
+{{< video "videos/vr/use-teleop/CheckSpaceRobotOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/CheckSpaceRobotNotOk.mp4" "40%" >}}
+
+- Stop teleoperation close to the position which will be reached when the motors will be compliant, so that the arms won't fall from high.
+
+{{< video "videos/vr/use-teleop/StopArmOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/StopArmNotOk.mp4" "40%" >}}
+
+## Use teleop safely
+- Check the environment around you before starting teleoperation.
+
+{{< video "videos/vr/use-teleop/CheckSpaceOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/CheckSpaceNotOk.mp4" "40%" >}}
+
+- Stop teleoperation before removing your headset! You must be back in the menu before dropping the controllers and removing your headset, because Reachy will continue following your movements until you stop it.
+
+{{< video "videos/vr/use-teleop/RemoveHeadsetOk.mp4" "40%" >}}
+
+{{< video "videos/vr/use-teleop/RemoveHeadsetNotOk.mp4" "40%" >}}
+
+
+## Familiarize yourself with the robot
+- Before teleoperating the actual robot, familiarize yourself with its movements, its workspace and its joints limits. The virtual robot in the mirror scene is a good opportunity for that.
+- Stay near the robot for your first trials: listen to the motors sounds, be aware of your workspace and field of view in a environment you know, try to manipulate light objects.
+- Explore your own workspace with small and quite slow movements to see how the robot reacts and better understand the relation between your movements and its.
+
+
{{< alert icon="💡" text="You may feel like being in a video game at some point, but never forget that your movements are reproduced in real life!" >}}
\ No newline at end of file
diff --git a/content/vr/getting-started/connect.md b/content/teleoperation/getting-started-teleoperation/connect-reachy2.md
similarity index 79%
rename from content/vr/getting-started/connect.md
rename to content/teleoperation/getting-started-teleoperation/connect-reachy2.md
index 684c3126..780a5590 100644
--- a/content/vr/getting-started/connect.md
+++ b/content/teleoperation/getting-started-teleoperation/connect-reachy2.md
@@ -1,50 +1,53 @@
----
-title: "Connect to Reachy 2"
-description: ""
-lead: "How to launch the app and connect to the robot"
-date: 2023-08-21T16:00:11+02:00
-lastmod: 2023-08-21T16:00:11+02:00
-type: docs
-draft: false
-images: []
-toc: true
-weight: "60"
----
-
-## Launch the app
-
-Once everything is installed, you can launch the application.
-First connect your headset to your computer and make sure it is ready for use.
-
-> Meta Quest headsets must be used with the link.
-
-Then run the *Reachy2Teleoperation.exe* file from the previously unzipped folder to start the application.
-
-## Find Reachy IP address
-
-The LCD screen connected in Reachy's back should be diplaying its IP address.
-
-{{< img-center "images/vr/getting-started/lcd-display.png" 400x "" >}}
-
-If the LCD screen is not working or is unplugged, check out the page [Find my IP section]({{< ref "help/system/find-my-ip" >}}) to learn other ways to get the IP address.
-
-## Connect to the robot
-
-Create a new robot entry in the menu with the IP address you previously found.
-
-> Note that you must select the input fields with your VR beam and fill them in using your computer keyboard.
-
-Once the robot is created, select it and click on "**Connect**".
-You should then arrive in the *transition room* of the application.
-
-A message to allow the network access to the app may pop up, in this case **allow access**:
-
-{{< img-center "images/vr/getting-started/allow-access.png" 400x "" >}}
-
-Make sure the connection is fine by checking the information displayed at the top of the mirror.
-You must see:
-- a green text telling you "Connected to Reachy"
-- the view of the robot displayed in miniature
-- a good network connection indication
-
-{{< img-center "images/vr/getting-started/mirror-scene.png" 600x "" >}}
+---
+title: "Connect to Reachy 2"
+description: "Establish connection with the robot form the VR teleoperation application"
+lead: "Establish connection with the robot form the VR teleoperation application"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Getting started with teleoperation"
+weight: 210
+toc: true
+---
+
+## Launch the app
+
+Once everything is installed, you can launch the application.
+First connect your headset to your computer and make sure it is ready for use.
+
+> Meta Quest headsets must be used with the link.
+
+Then run the *Reachy2Teleoperation.exe* file from the previously unzipped folder to start the application.
+
+## Find Reachy IP address
+
+The LCD screen connected in Reachy's back should be diplaying its IP address.
+
+{{< img-center "images/vr/getting-started/lcd-display.png" 400x "" >}}
+
+If the LCD screen is not working or is unplugged, check out the page Find my IP section to learn other ways to get the IP address.
+
+## Connect to the robot
+
+Create a new robot entry in the menu with the IP address you previously found.
+
+> Note that you must select the input fields with your VR beam and fill them in using your computer keyboard.
+
+Once the robot is created, select it and click on "**Connect**".
+You should then arrive in the *transition room* of the application.
+
+A message to allow the network access to the app may pop up, in this case **allow access**:
+
+{{< img-center "images/vr/getting-started/allow-access.png" 400x "" >}}
+
+Make sure the connection is fine by checking the information displayed at the top of the mirror.
+You must see:
+- a green text telling you "Connected to Reachy"
+- the view of the robot displayed in miniature
+- a good network connection indication
+
+{{< img-center "images/vr/getting-started/mirror-scene.png" 600x "" >}}
diff --git a/content/vr/getting-started/installation.md b/content/teleoperation/getting-started-teleoperation/installation.md
similarity index 89%
rename from content/vr/getting-started/installation.md
rename to content/teleoperation/getting-started-teleoperation/installation.md
index 96147698..9bf2360e 100644
--- a/content/vr/getting-started/installation.md
+++ b/content/teleoperation/getting-started-teleoperation/installation.md
@@ -1,69 +1,73 @@
----
-title: "Installation"
-description: "How to install the VR teleoperation application"
-lead: "How to install the VR teleoperation application on your computer"
-date: 2023-08-21T16:00:11+02:00
-lastmod: 2023-08-21T16:00:11+02:00
-type: docs
-draft: false
-images: []
-toc: true
-weight: "40"
----
-
-> Reachy 2 is already fully compatible with the teleoperation application. You have nothing to install on the robot.
-
-## On the Windows computer
-
-### 1. Check VR device installation
-
-Make sure that your VR device is properly installed and running (please refer to your device documentation).
-
-### 2. Download application
-
-Download the zip archive we sent you, and unzip it.
-
-{{< alert icon="👉" text="No standalone application is available yet for Reachy 2 teleoperation." >}}
-
-### 3. Install GStreamer
-
-The project relies on GStreamer.
-
-- Please install the **[Windows Runtime](https://gstreamer.freedesktop.org/data/pkg/windows/1.24.0/msvc/gstreamer-1.0-msvc-x86_64-1.24.0.msi)**.
-
-- Choose the **complete installation**:
-{{< img-center "images/vr/getting-started/complete-installation.png" 400x "" >}}
-
-- Add `C:\gstreamer\1.0\msvc_x86_64\bin` to your PATH environment variable.
-To do so:
- - Access the **Edit the system environment variables** control panel:
- {{< img-center "images/vr/getting-started/control-panel.png" 400x "" >}}
-
- - Open **Environment Variables...**:
- {{< img-center "images/vr/getting-started/environment-variables.png" 400x "" >}}
-
- - Select the **Path** variable and click **Edit...**:
- {{< img-center "images/vr/getting-started/user-variables.png" 400x "" >}}
-
- - Click **New** and **add `C:\gstreamer\1.0\msvc_x86_64\bin`** to the list:
- {{< img-center "images/vr/getting-started/new-variable.png" 400x "" >}}
-
- - Also check you have GSTREAMER_1_0_ROOT_MSVC_X86_64 in your System variables, value being `C:\gstreamer\1.0/msvc_x86_64\`:
- {{< img-center "images/vr/getting-started/system-variables.png" 400x "" >}}
-
-- **Reboot** your computer after the installation.
-
-### 4. Configure the firewall
-
-{{< img-center "images/vr/getting-started/firewall.png" 400x "" >}}
-
-{{< img-center "images/vr/getting-started/allow-app.png" 400x "" >}}
-
-### 5. Choose your headset refresh rate (optional)
-
-If you have a Meta Quest headet, we advise you to set the refresh rate at 120 Hz.
-
-To do so, use the desktop Meta app (the one appearing on your computer when your headset is connected on your computer and the link activated).
-In the devices tab, select your headset, and modify the graphics preferences in the advanced section.
-
-{{< img-center "images/vr/getting-started/refresh-rate.png" 400x "" >}}
+---
+title: "Installation"
+description: "Download and install the latest VR teleoperation application"
+lead: "Download and install the latest VR teleoperation application"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Getting started with teleoperation"
+weight: 200
+toc: true
+---
+
+
+> Reachy 2 is already fully compatible with the teleoperation application. You have nothing to install on the robot.
+
+## On the Windows computer
+
+### 1. Check VR device installation
+
+Make sure that your VR device is properly installed and running (please refer to your device documentation).
+
+### 2. Download application
+
+Download the zip archive we sent you, and unzip it.
+
+{{< alert icon="👉" text="No standalone application is available yet for Reachy 2 teleoperation." >}}
+
+### 3. Install GStreamer
+
+The project relies on GStreamer.
+
+- Please install the **[Windows Runtime](https://gstreamer.freedesktop.org/data/pkg/windows/1.24.0/msvc/gstreamer-1.0-msvc-x86_64-1.24.0.msi)**.
+
+- Choose the **complete installation**:
+{{< img-center "images/vr/getting-started/complete-installation.png" 400x "" >}}
+
+- Add `C:\gstreamer\1.0\msvc_x86_64\bin` to your PATH environment variable.
+To do so:
+ - Access the **Edit the system environment variables** control panel:
+ {{< img-center "images/vr/getting-started/control-panel.png" 400x "" >}}
+
+ - Open **Environment Variables...**:
+ {{< img-center "images/vr/getting-started/environment-variables.png" 400x "" >}}
+
+ - Select the **Path** variable and click **Edit...**:
+ {{< img-center "images/vr/getting-started/user-variables.png" 400x "" >}}
+
+ - Click **New** and **add `C:\gstreamer\1.0\msvc_x86_64\bin`** to the list:
+ {{< img-center "images/vr/getting-started/new-variable.png" 400x "" >}}
+
+ - Also check you have GSTREAMER_1_0_ROOT_MSVC_X86_64 in your System variables, value being `C:\gstreamer\1.0/msvc_x86_64\`:
+ {{< img-center "images/vr/getting-started/system-variables.png" 400x "" >}}
+
+- **Reboot** your computer after the installation.
+
+### 4. Configure the firewall
+
+{{< img-center "images/vr/getting-started/firewall.png" 400x "" >}}
+
+{{< img-center "images/vr/getting-started/allow-app.png" 400x "" >}}
+
+### 5. Choose your headset refresh rate (optional)
+
+If you have a Meta Quest headet, we advise you to set the refresh rate at 120 Hz.
+
+To do so, use the desktop Meta app (the one appearing on your computer when your headset is connected on your computer and the link activated).
+In the devices tab, select your headset, and modify the graphics preferences in the advanced section.
+
+{{< img-center "images/vr/getting-started/refresh-rate.png" 400x "" >}}
diff --git a/content/teleoperation/using-application/_index.md b/content/teleoperation/using-application/_index.md
new file mode 100644
index 00000000..8e45abc2
--- /dev/null
+++ b/content/teleoperation/using-application/_index.md
@@ -0,0 +1,13 @@
+---
+title: "Using Reachy2Teleoperation application"
+description: "Find out how to use Reachy2Teleoperation VR application for teleoperation"
+lead: ""
+date: 2023-07-25T15:34:02+02:00
+lastmod: 2023-07-25T15:34:02+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+weight: 30
+---
diff --git a/content/teleoperation/using-application/control-mobile-base.md b/content/teleoperation/using-application/control-mobile-base.md
new file mode 100644
index 00000000..4cb59533
--- /dev/null
+++ b/content/teleoperation/using-application/control-mobile-base.md
@@ -0,0 +1,17 @@
+---
+title: "Control the mobile base"
+description: "Use the mobile base in teleoperation"
+lead: "Use the mobile base in teleoperation"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 320
+toc: true
+---
+
+This is how you assemble your robot
\ No newline at end of file
diff --git a/content/vr/use-teleop/commands.md b/content/teleoperation/using-application/controllers-inputs.md
similarity index 85%
rename from content/vr/use-teleop/commands.md
rename to content/teleoperation/using-application/controllers-inputs.md
index 963a53dc..b0dc6776 100644
--- a/content/vr/use-teleop/commands.md
+++ b/content/teleoperation/using-application/controllers-inputs.md
@@ -1,47 +1,50 @@
----
-title: "Commands"
-description: "Controller inputs mapping for VR teleoperation"
-date: 2023-07-26T09:01:36+02:00
-lastmod: 2023-07-26T09:01:36+02:00
-draft: false
-images: []
-type: docs
-toc: true
-hidden: true
-weight: "90"
----
-
-> A reminder of the controller inputs mapping is available in the **help** section of the *transition room* in the VR teleoperation application:
-{{< img "images/vr/use-teleop/help-panel.png" 600x "Help panel in VR transition room">}}
-
-
-## Meta Quest
-
-### Standard inputs
-
-{{< img "images/vr/use-teleop/meta-quest-mapping.png" 600x "Meta Quest controller mapping">}}
-
-|Input|Feature description |
-|----|--------------------|
-|**A**|**At robot teleoperation start:** Start robot teleoperation|
-| |**During teleoperation:** Return to menu|
-|**B**|**During teleoperation:** Mobile base boost|
-|**X**|**When leaving teleoperation (A pressed):** Lock robot position|
-|**Left Thumbstick**|**During teleoperation:** Control mobile base translation|
-|**Right Thumbstick**|**During teleoperation:** Control mobile base rotation|
-|**Left Index Trigger**|**In menu:** Select button|
-| |**During teleoperation:** Control left gripper|
-|**Right Index Trigger**|**In menu:** Select button|
-| |**During teleoperation:** Control right gripper|
-|**Left Controller position / orientation**|**During teleoperation:** Reachy's left arm end effector position / orientation|
-|**Right Controller position / orientation**|**During teleoperation:** Reachy's right arm end effector position / orientation|
-|**Headset orientation**|**During teleoperation:** Reachy's head orientation|
-
-### Emergency stop combination
-
-{{< img "images/vr/use-teleop/meta-quest-emergency-stop.png" 600x "Meta Quest controller emergency stop">}}
-
-|Input|Feature description |
-|----|--------------------|
-|**A + right index trigger + right middle finger trigger**|**During teleoperation:** Emergency stop|
-|**X + left index trigger + left middle finger trigger**|**During teleoperation:** Emergency stop|
+---
+title: "Controllers inputs"
+description: "Mapping between controllers and teleoperation features"
+lead: "Mapping between controllers and teleoperation features"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 330
+toc: true
+---
+
+> A reminder of the controller inputs mapping is available in the **help** section of the *transition room* in the VR teleoperation application:
+{{< img "images/vr/use-teleop/help-panel.png" 600x "Help panel in VR transition room">}}
+
+
+## Meta Quest
+
+### Standard inputs
+
+{{< img "images/vr/use-teleop/meta-quest-mapping.png" 600x "Meta Quest controller mapping">}}
+
+|Input|Feature description |
+|----|--------------------|
+|**A**|**At robot teleoperation start:** Start robot teleoperation|
+| |**During teleoperation:** Return to menu|
+|**B**|**During teleoperation:** Mobile base boost|
+|**X**|**When leaving teleoperation (A pressed):** Lock robot position|
+|**Left Thumbstick**|**During teleoperation:** Control mobile base translation|
+|**Right Thumbstick**|**During teleoperation:** Control mobile base rotation|
+|**Left Index Trigger**|**In menu:** Select button|
+| |**During teleoperation:** Control left gripper|
+|**Right Index Trigger**|**In menu:** Select button|
+| |**During teleoperation:** Control right gripper|
+|**Left Controller position / orientation**|**During teleoperation:** Reachy's left arm end effector position / orientation|
+|**Right Controller position / orientation**|**During teleoperation:** Reachy's right arm end effector position / orientation|
+|**Headset orientation**|**During teleoperation:** Reachy's head orientation|
+
+### Emergency stop combination
+
+{{< img "images/vr/use-teleop/meta-quest-emergency-stop.png" 600x "Meta Quest controller emergency stop">}}
+
+|Input|Feature description |
+|----|--------------------|
+|**A + right index trigger + right middle finger trigger**|**During teleoperation:** Emergency stop|
+|**X + left index trigger + left middle finger trigger**|**During teleoperation:** Emergency stop|
diff --git a/content/vr/getting-started/setup.md b/content/teleoperation/using-application/customize-teleop-session.md
similarity index 86%
rename from content/vr/getting-started/setup.md
rename to content/teleoperation/using-application/customize-teleop-session.md
index 22bb1b95..b5b5def3 100644
--- a/content/vr/getting-started/setup.md
+++ b/content/teleoperation/using-application/customize-teleop-session.md
@@ -1,60 +1,64 @@
----
-title: "Setup your teleop session"
-description: "Setup audio and motion sickness options before starting teleoperation"
-lead: "How to setup the parameters before starting teleoperation"
-date: 2023-08-21T16:00:11+02:00
-lastmod: 2023-08-21T16:00:11+02:00
-type: docs
-draft: false
-images: []
-toc: true
-weight: "70"
----
-
-## Audio and microphone setup
-
-To have a better experience within the VR, configure the audio of your headset from your **headset settings**.
-
-You should be able to speak through the robot and hear from it when you are in the *transition room*.
-Check both audio input and output are on, and set them to a correct value.
-
-> On the Meta Quest 2 headset, we use the following parameters:
-> - **audio input**: 100%
-> - **audio ouput**: 65%
-
-
-## Motion sickness options
-
-Once in the *transition room*, you have options you can configure to help you avoid motion sickness.
-
-On the left of the mirror, open the **Settings** tab, and configure the motion sickness options before starting teleoperating the robot.
-
-{{< img-center "images/vr/getting-started/vr-settings.png" 600x "" >}}
-
-
-### Reticle
-Display a reticle to give a fixed point in the field of view. By default, the reticle only appears when the mobile base is moving.
-
-*Option:*
-- **Always display reticle**: always display the reticle, even if the mobile base is not moving.
-
-{{< img-center "images/vr/getting-started/reticle.png" 300x "" >}}
-
-
-### Navigation effects
-
-- **No effect**
-- **Tunneling**: when moving the mobile base, a black tunneling will appear in your peripheral vision and reduce your field of view
-- **Reduced screen**: when moving the mobile base, the size of the image will be reduced to let you see an artificial horizon behind it.
-
-*Option:*
-- **Activate effect on demande only**: during teleoperation, press one of the joysticks to activate/deactivate the occurence of the selected effect.
-
- If used with *tunneling*, deactivate the effect will disable the tunneling when moving the mobile base, activate it will let it appear automatically.
-
- If used with *reduced screen*, activate or deactivate the effect will let you manually reduce the size of the image.
-
-|Tunneling effect|Reduced screen effect |
-|----|--------------------|
-|{{< img-center "images/vr/getting-started/tunneling.png" 300x "" >}}|{{< img-center "images/vr/getting-started/reduced-screen.png" 300x "" >}}
+---
+title: "Customize your teleop session"
+description: "Customize motion sickness effects and other user prefs"
+lead: "Customize motion sickness effects and other user prefs"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 340
+toc: true
+---
+
+
+## Audio and microphone setup
+
+To have a better experience within the VR, configure the audio of your headset from your **headset settings**.
+
+You should be able to speak through the robot and hear from it when you are in the *transition room*.
+Check both audio input and output are on, and set them to a correct value.
+
+> On the Meta Quest 2 headset, we use the following parameters:
+> - **audio input**: 100%
+> - **audio ouput**: 65%
+
+
+## Motion sickness options
+
+Once in the *transition room*, you have options you can configure to help you avoid motion sickness.
+
+On the left of the mirror, open the **Settings** tab, and configure the motion sickness options before starting teleoperating the robot.
+
+{{< img-center "images/vr/getting-started/vr-settings.png" 600x "" >}}
+
+
+### Reticle
+Display a reticle to give a fixed point in the field of view. By default, the reticle only appears when the mobile base is moving.
+
+*Option:*
+- **Always display reticle**: always display the reticle, even if the mobile base is not moving.
+
+{{< img-center "images/vr/getting-started/reticle.png" 300x "" >}}
+
+
+### Navigation effects
+
+- **No effect**
+- **Tunneling**: when moving the mobile base, a black tunneling will appear in your peripheral vision and reduce your field of view
+- **Reduced screen**: when moving the mobile base, the size of the image will be reduced to let you see an artificial horizon behind it.
+
+*Option:*
+- **Activate effect on demande only**: during teleoperation, press one of the joysticks to activate/deactivate the occurence of the selected effect.
+
+ If used with *tunneling*, deactivate the effect will disable the tunneling when moving the mobile base, activate it will let it appear automatically.
+
+ If used with *reduced screen*, activate or deactivate the effect will let you manually reduce the size of the image.
+
+|Tunneling effect|Reduced screen effect |
+|----|--------------------|
+|{{< img-center "images/vr/getting-started/tunneling.png" 300x "" >}}|{{< img-center "images/vr/getting-started/reduced-screen.png" 300x "" >}}
|
\ No newline at end of file
diff --git a/content/vr/use-teleop/emergency-stop.md b/content/teleoperation/using-application/emergency-stop.md
similarity index 80%
rename from content/vr/use-teleop/emergency-stop.md
rename to content/teleoperation/using-application/emergency-stop.md
index a5ea04aa..4dddebcf 100644
--- a/content/vr/use-teleop/emergency-stop.md
+++ b/content/teleoperation/using-application/emergency-stop.md
@@ -1,16 +1,20 @@
---
title: "Emergency stop"
-description: "Stop immediately sending commands to the robot"
-lead: "Stop immediately sending commands to the robot"
-date: 2023-07-26T09:01:49+02:00
-lastmod: 2023-07-26T09:01:49+02:00
+description: "Quickly stop robot movements in teleoperation"
+lead: "Quickly stop robot movements in teleoperation"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
draft: false
images: []
type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 310
toc: true
-weight: "140"
---
+
In case you feel like something unexpected is happening with the robot while you are teleoperating it in VR, you can stop immediately teleoperation rather than using the standard exit menu, which requires to wait for a few seconds.
> The VR application emergency stop does not replace the physical emergency stop button of the robot.
diff --git a/content/vr/use-teleop/start.md b/content/teleoperation/using-application/step-by-step.md
similarity index 95%
rename from content/vr/use-teleop/start.md
rename to content/teleoperation/using-application/step-by-step.md
index 53540bf5..0e833fb3 100644
--- a/content/vr/use-teleop/start.md
+++ b/content/teleoperation/using-application/step-by-step.md
@@ -1,156 +1,160 @@
----
-title: "Teleoperate Reachy"
-description: "Start and stop Reachy teleoperation using the VR app"
-lead: "How to use the VR teleoperation application"
-date: 2023-07-26T09:01:49+02:00
-lastmod: 2023-07-26T09:01:49+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "100"
----
-
-{{< warning icon="👉🏾" text="Before starting teleoperating Reachy, please make sure you read the Best Practice" >}}
-
-## In brief
-
-> The button names used below are for the Meta Quest headsets. Please refer to the [Controllers input page]({{< ref "/vr/use-teleop/commands">}}) to get the corresponding inputs for your device.
-
-### Start teleoperating Reachy
-
-1. Make sure the robot is turned on, connected to the network and that all the robot's services are running before launching the teleoperation application.
-
-2. Select the robot you want to teleoperate (or create a new one), and click on "Connect".
-
-3. Once in the mirror room, you can configure various settings. Take time to tune the motion sickness effects you want to use in the settings menu. When you are ready to start, press "Ready", then hold (A).
-
-4. **Look straight ahead, with your body in the same orientation as your head while pressing A** to start the teleoperation. *The initial head position is used to determine the coordinate system giving your VR controllers position.*
-
-{{< alert icon="👉" text="Warning: you must not move your body anymore after this step. The position of your VR controllers to master the robot arms are calculated depending on the position you had while pressing A." >}}
-
-{{< warning icon="🚨" text="Important: even if Reachy is bio-inspired, it cannot reproduce exactly all your movements. There are positions that cannot be reached by the robot. Please avoid unusual movements and do not persist in trying to reach a position if you see that the robot is stuck before it." >}}
-
-5. (NEW) You first have the control of the head and the mobile base, but **not of the arms**. Take a few seconds to check the robot surroundings and go to an appropriate place before starting the full teleoperation. When the environment is safe, **press A** to get the full control. You can also go back to the mirror room pressing the related button with your laser beam.
-
-6. Come back any time to mirror room by **holding A**. Teleoperation of the robot is automatically paused if the headset is removed.
-
-{{< alert icon="👉" text="Please stop teleoperation before removing your headset (go back to mirror room or quit the app). If you do not, Reachy will continue following your controllers and headset orientation during a few seconds, and this can cause damages to the robot." >}}
-
-### Stop teleoperation
-
-1. Come back to the **mirror room** to pause the teleoperation by **holding A** at any time during teleoperation.
-
-2. Leave the app by clicking "**Quit**" icons in the mirror room and connection menu.
-
-The motors are automatically turned into compliant mode when quitting the mirror room. Please make sure the arms are close enough to the lowest position they can reach when coming back to the menu to avoid them falling or hitting something.
-
-
-## Step-by-step starting
-1. Make sure that your VR equipement is up and running. Please refer to your device documentation.
-
-2. Make sure the robot is turned on, connected to the network and that all the robot services are running. *By default, if you haven't modified anything, all services should be automatically launched on start of the **full/starter kit** robots.*
-
-3. Launch the application *TeleoperateReachy.exe* file if you are using a VR device connected to a Windows computer. For Oculus Quest users, start the app from within the headset if you have installed the *.apk*.
-
-4. Equip yourself with your headset, make sure you can see both controllers and that the scene around you is moving correctly in accordance with your head movements.
-
-5. Choose the robot you want to connect to: you can select a robot with its IP address, or add a new one to the list of available robots.
-
-{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
-{{< img "images/vr/use-teleop/select-robot.png" 600x "Select robot to connect">}}
-
-6. Press *Connect* to initiate the communication with the robot.
-
-{{< img "images/vr/use-teleop/connect.png" 600x "Connect to a robot">}}
-
-7. You should be now in the *transition room*, and see yourself controlling a virtual reachy. The actual robot is not in control at that time but the live camera stream is displayed at the top right of the mirror. The info, help and settings menus are available here (they are documented in the next section). Please get familiar with the robot controls and features (emotion, grasping lock).
-
-{{< img "images/vr/use-teleop/mirror.png" 600x "Mirror scene">}}
-
-8. When you are ready, **face the mirror completely** and click on "Ready". The position of the actual robot appears in a semi-transparent green color. This may be useful when you've left the robot in a certain position that you would like to keep when entering the teleoperation. Hold (A) to start the teleoperation.
-
-{{< img "images/vr/use-teleop/mirror-ready.png" 600x "Start teleoperation">}}
-
-9. (NEW) You first have the control of the head and the mobile base, but **not of the arms**. Take a few seconds to check the robot surroundings and go to an appropriate place before starting the full teleoperation. When the environment is safe, **press A** to get the full control. You can also go back to the mirror room pressing the related button with your laser beam.
-
-10. A 3 seconds timer appears while you enter the teleoperation. The motors speeds are reduced during this time to avoid sudden movements of the robot. Full speed is reached at the end of this countdown.
-
-{{< img "images/vr/use-teleop/timer-start.png" 600x "Validate position before starting">}}
-
-{{< alert icon="👉" text="Warning: you don't want to move your torso and body anymore after this step. Only your head and arms. The position of your VR controllers to master the robot arms are calculated depending on the position you had while pressing A." >}}
-
-11. Come back any time to menu by **pressing A**. Teleoperation of the robot is automatically paused if the headset is removed.
-
-
-## Use Reachy's emotions
-*Use of the antennas emotion is not available on Reachy 2.*
-
-## Application features
-
-### Connection page
-
-{{% expand "> Add a new robot" %}}
-Click on the robot to select to open the panel of all saved robots:
-{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
-Then click on "Add new robot +" at the bottom right of the page:
-{{< img "images/vr/use-teleop/add-robot-button.png" 600x "Add robot button">}}
-Enter a robot name and the IP address of the robot (if the headset is connected on a computer, use the computer keyboard), and save your robot card:
-*The IP address is mandatory. If no name is given to the new robot, it will be called @Reachy by default*
-{{< img "images/vr/use-teleop/add-robot-card.png" 600x "Add robot panel">}}
-{{% /expand %}}
-
-{{% expand "> Modify an existing robot"%}}
-Click on the robot to select to open the panel of all saved robots:
-{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
-Then click on the pencil icon of the robot you want to modify:
-{{< img "images/vr/use-teleop/modify-robot-button.png" 600x "Modify robot button">}}
-Modify the info on the robot card and save the card:
-{{< img "images/vr/use-teleop/modify-robot-panel.png" 600x "Modify robot panel">}}
-{{% /expand %}}
-
-{{% expand "> Delete a saved robot"%}}
-Click on the robot to select to open the panel of all saved robots:
-{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
-Then click on the bin icon of the robot you want to delete:
-{{< img "images/vr/use-teleop/delete-robot-button.png" 600x "Delete robot button">}}
-Validate the deletion:
-{{< img "images/vr/use-teleop/delete-robot-panel.png" 600x "Delete robot panel">}}
-{{% /expand %}}
-
-
-### Mirror scene
-
-{{% expand "> Check robot status"%}}
-Open the info menu in the mirror room:
-{{< img "images/vr/use-teleop/mirror-info.png" 600x "Info menu">}}
-The connection and services status, and motor temperature are reported here.
-{{% /expand %}}
-
-{{% expand "> Controller mapping"%}}
-Open the help menu in the mirror room:
-{{< img "images/vr/use-teleop/mirror-help.png" 600x "Info menu">}}
-The mapping of the controller buttons to the robot actions are displayed here.
-{{% /expand %}}
-
-{{% expand "> Settings menu"%}}
-Open the settings menu in the mirror room:
-{{< img "images/vr/use-teleop/mirror-settings.png" 600x "Settings menu">}}
-Here you can set your size to improve the mapping between your movements and reachy's motion. Individual parts of the robot can be deactivated in the case you don't need the mobile base, a specific arm, etc.
-Motion sickness options are available in this panel: choose to display a reticle or not, and select a navigation effect that fit to your robot use.
-You can also modify the grasping mode there: with full control you decide at each time the opening of the gripper, while the grasping lock option enables you to close the gripper with on trigger press and open it with another one. Grasping lock option can be turned on/off as well in the emotion menu.
-{{% /expand %}}
-
-{{% expand "> Reset position"%}}
-While facing the mirror, your body should be aligned with Reachy's body. This is mandatory to have a consistent control. If this is not the case after having pressed "Ready", face the mirror and click on "Reset position".
-{{< img "images/vr/use-teleop/reset_position.png" 600x "Reset position">}}
-The "Reset position" button is placed at the bottom of the mirror, under the A loader.
-{{% /expand %}}
-
-### Teleoperation exit
-
-{{% expand "> Exit and lock position"%}}
-While press (A) to exit the teleoperation, you may hold (X) to activate the position lock. A lock is displayed when doing so.
-{{< img "images/vr/use-teleop/exit-lock.png" 600x "Exit and lock">}}
-The robot will stayed locked while you'll be back in the mirror room. This can be useful to keep a certain position while you need to take a break, change position or remove the headset. The position of the robot will be displayed by the semi-transparent green robot when you will restart the teleoperation.
-{{% /expand %}}
+---
+title: "Step-by-step starting"
+description: "Start and stop teleoperation"
+lead: "Start and stop teleoperation"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 300
+toc: true
+---
+
+
+{{< warning icon="👉🏾" text="Before starting teleoperating Reachy, please make sure you read the Best Practice" >}}
+
+## In brief
+
+> The button names used below are for the Meta Quest headsets. Please refer to the Controllers input page to get the corresponding inputs for your device.
+
+### Start teleoperating Reachy
+
+1. Make sure the robot is turned on, connected to the network and that all the robot's services are running before launching the teleoperation application.
+
+2. Select the robot you want to teleoperate (or create a new one), and click on "Connect".
+
+3. Once in the mirror room, you can configure various settings. Take time to tune the motion sickness effects you want to use in the settings menu. When you are ready to start, press "Ready", then hold (A).
+
+4. **Look straight ahead, with your body in the same orientation as your head while pressing A** to start the teleoperation. *The initial head position is used to determine the coordinate system giving your VR controllers position.*
+
+{{< alert icon="👉" text="Warning: you must not move your body anymore after this step. The position of your VR controllers to master the robot arms are calculated depending on the position you had while pressing A." >}}
+
+{{< warning icon="🚨" text="Important: even if Reachy is bio-inspired, it cannot reproduce exactly all your movements. There are positions that cannot be reached by the robot. Please avoid unusual movements and do not persist in trying to reach a position if you see that the robot is stuck before it." >}}
+
+5. (NEW) You first have the control of the head and the mobile base, but **not of the arms**. Take a few seconds to check the robot surroundings and go to an appropriate place before starting the full teleoperation. When the environment is safe, **press A** to get the full control. You can also go back to the mirror room pressing the related button with your laser beam.
+
+6. Come back any time to mirror room by **holding A**. Teleoperation of the robot is automatically paused if the headset is removed.
+
+{{< alert icon="👉" text="Please stop teleoperation before removing your headset (go back to mirror room or quit the app). If you do not, Reachy will continue following your controllers and headset orientation during a few seconds, and this can cause damages to the robot." >}}
+
+### Stop teleoperation
+
+1. Come back to the **mirror room** to pause the teleoperation by **holding A** at any time during teleoperation.
+
+2. Leave the app by clicking "**Quit**" icons in the mirror room and connection menu.
+
+The motors are automatically turned into compliant mode when quitting the mirror room. Please make sure the arms are close enough to the lowest position they can reach when coming back to the menu to avoid them falling or hitting something.
+
+
+## Step-by-step starting
+1. Make sure that your VR equipement is up and running. Please refer to your device documentation.
+
+2. Make sure the robot is turned on, connected to the network and that all the robot services are running. *By default, if you haven't modified anything, all services should be automatically launched on start of the **full/starter kit** robots.*
+
+3. Launch the application *TeleoperateReachy.exe* file if you are using a VR device connected to a Windows computer. For Oculus Quest users, start the app from within the headset if you have installed the *.apk*.
+
+4. Equip yourself with your headset, make sure you can see both controllers and that the scene around you is moving correctly in accordance with your head movements.
+
+5. Choose the robot you want to connect to: you can select a robot with its IP address, or add a new one to the list of available robots.
+
+{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
+{{< img "images/vr/use-teleop/select-robot.png" 600x "Select robot to connect">}}
+
+6. Press *Connect* to initiate the communication with the robot.
+
+{{< img "images/vr/use-teleop/connect.png" 600x "Connect to a robot">}}
+
+7. You should be now in the *transition room*, and see yourself controlling a virtual reachy. The actual robot is not in control at that time but the live camera stream is displayed at the top right of the mirror. The info, help and settings menus are available here (they are documented in the next section). Please get familiar with the robot controls and features (emotion, grasping lock).
+
+{{< img "images/vr/use-teleop/mirror.png" 600x "Mirror scene">}}
+
+8. When you are ready, **face the mirror completely** and click on "Ready". The position of the actual robot appears in a semi-transparent green color. This may be useful when you've left the robot in a certain position that you would like to keep when entering the teleoperation. Hold (A) to start the teleoperation.
+
+{{< img "images/vr/use-teleop/mirror-ready.png" 600x "Start teleoperation">}}
+
+9. (NEW) You first have the control of the head and the mobile base, but **not of the arms**. Take a few seconds to check the robot surroundings and go to an appropriate place before starting the full teleoperation. When the environment is safe, **press A** to get the full control. You can also go back to the mirror room pressing the related button with your laser beam.
+
+10. A 3 seconds timer appears while you enter the teleoperation. The motors speeds are reduced during this time to avoid sudden movements of the robot. Full speed is reached at the end of this countdown.
+
+{{< img "images/vr/use-teleop/timer-start.png" 600x "Validate position before starting">}}
+
+{{< alert icon="👉" text="Warning: you don't want to move your torso and body anymore after this step. Only your head and arms. The position of your VR controllers to master the robot arms are calculated depending on the position you had while pressing A." >}}
+
+11. Come back any time to menu by **pressing A**. Teleoperation of the robot is automatically paused if the headset is removed.
+
+
+## Use Reachy's emotions
+*Use of the antennas emotion is not available on Reachy 2.*
+
+## Application features
+
+### Connection page
+
+{{% expand "> Add a new robot" %}}
+Click on the robot to select to open the panel of all saved robots:
+{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
+Then click on "Add new robot +" at the bottom right of the page:
+{{< img "images/vr/use-teleop/add-robot-button.png" 600x "Add robot button">}}
+Enter a robot name and the IP address of the robot (if the headset is connected on a computer, use the computer keyboard), and save your robot card:
+*The IP address is mandatory. If no name is given to the new robot, it will be called @Reachy by default*
+{{< img "images/vr/use-teleop/add-robot-card.png" 600x "Add robot panel">}}
+{{% /expand %}}
+
+{{% expand "> Modify an existing robot"%}}
+Click on the robot to select to open the panel of all saved robots:
+{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
+Then click on the pencil icon of the robot you want to modify:
+{{< img "images/vr/use-teleop/modify-robot-button.png" 600x "Modify robot button">}}
+Modify the info on the robot card and save the card:
+{{< img "images/vr/use-teleop/modify-robot-panel.png" 600x "Modify robot panel">}}
+{{% /expand %}}
+
+{{% expand "> Delete a saved robot"%}}
+Click on the robot to select to open the panel of all saved robots:
+{{< img "images/vr/use-teleop/choose-robot.png" 600x "Change robot to connect">}}
+Then click on the bin icon of the robot you want to delete:
+{{< img "images/vr/use-teleop/delete-robot-button.png" 600x "Delete robot button">}}
+Validate the deletion:
+{{< img "images/vr/use-teleop/delete-robot-panel.png" 600x "Delete robot panel">}}
+{{% /expand %}}
+
+
+### Mirror scene
+
+{{% expand "> Check robot status"%}}
+Open the info menu in the mirror room:
+{{< img "images/vr/use-teleop/mirror-info.png" 600x "Info menu">}}
+The connection and services status, and motor temperature are reported here.
+{{% /expand %}}
+
+{{% expand "> Controller mapping"%}}
+Open the help menu in the mirror room:
+{{< img "images/vr/use-teleop/mirror-help.png" 600x "Info menu">}}
+The mapping of the controller buttons to the robot actions are displayed here.
+{{% /expand %}}
+
+{{% expand "> Settings menu"%}}
+Open the settings menu in the mirror room:
+{{< img "images/vr/use-teleop/mirror-settings.png" 600x "Settings menu">}}
+Here you can set your size to improve the mapping between your movements and reachy's motion. Individual parts of the robot can be deactivated in the case you don't need the mobile base, a specific arm, etc.
+Motion sickness options are available in this panel: choose to display a reticle or not, and select a navigation effect that fit to your robot use.
+You can also modify the grasping mode there: with full control you decide at each time the opening of the gripper, while the grasping lock option enables you to close the gripper with on trigger press and open it with another one. Grasping lock option can be turned on/off as well in the emotion menu.
+{{% /expand %}}
+
+{{% expand "> Reset position"%}}
+While facing the mirror, your body should be aligned with Reachy's body. This is mandatory to have a consistent control. If this is not the case after having pressed "Ready", face the mirror and click on "Reset position".
+{{< img "images/vr/use-teleop/reset_position.png" 600x "Reset position">}}
+The "Reset position" button is placed at the bottom of the mirror, under the A loader.
+{{% /expand %}}
+
+### Teleoperation exit
+
+{{% expand "> Exit and lock position"%}}
+While press (A) to exit the teleoperation, you may hold (X) to activate the position lock. A lock is displayed when doing so.
+{{< img "images/vr/use-teleop/exit-lock.png" 600x "Exit and lock">}}
+The robot will stayed locked while you'll be back in the mirror room. This can be useful to keep a certain position while you need to take a break, change position or remove the headset. The position of the robot will be displayed by the semi-transparent green robot when you will restart the teleoperation.
+{{% /expand %}}
diff --git a/content/vr/use-teleop/messages.md b/content/teleoperation/using-application/teleoperation-messages.md
similarity index 76%
rename from content/vr/use-teleop/messages.md
rename to content/teleoperation/using-application/teleoperation-messages.md
index 7f13fee4..64d628f0 100644
--- a/content/vr/use-teleop/messages.md
+++ b/content/teleoperation/using-application/teleoperation-messages.md
@@ -1,25 +1,29 @@
----
-title: "Teleoperation messages"
-description: "Understand warning and error messages in the VR teleoperation app"
-lead: "Understand warning and error messages"
-date: 2023-07-26T09:01:43+02:00
-lastmod: 2023-07-26T09:01:43+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "130"
----
-
-During Reachy teleoperation, several messages can show up in front of view.
-
-**Warning messages**
-
-Some messages are just **warnings**, signaling you the quality of teleoperation may be altered or the current state of the robot may evolve into future errors (motors heating up or low battery). These messages are displayed on a **dark grey background**.
-When possible, please consider acting to prevent these warnings from becoming errors.
-
-**Error messages**
-
-Other messages may signal **errors**, which will lead to a fast dysfunction of the teleoperation. These messages are to take into account quickly, as you may not be controlling the robot properly anymore when they appear. These messages are displayed on a **red background**.
-
-{{< warning icon="👉🏾" text="When error messages appear, stop teleoperation and act appropriately depending on the error type. " >}}
+---
+title: "Teleoperation messages"
+description: "Understand displayed information during teleoperation sessions"
+lead: "Understand displayed information during teleoperation sessions"
+date: 2023-07-26T08:05:23+02:00
+lastmod: 2023-07-26T08:05:23+02:00
+draft: false
+images: []
+type: docs
+menu:
+ teleoperation:
+ parent: "Using Reachy2Teleoperation application"
+weight: 350
+toc: true
+---
+
+
+During Reachy teleoperation, several messages can show up in front of view.
+
+**Warning messages**
+
+Some messages are just **warnings**, signaling you the quality of teleoperation may be altered or the current state of the robot may evolve into future errors (motors heating up or low battery). These messages are displayed on a **dark grey background**.
+When possible, please consider acting to prevent these warnings from becoming errors.
+
+**Error messages**
+
+Other messages may signal **errors**, which will lead to a fast dysfunction of the teleoperation. These messages are to take into account quickly, as you may not be controlling the robot properly anymore when they appear. These messages are displayed on a **red background**.
+
+{{< warning icon="👉🏾" text="When error messages appear, stop teleoperation and act appropriately depending on the error type. " >}}
diff --git a/content/vr/_index.md b/content/vr/_index.md
deleted file mode 100644
index 04a27924..00000000
--- a/content/vr/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title : "VR app compatibility"
-description: "Use systend services with Reachy."
-date: 2023-07-26T08:58:44+02:00
-lastmod: 2023-07-26T08:58:44+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/compatibility/_index.md b/content/vr/compatibility/_index.md
deleted file mode 100644
index 3cdab028..00000000
--- a/content/vr/compatibility/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Compatibility"
-description: "Get compatible VR headsets and minimal PC requirements for the teleoperation app to run"
-date: 2023-07-26T08:59:05+02:00
-lastmod: 2023-07-26T08:59:05+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/compatibility/headsets.md b/content/vr/compatibility/headsets.md
deleted file mode 100644
index 47b8b1bd..00000000
--- a/content/vr/compatibility/headsets.md
+++ /dev/null
@@ -1,26 +0,0 @@
----
-title: "Headsets"
-description: "Get the list of compatible VR headset to use the VR teleoperation app"
-date: 2023-07-26T08:59:13+02:00
-lastmod: 2023-07-26T08:59:13+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "20"
----
-
-So far, the VR teleoperation application has been tested with the following devices:
-* **Meta Quest 2** (with Oculus Link)
-* **Meta Quest 3** (with Oculus Link)
-
-> No native application for Meta Quest headsets has been released at the moment.
-
-
-The application should also support any device compatible with Unity 2022.3 including but not limited to the following devices:
-* **Valve Index**
-* **HTC Vive**
-* **Oculus Rift**
-
-
-Please refer to [Unity documentation](https://docs.unity3d.com/2020.3/Documentation/Manual/VROverview.html) for more information about the compatibility.
\ No newline at end of file
diff --git a/content/vr/compatibility/pc-requirements.md b/content/vr/compatibility/pc-requirements.md
deleted file mode 100644
index 81585951..00000000
--- a/content/vr/compatibility/pc-requirements.md
+++ /dev/null
@@ -1,23 +0,0 @@
----
-title: "PC Requirements"
-description: "Get minimal VR requirements for the teleoperation app to run"
-date: 2023-07-26T08:59:23+02:00
-lastmod: 2023-07-26T08:59:23+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "30"
----
-
-The application is built on Unity 2022.3 LTS for which the requirements can be found [here](https://docs.unity3d.com/2020.3/Documentation/Manual/system-requirements.html).
-
-
-In order to use the desktop version of the teleoperation application, your PC needs to support Virtual Reality. We recommend the computer to run on Windows, to be powerful enough and equipped with a graphic card.
-
-The computer minimum requirements are the following:
-* **Operating System:** Windows 10 (or Windows 7 SP1)
-* **Processor:** Intel Core i5-4590/AMD FX 8350 equivalent or better
-* **Memory:** 8GB RAM
-* **Graphic card:** NVIDIA GeForce GTX 970, AMD Radeon R9 290 equivalent or better
-* **Network:** Broadband Internet connection. It is highly recommended for your PC to be **hard-wired** into your router using an **ethernet cable**.
diff --git a/content/vr/getting-started/_index.md b/content/vr/getting-started/_index.md
deleted file mode 100644
index fb773cad..00000000
--- a/content/vr/getting-started/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "VR Installation"
-description: ""
-date: 2023-08-21T15:59:42+02:00
-lastmod: 2023-08-21T15:59:42+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/getting-started/check-robot.md b/content/vr/getting-started/check-robot.md
deleted file mode 100644
index 5aafba33..00000000
--- a/content/vr/getting-started/check-robot.md
+++ /dev/null
@@ -1,34 +0,0 @@
----
-title: "Check robot is ready"
-description: ""
-lead: "Prepare your robot for teleoperation"
-date: 2023-08-21T16:00:11+02:00
-lastmod: 2023-08-21T16:00:11+02:00
-type: docs
-draft: false
-images: []
-toc: true
-weight: "50"
----
-
-## Little checks before start
-
-> When starting the robot, the services required for teleoperation are **automatically launched**.
-
-### SR camera must be unplugged
-
-Make sure the **SR camera is unplugged**: it could cause troubles to launch the teleop cameras service.
-
-{{< img-center "images/vr/getting-started/unplugged-sr.png" 400x "" >}}
-
-### Have you done anything since the last boot?
-
-In the following cases:
-- you have just unplugged the SR camera, without rebooting the robot
-- you have used the Python SDK during your session with the robot
-
-Then disconnect all running clients (if you used the Python SDK), and **restart the webrtc service** from the dashboard.
-
-{{< img-center "images/vr/getting-started/restart-webrtc.png" 600x "" >}}
-
-> A reboot of the robot will also work.
\ No newline at end of file
diff --git a/content/vr/installation.md b/content/vr/installation.md
deleted file mode 100644
index e86cbc46..00000000
--- a/content/vr/installation.md
+++ /dev/null
@@ -1,34 +0,0 @@
----
-title: "What needs to be installed"
-description: "How to install the VR teleoperation application"
-lead: "How to install the VR teleoperation application"
-date: 2023-07-26T09:00:02+02:00
-lastmod: 2023-07-26T09:00:02+02:00
-draft: false
-images: []
-type: docs
-toc: true
----
-
-{{< alert icon="👉" text="Reachy 2021/2023 is already fully compatible with the teleoperation application. You have nothing to install on the robot." >}}
-
-{{< alert icon="⬇️" text=" Download the latest version of the app">}}
-
-## On the Oculus Quest 2
-
-There are two options for this device: use it **natively on the headset** or run it on your computer using an **Oculus link**. If you want to use the Oculus Link, please refer to the *On Windows computer* section.
-To use it natively, choose one of the following options to install it.
-
-### From the Quest Store
-
-Contact us on our [discord channel](https://discord.com/channels/519098054377340948/991321051835404409) to be added to the list of the beta testers.
-
-### Using the apk
-
-[Download the apk from our github repo](https://github.com/pollen-robotics/ReachyTeleoperation/releases), and install it to your device with your favorite tool (with the [meta quest developer hub](https://developer.oculus.com/meta-quest-developer-hub/) for instance).
-
-## On the Windows computer
-
-Make sure that your VR device is properly installed and running (please refer to your device documentation).
-
-[Download the zip archive from our github repo](https://github.com/pollen-robotics/ReachyTeleoperation/releases), and unzip it. Simply launch the *TeleopReachy.exe* file to start the application.
diff --git a/content/vr/introduction/_index.md b/content/vr/introduction/_index.md
deleted file mode 100644
index fbaa98fe..00000000
--- a/content/vr/introduction/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "VR Introduction"
-description: "Get quickly introduced to VR teleoperation"
-date: 2023-07-26T14:25:40+02:00
-lastmod: 2023-07-26T14:25:40+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/introduction/introduction.md b/content/vr/introduction/introduction.md
deleted file mode 100644
index ede050ac..00000000
--- a/content/vr/introduction/introduction.md
+++ /dev/null
@@ -1,20 +0,0 @@
----
-title: "Introduction"
-description: "What is VR teleoperation application?"
-lead: "What is VR teleoperation application?"
-date: 2023-07-26T09:00:11+02:00
-lastmod: 2023-07-26T09:00:11+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "10"
----
-
-The Virtual Reality (VR) teleoperation application enables you to **control the robot remotely** with VR device.
-
-By connecting to your robot, the teleoperation application gives you the ability to **move Reachy's arm** with the tracking of the VR controllers, to **rotate Reachy's head** following your own head movements and to **see through Reachy's cameras**.
-
-You can also **manipulate objects** remotely controlling Reachy's grippers with your controllers' triggers.
-
-{{< youtube vVIBlbS2zJs >}}
\ No newline at end of file
diff --git a/content/vr/problem/_index.md b/content/vr/problem/_index.md
deleted file mode 100644
index 0eb9f4fa..00000000
--- a/content/vr/problem/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Problem"
-description: "Resolve problems you get using VR teleoperation application"
-date: 2023-07-26T09:00:36+02:00
-lastmod: 2023-07-26T09:00:36+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/problem/debug.md b/content/vr/problem/debug.md
deleted file mode 100644
index f3be68a1..00000000
--- a/content/vr/problem/debug.md
+++ /dev/null
@@ -1,70 +0,0 @@
----
-title: "Debug"
-description: "Meeting a problem with teleoperation? Find out what can cause this and how to resolve the situation by yourself"
-date: 2023-07-26T09:00:50+02:00
-lastmod: 2023-07-26T09:00:50+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "150"
----
-
-## Check the info on the app!
-Connect to the robot to get more information on the connection status and the status of the robot. Open the "info" menu on the left of the mirror.
-
-{{< img "images/vr/problem/mirror-info.png" 600x "Mirror Info">}}
-
-### Connection status
-The connection status give you information about the communication with the robot. Existing connection status are the following:
-* **Connected to a remote Reachy** *(green)*: everything seems to be working fine
-* **Trying to connect** *(blue)*: the app is looking for the connection with the robot
-* **Robot connection failed** *(orange)*: you are connected to a remote robot, but either the camera feed or the data stream failed. Teleoperation is not possible
-* **Unable to connect to remote server** *(red)*: no robot or service is detected after trying to connect
-
-### Network connection quality
-The status of the network connection can also help you:
-* **Good network connection** indicates the application manage to have fast responses from the robot
-* **Unable to reach robot** indicates the application doesn't manage to get any answer from the given IP address on the network
-
-### Services availability
-
-You can also check which services are available:
-* **Camera**: camera service from the cameras. ***Mandatory for teleoperation***
-* **Audio**: service to get sound from the robot to the operator
-* **Microphone**: service to send sound of the operator through the robot
-* **Motors**: joints services for sending and receiving data from the robot's motors ***Mandatory for teleoperation***
-
-## The app doesn't connect to the robot
-
-If you are not connected to the robot, the reason can be one of the following:
-* you are not connected to the right IP address
-* the robot is not connected to the network
-* the services are not working on the robot (either not launched or crashed)
-* your computer is not connected to the network
-* the connection is not stable enough for the app to stay connected to the robot
-
-## Reachy never comes to be ready
-
-First of all, check that the application managed to connect to the robot.
-The connection status with the robot is indicated at the top of the mirror.
-Camera view (top right) is not available if the connection failed.
-
-|Connected to the robot|Unable to connect to the robot|
-|----------------------|------------------------------|
-|{{< img "images/vr/problem/connected.png" 300x "Connected to Robot">}}| {{< img "images/vr/problem/notconnected.png" 300x "Not connected to Robot">}}|
-
-
-## The robot doesn't move properly
-**Reachy movements are shifted from my real movements**
-Your head was probably not correctly aligned with your body when you fixed your position, or you moved since the validation step.
-Come back to the mirror and validate your choices again to be able to fix a new position.
-
-**Reachy movements are jerky**
-The **connection is not fast enough** between the robot and your computer, or another program may be alterating the reactivity.
-A warning message may also be displayed during teleoperation indicated the network is either unstable or has low speed.
-
-**The movements of the robot seem not correlated anymore with mine**
-If a motor is overheating, it may have **stopped working**, which can lead in movements looking very different than yours. In reality, the arm is still trying to move according to yours, but the unmoving joints make the configuration of the arm hard to understand.
-In most of the cases, an **error message** should be displayed in the teleoperation, telling that at least 1 motor is in critical error.
-Nevertheless it may happen that no error message is displayed, if the motor stopped working before having time to send the information to the teleoperation app: in that case, you received a warning message telling at least 1 motor was heating up previously during teleoperation. Check the **temperature of the motors** in the **Info panel** of the transition room.
diff --git a/content/vr/problem/support-vr.md b/content/vr/problem/support-vr.md
deleted file mode 100644
index f9d64296..00000000
--- a/content/vr/problem/support-vr.md
+++ /dev/null
@@ -1,22 +0,0 @@
----
-title: "Support VR"
-description: "Get support from the community or Pollen Robotics if you meet problems with the VR teleoperation app"
-date: 2023-07-26T09:00:44+02:00
-lastmod: 2023-07-26T09:00:44+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "160"
----
-
-## Discord
-
-Join **[our Discord](https://discord.gg/vnYD6GAqJR)** if you have any questions, maybe someone has already asked the same question or other people could benefit from the answer!
-
-{{< alert icon="👉" text="Any questions relative to your development with Reachy?Join the Pollen Community on Discord" >}}
-
-
-## Pollen Robotics support
-
-For any specific questions concerning your robot or if you meet problems with the product, please contact us at [support@pollen-robotics.com](mailto:support@pollen-robotics.com).
diff --git a/content/vr/use-teleop/_index.md b/content/vr/use-teleop/_index.md
deleted file mode 100644
index b596cec3..00000000
--- a/content/vr/use-teleop/_index.md
+++ /dev/null
@@ -1,9 +0,0 @@
----
-title: "Use Teleop"
-description: "How to use the VR application to teleoperate Reachy correctly"
-date: 2023-07-26T09:01:14+02:00
-lastmod: 2023-07-26T09:01:14+02:00
-draft: false
-images: []
-type: docs
----
diff --git a/content/vr/use-teleop/mobile-base.md b/content/vr/use-teleop/mobile-base.md
deleted file mode 100644
index 0f849ea7..00000000
--- a/content/vr/use-teleop/mobile-base.md
+++ /dev/null
@@ -1,39 +0,0 @@
----
-title: "Control the mobile base"
-description: "Use the mobile in the VR teleoperation application"
-lead: "How to use the mobile base in the VR teleoperation application"
-date: 2023-07-26T09:01:49+02:00
-lastmod: 2023-07-26T09:01:49+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "110"
----
-
-## Control the mobile base
-Use the **thumbstick/trackpad** to control the mobile base!
-The **left controller controls the translation** of the mobile base, while the **right one controls the rotation**.
-
-**Is there any security to prevent collision with objects?**
-
-**Yes!** If you are too close to a wall or object, the LIDAR anti-collision safety unables the mobile base to go closer to the obstacle. The mobile base will therefore not move in this direction, but you can still go in other directions. You will get a warning message when the anti-collision safety is triggered.
-[More information on the anti-collision safety](https://docs.pollen-robotics.com/sdk/mobile-base/safety/)
-
-Nevertheless, this security is for the mobile base and won't prevent the robot's arms to collide with external objects, so be aware while teleoperating the robot.
-
-*Please note very small objects won't be detected by the LIDAR sensor.*
-
-**What is the forward direction of Reachy?**
-
-The forward direction is aligned with the **forward direction of the mobile base**, meaning that giving a forward instruction to the robot will always lead the robot to go physically forward, no matter the direction you are looking to.
-
-Check the actual direction of your commands using the **indicator** at the bottom: the white arrow shows you the direction command relative to your actual head orientation. If your head is correctly aligned with the mobile base forward direction, this arrow will point forward if giving a forward command with your left controller.
-{{< img "images/vr/use-teleop/straight_forward.png" 600x "Forward direction looking straight">}}
-{{< img "images/vr/use-teleop/head_on_side_forward.png" 600x "Forward direction looking on the left">}}
-
-In the above images, the same forward command is sent from the left controller.
-On the first image, the user is looking straight (the black arrow is located in the target view), so the white mobility arrow is pointing front.
-On the second image, the user is looking on the left side (the target view is on the left of the black arrow), so the forward direction is pointing right, as it is the direction aligned with the mobile base forward direction.
-
-*Note that these images are only for example, mobility is not available on virtual Reachy.*
diff --git a/content/vr/use-teleop/motion-sickness.md b/content/vr/use-teleop/motion-sickness.md
deleted file mode 100644
index 0d81f762..00000000
--- a/content/vr/use-teleop/motion-sickness.md
+++ /dev/null
@@ -1,50 +0,0 @@
----
-title: "Motion sickness options"
-description: "Add visual effects to reduce motion sickness while using the teleoperation application"
-lead: "Add visual effects to reduce motion sickness while using the teleoperation application"
-date: 2023-07-26T09:01:49+02:00
-lastmod: 2023-07-26T09:01:49+02:00
-draft: false
-images: []
-type: docs
-toc: true
-weight: "120"
----
-
-Motion sickness may occur using the VR teleoperation application.
-
-We added a few options you can activate to reduce motion sickness, giving you a fix point in your field of view, or avoid peripheral movements when moving with the robot.
-
-You have access to these options in the ***transition room***.
-
-On the left of the mirror, open the **Settings** tab, and configure the motion sickness options before starting teleoperating the robot.
-
-{{< img-center "images/vr/getting-started/vr-settings.png" 600x "" >}}
-
-
-### Reticle
-Display a reticle to give a fixed point in the field of view. By default, the reticle only appears when the mobile base is moving.
-
-*Option:*
-- **Always display reticle**: always display the reticle, even if the mobile base is not moving.
-
-{{< img-center "images/vr/getting-started/reticle.png" 300x "" >}}
-
-
-### Navigation effects
-
-- **No effect**
-- **Tunneling**: when moving the mobile base, a black tunneling will appear in your peripheral vision and reduce your field of view
-- **Reduced screen**: when moving the mobile base, the size of the image will be reduced to let you see an artificial horizon behind it.
-
-*Option:*
-- **Activate effect on demande only**: during teleoperation, press one of the joysticks to activate/deactivate the occurence of the selected effect.
-
- If used with *tunneling*, deactivate the effect will disable the tunneling when moving the mobile base, activate it will let it appear automatically.
-
- If used with *reduced screen*, activate or deactivate the effect will let you manually reduce the size of the image.
-
-|Tunneling effect|Reduced screen effect |
-|----|--------------------|
-|{{< img-center "images/vr/getting-started/tunneling.png" 300x "" >}}|{{< img-center "images/vr/getting-started/reduced-screen.png" 300x "" >}}
-|
\ No newline at end of file
diff --git a/layouts/index.html b/layouts/index.html
index 99e43e5b..384c0008 100644
--- a/layouts/index.html
+++ b/layouts/index.html
@@ -4,8 +4,10 @@
{{ .Title }}
-
+
{{ .Params.lead | safeHTML }}
+
Getting started
+
Safety guidelines
@@ -18,31 +20,72 @@
{{ .Title }}
-
-
-
Just got your Reachy kit? Check how to install your robot and start having fun with it.
+
+
-
-
Easy control of a Reachy robot in Python: read sensor information (eg. camera, joint position, force) and send actuator commands.
+
+
+
+
-
-
Control the robot remotely using Reachy's VR teleoperation application.
+
+
+
+
-
-
Use Reachy's dashboard to easily debug and do basic control.
+
+
+
+
-
-
Can't solve your problem? Ask for help!
+
Help
diff --git a/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_1270x740_resize_box_2.png b/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_1270x740_resize_box_2.png
new file mode 100644
index 00000000..c88e78e4
Binary files /dev/null and b/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_1270x740_resize_box_2.png differ
diff --git a/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_3f6b16ffca8f553b211127d347977ca0.png b/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_3f6b16ffca8f553b211127d347977ca0.png
new file mode 100644
index 00000000..b15dfd0b
Binary files /dev/null and b/resources/_gen/images/logo-pollen_huce698a7cb671e7aa50983aae17cf1693_91849_3f6b16ffca8f553b211127d347977ca0.png differ