Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs update #571

Merged
merged 3 commits into from
Nov 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/acknowledgement.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,6 @@
- [Andrew Kondrich](http://www.andrewkondrich.com/), [Jonathan Booher](https://web.stanford.edu/~jaustinb/) (domain randomization)
- [Albert Tung](https://www.linkedin.com/in/albert-tung3/) (demonstration collection)
- [Divyansh Jha](https://github.com/divyanshj16), [Fei Xia](http://fxia.me/) (robosuite v1.3 renderers)
- [Zhenyu Jiang](https://zhenyujiang.me/), [Yuqi Xie](https://xieleo5.github.io/), [You Liang Tan](https://youliangtan.github.io/), (robosuite v1.5)

We wholeheartedly welcome the community to contribute to our project through issues and pull requests. New contributors will be added to the list above.
2 changes: 0 additions & 2 deletions docs/algorithms/demonstrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,6 @@ We have included an example script that illustrates how demonstrations can be lo

We have included some sample demonstrations for each task at `models/assets/demonstrations`.

Our sister project [RoboTurk](http://roboturk.stanford.edu) has also collected several human demonstration datasets across different tasks and humans, including pilot datasets of more than a thousand demonstrations for two tasks in our suite via crowdsourcing. You can find detailed information about the RoboTurk datasets [here](roboturk).


## Structure of collected demonstrations

Expand Down
36 changes: 0 additions & 36 deletions docs/algorithms/roboturk.md

This file was deleted.

1 change: 0 additions & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,6 @@ Welcome to robosuite's documentation!
algorithms/benchmarking
algorithms/demonstrations
algorithms/sim2real
algorithms/roboturk

.. toctree::
:maxdepth: 1
Expand Down
14 changes: 7 additions & 7 deletions docs/modules/environments.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,17 @@ Environments are created by calling `robosuite.make` with the name of the task a

```python
import robosuite
from robosuite.controllers import load_part_controller_config
from robosuite.controllers import load_composite_controller_config

# load default controller parameters for Operational Space Control (OSC)
controller_config = load_part_controller_config(default_controller="OSC_POSE")
# BASIC controller: arms controlled using OSC, mobile base (if present) using JOINT_VELOCITY, other parts controlled using JOINT_POSITION
controller_config = load_composite_controller_config(controller="BASIC")

# create an environment to visualize on-screen
env = robosuite.make(
"TwoArmLift",
robots=["Sawyer", "Panda"], # load a Sawyer robot and a Panda robot
gripper_types="default", # use default grippers per robot arm
controller_configs=controller_config, # each arm is controlled using OSC
controller_configs=controller_config, # arms controlled via OSC, other parts via JOINT_POSITION/JOINT_VELOCITY
env_configuration="opposed", # (two-arm envs only) arms face each other
has_renderer=True, # on-screen rendering
render_camera="frontview", # visualize the "frontview" camera
Expand All @@ -36,7 +36,7 @@ env = robosuite.make(
"TwoArmLift",
robots=["Sawyer", "Panda"], # load a Sawyer robot and a Panda robot
gripper_types="default", # use default grippers per robot arm
controller_configs=controller_config, # each arm is controlled using OSC
controller_configs=controller_config, # arms controlled via OSC, other parts via JOINT_POSITION/JOINT_VELOCITY
env_configuration="opposed", # (two-arm envs only) arms face each other
has_renderer=False, # no on-screen rendering
has_offscreen_renderer=False, # no off-screen rendering
Expand All @@ -52,7 +52,7 @@ env = robosuite.make(
"TwoArmLift",
robots=["Sawyer", "Panda"], # load a Sawyer robot and a Panda robot
gripper_types="default", # use default grippers per robot arm
controller_configs=controller_config, # each arm is controlled using OSC
controller_configs=controller_config, # arms controlled via OSC, other parts via JOINT_POSITION/JOINT_VELOCITY
env_configuration="opposed", # (two-arm envs only) arms face each other
has_renderer=False, # no on-screen rendering
has_offscreen_renderer=True, # off-screen rendering needed for image obs
Expand All @@ -73,7 +73,7 @@ We provide a few additional details on a few keyword arguments below to highligh

- `robots` : this argument can be used to easily instantiate tasks with different robot arms. For example, we could change the task to use two "Jaco" robots by passing `robots=["Jaco", "Jaco"]`. Once the environment is initialized, these robots (as captured by the [Robot](../simulation/robot.html#robot) class) can be accessed via the `robots` array attribute within the environment, i.e.: `env.robots[i]` for the `ith` robot arm in the environment.
- `gripper_types` : this argument can be used to easily swap out different grippers for each robot arm. For example, suppose we want to swap the default grippers for the arms in the example above. We could just pass `gripper_types=["PandaGripper", "RethinkGripper"]` to achieve this. Note that a single type can also be used to automatically broadcast the same gripper type across all arms.
- `controller_configs` : this argument can be used to easily replace the action space for each robot arm. For example, if we would like to control the arm using joint velocities instead of OSC, we could use `load_controller_config(default_controller="JOINT_VELOCITY")` in the example above. Similar to `gripper_types` this value can either be per-arm specific or a single configuration to broadcast to all robot arms.
- `controller_configs` : this argument can be used to easily replace the action space for each robot. For example, if we would like to control the robot using IK instead of OSC, we could use `load_composite_controller_config(controller="WHOLE_BODY_IK")` in the example above.
- `env_configuration` : this argument is mainly used for two-arm tasks to easily configure how the robots are oriented with respect to one another. For example, in the `TwoArmLift` environment, we could pass `env_configuration="parallel"` instead so that the robot arms are located next to each other, instead of opposite each other
- `placement_initializer` : this argument is optional, but can be used to specify a custom `ObjectPositionSampler` to override the default start state distribution for Mujoco objects. Samplers are responsible for sampling a set of valid, non-colliding placements for all of the objects in the scene at the start of each episode (e.g. when `env.reset()` is called).

Expand Down
4 changes: 2 additions & 2 deletions docs/modules/robots.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

![robot_overview_diagram](../images/robot_module.png)

**Robots** are a key component in **robosuite**, and serve as the embodiment of a given agent as well as the central interaction point within an environment and key interface to MuJoCo for the robot-related state and control. **robosuite** captures this level of abstraction with the [Robot](../simulation/robot)-based classes, with support for both single-armed and bimanual variations. In turn, the Robot class is centrally defined by a [RobotModel](../modeling/robot_model), [RobotBaseModel](../modeling/robot_model.html#base-model), and [Controller(s)](../simulation/controller). Subclasses of the `RobotModel` class may also include additional models as well; for example, the [ManipulatorModel](../modeling/robot_model.html#manipulator-model) class also includes [GripperModel(s)](../modeling/robot_model.html#gripper-model) (with no gripper being represented by a dummy class).
**Robots** are a key component in **robosuite**, and serve as the embodiment of a given agent as well as the central interaction point within an environment and key interface to MuJoCo for the robot-related state and control. **robosuite** captures this level of abstraction with the [Robot](../simulation/robot)-based classes, with support for both fixed-base and mobile-base (which includes legged and wheeled robots) variations. In turn, the Robot class is centrally defined by a [RobotModel](../modeling/robot_model), [RobotBaseModel](../modeling/robot_model.html#base-model), and [Controller(s)](../simulation/controller). Subclasses of the `RobotModel` class may also include additional models as well; for example, the [ManipulatorModel](../modeling/robot_model.html#manipulator-model) class also includes [GripperModel(s)](../modeling/robot_model.html#gripper-model) (with no gripper being represented by a dummy class).

The high-level features of **robosuite**'s robots are described as follows:

* **Diverse and Realistic Models**: **robosuite** provides models for 8 commercially-available manipulator robots (including the bimanual Baxter robot), 7 grippers (including the popular Robotiq 140 / 85 models), and 6 controllers, with model properties either taken directly from the company website or raw spec sheets.
* **Diverse and Realistic Models**: **robosuite** provides models for 20 commercially-available robots (including the humanoid GR1 Robot), 15 grippers (including the inspire dexterous hand model), and 6 controllers, with model properties either taken directly from the company website or raw spec sheets.

* **Modularized Support**: Robots are designed to be plug-n-play -- any combinations of robots, models, and controllers can be used, assuming the given environment is intended for the desired robot configuration. Because each robot is assigned a unique ID number, multiple instances of identical robots can be instantiated within the simulation without error.

Expand Down
Loading
Loading