Skip to content

Commit

Permalink
Cleans up the how-to documentation (#325)
Browse files Browse the repository at this point in the history
# Description

This MR reviews the how-tos documentation and ensures they are all
somewhat consistent.

## Type of change

- This change requires a documentation update

## Checklist

- [x] I have run the [`pre-commit` checks](https://pre-commit.com/) with
`./orbit.sh --format`
- [x] I have made corresponding changes to the documentation
- [x] My changes generate no new warnings
- [ ] I have added tests that prove my fix is effective or that my
feature works
- [ ] I have updated the changelog and the corresponding version in the
extension's `config/extension.toml` file
- [x] I have added my name to the `CONTRIBUTORS.md` or my name already
exists there
  • Loading branch information
Mayankm96 authored Dec 20, 2023
1 parent cf7a65f commit bff476a
Show file tree
Hide file tree
Showing 8 changed files with 198 additions and 167 deletions.
85 changes: 48 additions & 37 deletions docs/source/how-to/draw_markers.rst
Original file line number Diff line number Diff line change
@@ -1,64 +1,75 @@
Creating Markers in Orbit
=========================
Creating Visualization Markers
==============================

In this tutorial, we will explore how to create different types of markers using a Python script.
The script demonstrates the creation of markers with various shapes and visual properties.
.. currentmodule:: omni.isaac.orbit

Please ensure you have gone through the previous tutorials, especially creating an empty scene for a foundational understanding.
Visualization markers are useful to debug the state of the environment. They can be used to visualize
the frames, commands, and other information in the simulation.

While Isaac Sim provides its own :mod:`omni.isaac.debug_draw` extension, it is limited to rendering only
points, lines and splines. For cases, where you need to render more complex shapes, you can use the
:class:`markers.VisualizationMarkers` class.

The Code
~~~~~~~~
This guide is accompanied by a sample script ``markers.py`` in the ``orbit/source/standalone/demos`` directory.

The tutorial corresponds to the ``markers.py`` script in the ``orbit/source/standalone/demos`` directory.
Let's take a look at the Python script:
.. dropdown:: Code for markers.py
:icon: code

.. literalinclude:: ../../../source/standalone/demos/markers.py
:language: python
:linenos:
.. literalinclude:: ../../../source/standalone/demos/markers.py
:language: python
:emphasize-lines: 49-97, 113, 114-145
:linenos:



Configuring the markers
-----------------------

The Code Explained
~~~~~~~~~~~~~~~~~~
The :class:`~markers.VisualizationMarkersCfg` class provides a simple interface to configure
different types of markers. It takes in the following parameters:

Creating and spawning markers
-----------------------------
- :attr:`~markers.VisualizationMarkersCfg.prim_path`: The corresponding prim path for the marker class.
- :attr:`~markers.VisualizationMarkersCfg.markers`: A dictionary specifying the different marker prototypes
handled by the class. The key is the name of the marker prototype and the value is its spawn configuration.

The :meth:`spawn_markers` function creates different types of markers with specified configurations.
For example, we include frames, arrows, cubes, spheres, cylinders, cones, and meshes.
The function returns a :obj:`VisualizationMarkers` object.
.. note::

In case the marker prototype specifies a configuration with physics properties, these are removed.
This is because the markers are not meant to be simulated.

Here we show all the different types of markers that can be configured. These range from simple shapes like
cones and spheres to more complex geometries like a frame or arrows. The marker prototypes can also be
configured from USD files.

.. literalinclude:: ../../../source/standalone/demos/markers.py
:language: python
:lines: 37-84
:linenos:
:lineno-start: 37
:lines: 49-97
:dedent:


Main simulation logic
---------------------
Drawing the markers
-------------------

The ``main`` function sets up the simulation context, camera view, and spawns lights into the stage.
It then creates instances of the markers and places them in a grid pattern.
The markers are rotated around the z-axis during the simulation for visualization purposes.
To draw the markers, we call the :class:`~markers.VisualizationMarkers.visualize` method. This method takes in
as arguments the pose of the markers and the corresponding marker prototypes to draw.

.. literalinclude:: ../../../source/standalone/demos/markers.py
:language: python
:lines: 86-111
:linenos:
:lineno-start: 86
:lines: 142-148
:dedent:


Executing the Script
~~~~~~~~~~~~~~~~~~~~
--------------------

To run the script, execute the following command:
To run the accompanying script, execute the following command:

.. code-block:: bash
./orbit.sh -p source/standalone/demos/markers.py
The simulation should start, and you can observe the different types of markers arranged in a grid pattern.
To stop the simulation, close the window, press the ``STOP`` button in the UI, or use ``Ctrl+C`` in the terminal.
The markers will rotating around their respective axes. Additionally every few rotations, they will
roll forward on the grid.

This tutorial provides a foundation for working with markers in Orbit.
You can further customize markers by adjusting their configurations and exploring additional options
available in the Orbit API.
To stop the simulation, close the window, or use ``Ctrl+C`` in the terminal.
77 changes: 56 additions & 21 deletions docs/source/how-to/save_camera_output.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,44 +6,79 @@ Saving rendered images and 3D re-projection

.. currentmodule:: omni.isaac.orbit

This how-to demonstrates an efficient saving of rendered images and the projection of depth images into 3D Space.
This guide accompanied with the ``run_usd_camera.py`` script in the ``orbit/source/standalone/tutorials/04_sensors``
directory.

It is accompanied with the ``run_usd_camera.py`` script in the ``orbit/source/standalone/tutorials/04_sensors``
directory. For an introduction to sensors, please check the :ref:`tutorial-add-sensors-on-robot` tutorials.
.. dropdown:: Code for run_usd_camera.py
:icon: code

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:emphasize-lines: 137-139, 172-196, 200-204, 214-232
:linenos:

Saving the Images
-----------------

To save the images, we use the basic write class from Omniverse Replicator. This class allows us to save the
Saving using Replicator Basic Writer
------------------------------------

To save camera outputs, we use the basic write class from Omniverse Replicator. This class allows us to save the
images in a numpy format. For more information on the basic writer, please check the
`documentation <https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/writer_examples.html>`_.

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:lines: 135-137
:linenos:
:lineno-start: 135
:lines: 137-139
:dedent:

While stepping the simulator, the images can be saved to the defined folder. Since the BasicWriter only supports
saving data using NumPy format, we first need to convert the PyTorch sensors to NumPy arrays before packing
them in a dictionary.

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:lines: 172-192
:dedent:

While stepping the simulator, the images can be saved to the defined folder.
Since the BasicWriter only supports saving data using NumPy format, we first need to convert the PyTorch sensors
to NumPy arrays before packing them in a dictionary.
After this step, we can save the images using the BasicWriter.

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:lines: 172-193
:linenos:
:lineno-start: 172
:lines: 193-196
:dedent:


Projection into 3D Space
------------------------

In addition, we provide utilities to project the depth image into 3D Space.
The re-projection operations are done using torch which allows us to use the GPU for faster computation.
The resulting point cloud is visualized using the :mod:`omni.isaac.debug_draw` extension from Isaac Sim.
We include utilities to project the depth image into 3D Space. The re-projection operations are done using
PyTorch operations which allows faster computation.

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:lines: 200-204
:dedent:

The resulting point cloud can be visualized using the :mod:`omni.isaac.debug_draw` extension from Isaac Sim.
This makes it easy to visualize the point cloud in the 3D space.

.. literalinclude:: ../../../source/standalone/tutorials/04_sensors/run_usd_camera.py
:language: python
:lines: 197-229
:linenos:
:lineno-start: 197
:lines: 214-232
:dedent:


Executing the script
--------------------

To run the accompanying script, execute the following command:

.. code-block:: bash
./orbit.sh -p source/standalone/tutorials/04_sensors/run_usd_camera.py --save --draw
The simulation should start, and you can observe different objects falling down. An output folder will be created
in the ``orbit/source/standalone/tutorials/04_sensors`` directory, where the images will be saved. Additionally,
you should see the point cloud in the 3D space drawn on the viewport.

To stop the simulation, close the window, press the ``STOP`` button in the UI, or use ``Ctrl+C`` in the terminal.
13 changes: 8 additions & 5 deletions docs/source/how-to/wrap_rl_env.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,16 @@
.. _how-to-env-wrappers:

Using environment wrappers
==========================

Wrapping environments
=====================

.. currentmodule:: omni.isaac.orbit

Environment wrappers are a way to modify the behavior of an environment without modifying the environment itself.
This can be used to apply functions to modify observations or rewards, record videos, enforce time limits, etc.
A detailed description of the API is available in the :class:`gymnasium.Wrapper` class.

At present, all RL environments inheriting from the :class:`omni.isaac.orbit.envs.RLTaskEnv` class
At present, all RL environments inheriting from the :class:`~envs.RLTaskEnv` class
are compatible with :class:`gymnasium.Wrapper`, since the base class implements the :class:`gymnasium.Env` interface.
In order to wrap an environment, you need to first initialize the base environment. After that, you can
wrap it with as many wrappers as you want by calling ``env = wrapper(env, *args, **kwargs)`` repeatedly.
Expand Down Expand Up @@ -68,7 +71,7 @@ To use the wrapper, you need to first install ``ffmpeg``. On Ubuntu, you can ins

The viewport camera used for rendering is the default camera in the scene called ``"/OmniverseKit_Persp"``.
The camera's pose and image resolution can be configured through the
:class:`omni.isaac.orbit.envs.ViewerCfg` class.
:class:`~envs.ViewerCfg` class.


.. dropdown:: Default parameters of the ViewerCfg class:
Expand Down Expand Up @@ -125,7 +128,7 @@ Every learning framework has its own API for interacting with environments. For
`Stable-Baselines3`_ library uses the `gym.Env <https://gymnasium.farama.org/api/env/>`_
interface to interact with environments. However, libraries like `RL-Games`_ or `RSL-RL`_
use their own API for interfacing with a learning environments. Since there is no one-size-fits-all
solution, we do not base the :class:`RLTaskEnv` class on any particular learning framework's
solution, we do not base the :class:`~envs.RLTaskEnv` class on any particular learning framework's
environment definition. Instead, we implement wrappers to make it compatible with the learning
framework's environment definition.

Expand Down
Loading

0 comments on commit bff476a

Please sign in to comment.