OpenRAVE Tutorial

From ISRWiki
Jump to navigation Jump to search

Note: this tutorial is under construction.

This is a step-by-step "How to Use OpenRAVE" Tutorial, and is intended to be a guide to learn from the scratch how to program simulations in this open source simulator. If you have some knowledge or experience on using this software, we recommend you to see the OpenRAVE Just Use It webpage.

What is OpenRAVE?

The Open Robotics Automation Virtual Environment (OpenRAVE) is an open-source cross-platform software architecture that is targeted for real-world autonomous robot applications, and includes a seamless integration of 3-D simulation, visualization, planning, scripting and control.

A plugin architecture allows users to easily write custom controllers or extend functionality. With OpenRAVE plugins, any planning algorithm, robot controller, or sensing subsystem can be distributed and dynamically loaded at run-time, which frees developers from struggling with monolithic code-bases. Users of OpenRAVE can concentrate on the development of planning and scripting aspects of a problem without having to explicitly manage the details of robot kinematics and dynamics, collision detection, world updates, and robot control. Because OpenRAVE is focused on autonomous motion planning and high-level scripting rather than low-level control and message protocols, it can be used in conjunction with other popular robotics packages such as ROS.

OpenRAVE supports a powerful scripting environment based on Python which makes it simple to control and monitor the demo and environment state. There are also supported interfaces for Octave and Matlab.

OpenRAVE was founded by Rosen Diankov at the Quality of Life Technology Center in the Carnegie Mellon University Robotics Institute. It was inspired from the RAVE simulator James Kuffner had started developing in 1995 and used for his experiments ever since. The OpenRAVE project was started in 2006 and is a complete rewrite of RAVE. The main goal with OpenRAVE is to create a planning architecture that would give robotics researchers an easy open-source interface to control their robots both in simulation and in the real-world without having to worry about the small details.

Foreword

First of all, this is an unofficial tutorial, written at VisLab in order to facilitate the usage of this tool.

Second: you can find the official web page of Rosen Diankov's OpenRAVE, here. Although you will find better first hand information in the official web page, if you are a newbie, we better recommend to start with this tutorial that introduces gradually most of the modules that you will be using within this simulator.

Third: although OpenRAVE is a cross-platform, this tutorial is just Linux-oriented, so, if you are a Windows user, perhaps you won't find this tutorial that useful.

The rest is about to be LEARNED... Have some fun!!

How to Install OpenRAVE

In this section we will describe how to easily install the OpenRAVE VisLab Edition. We call it VisLab Edition, because some plug-ins have been modified for internal purposes and, also, some models have been added to the original version of OpenRAVE. In case you want to Install the current original version of OpenRAVE, please read the official site of OpenRAVE. In order to install easily, we have written a installation script that automates the process. Just download the script, change it to an executable, and run it.

wget http://mediawiki.isr.ist.utl.pt/images/1/1a/Install_openRAVE.tar.gz
tar -xzvf Install_openRAVE.tar.gz
rm Install_openRAVE.tar.gz
chmod u+x Install_openRAVE.sh
Install_openRAVE.sh

Once you have finished with the installation, you should restart your PC, and then continue with this tutorial.

Testing OpenRAVE

To see if openRAVE was correctly installed, we would like to check if all the plug-ins are present in our software, we do that by entering in Terminal the following:

openrave -listplugins

The Plug-ins are supposed to be at least 14, if is so, then check that examples have been included at installation. Please enter the following in Terminal:

openrave.py --listexamples

This examples are not crucial for the tutorial, but later, they will be useful for explaining some aspects of coding and creating openRAVE Demos.

OpenRAVE Default Viewer

Before running any demo, it's important that you learn how to interact with the Coin3D GUI window, because Virtual Environments and 3D Models are shown in this window, and most of the time, you will find yourself willing to change the view perspectives, and other stuffs.

In the following Figure, we show the default OpenRAVE Viewer that is Qt-Coin3D. We have pointed five principal buttons that you will use more frequently.

  1. Menu View. In this menu, you will find the simulated camera parameters (that generates the actual view of the world), the Geometry, where you can switch between render & collision geometries of the objects, the frame rate of communication between the viewer and the core of the OpenRAVE, and the World Reference Frame (world axis).
  2. Menu Options. Here you will find as is expectable, some options for the simulation, like turning on the physics engine & the collision checkers, record the simulation etc.
  3. Arrow icon button. By default, this button is activated when the window is launched. This arrow indicates that your mouse pointer is in the “View Mode”. This means that you can rotate the view, pan & tilt, zoom in & out, etc. but you can not interact with the objects.
  4. Hand icon button. When you press this button, you are switching to the “Interaction Mode”.
  5. Eye icon button. This is a very helpful tool, because by pressing this button, the whole scene, will be automatically center in the window, so, if you initialize with an uncomfortable view, just click in the eye, and you will see the whole scene.

OK, let's open our first experimental 3D World, that, in OpenRAVE we call Environment. The next instruction will open the Baltazar experimental environment. Try changing the views, also try to move the Robot and the objects, and finally, see what happens if you activate the physics engine with the gravity pointing to -Z. Notice that you can toggle between view-mode and interaction-mode pressing [Esc]. To move objects just wrap them, and to move joints, click in the link while holding [Ctrl]. Now, to launch the environment, if you have installed the OpenRAVE VisLab Edition, then in Terminal just type..:

openrave Vislab/OR4_Envs/Baltazar_at_Vislab.env.xml

You should see a window similar to the next figure.

In the remaining of this section, will be exploring the power of OpenRAVE by running some demos.

OpenRAVE Demos

Let's begin testing the physics engine ODE. Just type in Terminal..:

openrave.py --example testphysics

This is an example of how the physics engine works when activated. You will see that the robot is not remaining in a fixed position and Orientation. Why is that? Let's run another example to answer this question. In terminal just type..:

openrave.py --example testphysics_controller

This second demo, is the one that actually demonstrates the physics engine with the Motor Controller of a PUMA. In the last demo, the controller was not defined and there were no such a thing like a force that maintain (or at least try to) the position and orientation of the links of the Robot. The physics engine also simulates the gravity, and because of that, if you do not declare the weight of the bodies in the environment, every time you activate the gravity simulation, you will see all this objects floating away in the scene.

A third example that give us a demonstration of a Robot grasping objects, is the following..:

openrave.py --example hanoi

So far, we have seen examples of a Robot moving and interacting with objects, but OpenRAVE also give us the possibility of simulate Cameras and other sensors. Let's see an example with cameras.

openrave.py --example calibrationviews

So, have you get the appetite? Yes?, well, now let's learn how to create some simulations!!!

When experimenting with simulations in OpenRAVE, mainly we have the following stages:

  1. Geometric Modelling.
  2. Knowledge base generation.
  3. Scripting.
  4. Interfacing Simulations.

These will be discussed in the next sections...

Geometric Modelling

Here is where you will learn how to create your enviroment (scenario), and create the Robots and the objects. Please follow the tutorial in the next link Geometric Modelling in OpenRAVE


Knowledge Base Generation

...This Section is not available yet. When Rosen Diankov created OpenRAVE, he designed its structure in such a way that there were some modules (plug-ins) that automatically generate some databases in off-line mode. Some examples of these data bases, are the inverse kinematics, the grasping plug-in, reachability and inverse reachability computation, detectability and others. More about the knowledge base generation will be disused in other section.

Scripting

...This Section is not available yet. Of course, if we want to run some special simulation, we have to code it. This can be done in Python language, in Matlab Scripts, in Octave Scripts, or ultimately in C++.


We present some Demos here:

Mobile Robot. In this demo, we have reproduced the ISR 7th flour. and we have mounted the Baltazar Robot on a segway, so it can translate throughout the corridors of the environment.


Planning & Manipulation.


Grasping.


Mobile Manipulation.


Human Imitation.

Interfacing Simulations

...This Section is not available yet. We also want to try our experiments in real Robots, and to do so, we can interface our simulation environment with the Robots. This is done with ROS.