RAPOSA-NG: Difference between revisions

From ISRWiki
Jump to navigation Jump to search
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
[[Image:Raposang.JPG|300px|thumb|RAPOSA-NG at the signature of a Protocol between MAI, IST and AIP]]
[[Image:RaposaNG description.png|600px]]


[[Image:RAPOSA LOGO wiki.jpg|300px|thumb|RAPOSA logo, by Filipe Jesus]]
== Description ==


IdMind designed a new version of [[RAPOSA]] for commercial purposes, named RAPOSA NG (Next Generation). ISR has acquired a unit of this new version for project and investigation purposes, aiming for depth notion, user-friendly visual control, 3D mapping, localization and augmented reality.
Following the success of RAPOSA, the IdMind company developed a commercial version of RAPOSA, improving it in various ways. Notably, the rigid chassis of RAPOSA, which eventually ends up being plastically deformed by frequent shocks, was replaced by semi-flexible structure, capable of absorbing non-elastical shocks, while significantly lighter than the original RAPOSA.


ISR acquired a barebones version of this robot, called RAPOSA-NG, and equipped it with a different set of sensors, following lessons learnt from previous research with RAPOSA. In particular, it is equipped with:
* a stereo camera unit (PointGrey Bumblebee2) on a pan-and-tilt motorized mounting;
* a Laser-Range Finder (LRF) sensor on a tilt-and-roll motorized mounting;
* an pan-tilt-and-zoom (PTZ) IP camera;
* an Inertial Measurement Unit (IMU).


== Specifications ==
This equipment was chosen not only to fit better our research interests, but also to aim at the RoboCup Robot Rescue competition.
The stereo camera is primarily used jointly with an Head-Mounted Display (HMD) wear by the operator: the stereo images are displayed on the HMD, thus providing depth perception to the operator, while the stereo camera attitude is controlled by the head tracker built-in the HMD.
The LRF is being used in one of the following two modes: 2D and 3D mapping. In 2D mapping we assume that the environment is made of vertical walls. However, since we cannot assume horizontal ground, we use a tilt-and-roll motorized mounting to automatically compensate for the robot attitude, such that the LRF scanning plane remains horizontal. An internal IMU measures the attitude of the robot body and controls the mounting servos such that the LRF scanning plane remains horizontal.
The IP camera is used for detail inspection: its GUI allows the operator to orient the camera towards a target area and zoom in into a small area of the environment. This is particularly relevant for remote inspection tasks in USAR. The IMU is used both to provide the remote operator with reading of the attitude of the robot, and for automatic localization and mapping of the robot.


* Length: 84 cm;
Further info can be found in the book chapter [http://link.springer.com/chapter/10.1007/978-3-319-05431-5_12 Two Faces of Human–Robot Interaction: Field and Service Robots (Rodrigo Ventura)], from New Trends in Medical and Service Robots
* Width: 47cm;
Mechanisms and Machine Science Volume 20, pp 177-192, SprinGer, 2014.
* Height: 18cm;


* Wheel radius: 9.5cm;
And check out our [https://www.facebook.com/socrob.rescue Facebook page]!
* Weight (without extra hardware equipped): 17kg;
* Weight (with extra hardware equipped): 22kg.


== Components ==
== Team ==


=== Internal Devices ===
* Rodrigo Ventura (coordinator)
* Filipe Jesus
* João Mendes
* João O'Neill


From origin:
== Videos ==


* 2 Maxon 150W DC motors (for differential drive);
<html><iframe width="560" height="315" src="//www.youtube.com/embed/edXl8UNH-UE" frameborder="0" allowfullscreen></iframe></html>
* Firgelli Automation DC motor (to tilt the front body);
* 2 electronic boards:
** Motors board (for encoding);
** Relay board (to manage and redirect power supplies);


Internal computer:
<html><iframe width="560" height="315" src="//www.youtube.com/embed/XXXmA3iVKL0" frameborder="0" allowfullscreen></iframe></html>
 
* Commell LV-679: A Mini-ITX Motherboard with:
** Intel Core 2 Duo 1.6Ghz 4Mo 800Mhz Processor;
** 2x Kingston 1GB 667Mhz DDR2 RAM;
** Intel PRO/Wireless LAN 2100 3B Mini PCI adapter;
** Kingston SSDNow V Series 64GB.
 
=== External Devices ===
 
Sensors:
 
* Point Grey Bumblebee2 (Firewire stereo camera system);
* MicroStrain 3DM-GX2 (IMU with triaxial accelerometer, triaxial gyro, triaxial magnetometer, temperature sensors, and an on-board processor).
 
 
=== Power ===
 
* 2x 4S2P Li-Po batteries 4800mAh 14.8V;
 
 
=== Extras ===
 
* 2x Groupner ULTRAMAT (for Li-Po battery charging);
* +6x 4S2P Li-Po batteries 4800mAh 14.8V (for replacement);
* PC Laptop Sony VAIO with Intel Core i5;
* Gamepad Logitech Rumblepad 2 (to control RAPOSA-NG);
* External power supply.
 
 
=== To be installed ===
 
* Microsoft Kinect for XBOX 360 (Vision + Depth camera);
* A Pan & Tilt motor.
 
== Power ==
 
[[Image:RAPOSA_Battery.JPG|thumb|Battery used by RAPOSA-NG]]
 
RAPOSA-NG uses two 4S2P Li-Po batteries with 4800mAh and 14.8V for motors and electronics, respectively. Two LEDs near the lock indicates how much energy is left for the electronics (left) and motors (right).
 
It is also possible to use an external power supply for the electronics.
 
 
=== Switch on/off ===
 
 
Motors and electronics are also switched on/off independently.
 
To switch on/off the electronics, place the key into the lock and turn it. 
 
If the external power supply is connected and on, the electronics are immediately on. Remove the external power supply to switch off the electronics.  
 
To switch on/off the motors, simply press the red button at the bottom-left top corner. If the light is on, the motors are on.
 
To switch on/off the computer, press the gray button near the lock after switching on the electronics until a green LED lits.
 
 
=== Batteries ===
 
Main page: [[RAPOSA-NG Batteries]]
 
 
RAPOSA-NG uses Lithium-ion Polymer batteries (Li-Po) with 4800mAh and 14.8V.
 
Each battery is composed of 4 cells with 2400mAh and 3.7V each, in a 4S2P fashion (two 4 series packs in parallel).

Latest revision as of 15:48, 18 November 2014

Description

Following the success of RAPOSA, the IdMind company developed a commercial version of RAPOSA, improving it in various ways. Notably, the rigid chassis of RAPOSA, which eventually ends up being plastically deformed by frequent shocks, was replaced by semi-flexible structure, capable of absorbing non-elastical shocks, while significantly lighter than the original RAPOSA.

ISR acquired a barebones version of this robot, called RAPOSA-NG, and equipped it with a different set of sensors, following lessons learnt from previous research with RAPOSA. In particular, it is equipped with:

  • a stereo camera unit (PointGrey Bumblebee2) on a pan-and-tilt motorized mounting;
  • a Laser-Range Finder (LRF) sensor on a tilt-and-roll motorized mounting;
  • an pan-tilt-and-zoom (PTZ) IP camera;
  • an Inertial Measurement Unit (IMU).

This equipment was chosen not only to fit better our research interests, but also to aim at the RoboCup Robot Rescue competition. The stereo camera is primarily used jointly with an Head-Mounted Display (HMD) wear by the operator: the stereo images are displayed on the HMD, thus providing depth perception to the operator, while the stereo camera attitude is controlled by the head tracker built-in the HMD. The LRF is being used in one of the following two modes: 2D and 3D mapping. In 2D mapping we assume that the environment is made of vertical walls. However, since we cannot assume horizontal ground, we use a tilt-and-roll motorized mounting to automatically compensate for the robot attitude, such that the LRF scanning plane remains horizontal. An internal IMU measures the attitude of the robot body and controls the mounting servos such that the LRF scanning plane remains horizontal. The IP camera is used for detail inspection: its GUI allows the operator to orient the camera towards a target area and zoom in into a small area of the environment. This is particularly relevant for remote inspection tasks in USAR. The IMU is used both to provide the remote operator with reading of the attitude of the robot, and for automatic localization and mapping of the robot.

Further info can be found in the book chapter Two Faces of Human–Robot Interaction: Field and Service Robots (Rodrigo Ventura), from New Trends in Medical and Service Robots Mechanisms and Machine Science Volume 20, pp 177-192, SprinGer, 2014.

And check out our Facebook page!

Team

  • Rodrigo Ventura (coordinator)
  • Filipe Jesus
  • João Mendes
  • João O'Neill

Videos