Main characteristics of UWSim

From UWSim
Jump to: navigation, search

Configurable environment

The 3D scene can be easily configured with third-party modeling software as Blender, 3D Studio Max, etc. The basic scene can be freely modeled, including materials and textures. The resulting scene will have the possibility to be loaded in the simulator as long as it is exported to any of the formats that OSG can read (.ive, .3ds, .wrl, etc.).

In addition to the basic 3D structure, additional elements can be dynamically added, modified and removed from the main program. OSG represents the virtual scenario with a scene graph, where the nodes can be easily accessed and managed. This includes not only geometry nodes, but also cameras, light sources, etc.

To the geometry that the user defines, new configurable nodes are automatically added that create the underwater visualization effects. These are the ocean surface, underwater visibility, water color, silt particles, god rays, etc.

The complete scene can be described by the user with an XML file or a xacro file (macro language to create XML files). To learn more about this visit Configuring and creating scenes.

Multiple robots support

The software design includes abstract classes that can be specialized for adding support for different vehicles and underwater manipulators. The default vehicle object is composed of a 3D model (modeled by the user in third party modeling software) that can be positioned in the scene on 6 degrees of freedom (DOF). The robots are described with an XML file according to the URDF format, that can include kinematic, dynamic and visual information. As an example, a vehicle + arm configuration is provided by default, girona500 vehicle and ARM5E manipulator. Different robots can be loaded and managed simultaneously in the scene.

Simulated sensors

Twelve sensors are available for vehicles plus default position/velocity sensors available for vehicles and manipulator joints in the current version of UWSim (1.4). The default position and velocity sensors for vehicles provide 6DOF pose (x, y, z, r, p, y) in the scene, in the case of manipulators position and velocity for each joint is provided. Besides this, the following sensors are available:

  • Camera: Provides virtual images that can be used for developing vision algorithms.
  • Range Camera: Depth image of the camera.
  • Range sensor: Measures the distance to the nearest obstacle along pre-defined directions. More info here.
  • Object picker: Fakes object grasping when the object is closer to a pre-defined distance.
  • Pressure: Provides a pressure measure.
  • DVL: Estimates the linear speed at which the vehicle is travelling.
  • Imu: Estimates the vehicle orientation with respect to the world frame.
  • GPS: Provides the vehicle position with respect to the world, only works when the vehicle is near the surface.
  • Multibeam: Simulates an array of range sensors, providing distances to nearest obstacles in a plane at constant angle increments. More info here.
  • Force: Estimates the force and torque applied to a vehicle part. More info here.
  • Structured light projector: Projects a laser or regular light on the scene. More info here.
  • Dredge: Dredges mud from buried objects. More info here.

Contact physics

Physics simulation is also supported, although in a simple manner at this moment. This is done by integrating the physics engine Bullet with OSG through osgBullet. This allows simulating contacts and forces and automatically updating the scene accordingly. The different bodies collision shapes can be automatically generated from the 3D models. In order to have water physics, underwater vehicle dynamics must be used (already available in the underwater simulation stack). This package provides dynamic simulation of underwater vehicles, interaction with the simulated world can be created using force sensors as external forces on the vehicle [1]. We are currently working in an integrated version for physics.

Network interfaces

All the different robots sensors and actuators can be interfaced with external software through the network. For this, we have integrated the simulator into the Robot Operating System (ROS) that provides many facilities for communications and distributed computation. Through the network interfaces, it is possible to access/update any vehicle position or velocity, to move arm joints, or to access simulated sensors such as images generated by virtual cameras. However, the core software is independent of any middleware, so it is possible to interface the simulator with other middlewares, just take ROSInterface.h as an example.

Widgets

Support for customizable widgets is also included. Widgets are small windows that can be placed inside the main display in order to show specific data to the user. An abstract interface for the creation of custom widgets is included. It allows to create specialized classes for displaying useful data depending on the specific application. For instance, the virtual camera view is displayed on a widget, which position and size can be modified during execution by the user. Another widget just connects to an external camera and displays the image on top of the simulation. This allows, for instance, to supervise an underwater vision, where live video is available and the environment has been previously modeled.