News

Error message

  • Notice: tempnam(): file created in the system's temporary directory in drupal_tempnam() (line 2552 of /users/apache/www.irs.uji.es/htdocs/uwsim/includes/file.inc).
  • Warning: file_put_contents(): Filename cannot be empty in file_unmanaged_save_data() (line 1986 of /users/apache/www.irs.uji.es/htdocs/uwsim/includes/file.inc).
  • The file could not be created.
  • Notice: tempnam(): file created in the system's temporary directory in drupal_tempnam() (line 2552 of /users/apache/www.irs.uji.es/htdocs/uwsim/includes/file.inc).
  • Warning: file_put_contents(): Filename cannot be empty in file_unmanaged_save_data() (line 1986 of /users/apache/www.irs.uji.es/htdocs/uwsim/includes/file.inc).
  • The file could not be created.

UWSim: Enhanced multibeam

Some time ago we realized multibeam had some important bugs and features that could be improved. So we decided to fix them and update the multibeam code to achieve a better simulation!. Some of the bugs fixed are: publishing values higher than the range limit and huge underwater particle noise. As some of you noticed multibeam sensors were publishing range values higher than the max range depending on the angle of the beam, this has been fixed on last version. About the huge underwater particle noise, for an unknown reason narrow cameras get much more underwater particles from osgOcean so we decided to disable them from multibeam cameras using osg mask. In the case we want noise, it can be added after getting the correct reading from the sensor using any noise distribution that we want to use.

Furthermore, we realized multibeam readings were incorrect when the field of view was big, started being noticeable from 160º. As you may know multibeam uses depth cameras to simulate range values, so in order to solve this we use more than one camera per multibeam when the field of view is higher than 120º. This has no effects on the output and solves the eyefish deformation multibeam was suffering when we increased the field of view over 160º.

Finally, we added a debug visibility for beams that can be configured through the configuration files on multibeam block. This debug feature draws a line for each beam, similar to range sensor, so we can see where the multibeam is looking at. As this lines are tagged as Augmented Reality will not be visible in virtual cameras. For simplicity and efficiency debug lines are drawn until max range, they will not show if a beam is colliding or not with an object, as happens with the sonar debug line.

Finally you can see the multibeam working in this video.

uwsim: Scene building through xacro.

As some people were commenting, maintaining XML scenes was quite complicated due to the need of copying and pasting large blocks of XML, mostly unedited, from file to file. This caused huge problems on consistency, for instance the "same" simulated sensor could be different on each scene. In order to solve this we added a xacro processing on the build compilation and few suggested macros have been created to help with this issue. If you don't know what xacro is you can check this.  This feature is called at build time, so only source-based installation will have it automatically, although deb users can build their xacro files manually using "xacro.py". What we suggest is to use four main files for macro libraries, to know:

  • Common: Basic generic macros for UWSimScenes.
  • DeviceLibrary: intended for sensors and interfaces, some added as example.
  • ObjectLibrary: Common used objects such as cirs terrain or blackbox should be available here, so there is no need to remember 3d mesh file, offsets or physics configuration to add them on a scene.
  • VehicleLibrary: library for vehicles (uses deviceLibrary to add sensors). This way we can easily maintain different configurations for vehicles and add the whole vehicle in a scene instead of copying it.

And then in the scene xacro file (.xml.xacro) call this macros to build the scene. This allows to have a unique library for sensors, objects and vehicles and just call the macro in each scene where it is used. An example has also been added, "xacroExample.xml.xacro", which creates the 424 lines "cirs.xml" scene, with just 35 lines!.  
 
Finally some lines have been added to CMakeLists so ".xml.xacro" files present on data/scenes are automatically built on catkin_make. Also, if any of this files suffers any changes is rebuilt on the next catkin_make, so there is no need to call "rosrun xacro....". The only drawback is changes on the libraries will require a manual rebuild.

We have added examples and more detailed information on how to do it in our wiki.
 
I hope this example helps you to create your own macro library and finish with the inconsistency problems!

UWSim: Initial dredging support.

Today we are presenting a new feature recently developed in uwsim: dredging. Now every object may be "buried" in sand to a percent of their height, configurable via XML. Furthermore buried objects may be "unearthed" using a dredge tool. This dredge tools are created in vehicles configured via XML setting its position and create an animation of the dredged particles using osgParticle. As an example we can see it in action unearthing a buried amphora, the dredge tool is placed in the gripper and we can see the desired path (green) and performed path (red).

To configure a buried object just an attributte in the object is needed stating the buried percent:

  <object buried="0.6">

and the dredge tool may be created using a DredgeTool device inside vehicles, for instance:

  <DredgeTool>
    <name>dredgeGripper</name>
    <target> end_effector</target>
    <offsetp>
      <x>0</x>
      <y>0</y>
      <z>0</z>
    </offsetp>
    <offsetr>
      <x>0</x>
      <y>0</y>
      <z>0</z>
    </offsetr>
  </DredgeTool>

Although this feature is already working in last git version, some fixes and enhancements are coming. As usual any bug report or comment can be done through the github project.

Available ROSPointCloudLoader interface in UWSim.

We recently added a new ROS interface to help users working with pointclouds(PCD) in UWSim: the ROSPointCloudLoader. This interface allows to hear pointclouds from a topic and visualize them on the 3D main window. The code takes all the required parameters, such as reference frame, from the pointcloud so there is only one parameter the topic name. Besides this parameter, an attribute can be configured to disable deleting the last PCD when a new one is received. This can be useful for laser stripe reconstruction to see the PCD building process. This is an example of how to include it into your scene XML file:

  <ROSPointCloudLoader delLastPCD="false">
    <topic>/laser_stripe/points2</topic>
  </ROSPointCloudLoader>

This example shows a PCD viewer for laser stripe reconstructions. In the following video you can see it running on both, a stereo reconstruction, and a laser stripe reconstruction. As can be seen in the video pointclouds will only be visible in the main windows, so it will not cause problems in virtual cameras.

Hope this feature helps debugging when working with pointclouds!.

Gaussian random noise available for cameras.

As a consequence from the previous shader updates we are happy to announce random noise can be added on UWSim simulated cameras. In the past we did quite a few tests to add noise on cameras but it required to generate a huge amount of random numbers so it decreased simulator framerate too much to add it.

Finally this feature has been implemented in shaders so only a few random numbers are required on each frame per camera as seed for the shader random noise generation. The noise is an additive gaussian distribution for each channel (RGB). A new configuration parameter tag has been added to VirtualCameras, <std>, so it can be configured via XML as a percent of the standard deviation for the gaussian distribution that will be added to the camera output (default value is 0.005).

Although this value may be not highly noticeable at simple sight, it really affects computer vision software. Here we can find a comparison of snapshots for different standard deviations.

Major shader management changes

We have completely updated the shader management of UWSim. Now shaders are created for each camera and main view, instead of for each object. This allows us to easily maintain shaders and use different shader effects on each camera.

Now osgOcean shaders are used, so the copied version of them present on data/shaders have been deleted, only maintaining default_scene shader, which is our custom object shader with added functionality such as structured light projector. This shader has been slightly optimized, so a small performance increase may be noticeable on some scenes.

Furthermore, this update also fixed some osgOcean underwater effects done through shaders as  uniforms were not correctly passed to virtual camera shaders.

Finally, depending on the texture and object colors some objects were showing wrong appearance, this should be fixed now. This was particularly noticeable in the ship of the shipwreck scene. See below the before(left) and after(right) images.

Besides this, an ARMask (Augmented Reality Mask) has been added to osgOcean in a way that every object using this mask should be visible only in the main view. This mask, has been applied to frames, frame labels, multibeam and range sensor debugs, trails ...etc. This means that now, virtual cameras should only see the raw scene without added information.

As this update required "major" modifications, bugs may appear, so please report any issue in the git repository as usual.

UWSim updates: TF publisher, trajectory trails, bullet 2.82

UWSim trunk has been updated with some updates that are worthy to be briefly described. TF publisher offers a simple interface to publish via ROS TF the frames of every object, vehicle, vehicle part and sensor present in the scene in a tree structure with virtual world root. Objects publishing is optional and rootName can be configured, vehicle's parts use a prefix with the vehicle's name to avoid collision on tf identifiers. To add it the following lines should be added in the rosinterfaces block of the XML config file.

  <WorldToROSTF>
    <rootName> world </rootName>
    <enableObjects> 1 </enableObjects>
    <rate>10</rate>
  </WorldToROSTF>

Trajectory trails have been completely re-implemented so they can represent object movements and vehicle parts movements. Additionally trajectory trails can be hidden / displayed pressing “t” key. Instead of adding a config line inside vehicle XML block, trajectory trails are configured using showTrajectory blocks inside simParams as the following example shows:

  <showTrajectory>
    <target>girona500</target>
    <color>
      <r>1.0</r>
      <g>0.0</g>
      <b>0.0</b>
    </color>
    <lineStyle> 1 </lineStyle>
    <timeWindow> 30 </timeWindow>
  </showTrajectory>

Only target field is required to add a trajectory trail configuring the object that will draw a trajectory while moving. The other tags allows us to decide color lineStyle, where 1 is continuous line 2,3,4 are different kinds of dashed lines and timeWindow, which is an option to limit the trail to the last X seconds.

Finally, last (but not least!) update is the change of the bullet physics solver to Dantzig and bullet 2.82. This changes lead to better contact physics as shown in our tests heading to a better manipulation simulation.

 

Force Sensor

Force sensors are now available on UWSim. It measures the external forces on a part of a vehicle. It can be added to every visual part of a vehicle and it will measure the collision forces the piece is suffering in Newtons force torque.

This sensor uses Bullet physics engine to create two identical collision shapes one colliding with external objects and one "ghost" that traverses them, at the end of each physics iteration measuring the difference between these two collision shapes allows us to calculate the force suffered by the object. Physical properties of the vehicle part such as weight should be correctly provided on URDF to obtain realistic results. For instance, this sensor has been used to measure collision forces of the girona500 vehicle and send this information to underwater vehicle dynamics module which uses it as an input for external forces, the result is vehicle can no longer traverse objects as we can see in the embedded video. The XML code needed to add one is:

  <ForceSensor>
    <name>ForceG500</name>
    <target>base_link</target>
    <offsetp>
      <x>-0.2</x>
      <y>0.75</y>
      <z>0</z>
    </offsetp>
    <offsetr>
      <x>-1.57</x>
      <y>0</y>
      <z>3.14</z>
    </offsetr>
  </ForceSensor>

The example shows that only a target and offset from the part's center is needed to add a force sensor. The measured force can be published in ROS by adding the following lines inside the rosInterfaces block of your XML:

  <ForceSensorROS>
    <name>ForceG500</name>
    <topic>g500/ForceSensor</topic>
    <rate>100</rate>
  </ForceSensorROS>

Structured Light Projector

The UWSim structured light projector displays user configured textures over the scene along the -Z axis (osgcamera like) of their local frame . This projector is able to simulate arbitrary laser and vehicle lights casting realistic shadows. Here we can see an example of use:

  <structuredLightProjector>
    <name>laser_projector</name>
    <relativeTo>sls</relativeTo>
    <fov>21</fov>
    <image_name>data/textures/laser_lines.png</image_name>
    <laser>1</laser>
    <position>
      <x>1</x>
      <y>0</y>
      <z>0</z>
    </position>
    <orientation>
      <r>0</r>
      <p>3.14</p>
      <y>1.57</y>
    </orientation>
  </structuredLightProjector>

As usual the previous XML code adds a structured light projector at the given position and orientation with respect to sls. fov tag configures the field of view of the projector (also known as angle of view). image_name tag provides the image to project, RGBA images are allowed but alpha channel will be used as a binary value to project or not project the texture on every pixel. Finally laser label decides if the projection should be treated as laser or not, main difference is laser light does not suffer attenuation along distance and it does not mix with objects color.

                             

Multibeam sensor

This new sensor available for UWSim vehicles measures multiple distances to nearest obstacles along the -Z axis (osgcamera like) of their local frame varying X rotation from initial angle to final angle in configured angle increments. This sensor is useful to create many virtual range sensor at once in a efficient manner as it uses z-buffer to do its inner calculations.

As underwater floating particles were causing huge noise in this kind of sensors, multibeam sensors will not be disturbed by them unless "underwaterParticles" attribute is set to true on XML multibeam configuration. This is an example of how to include it into your scene XML file:

  <multibeamSensor underwaterParticles="false">
    <name>multibeam</name>
    <relativeTo>part0</relativeTo>
    <position>
      <x>-0.2</x>
      <y> 0.1 </y>
      <z> 0 </z>
    </position>
    <orientation>
      <r>3.14</r>
      <p>0</p>
      <y>3.14 </y>
    </orientation>
    <initAngle>-60</initAngle>
    <finalAngle>60</finalAngle>
    <angleIncr>0.1</angleIncr>
    <range>10</range>
  </multibeamSensor>

The example above adds a multibeam sensor at the given position and orientation with respect to part0, with a maximum range of 10m measuring from -60º to 60º in increments of 0.1º. “underwaterParticles” attribute turns on or off floating particles noise and is set to false by default. The measured distances can be published in ROS by adding the following lines inside the rosInterfaces block of your XML:

  <multibeamSensorToLaserScan>
    <name>multibeam</name>
    <topic>g500/multibeam</topic>
  </multibeamSensorToLaserScan>

Pages