News


Posted by Javier Pérez

Today we are happy to announce UWSim version 1.4 is available in deb packages (and github) for ROS indigo and jade distributions. We have already announced most of the new features of this version as they came, but here is a list in the case you missed something:

Besides this main updates, many bugs fixes and small enhancements have been added to the simulator. Some that may be useful are:

  • Improved trajectories: Added the option to show only last X seconds, or delete it pressing a key.
  • Light rate: New parameter to increase or reduce the scene illumination.
  • ARMask: Augmented reality elements (trajectory trails, frames, ...) should not be visible in cameras and sensors.

Furthermore we have updated and extended the information available in our wiki with new first steps tutorials to show the new (and old) features. If you think a specific topic needs a tutorial or more information let us know.

 

Posted by Javier Pérez

Although we already had some old work in this direction (thanks Mario!), we couldn't add it to main branch until last weeks. Today we are presenting interactive marker support in UWSim. To do so, we have updated the visualization_osg package and now it is available for ROS indigo and jade, and will be required in UWSim v1.4. This package allows to create markers and interactive markers in OpenSceneGraph. Although some things are still missing, such as menus, the main features are working and available in the github code and debian packages.
 
Using this package we have added interactive marker capabilities in UWSim, which means you should be able to create and move your own markers through the scene, and receive feedback in the server. Furthermore we have created a SpawnMarker service which will allow to create, modify and delete geometry through ROS service calls. As usual we added two interface examples to show how to use this new feature and provide basic code.
 
The first example followMarker creates an interactive marker that will be followed by a vehicle copying its pose. For this reason the example requires a dataNavigator topic of the vehicle. The second example uses the spawn marker service to create , modify or delete a mesh object in the desired position. More information about how to use this features is available in the uwsim wiki.
 
We are sure that this update will help to improve the usability of UWSim, and can be used to create much more interesting examples (scene configuration maybe?). We encourage you to share your ideas in the discussion group  and contribute. Finally, here is a short video of the interactive markers in action.

 

Posted by Javier Pérez

Some time ago we realized multibeam had some important bugs and features that could be improved. So we decided to fix them and update the multibeam code to achieve a better simulation!. Some of the bugs fixed are: publishing values higher than the range limit and huge underwater particle noise. As some of you noticed multibeam sensors were publishing range values higher than the max range depending on the angle of the beam, this has been fixed on last version. About the huge underwater particle noise, for an unknown reason narrow cameras get much more underwater particles from osgOcean so we decided to disable them from multibeam cameras using osg mask. In the case we want noise, it can be added after getting the correct reading from the sensor using any noise distribution that we want to use.

Furthermore, we realized multibeam readings were incorrect when the field of view was big, started being noticeable from 160º. As you may know multibeam uses depth cameras to simulate range values, so in order to solve this we use more than one camera per multibeam when the field of view is higher than 120º. This has no effects on the output and solves the eyefish deformation multibeam was suffering when we increased the field of view over 160º.

Finally, we added a debug visibility for beams that can be configured through the configuration files on multibeam block. This debug feature draws a line for each beam, similar to range sensor, so we can see where the multibeam is looking at. As this lines are tagged as Augmented Reality will not be visible in virtual cameras. For simplicity and efficiency debug lines are drawn until max range, they will not show if a beam is colliding or not with an object, as happens with the sonar debug line.

Finally you can see the multibeam working in this video.

Posted by Javier Pérez

As some people were commenting, maintaining XML scenes was quite complicated due to the need of copying and pasting large blocks of XML, mostly unedited, from file to file. This caused huge problems on consistency, for instance the "same" simulated sensor could be different on each scene. In order to solve this we added a xacro processing on the build compilation and few suggested macros have been created to help with this issue. If you don't know what xacro is you can check this.  This feature is called at build time, so only source-based installation will have it automatically, although deb users can build their xacro files manually using "xacro.py". What we suggest is to use four main files for macro libraries, to know:

  • Common: Basic generic macros for UWSimScenes.
  • DeviceLibrary: intended for sensors and interfaces, some added as example.
  • ObjectLibrary: Common used objects such as cirs terrain or blackbox should be available here, so there is no need to remember 3d mesh file, offsets or physics configuration to add them on a scene.
  • VehicleLibrary: library for vehicles (uses deviceLibrary to add sensors). This way we can easily maintain different configurations for vehicles and add the whole vehicle in a scene instead of copying it.

And then in the scene xacro file (.xml.xacro) call this macros to build the scene. This allows to have a unique library for sensors, objects and vehicles and just call the macro in each scene where it is used. An example has also been added, "xacroExample.xml.xacro", which creates the 424 lines "cirs.xml" scene, with just 35 lines!.  
 
Finally some lines have been added to CMakeLists so ".xml.xacro" files present on data/scenes are automatically built on catkin_make. Also, if any of this files suffers any changes is rebuilt on the next catkin_make, so there is no need to call "rosrun xacro....". The only drawback is changes on the libraries will require a manual rebuild.

We have added examples and more detailed information on how to do it in our wiki.
 
I hope this example helps you to create your own macro library and finish with the inconsistency problems!

Posted by Javier Pérez

Today we are presenting a new feature recently developed in uwsim: dredging. Now every object may be "buried" in sand to a percent of their height, configurable via XML. Furthermore buried objects may be "unearthed" using a dredge tool. This dredge tools are created in vehicles configured via XML setting its position and create an animation of the dredged particles using osgParticle. As an example we can see it in action unearthing a buried amphora, the dredge tool is placed in the gripper and we can see the desired path (green) and performed path (red).

To configure a buried object just an attributte in the object is needed stating the buried percent:

  <object buried="0.6">

and the dredge tool may be created using a DredgeTool device inside vehicles, for instance:

  <DredgeTool>
    <name>dredgeGripper</name>
    <target> end_effector</target>
    <offsetp>
      <x>0</x>
      <y>0</y>
      <z>0</z>
    </offsetp>
    <offsetr>
      <x>0</x>
      <y>0</y>
      <z>0</z>
    </offsetr>
  </DredgeTool>

Although this feature is already working in last git version, some fixes and enhancements are coming. As usual any bug report or comment can be done through the github project.

Posted by Javier Pérez

We recently added a new ROS interface to help users working with pointclouds(PCD) in UWSim: the ROSPointCloudLoader. This interface allows to hear pointclouds from a topic and visualize them on the 3D main window. The code takes all the required parameters, such as reference frame, from the pointcloud so there is only one parameter the topic name. Besides this parameter, an attribute can be configured to disable deleting the last PCD when a new one is received. This can be useful for laser stripe reconstruction to see the PCD building process. This is an example of how to include it into your scene XML file:

  <ROSPointCloudLoader delLastPCD="false">
    <topic>/laser_stripe/points2</topic>
  </ROSPointCloudLoader>

This example shows a PCD viewer for laser stripe reconstructions. In the following video you can see it running on both, a stereo reconstruction, and a laser stripe reconstruction. As can be seen in the video pointclouds will only be visible in the main windows, so it will not cause problems in virtual cameras.

Hope this feature helps debugging when working with pointclouds!.

Posted by Javier Pérez

As a consequence from the previous shader updates we are happy to announce random noise can be added on UWSim simulated cameras. In the past we did quite a few tests to add noise on cameras but it required to generate a huge amount of random numbers so it decreased simulator framerate too much to add it.

Finally this feature has been implemented in shaders so only a few random numbers are required on each frame per camera as seed for the shader random noise generation. The noise is an additive gaussian distribution for each channel (RGB). A new configuration parameter tag has been added to VirtualCameras, <std>, so it can be configured via XML as a percent of the standard deviation for the gaussian distribution that will be added to the camera output (default value is 0.005).

Although this value may be not highly noticeable at simple sight, it really affects computer vision software. Here we can find a comparison of snapshots for different standard deviations.

Posted by Javier Pérez

We have completely updated the shader management of UWSim. Now shaders are created for each camera and main view, instead of for each object. This allows us to easily maintain shaders and use different shader effects on each camera.

Now osgOcean shaders are used, so the copied version of them present on data/shaders have been deleted, only maintaining default_scene shader, which is our custom object shader with added functionality such as structured light projector. This shader has been slightly optimized, so a small performance increase may be noticeable on some scenes.

Furthermore, this update also fixed some osgOcean underwater effects done through shaders as  uniforms were not correctly passed to virtual camera shaders.

Finally, depending on the texture and object colors some objects were showing wrong appearance, this should be fixed now. This was particularly noticeable in the ship of the shipwreck scene. See below the before(left) and after(right) images.

Besides this, an ARMask (Augmented Reality Mask) has been added to osgOcean in a way that every object using this mask should be visible only in the main view. This mask, has been applied to frames, frame labels, multibeam and range sensor debugs, trails ...etc. This means that now, virtual cameras should only see the raw scene without added information.

As this update required "major" modifications, bugs may appear, so please report any issue in the git repository as usual.

MERBOTS meeting

30 Apr 2015

This MERBOTS meeting was held at CIRS (UdG) on 28-30/04/2015, during the IFAC Workshop on Navigation, Guidance and Control of Underwater Vehicles (NGCUV'2015). Some of the attendees were (left to right): M. Carreras (UdG), P.J. Sanz (IP, UJI), P. Ridao (IP, UdG), G. Oliver (IP, UIB), L. Magí (UdG).

Summer School on Mobile Manipulators
UJI, July 13-14, 2015

The Interactive and Robotics Systems Laboratory (IRS-Lab) is organizing an international Summer School on Mobile Manipulators, funded by the Spanish Ministry, that will held on July 13-14, 2015 in Castellón, Spain, UJI (Universitat Jaume-I). It is worth mentioning the unusual focus, in a single event, of so many different dimensions concerning mobile manipulators (ground, air and underwater), highlighting so, potential synergies among them.

All the info is available at: http://www.irs.uji.es/summerschool2015 

Pages