Introducing the Modular OpenRobots Simulation Engine

Welcome to the official documentation for the MORSE project.

Quick start

For any questions about the use of MORSE or if you have any issues with the simulator, you can drop a mail to the morse-users@laas.fr mailing-list. You can suscribe to the mailing-list here.

You can report bugs to our bug-tracker.

What is MORSE?

_images/simu_render_indoors.jpg
  • A versatile simulator for generic mobile robots simulation (single or multi robots),
  • Enabling realistic and dynamic environments (with other interacting agents -humans- or objects),
  • Don’t reinvent the wheel: critical components reused from other open source projects (Blender for 3D rendering + UI, Bullet for physics simulation, dedicated robotic middlewares for communications + robot hardware support),
  • Seamless workflow: since the simulator rely on Blender for both modeling and the real time 3D engine, creating and modifying a simulated scene is straightforward.
  • Entirely scriptable in Python,
  • Adaptable to various level of simulation realism (for instance, we may want to simulate exteroceptive sensors like cameras in certain cases and access directly to a higher level representation of the world -like labeled artifacts- in other cases),
  • Currently compatible with YARP and LAAS OpenRobots robotics frameworks,
  • Fully open source, BSD-compatible.
_images/morse_interface.jpg

MORSE is partially funded by the Fondation RTRA within the ROSACE project framework.

_images/stae_logo.png
_images/rosace.png

The MORSE Workflow

How to build a complete simulation scenario, from the creation of a custom robot with predefined sensors and actuators to the complete scene, including other robots or humans.

Components library

_images/morse_robot.jpg

MORSE offers an extended set of predefined sensors and controllers that cover reasonably well common simulation needs in robotics. It offers also some complete robots.

The following page lists all the currently existing components and their properties:

MORSE has also a mechanism to alter input or output data (like adding noise to a GPS position) by so called modifiers:

To learn how to add new components (sensors, robots...), please refer to the developer documentation.

Supported middlewares

MORSE relies on middlewares to integrate in your robotic architecture.

We currently support only YARP, pocolibs and a simple text-based socket protocol. More middlewares are expected to be added in the next versions (partial ROS support is available in the development trunk).

Detailled information:

Tutorials

Intermediate

These tutorials provide more in-depth explanations of how to setup simulations with specific requirements.

Contributing to MORSE

As an open-source project driven by the research community, your contributions are very welcome!

Check the Developers documentation.

Media

Screenshots

_images/outdoor_example.jpg

An ATRV in an outdoor scenario.

_images/indoors_sick.jpg

Real-time simulation of a SICK laser range finder in an indoors environment.

_images/hri.jpg

Simulation of human-robot interaction: the robot tracks the posture of the human.

_images/morse_interface.jpg

The MORSE interface (crude Blender :-) )

Videos are also available on the Blender for Robotics Vimeo group.

On the road-map

The first release of MORSE contains only a subset of the final simulator specification.

Amongst the planned features for future MORSE releases:

  • full compatiblity with the ROS robotics framework (other robotics framework are planned as well. Let us know if you want to contribute in this area),
  • support for point cloud sensors (stereo-vision, Velodyne, Kinect,...)
  • complete support of the Willow Garage’s PR-2 robot, along with all the sensors
  • Developement of the user interface,
  • Scalablity (both in term of simulation capacity and ease of deployment),
  • Multi-node simulations (several Blender nodes can be started on several computer and automaticaly synchronise, which should allow simulations of tenth of robots in the same scene),
  • Dedicated supervision node that would allow to: observe the simulation, display logs and metrics, start/stop robots, dynamically alter the scene (like moving an obstacle in front of a robot, etc.).