Human-Robot interaction tutorial

This tutorial shows how to build a simulation with a robot and a human that is interactively controlled by the simulation user.

In this simple scenario, the robot is ordered to follow the human.

Pre-requisites

Initial scene

We will use the Builder API to create our scene.

Create a new scene hri.py and add these lines:

from morse.builder import *

# Import the human model.
human = Human()

# Use the standard environment 'sandox.blend'. You could use any of them.
env = Environment('sandbox')

Launch MORSE with this script (morse run hri.py). You can move the human using the arrow keys.

Note

If you are running MORSE on a Unix machine, you can make the first line of your script: #! /usr/bin/env morseexec.

You can then make your script executable with chmod +x hri.py. You can now quickly start your simulation simply by running ./hri.py.

Where is my human?

Exporting the position

As a first step, we would like to export the position of the human in the world. To do so, we need the Pose sensor.

Appending a pose sensor to the human is easy:

from morse.builder import *

human = Human()

# Import the pose sensor and attach it to the human.
pose = Pose()
human.append(pose)

# [...]

In this tutorial, we will use sockets to stream the pose out of MORSE:

from morse.builder import *

human = Human()

pose = Pose()
human.append(pose)

# Set the pose sensor to use the socket interface to communicate
# with modules outside of MORSE.
pose.add_stream('socket')

# [...]

You can now re-run the simulation, as usual. The human pose is now exported.

Reading the position outside of MORSE

We can retrieve the human’s pose human using a normal Python script pymorse:

from pymorse import Morse

def printer(data):
    print("Pose=" + str(data))

with Morse() as morse:

    # The pose sensor is available as 'morse.human.pose' because
    # the human is named 'human' and the pose sensor 'pose' in our
    # Builder script
    morse.human.pose.subscribe(printer)

    # Listen to pose updates for 10 sec
    morse.sleep(10)

You can run this script from any terminal, on the same machine as MORSE (or on a remote one, just replace Morse() with Morse(<hostname or ip>)).

It prints the human avatar’s pose on the terminal for 10 seconds. Try to move the human with the keyboard within the simulator. The output should look something like this:

Pose={'x': 0.16082972288131714, 'y': 0.00014015310443937778, 'z': 0.047640468925237656, 'pitch': -2.1290716745170357e-08, 'roll': 1.0065883238041806e-08, 'timestamp': 1444319642.4115114, 'yaw': 0.0001225958694703877}
Pose={'x': 0.16082972288131714, 'y': 0.00014015310443937778, 'z': 0.047640468925237656, 'pitch': -2.1494560797918894e-08, 'roll': 1.0039565623287672e-08, 'timestamp': 1444319642.4276326, 'yaw': 0.0001225958694703877}
Pose={'x': 0.16082972288131714, 'y': 0.00014015310443937778, 'z': 0.047640468925237656, 'pitch': -2.1901566782389637e-08, 'roll': 1.0047403797841525e-08, 'timestamp': 1444319642.444707, 'yaw': 0.0001225958694703877}
Pose={'x': 0.16082972288131714, 'y': 0.00014015310443937778, 'z': 0.047640468925237656, 'pitch': -1.7940088525847386e-08, 'roll': 1.0114515447412487e-08, 'timestamp': 1444319642.4619052, 'yaw': 0.0001225958694703877}
...

Moving the human

As noted earlier, you can move the human avatar with the arrow keys. However, it is also useful to program the motion of the simulated human. Indeed, like any other robot in MORSE, the human avatar can be externally controlled (for instance, to move along a predefined path).

Getting the human to follow a path

To get the human to follow a path, we first need to add a waypoint actuator, as we did for the pose sensor:

from morse.builder import *

human = Human()

pose = Pose()
human.append(pose)
pose.add_stream('socket')

motion = Waypoint()
motion.properties(ControlType="Position")
human.append(motion)
motion.add_stream('socket')

env = Environment('sandbox')

You can now re-run the simulation. Using the updated pymorse script below, you can now send waypoints that the human will move towards every time you press Enter.

from pymorse import Morse

with Morse() as morse:

    pose = morse.human.pose
    motion = morse.human.motion

    x = 2

    while True:
        input("The human is currently at: %s. Press Enter..." % pose.get())

        x = -x
        y = 0

        print("Moving to %s..." % ([x, y],))
        motion.publish({"x":x, "y":y, 'z':0, 'tolerance':0.3, 'speed':1})

TUTORIAL WITH THE NEW AVATAR STOPS HERE FOR NOW

When moving the mouse, you displace the yellow IK target of the head. This allows you to control the head direction.

Picking objects

Our human can pick up and drop objects. Let’s add a new object (a cornflakes box, from the kitchen objects library) on one of the tables. Exit the simulation (Esc), and re-open your script.

Add the following lines:

from morse.builder import *

human = Human()

# Import, configure and place a static object from 'kitchen_objects.blend'.
cornflakes = PassiveObject("props/kitchen_objects", "Cornflakes")
cornflakes.setgraspable()
cornflakes.properties(Label = "My cornflakes")
cornflakes.translate(-7, 3, 1.1)

env = Environment('indoors-1/indoor-1')

You can learn more on passive objects here.

user/beginner_tutorials/../../../media/hri_cornflakes.jpg

Restart the simulation (morse run hri.py or ./hri.py), and press the x key to switch to the manipulation mode. You can control the hand with the mouse while holding Middle Mouse Button. Press the Left Mouse Button with the crosshairs over an object to pick it up, and press Right Mouse Button to drop the object.

user/beginner_tutorials/../../../media/hri_cornflakes_pickup.jpg

Check the human component documentation for more details on what can be done with the human component.