Copyright 2017 - Custom text here

Below, you can find description of two competition task. Each team can participate only in one task. If you dont have access to a robot (i.e., you are not student  subscribed to intelligent robotics module), you can bid for access to one of our robots (several turtlebots and 1 irobot (rwi) b21  robot). The access to B21 robot will given to the team with the best application. Sign up here, deadline 6th Novemberhttps://goo.gl/forms/zATxYG9OBVb9LXVE3

Competition date to be confirm ASAP. For any questions, write us on robot-club(put here the usual)cs.bham.ac.uk

Credit goes to RoCKIn@Home, ERL-SR and Robocup@Home as these tasks are simplification of tasks from these competitions.

Task 1: Getting to know my home

 

Important notice: We intentionally propose a complicated task motivated by a story given below. We do not expect that teams will address all the challenges of the task. For this purpose, we present different stages in order to guide teams in their focus. Moreover, teams can chose their interests and bypass some necessary part by using a helper (which can be a QR code, typing in or anything reasonable that a team will discuss with referees) or by using existing ROS nodes. The described stages are only suggestions and so teams are free to make any reasonable additions or changes that they wish to.

 

Summary:  A robot is introduced into a new home where it is supposed to operate. Before it can be given to a new user - an elderly person - it must learn the specifics about its new home.

Part A: A trained operator can assist the robot to build its metric map, i.e., to obtain the layout of the flat, the size of rooms, the free space available to pass others, etc., and its semantic map, i.e., the relation of rooms (kitchen connects to the living room, that connects to the hallway), the relationships between furniture (dining table is in the kitchen), the properties of the furniture (position, shape, etc.), the relationships between objects (a cup is on the coffee table) and their properties as well.

Part B: After the maps are built with cooperation with the operator, the robot must autonomously demonstrate its learnt knowledge. First, it must navigate near to a piece of furniture or an object, making sure it can replan when an obstacle is limiting where the robot can stand next to the furniture/object. Second, it must answer some questions such as “Where is a vase?” (on the dining table), “where is a pillow?” (on the sofa and on the bed).

 

Stages: We recommend to split the task into the following stages with increasing difficulty and minimising the usage of helpers in each step. Moreover, the team’s focus should be the integrated system rather than several disconnected functionalities.

 

Stage 1:

  1. An operator guides the robot via joystick

  2. The robot makes a metric map with the help of an operator

  3. An operator navigates the robot to a new piece of furniture/object

  4. The robot uses a helper (a QR code in this case) to learn the object type, properties and world position

  5. Using this knowledge, the robot automatically updates its metric map (for example a table is poorly visible from laser data, hence the metric map should be edited in a way the robot knows, it cannot go between the locations of legs visible in the laser scan data)

  6. Using this knowledge, the robot automatically updates its semantic map

  7. After learning, the robot receives a navigation command via a helper (for example typed in text or a QR code). The robot must navigate to a region next to a piece of furniture or an object specified in the command. This cannot be a single point as this point might be obstructed (for example if a chair is next to a table) and the robot must plan where to go in order to fulfill the task; the robot must indicate if it cannot fulfill the task (the furniture/region is fully obstructed). The robot should give up if it is unable to reach it within a certain threshold (for example a minute)

  8. The robot needs to stand with an orientation to the furniture/object that maximises the view of the furniture/object by the robot's sensors

 
 

Stage 2:

  1. The QR codes used to learn properties of a piece of furniture/an object (1D) no longer contain the world position, the team needs to implement extraction of the world position by observing size of the QR code and its orientation

  2. If the team has a Pan Tilt Unit, it can expand (1H) by tilting the camera mounted on top of the PTU to gain a better view of the furniture

  3. Instead of an operator navigating the robot to a piece of furniture/object (1C), the robot automatically explores its environment and searches for QR codes

 

Stage 3:

  1. The QR code on a piece of furniture and an object (1D) contains only a name. The properties (size) must be extracted using a 3D camera

  2. Expand (2B) to navigate to a position to gain the best view of an object

  3. Speech recognition and understanding can be added to replace any helper mentioned before

  4. Replace joystick in (1A) with the ability for the robot to autonomously follow a person

 
 
 

Task 2: Navigating and interacting with people

 

Important notice: We intentionally propose a complicated task motivated by a story given below. We do not expect that teams will address all the challenges of the task. For this purpose, we present different stages in order to guide teams in their focus. Moreover, teams can chose their interests and bypass some necessary part by using a helper (which can be a QR code, typing in or anything reasonable that a team will discuss with referees) or by using existing ROS nodes. The described stages are only suggestions and so teams are free to make any reasonable additions or changes that they wish to.

 

Summary: An autonomous robot operates in a flat where it assists an elderly person - Granny Annie. She gives the robot certain tasks that require the robot to smoothly navigate in the environment. The robot has its map of static obstacles but sometimes, Granny Annie forgets something lying around. The robot needs to detect this unknown obstacle and can decide to slightly push it away if it is safe to do so. Sometimes it means that a certain path is impossible for a robot to navigate and it must take another route. If there is no other route, the robot can ask for an object to be removed and wait meanwhile. Moreover, Granny Annie has many friends coming to visit her who are not yet fully familiar with robots, hence they often happen to stand in the robot’s way. The robot must detect humans and gently ask them to move to one side. The robot must differentiate between objects and human obstacles.

 

Moreover, one of Annie's friends is sick and she wants to visit her. She wants the robot to go with her and help her to carry a basket. However, for the robot anything outside the department is unknown, as it does not have a map, hence it must strictly follow Annie. The robot can give her instructions about what to do so that it will be able to follow her. On the way to the friend’s place, there are many other people walking and the robot mustn't get distracted and lose Granny Annie. When they reach the friend’s house, the robot waits outside as there are stairs which the robot cannot overcome. When Annie returns, the robot must ensure that she is the correct person whom it previously followed.

 

However, Granny Annie is feeling dizzy as she forgot to take her medicine in the morning and she can’t remember where she lives. She can command the robot to “Take me home”, and the robot should used the previously observed path to guide her home, or “Take me to my doctor”, where the robot can obtain the map to determine how to get there. In both cases, the robot must adjust its speed to Granny’s.

 
 

Stages: We recommend to split the task into the following stages with increasing difficulty and minimising the usage of helpers in each step. Moreover, the team’s focus should be the integrated system rather than several disconnected functionalities.

 

Stage 1:

  1. The robot makes a metric map with the help of an operator

  2. The robot autonomously navigates to a sequence of waypoints and it must be able to replan if it observes that its intended route is not possible (for example if a door is closed between rooms)

  3. If the waypoint cannot be reached because of an obstacle, the robot must signal its knowledge that there is an obstacle. In this stage, we do not differentiate between objects and humans. The robot should ask for the obstacle to be removed and then wait (if there is no another way of reaching the waypoint)

  4. Following stage - Granny Annie has a helper - QR code - on her back. The robot must keep following her while building its map (i.e., addressing the SLAM problem). Moreover, there are no other people around

  5. Guiding stage - Granny gives a command by using a helper (typed in text or a QR code)

  6. Guiding stage - if the command is “Take me home”, the robot uses its previously built map to safely navigate back

 

Stage 2:

  1. Improve (1C) by differentiating between object obstacles and human obstacles

  2. Gently push objects to one side (but not humans)

  3. Improve (1D) by avoiding the helper, for example learning other aspects of Granny Annie (such as her clothes)

  4. Built HRI (Human Robot Interaction) in order to give a human complete guidance about how to interact with the robot (assume that the person to be followed has previously never met the robot, i.e., Granny Annie might lend the robot to a friend)

  5. Following stage - If someone crosses between the robot and Granny, the robot should continue following Granny

  6. Guiding stage - ensure that the person coming down the stairs to be guided is Granny

  7. Guiding stage - if the given command is “Take me to my doctor”, download a map (from the prepared website, or usb stick) and plan a path in that map.

 

Stage 3:

 

  1. Guiding stage - make sure that you have guided Granny to her doctor using face recognition. Learning data will be provided.

  2. Improve (1E) by speech recognition and understanding

  3. What if your robot looses Granny during Following or Guiding stage - actively search for her

  4. Guiding stage - if you detect that Granny is not following you and she takes a different path, gently block her and remind her to follow you