Skip to content

Proposed API for GPSR

Vision

  1. count(target: str) -> int

  2. target (str): The target to identify and count. There can be many targets for this function.

  3. int -> returns the amount of targets found in the environment. The following are required:
  4. Related to gestures or postures
    • sitting_person, standing_person, lying_person, waving_person, person_raising_left_arm, person_raising_right_arm, person_pointing_left, person pointing to the right
  5. Related to clothing, the following combinations:

    • color_list = ["blue", "yellow", "black", "white", "red", "orange", "gray"]
    • clothe_list = ["t shirt", "shirt", "blouse", "sweater", "coat", "jacket"]
  6. locate(target: str) -> point(x, y, z)

  7. target (str): The target to identify and locate. There can be many targets for this function.

  8. point(x, y, z) -> returns the coordinates of the target in the environment. The following are required:
  9. Related to gestures or postures
  10. sitting_person, standing_person, lying_person, waving_person, person_raising_left_arm, person_raising_right_arm, person_pointing_left, person pointing to the right
  11. Related to clothing, the following combinations:

    • color_list = ["blue", "yellow", "black", "white", "red", "orange", "gray"]
    • clothe_list = ["t shirt", "shirt", "blouse", "sweater", "coat", "jacket"]
  12. identify(target_name: str) -> point(x, y, z)

  13. Identify a person by a previously defined name.

  14. save_person(new_name: str) -> void

  15. Save a person (face) with a given name.

  16. object detection model (to allow pick and place):

  17. All robocup objects

  18. Some examples are:
{
 'orange juice', 'red wine', 'milk', 'iced tea', 'cola',
 'tropical juice', 'juice pack', 'apple', 'pear', 'lemon',
 'peach', 'banana', 'strawberry', 'orange', 'plum', 'cheezit',
 'cornflakes', 'pringles', 'tuna', 'sugar', 'strawberry jello',
 'tomato soup', 'mustard', 'chocolate jello', 'spam', 'coffee grounds',
 'plate', 'fork', 'spoon', 'cup', 'knife', 'bowl', 'rubiks cube',
 'soccer ball', 'dice', 'tennis ball', 'baseball', 'cleanser', 'sponge'
}

HRI

  1. interpret_commands(interpreted_instruction: str) -> CommandList

  2. interpreted_instruction (str): The instruction heard by the robot, which the user should mention.

  3. CommandList -> returns a list of commands to be executed by the robot. The commands possible are specified in commands.md.

2.