Skip to content

Transformations waypoints

Manos Tsardoulias edited this page Jan 26, 2021 · 3 revisions

The streamsim tf library provides tracking of all moving and non-moving objects in the environment, as well as methods for getting information about what affects what. Its methods are:

redis::RPCService <streamsim.tf.declare>

  • Used internally from Streamsim to declare devices to tf. Do not use!

redis::RPCService <streamsim.tf.get_declarations>

  • Gets all declared devices. Theoretically these are all environmental devices and those on the robots, as well as the environmental actors.
  • Input data: {}

redis::RPCService <streamsim.tf.get_tf>

  • Used to get the absolute pose of a device, using its name.
  • Input data: {'name': 'humidity_X'}
  • Example response: {'theta': 1.5707963267948966, 'x': 3.0, 'y': 2.0}

redis::RPCService <streamsim.tf.get_affections>

  • Used to get devices or actors that are in proximity of a sensor and affects it.
  • Example input: {'name': 'humidity_X'}
  • Example output: {'hum_X': {'type': 'humidifier', 'info': {'humidity': 50}, 'distance': 1.4142135623730951, 'range': 5.0}, 'water_12': {'type': 'water', 'info': {'humidity': 100}, 'distance': 0.0, 'range': 5.0}}

redis::RPCService <streamsim.tf.simulated_detection>

  • Used to simulate detections, by avoiding the realistic operation of sensors
  • Example input: {'name': 'microphone_X', 'type': 'sound'}
  • Detection types for microphone: sound, language, emotion, speech2text
  • Detection types for camera: face, qr, barcode, gender, age, motion, color, emotion
  • Example output: {'result': True, 'info': None, 'frm': {'human_0': {'type': 'human', 'info': {'motion': 0, 'sound': 1, 'language': 'EN', 'speech': 'Hey there', 'emotion': 'angry'}, 'distance': 2.5495097567963922, 'range': 4.0}, 'sound_source_4': {'type': 'sound_source', 'info': {'language': 'EN', 'speech': 'Hey there', 'emotion': 'happy'}, 'distance': 0.5, 'range': 5.0}}}