The lowest layer of the control system consists of dynamically loaded modules called UObjects which are used to bind hardware or software components, such as actuators and sensors on the one hand and voice synthesis or face recognition algorithms on the other hand. Components with an UObject interface are supported by the urbiscript programming language which is a part of Urbi software.

connector   COMMUNICATION MODULES

Communication with the hardware level is achieved by means of two modules able to communicate through serial ports.

  • UAria - enables controlling the mobile platform via ARIA protocol. It gives FLASH full compatibility with Mobile Robots products and offers a support for popular laser scanners,
  • UDynamixel - transfers data using Dynamixel protocol which enables controlling the actuators driving the arms, hands, and the head,
  • USerial - transfers data via serial port. 
  • URfid - allows to read/write rfid tags using simple USB reader. 

video   VIDEO MODULES

The next group of modules provides image processing capabilities on RGB and RGB-D data. Images from a camera can be accessed and processed by modules implementing OpenCV library functions.

  • UCamera - image capture functions and camera settings,
  • UColorDetector - color detection in HSV space,
  • UFacET - (Facial Expression Tracker) is a library of image processing procedures for detecting and parameterising face components (eg. eyes, eyebrows, lips, forehead wrinkles),
  • UImageDisplay - basic module to display image in a simple window,
  • UImageTool - basic image processing such as blurring, thresholding/morphological operations, etc.
  • UKinect (OpenNI+NITE 1.x)UKinect (OpenNI+NITE 2.x)  or UKinect (Kinect SDK) - extracting RGB-D data from a Kinect sensor. The first module allows to measure the distance to certain objects present in the image, detect human silhouette as well as provides information on position of particular elements of human body. It also gives access to very simple gesture recognition algorithms. The thitd module, based on Kinect SDK, provides the same functions as UKinect (OpenNI) while adding such features as 2D and 3D face tracking and microphone array support, speech recognition and detection of voice direction,
  • UMoveDetector - movement detection,
  • UObjectDetector - algorithms detecting objects, e.g. human faces or certain body parts using Haar classifier.

speaker   AUDITORY MODULES

  • UMP3 - PCM (wav) to MP3 converter.
  • UPlayer - simple module for playing pre-recorded .wav files. It enables the robot to play back various sounds and sentences recorded by external text-to-speech software,
  • UPlayerNext - advanced module for playing and recording (WAV/AIFF/MP3/MP2/MP1/OGG/URL Streams). It enables the robot to play back sounds from files or broadcast streams, record from microphones with many effects (equalizer, phaser, autowah, echo, distortion, chorus, dynamic aplification, BPM counter and beat trigger...),
  • URecog - this module uses Microsoft Speech Platform to recognize speech recorded using an external microphone,
  • USpeech - this module uses Microsoft Speech Platform for real-time speech synthesis,

internet   NETWORK MODULES

  • UBrowser - implements the functions of a web browser and an RSS reader. The module provides a wide variety of functions needed for extracting particular information from the Internet, like weather forecast or news,
  • UFacebook - module to handle Facebook's social networking services. Using this module, the robot is able to post new messages, upload photos (eg. from his camera), retrieve posts and a variety of other tasks that your FB app might require. With this module you can use our official "FLASH Connect" Facebook application or create your own app. 
  • UGCalendar + Contacts - allows to connect robot with your Google calendar. 
  • UMail (POCO), UMail (CURL) - serves as an e-mail client with the ability to check and read mails and send messages with various types of attachments (e.g. image from the robot’s camera or a voice message recorded by a Kinect sensor),
  • UTextTool - implements text functions like encoding, removing html markups... 
  • UPhilipsHue - control your Philips Hue home lighting system.

appraisal   APPRAISAL MODULES

Information gathered by the robot (from the websites, e-mails or via auditory modules) can be affectively assessed to extract their emotional meaning with modules:

  • UANEW - utilizes ANEW (Affective Norms for English Words) project. It can be used for evaluating a word or a set of words in terms of feelings they are associated with.
  • USentiWordNet is based on a project similar to ANEW - SentiWordNet. It is a lexical resource for opinion mining, assigning ratings to groups of semantic synonyms (synsets).
  • UWordNet - is an interface to WordNet - a large lexical database of English words, in which nouns, verbs, adjectives and adverbs are grouped into synsets, each expressing a distinct concept. It can be used as a synonym dictionary to find the basic form of a word.

ml   MACHINE LEARNING MODULES

  • UEigenfaces - user recognition alghoritm,
  • UKNearest - data training alghorithms (e.g. color learning).

pad   REMOTE CONTROL MODULES

  • UJoystick - module to handle pads, joysticks, etc. 
  • UKeyboard - module for capturing pressed keys.

emotion   EMOTIONAL MODULES

  • UWasabi - implementation of Wasabi emotional system proposed by Becker-Asano,
  • UPAD - implementation of dynamic PAD-based model of emotion.

 

 

EMYS and FLASH are Open Source and distributed according to the GPL v2.0 © Rev. 0.9.1, 15.05.2017

FLASH Documentation