URBI control system

Before starting FLASH, you should configure his control system. The main configuration file is called _CONFIG_.u and is located in ubi_scripts catalog (Fig. 1). For more information see Control system configuration in Software section.


edit config

Fig 1. Configuration file (_CONFIG_.u) location.

In the default configuration, three components are configured and ready to use: Emys (head) (movement of all of robot’s joints), Audio (speech synthesis, playing and recording audio files) and Video (capturing video from head camera and kinect device) (Fig. 2). You should also add Arms (movement of arms), Hands (movement of hands) and Platform (if you want to use ARIA module integrated with URBI). Do NOT add Platform, if you want to use external navigation software (ServerDemo and MobileEyes) provided by Adept Mobile Robots.


config flash

Fig. 2 View of robot structure config.

To start FLASH, run single.exe or single.bat (edit it if needed)(Fig. 3) from modules catalog; this will launch Urbi engine and load FLASH configuration file along with all the necessary functions.



Fig. 3 Runtime .bat file.

After loading all files, robot’s motors (head and arms) will start to hold torque. You should see "ROBOT READY :-)" info (Fig. 4). By default, basic set of FLASH competencies is loaded, which can be called via Urbi console.


 robot ready

Fig. 4 URBI engine view.

Run GOSTAI Console (or other telnet console), set the appropriate port and connect to the already running engine. Now you can load your commands into the engine (Fig. 5). Call a simple behavior command to get FLASH's head into vertical position.



Fig. 5
Loading example command to the URBI engie.

You can also check FLASH's video and audio systems. For this purpose load a few simple commands into the URBI engine (see Fig. 6, 7).



Fig 6. Simple video commands.



Fig. 7 Simple audio commands.

Example competencies can be found in Urbiscript examples in Tutorials section. 


Programing new gestures

FLASH's control API has ready to use preprogrammed safe-position gestures. You can find them here (LINK). Programming new gesture you should make sure that new arms position will not cause a collision. It is highly recommended programming only a fraction of new gesture at a time and using slow (giving long time of execution time>3s) movements. Moreover, you should always use MovePosition(time, q1, q2, q3, q4, q5) function, which is using trajectory generators. Using direct joint value slot (val) without trajectory generator, motor will achieve the position as fast as it can. You should avoid such situations it may damaged arms or their drive mechanism.

Always start new, no previously tested gesture from arms zero-position

robot.body.arm.MovePosition(3s, 0, 0, 0, 0, 0);

If you wan to achieve by left arm following position:

q1 =  70 deg.
q2 =  75 deg.
q3 =  60 deg.
q4 =  100 deg.
q5 =  0 deg.

in 3s use this function giving 'left' arm localizator:

robot.body.arm[left].MovePosition(3s, 70, 75, 60, 100, 0);

If you are not sure, that it is safe position start from small step like this:

robot.body.arm[left].MovePosition(3s, 20, 20, 20, 20, 0);


robot.body.arm[left].MovePosition(3s, 40, 40, 40, 40, 0);


robot.body.arm[left].hand.MovePosition(3s, 70, 75, 60, 100, 0);



  • If you are no sure at what position arms are, back to zero and start again.
  • Execution of a new gesture should be tested from different starting positions (by preceding it with various other gestures) to see if it is always safe for the robot.
  • The finished gesture should never cause the robot to hit himself no matter what the previous position was.

If you are still no 100% sure about your arm movement try to follow these rules:

  • The operator should observe robot movement with their hand on the main power switch, and should be able to turn off the robot immediately if the programmed move is dangerous for the robot.
  • When testing new gestures, joint maximum speed and torque should be decreased to minimize damage if the programmed movement is invalid. It can be done using the following commands:
Dyn2.SetTorqueLimit(joint_id,300);  // decrease joint torque to 300 (corresponds to 3A of the joint motor)
Dyn2.SetMovingSpeed(joint_id, 150); // decrease joint speed to 150 (internal motor encoder impulses / 0.28 (arm joint ratio) = 42 deg./s),

Both commands change parameters stored in RAM, so they affect the joint movement until the driver for the corresponding joint is restarted (e.g. after power restart). When finish back to default (see Arms_MaxSpeed variable in _CONFIG_.u)

Dyn2.SetTorqueLimit(joint_id,700);  // joint torque to 700 (corresponds to 7A of the joint motor)
Dyn2.SetMovingSpeed(joint_id, 357); // joint speed to 357 (internal motor encoder impulses / 0.28 (arm joint ratio) = 100 deg./s),


Mobile platform control

FLASH's mobile platform is operated in a client-server environment. Platform controller handles the low-level functions such as platform speed and heading, sensor readings acquisition, etc. To complete the client-server architecture, FLASH requires PC software running on an on-board computer connected with the platform controller via a serial port. It provides the high-level, intelligent robot controls, including obstacle avoidance, path planning, localization, navigation, and so on.

FLASH utilises PC-software produced by Adept Mobile Robot LLC. Complete navigation software consist of:

  • ARIA (LINK, DOCU) - Advanced Robotics Interface for Applications (ARIA) software is a C++-based opensource development environment that provides a robust client-side interface to a variety of intelligent robotics systems, including your robot’s microcontroller and accessory systems.
  • ARNetworking (distributed with ARIA) - provides the critical layer for TCP/IP based communications with robot over the network.
  • ARNL, BaseARNL (LINK) - is a set of software packages built on top of ARIA for intelligent navigation and localization. It allows the program to keep track of where the robot is and allows him to get to a given destination.
  • Mapper3 (LINK) - an application designed as a helper for ARNL to convert ARNL's laser scan log files (.2d) to map files (.map), edit map data and add goals, forbidden lines, home points, and other map objects.
  • MobileEyes (LINK) - a graphical application for remotely monitoring and controlling the robot (using wireless networking or other communications link). It uses the ArNetworking protocol (implementation included with ARIA) to connect to a server program on the robot's onboard computer.
attention ARIA and ARNetworking software are provided as open source software under the GNU General Public License. ARIA implementation based on URBI is integrated into FLASH's control system.
attention Full navigation (ARNL, BaseARNL, Mapper3, MobileEyes) software is solely owned and copyrighted or licensed for use and distribution ONLY by Adept Mobile Robots LLC and can be obtained directly from the producer (LINK). FLASH Robotics only presents how to use this software with their robots (FLASH). THEY DO NOT DISTRIBUTE any ADEPT software. ARNL users are authorized to develop and operate custom software for personal research and educational use only. Duplication, distribution, reverse-engineering or commercial application of MobileRobots software and hardware without license or the express written consent of Adept MobileRobots LLC. is explicitly forbidden.
URBI based platform control (without navigation, only platform driving)

To control the mobile platform from URBI level you must first add the "Platform" component in _CONFIG_.u file (described above). It provides ARIA (LINK, DOCU) library integration with the existing robot control system. Using robot API (platform part) (LINK) you can control platform movement (set/get longitude and rotational speed, get odometry data, access laser range sensor data, battery level). Using raw UAria (LINK) module you will be able to set/get more movement parameters (acceleration, deceleration, max speed, max distance, etc.).

Unfortunately, ARIA library server thread causes URBI engine micro locks. To avoid them, load UAria module in remote mode. In this mode, a separate thread is created to handle library functions in the same engine instance. FLASH's control system allows to apply remote mode to any UObject, but some of them (UAria, UGCalendar, UJoystick, UMail) are already defined as .bat files called rem_UObjectName.bat (Fig. 9,10) which run the remote version of a given module. 

Fig. 9 Definitions of remote modules.


run remote

Fig. 10 Definitions of rem_UAria.bat (load UAria in remote mode).


 In order to to use the remote mode run common.bat instead of single.bat. If you have your own set of remote UObjects edit common.bat (Fig. 11).



Fig. 11 common.bat file.


After running the common.bat file you will see two URBI windows (main URBI engine instance and an additional window with UAria) (see Fig. 12).


robot ready nav

Fig. 12 URBI engine view.


 Run GOSTAI Console (or other telnet console), set the appropriate port and connect to the already running engine. Now you can test platform commands (Fig. 13).


platform test

Fig. 13 Simple mobile platform command (set rotational speed to 10 deg/s).


Basic ADEPT platform control software (without navigation, only driving)

You can also use (separately from URBI) your own platform control software. Using examples provided by Adept Mobile Robots you can implement software tailored to your application. Below it is shown how to run ready to use Adept demos (demo.exe, serverDemo.exe).

Go to the ARIA Mobile Robots folder (C:\Program Files\MobileRobots\Aria\bin) and run demo.exe. By default, the ARIA demo program connects with the robot through COM1 serial port and the attached laser rangefinder accessory through COM3 serial port. To change those connection options to comply with the FLASH configuration use this command:

demo.exe -connectLaser -laserType urg2.0 -laserPortType serial -laserPort COM3 -laserStartingBaud 115200

The ARIA demo displays a menu of robot operation options (Fig. 14). The default mode of operation is teleop. In teleop mode, you drive the robot manually, using the arrow keys on your keyboard or a joystick connected to the on-board PC’s USB port. 

  • ↑ - forward
  • ↓ - reverse
  • ← - turn left
  • → - turn right
  • space - stop



Fig. 14 demo.exe view.


With FLASH only few modes are available:

  • position 'p' - displays the coordinates of the robot’s position relative to its starting location bumps,
  • teleop 't' - drive and steer the robot via the keyboard or a joystick; avoids collisions,
  • unguarded 'u' - same as teleop, except no collision avoidance,
  • laser 'l' - displays the closest and furthest readings from the laser range finder.


Another way to control robot movement is using a network demo provided with ARIA software. Go to the ARIA Mobile Robots folder (C:\Program Files\MobileRobots\Aria\bin) and run serverDemo.exe. By default, the ARIA demo program connects with the robot through COM1 serial port and the attached laser rangefinder accessory through COM3 serial port. To change those connection options to comply with the FLASH configuration use this command:

serverDemo.exe -connectLaser -laserType urg2.0 -laserPortType serial -laserPort COM3 -laserStartingBaud 115200

When connected with the ARNetworking serverDemo client, FLASH is ready to be controlled via a network connection. Run MobileEyes software (C:\Program Files\MobileRobots\MobileEyes\bin\MobileEyes.exe) and from its startup dialog specify the IP address or hostname of the computer that is running serverDemo.exe. The hostname localhost or IP address usually works if serverDemo is running on the same PC as the one you launch MobileEyes on.



Fig. 15 MobileEyes software view.



Advanced ADEPT platform control software (full navigation)

It is also possible to use full navigation software distributed by Adept Mobile Robots (separately from URBI). Similarly to the serverDemo example, run arnlServer.exe from C:\Program Files\MobileRobots\ARNL\bin folder. In order to comply with the FLASH configuration use this command:

arnlServer.exe -connectLaser -laserType urg2.0 -laserPortType serial -laserPort COM3 -laserStartingBaud 115200

if you have a ready map use this command:

arnlServer.exe -connectLaser -laserType urg2.0 -laserPortType serial -laserPort COM3 -laserStartingBaud 115200 -map mapName.map

Run MobileEyes software (C:\Program Files\MobileRobots\MobileEyes\bin\MobileEyes.exe) and from its startup dialog specify the IP address or hostname of the computer that is running serverDemo. The hostname localhost or IP address usually works if serverDemo is running on the same PC as the one you launch MobileEyes on.

If map is not loaded, choose 'Tools' then 'Robot Configuration' menu. Choose the Files section. Set the Map parameter to your map file (use the [...] button, or enter the name or path to the map file) (Fig. 16).



Fig. 16 Loading new map.


Before ARNL can navigate automatically, the robot must be correctly localized, so you must provide the software with the initial position of the robot. Choose 'Tools', then 'Robot Tools' and 'Localize to Point' menu option. Find the location in the map that corresponds to where the robot actually is in your environment. Click at this spot and hold the mouse button. Drag the mouse in the direction that the robot is facing, then release the mouse. Please note that visualisation of laser data (dots) should be aligned with the obstacles contained within the previously loaded map. To finish, click Done (see Fig. 17).



Fig. 17 Localization procedure.


After that, your robot is ready for path planning and point-to-point movement. To set a new goal position use the Send Robot tool (see Fig. 18).



Fig. 18 Setting new goal destination.



Creating a new map

You can also create a new map file with ADEPT Mobile Robots software. You can do this in two ways. The first is to run the arnlServer example program. The second way is to use the sickLogger program instead of arnlServer. Below we described only the first one; for more details read Mapper.txt file (you can find it in C:\Program Files\MobileRobots\ARNL\docs).

Connect to arnlServer with MobileEyes, and select 'Map Creation' from the Tools menu, then 'Start Mapping'. Provide a name for the scan log. Then use MobileEyes to drive the robot around the environment. When done, stop mapping by choosing 'Stop Mapping' from the 'Map Creation' submenu of the 'Tools' menu (see Fig. 19).


start map

Fig. 19 Start new map creation.


Before you start mapping it is a good idea to develop an exploration strategy for driving your robot around. The exploration strategy depends on your environment. For tips on making a good map read carefully Mapper.txt file (you can find it in C:\Program Files\MobileRobots\ARNL\docs). Drive the robot around the environment for a while. When done, stop mapping by choosing 'Stop Mapping' from the 'Map Creation' submenu of the 'Tools' menu.

After that, the map file must be processed from laser scan log (.2d) file in the Mapper3 application (you can find it in C:\Program Files\MobileRobots\Mapper3\bin). The map also needs to be completed by adding logical objects like goals, forbidden lines (boundaries), etc. To load the scan log file into Mapper3, first copy the .2d file from the robot (or run Mapper3 on the on-board computer). Once you load the scan, Mapper3 will begin processing the .2d file and will show the progress as it is tracing the scan log and creating the map (see Fig. 20).


make map

Fig. 20 Laser scan log processing.


During all this processing, an animated moving robot icon is shown in the right side of the status bar on the bottom of the Mapper3 window. When the software finishes processing the map, the 'Finish' button will become available. Click the 'Finish' button to save the new map.

More tips and information about navigation can be found in the ARNL developer reference manual and on the support wiki:



Turning OFF balancing platform (for test purposes)

For test purposes, the robot can be used while placed on a stable base, without balancing. To turn the balancing off, you must use the mode selection switch located on the platform controller. Putting the switch into middle position turns the balancing off, upper and lower positions turn the balancing on (using IMU or odometry for that purpose respectively) (Fig. 21).

off balance

Fig. 21 Turning OFF the balancing.

Balancing can be turned OFF when robot power is on; it should be assisted by a second operator holding the back handle of the robot. If the balancing is OFF, it should NEVER be turned back ON when the robot is operating. To turn the balancing on again, turn off the robot first, move the mode switch to the desired position and then turn on the robot (complying with the start-up procedure).



EMYS and FLASH are Open Source and distributed according to the GPL v2.0 © Rev. 0.9.2, 23.11.2017

FLASH Documentation