Animals have an amazing ability to quickly and efficiently extract useful information from their environment. Humans, for example, can react to visual stimuli with a mean reaction time of 180 to 200 milliseconds (Shelton and Kumar, 2010).
To explore how this could be done, a robotic analogue for the human visual system has been developed at Plymouth to assess algorithms, and to mimic observed behaviours.
The Owl robot is a stereo camera host designed for the exploration of verging camera stereopsis. It has five degrees of freedom offering Neck rotate with Stereo eyes with pan and tilt.
Local processing is by the Raspberry Pi dual camera compute board located at the base of the robot. An additional PCB provides an interface to the Pi for Servo control and an audio codec.
The cameras are Pi HD cameras set to deliver stereo pairs at VGA resolution and streamed using RTP protocol web streaming at 30fps.
A pair of MKS DS65k high-speed digital servos move the eyes with a 333Hz period. Normal drive Pulse Width Modulation (PWM) is 3ms period, with a pulse width between 850us- 2150µs (ie. 1300 PWM value range).
The centre position is set at approximately 1,500µs and the servos have a dead band of one microsecond. The PWM range allows for 160° rotation, with one PWM step being 0.113°.
The Pi compute board was chosen as it offers dual DMA camera inputs, that facilitate the high-speed streaming of camera images to the internet. The processor is BCM2835 (the same installed in Raspberry Pi B+).
The eyes of the robot are OV5647 camera modules, each capable of 2592 x 1944 pixels in a 4:3 aspect ratio. The cameras have been set to display a lower 640 x 480 pixels, as higher resolutions are not required and reduce image processing time.
The motherboard of the Owl is a Raspberry Pi compute module and board, chosen as it offers two DMA camera inputs.
The two video streams are combined into a single 1280 x 480 stream, which is then communicated to a host computer via an RTP interface over USB in the MJPEG format. They are not synchronised. The cameras can pan and tilt independently via the four high-speed MKS DS65K servos, each capable of a no-load angular velocity of 0.203 sec/60° at 4.8V (source: www.mksservosusa.com). This equates too approximately 300°s-1.
Observational data has shown that human eye saccades averages 160°s-1 (Abrams et al., 1989), but can reach a maximum angular velocity of approximately 900°/s (Wilson etal., 1993), thus the robotic analogue cannot capture the full speed of the human visual system. However, 300°/s is satisfactory for the experiments planned.
The neck of the robot can rotate about one axis with a Corona DS558HV servo, offering a range of 160 degrees and a top no-load angular velocity of 300°s-1 (source: hobbyking.com). This velocity will be lower due to the mass of the owl head, but still satisfactory for target tracking. Servos are controlled by the Pi compute module, however, the PWM drive values are generated by the host computer.
The Pi compute module runs a python script which creates an IP socket over a USB connection via a TCP protocol. The script waits for a 24-byte packet, which contains five 4-digit decimal integer numbers in an ASCII string separated by spaces, these are the new servo positions instructed by the host computer.
Software limits are applied to these new positions so that the servos do not over actuate to a state where the camera ribbon cables are damaged.
The host computer supports the main software, programmed in C++ with the OpenCV 3.1 library. This set up will allow for faster computations, compared to just using the on-board Pi compute module. Cameras were calibrated using OpenCV functions to correct for intrinsic distortions.
Download the report to read more.
Github resources for the Plymouth Owl project.
Dedicated robotics and communications laboratories, where students focus on industrial and intelligent robotic systems and high frequency electronics.
Dr Ian Howard
Associate Professor Computational Neuroscience