Sunday, February 22, 2015

The Inner Voice of Unmanned System Control

Introduction
The innovative technology from Think-A-Move (TAM) called SPEAR is comprised of an in ear microphone that captures speech inside the ear canal which eliminates background noise and captures voice commands. SPEAR is designed to work with a joystick operated controlled unit and, a portable laptop. SPEAR is currently conducting field testing.
    Traditionally, Unmanned Ground Vehicles (UGV) are tele-operated through a joystick-based Operator Control Unit (OCU), which tends to be big and bulky, requiring the operator to be confined to a military vehicle or some other stationary position. Moreover, joystick control of the robot requires the operator to employ both of his hands, leaving him defenseless in case of an attack (Chhatpar, Blanco, Czerniak, Hoffman, Juneja, Pruthi, Liu, Karlsen, Brown, 2009). Ttraditional ground station control systems require the operator to be secured when operating in the hostile environments typically encounter in combat scenarios. Furthermore the operator's mobility and situational awareness is degraded. In combat situations the degradation of the operator's situational awareness could become fatal. Augmenting traditional joystick actuated controlling units with SPEAR increases the operator's mobility and situational awareness.

 

How it works
    The system design provides 30 dB of passive noise reduction, enabling it to work well in high-noise environments, where the performance of traditional speech systems, using external microphones, degrade significantly; it also utilizes a proprietary speech recognition engine to process the detected signal and additional algorithms enabling it to maintain a high level of accuracy in high noise environments… The performance of traditional speech recognition systems, which utilize an external microphone, degrades significantly when ambient noise levels increase. This is because of the decrease in signal-to-noise ratio which results from increase in ambient noise levels. Given the loud noise levels typically experienced in a military operational environment, this significantly limits the practical application of traditional speech recognition systems (Chhatpar, et al, 2009).
    Once the SPEAR receives voice commands those commands are processed via a portable notebook which the operator could carry in a backpack. A wired connection carries the signal from the TAM earpiece, and ends in a standard 3.5mm audio jack that can be plugged into a computer soundcard (Brown, Blanco, Czerniak, Hoffman, Hoffman, Juneja, Ngia, Pruthi, Liu, 2010). Those commands are processed and relayed to the UGV. The UGV can execute voice commands from SPEAR and commands from the traditional joystick-based Operator Control Unit at the same time.

 

Challenges
    Research conducted in 2010 demonstrated how important it is to tailor speech commands to the target audience. Before training the soldiers in the experiment, less than 10% of the commands the Soldiers thought should be used were the commands that were programmed into the speech-control system. Even after training and using many of the commands during a simulation task, only 34% of the Soldiers remembered the commands that the system designers programmed. Commands that were initially intuitive ("Take picture" and "Label alpha") were correctly used by 72% and 83%, respectively, of the Soldiers after training. Conversely, less intuitive phrases such as "Activate exclusion zone" were not remembered by any of the Soldiers, even after training (Redden, Carstens, Pettitt, 2010).
    Another area of concern was differentiating between causal speech and speech commands. Many of the commands like "take picture" could be used in conversation with other soldiers. Having the UGV execute unintended commands during missions could be inconvenient or detrimental to the mission depending on the situation. For instance, mistakenly giving the command to "cut the wire" while explaining the details of the mission to a superior could have dire consequences during an explosive disposal operation.

 

Recommended Improvements
    Improvements to the command data base are already in progress. Other recommended improvements are the use of a "begin command" key word and in "end command" key word. An uncommon word that would not be used in everyday speech would be most suitable. That way commands would not be activated when commands were not intended to be activated. Feedback from the SPEAR software could also ask for confirmation for sensitive commands like in the "cut the wire" example above.
    The inclusion of additional user feedback references like Tactile Situational Awareness System, head tracking, gesture recognition sensors and the use of look through display and augmented reality glasses like Google Glass or Microsoft HoloLens would free up a soldiers hands for personal defense and increase the soldier's situational awareness. The soldier would be able to see through his own eyes and the camera of the UGV by utilizes a look through type display. Object recognition software could highlight and/or magnify objects of concern that the UGV sees or the soldier sees display on the look through glasses. For instance a data base of objects commonly used to hide Improvised Explosive Devices (IED) could scan the background and identify likely threats at the discretion of the operator. The operator could command the UGV camera to track his head movements during intricate operations like diffusing an IED. Tactile feedback could alert the operator when the UGV is approaching obstacles or approaching dangerous rollover angles. Gesture recognition would allow the operator to use hand signals to control the UGV when the tactical situation dictates silence. Each of these technologies are still in its infancy. Complimentary integration strategies of each control should be considered during the continued development of these technologies.


    
References
Brown, J., Blanco, C., Czerniak, J., Hoffman, B., Hoffman, O., Juneja, A., Ngia, L., Pruthi, T., Liu, D. (2010). Soldier experiments and assessments using SPEAR speech control system for UGVs. Paper presented at the, 7664(1) doi:10.1117/12.852507

Chhatpar, S. R., Blanco, C., Czerniak, J., Hoffman, O., Juneja, A., Pruthi, T., Liu, D., Karlsen, R., Brown, J. (2009). Field experiments using SPEAR: A speech control system for UGVs. Paper presented at the, 7332(1) doi:10.1117/12.818854

Glass. Retrieved February 22, 2015 from http://www.google.com/glass/start/

Microsoft HoloLens. Retrieved February 22, 2015 from http://www.microsoft.com/microsoft-hololens/en-us

Redden, E. S., Carstens, C. B., Pettitt, R. A., & ARMY RESEARCH LAB ABERDEEN PROVING GROUND MD HUMAN RESEARCH AND ENGINEERING DIRECTORATE. (2010). Intuitive speech-based robotic control

Sunday, February 8, 2015

NASA’s Global Hawk Pacific Data Delivery

NASA's Global Hawk Pacific (GoPac) mission was the first mission using the Global Hawk Aircraft to study trace gases, aerosols, and the dynamics of the upper troposphere and lower stratosphere (Dunbar) using ten different exteroceptive instruments packages.

Instrument Packages
Airborne Compact atmospheric mapper – two spectrometers to measure how sunlight is absorbed and scattered throughout the atmosphere, and two high definition cameras to identify cloud types and features on Earth's surface.

High Definition Video System - forward looking, time lapsed imagery to identify cloud types and provide situational awareness for the plane, allowing the mission team to change altitude and course to investigate interesting atmospheric phenomena

Microwave Temperature Profiler – This radio meter detects naturally-occurring emissions of microwaves from oxygen molecules in the atmosphere a measurement that is translated in a picture of temperature field above, at, and below the flight path of the plane

Focus Cavity aerosol spectrometer and Nuclei-mode Aerosol sized spectrometer – These spectrometers measure these size distribution and abundance of aerosols. Aerosols play an important but incompletely understood role in climate change and atmospheric dynamics.

Ultra-high sensitivity aerosol spectrometer – This spectrometer looks down a column of air from the plane down to Earth's surface and measures the properties of light in the atmosphere to determine the concentration and size of aerosol particles.

Unmanned Aerial System Hygrometer – This advanced form of Hygrometer uses a continuous beam of laser light and two mirrors to sense the amount water present in the air. Water vapor is a potent greenhouse gas.

UAS Ozone- NOAA Unmanned Aerial System Ozone Instrument – The instrument directly samples the ozone in the atmosphere. Taking a sample of air from outside the aircraft and passing it between a lamp that emits ultraviolet (UV) radiation and a UV detector.

Unmanned Aerial System Chromatograph for Atmospheric Trace Species – the instrument collects air samples and uses two Chromatographers to separate out different molecules and detect the presence and amount of greenhouse gases, ozone-depleting gasses, carbon monoxide, and hydrogen.

Cloud Physics Lidar – This instrument pulses laser light into the atmosphere and observes the reflections – a process known as light detection and ranging, or LIDAR – to reveal the structure and brightness of clouds and aerosols.

Meteorological Measurement System – This package of instruments measures atmospheric temperature, pressure, air turbulence and the direction and speed of the winds (both horizontally and vertically) immediately around the plane. (Dunbar)

Global Hawks flight Characteristics
Global Hawk's maximum endurance of 42 hours and an on-station endurance of 24 hours, range of 3000 NM, and maximum altitude of 65,000 feet which is above significant weather occurrences (Ivancic and Sullivan) enables it to monitor the development of weather patterns in the Pacific Ocean. Electrical sensors are power by a 28 VDC, 186.3 A (5.2 KW) engine driven generator and a 115 VAC, 3-phase 400 Hz, 71.8 A/phase (8.3 KVA) hydraulic powered generator (NASA Armstrong…). The electricity created by these two generators power all of the electrical sensors on the Global Hawk.

Data Delivery
The Global Hawk mission used a 2 Mbps bidirectional link provided by the Ku-band satellite link. The system was capable of 50 Mbps but the cost to operate at such rates was prohibitive (Ivancic and Sullivan). The Global Hawk used a method to deliver data to ground stations called store and forward. The aircraft network uses a standard ethernet TCP/IP LAN with airworthy switches using 10/100T ports. The Link Module system acts as a router between the aircraft and Global Hawk Operation Center networks. Additionally, the Link Module system acts as an on-board file server and database as well as a wide-band router (Sorenson). The data collected from the instrument packages is stored internally within the aircraft and transferred to a ground station when signal coverage of the KU band supports data transfer. This way scientists can evaluate the data as close to real-time as possible. Data is backed up onboard in case of signal failure or corruption.

Recommended improvements
The data relay operation of the GoPac Global Hawk is heavily reliant on satellite coverage to transfer data to ground stations throughout its flight. The Global Hawk actually losses satellite signal coverage around 75 degrees north latitude and during satellite handoff. Because of the store and forward protocol the Global Hawk is able to retain data and transfer once satellite communications are restored. However, if the Global Hawk would be lost during this time data may not be recoverable.

One Possible solution is the use of smaller high endurance Unmanned Aerial Systems (UAS) that can act as network nodes with ground stations. These small UASs could also act as data system backups by also utilizing the same store and forward protocols that Global Hawk utilizes. If one of the UASs is lost the remaining UASs would still be able retain and transfer valuable data. Utilizing UAS platforms would decrease or eliminate the reliance on satellite network coverage therefore decreasing high operational data transfer costs.

References:
Dunbar, Brian. August 2013. GPAC 2010. Retrieved February 7, 2015 from http://www.nasa.gov/externalflash/Glopac/

Ivancic, William D.; Sullivan, Donald V. Delivery of Unmanned Aerial Vehicle Data (January 2011)
Sorenson, Carl. Global Hawk Payload Network Communications Guide. (November 2008).

NASA Armstrong Fact Sheet: Global Hawk High-altitude, long-endurance science aircraft. Feb 28, 2014). Retrieved February 7, 2015 from http://www.nasa.gov/centers/armstrong/news/FactSheets/FS-098-DFRC.html


Sunday, February 1, 2015

Unmanned System Sensor Integration and Placement

Sensor placement is a critical design decision that is based on the objective that an unmanned system will be tasked to perform.
Blade Nano QX

First person view (FPV) racing is intense. A small remotely piloted unmanned aeronautical system that can zip through obstacles at 100mph, the Blade Nano QX is small, capable, durable and affordable. 
The heart of the Blade Nano QX is a multisensor 4-1 module that receives radio signals from a handheld transmitter. It provides electric speed control, attitude gyros and control mixing that provides a stable flying platform for the user. When a user makes a control input, the mixing control references the UAS's current attitude via the internal gyros then adjusts the control surfaces needed to accomplish the desired input. The mixing control limits control inputs from the user. In stability flight mode, the mixing unit limits user inputs when reaching pitch attitudes and bank angles that could lead to an unusual attitude and cause the UAS to depart controlled flight. The limiting feature of the mixing control can be placed into agility mode so the user can fly aerobatic flight profiles and achieve high gain control response in contrast to stability mode.
The eyes of the Blade Nano QX is the Spektrum™ ultra micro FPV camera. The FPV camera is the main sensor that enables the user to fly in FPV. This camera is capable of 720p HD video without transmission lag to the user's headset via the SpiroNET circular polarized antenna system. The Fat Shark Teleporter V4 5.8GHz headset with digital head tracking adjusts the camera in reference to the user's head movement via a gimbal. The panning of the FPV camera allows the user to scan ahead of the Blade Nano QX for obstacles along the desired flight path. The camera is placed inside to nose of the UAS. The FPV camera placement along the center cordline of the airframe makes it possible for the user to fly at such a high rate of speed and avoid smashing into obstacles. Placement of the camera in the center of the UAS enables the user to estimate obstacle clearances and fly the best path.   
The Blade 4-channel 2.4GHz transmitter enables the user to remotely control the Blade Nano QX. A LED light on the transmitter indicates when the signal between the transmitter and the receiver is lost. The Blade Nano QX also has a low battery sensor/indicator on the receiver. All of these sensors integrated in the Blade Nano QX enable the user too safely and effectively control the Blade Nano QX when flying racing circuits at outrageous speeds through obstacles from first person perspective. But it's the camera placement that makes it possible.

DJ Phantom 2 vision+
"The most important thing, of course, if you are flying to shoot, is to see what your composition is," -filmmaker Philip Bloom (Hansen)
When conducting aerial motion video and stills below 400ft above ground level (AGL) the most important thing is stability. Not only is the Phantom 2 Vision+ stable but it is also one of the easiest remotely piloted UASs available. The Phantom 2 is so easy to use that everyone, not just the hardcore hobbyist, can use it to capture stunning 1080p videos and 14 megapixel still photographs.    
The Phantom 2 Vision+ uses an f2.8 lens paired with a 14-megapixel 1/2.3-inch CMOS sensor that can capture Adobe DNG raw and JPEG images and video at up to 1080p at 30fps and 720p at 60fps. You can also control ISO, exposure compensation and white balance, and choose from a 140-, 120-, or 90-degree field of view (Top-notch). Videos and photographs are stored on a 4GB removable SD card. The camera is stabilized with a 3-axial stabilized gimbal. The user can control the ISO, exposure compensation and white balance, and choose from a 140-, 120-, or 90-degree field of view(Top-notch). The placement of the camera underneath the UAS on the gimbal enables the Phantom 2 Vision+ to create professional videos and pictures. The location of this camera prevents the frame of UAS from interfering with the pictures. 
The key piece that makes the Vision+ easy to fly for beginners is the built-in DJI Naza-M flight control system. It's made up of an inertial sensor, barometric altimeter, a compass, GPS, LED flight [indicator sensors] and a controller that gets them all to work together(Top-notch). You can set up to 16 GPS way points via the Vision app. The Phantom 2 Vision+ can be tracked on a smart phone via GPS and a digital map. The digital map also displays no fly areas like airports. If the Phantom 2 Vision+ loses the signal from the handheld transmitter, it will continue a mission if operating in mission mode with GPS waypoints. If the transmitter signal is lost when Phantom 2 Vision+ is being operated manually the Phantom 2 Vision+ will trigger 'Return-to-Home', meaning the Phantom 2 Vision+ will automatically fly back to its takeoff point and land safely (Phantom 2 Vision+). 
The Phantom 2 Vision+ is not the most expensive system, nor does the Phantom 2 Vision+ have all the top of the line features on the market. However, the Phantom 2 Vision+ is an extremely easy to use platform. Its usability enables new users to fly and create stunning high quality videos and pictures.  
References: 
Hansen, Eric. The Best Drones. (January 8, 2015). Retrieved January 31, 2015 from http://thewirecutter.com/reviews/best-drones/

Nano QX FPV RTF with SAFE® Technology. (n.d.) Retrieved January 31, 2015 from http://www.horizonhobby.com/nano-qx-fpv-rtf-with-safe-technology-blh7200 
Nano QX Quad-Copter Manual. (January 7, 2013). Retrieved January 31, 2015 from www.horizonhobby.com/pdf/BLH7600-Manual_EN.pdf

Phantom 2 Vision+. (n.d.). Retrieved January 31, 2015 from http://www.dji.com/product/phantom-2-vision-plus/feature 
Phantom 2 Vision+ User Manual (EN) v1.8. (January 30, 2015). Retrieved January 31, 2015 from http://www.dji.com/product/phantom-2-vision-plus/download

Top-notch eye in the sky. (October 14, 2014). Retrieved January 31, 2015 from http://www.cnet.com/products/dji-phantom-2-vision-plus/