Email this Article Email   

CHIPS Articles: Human Presence Detection

Human Presence Detection
The Navy has tri-service responsibility for EOD-related science and technology development.
By Ann Dakis - October-December 2011
The ability of man-portable robots to effectively address life-threatening hazards like improvised explosive devices (IEDs) on the battlefield has led to widespread user acceptance and fielding in explosive ordnance disposal (EOD) missions. The Navy has tri-service responsibility for EOD-related science and technology development, and the Space and Naval Warfare Systems Center Pacific supports the naval EOD technology division of the Naval Sea Systems Command in the execution of this tasking.

"Today's warfighter considers the robot an asset, since it saves lives, but at the same time, the current operator control unit is perceived as a liability," said Bart Everett, SSC Pacific's technical director for robotics. "From a command and control perspective, the need to teleoperate these systems severely limits their applicability in missions other than EOD."

Teleoperating a robot is extremely fatiguing, and control equipment is too heavy and cumbersome for extended dismounted operations. In addition, the operator becomes fully immersed in directing such a vehicle at the expense of his or her situational awareness, which can be extremely dangerous under battlefield conditions.

Range and line-of-sight restrictions of radio links further complicate the problem, and when communications are lost, the mission is effectively over, and the asset must be manually retrieved.

For these reasons, SSC Pacific's unmanned systems branch is heavily focused on making a robot a more intelligent and effective asset, and the operator control unit less of a liability. According to Everett, "The ultimate goal is to eliminate the need for a robot-specific controller altogether."

Smarter robots and a reduced control burden will expand the use of unmanned systems across a much broader spectrum of military operations than just EOD. The branch has already made significant progress toward these goals in the past few years.

The Autonomous Robotic Mapping System (ARMS), for example, can automatically explore an unknown or hostile environment while building a highly accurate and detailed map. A scanning laser rangefinder measures distance to all surrounding objects within a 360-degree field of view, and stereo cameras assist with three-dimensional rendering. No human guidance is necessary, other than initial high level direction telling the robot where to search.

"Current efforts include optimizing and testing these autonomous exploration and mapping behaviors in urban environments with multiple buildings and varying terrain," said Estrellina Pacis Rius, project manager of the urban environment exploration (UrbEE) project. "It is projected that future conflicts will increasingly occur in urban settings, so we are evaluating realistic use of these robots to support dismounted troops operating in such areas."

For example, urban settings pose challenges for line-of-sight communications and GPS dependent navigation. If radio communication with the warfighter is lost, UrbEE developed behaviors enable a robot to complete its search-and-map mission and return to the starting point to upload the results.

Another UrbEE capability includes adaptive position estimation, which allows the robot to maintain accurate knowledge of its position and location without GPS.

Having a freshly generated floor plan of a previously unknown structure is a huge advantage, but if warfighters then have to enter the space, it is very important for them to know of any hazards. The robot must detect objects of tactical significance and annotate such on the map.

"When you ask warfighters for a prioritized list of what they want to know about… the No. 1 answer is always human presence," Everett said. "From a detection standpoint, humans have two obvious characteristics that can be exploited, in that we move around and we give off heat."

Inexpensive passive-infrared (PIR) motion sensors, or pyroelectric sensors, like those commonly used for home lighting control, exploit both these features. In home systems, to make a sensor that can detect a human being, it must be sensitive to the temperature of a human body. Humans have a skin temperature of about 93 degrees Fahrenheit, and radiate infrared energy with a wavelength between 9 and 10 micrometers. Therefore, the sensors are typically sensitive in the range of 8 to 12 micrometers.

Fused sensor solutions, such as color and thermal imagery, are used to detect and track humans. In Figure 1, the image on the left shows a thermal image overlaid directly on a color image. Regions which are likely to correspond to human skin or thermal signature are highlighted in the fused image on the right.

The first robot to successfully demonstrate such a static motion detection capability was ROBART I, which was Everett's 1981 thesis project at the Naval Postgraduate School. ROBART I used a combination of infrared, optical, acoustical and vibration sensors. This research prototype laid the framework for subsequent robotic security efforts at SSC Pacific, and motion detection from stationary vehicles is now a common and mature technology.

"The fundamental problem is fairly obvious," Everett said. "If the robot is standing still, anything that moves could potentially be a human. But once the robot itself starts to move, everything its sensors 'see' appears in motion, and so this simplistic algorithm becomes ineffective."

This challenge is further complicated by the fact that the very nature of mobility introduces constantly changing variables that alter the physical relationships between a moving platform and its surroundings.

"To address these issues, we employ two-stage sensor fusion," Everett said. "The first stage uses a scanning laser to detect changes in range data, while the second stage processes thermal imagery to verify any potential human presence."

These complementary sensors have non-overlapping strengths and weaknesses that can more reliably detect an intruder from a moving platform, while minimizing the number of false and nuisance alarms. Figure 2 illustrates the power of new algorithms to detect false positives in human presence detection.

After the robot builds a map of the area of interest, it can then detect anomalies in the environment and mark the locations on a map with an icon indicating a potential human presence. An operator can then click on the icon to view more detailed information to confirm whether it is a human presence or not. An example is shown in Figure 3.

On the battlefield, however, a robot must also be able to detect people that are not moving, and may in fact be hiding or otherwise occluded. Rius, who also oversees the human presence detection project, has been leading a team in developing such a capability since 2008 for tactical purposes and for safe operation near pedestrians.

Collaborative work with Sarnoff Corp. (now SRI International) has been ongoing for the past three years to develop a compact, fused visual and thermal stereo camera payload optimized for detecting occluded individuals. The same payload can be used to follow a person's movements.

SSC Pacific scientists and engineers continue to advance robotic technology and artificial intelligence.

According to lead project engineer Greg Kogut, "There is increasing demand from dismounted Navy and Marine Corps warfighters for a leader-follower behavior for small-to-medium sized robotic vehicles. This scenario involves a robot following a particular human like a well-trained dog would do, while avoiding other people who might get in the way. Our human presence detection projects allow us to demonstrate meaningful progress towards such a capability."

Follow SPAWAR on Facebook: www.facebook.com/spaceandnavalwarfaresystemscommand.

Ann Dakis is a staff writer in the public affairs office of SSC Pacific.

TAGS:
Figure 1. Fused sensor solutions, such as color and thermal imagery, are used to detect and track humans. The image on the left shows a thermal image overlaid directly on a color image. Regions that are likely to correspond to human skin or thermal signature are highlighted in the fused image on the right.
Figure 1. Fused sensor solutions, such as color and thermal imagery, are used to detect and track humans. The image on the left shows a thermal image overlaid directly on a color image. Regions that are likely to correspond to human skin or thermal signature are highlighted in the fused image on the right.

Figure 2. An example of urban test data illustrating rejected false positives (blue boxes) and accepted positives (white).
Figure 2. An example of urban test data illustrating rejected false positives (blue boxes) and accepted positives (white).

Figure 3. After a robot builds a map of the area of interest, it can then detect anomalies in the environment and mark the locations on a map with an icon indicating a potential human presence. An operator can then click on the icon to view more detailed information to confirm whether a human is present.
Figure 3. After a robot builds a map of the area of interest, it can then detect anomalies in the environment and mark the locations on a map with an icon indicating a potential human presence. An operator can then click on the icon to view more detailed information to confirm whether a human is present.

CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988
Hyperlink Disclaimer