Flight Trials of a Rotorcraft Unmanned Aerial

2 downloads 0 Views 1MB Size Report
with Northrop Grumman's RQ-8A Fire Scout for ground and, more ... by the Army/NASA. Autonomous Rotorcraft Program (ARP), Brigham ..... landing script is given. The helicopter ..... particularly important when closing the loop on a vision ...
Flight Trials of a Rotorcraft Unmanned Aerial Vehicle Landing Autonomously at Unprepared Sites Colin Theodore Dale Rowley San Jose State Foundation Ames Research Center, CA

Adnan Ansar Larry Matthies Steve Goldberg Jet Propulsion Laboratory, CA

David Hubbard Brigham Young University Provo, UT

Matthew Whalley Aeroflightdynamics Directorate (AMRDEC) Ames Research Center, CA

ABSTRACT This paper describes the flight testing and evaluation of technologies for the autonomous landing of a Yamaha RMAX helicopter at non-cooperative sites without the aid of GPS. The Yamaha RMAX used for these flight trials has been modified to include stereo cameras, a scanning laser, and an avionics payload. Machine vision stereo range mapping is used to generate an accurate terrain representation, and a safe landing area determination algorithm is used to select the safest landing point within the terrain. A machine vision self-localization system is used to measure the helicopter position during the descent and landing when GPS is unavailable. The software and hardware architecture of the flight system is presented and each system component is described. Results and lessons learned from the flight evaluation and optimization of the individual components are reported as well as the overall system performance with respect to a set of objective metrics for the autonomous landing flight trails.

With the intent of the US armed forces to field an increasing number of Unmanned Aerial Vehicles (UAVs), particularly Rotorcraft UAVs (RUAVs), there is a clear need for an autonomous landing capability to remote, unprepared (and possibly cluttered) sites. Such a capability would also need to include a self-localization component since GPS signals may be intermittent or unavailable during the landing task due to electronic jamming or occlusion of GPS satellites. In addition to the difficulty of finding a suitable landing point and navigating to this landing point without GPS, all terrain sensing, information processing and decision-making functions must be performed on-board without the need for operator interaction. Typical operational scenarios that would benefit from such a capability include: perchand-stare surveillance operations; ground loiter to conserve fuel and other system resources; precision supply delivery and Forward Arming and Refueling Point (FARP) operations; remote recovery of the RUAV; and forced landing contingencies resulting from onboard failures or loss of communication.

One method of landing RUAVs autonomously is to specifically prepare the landing site with instrumentation or markings. A landing system in current operational use is the UAV Common Automatic Recovery System (UCARS) developed by Sierra Nevada Corporation [1]. Derivatives of this system include the Tactical Automatic Landing System (TALS) that is used by the US Army’s Shadow UAV, and UCARS-V2 that has seen success with Northrop Grumman’s RQ-8A Fire Scout for ground and, more recently, shipboard landings. These systems use a ground-based millimeter-wave radar tracking system with a transponder mounted on the vehicle. The systems can be used to provide automatic landing only at cooperative sites, since ground equipment is required and must be set-up before the landing. Another approach is to mark the landing point with a pattern and have the onboard system track the pattern with sensors and algorithms. Researchers at the Jet Propulsion Laboratory (JPL) [2, 3] have demonstrated this capability by marking the landing point with a painted “H” target and using machine vision to identify and track the target for landing. For this demonstration, a combination of vision, Inertial Measurement Unit (IMU) and GPS measurements were used to navigate the helicopter to the landing pad.

Presented at the American Helicopter Society 62nd Annual Forum, Phoenix, AZ, May 9-11, 2006. Copyright © 2006 by the American Helicopter Society, Inc. All rights reserved.

When considering the task of landing at noninstrumented sites, various methods have been demonstrated for terrain sensing, including passive stereo ranging and structure from motion, laser scanning, and

INTRODUCTION

millimeter-wave radar scanning. For example, researchers at JPL [4] have demonstrated landing at a non-cooperative site using a vision-based approach with an on-board landing site selection algorithm. As another example, researchers at UC Berkeley have demonstrated a multiple view motion estimation approach for building a terrain representation and selecting a suitable landing point [5]. This paper describes an integrated approach to the autonomous landing of RUAVs that uses: machine vision technologies for terrain sensing and self-localization without GPS; an advanced model-following flight control system with accurate flight-identified dynamic models; sensor integration with smooth switching between GPS and vision-based waypoint control; and an on-board mission manager for decision-making and coordination of the individual system components. This work was preformed as part of the US Army’s Precision Autonomous Landing Adaptive Control Experiment (PALACE) Army Technology Objective (ATO) that was carried out by the US Army Aeroflightdynamics Directorate (AFDD) at Ames Research Center. Supporting efforts were provided by the Army/NASA Autonomous Rotorcraft Program (ARP), Brigham Young University (BYU), and the Mobility and Robotics Group at JPL. Table 1 lists the quantitative metrics and the target performance objectives for the PALACE program. The first three metrics are the target constraint values on the landing site selection algorithm and are a function of the geometry of the Yamaha RMAX helicopter. The following metric specifies the landing accuracy and accounts for the amount of drift in vehicle position during the vision-based descent. The next requirement specifies that the feature-tracking algorithms should run with a processing time of less than 100 msec to produce a position estimate at 10Hz. The final two objectives specify that the Safe Landing Area Determination (SLAD) algorithm should run in under 5 seconds with an accuracy of greater than 98% in choosing a safe landing site. Initial work on the PALACE program independently demonstrated each of the core machine vision technologies [6], and validated their utility in both simulation and flight. This was followed by the construction of an integrated simulation environment [7] for the development of the landing procedures and integration of the machine vision algorithms, vehicle dynamics, and control laws. The performance of the

Table 1. PALACE program performance metrics and objective values. Quantitative Metric Landing Site Size Landing Surface Slope Landing Surface Roughness Landing Accuracy Feature-Tracking Cycle Time SLAD Calculation Time SLAD Success Rate

Project Objective < 7.0 m < 15 deg < 10 cm < 1.25 m < 100 msec < 5 sec > 98%

individual machine vision algorithms was evaluated using the simulation. The simulation also provided a level of risk reduction when transitioning the PALACE landing technologies to flight trials. This paper describes the flight development, testing and validation of the autonomous landing technologies on a Yamaha RMAX RUAV. The in-flight performance of the individual technologies is presented, as well as the performance of the overall landing procedure. The first section of this paper presents the flight software and hardware architecture and describes each of the individual components. This will include the on-board mission manager element that coordinates each component of the system and provides all of the intelligence and decision-making capabilities to enable autonomous landings. The next section presents results from the flight trials on the Yamaha RMAX, including stereo range mapping, landing site selection, and selflocalization without GPS. The final section presents concluding remarks and discusses the ability of the PALACE system to meet the set of objectives listed in Table 1.

PALACE SYSTEM ARCHITECTURE Figure 1 shows a schematic of the hardware and software architecture used for the PALACE flight trials. Each element of this architecture is described in the following sections. ARP RMAX Hardware and Sensors The flight trials and demonstrations for the PALACE program were performed using a modified Yamaha RMAX that is part of the Autonomous Rotorcraft Project (ARP) [8]. The Yamaha RMAX is a small-scale helicopter with a rotor diameter of 3.12 m and an empty mass of 66 kg. The maximum payload is 28 kg.

PALACE Landing Point Selection Interface

Ground 802.11g wireless telemetry Aircraft (RMAX) - Operator selected landing point - Camera images - Landing point selection results

- Pseudo-GPS position - CLAW mode changes - Commanded way points

PALACE Mission Manager (cycles at 10Hz)

CLAW - Helicopter position (GPS) - Helicopter attitudes

Flight Control Computer

Landing Site Selection System

Monocular Position Estimation System

Experimentation Computer

- Actuator Commands

- IMU measurements - GPS location

Cameras Actuators

Sensors

SICK Scanning Laser

Left B&W Right B&W

Figure 1. PALACE flight hardware and software architecture.

Figure 2. ARP RMAX research aircraft. The RMAX operated by ARP (Figure 2) has been modified to include an avionics payload and various sensors. The avionics payload includes: a navigation and flight control computer, an experimentation computer, an IMU, a GPS receiver, and radio communications equipment. The experimentation computer is a compactPCI with a Pentium III running at 700Mhz and hosts the PALACE software and machine-vision processing. A

Figure 3. ARP RMAX avionics payload and stub wing with left cameras. pair of monochrome 640x480 resolution stereo cameras are mounted on either end of a vibration-isolated articulated stub wing with a stereo baseline of 1.1 m. Figure 3 shows the camera arrangement on the left end of the stub wing with monochrome and color cameras (the PALACE system uses only the monochrome camera for stereo ranging). Finally a SICK scanning laser is

Monochrome Digital Camera SICK Scanning Laser

Left Camera Image

Machine Vision Feature Tracker

Laser Range to Center of Camera Image

Feature Pixel Location

Monocular Position Estimation

Pseudo-GPS Position

Aglorithm Digital IMU

Measured Helicopter Attitudes

Figure 4. Flowchart of MPE processing. mounted under the nose to provide accurate distance measurements to the ground. ARP Flight Control Laws (CLAW) The CLAW block of the flight architecture (see Figure 1) contains the inner-loop and outer-loop control laws, as well as the mode switching elements. The inner-loop provides the primary attitude stabilization and attitude control of the helicopter. The outer-loop includes a waypoint navigation controller, which is made up of a path smoother and a path follower. When a list of waypoints are received into the CLAW block (from the operator or from the PALACE mission manager) they are first passed to the path smoother, which produces a larger list of waypoints that describes a smooth path between the original waypoints. The path follower then takes the desired smooth path and provides a steady stream of commands to the inner-loop to guide the helicopter along the path. The CLAW block provides a number of different modes for localization during waypoint navigation. The default is to use the signal returned by the on-board GPS receiver for navigation. A second mode allows an external position to be input into the CLAW block, which is then used for navigation. In the PALACE architecture, this external position is the ‘pseudo-GPS’ signal from the machine-vision position-estimation system. A third mode uses inertial navigation to provide a position estimate based on the attitude and acceleration measurements. Each of these navigation modes is selectable by sending a request to the CLAW block, which produces a smooth, transient-free transition to the selected mode. Monocular Position Estimation (MPE) System The MPE system provides a helicopter position that is independent of GPS and is used for navigation during the landing when GPS is assumed to be unavailable. The MPE system takes the left camera images, the laser range to the center of the camera image, and information about the pose of the camera to measure the position of the helicopter relative to a fixed point on the ground that is being tracked over time.

Figure 5. Initialization of MPE system. Figure 4 shows a schematic of sensor data processing to produce the position estimate. The machine-vision feature-tracking algorithm was developed by JPL [9] and is able to lock onto surface features and accurately track them from frame to frame. The MPE system runs in real-time at 10 Hz and produces a pseudo-GPS position estimate. The rate and format of the pseudo-GPS signal is the same as the signal from the GPS receiver, which required no changes to the control system to integrate the machine-vision position estimate. Figures 5 and 6 show how the MPE system calculates the position estimate. The MPE system is initialized (Figure 5) with the location of the intended landing point (or any arbitrary ground point). The location of this ground reference point is estimated by adding the visionestimated offset (MPE offset) to the current helicopter location. The machine-vision feature-tracker then locks onto this reference point within the camera images and

utilizes images from a pair of stereo cameras along with pose information for the cameras. The stereo range map is created by determining pixel disparities between features in the left and right camera images (Figure 7). The second operation in the landing site selection process utilizes a Safe Landing Area Determination (SLAD) algorithm. This algorithm applies a set of landing point constraints to the range map to find all safe landing regions, and then to choose the optimum landing point. Three constraints on the landing point are applied to ensure the surface slope is below a given tolerance, any surface obstructions or roughness are below a given size, and the range to the nearest obstacle is greater than a given distance. For the ARP RMAX the slope limit is set to be a maximum of 15 degrees, and the safe distance constraint (open-area diameter) is set to be a minimum of 7.0 m.

Figure 6. Determination of pseudo-GPS position. tracks it over time. In following software cycles, the MPE system estimates the helicopter position relative to the reference ground point being tracked (Figure 6). This offset is then added to the position of the fixed landing point to generate the pseudo-GPS signal. The pseudo-GPS signal is then sent to the control laws, which use it for navigation. Landing Site Selection System The landing site selection system used for the PALACE program consists of two operations. The first operation involves creating a stereo range map to represent the surface terrain profile. The stereo ranging algorithm

The roughness constraint defines the maximum allowable size of ground features or obstructions for landing. Any terrain roughness or surface objects physically larger than the roughness constraint value will be recognized as obstacles by the SLAD algorithm and the surrounding terrain will be rejected as a safe landing area. Since the resolution of stereo ranging varies with the physical distance to the ground surface, the value of the roughness constraint must be set dynamically based on the height above ground level (AGL). For example, from 30 m AGL, the system is only capable of consistently resolving objects that are greater than about 2 m in height. Therefore, the landing point selection system has to be run at different altitudes during the descent with successively tighter roughness constraint values. The descent profile used for PALACE is shown in Figure 8, starting with an initial landing point being chosen from 30 m AGL. The landing point is then verified and refined at 24 m, 18 m, 12 m, and 6 m AGL with successively tighter roughness constraint values as

Figure 7. Stereo images and resulting range map.

30m -- Initial SLAD landing point selection 24m -- Refine landing point selection 18m -- Refine landing point selection 12m -- Verify selected landing point 6m -- Verify selected landing point

Ground

Figure 8. Landing point selection descent profile. the altitude decreases. Figure 9 shows an example of the SLAD operator interface. This display shows results from the safe landing site selection system, in this case with two obstacles in the camera field of view. The yellow region shows the portion of the camera image where no range data are available due to non-overlapping left-right image features or the close proximity of the edge of the image. The points in red violate one or more of the landing point selection constraints and are determined to be unsafe for landing. For this particular case, the points in the red region are either too close to the obstacles, or too close to the inner edge of the yellow region, which is treated as an obstacle since no terrain range information is available beyond this border. The green region indicates the points that meet all of the constraints and are considered safe to land. Finally, the black ‘+’ indicates the point that best meets the constraints, and the circle shows the diameter of the safe-distance constraint. The information in Figure 9 is presented to the operator each time a new landing point is selected and allows the operator to verify the landing point. The operator also has the option of selecting an alternative landing point if necessary. PALACE Mission Manager The PALACE mission manager software is the heart of the system and unifies the various elements. It also provides the decision-making and coordination capabilities required to fly complete PALACE landings. Figure 10 shows the steps in the PALACE landing procedure as the helicopter descends from 30 m to the ground. The landing procedure starts with the arrival of the helicopter at the landing zone, which has been specified

Figure 9. SLAD operator interface showing results of landing point selection. GPS-based Navigation

30 Meters

- GPS navigation to landing zone - Start landing procedure at 30 meters AGL - Run SLAD to determine safe landing point - Begin tracking landing point - Move to bring landing point to center of image

24 Meters

- Descend to 24m AGL using GPS for navigation - Re-run SLAD to refine landing point - Begin tracking updated landing point

18 Meters

- Descend to 18m AGL using GPS for navigation - Re-run SLAD to refine landing point - Begin tracking updated landing point

12 Meters

- Descend to 12m AGL using GPS for navigation - Switch to MPE, pseudo-GPS for positioning - Re-run SLAD to refine landing point - Begin tracking updated landing point

6 Meters

- Descend to 6m AGL using MPE for navigation - Verify landing point by checking roughness

2 Meters

- Descent to 2m AGL using MPE for navigation - Initiate final landing sequences - Switch to Inertial Navigation for positioning

Machine Vision based Navigation

Inertial Navigation

Ground

- Descend to ground using Inertial Navigation - Weight on wheels switches for vehicle shutdown

Figure 10. PALACE landing sequence during the descent from 30m to the ground. by the operator as a GPS map coordinate. The cameras on the RMAX are set to a pitch angle of -60 degrees so that the helicopter will descend to the final landing point at this angle. The descent angle, or glideslope, is set the same as the camera angle so that the landing point remains nominally in the center of the camera image as the helicopter descends. The helicopter moves under GPS waypoint navigation to view the nominal landing point location specified by the

operator from an altitude of 30 m AGL. At this point, the stereo ranging and SLAD algorithms are run to select an initial landing point that best meets the set of slope, roughness and safe distance constraints. The machine vision feature-tracker estimates the location of the landing point and starts tracking this point. The PALACE mission manager then calculates and sends a waypoint to move the helicopter laterally so that the selected landing point is repositioned in the center of the camera image. Following this, a waypoint is set at an altitude of 24 m AGL along the glideslope and the helicopter is commanded to descend to this altitude. At 24 m, the stereo ranging and SLAD algorithms are run again to refine the landing point location with a tighter roughness constraint to reject regions with smaller obstacles that could not be detected from the higher altitude. The feature-tracker is then re-initialized with the updated landing point and a waypoint is set to bring this point to the center of the camera image. Another waypoint is set at an altitude of 18 m along the glideslope and the vehicle descends further. The same procedure happens at 18m AGL to bring the helicopter down to 12m AGL. At 12 m, the CLAW block is instructed to switch from GPS to the pseudo-GPS position estimate returned by the MPE system for navigation. The descent and landing from this point is flown entirely without using GPS. The SLAD algorithm is run again to refine the landing point and the helicopter descends to 6 m AGL. At 6 m AGL, the stereo ranging and SLAD algorithms are run for the final time to verify that the final landing zone is obstacle-free (within the final roughness constraint value) and safe for landing. Once the final landing point has been verified, the helicopter descends to 2 m AGL where the command to initiate the final landing script is given. The helicopter begins the final descent using MPE for navigation. When MPE goes out of range (too low for reliable tracking), CLAW switches to inertial navigation for the final portion of the landing. The total time while in inertial navigation is generally less than about 10 seconds. Weight-on-wheels switches are triggered when the vehicle touches the ground to complete the landing. For the flight trials, a high-speed flash drive is connected to the experimentation computer to record all of the data related to the PALACE mission manager and the machine-vision algorithms. These data included: the mission manager log, commands, actions and internal data; the MPE camera images, laser data, state data and results at 10Hz; the stereo images, state data and stereo range maps; and the SLAD inputs, output maps and landing point selection results. The collection of these

flight data allowed for post-flight verification of the system, and for regression testing to better characterize and optimize the flight performance.

FLIGHT TEST RESULTS Among the most critical technologies for the automated landing task are the machine vision algorithms and the integration of these algorithms with the RMAX hardware and control laws. Component flight tests were first performed to validate and optimize the in-flight performance of the various machine vision elements. Following this, flight trials of the complete system were performed to validate the mission manager functionality and decision-making capabilities, the integration of the machine vision elements into the complete landing procedure, and the smooth transition between different control modes. This ultimately led to complete landings on various surfaces and obstacle fields. Upwards of 25 flights to-date have included test points for the development and evaluation of the PALACE components and system. Stereo Ranging Evaluations The generation of an accurate range map from a pair of stereo camera images is the key element in the selection of a safe landing point. If the range map does not represent the ground terrain and obstacles with sufficient accuracy, then the SLAD algorithm may not select the best, or even a safe landing point, and may also reject a potentially safe landing point. There are two key performance measures associated with stereo ranging. The first is the amount of uncertainty (or noise) in ranging flat ground, which gives an indication of stereo ranging resolution. The resolution places a lower limit on the size of obstacles that can be resolved from the ranging noise and is used as the basis for setting the roughness constraint values for the SLAD algorithm. The second key performance measure is the accuracy with which obstacle sizes are represented in the range map. This is evaluated by comparing measured obstacle heights to actual obstacle heights. Effect of Surface Texture and Patterns The stereo ranging algorithm is based on the assumption that a feature in one camera image can be uniquely matched with the same feature in a second camera image. This assumption clearly breaks down in regions where pixels are indistinguishable and features cannot be uniquely identified in the second image. In such cases, “holes” are produced in the range map where no range information could be deduced. Therefore stereo ranging performance will vary with surface texture and with any patterns that may be present on the landing surface.

Figure 11. Effect of surface texture on stereo ranging performance. The effect of surface texture is illustrated in Figure 11, which shows the left camera images and resultant range maps for grass, concrete and asphalt surfaces. These results were obtained at 30 m with the cameras set at -60 degrees. The three range maps clearly show differences between the three nominally flat surfaces. In order to quantify the differences in the stereo ranging results, a least squares method is used to fit a plane to the range map point cloud. The RMS or average deviation of the point cloud from the plane gives an indication of the amount of noise or uncertainty in the stereo range map. A second metric is the peak-to-peak or maximum variation in range measurements after accounting for overall surface slope. Table 2 lists the average deviation and maximum variation metrics for three cases shown in Figure 11. For each surface, the maximum variation for the flat surface is between 1.0 m and 1.5 m, with asphalt being slightly better than grass and concrete surfaces. Taking asphalt as an example, these numbers indicate that it would be difficult to distinguish obstacles smaller than 1 m in height from the noise or uncertainty in the range map. It is important to mention that the results in Figure 11 and Table 2 for concrete and asphalt represent the best that can be obtained for these surfaces with special attention to the directionality or patterns on the surfaces. Figure 12 shows camera images and range maps for the same patch of asphalt from three different headings with the surface patterns or streaks aligned mostly vertical, mostly diagonal and mostly horizontal. Table 3 lists the metrics for these three cases. The results clearly show

Table 2. Stereo ranging performance measures for flat surfaces (30 m AGL, -60 degree camera angle). Surface Texture

Grass Concrete ** Asphalt **

Average Deviation (m) 0.18 0.21 0.17

Maximum Variation (m) 1.18 1.50 1.07

** Values vary with surface directionality Table 3. Stereo ranging performance measures for asphalt with different directional ‘streaks’ (30 m AGL, 60 degree camera angle). Surface Orientation

Horizontal Patterns Diagonal Patterns Vertical Patterns

Average Deviation (m) 0.26 0.41 0.17

Maximum Variation (m) 2.97 2.37 1.07

that stereo ranging is sensitive to any directionality in the surface texture. With horizontal and diagonal patterns on the surface, the stereo ranging algorithm is not able to accurately determine the disparity values, which results in large variations in ranging distance. Similar results were seen for concrete surfaces, which have directional patterns due to surface lines. These results indicate that careful attention to helicopter heading is required to ensure that good stereo ranging resolution is achieved. Recent results with improvements in camera calibration have shown a reduced dependence of stereo ranging results on surface texture directionality. Testing is

Figure 12. Effect of surface patterns on stereo range performance. currently underway to quantify this reduced sensitivity to directionality with the new camera calibration models. Effect of Altitude Altitude is an important parameter in stereo ranging since resolution is directly related to the height above the surface with greater resolution closer to the ground. Since the stereo ranging resolution varies with altitude, the SLAD roughness constraint must also vary with altitude. Table 4 lists the roughness constraint values versus height for asphalt (with surface patterns aligned vertically) and grass surfaces. These values are derived from the maximum variation metrics for these surfaces at each particular height and represent the size of obstacles that are clearly distinguishable from the noise in the range maps. These roughness constraint values therefore place an upper limit on the size of obstacles that will cause a surface to be rejected by the SLAD algorithm. Obstacle sizes less than these values may or may not be rejected. The values indicate that from 6 m AGL over asphalt, the system will reliably reject landing surfaces with obstacles of height 15 cm and taller. Obstacle Height Accuracy The accuracy with which obstacle sizes can be estimated using stereo ranging was evaluated by flying over obstacles of known dimensions from different altitudes and comparing the estimated obstacle height from stereo ranging with the actual obstacle height. Figure 13 shows

Table 4. Roughness constraint values as a function of altitude for asphalt and grass. Height AGL (m) 30 24 18 12 6

Roughness Constraint (m) Asphalt Grass 1.8 2.3 1.3 1.5 0.75 1.0 0.40 0.65 0.15 0.25

Table 5. Measured obstacle height versus actual obstacle height from different altitudes over asphalt. Ranging Altitude (AGL) Obstacle Height (m)

6m

12m

0.2

0.22 (10%)

0.20 (0%) 0.63 (5%)

0.6 0.9 1.5

18m

0.62 (3%) 0.97 (7%)

24m

30m

0.80 (11%) 1.45 (3%)

1.10 (22%) 1.70 (13%)

an example of such a test with the RMAX flying over boxes of known heights from different altitudes. Table 5 compares the measured and actual obstacle height results for one particular flight test, shown in Figure 13. The left column of the table is the actual obstacle height and the values across the top are the ranging altitudes. The data within the table are the

Figure 13. Flight test to determine accuracy of obstacle height estimation with stereo ranging. obstacle height values measured from the stereo range maps along with the percentage error. For this test, obstacles were measured to within 10% of their actual heights for altitudes up to 18 m AGL, and within 20% of their actual heights for altitudes above 18m AGL. SLAD Flight Evaluations: The SLAD algorithm takes a range map and combines it with a set of landing point constraints to first calculate a safe landing map, and second, choose the optimum landing point. For the PALACE program, the range maps were generated using stereo ranging, but the SLAD algorithm can be used with range maps from any sensor, including active as well as passive sensors. The constraint values used for the PALACE flight trials are based on the geometry and performance limits of the Yamaha RMAX. The landing site slope constraint value was set to 15 degrees, although all landings performed with the RMAX were on slopes below 5 degrees. The safe distance constraint is set to 7 m to ensure the algorithm selects a landing point with an open-area diameter of at least 7 m. This leaves a margin of 2 m from the rotor tip to the nearest obstacle, which is required to account for drift associated with the machinevision tracking and inertial navigation systems during the final descent and landing. The success or failure of the SLAD algorithm is based its performance in choosing a valid landing site. A success occurs when the algorithm correctly identifies a safe landing point when one exists, or when it correctly identifies no safe landing sites when none exist. There are two failure modes for the SLAD algorithm. The first is a false negative when the algorithm returns that there is no safe landing point when one actually exists. The second is a false positive when the algorithm returns a landing point that violates one or more of the constraints.

Figure 14. Landing site selection results from 30 m AGL over a gravel surface. Box heights are 1.6 m. The performance of the SLAD algorithm was evaluated in-flight for a number of different landing scenarios, with variations in surface texture (asphalt, grass, concrete and gravel), obstacle field (size, density and spacing), height above the terrain, and SLAD constraint values. With an accurate representation of the surface terrain in the form of a 3D range map, the SLAD algorithm was able to correctly identify which portions of the field of view were safe to land, and which were not. For cases where the stereo range map contains more noise than expected, the SLAD algorithm is more constrained by the additional surface roughness and could potentially reject areas that are safe for landing. This results in a conservative system where safe landing areas can potentially be deemed unsafe because of the additional noise. Figures 14 and 15 show the SLAD results from 30 m AGL and 24 m AGL for one particular landing to a gravel surface. For this case, the landing zone contained a golf cart (1.8 m tall) and two boxes (1.6 m tall). The roughness constraint values for gravel were set to be the same as those for grass (see Table 4) since the stereo ranging performance for gravel is similar to results obtained for grass. Figure 14 shows the landing point selection results from 30m AGL, at the beginning of the landing descent, with a SLAD roughness constraint value of 2.3 m. The golf cart on the left side of the image is recognized as an obstacle even though it is only 1.8 m tall. The neighboring regions to the golf cart are colored red indicating that they are unsafe for landing. The boxes (1.6 m tall) are not recognized as obstacles from this altitude as indicated by the large patch of green that

Figure 15. Landing site selection results from 24 m AGL over a gravel surface. encloses the box at the center of the image. The vicinity of the box is clearly not a safe place to land, but its height of 1.6 m is below the roughness constraint value of 2.3 m, so it is not rejected by the algorithm.

Figure 16. Landing site selection results from 12 m AGL over an asphalt surface. The box height is 40 cm and the roughness constraint is set to 45 cm.

Figure 15 shows the SLAD results during the same descent but this time from a height of 24 m AGL. For this case, the roughness constraint is set at 1.5 m and the box near the center of the image is recognized as an obstacle and rejected as a safe point to land. The previous landing point (shown with the blue circle) is no longer safe since it is too close to the box. The new landing point (shown with the black circle) is within the safe region and maximizes the distance from the box and the edge of the window. These examples shows how the SLAD algorithm refines the location of landing point during the descent from 30 m through 24 m and 18 m AGL, where smaller obstacles are detected at lower altitudes and the landing point location is adjusted. From heights of 12 m and 6 m AGL, the SLAD algorithm no longer attempts to refine the landing point location, but rather simply determines whether the landing point is still safe. This involves checking to see whether the portion of the landing site visible from these heights is obstacle-free to within the value of the roughness constraint. If the roughness constraint is not violated, then the landing point is validated and the descent continues. If the roughness constraint is violated, then the area is deemed unsafe and the landing is aborted. Figures 16 and 17 show examples of the landing site selection results for the same scenario with different

Figure 17. Landing site selection results from 12 m AGL over an asphalt surface. The box height is 40 cm and the roughness constraint is set to 40 cm. roughness constraint values. For these cases, the helicopter is at a height of 12 m AGL and the landing zone contains a box that is 40 cm tall. For the case in Figure 16 the roughness constraint is set to 45 cm and the landing point is validated as safe since the box is below the roughness constraint and is not seen as an obstacle by the SLAD algorithm. For the case in Figure 17, the roughness constraint is reduced to 40 cm and the box is recognized as an obstacle, which leads to the previously selected landing site being rejected.

Easting (meters)

1 Estimated Actual

0 -1 -2 -3 -4 -5 10

Northing (meters)

Monocular Position Estimation (MPE) Evaluations The monocular feature-tracker and MPE algorithms take left camera images, the laser range to the center of the camera images, and information about the pose of the camera to estimate the position of the helicopter position relative to a fixed point on the ground. This allows the MPE system to provide a self-localization capability that is used for navigation during the final portion of the descent from 12 m AGL to the ground without GPS.

8 6 4

The performance of the MPE system was evaluated inflight for a number of different conditions, including variations in atmospheric conditions (wind speed, wind direction and level of turbulence), lightning conditions (overcast and full sun), surface textures (grass, concrete, asphalt and gravel), and height above the ground. The performance metrics that were evaluated during flight were the amount of tracking drift, the processing time required for each cycle of tracking and position estimation, and the ability of the helicopter to hold position and maintain the correct descent path while navigating to the ground. The tracking drift associated with MPE is due to the fact that the template of the feature being tracked is updated with each cycle of the MPE system. Updating the feature template at each step has the advantage of accounting for changes in the camera position during the descent, but is prone to drift since the original feature that was used to initialize the tracker is not kept in memory.

Effect of Latency

Flight tests of the MPE system in separate hover, climb and descent tests revealed that MPE is robust for different surfaces, with similar performance seen over grass, concrete, asphalt and gravel. MPE is also robust to lighting conditions (overcast and full sun) and atmospheric conditions (wind speed, wind direction and level of turbulence) with similar amounts of tracking drift seen in each case. The total amount of tracking drift observed in each case was small and limited in each case to 15-20 cm per minute.

As shown in Figure 4, the inputs to the MPE system are the left camera images, the laser range to the center of the camera images, and the measured helicopter attitudes from the IMU. Since each input is used to estimate the helicopter position, time synchronization is critical. If the latency between the various MPE inputs is too large, then a coupling between the attitude control loop (innerloop) and the position control loop (outer-loop) is introduced that compromises the overall stability of the helicopter.

Figure 18 shows the Easting, Northing and height time histories during a climb from 12 m AGL to 30 m AGL using the vision-based self-localization system. The total amount of drift for this case after about 5 minutes without GPS was less than 1 m. This amount of drift is well within the 2 m margin included in the SLAD safe distance constraint to account for tracking drift, particularly since the amount of time to navigate from 12 m AGL to the ground is only on the order of one to two minutes.

As an example, if the helicopter were to oscillate inflight in the pitch direction (with no fore-aft motion), the pitch attitude would be seen as changes in: the measured values of pitch attitude, the length of the laser slant range, and the vertical pixel location of the feature being tracked. If the signals are precisely time synchronized, then the pitch oscillation seen in each of the three MPE inputs will be in-phase and the MPE pseudo-GPS position estimate would be steady with no oscillation. This is equivalent to the case with GPS where the measured position is independent of the helicopter attitudes and the inner and outer loops remain uncoupled.

2 0

Height AGL (meters)

-2 35 30 25 20 15 10

60.0

120.0

180.0

240.0

300.0

Time (seconds)

Figure 18. Time histories comparing actual position (GPS) and estimated position (MPE) for a climb from 12m AGL to 30m AGL.

However, this is not the case if the MPE inputs signals are not time synchronized. In such case, the estimated position will oscillate at the same frequency as the pitch oscillation, which strongly couples the inner-loop attitude control and the outer-loop position control and can ultimately lead to attitude instability. The set-up that was used on the ARP RMAX for the PALACE flight trials has the cameras and laser connected directly to the experimentation computer. Camera images are taken every 33 msec (30 Hz) and laser data are taken every 27 msec (37 Hz). The timestamp from each of these measurements is used to synchronize the camera images and laser data. On the other hand, the attitudes are measured by the IMU, which is connected directly to the flight control computer. The attitudes are then passed to the experimentation computer over a TCP/IP socket every 20 msec (50 Hz) for use in the MPE calculations. In order to quantify the inherent latencies between the various MPE inputs, a pitch frequency sweep was performed with the helicopter on the ground (with the engine turned off) by manually raising and lowering the tail. The camera images, laser data and attitudes were recorded within the mission manager at the point just before prior to the MPE calculations. Using system identification [10], it was found that the camera images and laser range measurements were synchronized to within a couple of millisecond. However the attitude measurements were delayed by 43 msec from the camera images and laser data. This latency in the attitude measurements is due to processing and filtering on the IMU, processing within the flight control computer, and transmission time between the flight control and experimentation computers. Figure 19 shows the power spectrum results for the MPE position estimate for hover at altitudes of 6 m, 12 m, and 18 m AGL. These results were generated without compensating for the latency correction of 43msec in the attitude measurements. As expected, the magnitude decreased above the position loop crossover frequency of about 0.8 rad/sec. However there is a spike in the power spectrum curves at about 3 rad/sec, which corresponds to the crossover frequency of the inner attitude loop. The peak also increases in magnitude as the altitude increases. This effect is seen in-flight as an oscillation in pitch and roll that increases in magnitude and decreases in damping as the altitude increases. The sensitivity with altitude is due to the physical distance per pixel, which increases linearly with altitude and translates into a linear increase in cross-coupling gain between the inner and outer control loops. A flight test with the same configuration at 24 m AGL resulted in an unstable pitch oscillation.

Figure 19. Power spectrum for MPE position estimate at 6 m, 12 m, and 18 m AGL without compensating for the 43 msec latency in the attitude measurements.

Figure 20. Power spectrum of the MPE position estimate at 12 m with and without the attitude latency correction. In order to time synchronize the three inputs to the MPE system, the camera images and laser data were delayed by 43 msec to be in-phase with the attitude data. Figure 20 shows the effect on the MPE position power spectrum at 12 m AGL with and without the attitude latency compensation. The addition of this compensation has two effects on the power spectrum. The first is that the peak at 3 rad/sec is reduced and no oscillations in pitch and roll were seen in-flight. This indicates that by carefully synchronizing the various MPE inputs, the MPE pseudo-GPS position estimate is no longer a function of attitude and the inner and outer-loops are again decoupled. With the latency correction included, stable hover was achieved up to 30 m AGL (shown in Figure 18). The second effect of compensating for the attitude latency is that the magnitude (Figure 20) increases around the position loop crossover frequency, which translates into a greater lateral and fore-aft motion of the helicopter while holding a hover position. This results from the 43 msec of latency that is moved from the inner attitude loop to the outer position loop, which decreases the amount of damping in the position loop. This is

illustrated in Figure 18 where the fore-aft and lateral motions increase in amplitude as the altitude increases. The period of between 6 and 8 seconds corresponds to the crossover frequency of the position hold loop of about 0.8 rad/sec. The total amount of additional time delay added to the outer-loop by the MPE feedback, when compared with GPS feedback, is around 100 msec. This includes the MPE processing time of 50 to 60 msec, and the delay introduced by the attitude latency compensation of 43 msec. The dependence on altitude results from the increase in physical distance per pixel of resolution as the altitude increases. Even though this fore-aft and lateral motion is larger with MPE than with GPS, the system is still stable both in attitude and position to at least 30 m AGL.

Based on the results of the PALACE flight trials on a modified Yamaha RMAX, the following conclusions can be drawn: 1.

Precise autonomous landings to unprepared sites are possible without ground-based instrumentation or markings, and without GPS. This has been demonstrated in-flight with a total of 17 successful landings to date on various surfaces and obstacle fields.

2.

A successful autonomous landing system requires careful attention to system integration, solid dynamics and control performance, and close attention to the dynamic interaction between the various components. This is particularly important when closing the loop on a vision sensor, such as the MPE system where stable flight could be achieved only after carefully synchronizing all of the position estimate inputs.

3.

Passive stereo ranging alone may not be sufficient for terrain sensing due to its sensitivity to terrain texture and directionality, and brightness. Successful autonomous landings have been possible only by inspection of the ground surface to choose a heading that gives the best stereo ranging possible. Since an accurate stereo range map is the key element to the landing site selection algorithm, improvements in stereo ranging accuracy will make the system more robust to different surfaces and obstacles.

4.

The performance of the landing system must be optimized based on the available processing power, and there is a trade-off between the desired performance and the available processing power.

CONCLUDING REMARKS This paper provided an overview of the PALACE program and the methods of integrating machine vision technologies with realistic vehicle dynamics and control laws for autonomous landings of RUAVs. Results from flight trials to evaluate and verify the performance of the machine-vision components were reported. A comparison of the between the quantitative objectives of the PALACE program (listed in Table 1) and the values measured in flight are shown in Table 6. The vision-based self-localization system is able to meet the requirements of drift (landing accuracy) and processing time with a performance that this robust for different surfaces, lighting conditions, and atmospheric conditions. The SLAD algorithm was able to meet the requirements of landing site size and processing time, but was unable to meet the landing surface roughness and SLAD success rate requirements due to limitations in the stereo ranging resolution. The trade-off between the maximum allowable surface roughness and the resulting SLAD success rate dictates that, with the current stereo ranging resolution, both objectives cannot be met simultaneously. For the results in this paper, the roughness constraint at each altitude is set in order to achieve a 95% SLAD success rate, which produces a reasonable balance between success rate and resolution. Table 6. PALACE program quantitative metrics and flight measured values. Quantitative Metric Landing Site Size Landing Surface Slope Landing Surface Roughness Landing Accuracy Feature-Tracking Time SLAD Calculation Time SLAD Success Rate

Project Objective < 7.0 m < 15 deg < 10 cm < 1.25 m < 100 msec < 5 sec > 98%

Measured Values < 7.0 m < 15 deg < 15 cm < 1.00 m < 60 msec < 3 sec > 95%

It should be mentioned that, subsequent to the data collection and writing of this paper, significant improvements in resolution and accuracy of stereo ranging have been achieved with camera models from a new JPL camera calibration algorithm. This improvement in stereo ranging performance translates into roughness constraint values that are reduced by 50 to 60% from those shown in Table 4 and decreases the sensitivity of the stereo ranging algorithm to surface lines and patterns. This improvement in stereo ranging performance also improves the SLAD algorithm performance to the extent that both the landing surface roughness and SLAD success rate objectives (Table 6) can be achieved.

REFERENCES [1] Sierra Nevada Corporation, “UAV Common Automatic Recovery System (UCARS),” http://www.sncorp.com/uav1.html. [2] Saripalli, S., Montgomery, J. F., and Sukhatme, G. S., “Visually-Guided Landing of an Unmanned Aerial Vehicle,” IEEE Transactions on Robotics and Automation, vol. 12, no. 3, pp. 371-381, June 2003. [3] Saripalli, S., and Sukhatme, G. S., “Landing on a Moving Target using an Autonomous Helicopter,” Proceedings of the International Conference on Field and Service Robotics, July 2003. [4] Garcia-Padro, P. J., Sukhatme, G. S., and Montgomery, J. F., “Towards Vision-based Safe Landing for an Autonomous Helicopter,” Robotics and Autonomous Systems, vol. 38, no. 1, pp. 19-29, 2001. [5] Shakernia, O., Vidal, R., Sharp, C. S., Ma, Y., and Shastry, S. S., “Multiple View Motion Estimation and Control for Landing an Unmanned Aerial Vehicle,” Proceedings of the International Conference on Field and Service Robotics, May 2002. [6] Hintze, J., Christian, D., Theodore, C., Tischler, M., McLain, T., and Montgomery, J., “Simulated Autonomous Landing of a Rotorcraft Unmanned Aerial Vehicle in a Non-cooperative Environment,” Proceedings of the American Helicopter Society 60th Annual Forum, Baltimore, MD, June 2004. [7] Theodore, C., Shelden, S., Rowley, D., McLain, T., Dai, W., and Takahashi, M., “Full Mission Simulation of a Rotorcraft Unmanned Aerial Vehicle for Landing in a Non-Cooperative Environment,” Proceedings of the American Helicopter Society 61st Annual Forum, Grapevine, TX, June 2005. [8] Whalley, M., Takahashi, M., Schulein, G., Freed, M., Christian, D., Patterson-Hine, A., and Harris, R., “The Army/NASA Autonomous Rotorcraft Project,” Proceedings of the American Helicopter Society 59th Annual Forum, Phoenix, AZ, May 2003. [9] Johnson, A., Klump, A., Collier, D., and Wolf, A., “LIDAR-Based Hazard Avoidance for Safe Landing on Mars,” AAS/AIAA Space Flight Mechanics Meeting, Santa Barbara, CA, Feb 2001. [10] Tischler, M., and Cauffman, M., “FrequencyResponse Method for Rotorcraft System Identification: Flight Applications to BO-105 Coupled Rotor/Fuselage

Dynamics,” Journal of the American Helicopter Society, vol. 37, no. 3, July 1992.