A Novel Vision Based Protocol for Controlling Flapping Wing Vehicles ...

34 downloads 0 Views 2MB Size Report
The obtained results accentuate the usefulness of the developed protocol in controlling of ... consequential positive impact on vehicle performance. Key Words: ...
Journal of Applied Science and Engineering, Vol. 18, No. 4, pp. 331-338 (2015)

DOI: 10.6180/jase.2015.18.4.03

A Novel Vision Based Protocol for Controlling Flapping Wing Vehicles in Indoor Surveillance Mission Sankarasrinivasan Seshadri1, Balasubramanian Esakki1*, Lung-Jieh Yang2, Udayagiri Chandrasekhar1 and Sarasu Packiriswamy1 1

2

Centre for Autonomous System Research, Vel Tech University, Chennai, India Department of Mechanical and Electromechanical Engineering, Tamkang University, Tamsui, Taiwan 251, R.O.C.

Abstract This paper presents a novel protocol for vision based flight control of flapping wing micro aerial vehicles (FWMAV). The crux of the work intends to use real time video signals from an on-board camera to control the FWMAVs without using any electronic sensor modules and manual remote control system. A unique control interface is framed for indoor surveillance tasks wherein the user will command the vehicle by providing set of vision information to activate certain flight movements. The control interface is programmed through MATLAB which offers user friendly graphical options to perform vision controlled tasks. Initial simulation is carried out to control the FWMAVs through interfacing MATLAB and Proteus software. The simulation test includes controlling the flapping frequency, directivity of the FWMAV and also validating the vision based systems. In addition, the real time experimentation is performed to control the FWMAV through color based signalling. The obtained results accentuate the usefulness of the developed protocol in controlling of FWMAVs with reprogrammable control interface and light-weight camera that with consequential positive impact on vehicle performance. Key Words: Color Detection, FWMAVs, GUI, Transmitter Circuitry, Vision System

1. Introduction Though the application spectrum corresponding to the use of FWMAVs in surveillance missions is substantial, development of a suitable control protocols involves several challenges due to unstable flight characteristics and restrictions on payloads. Despite the fact that several *Corresponding author. E-mail: *[email protected] This paper is the extension from the authors’ technical abstract presented in the 1st International Conference on Biomimetics and Ornithopters (ICBAO-2015), held by Tamkang University, Tamsui, Taiwan, during June 28-30, 2015.

control algorithms based on feedback from on-board sensors and inertial measurement are already reported [14], they entail the use of extensive electronic modules that significantly add to payload. However, vison based navigation systems can be an effective choice to control the FWMAVs autonomously as it only requires a light weight camera system. Croon et al. categorized three main approaches [5] based on vision based flight control. First one involves controlling the attitude of MAVs with an aid of external cameras [6,7] through triangulation method and this approach has limitations in an unknown non-fixed camera environment. Second methodology involves estimating the states such as 3D position, pitch, yaw and roll using a prefabricated [8] or learned 3D en-

332

Sankarasrinivasan Seshadri et al.

vironment [9,10]. In order to perform this task, a computationally expensive system is necessary. The third method to control the aerial vehicles is to employ the optical flow method [11,12] in which optical flow signals are mapped with respect to divergent viewing directions and sending corresponding steering commands to MAVs. Many vision based algorithms are developed for fixed wing and rotary wing vehicles [13-17]. Nevertheless, the control of FWMAVs through image based algorithm is still in nascent stage [18]. De Croon et al. [5, 19] performed height control of FWMAVs using two cameras attached to the system. Beak and fearing [20] demonstrated altitude control of the ornithopter with the help of an external camera. Zufferey et al. [21,22] accomplished autonomous flight of insects using optical flow sensor networks. Hsiao et al. [23] utilized stereo-vision technology to acquire the spatial position and attitude of FWMAV. Nelson et al. [24] developed an accurate path following MAV system using vector field path control law. The error occurrence for the straight and orbital paths is found to be of less than three times the wing span. Cooper et al. [25] proposed a vision controlled MAV system which adopts single image perceptive cue algorithms for building 3D models and navigating in indoor environments. To overcome the limitations related to the existing protocols, this study demonstrates a novel framework to control the FWMAVs through system software modules that are integrated with image processing and

guidance algorithms.

2. Control Interface and Transmitter System In the present work, a dedicated graphical interface is developed for controlling of flapping frequency and directionality in manual mode. The right panel on the interface (Figure 1) is used to perform navigation based on color information through activating of autonomous mode. A display panel is also provided to obtain the real time pictures while performing the indoor surveillance tasks. To ensure the communication between the ground control station and FWMAV, an efficient hardware interface is also developed (Figure 2). The simulation system consists of Arduino Uno, virtual COM Ports and two BLDC motors. On sending of a control signal from the interface, RXD port confirms reception and directs micro controller for further actions. In the manual mode of simulation, when the user slides along the throttle as shown in Figure 1, the output voltage across the port 11 is varied and thus accounts for change in flapping speed. The ports 12 and 13 are provided as a pathway to control the flight direction and it is achieved through polarity reversal. Similarly in the case of vision control mode, the same control sequences are executed based on the image information calibrated from the FWMAV. The simulation tests are carried out successfully confirming the performance

Figure 1. Graphical user interface with manual and autonomous modes.

A Novel Vision Based Protocol for Controlling Flapping Wing Vehicles in Indoor Surveillance Mission

333

Figure 2. MALTAB simulation.

of the system based on achieving desired flapping frequency, directivity and vision based controls. In the real time circuitry, the wired connection for the BLDC motors of FWMAVs are replaced with the Arduino integrated wireless transmitter system. The hardware unit as shown in Figure 3 consisting of control interface installed in the PC and Arduino controlled MAV transmitter system. The output ports of the Arduino are connected to the MAV transmitter circuit through six terminal cables in which one pair is used for flapping frequency control and rest for changing the direction of flight. In order to achieve desired flapping frequency, a specific pulse width modulated (PWM) voltage signal is sent from the Arduino to the MAV transmitter system.

used to actuate the flapping motors and orange color is used to control the direction of the vehicle. In order to achieve rapid color detection for enhanced navigation, the color detection, algorithm is optimized in terms of detection time and sturdiness with reference to noise interference. Typically color detection tasks are carried out using hue color model due to its accuracy and simplicity [26]. Other color models such as RGB, Normalised RGB, HSV, YUV, YCbCr, YIQ, CIELAB, and CIELUV are also explored to improve the efficiency of color detection [27] as per the algorithm that is schematically depicted in Figure 5. The color detection time involves conversion time from actual color model to processing color model (TC), selection of a specific color element to perform thresholding (Tp), and period to carry out thresholding pro-

3. Vision-based Control Mode The control performance of FWMAV is effectively deployed through designing a vision based control assembly as shown in Figure 4. In the vision control mode, the control interface sends set of flight signals based on images acquired from FWMAV. The captured images are subsequently processed by the image algorithms and color of the target is identified. For each predefined target color, the actuation of FWMAV is performed. In this study blue color is

Figure 3. Arduino controlled MAV transmitter system.

334

Sankarasrinivasan Seshadri et al.

Figure 4. Block diagram of the vision based FMV control.

Figure 5. Color recognition algorithm.

cess (Tt). The user can define any color and their corresponding activation signal depends on the flying environment where the experiments are performed. The color detection time involves distinct elements such as time required for conversion of actual color model to processing color model (TC), selection of a specific color element to perform thresholding (Tp) and time for carrying out thresholding process (Tt). Several experiments are conducted using real time images, standard database images and synthetic images to optimize time for color model conversion (Table 1) and thresholding. The image and video frames are resized to 320 ´ 240 for convenience in processing. It can be inferred from the results (Table 2) that the Y and RGB based color models perform most efficiently. But in real time, RGB based color model is not used due to its high sensitivity to illumination and noise [28] and this study deploys only Y based color model. The successive experiments are carried out to test the robustness of the color model towards various noises and their results are given in Table 3. The thresholding is performed using the test image with various possible noises and root

mean error is evaluated. It is evident from these results (Tables 2 and 3) that, Y based color model perform optimal with respect to time and robust towards various image noises. The color detection algorithm using optimal color model is integrated in the control interface which aids in enhanced processing of real time images obtained from FWMAVs.

4. Experiments with FWMAVs The experimental micro aerial vehicle with custom developed transmitter and receiver units is used for proof Table 1. Color model conversion time (Tc)

Color model N RGB HSV YUV YIQ YCbCr CIELUV CIELAB

Conversion time 0.0025 0.0455 0.0133 0.0135 0.0136 0.1213 0.1673

A Novel Vision Based Protocol for Controlling Flapping Wing Vehicles in Indoor Surveillance Mission

of concept testing (shown in Figure 6(a) to 6(d)). Experiments are conducted through the user based signaling using colored object. In the present study, blue color is programmed to activate the throttle which in turn flaps the wings at predefined frequency of the vehicle. Results pertaining to controlling the flapping frequency are shown in Table 4. In order to control the directionality of

335

the vehicle, orange color is used as is shown in Figure 7(b). The real time detection of the object with FWMAV mounted with camera unit shown in Figure 7(c) provides the actuation to control the directionality of vehicle. During the course of experiments, in the manual mode, as the user slide along the flapping frequency control unit, a different PWM signal will be generated

Table 2. Avg. thresholding time for various colors Colors

RGB

NRGB

HSV

YIQ

YUV

YCbCr

CIELAB

CIELUV

Blue Green Yellow Red Orange

0.031 0.032 0.030 0.031 0.034

0.031 0.031 0.031 0.031 0.032

0.047 0.048 0.048 0.046 0.047

0.037 0.037 0.036 0.036 0.037

0.037 0.037 0.036 0.036 0.037

0.037 0.036 0.036 0.036 0.037

0.090 0.089 0.092 0.093 0.091

0.070 0.072 0.072 0.074 0.074

Table 3. Color Model Sturdiness towards various noises Noises

HSV

YIQ

YUV

YCbCr

CIELAB

CIELUV

White noise Gaussian Fast fading Jp2k Jpeg

0.088 0.025 0.044 0.035 0.046

0.088 0.012 0.025 0.019 0.020

0.089 0.013 0.026 0.018 0.029

0.084 0.013 0.026 0.018 0.029

0.1168 0.017 0.060 0.039 0.059

0.115 0.011 0.024 0.017 0.028

Figure 6. (a) Flapping wing vehicle; (b) Video receiver unit; (c) Arduino based FWMAV transmitter system; (d) Graphical control interface.

336

Sankarasrinivasan Seshadri et al.

by the Arduino board. The generated PWM signal proportionally affects the input voltage to the BLDC motor used to obtain flapping motion. The experimental results Provided an insight to relate the PWM signal, input voltage to the BLDC and the flapping frequency of FWMAVs. Table 4 provides the details on these aspects after successful experimental studies using the developed hardware and software modules that could be used to control the vehicle in real time. Table 4. Control of flapping frequency Arduino Ref PWM (Duty cycle: 0:255) 226 211

Input voltage (V)

Flapping frequency (Hz)

3.70 0.74

20 04

5. Conclusions This paper presents a novel protocol using only the vision sensing for controlling the frequency and directionality of FWMAV. The control interface is developed incorporating manual and autonomous modes such that the resultant FWMAVs can perform versatile tasks pertinent to indoor surveillance, path tracking or target seeking. The initial demonstration involves virtually linking Proteus simulation circuitry with MATLAB control interface. Experimental testing validates the proposed methodology with reference to initiating the flight and controlling of the directionality through the image information. The developed control interface offers flexible hardware architectures and re-programmable software modules besides offering other functional advantages of reduction in payload and enhanced endurance.

Acknowledgements Financial assistance to this work extended by GITACII program of the Department of Science and Technology (DST, India) and Indo-Taiwan Joint Project of Ministry of Science and Technology (MOST-102-2923E-032-001-MY3, Taiwan) is thankfully acknowledged.

References

Figure 7. (a) Flapping activation; (b) Controlling directivity; (c) Detection of object.

[1] Deng, X., Schenato, L. and Sastry, S. S., “Flapping Flight for Biomimetic Robotic Insects: Part II-flight Control Design,” IEEE Transactions on Robotics, Vol. 22, No. 4, pp. 789-803 (2006). doi: 10.1109/TRO. 2006.875483 [2] Khan, Z. A. and Agrawal, S. K., “Design and Optimization of a Biologically Inspired Flapping Mechanism for Flapping Wing Micro Air Vehicles,” IEEE International Conference on Robotics and Automation, Rome, Italy, pp. 373-378 (2007). doi: 10.1109/ROBOT.2007. 363815 [3] Shigeoka, K. S., “Velocity and Altitude Control of an Ornithopter Micro Aerial Vehicle,” Master’s thesis, The University of Utah, USA (2007). [4] Yang, L. J., Balasubramanian, E., Chandrasekhar, U., Hung, K.-H. and Cheng, C.-M., “Practical Flapping Mechanisms for 20 cm-span Micro Air Vehicles,” In-

A Novel Vision Based Protocol for Controlling Flapping Wing Vehicles in Indoor Surveillance Mission

ternational Journal of Micro Aerial Vehicle, Vol. 7, No. 2, pp. 181-202 (2015). doi: 10.1260/1756-8293.7. 2.181 [5] de Croon, G., de Clerq, K., Ruijsink, R., Remes, B. and de Wagter, C., “Design, Aerodynamics and Visionbased Control of the DelFly,” International Journal of Micro Air Vehicles, Vol. 1, No. 2, pp. 71-97 (2009). doi: 10.1260/175682909789498288 [6] Mak, L. C., Whitty, M. and Furukawa, T., “A Localisation System for an Indoor Rotary-wing MAV Using Blade Mounted LEDs,” Sensor Review, Vol. 28, No. 2, pp. 125-131 (2008). doi: 10.1108/02602280810856688 [7] Rudol, P., Wzorek, M., Conte, G. and Doherty, P., “Micro Unmanned Aerial Vehicle Visual Servoing for Cooperative Indoor Exploration,” Proceedings of the 2008 IEEE Aerospace Conference, Montana, USA, pp. 1-10 (2008). doi: 10.1109/AERO.2008.4526558 [8] Kemp, C., Visual Control of a Quad-Rotor Helicopter, Ph.D. Dissertation, Churchill College, University of Cambridge, UK (2006). [9] Elik, K. C., Chung, S.-J. and Somani, A., “Mono-vision Corner SLAM for Indoor Navigation,” IEEE International Conference on Electro/Information Technology 2008, Ames, IA, pp. 343-348 (2008). doi: 10. 1109/EIT.2008.4554326 [10] Ahrens, S., Vision-based Guidance and Control of a Hovering Vehicle in Unknown Environments, M.Sc. thesis, MIT, USA, June (2008). doi: 10.1109/ROBOT. 2009.5152680 [11] Ruffier, F. and Franceschini, N., “Optic Flow Regulation: the Key to Aircraft Automatic Guidance,” Robotics and Autonomous Systems, Vol. 50, No. 4, pp. 177-194 (2005). doi: 10.1016/j.robot.2004.09.016 [12] Barrows, G., Neely, C. and Miller, K., “Optic Flow Sensors for MAV Navigation,” in Fixed and Flapping Wing Aerodynamics for Micro Air Vehicle Applica tions, Progress in Astronautics and Aeronautics , T. J. Mueller, Ed. AIAA, Vol. 195, pp. 557-574 (2001). doi: 10.2514/5.9781600866654.0557.0574 [13] Yu, Z.-Y., et al., “3D Vision Based Landing Control of a Small Scale Autonomous Helicopter,” International Journal of Advanced Robotic Systems, Vol. 4, No. 1, pp. 51-56 (2007). doi: 10.5772/5710 [14] Sinisa, T. and Nechyba, M. C., “A Vision System for Intelligent Mission Profiles of Micro Air Vehicles,”

337

IEEE Transactions on Vehicular Technology, Vol. 53, No. 6, pp. 1713-1725 (2004). doi: 10.1109/TVT.2004. 834880 [15] De Christophe, W., et al., “Autonomous Flight of a 20gram Flapping Wing MAV with a 4-gram Onboard Stereo Vision System,” IEEE International Conference on Robotics and Automation (ICRA 2014), Hong-Kong, pp. 4982-4987 (2014). doi: 10.1109/ICRA.2014.6907 589 [16] Moore, R. J. D., et al., “Autonomous MAV Guidance with a Lightweight Omnidirectional Vision Sensor,” IEEE International Conference on Robotics and Automation (ICRA 2014) Hong-Kong, pp. 3856-3861 (2014). doi: 10.1109/ICRA.2014.6907418 [17] Shen, S., Nathan M. and Vijay K., “Autonomous Multifloor Indoor Navigation with a Computationally Constrained MAV,” IEEE International Conference on Robotics and Automation (ICRA 2011), Shanghai, pp. 20-25 (2011). doi: 10.1109/ICRA.2011.5980357 [18] Ruffier, F. and Franceschini, N., “Optic Flow Regulation: the Key to Aircraft Automatic Guidance,” Robotics and Autonomous Systems, Vol. 50, No. 4, pp. 177-194 (2005). doi: 10.1016/j.robot.2004.09.016 [19] de Croon, G., de Weerdt, E., de Wagter, C. and Remes, B., “The Appearance Variation Cue for Obstacle Avoidance,” IEEE International Conference on Robotics and Biomimetics (ROBIO), Tianjin, pp. 1606-1611 (2010). doi: 10.1109/ROBIO.2010.5723570 [20] Baek, S. and Fearing, R., “Flight Forces and Altitude Regulation of 12 Gram I-Bird,” IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Tokyo, Japan, pp. 454-460 (2010). doi: 10.1109/BIOROB.2010.5626347 [21] Zufferey, J.-C., Klaptocz, A., Beyeler, A., Nicoud, J.D. and Floreano, D., “A 10-gram Microflyer for Vision-based Indoor Navigation,” Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 32673272 (2006). doi: 10.1109/IROS.2006.282436 [22] Zufferey, J.-C., Beyeler, A. and Floreano, D., Flying Insects and Robots, Edited by Dario Floreano, JeanChristophe Zuffery, Mandyam V. Srinavasan and Charlie Ellington, Springer, Chapter 6, Optic Flow to Steer and Avoid Collision in 3D, pp. 73-86 (2009). doi: 10. 1007/978-3-540-89393-6_6

338

Sankarasrinivasan Seshadri et al.

[23] Hsiao, F. Y., Hsu, H. K., Chen, C. L., Yang, L. J. and Shen, J. F., “Using Stereo Vision to Acquire the Flight Information of Flapping-wing MAVs,” Journal of Applied Science and Engineering, Vol. 15, No. 3, pp. 213226 (2012). doi: 10.6180/jase.2012.15.3.02 [24] Nelson, D. R., et al., “Vector Field Path Following for Miniature Air Vehicles,” IEEE Transactions on Robotics, Vol. 23, No. 3, pp. 519-529 (2007). doi: 10.1109/ TRO.2007.898976 [25] Cooper, B., Chen, J. and Saxena, A., “Autonomous MAV Flight in Indoor Environments Using Single Image Perspective Cues,” IEEE International Conference on Robotics and automation (ICRA 2011), Shanghai, pp. 5776-5783 (2011). doi: 10.1109/ICRA.2011.5980136 [26] Fairchild, M. D., Color Appearance Models, John

Wiley & Sons (2013). doi: 10.1002/9781118653128. ch16 [27] Sankarasrinivasan, S, Balasubramanian, E., Hsiao, F. Y. and Yang, L. J., “Robust Target Tracking Algorithm for MAV Navigation System,” IEEE International Conference on Industrial Instrumentation and Control (ICIC 2015), Pune, pp. 269-274 (2015). doi: 10.1109/ IIC.2015.7150751 [28] Cheng, H.-D., et al., “Color Image Segmentation: Advances and Prospects,” Pattern Recognition, Vol. 34, No. 12, pp. 2259-2281 (2001). doi: 10.1016/S00313203(00)00149-7

Manuscript Received: Aug. 24, 2015 Accepted: Sep. 15, 2015