Automatic Control of a ROV for Inspection of ...

1 downloads 0 Views 839KB Size Report
This work deals with the implementation of a position and orientation automatic control of an underwater vehicle to perform inspection tasks of submerged ...
Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

Vinícius Nizolli Kuhn [email protected] Instituto Federal Sul-Rio-Grandense Coordenadoria de Automação Industrial 96180-000, Camaquã, RS, Brazil

Paulo Lilles Jorge Drews Jr. [email protected] Universidade Federal do Rio Grande Centro de Ciências Computacionais 96201-900, Rio Grande, RS, Brazil

Sebastião C. Pinheiro Gomes [email protected] Universidade Federal do Rio Grande Instituto de Matemática, Estatística e Física 96201-900, Rio Grande, RS, Brazil

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing This work deals with the implementation of a position and orientation automatic control of an underwater vehicle to perform inspection tasks of submerged structures without using the knowledge of a previous dynamic model in the control law and, mainly, by using a low cost embedded minimal instrumentation. This instrumentation does not employ expensive components to determine the position and orientation of the vehicle, like a central inertial. In this way, a computer vision system is used as a sensory source in order to assist the control. It was developed an algorithm to image processing and a system for integrating the different sensors. Experimental results using the proposed sensing show that the closed loop control of the vehicle was suitable for the conduction of inspections. Keywords: Computer vision, control, ROV, sensors, underwater robotics

Mauro André Barbosa Cunha [email protected] Instituto Federal Sul-Rio-Grandense Grupo de Pesquisa em Automação e Controle 96015-360, Pelotas, RS, Brazil

Silvia Silva da Costa Botelho [email protected] Universidade Federal do Rio Grande Centro de Ciências Computacionais 96201-900, Rio Grande, RS, Brazil

I. Introduction

1

Nowadays, the ocean plays a fundamental role in the global economy, mainly due to oil extraction industry. Nevertheless, much of the underwater environment is still unknown to man, either by size or by adverse environmental conditions. In these situations, unmanned underwater vehicles become an important tool because they allow, without risking human lives, to perform inspections, collect data, perform construction work and installation of underwater structures, etc. The inspection of underwater installations, such as power cables, telecommunications, and pipelines, is performed by trained professionals who, from the surface, controls a ROV (Remotely Operated Vehicle) based on images captured by an embedded video camera. This is a tough, slow, tiring, and error-prone task because it requires experience and constant operator attention to a console in order to maneuver the vehicle. Moreover, the task complexity is increased by the ocean currents action and by limits in the underwater images acquisition process. Therefore, the automation of some part of this process can constitute an important improvement with respect to errors, working time, and costs. However, designing control laws for unmanned underwater vehicles is a hard task, mainly due to difficulties in determining a realistic dynamic model. The existence of external disturbances, such as forces exerted by the umbilical cable and the ocean currents action, add extra difficulties to the control system performance. Underwater vehicles are also characterized by having a difficult instrumentation, i.e., accurately determine the position and orientation of the vehicle from inertial sensors is a complicated task and can be expensive (Kuhn et al., 2011). 1

Paper submitted to JBSMSE

J. of the Braz. Soc. of Mech. Sci. & Eng.

Researches on underwater vehicles can be found in the literature. Designs and development of low-cost underwater vehicles are treated in Tumari (2008) and Calvo et al. (2009). Research on dynamic modeling and simulations of underwater vehicles control can be found in Barros (1994), Tavares (2003), Küçük (2007), Wang et al. (2009), and Wang et al. (2011). Many types of controllers have been exploited to control these vehicles, such as controllers based on neural networks (Li, 2005; Bagheri et al., 2006; Shi et al., 2007; Liang, Gan and Wan, 2010), sliding-mode (Guo, Chiu and Huang, 2003; Sebastián and Sotelo, 2005; Akçakaya et al., 2009; Narimani, Nazem and Loueipour, 2009), fuzzy logic (Nagashima et al., 2002; Akkizidis et al., 2003; Aguilar, 2007; Salim, Noordin and Jahari, 2010), sensorial fusion (Williams et al., 2001; Singh and Sehgal, 2006; Karras, Loizou and Kyriakopoulos, 2011), multivariable systems (Luque, 2007), variable structure (Gomes et al., 2010), etc. Furthermore, different computer vision methods have been investigated for automatic identification of cables and pipelines in images captured by ROVs (Asif, Arshad and Yahya, 2006; Wirth, 2007; Buscariollo, 2008; Jordan, Berger and Bustamante, 2011). Some studies with experimental features have been developed in Brazil, involving underwater vehicles. Most of them address the development of low cost ROVs (Barros and Soares, 2002; Moraes, 2005; Centeno, 2007; Magalhães, 2007). Other works are intended for specific tasks, such as the work of Camerini et al. (2010), where an autonomous vehicle is presented for inspection of flexible risers in deep waters. Carneiro et al. (2006) show the development of a ROV for inspection, where a video camera and a sonar sensor was used to the position control. Avila (2008) and Prado (2009) have made contributions in the modeling and identification of hydrodynamic parameters. Barros et al. (2011) worked on the development of an AUV (Autonomous Underwater Vehicle) using low cost sensors and actuators. Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

At the Federal University of Rio Grande (FURG), a ROV with low cost features has been developed and improved since 2006, and it is currently in its second generation, entitled ROVFURG-II. The first generation ROVFURG was devoted to acquisition of technology for the embedded electronics and software to allow control from a joystick. The second generation focused on adding some autonomy, as to permit the use of the vehicle in automatic inspection of underwater structures. In practical terms, the operator can drive the vehicle until a structure to be inspected using the joystick and always watching the image coming from the onboard camera. When the operator decides that the image allows for a good observation of part of the structure, the control changes to be automatic and the whole structure may be inspected. The control law tries to maintain a constant distance between the vehicle and structure, while the vehicle develops its path inspection. This work demonstrates the feasibility of this automatic control strategy in order to supervise a submerged structure using a low cost embedded minimal instrumentation and computer vision, and without using the knowledge of a previous dynamic model in the control law. It is important to outline that, in the nowadays stage, the ROVFURG-II presents some AUVs features, but it is still treated as a ROV since its communication and power are performed through an umbilical cable.

purposes represents an interesting innovation introduced in this paper.

Figure 1. Underwater vehicle: ROVFURG-II.

This vehicle has four thrusters assembled horizontally (two) and vertically (two), as shown in Fig. 2, where the orthogonal and perspective simplified views are shown.

(a)

(b)

(c)

(d)

Nomenclature e = error signal sampled f = focal distance Z = distance between target and the camera k = distance between the lens and the focal point Ki = integral gain Kd =derivative gain Kp = proportional gain N = number of elements in the forgetting window u = control action Width= width of the target Subscripts px relative to the image pixels s relative to the surge motion controller tgt relative to the target y relative to the yaw motion controller 1 relative to the first parameters of the camera 2 relative to the second parameters of the camera

II. Description of Experimental Apparatus A. Underwater Vehicle The vehicle used in this work is presented in Fig. 1. This is the ROVFURG-II, developed at FURG. Detailed information about the design, development and construction of this vehicle can be found in Moraes (2005) and Centeno (2007). The quality of the sensors used in navigation is directly related to their costs, as in the case of very efficient and expensive central inertial systems. In subsequent sections of this article, it will be shown that it is possible to develop an automatic control for an underwater vehicle inspection tasks even with low-cost sensors. As the vehicle has passive control in two degrees of freedom (roll and pitch), it was noticed that only the information about depth, yaw and distance between the ROV and the structure inspected would be needed. This was the reason for the choice of sensors used in this work. A software was developed to extract the distance information between ROV and structure, using images from an onboard camera. This information is critical for the control system and the use of this vision system for tracking J. of the Braz. Soc. of Mech. Sci. & Eng.

Figure 2. Views: (a) frontal, (b) side, (c) top, (d) perspective.

The ROVFURG-II is a prototype of an underwater vehicle type ROV, controlled by a human operator from a joystick with the help of images captured by an embedded video camera. In this way, the operator can manually control the vehicle trajectory, as well as all the functionality of the video camera. This vehicle was designed to perform only inspection tasks in the underwater environment, in such a way that it does not have any type of manipulator to perform work. The ROVFURG-II has slightly positive floatability, near neutral, so that small vertical thrust applied by the thrusters tends to easily move the prototype. In cases of problem with the vehicle, slightly positive buoyancy causes it rises slowly to the surface. Figure 3 presents the body reference system commonly used for underwater vehicles (Fossen, 1994). In this figure, there are six DOFs (Degrees Of Freedom): surge, sway, heave, roll, pitch, and yaw.

Figure 3. Reference system for underwater vehicles.

Another important feature of this vehicle is with respect to its controllability. The ROVFURG-II is sub-actuated, i.e., it has an insufficient number of actuators for the effectuation of active control Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

in its six DOFs (does not have side thrusters that allow a lateral displacement of the vehicle, sway motion). Moreover, the pitch and roll motions are controlled passively from the distance between the centers of thrust and gravity, which creates the restorative torques to keep the vehicle in a horizontal position (Tavares, 2003). Thus, the two top thrusters only control the heave motion, while the two horizontal thrusters are responsible for surge and yaw motions. Given the features of this vehicle is intended to automatically control the surge, heave, and yaw motions (pitch and roll are controlled passively), thus remaining, only the sway motion without control because the absence of lateral actuators.

results showed that the pressure sensor always presented an excellent response and linear behavior, independently of relative velocity between the fluid (water) and the ROV. Its major limitation is the depth, which can be at most 30 m.

B. Depth Sensor A depth sensor was incorporated into ROVFURG-II to achieve the heave motion automatic control. It is an integrated pressure sensor, which provides an indirect measure of depth from the hydrostatic pressure exerted by the water column on the sensitive element. The integrated pressure sensor used is the MPX4250DP (Freescale Semiconductor Inc., 2009). For the purpose of sensor calibration, tests were conducted to obtain its response curve. The testing framework used is sketched in Fig. 4. Note that the sensor is connected to the computer through a data acquisition system, which was developed in Kuhn (2005).

Figure 4. Structure to obtain the response curve of the pressure sensor.

In this structure, a transparent hose was attached to the sensor. This hose has an initial stretch with small diameter to prevent the free flowing of water, forming an air pocket between the sensor and liquid media which does not allow the water incidence in the component, but allows pressure reading. In the remaining length a hose with a greater diameter was used, so that water could flow inside. With the structure set up, there were samples of the sensor signal with empty hose (water only enough to form the air pocket). In the sequence, it was added an amount of water in the hose equivalent to a 10 cm liquid column and new readings of the signal were captured. This procedure was repeated every ten centimeters up to 4.5 m height. The aforementioned experiment was repeated three times and its results, together with the ideal response supplied by the manufacturer (linear response), are presented in Fig. 5. Based on these results, one can verify that the experimental results vary slightly around the ideal response. These errors may be associated with factors such as: inaccuracy in measuring the water column height, compression of the air pocket formed between the sensor and liquid media, errors in analog to digital conversion, errors inherent to the sensor itself, etc. Nevertheless, as the experimental curves were close to the ideal curve, the sensor response was considered linear. Open and closed loop experimental J. of the Braz. Soc. of Mech. Sci. & Eng.

Figure 5. Response curve of pressure sensor.

C. Rotation Sensor A second sensor was incorporated into ROVFURG-II to achieve the yaw motion automatic control. This is a gyroscope MEMS (Micro Electro-Mechanical System) rotation sensor, which is responsible for measuring the angular rate around the vertical axis of the vehicle. The gyroscope used is the ADXRS610 (Analog Devices Inc., 2010). This gyroscope has an integrated temperature sensor that provides a temperature measure, which is used for a correction offset value of the angular rate according to the current temperature. This correction is based on Weinberg (2009). An important feature of this component is that it provides an output voltage proportional to its angular velocity (from -300 °/s to 300 °/s). Thus, the angular position provided to the control law of the yaw angle is obtained after a numerical integration of the velocity signal. Therefore, as mentioned before the temperature correction is essential to reduce drift errors due to the integration. In order to calibrate the sensor, the gyroscope was connected to the data acquisition system developed in Kuhn (2005), and some experiments were performed. After starting the data acquisition process, the gyroscope was subjected to the following angular positions: 0 °, 90 °, 180 °, 270 °, 360 °, 270 °, 180 °, 90 °, 0 °, -90 °, -180 °,-270 °, -360 °, -270 °, -180 °, -90 °, 0 °, 90 °, 0 °, -90 °, and 0 °. At the end of this sequence, the sensor was subjected to random vibrations and motions before being repositioned at 0 ° (initial orientation). The captured signal was then numerically integrated to verify whether the obtained angular positions corresponded to those imposed on the experiment. These results are presented in Figs. 6–7. It can be seen that the angular position signal has small errors with respect to right angles, but tends to increase during time due to the numerical integration operation and, mainly, the Angle Random Walk which is an inherent feature of the sensor itself (Stockwell, 2003). After one minute of integration, the greater positioning error in three repetitions of this experiment was of 3.45 °, which is acceptable for inspection purposes, but must be taken into consideration in other kind of application. However, after the introduction of correction as a function of temperature, the errors due to the integration of the offset decreased, allowing operation for about five minutes. Given the objective of validating the control strategy proposed in this work, the sensor was quite useful, but for future work one intends to investigate the use of redundant Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

information from other sensors to extract information from a reliable yaw angle in order to allow missions of long duration.

From the pinhole camera model (Szelinski, 2010), one obtains:

Widthtgt  f  Width px  (Z  k )

Figure 6. Gyroscope angular velocity signal.

(1)

where: Widthtgt is the object width in centimeters, Widthpx is the object width in the image in pixels, f is the focal length in centimeters (previously known through the calibration process), Z is the actual distance from the object to the camera in meters, and k is an unknown constant which defines the distance between the camera outside and the optical center. From the focal length value, object width, and Eq. (1), it is possible to establish the distance from the object to the camera, since the focal length is obtained by camera calibration (Bouguet, 2011). As k is unknown, an experiment was done to determine the constants values and to determine a more accurate focal length value, since they are critical parameters to obtain the actual distance from the camera to the object. In this experiment, it was used a metallic structure pipe type with 3.2 cm of external diameter, presented in Fig. 8. The tests were performed with the submersed ROV for the light refraction effects in water to be considered.

Figure 8. Metallic structure pipe type.

Figure 7. Angular position signal (numerical integration).

D. Video Camera Differently from the previous cases, the information to control the surge motion is not from a specific sensor, but from the images captured by the ROVFURG-II embedded video camera. This camera is a Typhoon (Tritech International Limited, 2009) model, and does not have any kind of movement relative to the robot, being fixed to its metal frame and facing forward, i.e., the images captured are those that are in front of the vehicle. It has a resolution of 640 x 480 pixels, but in this work, the images were processed at a resolution of 320 x 240 pixels. It allows its application in real time due the smaller processing time. The ROV surge motion control is based on the distance from the target structure to the vehicle. In the present system, vertical and horizontal position of the target could be estimated, it would have to be in front of the camera. If another camera had been attached to the ROV with down-looking, e.g., the heave motion could also be estimated. In this way, computer vision techniques are applied to images captured by the camera, in order to identify patterns in the structure. Once the algorithm has identified the target, the distance from the ROV to it can be estimated based on it size in the captured image (number of pixels) and its actual size (previously known). The present paper considers the environment with good visibility in order to allow the method to detect the target. It includes low turbidity and good illumination conditions. For future work one intends to investigate the method proposed in Schechner and Karpel (2005) to overcome this limitation. J. of the Braz. Soc. of Mech. Sci. & Eng.

Initially, this structure was placed close to the video camera, perpendicular, and its image was captured. In the sequence, the structure was displaced at a distance of 5 cm from the camera and the image was captured again. This procedure was performed for the distances: 0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 70, 80, 90, and 100 cm. The number of pixels according to the structure diameter was obtained for each captured image. The results were, respectively: 90, 61, 50, 41, 34, 30, 26, 23, 21, 19, 18, 16, 15, 13, 12, 11, and 10 pixels. From Eq. (1), one obtains:

Z  k1  k2 / Width px

(2)

where: k1 = k and k2 = f·Widthtgt. By using experimental data, and Eq. (2), which relates the real distance (Z) with the width measured in pixels (Widthpx), one applied the least squares method to find the values of the constants k1 and k2. The obtained values were: k1 = -11.9 cm and k2 = 1083.29 cm·px (where px is number of pixels). Taking into account the calibration parameters: f = 494.17 with standard deviation of ±158.18 (in px) and Widthtgt = 3.2 cm, by the Eq. (1), one can not determine k1, however, k2 results in 1581.35 cm·px with values within the expected range: from 1075.17 to 2087.52 cm·px. The difference from k2 determined analytically to the experimental one, is due to high radial distortion presented in the camera calibration process to estimate the focal length. Moreover, the real target width estimation shows small variations that are difficult to distinguish and can affect the determination of this constant. Figure 9 shows the experimental response curve and the response curve obtained by Eq. (2). It can be observed that the curve of Eq. (2) approaches considerably the experimental one and for this reason it has been adopted for the online calculation of the distance from the image captured by the camera. Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

Figure 9. Video camera response. Figure 11. The ROVFURG-II embedded electronics.

E. Actuators The thrusters used in ROVFURG-II are manufactured by SeaBotix, model BTD 150 (Seabotix Inc., 2007). The response curve of the thruster operating together with the power stage is shown in Fig. 10 and it was experimentally obtained in Centeno (2007). In this work, the nonlinearity present over 16 V, in Fig. 10, which is caused by current limiting the power stage, was not taken into account.

III. Control Software A. Overview The aim was to control simultaneously three movements: depth (heave), yaw and surge. It was proposed a control that did not depend on prior knowledge of the dynamic model of the vehicle due to the fact that some model parameters are difficult to estimate. As it will be seen in the experimental results (section IV), a PID (Proportional, Integral and Derivative) for the depth and surge and a PD (Proportional and Derivative) for yaw showed good results. The control software was written in C++. Figure 12 presents a simplified block diagram of the control software developed for the ROVFURG-II.

Figure 10. Response curve of the thruster model BTD 150.

F. Embedded Electronics Figure 11 presents a simplified block diagram of the ROVFURG-II embedded electronics. The microcontroller is responsible to monitor and execute the commands received by the serial port. Only the reading and writing commands have been implemented. In fact, its function is very restricted. Due to this reason, the signal processing is done in the control software installed on a computer in the surface. The microcontroller performs reading in all embedded sensors and send them via serial port to the control software. A write command carries information about the vehicle activation devices, so upon receiving such a command, the microcontroller drives each thruster with its speed and its rotation direction. During the automatic control operation, the computer software sends a read command, processing the signals in the surface and sends a write command containing the new speeds values of the vehicle thrusters. This procedure is repeated continuously while a trajectory is being tracked. J. of the Braz. Soc. of Mech. Sci. & Eng.

Figure 12. Operation of the control software.

Before the automatic control is started, the desired reference trajectory for the vehicle must be defined. This reference must contain the information of surge, heave, and yaw motions (DOFs controlled by software) defined at every instant of time over the control. Once trajectory is defined, one can put the vehicle in automatic control. During this operation, the control software is continuously exchanging information with the ROVFURG-II. At a first moment, the control application sends a read command to the vehicle, which returns with reading of the depth and rotation sensors. After, the images captured by the video camera are processed by the block Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

“image processing”, which provides a measure of the distance from the ROV to the target structure. From that moment, the block “signal processing and control laws” analyzes the three physical quantities measured (depth, angular rate, and distance to the structure) and the reference trajectory, generating signals for the vehicle thrusters, that are sent via a write command. This control procedure is repeated with a fixed sampling time of 100 ms during the trajectory tracking.

B. Signal Processing and Control Laws The block “signal processing and control laws” of the Fig. 12 is presented in details in Fig. 13. It can be seen that each one of the input physical passes through a LPF (Low Pass Filter). These filters are intended to eliminate spurious high frequency noise that can occur during normal system operation.

The angular rate information from the gyroscope has a low amplitude noise due to the sensor itself. An average filter was applied after the LPF to attenuate this noise. The filtered signal is then numerically integrated to obtain the angular position. The yaw motion controller implemented was a PD, because it showed a good performance and stability in the tracking of constant references (suitable for inspection tasks that need to keep an object in the center of the camera image without angular deviations). The discretized mathematical expression, implemented for this controller is presented in Eq. (4).

u y (tk )  Kp y  ey (tk )  Kd y  ey (tk )  ey (tk 1 )/ T

(4)

where: uy(tk) is the output signal of the yaw motion controller in the actual instant, Kpy and Kdy are the proportional and derivative gains, respectively, of the yaw motion controller, ey(tk) is the yaw angle error signal sampled in the actual instant, ey(tk-1) is the yaw angle error signal in the previous sampling instant, and T is the sampling time. The surge motion controller has the same algorithm as the heave motion controller, i.e., a PID with a forgetting window in the integral component. The discretized mathematical expression implemented for this controller is shown in Eq. (5).

us (tk )  Kps  es (tk )  Kis  T 

k

 e (t )  Kd  e (t )  e (t ) / T

s i  k  1 N s

i

s

s

k

s

k 1

(5)

Figure 13. Algorithm with control laws.

The output signal of the depth controller (uh) goes directly to the top thrusters, because they are responsible only for the heave motion of the vehicle. Although these thrusters make possible a roll motion, it is not exploited, because it is already controlled passively by the restorative torques action. Differently from the top thrusters, the horizontal thrusters are driven by a combination of signals from two controllers: surge motion (us) and yaw motion (u). These signals need to be combined, because the horizontal thrusters are responsible for both movements. The heave motion control is made by a PID controller. In this controller, it was used a forgetting window in the integral component (Gomes and Bier, 1998). This window consists of evaluating only N elements to compose the sum of integral component errors. The discretized mathematical expression implemented for this controller is shown in Eq. (3).

uh (tk )  Kph  eh (tk )  Kih  T 

k

 e (t )  Kd  e (t )  e (t ) / T

h i  k  1 N h

i

h

h

k

h

k 1

(3) where: uh(tk) is the output signal of the heave motion controller in the actual instant, Kph, Kih, and Kdh are the proportional, integral, and derivative gains, respectively, of the heave motion controller, eh(tk) is the depth error signal sampled in the actual instant, Nh is the number of elements in the forgetting window of the heave motion controller, the terms of Σ is the sum of the last Nh samples of the depth error signal, eh(tk-1) is the depth error signal in the previous sampling instant, and T is the sampling time.

J. of the Braz. Soc. of Mech. Sci. & Eng.

where: us(tk) is the output signal of the surge motion controller in the actual instant, Kps, Kis, and Kds are the proportional, integral, and derivative gains, respectively, of the surge motion controller, es(tk) is the distance to the structure error signal sampled in the actual instant, Ns is the number of elements in the forgetting window of the surge motion controller, the terms of Σ is the sum of the last Ns samples of the distance error signal, es(tk-1) is the distance error signal in the previous sampling instant, and T is the sampling time.

C. Image Processing The OpenCV library was used for the image manipulation, because it already has many efficient algorithms written in C/C++ for capturing and recognizing patterns, allowing the image processing in real time. A complete description of the library functions can be found in Bradski and Kaehler (2008). The target structure used in the experiments is the same as that taken for the calibration of the video camera, i.e., a metallic structure pipe type. In this way, one can use Eq. (2) to convert the diameter of the structure in the images (in pixels) for the real distance from the vehicle to the structure (in meters). It should be also noted that the application of this algorithm was performed from the images captured by the ROVFURG-II camera. In this particular case, the images were taken from underwater inspections performed during the daytime in a controlled environment with features that reduce the complexity of structure identification. In this situation, the water has a low turbidity, the natural lighting environment combined with shallow is sufficient to obtain good visibility conditions of underwater images, the target structure did not change its contours by the natural action of the marine environment, etc. Figure 14 presents details of the block “image processing” of Fig. 12, i.e., the algorithm used to analyze the images captured by the ROVFURG-II camera. The image changes throughout the processing are presented in Fig. 15.

Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

environments, it is clear the need for modifications and adjusting the image processing program. In more complex applications is common the use of Kalman filters to estimate the structure position (Karras, Loizou and Kyriakopoulos, 2011; Asif, Arshad and Yahya, 2006), as well as other image processing methods, like particle filters (Wirth, 2007) and nonlinear control (Narimani, Nazem and Loueipour, 2009). But, these techniques are not applied due the computation cost and difficulties to adjust parameters. Figure 14. Image processing algorithm.

IV. Experimental Results A. Considerations

(a)

(b)

(d)

(e)

(c)

Figure 15. Image processing states: (a) original image resized, (b) grayscale, (c) edge detection, (d) dilate, (e) parallel lines.

Initially, the camera images are received by a video capture card. Once the image is available, it is resized to 320 x 240 pixels using bilinear interpolation. Then, later the resized image is converted to grayscale, losing its color characteristics, since this work does not make any assumption about the object color to be tracked, just about its shape. These two initial procedures reduce significantly the computational cost and are essential in such a way that the control software does not compromise system performance. Figure 15(a) shows the resized image and, Fig. 15(b), the resized image after grayscale conversion. The next step is to apply an edge detector. In this case, the Canny algorithm (Canny, 1986) was applied in grayscale images. This operator produces as output an image with the positions of the detected intensity discontinuities (Lima et al., 2008). This processing result is shown in Fig. 15(c). A morphological dilation operator is applied to highlight the edges found by Canny in order to assist the next step, line detection. Figure 15(d) presents the resulting image of this operator. After the highlight by the dilation operator, the Hough transform (Duda and Hart, 1972) is applied to detect lines. In this process, pixels are converted into lines, and then the intersection points between them are found and grouped into line segments (Pedrini and Schwartz, 2007). From this information, the longest line segments that are parallel amongst themselves are searched (structure contours) and two parallel lines along them are drawn. These two lines can be seen in Fig. 15(e). The distance between these two parallel lines corresponds to the external diameter of the metallic structure pipe type. Thus, the distance from the structure to the vehicle can be obtained from Eq. (2). In case of failure in structure identification, the distance measurement is not updated, remaining with the last valid value identified. This algorithm does not foresee situations in which the structure is out of sight of the camera: in these cases, the system control must be changed to the manual mode. As stated before, the system was designed for a specific situation and, for other structure types and other underwater J. of the Braz. Soc. of Mech. Sci. & Eng.

The experimental results had the primary purpose of proving the ability of the automatic control to inspect a proposed underwater structure, in order to maintain the structure in the center of the image (yaw control), keeping the distance between the ROV and the structure constant (surge control) and carry paths in a depth to thereby scan across the submerged structure (depth or heave control). Therefore, in yaw and surge, position reference signals are constants (step response). In depth (heave) motion, it was assigned a constant speed as reference signal. These experimental tests with ROVFURG-II were performed during the daytime in an indoor pool of 1.4 m depth. This means that the vehicle was not subject to environmental disturbances represented by the waves, ocean currents, and winds. Thus, the traction effects of the umbilical cable decrease considerably, not adding extra difficulties to the control system. The target structure is the same that was used for the development and adjust the image processing algorithm, i.e., a metallic structure pipe type with 3.2 cm external diameter. Thus, Eq. (2) was used to obtain the distance from this structure to the vehicle. It should be remembered that the nonlinearity in the thrusters response curve was not considered (Fig. 10), in a manner that each actuator produces a maximum thrust of 10.67 N when subjected to a 19.2 V. The processing time of information from sensors and video camera images is variable, however, remained lower than the fixed sampling time (100 ms) and it does not influence the control system performance. The experiment presented was repeated five times, producing similar results.

B. Reference Trajectory The choice of a reference trajectory with the surge, heave, and yaw motions, took into account two characteristics of ROVFURGII: the non-controllability of the sway motion (does not have side thrusters) and fixed position of the video camera (facing forward of the vehicle). Therefore, it was defined a reference with a vertical motion for tracking the metallic structure pipe type. The depth reference was a ramp with an inclination of 5 cm/s, so the vehicle run 50 cm in 10 s and then returned to its initial position in 20 s. This trajectory is continuously repeated and always with a constant speed of 5 cm/s. In order to maintain the target structure at the center of the image during inspection, the vehicle should not have angular variations around its vertical axis. For this reason, the yaw angle reference signal is constant, i.e., the angular position is assumed to be 0 ° and the control law acts to keep this angle null. The reference signal for the surge, distance from the ROV to the target structure, follows the same principle that the yaw angle. At the time the automatic control is activated the distance signal is captured and, thereafter, it is used as a constant reference signal. Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

It should be emphasized that, while tracking the reference trajectory, the vehicle may suffer a lateral shift (sway) due to water movement, traction of the umbilical cable or other type of disturbance. However, this drift does not have how to be corrected due to absence of lateral actuators in the vehicle.

C. Control System Response Figures 16–23 show the movements described by the vehicle over the control action. In the Figs. 20–23 the instants ‘a’, ‘b’, and ‘c’, highlighted by a dashed line, indicates the moments when manual disturbances were made in the target structure, bringing it closer to or away from the vehicle. The heave motion described by the vehicle, even as the depth reference signal, can be seen in Fig. 16. The depth error signal and the thrust developed by the top thrusters are shown in Figs. 17–18, respectively.

Figure 19. Yaw angle.

Figure 16. Depth of the vehicle (heave motion).

Figure 20. Distance from the vehicle to the structure.

Figure 17. Depth error of the vehicle.

Figure 21. Distance error (from the vehicle to the structure).

Figure 18. Thrust of the top actuators.

Figure 22. Actuator left thrust.

J. of the Braz. Soc. of Mech. Sci. & Eng.

Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

D. Image Processing Algorithm Response Figure 24 presents a sequence of images captured by the video camera and processed by the structure identification algorithm throughout the three DOFs control (Figs. 16–23). The presented frames are spaced by a second, the first refers to the instant of 0 s and the last is related to 20 s. In this experiment, the structure was not correctly positioned vertically, so it presents an inclination degree in images.

Figure 23. Actuator right thrust.

Based on these results, it becomes clear the difficulty of the controller to keep the vehicle on the specified trajectory, especially in the moments when there were inversion of the movement direction (10 s, 20 s, 30 s...), since in these instances it has to deal with the water flow in the opposite direction. As can be observed in Fig. 17, the position error in these critical points is approximately 8 cm, but remains about 4 cm in the rest of the trajectory. The top thrusters operate with up 50 % of its maximum power at instants of change in the movement direction, as it can be seen in Fig. 18. Note that the motion described by the ROV at the ramp represents the signal depth quantization, i.e., the levels resulting from the signal digitization (Tocci, Widmer and Moss, 2007). This occurs due to the low resolution of the microcontroller analog-todigital converter (10 bits) and the sensitivity of the depth sensor (18.8 mV/kPa). Since a resolution of 10 bits allows 1023 different voltage levels, the minimum variation experienced by the converter is 4.89 mV (5 V/1023). Therefore, the smallest depth change detected is approximately 2.65 cm. This variation can be seen in Fig. 16. Given this limitation, it can be concluded that the results obtained were acceptable. The response curve of the yaw motion obtained by integrating angular velocity signal of the gyroscope is shown in Fig. 19. In this figure, one can note that even after one minute of integration, the greatest yaw angle error remains less than 4 °, which is a satisfactory result for underwater inspection, since this drift is not perceptible visually. The thrusts developed by the left and right horizontal thrusters are presented, respectively, in Figs. 22–23. The third DOF controlled, the surge, is based on images captured by video camera. This distance is shown in Fig. 20 and the corresponding error signal can be seen in Fig. 21. The output of the surge motion controller is sent to the horizontal actuators, Figs. 22– 23. In order to verify the response of the image processing algorithm, manual disturbances were generated in the target structure. The target was moved further from the robot at the instants ‘a’ and ‘c’. It was moved closer at the instant ‘b’. In the Fig. 21, the distance error, before the first disturbance and few seconds after the last, oscillates between 2.5 cm and 5 cm, which is a good result for underwater inspection. This error could be reduced if the resolution of images captured by video camera was increased to 640 x 480 pixels, but this imply to increase the computational cost and its practical viability would have to be verified. The control system responded well to the disturbances imposed, taking on average 5 s to reestablish the initial reference position. The system response as a whole has been quite interesting, but it is probable that a greater refinement of the controller parameters could further improve system performance. J. of the Braz. Soc. of Mech. Sci. & Eng.

Figure 24. Images processed by the structure identification algorithm.

At certain times, the algorithm cannot identify the structure due blurring effect in the images. When it happens, the width information is not updated, being defined the last valid value identified. This procedure implies a delayed response of the controller for each consecutive image that is not possible to identify the structure contours. Since the sampling time of the system was set at 100 ms the automatic control, in this experiment, was during 70 s, then, a total of 700 images were captured and processed. From this total, the structure identification algorithm failed in only 48 frames, i.e., the success rate was 93.1 %, which is a good number front of the algorithm implemented. Moreover, there were only five consecutive failures of the structure identification, which generates a maximum delay of 500 ms in the surge motion controller response, but not enough to compromise the system performance due to its slow dynamic. The target structure is centered on the image for almost all inspection, a small shift to the right is perceived, but was caused when approaching the target structure to the vehicle. The distance from the camera to the structure (target size in image) is preserved

Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

throughout the experiment, even with the movements away and with approximation the structure for simulating disturbance.

V. Conclusions This paper presented results of the position and orientation control of a ROV to perform automatic inspections tasks of a submerged structure using a low cost embedded minimal instrumentation, dispensing the use of central inertial. A good performance of the automatic control is achieved because it was developed an image processing algorithm to extract a position measurement and a system to integrate a camera and various sensors. A pressure sensor was used to observe the inertial measurement of the vertical position, i.e., the vehicle depth. The angular position around the vertical axis of the ROV, the yaw angle, was controlled from the signal provided by a gyroscope. In addition to information from the sensors, the images captured by embedded video camera were processed by an algorithm that provides distance information from the target structure to the vehicle for surge motion. Experiments were performed to obtain the response curve and calibration of sensors and video camera. Graphically, it was observed that the results obtained had small errors in the tracking of the reference trajectory, but that in general, were visually imperceptible and are considered satisfactory for the inspection operations with ROVFURG-II. However, it is quite probable that a greater refinement of the controller parameters could further improve system performance. Future works will involve the control of other degrees of freedom of the ROVFURG-II, through signals from the sensors and the processing of images captured by embedded video camera. For this, side thrusters will be installed in order to allow the vehicle to develop sway motion. In this regard, efforts will be done to test the inclusion of a magnetic compass type sensor for measuring the yaw angle and use this information to solve the problem of accumulation of errors due to integration of the gyroscope signal. Related to the computer vision method, the future directions will be focused on dealing with water turbidity and distortions since an improvement in image quality is necessary to use the system in oceanic environments.

References Aguilar, L. L. P., 2007, “Controle inteligente para a navegação de veículos submarinos semi-autônomos” (In Portuguese), M.S. thesis, Dept. Eng., Polytechnic School of Univ. of São Paulo, São Paulo, Brazil. Akçakaya, H., Yildiz, H. A., Sağlam, G. and Gürleyen, F., 2009, “Sliding mode control of autonomous underwater vehicle”, Int. Conf. Elect. and Electron. Eng., Bursa, Turkey. Akkizidis, I. S., Roberts, G. N., Ridao, P. and Batlle J., 2003, “Designing a fuzzy-like PD controller for an underwater robot”, Control Eng. Practice, Vol.11, pp. 471-480. Analog Devices Inc., 2010, “ADXRS610: ±300 °/sec Yaw Rate Gyro”, Norwood, MA. Asif, M., Arshad, M. R. and Yahya, A., 2006, “An active contour for underwater target tracking and navigation”, Proc. Int. Conf. ManMachine Syst., Langkawi, Malaysia. Avila, J. P. J., 2008, “Modelagem e identificação de parâmetros hidrodinâmicos de um veículo submarino” (In Portuguese), Ph.D. thesis, Dept. Mech. Eng., Univ. of São Paulo, São Paulo, Brazil. Bagheri, A., Amanifard, N., Karimi, T., Farahani, M. H. and Besarati, S. M., 2006, “Adaptive neural network control of an underwater remotely operated vehicle (ROV)”, Proc. 10th WSEAS Int. Conf. Comput., Athens, Greece, pp. 614-619. Barros, E. A. de, 1994, “A Cooperative Control System and Its Application to the Collision Avoidance Guidance of Autonomous Underwater Vehicles”, Ph.D. thesis, Univ. of Tokyo, Tokyo, Japan.

J. of the Braz. Soc. of Mech. Sci. & Eng.

Barros, E. A. de and Soares, F. J. A., 2002, “Desenvolvimento de um Robô Submarino de Baixo Custo” (In Portuguese), 14th Brazilian Congr. of Automatic, Natal, Brazil, pp. 2121-2126. Barros, E. A. de, Dantas, J. D., Freire, L. O., Zanoni, F. D., Oliveira, L. M. and Vale, R. T. S., 2011, “New Aspects in the Pirajuba AUV Project”, 21st Brazilian Congr. of Mech. Eng., Natal, Brazil. Bouguet, J.-Y., 2011, “Camera Calibration Toolbox for Matlab”, California Institute of Technology, Pasadena, CA. [Online]. Available: http://www.vision.caltech.edu/bouguetj/calib_doc Bradski, G. and Kaehler, A., 2008, “Learning OpenCV: Computer Vision with the OpenCV Library”, Ed. O'Reilly Media, New York. Buscariollo, P. H., 2008, “Sistema de posicionamento dinâmico baseado em visão computacional e laser” (In Portuguese), Ph.D. thesis, Dept. Eng., Polytechnic School of Univ. of São Paulo, São Paulo, Brazil. Camerini, C., Freitas, M., Langer, R. A., Weid, J. P. and Marnet, R., 2010, “Autonomous Underwater Riser Inspection Tool”, Proc. 8th Int. Pipeline Conf., Calgary, Canada, pp. 1-7. Calvo, O., Sousa, A., Bibiloni, J., Curti, H., Acosta, G. and Rozenfeld, A., 2009, “Low-cost autonomous underwater vehicle for underwater acoustic inspections”, J. of Maritime Research, Vol.6, No.2, pp. 37-52. Canny, J., 1986, “A computational approach to edge detection”, IEEE Trans. Pattern Anal. Mach. Intell., Vol.8, No.6, pp. 679-698. Carneiro, R. F., Leite, A. C., Peixoto, A. J., Goulart, C., Costa, R. R., Lizarralde, F. and Hsu, L., 2006, “Underwater robot for tunnel inspection: design and control”, Proc. 12th Latin-American Congr. on Automatic Control, Salvador, Brazil. Centeno, M. L., 2007, “ROVFURG-II: Projeto e construção de um veículo subaquático não tripulado de baixo custo” (In Portuguese), M.S. thesis, Dept. Ocean Eng., Univ. Federal of Rio Grande, Rio Grande, Brazil. Duda, R. O. and Hart, P. E., 1972, "Use of the hough transformation to detect lines and curves in pictures", Comm. ACM, Vol.15, No.1, pp. 1115. Fossen, T. I., 1994, “Guidance and Control of Ocean Vehicles”, Ed. John Wiley & Sons, Chichester, England. Freescale Semiconductor Inc., 2009, “Integrated Silicon Pressure Sensor OnChip Signal Conditioned, Temperature Compensated and Calibrated: MPX4250 Series”, 7th ed., Tempe, AZ. Gomes, S. C. P. and Bier, C. C., 1998, “Estudo sobre trajetórias de controle para robôs manipuladores” (In Portuguese), 12th Brazilian Congr. of Automatic, Uberlândia, Brazil. Gomes, S. da S., Zeilmann, A. P., Terres, M. A. de S. M., Soares, L. B. and Gomes, S. C. P., 2010, “Controle de um veículo subaquático baseado em estrutura variável” (In Portuguese), 4th Seminar and Workshop on Ocean Eng., Rio Grande, Brazil. Guo, J., Chiu, F.-C. and Huang, C.-C., 2003, “Design of a sliding mode fuzzy controller for the guidance and control of an autonomous underwater vehicle”, IEEE J. Ocean. Eng., Vol.30, pp. 2137-2155. Jordan, M. A., Berger, C. and Bustamante, J. L., 2011, “Design of a VisionBased Sensor of Position and Rate for the Guidance of Autonomous Underwater Vehicles”, Argentine Institute of Oceanography, Bahía Blanca, Argentina. [Online]. Available: http://www.iadoconicet.gob.ar/archivos/Vision-Based-Sensor.pdf Karras, G. C., Loizou, S. G. and Kyriakopoulos, K. J., 2011, “Towards semiautonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control”, Autonomous Robots, Vol.31, pp. 67-86. Küçük, K., 2007, “Modeling and motion Simulation of an Underwater Vehicle”, M.S. thesis, Dept. Mech. Eng., Middle East Technical Univ., Ankara, Turkey. Kuhn, V. N., 2005, “Desenvolvimento de um sistema microcontrolado para implementação do modo teach em um robô: hardware e software” (In Portuguese), Technology Ind. Automation, Federal Center of Technological Education of Pelotas, Pelotas, Brazil. Kuhn, V. N., Drews Jr, P. L. J., Gomes, S. C. P., Cunha, M. A. B. and Botelho, S. C., 2011, “Controle automático de um ROV para inspeção de estruturas submersas”, (In Portuguese), 10th Brazilian Symposium on Intelligent Automation, São João del-Rei, Brazil. Li, J. H., 2005, “A neural network adaptive controller design for free-pitchangle diving behavior of an autonomous underwater vehicle”, Robotics and Autonomous Syst., Vol.52, pp. 132-147. Liang, X., Gan, Y. and Wan, L., 2010, “Motion controller for autonomous underwater vehicle based on parallel neural network”, Int. J. of Digital Content Technology and its Applicat., Vol.4, No.9, pp. 61-67.

Pre-Print

Automatic Control of a ROV for Inspection of Underwater Structures Using a Low Cost Sensing

Lima, J. P. S. do M., Silva, D. da, Teichrieb, V. and Kelner, J., 2008, “Reconhecimento de padrões em tempo real utilizando a biblioteca OpenCV” (In Portuguese), 5th Workshop Virtual and Augmented Reality, Bauru, Brazil. Luque, J. C. C., 2007, “Controle robusto multivariável para um veículo submersível autônomo” (In Portuguese), M.S. thesis, Dept. Mech. Eng., Polytechnic School of Univ. of São Paulo, São Paulo, Brazil. Magalhães, P. H. V., 2007, “Desenvolvimento de um Submersível Remotamente Operado de Baixo Custo e Caracterização dos Sistemas de Propulsão e Vetorização de Empuxo por Hélice” (In Portuguese), Ph.D. thesis, Dept. Mech. Eng., Univ. Federal of Minas Gerais, Belo Horizonte, Brazil. Moraes, C. E. M., 2005, “ROVFURG-I: Projeto e construção de um veículo subaquático não tripulado de baixo custo” (In Portuguese), M.S. thesis, Dept. Ocean Eng., Univ. Federal of Rio Grande, Rio Grande, Brazil. Nagashima, Y., Taguchi, N., Ishimatsu, T. and Mizokami, T., 2002, “Development of a compact autonomous underwater vehicle using variable vector propeller”, Proc. 12th Int. Offshore and Polar Eng. Conf., Kitakyushu, Japan, pp. 290-294. Narimani, M., Nazem, S. and Loueipour, M., 2009, “Robotics vision-based system for an underwater pipeline and cable tracker”, IEEE OCEANS'09, Bremen, Germany. Pedrini, H. and Schwartz, W. R., 2007, “Análise de Imagens Digitais: Princípios, Algoritmos e Aplicações” (In Portuguese), Ed. Thomson Learning, São Paulo, Brazil. Prado, A. de A., 2009, “Metodologia experimental para obtenção dos parâmetros hidrodinâmicos do VSNT JAÚ II, baseado em processamento digital de imagens.” (In Portuguese), M.S. thesis, Dept. Eng., Polytechnic School of Univ. of São Paulo, São Paulo, Brazil. Salim, M. A., Noordin, A. and Jahari, A. N., 2010, “A robust of fuzzy logic and proportional derivative control system for monitoring underwater vehicles”, 2nd Int. Conf. Comput. Research and Develop., Kuala Lumpur, Malaysia. Schechner, Y. Y. and Karpel, N., 2005, “Recovery of underwater visibility and structure by polarization analysis”, IEEE J. Ocean. Eng., Vol.30, No.3, pp. 570-587. Seabotix Inc., 2007, “Standard Thruster & 2 wire whip: BTD150 Specifications”, San Diego, CA. Sebastián, E. and Sotelo, M. A., 2005, “Adaptive fuzzy sliding mode controller for the snorkel underwater vehicle”, Proc. 2nd Int. Conf.

J. of the Braz. Soc. of Mech. Sci. & Eng.

Informatics Control, Automation and Robotics, Barcelona, Spain, pp. 255-259. Shi, Y., Qian, W., Yan, W. and Li, J., 2007, “Adaptive depth control for autonomous underwater vehicles based on feedforward neural networks”, Int. J. of Comput. Sci. & Applicat., Vol.4, pp. 107-118. Singh, P. and Sehgal, A., 2006, “Computer vision, sonar and sensor fusion for autonomous underwater vehicle (AUV) navigation”, CIENCIA'06, Hyderabad, India. Stockwell, W., 2003, “Angle Random Walk”, Crossbow Technology Inc., Milpitas, CA. [Online]. Available: http://www.xbow.com/pdf/AngleRandomWalkAppNote.pdf Szelinski, R., 2010, “Computer Vision: Algorithms and Applications”, Ed. Springer, New York. Tavares, A. M., 2003, “Um estudo sobre modelagem e o controle de veículos subaquáticos não tripulados” (In Portuguese), M.S. thesis, Dept. Ocean Eng., Univ. Federal of Rio Grande, Rio Grande, Brazil. Tocci, R. J., Widmer, N. S. and Moss, G. L., 2007, “Sistemas Digitais: Princípios e Aplicações” (In Portuguese), 10th ed., Ed. Pearson Prentice Hall, São Paulo, Brazil. Tritech International Limited, 2009, “Typhoon: Colour Zoom Camera”, Aberdeen, United Kingdom. Tumari, M. Z. B. M., 2008, “The development of a low cost underwater vehicle”, Bachelor of Eng., Univ. Tech. Malaysia, Skudai, Malaysia. Wang, S.-X., Sun, X.-J., Wang, Y.-H., Wu, J.-G. and Wang, X.-M., 2011, “Dynamic modeling and motion simulation for a winged hybrid-driven underwater glider”, China Ocean Eng., Vol.25, No.1, pp. 97-112. Wang, B., Wan, L., Xu, Y.-R. and Qin, Z.-B., 2009, “Modeling and simulation of a mini AUV in spatial motion”, J. Marine. Sci. Appl., Vol.8, pp. 7-12. Weinberg, H., 2009, “Calibrating iMEMS Gyroscopes”, Analog Devices Inc., Norwood, MA. [Online]. Available: http://www.analog.com/static/imported-files/application_notes/AN1049.pdf Williams, S. B., Newman, P., Rosenblatt, J., Dissanayake, G. and DurrantWhyte, H., 2001, “Autonomous underwater navigation and control”, Robotica, Vol.19, No.5, pp. 481-496. Wirth, S., 2007, “Visual underwater cable/pipeline tracking”, M.S. thesis, Dept. Comput. Sci., Koblenz-Landau Univ., Koblenz, Germany.

Pre-Print