Modified Hopfield Neural Network Computational ... - IEEE Xplore

3 downloads 0 Views 2MB Size Report
*CINVESTAV del IPN, Unidad Guadalajara. Av. del Bosque, Col. El Bajío, Zapopan, Guadalajara, Jalisco 45019, MEXICO email: [email protected].
Modified Hopfield Neural Network Computational Technique For Real-Time Fusion of Multimode Radar/SAR Imagery Yuriy V. Shkvarko*, Stewart R. Santos*, José Tuxpan*, Eduardo Espadas *CINVESTAV del IPN, Unidad Guadalajara Av. del Bosque, Col. El Bajío, Zapopan, Guadalajara, Jalisco 45019, MEXICO email: [email protected] Abstract: We address a new approach to the problem of improvement of the quality of remote sensing (RS) imagery obtained with multimode imaging radar/SAR systems that employ different image formation methods via performing the collaborative RS image/method fusion. The collaborative considerations involve adaptive adjustment of the user-controllable regularization degrees of freedom in a particular image formation scheme. We develop the Hopfield neural network-adapted computational methodology for performing such data fusion employing the recently developed descriptive experiment design regularization (DEDR) framework aggregated with the variational analysis (VA) image enhanced approach. The addressed modified maximum entropy neural network (MENN) technique performs the collaborative reconstruction-fusion task in an efficient computational fashion ensuring on-line dynamic updates only of higher quality information from the input multimode image frames. The reported simulations verify that the developed DEDR-VA optimal MENN fusion technique outperforms the recently proposed iterative enhanced radar/SAR imaging methods both in the achievable resolution enhancement and the convergence rate.

1. Introduction Increasing capability of co-registered multi-sensor remote sensing (RS) imagery has spurred development of various system/method fusion techniques for enhancing the RS images acquired with multimode imaging radar/SAR systems as required for end-user-oriented environmental resource management, e.g., see the new Bayesian experiment design (BED) framework developed in the companion paper [1] of this series and the references therein. Modern Neural Network (NN) approaches are well suitable for solving optimization problems such as high-resolution enhancement of the remote sensing (RS) imagery, image restoration and change detection [2]-[6]. To design the NN-oriented fusion framework, we propose to follow the recently developed descriptive experiment design regularization (DEDR) method [7], [8] aggregated with the variational analysis (VA) image enhancement techniques [1], [9]. In the unified DEDR-VA framework, the multimode RS information fusion implies performing the minimization of the augmented NN energy function that incorporates, in general, two metrics: (i) the maximum entropy (ME) a priori term composed as a “clique” function containing the a priori knowledge about the RS image space structure, and (ii) a composite “penalty” or data agreement term depending on the closeness of the actually acquired multimode RS image frames to the desired fused (synthetic) RS image.

2. Image Restoration Model In RS imaging with different sensor systems, a model most often used expresses the degraded image formed by a system as a sum of the noise vector and a linear convolution of the original image vector with the system’s spatial response function. The latter is usually referred to as the point-spread function (PSF) of the image formation system [1], [8]. The power spatial

spectrum pattern (SSP) of the scattered (radiated) signal field is referred to as the original image of the environment [7]. The noise vector is account to the power components in the degraded image that correspond to system noise and environmental noise and are statistically independent on the image. Let consider now P different degraded images {q(p); p = 1, …, P} of the same original scene image b obtained with P different systems or methods. The system or method fusion can be employed in these cases to improve the quality of the images. In the system fusion problem, we associate P different models of the PSF with the corresponding image formation systems. In the method fusion problem, we assume one given system but apply P different image formation algorithms to form the images {q(p); p = 1, …, P}, e.g., from the DEDR/BED family developed in the acompanion paper [1]. In both cases, we have the system of P equations q(p) = Φ( p) b + ν(p); p = 1,…, P; (1) ( p) (p) with P different PSFs Φ and different noise vectors {ν ; p = 1, …, P}, respectively. The problem of system or method fusion is considered as a composite inverse problem of restoration of the original image b from P actually formed degraded images {q(p)}, given the systems' PSFs { Φ( p) }. No prior knowledge about the statistics of noise in the data is implied, thus the ME prior model uncertainty conventional for the practical RS problems is assumed.

3. ME Regularization for RS Imaging It is well known that the PSFs { Φ( p) } are ill-conditioned for practical RS image formation systems [1]–[8]. Hence, the regularization-based approach is needed when dealing with the formulated above problem. Next, the statistical model uncertainties about the image and noise significantly complicate the restoration problem making inapplicable the statistically optimal Bayesian inference techniques [3], [7]. That is why, we sdopt here the ME regularization approach [2], [4], [6] in whic case the desired image is to be found as a solution bˆ = arg min E ( b λ ) to the problem of minimization of the augmented error function b

E (b|λ) = – H(b) +

1 2



P p =1

λ p Jp (b) +

1 2

λ P+1 JP+1 (b) ,

(2)

where H(b) = – ∑ k =1 bk ln bk is the image entropy , λ = (λ1 , …, λP , λP+1 )T is the vector of K

regularization parameters, {Jp(b) = ||q(p) – Φ( p) b||2; p = 1, …, P} compose a set of objective (cost) functions incorporated into the optimization, and JP+1(b) = bTMb is the Tikhonov VA inspired stabilizer [6] that controls metrics properties of the desired image specified by the K×K matrix-form pseudo differential operator, the same as the metrics inducing matrix M from the companion paper [1] of this series. It is important to note that the ME-regularized solution of the problem (2) exists and is guaranteed to be unique because the surfaces of all functions that compose E (b|λ) are convex. But one can deduce that due to the high order nonlinearity of the error function, such ME-regularized solution of the image restoration problem with system fusion requires extremely complex computations with proper collaborative adjustments of all “degrees of freedom” λ in (2) if solve this problem using the standard gradient descent-based minimization techniques, thus yields an NP hard computational problem [2], [5].

4. MENN for Collaborative RS System/Method Fusion The multistate Hopfield-type dynamic maximum entropy NN (MENN) that we propose to employ to solve the fusion problem at hand is a P-D expansion of the MENN developed in [6]

with the K-D state vector x and K-D output vector z = sgn(Wx + θ), where W and θ represent the matrix of synaptic weights and the vector of the corresponding bias inputs of the MENN, respectively, designed to aggregate all P systems/methods to be fused. The state values {xk; k = 1, …, K} of all K neurons are used to represent the gray levels of the image in the process of restoration. Each neuron k receives the signals from all other neurons including itself and a bias input. The computational structure of the MENN is presented in Figure 1. The energy function of such the NN is expressed as [2] K K K (3) E = – 12 xT W x – θT x = – 12 ∑ k =1 ∑ i =1Wki xk xi – ∑ k =1θ k xk . x1′ x′2 • • •

τ

Neuron 1

x′′1

xK′

• • •

Neuron 2

xK′



x′′2

x′2 • • •

xK′

τ

×

x1′

Activation Function

Wki



sgn(⋅)

zk

ℜ(⋅)

x″k

• •



Neuron K

×

xi′

State Updating





x′1

•••

Unit-delay operators

Wk1

x1′ x′2

τ

xK′′

xK′ Input Signals

×

θk Bias Input

WkK

Synaptic Weights

(a) (b) Figure 1. MENN structure (a), and structure of a particular neuron (b).

The idea for solving the RS system/method fusion problem using the MENN is based on the following proposition. If the energy function (3) of the NN represents the function of a mathematical minimization problem over a parameter space, then the state of the NN would represent the parameters and the stationary point of the network would represent a local minimum of the original minimization problem (2). Hence, utilizing the concept of a dynamic NN, we may translate our image enhancement problem with RS system/method fusion to the correspondent problem of minimization of the energy function (3) of the related MENN. Therefore, we define the parameters of the MENN in such a fashion that to aggregate the corresponding parameters of all P RS systems/methods (1) to be fused that yields K P Wki = – ∑ p=1[ λˆp ∑ j =1 Φ (jkp ) Φ (jip ) ] – λ P+1Mki , (4) K P θk = – lnbk + ∑ p=1[ λˆp ∑ j =1 Φ (jkp ) q (jp ) ]

(5)

for all k,i = 1, …, K. Next, to find a minimum of the energy function (3) the states of the network are to be updated x'' = x' + Δx (the superscripts ' and '' correspond to the state values before and after network state updating) to provide the nonpositive energy changes K ΔE ≈ – ( ∑i =1Wki xi′ + θk′ – 1)Δxk – (1/2)Wkk (Δxk )2 . (6) To guarantee nonpositive values of the energy changes (6) at each updating step the state update rule ℜ(z ) should be as follows, ℜ(z ): if zk = 0, then Δxk = 0; if zk > 0, then Δxk = Δ; if zk < 0, then Δxk = – Δ for all k = 1, …, K, where Δ is a prescribed step-size parameter [6]. Next, following the original MENN prototype from [6], we adopt the collaborative balancing method for estimation of the regularization parameters P {λˆp = qˆ −1aˆ p , aˆ p = (∑ n =1 rn )−1rp ; p, n = 1, …, P} (7)

( p) −2

where rp = trace{ (Φ ) } represents the so-called resolution factor of the corresponding pth system/method to be fused [6], and the gain factor qˆ is to be found as a solution to the socalled adaptive resolution-to-noise balance equation (Eq. (13) from [6]) for each particular scene. Note that by integrating this MENN algorithm with different BED-VA-related reconstruction methods presented in the companion paper [1], different other feasible modifications of the MENN-based system/method fusion technique can also be devised.

5. Simulations Results To analyze the performances of different reconstruction strategies exemplified in [1], [7]–[10] with system/method fusion, we evaluated qualitatively and quantitatively the effectiveness of the image reconstruction/fusion via performing the computer simulations experiment with RS images of 1024×1024 pixel format [10]. To compare the results of fusion of different BEDVA-related techniques exemplified in the companion paper [1] of this series, we run the simulations experiment in the format similar to the previous studies of the BED-VA in [1] and the DEDR in [8]. To obtain comparatable results, we simulated a fractional SAR imaging experiment adopting a squared triangular shape of the imaging SAR range PSF and the squared Gaussian shape of the corresponding azimuth PSF, the same as in [1], [8] but of different pixel widths. In all fusion experiments reported below, different feasible combinations { p, p′ ∈P} of two systems/methods from the tested six (P = 6) specified in the companion paper [1] were simulated and compared where different indices { p, p′ ∈P = 6} point at different tested methods. These six most prominent methods are as specified in Section 3 of the companion paper [1], namely: p = 1 corresponds to the conventional lowresolution matched spatial filtering (MSF) technique [1]; p = 2 corresponds to the anisotropic diffusion (AF) method [2]; p = 3 relates to the robust spatial filtering (RSF) algorithm [7]; p = 4 corresponds to the robust adaptive spatial filtering (RASF) algorithm [8]; p = 5 relates to or the prominent adaptive amplitude-phase estimator (APES) [9] modified in [1] for the considered SAR imaging problem, and the last one, p = 6, corresponds to the adaptive BEDVA-optimal iterative technique specified by Eq. (11) from the companion paper [1]. For quantitative evaluation of the image reconstruction/fusion performances, we employed two conventional metrics [1], [8], namely, the Improvement in the Output Signal-to-Noise Ratio (IOSNR) K K IOSNR fused = 10 log (8) (((q ( p ) + q ( p′) ) / 2) − b ) 2 (bˆ fused − b ) 2 , 10

(∑

k =1

k

and the Mean Absolute Error (MAE) MAE fused = 10 log10

k

1 K



K i =1

k

bˆkfused − bk ,



k =1

k

k

)

(9)

where bk is the kth element of the discrete lexicographically ordered test high-resolution image b of dimension K=1024×1024 [10], { qk( p ) , qk( p′) } represent the kth elements of the tested combinations { p, p′ ∈P = 6} of the images fused in a particular simulation experiment (as specified in the captions in Figure 2 and Table 1), and {bˆ fused } represent the corresponding k

pixel values of the fused image for the particular tested combination { p, p′ ∈P = 6}. The qualitative simulation results are presented in Figure 2. The corresponding quantitative results of the improvement in the IOSNR and MAE metrics gained due to the NN-performed reconstruction/fusion are reported in Table 1. All those are indicative of the overall performance improvements in the MENN-reconstructed RS imagery. The reported numerical simulations verify that the developed BED-VA method [1] outperforms the most prominent existing nonparametric high-resolution RS imaging techniques [4]–[7] (both without fusion and in the fused version) in the attainable resolution enhancement requiring only 7 iterations.

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

(i)

(j) (k) (l) Figure 2. Qualitative results of the RS image reconstruction/fusion for the fractional SAR specifications from Table 1. (a) Original test scene [11]; (b) MSF2 degraded scene image; (c) AD2 enhancement (without fusion) [2], (d) RSF2 enhancement (without fusion) [7]; (e) RASF2 enhancement (without fusion) [8]; (f) APES2 enhancement (without fusion) [9]; (g) MSF1-MSF2 fusion; (h) MSF1-AD2 fusion; (i) MSF1-RSF2 fusion; (j) MSF1-RASF2 fusion; (k) MSF1-APES2 fusion; (l) RSF1-BED-VA2 fusion. Indexes 1 and 2 point at two corresponding SNRs: SNR1=10 dB and SNR2=20 dB. The dynamic MENN enhancement results (d)–(f) are reported for 20 performed iterations and the corresponding fusion results for 7 performed iterations.

8. Conclusions This paper presents a new approach for reconstructive RS imaging based on two paradigms, the BED-VA regularization (developed in the companion paper [1] of this series) combined with the MENN computing. Construction of a convex augmented NN energy function with the DEDR-VA-related objective components enables rapid dynamic fusion ensuring updates only of higher quality information from the input multimode RS image frames. The simulations verified that the proposed BED-VA-related MENN cooperative fusion approach outperforms existing low-resolution techniques [2], [5] as well as the

recently proposed iterative high-resolution methods [4], [6], [9] both in the achievable resolution enhancement and the convergence rate (analyzed in [1]). The speeded-up convergence paves a way to approaching the (near) real-time computational mode of implementing the proposed MENN-based RS imagery enhancement/fusion techniques. Table 1. IOSNR and MAE metrics achieved in the reconstruction of the RS images with five simulated fused techniques for the same test fractional SAR parameters [1]: range PSF width (at ½ from the peak value) κr= 15 pixels; azimuth PSF width (at ½ from the peak value) κa = 30 pixels; Indexes 1 and 2 point at two corresponding SNRs: SNR1=10 dB and SNR2=20 dB. The fusion was performed employing the MENN technique from Section 4 for the corresponding combinations of the BED-related techniques from [1, Section 3] for the same two tested image SNRs. The dynamic MENN enhancement and fusion results are reported for 20 performed iterations for the DEDR-VA techniques and for 7 performed iterations for the optimal BED-VA method [1]. Method Fusion Combination metric MSF1-MSF2 MSF1-AD2 MSF1-RSF2 MSF1-RASF2 MSF1-APES2 RSF1-BED-VA2 IOSNR 7.92 8.39 8.73 9.17 9.74 10.79 MAE 20.17 19.34 17.43 15.34 13.16 12.38

References: [1]

Y. Shkvarko, J. Tuxpan, S. Santos, and D. Castro, “Bayesian dynamic experiment design regularization framework for high-resolution radar/SAR imaging”, In Trans. Int. Radar Symp., 2011 (companion paper in this same Transactions). [2] P. Perona and J. Malik, “Scale-space and edge detection using anisotropic diffusion”, IEEE Trans. Pattern Anal. Machine Intell., July 1990, vol. 12, no. 7, pp. 629-639 [3] S. Haykin, “Neural Networks: A Comprehensive Foundation”, New York: Macmillan, 1994. [4] H. D. Li, M. Kallergi, W. Qian, V. K. Jain, and L. P. Clarke, “Neural network with maximum entropy constraint for nuclear medicine image restoration”, Optical Engineering, vol..34, pp. 1431-1440, 1995. [5] S. John and M.A. Vorontsov, “Multiframe selective information fusion from robust error theory”, IEEE Trans. Image Proc., May 2005, vol. 14, no. 5, pp. 577-584. [6] Y.V. Shkvarko, Y. S. Shmaliy, R. Jaime-Rivas and M. Torres-Cisneros, “System fusion in passive sensing using a modified Hopfield network”, Journal of the Franklin Institute, 2000, vol. 338, pp. 405–427. [7] Y.V. Shkvarko, “Estimation of wavefield power distribution in the remotely sensed environment: Bayesian maximum entropy approach”, IEEE Trans. Signal Proc., Sep. 2002, vol. 50, no. 9, pp. 2333-2346. [8] Y.V. Shkvarko, “Unifying experiment design and convex regularization techniques for enhanced imaging with uncertain remote sensing data––Part I: Theory; ––Part II: Adaptive implementation and performance issues”, IEEE Trans. Geoscience and Remote Sensing, Jan. 2010, vol. 48, no. 1, pp. 82-111. [9] T. Yardibi, J. Li, P. Stoica, M. Xue, and A.B. Baggeroer “Source localization and sensing: A nonparametric iterative adaptive approach based on weighted least squares”, IEEE Trans. Aerospace and Electr. Systems, Jan. 2010, vol. 46, no. 1, pp. 425-444. [10] Y.V. Shkvarko, J. Tuxpan, and S.R. Santos, “Dynamic Experiment Design Regularization Approach to Adaptive Imaging with Array Radar/SAR Sensor Systems”, Sensors, April 2011, no 5, pp 4483-4511. [11] Google Earth system: Coordinates: 19°00’28.98’’N, 94°14’00.41’’W, Elev. 6597 ft. Imagery data Feb. 2, 2009.