Please use this identifier to cite or link to this item:
Title: Complex-valued neural networks and their learning algorithms
Authors: Savitha Ramasamy.
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2011
Abstract: Recent developments in complex-valued feed-forward neural networks have found number of applications like adaptive array signal processing, medical imaging, communication engineering, etc. However, these applications demand a tighter phase approximation along with magnitude approximation, which has not been emphasized in the existing literature. To fill this gap, this thesis addresses the development of novel fully complex-valued feed-forward neural networks and their supervised batch and sequential learning algorithms with an emphasis on better phase approximation. The classical approach to handle complex-valued signals is to split each complex-valued signal into two real-valued signals, either the real/imaginary components or magnitude/ phase components, and then use existing real-valued neural networks. In such a split complex-valued network, real-valued activation functions and real valued weights are used to estimate the network parameters. Thus, the gradients used in updating the free parameters of the network do not represent the true complex-valued gradients resulting in poor approximation of complex-valued functions, especially the phase of the complex-valued signals. This clearly shows a need for developing fully complex-valued neural networks and their learning algorithms.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
1.49 MBAdobe PDFView/Open

Page view(s) 20

Updated on Jun 15, 2024

Download(s) 50

Updated on Jun 15, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.