Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/61795
Title: Sparse signal processing and compressed sensing recovery
Authors: Sujit Kumar Sahoo
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Electronic systems::Signal processing
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2013
Source: Sujit Kumar Sahoo. (2013). Sparse signal processing and compressed sensing recovery. Doctoral thesis, Nanyang Technological University, Singapore.
Abstract: The works presented in this thesis focus on sparsity in the real world signals, its applications in image processing, and recovery of sparse signal from Compressed Sensing (CS) measurements. In the field of signal processing, there exist various measures to analyze and represent the signal to get a meaningful outcome. Sparse representation of the signal is a relatively new measure, and the applications based on it are intuitive and promising. Overcomplete and signal dependant representations are modern trends in signal processing, which helps sparsifying the redundant information in the representation domain (dictionary). Hence, the goal of signal dependant representation is to train a dictionary from sample signals. Interestingly, recent dictionary training algorithms such as $K$-SVD, MOD, and their variations are reminiscent of the well know $K$-means clustering. The first part of the work analyses such algorithms from the viewpoint of $K$-means. The analysis shows that though $K$-SVD is sequential like $K$-means, it fails to simplify to $K$-means by destroying the structure in the sparse coefficients. In contrast, MOD can be viewed as a parallel generalization of $K$-means, which simplifies to $K$-means without affecting the sparse coefficients. Keeping stability and memory usage in mind, an alternative to MOD is proposed: a Sequential Generalization of $K$-means (SGK). Through the synthetic data experiment, the performance of SGK is demonstrated to be comparable with $K$-SVD and MOD. Using complexity analysis, SGK is shown to be much faster compared to $K$-SVD, which is also validated from the experiment. The next part of the work illustrates the applications of trained dictionary in image processing, where it compares the usability of SGK and $K$-SVD through image compression and image recovery (inpainting, denoising). The obtained results suggest that $K$-SVD can be successfully replaced with SGK, due to its quicker execution and comparable outcomes. Similarly, it is possible to extend the use of SGK to other applications of sparse representation. The subsequent part of the work proposes a framework to improve the image recovery performance using sparse representation of local image blocks. An adaptive blocksize selection procedure for local sparse representation is proposed, which improves the global recovery of underlying image. Ideally, the adaptive blocksize selection should minimize the Mean Square Error (MSE) in a recovered image. The results obtained using the proposed framework are comparable to the recently proposed image recovery techniques. The succeeding part of the work addresses the recovery of sparse signals from CS measurements. The objective is to recover the large dimension sparse signals from small number of random measurements. Orthogonal Matching Pursuit (OMP) and Basis Pursuit (BP) are two well known sparse signal recovery algorithms. To recover a $d$-dimensional $m$-sparse signal, BP only needs the number of measurements $N=O\left(m\ln{\frac{d}{m}}\right)$, which is similar to theoretical $\ell_0$ norm recovery. On the contrary, the best known theoretical guarantee for a successful signal recovery in probability shows OMP is needing $N=O\left(m\ln{d}\right)$, which is more than BP. However, OMP is known for its swift execution speed, and it's considered to be the mother of all greedy pursuit techniques. In this piece of the work, an improved theoretical recovery guarantee for OMP is obtained. A new scheme called $\text{OMP}_\alpha$ is introduced for CS recovery, which runs OMP for $m+\lfloor\alpha{m}\rfloor$ iterations, where $\alpha\in[0,1]$. It is analytically shown that $\text{OMP}_\alpha$ recovers a $d$-dimensional $m$-sparse signal with high probability when $N = O\left(m\ln{\frac{d}{\lfloor\alpha{m}\rfloor+1}}\right)$, which is a similar trend as that of BP.
URI: https://hdl.handle.net/10356/61795
DOI: 10.32657/10356/61795
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
Thesis.pdfThesis5.56 MBAdobe PDFThumbnail
View/Open

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.