Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/154075
Title: Convergence of non-convex non-concave GANs using sinkhorn divergence
Authors: Adnan, Risman
Saputra, Muchlisin Adi
Fadlil, Junaidillah
Ezerman, Martianus Frederic
Iqbal, Muhamad
Basaruddin, Tjan
Keywords: Science::Physics
Issue Date: 2021
Source: Adnan, R., Saputra, M. A., Fadlil, J., Ezerman, M. F., Iqbal, M. & Basaruddin, T. (2021). Convergence of non-convex non-concave GANs using sinkhorn divergence. IEEE Access, 9, 67595-67609. https://dx.doi.org/10.1109/ACCESS.2021.3074943
Journal: IEEE Access
Abstract: Sinkhorn divergence is a symmetric normalization of entropic regularized optimal transport. It is a smooth and continuous metrized weak-convergence with excellent geometric properties. We use it as an alternative for the minimax objective function in formulating generative adversarial networks. The optimization is defined with Sinkhorn divergence as the objective, under the non-convex and non-concave condition. This work focuses on the optimization's convergence and stability. We propose a first order sequential stochastic gradient descent ascent (SeqSGDA) algorithm. Under some mild approximations, the learning converges to local minimax points. Using the structural similarity index measure (SSIM), we supply a non-asymptotic analysis of the algorithm's convergence rate. Empirical evidences show a convergence rate, which is inversely proportional to the number of iterations, when tested on tiny colour datasets Cats and CelebA on the deep convolutional generative adversarial networks and ResNet neural architectures. The entropy regularization parameter $\varepsilon $ is approximated to the SSIM tolerance $\epsilon $. We determine that the iteration complexity to return to an $\epsilon $ -stationary point to be $\mathcal {O}\left ({\kappa \, \log (\epsilon ^{-1})}\right)$ , where $\kappa $ is a value that depends on the Sinkhorn divergence's convexity and the minimax step ratio in the SeqSGDA algorithm.
URI: https://hdl.handle.net/10356/154075
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2021.3074943
Rights: © 2021 IEEE. This journal is 100% open access, which means that all content is freely available without charge to users or their institutions. All articles accepted after 12 June 2019 are published under a CC BY 4.0 license, and the author retains copyright. Users are allowed to read, download, copy, distribute, print, search, or link to the full texts of the articles, or use them for any other lawful purpose, as long as proper attribution is given.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SPMS Journal Articles

Files in This Item:
File Description SizeFormat 
Convergence_of_Non-Convex_Non-Concave_GANs_Using_Sinkhorn_Divergence.pdf3.9 MBAdobe PDFView/Open

Page view(s)

15
Updated on May 17, 2022

Download(s)

5
Updated on May 17, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.