An Information-Theoretic Perspective on Proper Quaternion Variational Autoencoders.
generative learning
properness
quaternion neural networks
second-order circularity
variational autoencoder
Journal
Entropy (Basel, Switzerland)
ISSN: 1099-4300
Titre abrégé: Entropy (Basel)
Pays: Switzerland
ID NLM: 101243874
Informations de publication
Date de publication:
03 Jul 2021
03 Jul 2021
Historique:
received:
30
05
2021
revised:
24
06
2021
accepted:
01
07
2021
entrez:
6
8
2021
pubmed:
7
8
2021
medline:
7
8
2021
Statut:
epublish
Résumé
Variational autoencoders are deep generative models that have recently received a great deal of attention due to their ability to model the latent distribution of any kind of input such as images and audio signals, among others. A novel variational autoncoder in the quaternion domain H, namely the QVAE, has been recently proposed, leveraging the augmented second order statics of H-proper signals. In this paper, we analyze the QVAE under an information-theoretic perspective, studying the ability of the H-proper model to approximate improper distributions as well as the built-in H-proper ones and the loss of entropy due to the improperness of the input signal. We conduct experiments on a substantial set of quaternion signals, for each of which the QVAE shows the ability of modelling the input distribution, while learning the improperness and increasing the entropy of the latent space. The proposed analysis will prove that proper QVAEs can be employed with a good approximation even when the quaternion input data are improper.
Identifiants
pubmed: 34356397
pii: e23070856
doi: 10.3390/e23070856
pmc: PMC8305877
pii:
doi:
Types de publication
Journal Article
Langues
eng
Subventions
Organisme : Sapienza Università di Roma
ID : RG11916B88E1942F
Références
Neural Netw. 2020 Dec;132:321-332
pubmed: 32977277
Entropy (Basel). 2021 Mar 21;23(3):
pubmed: 33801048
IEEE Trans Neural Netw Learn Syst. 2015 Oct;26(10):2422-39
pubmed: 25594982
Entropy (Basel). 2021 Jan 19;23(1):
pubmed: 33477766
Neural Netw. 2021 Jul;139:199-200
pubmed: 33774356
Neural Netw. 2021 Jan;133:132-147
pubmed: 33217682
Entropy (Basel). 2020 Dec 17;22(12):
pubmed: 33348816
Entropy (Basel). 2018 Jan 11;20(1):
pubmed: 33265134
Entropy (Basel). 2020 Mar 29;22(4):
pubmed: 33286164
Entropy (Basel). 2021 May 02;23(5):
pubmed: 34063192
Entropy (Basel). 2019 Jan 17;21(1):
pubmed: 33266795
Entropy (Basel). 2020 Sep 21;22(9):
pubmed: 33286824