DACH: Domain Adaptation Without Domain Information.


Journal

IEEE transactions on neural networks and learning systems
ISSN: 2162-2388
Titre abrégé: IEEE Trans Neural Netw Learn Syst
Pays: United States
ID NLM: 101616214

Informations de publication

Date de publication:
12 2020
Historique:
pubmed: 25 1 2020
medline: 26 10 2021
entrez: 25 1 2020
Statut: ppublish

Résumé

Domain adaptation is becoming increasingly important for learning systems in recent years, especially with the growing diversification of data domains in real-world applications, such as the genetic data from various sequencing platforms and video feeds from multiple surveillance cameras. Traditional domain adaptation approaches target to design transformations for each individual domain so that the twisted data from different domains follow an almost identical distribution. In many applications, however, the data from diversified domains are simply dumped to an archive even without clear domain labels. In this article, we discuss the possibility of learning domain adaptations even when the data does not contain domain labels. Our solution is based on our new model, named domain adaption using cross-domain homomorphism (DACH in short), to identify intrinsic homomorphism hidden in mixed data from all domains. DACH is generally compatible with existing deep learning frameworks, enabling the generation of nonlinear features from the original data domains. Our theoretical analysis not only shows the universality of the homomorphism, but also proves the convergence of DACH for significant homomorphism structures over the data domains is preserved. Empirical studies on real-world data sets validate the effectiveness of DACH on merging multiple data domains for joint machine learning tasks and the scalability of our algorithm to domain dimensionality.

Identifiants

pubmed: 31976912
doi: 10.1109/TNNLS.2019.2962817
doi:

Types de publication

Journal Article Research Support, Non-U.S. Gov't

Langues

eng

Sous-ensembles de citation

IM

Pagination

5055-5067

Auteurs

Articles similaires

Selecting optimal software code descriptors-The case of Java.

Yegor Bugayenko, Zamira Kholmatova, Artem Kruglov et al.
1.00
Software Algorithms Programming Languages
Databases, Protein Protein Domains Protein Folding Proteins Deep Learning

Exploring blood-brain barrier passage using atomic weighted vector and machine learning.

Yoan Martínez-López, Paulina Phoobane, Yanaima Jauriga et al.
1.00
Blood-Brain Barrier Machine Learning Humans Support Vector Machine Software

Understanding the role of machine learning in predicting progression of osteoarthritis.

Simone Castagno, Benjamin Gompels, Estelle Strangmark et al.
1.00
Humans Disease Progression Machine Learning Osteoarthritis

Classifications MeSH