|
Showing 1 - 1 of
1 matches in All Departments
Solving problems with deep neural networks typically relies on
massive amounts of labeled training data to achieve high
performance. While in many situations huge volumes of unlabeled
data can be and often are generated and available, the cost of
acquiring data labels remains high. Transfer learning (TL), and in
particular domain adaptation (DA), has emerged as an effective
solution to overcome the burden of annotation, exploiting the
unlabeled data available from the target domain together with
labeled data or pre-trained models from similar, yet different
source domains. The aim of this book is to provide an overview of
such DA/TL methods applied to computer vision, a field whose
popularity has increased significantly in the last few years. We
set the stage by revisiting the theoretical background and some of
the historical shallow methods before discussing and comparing
different domain adaptation strategies that exploit deep
architectures for visual recognition. We introduce the space of
self-training-based methods that draw inspiration from the related
fields of deep semi-supervised and self-supervised learning in
solving the deep domain adaptation. Going beyond the classic domain
adaptation problem, we then explore the rich space of problem
settings that arise when applying domain adaptation in practice
such as partial or open-set DA, where source and target data
categories do not fully overlap, continuous DA where the target
data comes as a stream, and so on. We next consider the least
restrictive setting of domain generalization (DG), as an extreme
case where neither labeled nor unlabeled target data are available
during training. Finally, we close by considering the emerging area
of learning-to-learn and how it can be applied to further improve
existing approaches to cross domain learning problems such as DA
and DG.
|
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.