|
Showing 1 - 1 of
1 matches in All Departments
Embeddings have undoubtedly been one of the most influential
research areas in Natural Language Processing (NLP). Encoding
information into a low-dimensional vector representation, which is
easily integrable in modern machine learning models, has played a
central role in the development of NLP. Embedding techniques
initially focused on words, but the attention soon started to shift
to other forms: from graph structures, such as knowledge bases, to
other types of textual content, such as sentences and documents.
This book provides a high-level synthesis of the main embedding
techniques in NLP, in the broad sense. The book starts by
explaining conventional word vector space models and word
embeddings (e.g., Word2Vec and GloVe) and then moves to other types
of embeddings, such as word sense, sentence and document, and graph
embeddings. The book also provides an overview of recent
developments in contextualized representations (e.g., ELMo and
BERT) and explains their potential in NLP. Throughout the book, the
reader can find both essential information for understanding a
certain topic from scratch and a broad overview of the most
successful techniques developed in the literature.
|
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.