Books > Computing & IT > Applications of computing > Artificial intelligence > Natural language & machine translation
|
Buy Now
Pretrained Transformers for Text Ranking - BERT and Beyond (Paperback)
Loot Price: R2,341
Discovery Miles 23 410
|
|
Pretrained Transformers for Text Ranking - BERT and Beyond (Paperback)
Series: Synthesis Lectures on Human Language Technologies
Expected to ship within 10 - 15 working days
|
The goal of text ranking is to generate an ordered list of texts
retrieved from a corpus in response to a query. Although the most
common formulation of text ranking is search, instances of the task
can also be found in many natural language processing (NLP)
applications.This book provides an overview of text ranking with
neural network architectures known as transformers, of which BERT
(Bidirectional Encoder Representations from Transformers) is the
best-known example. The combination of transformers and
self-supervised pretraining has been responsible for a paradigm
shift in NLP, information retrieval (IR), and beyond. This book
provides a synthesis of existing work as a single point of entry
for practitioners who wish to gain a better understanding of how to
apply transformers to text ranking problems and researchers who
wish to pursue work in this area. It covers a wide range of modern
techniques, grouped into two high-level categories: transformer
models that perform reranking in multi-stage architectures and
dense retrieval techniques that perform ranking directly. Two
themes pervade the book: techniques for handling long documents,
beyond typical sentence-by-sentence processing in NLP, and
techniques for addressing the tradeoff between effectiveness (i.e.,
result quality) and efficiency (e.g., query latency, model and
index size). Although transformer architectures and pretraining
techniques are recent innovations, many aspects of how they are
applied to text ranking are relatively well understood and
represent mature techniques. However, there remain many open
research questions, and thus in addition to laying out the
foundations of pretrained transformers for text ranking, this book
also attempts to prognosticate where the field is heading.
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!
|
You might also like..
|
Email address subscribed successfully.
A activation email has been sent to you.
Please click the link in that email to activate your subscription.