State-of-the-art algorithms and theory in a novel domain of machine
learning, prediction when the output has structure. Machine
learning develops intelligent computer systems that are able to
generalize from previously seen examples. A new domain of machine
learning, in which the prediction must satisfy the additional
constraints found in structured data, poses one of machine
learning's greatest challenges: learning functional dependencies
between arbitrary input and output domains. This volume presents
and analyzes the state of the art in machine learning algorithms
and theory in this novel field. The contributors discuss
applications as diverse as machine translation, document markup,
computational biology, and information extraction, among others,
providing a timely overview of an exciting field. Contributors
Yasemin Altun, Goekhan Bakir, Olivier Bousquet, Sumit Chopra,
Corinna Cortes, Hal Daume III, Ofer Dekel, Zoubin Ghahramani, Raia
Hadsell, Thomas Hofmann, Fu Jie Huang, Yann LeCun, Tobias Mann,
Daniel Marcu, David McAllester, Mehryar Mohri, William Stafford
Noble, Fernando Perez-Cruz, Massimiliano Pontil, Marc'Aurelio
Ranzato, Juho Rousu, Craig Saunders, Bernhard Schoelkopf, Matthias
W. Seeger, Shai Shalev-Shwartz, John Shawe-Taylor, Yoram Singer,
Alexander J. Smola, Sandor Szedmak, Ben Taskar, Ioannis
Tsochantaridis, S.V.N Vishwanathan, Jason Weston
General
Is the information for this product incomplete, wrong or inappropriate?
Let us know about it.
Does this product have an incorrect or missing image?
Send us a new image.
Is this product missing categories?
Add more categories.
Review This Product
No reviews yet - be the first to create one!