|
Showing 1 - 3 of
3 matches in All Departments
Data analysis is changing fast. Driven by a vast range of
application domains and affordable tools, machine learning has
become mainstream. Unsupervised data analysis, including cluster
analysis, factor analysis, and low dimensionality mapping methods
continually being updated, have reached new heights of achievement
in the incredibly rich data world that we inhabit. Statistical
Learning and Data Science is a work of reference in the rapidly
evolving context of converging methodologies. It gathers
contributions from some of the foundational thinkers in the
different fields of data analysis to the major theoretical results
in the domain. On the methodological front, the volume includes
conformal prediction and frameworks for assessing confidence in
outputs, together with attendant risk. It illustrates a wide range
of applications, including semantics, credit risk, energy
production, genomics, and ecology. The book also addresses issues
of origin and evolutions in the unsupervised data analysis arena,
and presents some approaches for time series, symbolic data, and
functional data. Over the history of multidimensional data
analysis, more and more complex data have become available for
processing. Supervised machine learning, semi-supervised analysis
approaches, and unsupervised data analysis, provide great
capability for addressing the digital data deluge. Exploring the
foundations and recent breakthroughs in the field, Statistical
Learning and Data Science demonstrates how data analysis can
improve personal and collective health and the well-being of our
social, business, and physical environments.
Data analysis is changing fast. Driven by a vast range of
application domains and affordable tools, machine learning has
become mainstream. Unsupervised data analysis, including cluster
analysis, factor analysis, and low dimensionality mapping methods
continually being updated, have reached new heights of achievement
in the incredibly rich data world that we inhabit. Statistical
Learning and Data Science is a work of reference in the rapidly
evolving context of converging methodologies. It gathers
contributions from some of the foundational thinkers in the
different fields of data analysis to the major theoretical results
in the domain. On the methodological front, the volume includes
conformal prediction and frameworks for assessing confidence in
outputs, together with attendant risk. It illustrates a wide range
of applications, including semantics, credit risk, energy
production, genomics, and ecology. The book also addresses issues
of origin and evolutions in the unsupervised data analysis arena,
and presents some approaches for time series, symbolic data, and
functional data. Over the history of multidimensional data
analysis, more and more complex data have become available for
processing. Supervised machine learning, semi-supervised analysis
approaches, and unsupervised data analysis, provide great
capability for addressing the digital data deluge. Exploring the
foundations and recent breakthroughs in the field, Statistical
Learning and Data Science demonstrates how data analysis can
improve personal and collective health and the well-being of our
social, business, and physical environments.
The first systematic study of parallelism in computation by two
pioneers in the field. Reissue of the 1988 Expanded Edition with a
new foreword by Leon Bottou In 1969, ten years after the discovery
of the perceptron-which showed that a machine could be taught to
perform certain tasks using examples-Marvin Minsky and Seymour
Papert published Perceptrons, their analysis of the computational
capabilities of perceptrons for specific tasks. As Leon Bottou
writes in his foreword to this edition, "Their rigorous work and
brilliant technique does not make the perceptron look very good."
Perhaps as a result, research turned away from the perceptron. Then
the pendulum swung back, and machine learning became the
fastest-growing field in computer science. Minsky and Papert's
insistence on its theoretical foundations is newly relevant.
Perceptrons-the first systematic study of parallelism in
computation-marked a historic turn in artificial intelligence,
returning to the idea that intelligence might emerge from the
activity of networks of neuron-like entities. Minsky and Papert
provided mathematical analysis that showed the limitations of a
class of computing machines that could be considered as models of
the brain. Minsky and Papert added a new chapter in 1987 in which
they discuss the state of parallel computers, and note a central
theoretical challenge: reaching a deeper understanding of how
"objects" or "agents" with individuality can emerge in a network.
Progress in this area would link connectionism with what the
authors have called "society theories of mind."
|
You may like...
Tenet
John David Washington, Robert Pattinson, …
DVD
R53
Discovery Miles 530
Loot
Nadine Gordimer
Paperback
(2)
R398
R330
Discovery Miles 3 300
|