|
Showing 1 - 5 of
5 matches in All Departments
The last lecture course that Nobel Prize winner Richard P. Feynman
gave at Caltech from 1983 to 1986 was not on physics but on
computer science. The first edition of the Feynman Lectures on
Computation published in 1996 and provided an overview of standard
and not-so-standard topics in computer science given in Feynman's
inimitable style. Although now over 20 years old, most of the
material is still relevant and interesting, and Feynman's unique
philosophy of learning and discovery shines through. For this new
edition, Tony Hey has updated the lectures with an invited chapter
from Professor John Preskill on "Quantum Computing 40 Years Later."
This contribution captures the progress made towards building a
quantum computer since Feynman's original suggestions in 1981. The
last 25 years have also seen the "Moore's Law" roadmap for the IT
industry coming to an end. To reflect this transition, John Shalf,
Senior Scientist at Lawrence Berkeley National Laboratory, has
contributed a chapter on "The Future of Computing Beyond Moore's
Law." The final update for this edition capturea Feynman's interest
in Artificial Intelligence and Artificial Neural Networks. Eric
Mjolsness, now a professor of Computer Science at the University of
California Irvine, was a Teaching Assistant for Feynman's original
lecture course and his research interests are now in the
application of Artificial Intelligence and Machine Learning for
multi-scale science. He has contributed a chapter on "Feynman on
Artificial Intelligence and Machine Learning" that captures the
early discussions with Feynman and also looks towards future
developments. This exciting and important work provides key reading
for students and scholars in the fields of computer science and
computational physics.
The last lecture course that Nobel Prize winner Richard P. Feynman
gave at Caltech from 1983 to 1986 was not on physics but on
computer science. The first edition of the Feynman Lectures on
Computation published in 1996 and provided an overview of standard
and not-so-standard topics in computer science given in Feynman's
inimitable style. Although now over 20 years old, most of the
material is still relevant and interesting, and Feynman's unique
philosophy of learning and discovery shines through. For this new
edition, Tony Hey has updated the lectures with an invited chapter
from Professor John Preskill on "Quantum Computing 40 Years Later."
This contribution captures the progress made towards building a
quantum computer since Feynman's original suggestions in 1981. The
last 25 years have also seen the "Moore's Law" roadmap for the IT
industry coming to an end. To reflect this transition, John Shalf,
Senior Scientist at Lawrence Berkeley National Laboratory, has
contributed a chapter on "The Future of Computing Beyond Moore's
Law." The final update for this edition capturea Feynman's interest
in Artificial Intelligence and Artificial Neural Networks. Eric
Mjolsness, now a professor of Computer Science at the University of
California Irvine, was a Teaching Assistant for Feynman's original
lecture course and his research interests are now in the
application of Artificial Intelligence and Machine Learning for
multi-scale science. He has contributed a chapter on "Feynman on
Artificial Intelligence and Machine Learning" that captures the
early discussions with Feynman and also looks towards future
developments. This exciting and important work provides key reading
for students and scholars in the fields of computer science and
computational physics.
This unique collection introduces AI, Machine Learning (ML), and
deep neural network technologies leading to scientific discovery
from the datasets generated both by supercomputer simulation and by
modern experimental facilities.Huge quantities of experimental data
come from many sources — telescopes, satellites, gene sequencers,
accelerators, and electron microscopes, including international
facilities such as the Large Hadron Collider (LHC) at CERN in
Geneva and the ITER Tokamak in France. These sources generate many
petabytes moving to exabytes of data per year. Extracting
scientific insights from these data is a major challenge for
scientists, for whom the latest AI developments will be
essential.The timely handbook benefits professionals, researchers,
academics, and students in all fields of science and engineering as
well as AI, ML, and neural networks. Further, the vision evident in
this book inspires all those who influence or are influenced by
scientific progress.
The principles of quantum mechanics are the basis of everything in the physical world--from atoms to stars, from nuclei to lasers. Quantum paradoxes and the eventful life of Schroedinger's Cat are explained, along with the Many Universe explanation of quantum measurement in this newly revised edition. Updated throughout, the book also looks ahead to the nanotechnology revolution and describes quantum cryptography, computing and teleportation. Including an account of quantum mechanics and science fiction, this accessible book is geared to the general reader. Anthony Hey teaches at the University of Southampton, UK, and is the co-author of several books, including two with Patrick Walters, The Quantum Universe (Cambridge, 1987), and Einstein's Mirror (Cambridge , 1997). Patrick Walters is a Lecturer in Continuing Education at the University of Wales at Swansea. He co-ordinates the Physical Science Programme in DACE which includes the Astronomy Programme. His research interests include science education, and he also writes non-technical books on science for the general reader and beginning undergraduates. First Edition Pb (1987): 0-521-31845-9
The Theory of Special Relativity is one of the most profound discoveries of the twentieth century. Einstein's Mirror blends a simple, nonmathematical account of the theory of special relativity and gravitation with a description of the way experiments have triumphantly supported these theories. The authors explore the many applications of relativity in atomic and nuclear physics, which are many and range from satellite navigation systems, particle accelerators and nuclear power to quantum chemistry, antimatter and black holes. The book also features a superb collection of photographs and includes amusing anecdotes and biographies about the early pioneers. In the closing chapter, the authors examine the influence of Einstein's relativity on the development of science fiction. General readers with an interest in science will enjoy and benefit from this fascinating and accessible introduction to one of the most important areas of physics.
|
|