![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Science: general issues > Philosophy of science
Calls for a "consilient" or "vertically integrated" approach to the study of human mind and culture have, for the most part, been received by scholars in the humanities with either indifference or hostility. One reason for this is that consilience has often been framed as bringing the study of humanistic issues into line with the study of non-human phenomena, rather than as something to which humanists and scientists contribute equally. The other major reason that consilience has yet to catch on in the humanities is a dearth of compelling examples of the benefits of adopting a consilient approach. Creating Consilience is the product of a workshop that brought together internationally-renowned scholars from a variety of fields to address both of these issues. It includes representative pieces from workshop speakers and participants that examine how adopting such a consilient stance -- informed by cognitive science and grounded in evolutionary theory -- would concretely impact specific topics in the humanities, examining each topic in a manner that not only cuts across the humanities-natural science divide, but also across individual humanistic disciplines. By taking seriously the fact that science-humanities integration is a two-way exchange, this volume takes a new approach to bridging the cultures of science and the humanities. The editors and contributors formulate how to develop a new shared framework of consilience beyond mere interdisciplinarity, in a way that both sides can accept.
This book presents a history of the correspondence principle from a new perspective. The author provides a unique exploration of the relation between the practice of theory and conceptual development in physics. In the process, he argues for a new understanding of the history of the old quantum theory and the emergence of quantum mechanics. The analysis looks at how the correspondence principle was disseminated and how the principle was applied as a research tool during the 1920s. It provides new insights into the interaction between theoretical tools and scientific problems and shows that the use of this theoretical tool changed the tool itself in a process of transformation through implementation. This process, the author claims, was responsible for the conceptual development of the correspondence principle. This monograph connects to the vast literature in the history of science, which analyzed theoretical practices as based on tacit knowledge, skills, and calculation techniques. It contributes to the historical understanding of quantum physics and the emergence of quantum mechanics. Studying how physicists used a set of tools to solve problems, the author spells out the skillful guessing" that went into the making of quantum theoretical arguments and argues that the integration and implementation of technical resources was a central driving force for the conceptual and theoretical transformation in the old quantum theory.
Isaac Newton's Scientific Method examines Newton's argument for universal gravity and his application of it to resolve the problem of deciding between geocentric and heliocentric world systems by measuring masses of the sun and planets. William L. Harper suggests that Newton's inferences from phenomena realize an ideal of empirical success that is richer than prediction. Any theory that can achieve this rich sort of empirical success must not only be able to predict the phenomena it purports to explain, but also have those phenomena accurately measure the parameters which explain them. Harper explores the ways in which Newton's method aims to turn theoretical questions into ones which can be answered empirically by measurement from phenomena, and to establish that propositions inferred from phenomena are provisionally accepted as guides to further research. This methodology, guided by its rich ideal of empirical success, supports a conception of scientific progress that does not require construing it as progress toward Laplace's ideal limit of a final theory of everything, and is not threatened by the classic argument against convergent realism. Newton's method endorses the radical theoretical transformation from his theory to Einstein's. Harper argues that it is strikingly realized in the development and application of testing frameworks for relativistic theories of gravity, and very much at work in cosmology today.
This book provides an accessible and up to date overview of the foundational issues about both emerging constructive understandings of the digital era and still hidden and ignored aspects that could instead be dramatically relevant in the future, in the process of a technological humanism. The book offers relevant scientific and ethical questions bringing together professionals and researchers, from different professional and disciplinary fields, who have a shared interest in investigating operative aspects of technological, digital and cultural transitions of humans and their capacity of building human societies. The challenges are clear but there is a lack of an epistemological, anthropological, economic and social agenda that would enable a drive to such transitions towards a technological humanism. This book provides an ideal platform for professionals and scholars, not only providing tools for problem analysis, but also indicating shared directions, needs and objectives for a common goal; the creation of new scenarios instead of the creation of fears and manipulated social imaginaria.
John von Neumann (1903-1957) was undoubtedly one of the scientific geniuses of the 20th century. The main fields to which he contributed include various disciplines of pure and applied mathematics, mathematical and theoretical physics, logic, theoretical computer science, and computer architecture. Von Neumann was also actively involved in politics and science management and he had a major impact on US government decisions during, and especially after, the Second World War. There exist several popular books on his personality and various collections focusing on his achievements in mathematics, computer science, and economy. Strangely enough, to date no detailed appraisal of his seminal contributions to the mathematical foundations of quantum physics has appeared. Von Neumann's theory of measurement and his critique of hidden variables became the touchstone of most debates in the foundations of quantum mechanics. Today, his name also figures most prominently in the mathematically rigorous branches of contemporary quantum mechanics of large systems and quantum field theory. And finally - as one of his last lectures, published in this volume for the first time, shows - he considered the relation of quantum logic and quantum mechanical probability as his most important problem for the second half of the twentieth century. The present volume embraces both historical and systematic analyses of his methodology of mathematical physics, and of the various aspects of his work in the foundations of quantum physics, such as theory of measurement, quantum logic, and quantum mechanical entropy. The volume is rounded off by previously unpublished letters and lectures documenting von Neumann's thinking about quantum theory after his 1932 Mathematical Foundations of Quantum Mechanics. The general part of the Yearbook contains papers emerging from the Institute's annual lecture series and reviews of important publications of philosophy of science and its history.
In both science and philosophy, the twentieth century saw a radical breakdown of certainty in the human worldview, as quantum uncertainty and linguistic ambiguity destroyed the comfortable certitudes of the past. As these disciplines form the foundation for a human position in the world, a major epistemological reorganization had to take place. In this book, quantum theorist Stig Stenholm presents Bohr and Wittgenstein, in physics and in philosophy, as central figures representing this revision. Each of them took up the challenge of replacing apparent order and certainty with a provisional understanding based on limited concepts in constant flux. Stenholm concludes that the modern synthesis created by their heirs is far from satisfactory, and the story is so far an unfinished one. The book will appeal to any researcher in either discipline curious about the foundation of modern science, and works to provoke a renewal of discussion, and the eventual emergence of a reformed clarity and understanding.
This book explores the deep meaning-the nature or essence-of the economy and its fundamental components. As a monograph on the philosophy of the economy and economics, it deduces the metaphysical nature of these two, going step by step from more general to more specific realities to finally arrive at the adequate features of the economic sciences and their methods. It builds on a largely Aristotelian approach, but also draws extensively from modern scholarship in the area. Usefully and pertinently, the book covers both general aspects of the economy and particular historically specific features. Among the important topics covered in the book are the meanings of the economy, the nature and role of economic agents, the nature of the macroeconomy, the nature and role of money, and so on. The book concludes with chapters on the nature of economics itself and its methodologies.
In the following pages I have endeavored to show the impact on philosophy of tech nology and science; more specifically, I have tried to make up for the neglect by the classical philosophers of the historic role of technology and also to suggest what positive effects on philosophy the ahnost daily advances in the physical sciences might have. Above all, I wanted to remind the ontologist of his debt to the artificer: tech nology with its recent gigantic achievements has introduced a new ingredient into the world, and so is sure to influence our knowledge of what there is. This book, then, could as well have been called 'Ethnotechnology: An Explanation of Human Behavior by Means of Material Culture', but the picture is a complex one, and there are many more special problems that need to be prominently featured in the discussion. Human culture never goes forward on all fronts at the same time. In our era it is unquestionably not only technology but also the sciences which are making the most rapid progress. Philosophy has not been very successful at keeping up with them. As a consequence there is an 'enormous gulf between scientists and philosophers today, a gulf which is as large as it has ever been. ' (1) I can see that with science moving so rapidly, its current lessons for philosophy might well be outmoded tomorrow."
Inside you lies a precise scientific instrument - the ability to observe Nature and recall past experiences. You were born with it and you use it every day. You can be trained to use it more effectively to, for example, compare and discover new species of organisms or new minerals. Our senses do have limitations, and we often use microscopes, telescopes and other tools to aid our observation. However, we benefit from knowing their limitations and the impact they have on our ability to combine our observations and our experience to make decisions. Once these tools replace our direct observation and our experience we ourselves become disconnected from Nature. Scientific practice turns into well-meant opinions out-weighing empirical evidence. This is happening now in the current age of big data and artificial intelligence. The author calls this the Modern Hubris and it is slowly corroding science. To combat the Modern Hubris and to reconnect with Nature, scientists need to change the way they practise observation. To do so may require the scientist to transform themself. One person who successfully did this was Johann Wolfgang von Goethe. His journey demonstrates how one man attempted to take on the Modern Hubris by transforming his life and how he saw Nature. Following Goethe's transformation teaches us how we can also reconnect ourselves with Nature and Natural science.
Geometry has fascinated philosophers since the days of Thales and Pythagoras. In the 17th and 18th centuries it provided a paradigm of knowledge after which some thinkers tried to pattern their own metaphysical systems. But after the discovery of non-Euclidean geometries in the 19th century, the nature and scope of geometry became a bone of contention. Philosophical concern with geometry increased in the 1920's after Einstein used Riemannian geometry in his theory of gravitation. During the last fifteen or twenty years, renewed interest in the latter theory -prompted by advances in cosmology -has brought geometry once again to the forefront of philosophical discussion. The issues at stake in the current epistemological debate about geometry can only be understood in the light of history, and, in fact, most recent works on the subject include historical material. In this book, I try to give a selective critical survey of modern philosophy of geometry during its seminal period, which can be said to have begun shortly after 1850 with Riemann's generalized conception of space and to achieve some sort of completion at the turn of the century with Hilbert's axiomatics and Poincare's conventionalism. The philosophy of geometry of Einstein and his contemporaries will be the subject of another book. The book is divided into four chapters. Chapter 1 provides back ground information about the history of science and philosophy."
This book presents a study of the various feelings of awe and wonder experienced by astronauts during space flight. It summarizes the results of two experimental, interdisciplinary studies that employ methods from neuroscience, psychology, phenomenology and simulation technology, and it argues for a non-reductionist approach to cognitive science.
In Search of a Theory of Everything takes readers on an adventurous journey through space and time on a quest for a unified "theory of everything" by means of a rare and agile interplay between the natural philosophies of influential ancient Greek thinkers and the laws of modern physics. By narrating a history and a philosophy of science, theoretical physicist Demetris Nicolaides logically connects great feats of critical mind and unbridled human imagination in their ambitious quest for the theory that will ultimately explain all the phenomena of nature via a single immutable overarching law. This comparative study of the universe tells the story of physics through philosophy, of the current via the forgotten, in a balanced way. Nicolaides begins each chapter with a relatively easier analysis of nature-one conceived by a major natural philosopher of antiquity-easing readers gradually into the more complex views of modern physics, by intertwining finely the two, the ancient with the new. Those philosophers' rigorous scientific inquiry of the universe includes ideas that resonate with aspects of modern science, puzzles about nature that still baffle, and clever philosophical arguments that are used today to reassess competing principles of modern physics and speculate about open physics problems. In Search of a Theory of Everything is a new kind of sight, a philosophical insight of modern physics that has long been left unexamined.
The incredible achievements of modern scientific theories lead most
of us to embrace scientific realism: the view that our best
theories offer us at least roughly accurate descriptions of
otherwise inaccessible parts of the world like genes, atoms, and
the big bang. In Exceeding Our Grasp, Stanford argues that careful
attention to the history of scientific investigation invites a
challenge to this view that is not well represented in contemporary
debates about the nature of the scientific enterprise.
Understanding the emergence of a scientific culture - one in which
cognitive values generally are modelled on, or subordinated to,
scientific ones - is one of the foremost historical and
philosophical problems with which we are now confronted. The
significance of the emergence of such scientific values lies above
all in their ability to provide the criteria by which we come to
appraise cognitive enquiry, and which shape our understanding of
what it can achieve.
This book is a tribute to the scientific legacy of GianCarlo Ghirardi, who was one of the most influential scientists in the field of modern foundations of quantum theory. In this appraisal, contributions from friends, collaborators and colleagues reflect the influence of his world of thoughts on theory, experiments and philosophy, while also offering prospects for future research in the foundations of quantum physics. The themes of the contributions revolve around the physical reality of the wave function and its notorious collapse, randomness, relativity and experiments.
Jaakko Hintikka is one of the most creative figures in contemporary philosophy. He has made significant contributions to virtually all areas of the discipline, from epistemology and the philosophy of logic to the history of philosophy and the philosophy of science. Part of the fruitfulness of Hintikka 's work is due to its opening important new lines of investigation and new approaches to traditional philosophical problems. This volume gathers together essays from some of Hintikka 's colleagues and former students exploring his influence on their work and pursuing some of the insights that we have found in his work. This book includes a comprehensive overview of Hintikka 's philosophy by Dan Kolak and John Symons and an annotated bibliography of Hintikka 's work.
Energy is at the heart of physics (and of huge importance to
society) and yet no book exists specifically to explain it, and in
simple terms. In tracking the history of energy, this book is
filled with the thrill of the chase, the mystery of smoke and
mirrors, and presents a fascinating human-interest story. Following
the history provides a crucial aid to understanding: this book
explains the intellectual revolutions required to comprehend
energy, revolutions as profound as those stemming from Relativity
and Quantum Theory. Texts by Descartes, Leibniz, Bernoulli,
d'Alembert, Lagrange, Hamilton, Boltzmann, Clausius, Carnot and
others are made accessible, and the engines of Watt and Joule are
explained.
This book develops an original theory of decision-making based on the concept of plausibility. The author advocates plausible reasoning as a general philosophical method and demonstrates how it can be applied to problems in argumentation theory, scientific theory choice, risk management, ethics, law, economics, and epistemology. Human decisions are conditioned by formidable uncertainty. The standard resource for dealing rationally with uncertainty is the mathematical concept of probability. The probability calculus is well-known, but since the numerical demands for applying it cannot usually be met, it is not widely applicable. By contrast, the concept of plausibility is widely applicable, but it is little known. This book relies on a generalized concept of plausibility whose strength is its adaptability. The adaptability is due to a novel form of decision theory that takes plausibilities as inputs. This form of decision theory remains applicable to decisions informed by sharp probabilities and utilities, but it can also be applied to decisions that must be made without them. It can aid in the rationally critical enterprise of discriminating good arguments from bad, and this can foster philosophical progress. A Plea for Plausibility will be of interest to scholars and advanced students working in argumentation theory, philosophy of science, ethics, epistemology, economics, law, and risk management.
In May of 1973 we organized an international research colloquium on foundations of probability, statistics, and statistical theories of science at the University of Western Ontario. During the past four decades there have been striking formal advances in our understanding of logic, semantics and algebraic structure in probabilistic and statistical theories. These advances, which include the development of the relations between semantics and metamathematics, between logics and algebras and the algebraic-geometrical foundations of statistical theories (especially in the sciences), have led to striking new insights into the formal and conceptual structure of probability and statistical theory and their scientific applications in the form of scientific theory. The foundations of statistics are in a state of profound conflict. Fisher's objections to some aspects of Neyman-Pearson statistics have long been well known. More recently the emergence of Bayesian statistics as a radical alternative to standard views has made the conflict especially acute. In recent years the response of many practising statisticians to the conflict has been an eclectic approach to statistical inference. Many good statisticians have developed a kind of wisdom which enables them to know which problems are most appropriately handled by each of the methods available. The search for principles which would explain why each of the methods works where it does and fails where it does offers a fruitful approach to the controversy over foundations.
Brian Skyrms presents a fascinating exploration of how fundamental signals are to our world. He uses a variety of tools -- theories of signaling games, information, evolution, and learning -- to investigate how meaning and communication develop. He shows how signaling games themselves evolve, and introduces a new model of learning with invention. The juxtaposition of atomic signals leads to complex signals, as the natural product of gradual process. Signals operate in networks of senders and receivers at all levels of life. Information is transmitted, but it is also processed in various ways. That is how we think -- signals run around a very complicated signaling network. Signaling is a key ingredient in the evolution of teamwork, in the human but also in the animal world, even in micro-organisms. Communication and co-ordination of action are different aspects of the flow of information, and are both effected by signals.
This book provides a comprehensive overview of the nature of explanations as given in both natural and social sciences. It discusses models of explanation adopted in natural and social sciences. The author also elaborates upon naturalistic and anti-naturalistic views and other types of explanations such as functional, purposive, etc in social science. The volume elaborates upon themes like bridge principle; functional explanation; purposive explanation; teleological explanation; prediction; methodological individualism; methodological collectivism; illocutionary redescription; principle of action; and dispositional explanations, to understand whether the explanations given in the realm of social sciences are the same or different from the explanations that are given in the field of natural sciences. This introductory book is a must read for students and scholars of philosophy of science, logic, science and technology studies, social sciences, and philosophy in general.
This volume traces the origins and evolution of the idea of human extinction, from the ancient Presocratics through contemporary work on "existential risks." Many leading intellectuals agree that the risk of human extinction this century may be higher than at any point in our 300,000-year history as a species. This book provides insight on the key questions that inform this discussion, including when humans began to worry about their own extinction and how the debate has changed over time. It establishes a new theoretical foundation for thinking about the ethics of our extinction, arguing that extinction would be very bad under most circumstances, although the outcome might be, on balance, good. Throughout the book, graphs, tables, and images further illustrate how human choices and attitudes about extinction have evolved in Western history. In its thorough examination of humanity’s past, this book also provides a starting point for understanding our future. Although accessible enough to be read by undergraduates, Human Extinction contains new and thought-provoking research that will benefit even established academic philosophers and historians.
This groundbreaking, open access volume analyses and compares data practices across several fields through the analysis of specific cases of data journeys. It brings together leading scholars in the philosophy, history and social studies of science to achieve two goals: tracking the travel of data across different spaces, times and domains of research practice; and documenting how such journeys affect the use of data as evidence and the knowledge being produced. The volume captures the opportunities, challenges and concerns involved in making data move from the sites in which they are originally produced to sites where they can be integrated with other data, analysed and re-used for a variety of purposes. The in-depth study of data journeys provides the necessary ground to examine disciplinary, geographical and historical differences and similarities in data management, processing and interpretation, thus identifying the key conditions of possibility for the widespread data sharing associated with Big and Open Data. The chapters are ordered in sections that broadly correspond to different stages of the journeys of data, from their generation to the legitimisation of their use for specific purposes. Additionally, the preface to the volume provides a variety of alternative "roadmaps" aimed to serve the different interests and entry points of readers; and the introduction provides a substantive overview of what data journeys can teach about the methods and epistemology of research.
This book examines what seems to be the basic challenge in neuroscience today: understanding how experience generated by the human brain is related to the physical world we live in. The 25 short chapters present the argument and evidence that brains address this problem on a wholly trial and error basis. The goal is to encourage neuroscientists, computer scientists, philosophers, and other interested readers to consider this concept of neural function and its implications, not least of which is the conclusion that brains don't "compute." |
You may like...
Nonlinear Approaches in Engineering…
Reza N. Jazar, Liming Dai
Hardcover
R4,319
Discovery Miles 43 190
Gregory Breit Centennial Symposium, The
Vernon W. Hughes, Francesco Iachello, …
Hardcover
R5,001
Discovery Miles 50 010
Epigenetics in Organ Specific Disorders…
Chandra S. Boosani, Ritobrata Goswami
Paperback
R3,483
Discovery Miles 34 830
|