Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Philosophy of mathematics
People who learn to solve problems ‘on the job’ often have to do it differently from people who learn in theory. Practical knowledge and theoretical knowledge is different in some ways but similar in other ways - or else one would end up with wrong solutions to the problems. Mathematics is also like this. People who learn to calculate, for example, because they are involved in commerce frequently have a more practical way of doing mathematics than the way we are taught at school. This book is about the differences between what we call practical knowledge of mathematics - that is street mathematics - and mathematics learned in school, which is not learned in practice. The authors look at the differences between these two ways of solving mathematical problems and discuss their advantages and disadvantages. They also discuss ways of trying to put theory and practice together in mathematics teaching.
The Conceptual Roots of Mathematics is a comprehensive study of the foundation of mathematics. J.R. Lucas, one of the most distinguished Oxford scholars, covers a vast amount of ground in the philosophy of mathematics, showing us that it is actually at the heart of the study of epistemology and metaphysics.
We use addition on a daily basis--yet how many of us stop to truly consider the enormous and remarkable ramifications of this mathematical activity? Summing It Up uses addition as a springboard to present a fascinating and accessible look at numbers and number theory, and how we apply beautiful numerical properties to answer math problems. Mathematicians Avner Ash and Robert Gross explore addition's most basic characteristics as well as the addition of squares and other powers before moving onward to infinite series, modular forms, and issues at the forefront of current mathematical research. Ash and Gross tailor their succinct and engaging investigations for math enthusiasts of all backgrounds. Employing college algebra, the first part of the book examines such questions as, can all positive numbers be written as a sum of four perfect squares? The second section of the book incorporates calculus and examines infinite series--long sums that can only be defined by the concept of limit, as in the example of 1+1/2+1/4+...=? With the help of some group theory and geometry, the third section ties together the first two parts of the book through a discussion of modular forms--the analytic functions on the upper half-plane of the complex numbers that have growth and transformation properties. Ash and Gross show how modular forms are indispensable in modern number theory, for example in the proof of Fermat's Last Theorem. Appropriate for numbers novices as well as college math majors, Summing It Up delves into mathematics that will enlighten anyone fascinated by numbers.
From Ancient Greek times, music has been seen as a mathematical art, and the relationship between mathematics and music has fascinated generations. This collection of wide ranging, comprehensive and fully-illustrated papers, authorized by leading scholars, presents the link between these two subjects in a lucid manner that is suitable for students of both subjects, as well as the general reader with an interest in music. Physical, theoretical, physiological, acoustic, compositional and analytical relationships between mathematics and music are unfolded and explored with focus on tuning and temperament, the mathematics of sound, bell-ringing and modern compositional techniques.
This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction in the philosophy of mathematics, Jose Ferreiros uses the crucial idea of a continuum to provide an account of the development of mathematical knowledge that reflects the actual experience of doing math and makes sense of the perceived objectivity of mathematical results. Describing a historically oriented, agent-based philosophy of mathematics, Ferreiros shows how the mathematical tradition evolved from Euclidean geometry to the real numbers and set-theoretic structures. He argues for the need to take into account a whole web of mathematical and other practices that are learned and linked by agents, and whose interplay acts as a constraint. Ferreiros demonstrates how advanced mathematics, far from being a priori, is based on hypotheses, in contrast to elementary math, which has strong cognitive and practical roots and therefore enjoys certainty. Offering a wealth of philosophical and historical insights, Mathematical Knowledge and the Interplay of Practices challenges us to rethink some of our most basic assumptions about mathematics, its objectivity, and its relationship to culture and science.
This is an open access title available under the terms of a CC BY-NC-ND 4.0 licence. It is free to read at Oxford Scholarship Online and offered as a free PDF download from OUP and selected open access locations. Recently, debates about mathematical structuralism have picked up steam again within the philosophy of mathematics, probing ontological and epistemological issues in novel ways. These debates build on discussions of structuralism which began in the 1960s in the work of philosophers such as Paul Benacerraf and Hilary Putnam; going further than these previous thinkers, however, these new debates also recognize that the motivation for structuralist views should be tied to methodological developments within mathematics. In fact, practically all relevant ideas and methods have roots in the structuralist transformation that modern mathematics underwent in the 19th and early 20th centuries. This edited volume of new essays by top scholars in the philosophy of mathematics explores this previously overlooked 'pre-history' of mathematical structuralism. The contributors explore this historical background along two distinct but interconnected dimensions. First, they reconsider the methodological contributions of major figures in the history of mathematics, such as Dedekind, Hilbert, and Bourbaki, who are responsible for the introduction of new number systems, algebras, and geometries that transformed the landscape of mathematics. Second, they reexamine a range of philosophical reflections by mathematically inclined philosophers, like Russell, Cassirer, and Quine, whose work led to profound conclusions about logical, epistemological, and metaphysical aspects of structuralism. Overall, the essays in this volume show not only that the pre-history of mathematical structuralism is much richer than commonly appreciated, but also that it is crucial to take into account this broader intellectual history for enriching current debates in the philosophy of mathematics. The insights included in this volume will interest scholars and students in the philosophy of mathematics, the philosophy of science, and the history of philosophy.
The Philosophy of Mathematics Today gives a panorama of the best current work in this lively field, through twenty essays specially written for this collection by leading figures. The topics include indeterminacy, logical consequence, mathematical methodology, abstraction, and both Hilbert's and Frege's foundational programmes. The collection will be an important source for research in the philosophy of mathematics for years to come. Contributors Paul Benacerraf, George Boolos, John P. Burgess, Charles S. Chihara, Michael Detlefsen, Michael Dummett, Hartry Field, Kit Fine, Bob Hale, Richard G. Heck, Jnr., Geoffrey Hellman, Penelope Maddy, Karl-Georg Niebergall, Charles D. Parsons, Michael D. Resnik, Matthias Schirn, Stewart Shapiro, Peter Simons, W.W. Tait, Crispin Wright.
Logical monism is the claim that there is a single correct logic, the 'one true logic' of our title. The view has evident appeal, as it reflects assumptions made in ordinary reasoning as well as in mathematics, the sciences, and the law. In all these spheres, we tend to believe that there are determinate facts about the validity of arguments. Despite its evident appeal, however, logical monism must meet two challenges. The first is the challenge from logical pluralism, according to which there is more than one correct logic. The second challenge is to determine which form of logical monism is the correct one. One True Logic is the first monograph to explicitly articulate a version of logical monism and defend it against the first challenge. It provides a critical overview of the monism vs pluralism debate and argues for the former. It also responds to the second challenge by defending a particular monism, based on a highly infinitary logic. It breaks new ground on a number of fronts and unifies disparate discussions in the philosophical and logical literature. In particular, it generalises the Tarski-Sher criterion of logicality, provides a novel defence of this generalisation, offers a clear new argument for the logicality of infinitary logic and replies to recent pluralist arguments.
Even the most enthusiastic of maths students probably at one time wondered when exactly it would all prove useful in ‘real life’. Well, maths reaches so far and wide through our world that, love it or hate it, we’re all doing maths almost every minute of every day. David Darling and Agnijo Banerjee go in search of the perfect labyrinth, journey back to the second century in pursuit of ‘bubble maths’, reveal the weirdest mathematicians in history and transform the bewildering into the beautiful, delighting us once again.
Thinking about Mathematics covers the range of philosophical issues and positions concerning mathematics. The text describes the questions about mathematics that motivated philosophers throughout its history and covers historical figures, such as Plato, Aristotle, Kant, and Mill. It presents the major positions and arguments concerning mathematics throughout the twentieth century, bringing the reader up to the present positions and battle lines.
Paolo Mancosu provides an original investigation of historical and systematic aspects of the notions of abstraction and infinity and their interaction. A familiar way of introducing concepts in mathematics rests on so-called definitions by abstraction. An example of this is Hume's Principle, which introduces the concept of number by stating that two concepts have the same number if and only if the objects falling under each one of them can be put in one-one correspondence. This principle is at the core of neo-logicism. In the first two chapters of the book, Mancosu provides a historical analysis of the mathematical uses and foundational discussion of definitions by abstraction up to Frege, Peano, and Russell. Chapter one shows that abstraction principles were quite widespread in the mathematical practice that preceded Frege's discussion of them and the second chapter provides the first contextual analysis of Frege's discussion of abstraction principles in section 64 of the Grundlagen. In the second part of the book, Mancosu discusses a novel approach to measuring the size of infinite sets known as the theory of numerosities and shows how this new development leads to deep mathematical, historical, and philosophical problems. The final chapter of the book explore how this theory of numerosities can be exploited to provide surprisingly novel perspectives on neo-logicism.
In recent years there have been a number of books-both anthologies and monographs-that have focused on the Liar Paradox and, more generally, on the semantic paradoxes, either offering proposed treatments to those paradoxes or critically evaluating ones that occupy logical space. At the same time, there are a number of people who do great work in philosophy, who have various semantic, logical, metaphysical and/or epistemological commitments that suggest that they should say something about the Liar Paradox, yet who have said very little, if anything, about that paradox or about the extant projects involving it. The purpose of this volume is to afford those philosophers the opportunity to address what might be described as reflections on the Liar.
Logic is a field studied mainly by researchers and students of philosophy, mathematics and computing. Inductive logic seeks to determine the extent to which the premisses of an argument entail its conclusion, aiming to provide a theory of how one should reason in the face of uncertainty. It has applications to decision making and artificial intelligence, as well as how scientists should reason when not in possession of the full facts. In this book, Jon Williamson embarks on a quest to find a general, reasonable, applicable inductive logic (GRAIL), all the while examining why pioneers such as Ludwig Wittgenstein and Rudolf Carnap did not entirely succeed in this task. Along the way he presents a general framework for the field, and reaches a new inductive logic, which builds upon recent developments in Bayesian epistemology (a theory about how strongly one should believe the various propositions that one can express). The book explores this logic in detail, discusses some key criticisms, and considers how it might be justified. Is this truly the GRAIL? Although the book presents new research, this material is well suited to being delivered as a series of lectures to students of philosophy, mathematics, or computing and doubles as an introduction to the field of inductive logic
In this third installment of his classic 'Foundations' trilogy, Michel Serres takes on the history of geometry and mathematics. Even more broadly, Geometry is the beginnings of things and also how these beginnings have shaped how we continue to think philosophically and critically. Serres rejects a traditional history of mathematics which unfolds in a linear manner, and argues for the need to delve into the past of maths and identify a series of ruptures which can help shed light on how this discipline has developed and how, in turn, the way we think has been shaped and formed. This meticulous and lyrical translation marks the first ever English translation of this key text in the history of ideas.
In this illuminating collection, Charles Parsons surveys the contributions of philosophers and mathematicians who shaped the philosophy of mathematics over the course of the past century. Parsons begins with a discussion of the Kantian legacy in the work of L. E. J. Brouwer, David Hilbert, and Paul Bernays, shedding light on how Bernays revised his philosophy after his collaboration with Hilbert. He considers Hermann Weyl's idea of a "vicious circle" in the foundations of mathematics, a radical claim that elicited many challenges. Turning to Kurt Goedel, whose incompleteness theorem transformed debate on the foundations of mathematics and brought mathematical logic to maturity, Parsons discusses his essay on Bertrand Russell's mathematical logic--Goedel's first mature philosophical statement and an avowal of his Platonistic view. Philosophy of Mathematics in the Twentieth Century insightfully treats the contributions of figures the author knew personally: W. V. Quine, Hilary Putnam, Hao Wang, and William Tait. Quine's early work on ontology is explored, as is his nominalistic view of predication and his use of the genetic method of explanation in the late work The Roots of Reference. Parsons attempts to tease out Putnam's views on existence and ontology, especially in relation to logic and mathematics. Wang's contributions to subjects ranging from the concept of set, minds, and machines to the interpretation of Goedel are examined, as are Tait's axiomatic conception of mathematics, his minimalist realism, and his thoughts on historical figures.
Towards Non-Being presents an account of the semantics of intentional language-verbs such as 'believes', 'fears', 'seeks', 'imagines'. Graham Priest tackles problems concerning intentional states which are often brushed under the carpet in discussions of intentionality, such as their failure to be closed under deducibility. Priest's account draws on the work of the late Richard Routley (Sylvan), and proceeds in terms of objects that may be either existent or non-existent, at worlds that may be either possible or impossible. Since Russell, non-existent objects have had a bad press in Western philosophy; Priest mounts a full-scale defence. In the process, he offers an account of both fictional and mathematical objects as non-existent. The book will be of central interest to anyone who is concerned with intentionality in the philosophy of mind or philosophy of language, the metaphysics of existence and identity, the philosophy or fiction, the philosophy of mathematics, or cognitive representation in AI. This updated second edition adds ten new chapters to the original eight. These further develop the ideas of the first edition, reply to critics, and explore new areas of relevance. New topics covered include: conceivability, realism/antirealism concerning non-existent objects, self-deception, and the verb to be.
The genesis of the digital idea and why it transformed civilization A few short decades ago, we were informed by the smooth signals of analog television and radio; we communicated using our analog telephones; and we even computed with analog computers. Today our world is digital, built with zeros and ones. Why did this revolution occur? The Discrete Charm of the Machine explains, in an engaging and accessible manner, the varied physical and logical reasons behind this radical transformation. The spark of individual genius shines through this story of innovation: the stored program of Jacquard’s loom; Charles Babbage’s logical branching; Alan Turing’s brilliant abstraction of the discrete machine; Harry Nyquist’s foundation for digital signal processing; Claude Shannon’s breakthrough insights into the meaning of information and bandwidth; and Richard Feynman’s prescient proposals for nanotechnology and quantum computing. Ken Steiglitz follows the progression of these ideas in the building of our digital world, from the internet and artificial intelligence to the edge of the unknown. Are questions like the famous traveling salesman problem truly beyond the reach of ordinary digital computers? Can quantum computers transcend these barriers? Does a mysterious magical power reside in the analog mechanisms of the brain? Steiglitz concludes by confronting the moral and aesthetic questions raised by the development of artificial intelligence and autonomous robots. The Discrete Charm of the Machine examines why our information technology, the lifeblood of our civilization, became digital, and challenges us to think about where its future trajectory may lead.
Sets are central to mathematics and its foundations, but what are they? In this book Luca Incurvati provides a detailed examination of all the major conceptions of set and discusses their virtues and shortcomings, as well as introducing the fundamentals of the alternative set theories with which these conceptions are associated. He shows that the conceptual landscape includes not only the naive and iterative conceptions but also the limitation of size conception, the definite conception, the stratified conception and the graph conception. In addition, he presents a novel, minimalist account of the iterative conception which does not require the existence of a relation of metaphysical dependence between a set and its members. His book will be of interest to researchers and advanced students in logic and the philosophy of mathematics.
Leibniz published the Dissertation on Combinatorial Art in 1666. This book contains the seeds of Leibniz's mature thought, as well as many of the mathematical ideas that he would go on to further develop after the invention of the calculus. It is in the Dissertation, for instance, that we find the project for the construction of a logical calculus clearly expressed for the first time. The idea of encoding terms and propositions by means of numbers, later developed by Kurt Goedel, also appears in this work. In this text, furthermore, Leibniz conceives the possibility of constituting a universal language or universal characteristic, a project that he would pursue for the rest of his life. Mugnai, van Ruler, and Wilson present the first full English translation of the Dissertation, complete with a critical introduction and a comprehensive commentary.
To open a newspaper or turn on the television it would appear that science and religion are polar opposites - mutually exclusive bedfellows competing for hearts and minds. There is little indication of the rich interaction between religion and science throughout history, much of which continues today. From ancient to modern times, mathematicians have played a key role in this interaction. This is a book on the relationship between mathematics and religious beliefs. It aims to show that, throughout scientific history, mathematics has been used to make sense of the 'big' questions of life, and that religious beliefs sometimes drove mathematicians to mathematics to help them make sense of the world. Containing contributions from a wide array of scholars in the fields of philosophy, history of science and history of mathematics, this book shows that the intersection between mathematics and theism is rich in both culture and character. Chapters cover a fascinating range of topics including the Sect of the Pythagoreans, Newton's views on the apocalypse, Charles Dodgson's Anglican faith and Goedel's proof of the existence of God.
A comprehensive look at four of the most famous problems in mathematics Tales of Impossibility recounts the intriguing story of the renowned problems of antiquity, four of the most famous and studied questions in the history of mathematics. First posed by the ancient Greeks, these compass and straightedge problems-squaring the circle, trisecting an angle, doubling the cube, and inscribing regular polygons in a circle-have served as ever-present muses for mathematicians for more than two millennia. David Richeson follows the trail of these problems to show that ultimately their proofs-which demonstrated the impossibility of solving them using only a compass and straightedge-depended on and resulted in the growth of mathematics. Richeson investigates how celebrated luminaries, including Euclid, Archimedes, Viete, Descartes, Newton, and Gauss, labored to understand these problems and how many major mathematical discoveries were related to their explorations. Although the problems were based in geometry, their resolutions were not, and had to wait until the nineteenth century, when mathematicians had developed the theory of real and complex numbers, analytic geometry, algebra, and calculus. Pierre Wantzel, a little-known mathematician, and Ferdinand von Lindemann, through his work on pi, finally determined the problems were impossible to solve. Along the way, Richeson provides entertaining anecdotes connected to the problems, such as how the Indiana state legislature passed a bill setting an incorrect value for pi and how Leonardo da Vinci made elegant contributions in his own study of these problems. Taking readers from the classical period to the present, Tales of Impossibility chronicles how four unsolvable problems have captivated mathematical thinking for centuries.
Head hits cause brain damage - but not always. Should we ban sport to protect athletes? Exposure to electromagnetic fields is strongly associated with cancer development - does that mean exposure causes cancer? Should we encourage old fashioned communication instead of mobile phones to reduce cancer rates? According to popular wisdom, the Mediterranean diet keeps you healthy. Is this belief scientifically sound? Should public health bodies encourage consumption of fresh fruit and vegetables? Severe financial constraints on research and public policy, media pressure, and public anxiety make such questions of immense current concern not just to philosophers but to scientists, governments, public bodies, and the general public. In the last decade there has been an explosion of theorizing about causality in philosophy, and also in the sciences. This literature is both fascinating and important, but it is involved and highly technical. This makes it inaccessible to many who would like to use it, philosophers and scientists alike. This book is an introduction to philosophy of causality - one that is highly accessible: to scientists unacquainted with philosophy, to philosophers unacquainted with science, and to anyone else lost in the labyrinth of philosophical theories of causality. It presents key philosophical accounts, concepts and methods, using examples from the sciences to show how to apply philosophical debates to scientific problems.
This volume contains six new and fifteen previously published essays -- plus a new introduction -- by Storrs McCall. Some of the essays were written in collaboration with E. J. Lowe of Durham University. The essays discuss controversial topics in logic, action theory, determinism and indeterminism, and the nature of human choice and decision. Some construct a modern up-to-date version of Aristotle's bouleusis, practical deliberation. This process of practical deliberation is shown to be indeterministic but highly controlled and the antithesis of chance. Others deal with the concept of branching four-dimensional space-time, explain non-local influences in quantum mechanics, or reconcile God's omniscience with human free will. The eponymous first essay contains the proof of a fact that in 1931 Kurt Godel had claimed to be unprovable, namely that the set of arithmetic truths forms a consistent system."
Michael G. Titelbaum presents a new Bayesian framework for modeling rational degrees of belief, called the Certainty-Loss Framework. Subjective Bayesianism is epistemologists' standard theory of how individuals should change their degrees of belief over time. But despite the theory's power, it is widely recognized to fail for situations agents face every day-cases in which agents forget information, or in which they assign degrees of belief to self-locating claims. Quitting Certainties argues that these failures stem from a common source: the inability of Conditionalization (Bayesianism's traditional updating rule) to model claims' going from certainty at an earlier time to less-than-certainty later on. It then presents a new Bayesian updating framework that accurately represents rational requirements on agents who undergo certainty loss. Titelbaum develops this new framework from the ground up, assuming little technical background on the part of his reader. He interprets Bayesian theories as formal models of rational requirements, leading him to discuss both the elements that go into a formal model and the general principles that link formal systems to norms. By reinterpreting Bayesian methodology and altering the theory's updating rules, Titelbaum is able to respond to a host of challenges to Bayesianism both old and new. These responses lead in turn to deeper questions about commitment, consistency, and the nature of information. Quitting Certainties presents the first systematic, comprehensive Bayesian framework unifying the treatment of memory loss and context-sensitivity. It develops this framework, motivates it, compares it to alternatives, then applies it to cases in epistemology, decision theory, the theory of identity, and the philosophy of quantum mechanics. |
You may like...
Research in History and Philosophy of…
Maria Zack, Elaine Landry
Hardcover
R3,271
Discovery Miles 32 710
Mathematics, Logic, and their…
Mojtaba Mojtahedi, Shahid Rahman, …
Hardcover
R3,359
Discovery Miles 33 590
The Scientific Counter-Revolution - The…
Michael John Gorman
Hardcover
R3,307
Discovery Miles 33 070
|