![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Philosophy of mathematics
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed development of higher-order logic, including a comprehensive discussion of its semantics. Professor Shapiro demonstrates the prevalence of second-order notions in mathematics is practised, and also the extent to which mathematical concepts can be formulated in second-order languages . He shows how first-order languages are insufficient to codify many concepts in contemporary mathematics, and thus that higher-order logic is needed to fully reflect current mathematics. Throughout, the emphasis is on discussing the philosophical and historical issues associated with this subject, and the implications that they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic as might be gained from a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in this subject.
Beginning with a review of formal languages and their syntax and semantics, Logic, Proof and Computation conducts a computer assisted course in formal reasoning and the relevance of logic to mathematical proof, information processing and philosophy. Topics covered include formal grammars, semantics of formal languages, sequent systems, truth-tables, propositional and first order logic, identity, proof heuristics, regimentation, set theory, databases, automated deduction, proof by induction, Turing machines, undecidability and a computer illustration of the reasoning underpinning Godel's incompleteness proof. LPC is designed as a multidisciplinary reader for students in computing, philosophy and mathematics.
Why is 7 such a lucky number and 13 so unlucky? Why does a jury traditionally have `12 good men and true', and why are there 24 hours in the day and 60 seconds in a minute? This fascinating new book explores the world of numbers from pin numbers to book titles, and from the sixfold shape of snowflakes to the way our roads, houses and telephone numbers are designated in fact and fiction. Using the numbers themselves as its starting point it investigates everything from the origins and meaning of counting in early civilizations to numbers in proverbs, myths and nursery rhymes and the ancient `science' of numerology. It also focuses on the quirks of odds and evens, primes, on numbers in popular sports - and much, much more. So whether you've ever wondered why Heinz has 57 varieties, why 999 is the UK's emergency phone number but 911 is used in America, why Coco Chanel chose No. 5 for her iconic perfume, or how the title Catch 22 was chosen, then this is the book for you. Dip in anywhere and you'll find that numbers are not just for adding and measuring but can be hugely entertaining and informative whether you're buying a diamond or choosing dinner from the menu.
In 1940 G. H. Hardy published A Mathematician's Apology, a meditation on mathematics by a leading pure mathematician. Eighty-two years later, An Applied Mathematician's Apology is a meditation and also a personal memoir by a philosophically inclined numerical analyst, one who has found great joy in his work but is puzzled by its relationship to the rest of mathematics.
Since its original appearance in 1997, Numerical Linear Algebra has been a leading textbook in its field, used in universities around the world. It is noted for its 40 lecture-sized short chapters and its clear and inviting style. It is reissued here with a new foreword by James Nagy and a new afterword by Yuji Nakatsukasa about subsequent developments.
Science Without Numbers caused a stir in philosophy on its original publication in 1980, with its bold nominalist approach to the ontology of mathematics and science. Hartry Field argues that we can explain the utility of mathematics without assuming it true. Part of the argument is that good mathematics has a special feature ("conservativeness") that allows it to be applied to "nominalistic" claims (roughly, those neutral to the existence of mathematical entities) in a way that generates nominalistic consequences more easily without generating any new ones. Field goes on to argue that we can axiomatize physical theories using nominalistic claims only, and that in fact this has advantages over the usual axiomatizations that are independent of nominalism. There has been much debate about the book since it first appeared. It is now reissued in a revised contains a substantial new preface giving the author's current views on the original book and the issues that were raised in the subsequent discussion of it.
"Inspiring and informative...deserves to be widely read." -Wall Street Journal "This fun book offers a philosophical take on number systems and revels in the beauty of math." -Science News Because we have ten fingers, grouping by ten seems natural, but twelve would be better for divisibility, and eight is well suited to repeated halving. Grouping by two, as in binary code, has turned out to have its own remarkable advantages. Paul Lockhart presents arithmetic not as rote manipulation of numbers-a practical if mundane branch of knowledge best suited for filling out tax forms-but as a fascinating, sometimes surprising intellectual craft that arises from our desire to add, divide, and multiply important things. Passionate and entertaining, Arithmetic invites us to experience the beauty of mathematics through the eyes of a beguiling teacher. "A nuanced understanding of working with numbers, gently connecting procedures that we once learned by rote with intuitions long since muddled by education... Lockhart presents arithmetic as a pleasurable pastime, and describes it as a craft like knitting." -Jonathon Keats, New Scientist "What are numbers, how did they arise, why did our ancestors invent them, and how did they represent them? They are, after all, one of humankind's most brilliant inventions, arguably having greater impact on our lives than the wheel. Lockhart recounts their fascinating story... A wonderful book." -Keith Devlin, author of Finding Fibonacci
Visual thinking -- visual imagination or perception of diagrams and
symbol arrays, and mental operations on them -- is omnipresent in
mathematics. Is this visual thinking merely a psychological aid,
facilitating grasp of what is gathered by other means? Or does it
also have epistemological functions, as a means of discovery,
understanding, and even proof? By examining the many kinds of
visual representation in mathematics and the diverse ways in which
they are used, Marcus Giaquinto argues that visual thinking in
mathematics is rarely just a superfluous aid; it usually has
epistemological value, often as a means of discovery. Drawing from
philosophical work on the nature of concepts and from empirical
studies of visual perception, mental imagery, and numerical
cognition, Giaquinto explores a major source of our grasp of
mathematics, using examples from basic geometry, arithmetic,
algebra, and real analysis. He shows how we can discern abstract
general truths by means of specific images, how synthetic a priori
knowledge is possible, and how visual means can help us grasp
abstract structures.
Mary Leng offers a defense of mathematical fictionalism, according to which we have no reason to believe that there are any mathematical objects. Perhaps the most pressing challenge to mathematical fictionalism is the indispensability argument for the truth of our mathematical theories (and therefore for the existence of the mathematical objects posited by those theories). According to this argument, if we have reason to believe anything, we have reason to believe that the claims of our best empirical theories are (at least approximately) true. But since claims whose truth would require the existence of mathematical objects are indispensable in formulating our best empirical theories, it follows that we have good reason to believe in the mathematical objects posited by those mathematical theories used in empirical science, and therefore to believe that the mathematical theories utilized in empirical science are true. Previous responses to the indispensability argument have focussed on arguing that mathematical assumptions can be dispensed with in formulating our empirical theories. Leng, by contrast, offers an account of the role of mathematics in empirical science according to which the successful use of mathematics in formulating our empirical theories need not rely on the truth of the mathematics utilized.
Gottlob Frege's Grundgesetze der Arithmetik, or Basic Laws of Arithmetic, was intended to be his magnum opus, the book in which he would finally establish his logicist philosophy of arithmetic. But because of the disaster of Russell's Paradox, which undermined Frege's proofs, the more mathematical parts of the book have rarely been read. Richard G. Heck, Jr., aims to change that, and establish it as a neglected masterpiece that must be placed at the center of Frege's philosophy. Part I of Reading Frege's Grundgesetze develops an interpretation of the philosophy of logic that informs Grundgesetze, paying especially close attention to the difficult sections of Frege's book in which he discusses his notorious 'Basic Law V' and attempts to secure its status as a law of logic. Part II examines the mathematical basis of Frege's logicism, explaining and exploring Frege's formal arguments. Heck argues that Frege himself knew that his proofs could be reconstructed so as to avoid Russell's Paradox, and presents Frege's arguments in a way that makes them available to a wide audience. He shows, by example, that careful attention to the structure of Frege's arguments, to what he proved, to how he proved it, and even to what he tried to prove but could not, has much to teach us about Frege's philosophy.
Frege's Theorem collects eleven essays by Richard G Heck, Jr, one
of the world's leading authorities on Frege's philosophy. The
Theorem is the central contribution of Gottlob Frege's formal work
on arithmetic. It tells us that the axioms of arithmetic can be
derived, purely logically, from a single principle: the number of
these things is the same as the number of those things just in case
these can be matched up one-to-one with those. But that principle
seems so utterly fundamental to thought about number that it might
almost count as a definition of number. If so, Frege's Theorem
shows that arithmetic follows, purely logically, from a near
definition. As Crispin Wright was the first to make clear, that
means that Frege's logicism, long thought dead, might yet be
viable.
Agenda Relevance is the first volume in the authors' omnibus
investigation of
"
Chemistry, physics and biology are by their nature genuinely difficult. Mathematics, however, is man-made, and therefore not as complicated. Two ideas form the basis for this book: 1) to use ordinary mathematics to describe the simplicity in the structure of mathematics and 2) to develop new branches of mathematics to describe natural sciences.
Logic is a field studied mainly by researchers and students of philosophy, mathematics and computing. Inductive logic seeks to determine the extent to which the premisses of an argument entail its conclusion, aiming to provide a theory of how one should reason in the face of uncertainty. It has applications to decision making and artificial intelligence, as well as how scientists should reason when not in possession of the full facts. In this book, Jon Williamson embarks on a quest to find a general, reasonable, applicable inductive logic (GRAIL), all the while examining why pioneers such as Ludwig Wittgenstein and Rudolf Carnap did not entirely succeed in this task. Along the way he presents a general framework for the field, and reaches a new inductive logic, which builds upon recent developments in Bayesian epistemology (a theory about how strongly one should believe the various propositions that one can express). The book explores this logic in detail, discusses some key criticisms, and considers how it might be justified. Is this truly the GRAIL? Although the book presents new research, this material is well suited to being delivered as a series of lectures to students of philosophy, mathematics, or computing and doubles as an introduction to the field of inductive logic
Contents: Introduction; I. ONTOLOGY; 1. Existence (1987); 2. Nonexistence (1998); 3. Mythical Objects (2002); II. NECESSITY; 4. Modal Logic Kalish-and-Montague Style (1994); 5. Impossible Worlds (1984); 6. An Empire of Thin Air (1988); 7. The Logic of What Might Have Been (1989); III. IDENTITY; 8. The fact that x=y (1987); 9. This Side of Paradox (1993); 10. Identity Facts (2003); 11. Personal Identity: What's the Problem? (1995); IV. PHILOSOPHY OF MATHEMATICS; 12. Wholes, Parts, and Numbers (1997); 13. The Limits of Human Mathematics (2001); V. THEORY OF MEANING AND REFERENCE; 14. On Content (1992); 15. On Designating (1997); 16. A Problem in the Frege-Church Theory of Sense and Denotation (1993); 17. The Very Possibility of Language (2001); 18. Tense and Intension (2003); 19. Pronouns as Variables (2005)
This book is a specialized monograph on interpolation and definability, a notion central in pure logic and with significant meaning and applicability in all areas where logic is applied, especially computer science, artificial intelligence, logic programming, philosophy of science and natural language. Suitable for researchers and graduate students in mathematics, computer science and philosophy, this is the latest in the prestigous world-renowned Oxford Logic Guides, which contains Michael Dummet's Elements of intuitionism (second edition), J. M. Dunn and G. Hardegree's Algebraic Methods in Philosophical Logic, H. Rott's Change, Choice and Inference: A Study of Belief Revision and Nonmonotonic Reasoning, P. T. Johnstone's Sketches of an Elephant: A Topos Theory Compendium: Volumes 1 and 2, and David J. Pym and Eike Ritter's Reductive Logic and Proof Search: Proof theory, semantics and control.
Quantum gravity is the name given to a theory that unites general relativity - Einstein's theory of gravitation and spacetime - with quantum field theory, our framework for describing non-gravitational forces. The Structural Foundations of Quantum Gravity brings together philosophers and physicists to discuss a range of conceptual issues that surface in the effort to unite these theories, focusing in particular on the ontological nature of the spacetime that results. Although there has been a great deal written about quantum gravity from the perspective of physicists and mathematicians, very little attention has been paid to the philosophical aspects. This volume closes that gap, with essays written by some of the leading researchers in the field. Individual papers defend or attack a structuralist perspective on the fundamental ontologies of our physical theories, which offers the possibility of shedding new light on a number of foundational problems. It is a book that will be of interest not only to physicists and philosophers of physics but to anyone concerned with foundational issues and curious to explore new directions in our understanding of spacetime and quantum physics.
A comprehensive philosophical introduction to set theory. Anyone wishing to work on the logical foundations of mathematics must understand set theory, which lies at its heart. Potter offers a thorough account of cardinal and ordinal arithmetic, and the various axiom candidates. He discusses in detail the project of set-theoretic reduction, which aims to interpret the rest of mathematics in terms of set theory. The key question here is how to deal with the paradoxes that bedevil set theory. Potter offers a strikingly simple version of the most widely accepted response to the paradoxes, which classifies sets by means of a hierarchy of levels. What makes the book unique is that it interweaves a careful presentation of the technical material with a penetrating philosophical critique. Potter does not merely expound the theory dogmatically but at every stage discusses in detail the reasons that can be offered for believing it to be true.
Many philosophers these days consider themselves naturalists, but it's doubtful any two of them intend the same position by the term. In this book, Penelope Maddy describes and practises a particularly austere form of naturalism called 'Second Philosophy'. Without a definitive criterion for what counts as 'science' and what doesn't, Second Philosophy can't be specified directly - 'trust only the methods of science!' or some such thing - so Maddy proceeds instead by illustrating the behaviours of an idealized inquirer she calls the 'Second Philosopher'. This Second Philosopher begins from perceptual common sense and progresses from there to systematic observation, active experimentation, theory formation and testing, working all the while to assess, correct and improve her methods as she goes. Second Philosophy is then the result of the Second Philosopher's investigations. Maddy delineates the Second Philosopher's approach by tracing her reactions to various familiar skeptical and transcendental views (Descartes, Kant, Carnap, late Putnam, van Fraassen), comparing her methods to those of other self-described naturalists (especially Quine), and examining a prominent contemporary debate (between disquotationalists and correspondence theorists in the theory of truth) to extract a properly second-philosophical line of thought. She then undertakes to practise Second Philosophy in her reflections on the ground of logical truth, the methodology, ontology and epistemology of mathematics, and the general prospects for metaphysics naturalized.
Robert Hanna presents a fresh view of the Kantian and analytic traditions that have dominated continental European and Anglo-American philosophy over the last two centuries, and of the relation between them. The rise of analytic philosophy decisively marked the end of the hundred-year dominance of Kant's philosophy in Europe. But Hanna shows that the analytic tradition also emerged from Kant's philosophy in the sense that its members were able to define and legitimate their ideas only by means of an intensive, extended engagement with, and a partial or complete rejection of, the Critical Philosophy. Hanna puts forward a new 'cognitive-semantic' interpretation of transcendental idealism, and a vigorous defence of Kant's theory of analytic and synthetic necessary truth. These will make Kant and the Foundations of Analytic Philosophy compelling reading not just for specialists in the history of philosophy, but for all who are interested in these fundamental philosophical issues. |
![]() ![]() You may like...
The South African Guide To Gluten-Free…
Zorah Booley Samaai
Paperback
|