Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Philosophy of mathematics
Roy T Cook examines the Yablo paradox-a paradoxical, infinite sequence of sentences, each of which entails the falsity of all others later than it in the sequence-with special attention paid to the idea that this paradox provides us with a semantic paradox that involves no circularity. The three main chapters of the book focus, respectively, on three questions that can be (and have been) asked about the Yablo construction. First we have the Characterization Problem, which asks what patterns of sentential reference (circular or not) generate semantic paradoxes. Addressing this problem requires an interesting and fruitful detour through the theory of directed graphs, allowing us to draw interesting connections between philosophical problems and purely mathematical ones. Next is the Circularity Question, which addresses whether or not the Yablo paradox is genuinely non-circular. Answering this question is complicated: although the original formulation of the Yablo paradox is circular, it turns out that it is not circular in any sense that can bear the blame for the paradox. Further, formulations of the paradox using infinitary conjunction provide genuinely non-circular constructions. Finally, Cook turns his attention to the Generalizability Question: can the Yabloesque pattern be used to generate genuinely non-circular variants of other paradoxes, such as epistemic and set-theoretic paradoxes? Cook argues that although there are general constructions-unwindings-that transform circular constructions into Yablo-like sequences, it turns out that these sorts of constructions are not 'well-behaved' when transferred from semantic puzzles to puzzles of other sorts. He concludes with a short discussion of the connections between the Yablo paradox and the Curry paradox.
Topology is the mathematical study of the most basic geometrical structure of a space. Mathematical physics uses topological spaces as the formal means for describing physical space and time. This book proposes a completely new mathematical structure for describing geometrical notions such as continuity, connectedness, boundaries of sets, and so on, in order to provide a better mathematical tool for understanding space-time. This is the initial volume in a two-volume set, the first of which develops the mathematical structure and the second of which applies it to classical and Relativistic physics. The book begins with a brief historical review of the development of mathematics as it relates to geometry, and an overview of standard topology. The new theory, the Theory of Linear Structures, is presented and compared to standard topology. The Theory of Linear Structures replaces the foundational notion of standard topology, the open set, with the notion of a continuous line. Axioms for the Theory of Linear Structures are laid down, and definitions of other geometrical notions developed in those terms. Various novel geometrical properties, such as a space being intrinsically directed, are defined using these resources. Applications of the theory to discrete spaces (where the standard theory of open sets gets little purchase) are particularly noted. The mathematics is developed up through homotopy theory and compactness, along with ways to represent both affine (straight line) and metrical structure.
The German philosopher and mathematician Gottlob Frege (1848-1925) was the father of analytic philosophy and to all intents and purposes the inventor of modern logic. Basic Laws of Arithmetic, originally published in German in two volumes (1893, 1903), is Freges magnum opus. It was to be the pinnacle of Freges lifes work. It represents the final stage of his logicist project the idea that arithmetic and analysis are reducible to logic and contains his mature philosophy of mathematics and logic. The aim of Basic Laws of Arithmetic is to demonstrate the logical nature of mathematical theorems by providing gapless proofs in Frege's formal system using only basic laws of logic, logical inference, and explicit definitions. The work contains a philosophical foreword, an introduction to Frege's logic, a derivation of arithmetic from this logic, a critique of contemporary approaches to the real numbers, and the beginnings of a logicist treatment of real analysis. As is well-known, a letter received from Bertrand Russell shortly before the publication of the second volume made Frege realise that his basic law V, governing the identity of value-ranges, leads into inconsistency. Frege discusses a revision to basic law V written in response to Russells letter in an afterword to volume II. The continuing importance of Basic Laws of Arithmetic lies not only in its bearing on issues in the foundations of mathematics and logic but in its model of philosophical inquiry. Frege's ability to locate the essential questions, his integration of logical and philosophical analysis, and his rigorous approach to criticism and argument in general are vividly in evidence in this, his most ambitious work. Philip Ebert and Marcus Rossberg present the first full English translation of both volumes of Freges major work preserving the original formalism and pagination. The edition contains a foreword by Crispin Wright and an extensive appendix providing an introduction to Frege's formal system by Roy T. Cook.
Model theory, a major branch of mathematical logic, plays a key
role connecting logic and other areas of mathematics such as
algebra, geometry, analysis, and combinatorics. Simplicity theory,
a subject of model theory, studies a class of mathematical
structures, called simple. The class includes all stable structures
(vector spaces, modules, algebraically closed fields,
differentially closed fields, and so on), and also important
unstable structures such as the random graph, smoothly approximated
structures, pseudo-finite fields, ACFA and more. Simplicity theory
supplies the uniform model theoretic points of views to such
structures in addition to their own mathematical analyses.
Our conception of logical space is the set of distinctions we use to navigate the world. In The Construction of Logical Space Agustin Rayo defends the idea that one's conception of logical space is shaped by one's acceptance or rejection of 'just is'-statements: statements like 'to be composed of water just is to be composed of H2O', or 'for the number of the dinosaurs to be zero just is for there to be no dinosaurs'. The resulting picture is used to articulate a conception of metaphysical possibility that does not depend on a reduction of the modal to the non-modal, and to develop a trivialist philosophy of mathematics, according to which the truths of pure mathematics have trivial truth-conditions.
Michael G. Titelbaum presents a new Bayesian framework for modeling rational degrees of belief, called the Certainty-Loss Framework. Subjective Bayesianism is epistemologists' standard theory of how individuals should change their degrees of belief over time. But despite the theory's power, it is widely recognized to fail for situations agents face every day-cases in which agents forget information, or in which they assign degrees of belief to self-locating claims. Quitting Certainties argues that these failures stem from a common source: the inability of Conditionalization (Bayesianism's traditional updating rule) to model claims' going from certainty at an earlier time to less-than-certainty later on. It then presents a new Bayesian updating framework that accurately represents rational requirements on agents who undergo certainty loss. Titelbaum develops this new framework from the ground up, assuming little technical background on the part of his reader. He interprets Bayesian theories as formal models of rational requirements, leading him to discuss both the elements that go into a formal model and the general principles that link formal systems to norms. By reinterpreting Bayesian methodology and altering the theory's updating rules, Titelbaum is able to respond to a host of challenges to Bayesianism both old and new. These responses lead in turn to deeper questions about commitment, consistency, and the nature of information. Quitting Certainties presents the first systematic, comprehensive Bayesian framework unifying the treatment of memory loss and context-sensitivity. It develops this framework, motivates it, compares it to alternatives, then applies it to cases in epistemology, decision theory, the theory of identity, and the philosophy of quantum mechanics.
This is the first logically precise, computationally implementable,
book-length account of rational belief revision. It explains how a
rational agent ought to proceed when adopting a new belief - a
difficult matter if the new belief contradicts the agent's old
beliefs.
In Frege's Conception of Logic Patricia A. Blanchette explores the relationship between Gottlob Frege's understanding of conceptual analysis and his understanding of logic. She argues that the fruitfulness of Frege's conception of logic, and the illuminating differences between that conception and those more modern views that have largely supplanted it, are best understood against the backdrop of a clear account of the role of conceptual analysis in logical investigation. The first part of the book locates the role of conceptual analysis in Frege's logicist project. Blanchette argues that despite a number of difficulties, Frege's use of analysis in the service of logicism is a powerful and coherent tool. As a result of coming to grips with his use of that tool, we can see that there is, despite appearances, no conflict between Frege's intention to demonstrate the grounds of ordinary arithmetic and the fact that the numerals of his derived sentences fail to co-refer with ordinary numerals. In the second part of the book, Blanchette explores the resulting conception of logic itself, and some of the straightforward ways in which Frege's conception differs from its now-familiar descendants. In particular, Blanchette argues that consistency, as Frege understands it, differs significantly from the kind of consistency demonstrable via the construction of models. To appreciate this difference is to appreciate the extent to which Frege was right in his debate with Hilbert over consistency- and independence-proofs in geometry. For similar reasons, modern results such as the completeness of formal systems and the categoricity of theories do not have for Frege the same importance they are commonly taken to have by his post-Tarskian descendants. These differences, together with the coherence of Frege's position, provide reason for caution with respect to the appeal to formal systems and their properties in the treatment of fundamental logical properties and relations.
David Bostock presents a critical appraisal of Bertrand Russell's philosophy from 1900 to 1924-a period that is considered to be the most important in his career. Russell developed his theory of logic from 1900 to 1910, and over those years wrote the famous work Principia Mathematica with A. N. Whitehead. Bostock explores Russell's development of 'logical atomism', which applies this logic to problems in the theory of knowledge and in metaphysics, and was central to his philosophical work from 1910 to 1924. This book is the first to focus on this important period of Russell's development, examining the three key areas of logic and mathematics, knowledge, and metaphysics, and demonstrating the enduring value of his work in these areas.
Das Buch ist eine unterhaltsame und formelfreie Darstellung der modernen Physik vom 19. Jahrhundert bis zur Gegenwart. Das Leben Albert Einsteins und seine wissenschaftlichen Leistungen ziehen sich als roter Faden durch den Text. Der Autor erlautert zentrale Begriffe und Ergebnisse der modernen Physik in popularwissenschaftlicher Form aus der historischen Perspektive. Der Leser erfahrt auf amusante Weise, wie sich die moderne Physik entwickelt hat. Wir begegnen Poincare, Lorentz und Hilbert, Boltzmann und Bohr, Minkowski, Planck, de Broglie, Hubble und Weyl, Gamow, Hahn und Meitner, Kapiza und Landau, Fermi und vielen anderen beruhmten Wissenschaftlern. Was hatte Eddington gegen Chandrasekhar und was hatte Einstein gegen Schwarze Locher? Warum sollten Raumtouristen, Traumtouristen und Weltraumtraumtouristen nicht am Loch Ness, sondern auf der sicheren Seite eines Schwarzen Loches Urlaub machen? Warum wetterte Pauli gegen Einstein? Stimmt die Sache mit der Atombombenformel? Vermatschte Materie, Urknall und kosmische Hintergrundstrahlung, Gravitationswellen und Doppelpulsare, die kosmologische Konstante und die Expansion des Universums sind weitere Themen, die den Leser in Atem halten und kein geistiges Vakuum aufkommen lassen."
Includes several classic essays from the first edition, a representative selection of the most influential work of the past twenty years, a substantial introduction, and an extended bibliography. Originally published by Prentice-Hall in 1964.
In our hyper-modern world, we are bombarded with more facts, stats and information than ever before. So, what can we grasp hold of to make sense of it all? Oliver Johnson reveals how mathematical thinking can help us understand the myriad data all around us. From the exponential growth of viruses to social media filter-bubbles; from share-price fluctuations to growth of computing power; from the datafication of our sports pages to quantifying climate change. Not to mention the things much closer to home: ever wondered when the best time is to leave a party? What are the chances of rain ruining your barbecue this weekend? How about which queue is the best to join in the supermarket? Journeying through the three sections of Randomness, Structure, and Information, we meet a host of brilliant minds such Alan Turing, Enrico Fermi and Claude Shannon, and we learn the tools, tips and tricks to cut through the noise all around us - from the Law of Large Numbers to Entropy to Brownian Motion. Lucid, surprising, and endlessly entertaining, Numbercrunch equips you with a definitive mathematician's toolkit to make sense of your world.
Gaisi Takeuti was one of the most brilliant, genius, and influential logicians of the 20th century. He was a long-time professor and professor emeritus of mathematics at the University of Illinois at Urbana-Champaign, USA, before he passed away on May 10, 2017, at the age of 91. Takeuti was one of the founders of Proof Theory, a branch of mathematical logic that originated from Hilbert's program about the consistency of mathematics. Based on Gentzen's pioneering works of proof theory in the 1930s, he proposed a conjecture in 1953 concerning the essential nature of formal proofs of higher-order logic now known as Takeuti's fundamental conjecture and of which he gave a partial positive solution. His arguments on the conjecture and proof theory in general have had great influence on the later developments of mathematical logic, philosophy of mathematics, and applications of mathematical logic to theoretical computer science. Takeuti's work ranged over the whole spectrum of mathematical logic, including set theory, computability theory, Boolean valued analysis, fuzzy logic, bounded arithmetic, and theoretical computer science. He wrote many monographs and textbooks both in English and in Japanese, and his monumental monograph Proof Theory, published in 1975, has long been a standard reference of proof theory. He had a wide range of interests covering virtually all areas of mathematics and extending to physics. His publications include many Japanese books for students and general readers about mathematical logic, mathematics in general, and connections between mathematics and physics, as well as many essays for Japanese science magazines. This volume is a collection of papers based on the Symposium on Advances in Mathematical Logic 2018. The symposium was held September 18-20, 2018, at Kobe University, Japan, and was dedicated to the memory of Professor Gaisi Takeuti.
Paolo Mancosu presents a series of innovative studies in the history and the philosophy of logic and mathematics in the first half of the twentieth century. The Adventure of Reason is divided into five main sections: history of logic (from Russell to Tarski); foundational issues (Hilbert's program, constructivity, Wittgenstein, Godel); mathematics and phenomenology (Weyl, Becker, Mahnke); nominalism (Quine, Tarski); semantics (Tarski, Carnap, Neurath). Mancosu exploits extensive untapped archival sources to make available a wealth of new material that deepens in significant ways our understanding of these fascinating areas of modern intellectual history. At the same time, the book is a contribution to recent philosophical debates, in particular on the prospects for a successful nominalist reconstruction of mathematics, the nature of finitist intuition, the viability of alternative definitions of logical consequence, and the extent to which phenomenology can hope to account for the exact sciences.
Simone Weil: philosopher, political activist, mystic - and sister to André, one of the most influential mathematicians of the twentieth century. These two extraordinary siblings formed an obsession for Karen Olsson, who studied mathematics at Harvard, only to turn to writing as a vocation. When Olsson got hold of the 1940 letters between the siblings, she found they shared a curiosity about the inception of creative thought - that flash of insight - that Olsson experienced as both a maths student, and later, novelist. Following this thread of connections, The Weil Conjectures explores the lives of Simone and André, the lore and allure of mathematics, and its significance in Olsson's own life.
A stimulating intellectual history of Ptolemy's philosophy and his conception of a world in which mathematics reigns supreme The Greco-Roman mathematician Claudius Ptolemy is one of the most significant figures in the history of science. He is remembered today for his astronomy, but his philosophy is almost entirely lost to history. This groundbreaking book is the first to reconstruct Ptolemy's general philosophical system-including his metaphysics, epistemology, and ethics-and to explore its relationship to astronomy, harmonics, element theory, astrology, cosmology, psychology, and theology. In this stimulating intellectual history, Jacqueline Feke uncovers references to a complex and sophisticated philosophical agenda scattered among Ptolemy's technical studies in the physical and mathematical sciences. She shows how he developed a philosophy that was radical and even subversive, appropriating ideas and turning them against the very philosophers from whom he drew influence. Feke reveals how Ptolemy's unique system is at once a critique of prevailing philosophical trends and a conception of the world in which mathematics reigns supreme. A compelling work of scholarship, Ptolemy's Philosophy demonstrates how Ptolemy situated mathematics at the very foundation of all philosophy-theoretical and practical-and advanced the mathematical way of life as the true path to human perfection.
Crispin Wright is widely recognised as one of the most important and influential analytic philosophers of the twentieth and twenty-first centuries. This volume is a collective exploration of the major themes of his work in philosophy of language, philosophical logic, and philosophy of mathematics. It comprises specially written chapters by a group of internationally renowned thinkers, as well as four substantial responses from Wright. In these thematically organized replies, Wright summarizes his life's work and responds to the contributory essays collected in this book. In bringing together such scholarship, the present volume testifies to both the enormous interest in Wright's thought and the continued relevance of Wright's seminal contributions in analytic philosophy for present-day debates;
The interplay between computability and randomness has been an
active area of research in recent years, reflected by ample funding
in the USA, numerous workshops, and publications on the subject.
The complexity and the randomness aspect of a set of natural
numbers are closely related. Traditionally, computability theory is
concerned with the complexity aspect. However, computability
theoretic tools can also be used to introduce mathematical
counterparts for the intuitive notion of randomness of a set.
Recent research shows that, conversely, concepts and methods
originating from randomness enrich computability theory.
In 1655, the philosopher Thomas Hobbes claimed he had solved the
centuries-old problem of "squaring of the circle" (constructing a
square equal in area to a given circle). With a scathing rebuttal
to Hobbes's claims, the mathematician John Wallis began one of the
longest and most intense intellectual disputes of all time.
"Squaring the Circle" is a detailed account of this controversy,
from the core mathematics to the broader philosophical, political,
and religious issues at stake.
To what extent are the subjects of our thoughts and talk real? This is the question of realism. In this book, Justin Clarke-Doane explores arguments for and against moral realism and mathematical realism, how they interact, and what they can tell us about areas of philosophical interest more generally. He argues that, contrary to widespread belief, our mathematical beliefs have no better claim to being self-evident or provable than our moral beliefs. Nor do our mathematical beliefs have better claim to being empirically justified than our moral beliefs. It is also incorrect that reflection on the genealogy of our moral beliefs establishes a lack of parity between the cases. In general, if one is a moral antirealist on the basis of epistemological considerations, then one ought to be a mathematical antirealist as well. And, yet, Clarke-Doane shows that moral realism and mathematical realism do not stand or fall together - and for a surprising reason. Moral questions, insofar as they are practical, are objective in a sense that mathematical questions are not, and the sense in which they are objective can only be explained by assuming practical anti-realism. One upshot of the discussion is that the concepts of realism and objectivity, which are widely identified, are actually in tension. Another is that the objective questions in the neighborhood of factual areas like logic, modality, grounding, and nature are practical questions too. Practical philosophy should, therefore, take center stage.
This book presents a new nominalistic philosophy of mathematics: semantic conventionalism. Its central thesis is that mathematics should be founded on the human ability to create language - and specifically, the ability to institute conventions for the truth conditions of sentences. This philosophical stance leads to an alternative way of practicing mathematics: instead of "building" objects out of sets, a mathematician should introduce new syntactical sentence types, together with their truth conditions, as he or she develops a theory. Semantic conventionalism is justified first through criticism of Cantorian set theory, intuitionism, logicism, and predicativism; then on its own terms; and finally, exemplified by a detailed reconstruction of arithmetic and real analysis. Also included is a simple solution to the liar paradox and the other paradoxes that have traditionally been recognized as semantic. And since it is argued that mathematics is semantics, this solution also applies to Russell's paradox and the other mathematical paradoxes of self-reference. In addition to philosophers who care about the metaphysics and epistemology of mathematics or the paradoxes of self-reference, this book should appeal to mathematicians interested in alternative approaches.
The biological and social sciences often generalize causal
conclusions from one context or location to others that may differ
in some relevant respects, as is illustrated by inferences from
animal models to humans or from a pilot study to a broader
population. Inferences like these are known as extrapolations. The
question of how and when extrapolation can be legitimate is a
fundamental issue for the biological and social sciences that has
not received the attention it deserves. In Across the Boundaries,
Steel argues that previous accounts of extrapolation are inadequate
and proposes a better approach that is able to answer
methodological critiques of extrapolation from animal models to
humans.
Cet ouvrage contient les correspondances actives et passives de Jules Houel avec Joseph-Marie De Tilly, Gaston Darboux et Victor-Amedee Le Besgue ainsi qu'une introduction qui se focalise sur la decouverte de l'impossibilite de demontrer le postulat des paralleles d'Euclide et l'apparition des premiers exemples de fonctions continues non derivables. Jules Houel (1823-1886) a occupe une place particuliere dans les mathematiques en France durant la seconde partie du 19eme siecle. Par ses travaux de traduction et ses recensions, il a vivement contribue a la reception de la geometrie non euclidienne de Bolyai et Lobatchevski ainsi qu'aux debats sur les fondements de l'analyse. Il se situe au centre d'un vaste reseau international de correspondances en lien avec son role de redacteur pour le Bulletin des sciences mathematiques et astronomiques.
It is a fact of modern scientific thought that there is an enormous variety of logical systems - such as classical logic, intuitionist logic, temporal logic, and Hoare logic, to name but a few - which have originated in the areas of mathematical logic and computer science. In this book the author presents a systematic study of this rich harvest of logics via Tarski's well-known axiomatization of the notion of logical consequence. New and sometimes unorthodox treatments are given of the underlying principles and construction of many-valued logics, the logic of inexactness, effective logics, and modal logics. Throughout, numerous historical and philosophical remarks illuminate both the development of the subject and show the motivating influences behind its development. Those with a modest acquaintance of modern formal logic will find this to be a readable and not too technical account which will demonstrate the current diversity and profusion of logics. In particular, undergraduate and postgraduate students in mathematics, philosophy, computer science, and artificial intelligence will enjoy this introductory survey of the field. |
You may like...
The Hyperuniverse Project and Maximality
Carolin Antos, Sy-David Friedman, …
Hardcover
R3,078
Discovery Miles 30 780
The Equation of Knowledge - From Bayes…
Le Nguyen Hoang
Hardcover
Research in History and Philosophy of…
Maria Zack, Dirk Schlimm
Hardcover
R2,817
Discovery Miles 28 170
|