![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
The fundamental theorem of algebra states that any complex polynomial must have a complex root. This book examines three pairs of proofs of the theorem from three different areas of mathematics: abstract algebra, complex analysis and topology. The first proof in each pair is fairly straightforward and depends only on what could be considered elementary mathematics. However, each of these first proofs leads to more general results from which the fundamental theorem can be deduced as a direct consequence. These general results constitute the second proof in each pair. To arrive at each of the proofs, enough of the general theory of each relevant area is developed to understand the proof. In addition to the proofs and techniques themselves, many applications such as the insolvability of the quintic and the transcendence of e and pi are presented. Finally, a series of appendices give six additional proofs including a version of Gauss'original first proof. The book is intended for junior/senior level undergraduate mathematics students or first year graduate students, and would make an ideal "capstone" course in mathematics.
It is the business of science not to create laws, but to discover them. We do not originate the constitution of our own minds, greatly as it may be in our power to modify their character. And as the laws of the human intellect do not depend upon our will, so the forms of science, of (1. 1) which they constitute the basis, are in all essential regards independent of individual choice. George Boole 10, p. llJ 1. 1 Comparison with Traditional Logic The logic of this book is a probability logic built on top of a yes-no or 2-valued logic. It is divided into two parts, part I: BP Logic, and part II: M Logic. 'BP' stands for 'Bayes Postulate'. This postulate says that in the absence of knowl edge concerning a probability distribution over a universe or space one should assume 1 a uniform distribution. 2 The M logic of part II does not make use of Bayes postulate or of any other postulates or axioms. It relies exclusively on purely deductive reasoning following from the definition of probabilities. The M logic goes an important step further than the BP logic in that it can distinguish between certain types of information supply sentences which have the same representation in the BP logic as well as in traditional first order logic, although they clearly have different meanings (see example 6. 1. 2; also comments to the Paris-Rome problem of eqs. (1. 8), (1. 9) below)."
Term rewriting techniques are applicable to various fields of computer science, including software engineering, programming languages, computer algebra, program verification, automated theorem proving and Boolean algebra. These powerful techniques can be successfully applied in all areas that demand efficient methods for reasoning with equations. One of the major problems encountered is the characterization of classes of rewrite systems that have a desirable property, like confluence or termination. In a system that is both terminating and confluent, every computation leads to a result that is unique, regardless of the order in which the rewrite rules are applied. This volume provides a comprehensive and unified presentation of termination and confluence, as well as related properties. Topics and features: *unified presentation and notation for important advanced topics *comprehensive coverage of conditional term-rewriting systems *state-of-the-art survey of modularity in term rewriting *presentation of unified framework for term and graph rewriting *up-to-date discussion of transformational methods for proving termination of logic programs, including the TALP system This unique book offers a comprehensive and unified view of the subject that is suitable for all computer scientists, program designers, and software engineers who study and use term rewriting techniques. Practitioners, researchers and professionals will find the book an essential and authoritative resource and guide for the latest developments and results in the field.
Alfred Tarski was one of the two giants of the twentieth-century development of logic, along with Kurt Goedel. The four volumes of this collection contain all of Tarski's published papers and abstracts, as well as a comprehensive bibliography. Here will be found many of the works, spanning the period 1921 through 1979, which are the bedrock of contemporary areas of logic, whether in mathematics or philosophy. These areas include the theory of truth in formalized languages, decision methods and undecidable theories, foundations of geometry, set theory, and model theory, algebraic logic, and universal algebra.
This book collects 13 papers that explore Wittgenstein's philosophy throughout the different stages of his career. The author writes from the viewpoint of critical rationalism. The tone of his analysis is friendly and appreciative yet critical. Of these papers, seven are on the background to the philosophy of Wittgenstein. Five papers examine different aspects of it: one on the philosophy of young Wittgenstein, one on his transitional period, and the final three on the philosophy of mature Wittgenstein, chiefly his Philosophical Investigations. The last of these papers, which serves as the concluding chapter, concerns the analytical school of philosophy that grew chiefly under its influence. Wittgenstein's posthumous Philosophical Investigations ignores formal languages while retaining the view of metaphysics as meaningless -- declaring that all languages are metaphysics-free. It was very popular in the middle of the twentieth century. Now it is passe. Wittgenstein had hoped to dissolve all philosophical disputes, yet he generated a new kind of dispute. His claim to have improved the philosophy of life is awkward just because he prevented philosophical discussion from the ability to achieve that: he cut the branch on which he was sitting. This, according to the author, is the most serious critique of Wittgenstein.
The book is intended for students who want to learn how to prove theorems and be better prepared for the rigors required in more advance mathematics. One of the key components in this textbook is the development of a methodology to lay bare the structure underpinning the construction of a proof, much as diagramming a sentence lays bare its grammatical structure. Diagramming a proof is a way of presenting the relationships between the various parts of a proof. A proof diagram provides a tool for showing students how to write correct mathematical proofs.
This is a thorough and comprehensive treatment of the theory of NP-completeness in the framework of algebraic complexity theory. Coverage includes Valiant's algebraic theory of NP-completeness; interrelations with the classical theory as well as the Blum-Shub-Smale model of computation, questions of structural complexity; fast evaluation of representations of general linear groups; and complexity of immanants.
George Boole (1815-1864) is well known to mathematicians for his research and textbooks on the calculus, but his name has spread world-wide for his innovations in symbolic logic and the development and applications made since his day. The utility of "Boolean algebra" in computing has greatly increased curiosity in the nature and extent of his achievements. His work is most accessible in his two books on logic, "A mathematical analysis of logic" (1947) and "An investigation of the laws of thought" (1954). But at various times he wrote manuscript essays, especially after the publication of the second book; several were intended for a non-technical work, "The Philosophy of logic," which he was not able to complete. This volume contains an edited selection which not only relates them to Boole's publications and the historical context of his time, but also describes their strange history of family, followers and scholars have treid to confect an edition. The book will appeal to logicians, mathematicians and philosophers, and those interested in the histories of the corresponding subjects; and also students of the early Victorian Britain in which they were written.
The Equation of Knowledge: From Bayes' Rule to a Unified Philosophy of Science introduces readers to the Bayesian approach to science: teasing out the link between probability and knowledge. The author strives to make this book accessible to a very broad audience, suitable for professionals, students, and academics, as well as the enthusiastic amateur scientist/mathematician. This book also shows how Bayesianism sheds new light on nearly all areas of knowledge, from philosophy to mathematics, science and engineering, but also law, politics and everyday decision-making. Bayesian thinking is an important topic for research, which has seen dramatic progress in the recent years, and has a significant role to play in the understanding and development of AI and Machine Learning, among many other things. This book seeks to act as a tool for proselytising the benefits and limits of Bayesianism to a wider public. Features Presents the Bayesian approach as a unifying scientific method for a wide range of topics Suitable for a broad audience, including professionals, students, and academics Provides a more accessible, philosophical introduction to the subject that is offered elsewhere
This volume presents a unified approach to the mathematical theory of a wide class of non-additive set functions, the so called null-additive set functions, which also includes classical measure theory. It includes such important set functions as capacities, triangular set functions, some fuzzy measures, submeasures, decomposable measures, possibility measures, distorted probabilities, autocontinuous set functions, etc. The usefulness of the theory is demonstrated by applications in nonlinear differential and difference equations; fractal geometry in the theory of chaos; the approximation of functions in modular spaces by nonlinear singular integral operators; and in the theory of diagonal theorems as a universal method for proving general and fundamental theorems in functional analysis and measure theory. Audience: This book will be of value to researchers and postgraduate students in mathematics, as well as in such diverse fields as knowledge engineering, artificial intelligence, game theory, statistics, economics, sociology and industry.
The purpose of this book is to provide the reader who is interested in applications of fuzzy set theory, in the first place with a text to which he or she can refer for the basic theoretical ideas, concepts and techniques in this field and in the second place with a vast and up to date account of the literature. Although there are now many books about fuzzy set theory, and mainly about its applications, e. g. in control theory, there is not really a book available which introduces the elementary theory of fuzzy sets, in what I would like to call "a good degree of generality." To write a book which would treat the entire range of results concerning the basic theoretical concepts in great detail and which would also deal with all possible variants and alternatives of the theory, such as e. g. rough sets and L-fuzzy sets for arbitrary lattices L, with the possibility-probability theories and interpretations, with the foundation of fuzzy set theory via multi-valued logic or via categorical methods and so on, would have been an altogether different project. This book is far more modest in its mathematical content and in its scope.
The book is devoted to various constructions of sets which are
nonmeasurable with respect to invariant (more generally,
quasi-invariant) measures. Our starting point is the classical
Vitali theorem stating the existence of subsets of the real line
which are not measurable in the Lebesgue sense. This theorem
stimulated the development of the following interesting topics in
mathematics:
The present volume of the Handbook of the History of Logic brings
together two of the most important developments in 20th century
non-classical logic. These are many-valuedness and
non-monotonicity. On the one approach, in deference to vagueness,
temporal or quantum indeterminacy or reference-failure, sentences
that are classically non-bivalent are allowed as inputs and outputs
to consequence relations. Many-valued, dialetheic, fuzzy and
quantum logics are, among other things, principled attempts to
regulate the flow-through of sentences that are neither true nor
false. On the second, or non-monotonic, approach, constraints are
placed on inputs (and sometimes on outputs) of a classical
consequence relation, with a view to producing a notion of
consequence that serves in a more realistic way the requirements of
real-life inference.
We welcome Volume 20, Formal Aspects of Context. Context has always been recognised as strongly relevant to models in language, philosophy, logic and artifi cial intelligence. In recent years theoretical advances in these areas and especially in logic have accelerated the study of context in the international community. An annual conference is held and many researchers have come to realise that many of the old puzzles should be reconsidered with proper attention to context. The volume editors and contributors are from among the most active front-line researchers in the area and the contents shows how wide and vigorous this area is. There are strong scientific connections with earlier volumes in the series. I am confident that the appearance of this book in our series will help secure the study of context as an important area of applied logic. D.M.Gabbay INTRODUCTION This book is a result of the First International and Interdisciplinary Con ference on Modelling and Using Context, which was organised in Rio de Janeiro in January 1997, and contains a selection of the papers presented there, refereed and revised through a process of anonymous peer review. The treatment of contexts as bona-fide objects of logical formalisation has gained wide acceptance in recent years, following the seminal impetus by McCarthy in his 'lUring award address."
Despite decades of work in evolutionary algorithms, there remains a lot of uncertainty as to when it is beneficial or detrimental to use recombination or mutation. This book provides a characterization of the roles that recombination and mutation play in evolutionary algorithms. It integrates prior theoretical work and introduces new theoretical techniques for studying evolutionary algorithms. An aggregation algorithm for Markov chains is introduced which is useful for studying not only evolutionary algorithms specifically, but also complex systems in general. Practical consequences of the theory are explored and a novel method for comparing search and optimization algorithms is introduced. A focus on discrete rather than real-valued representations allows the book to bridge multiple communities, including evolutionary biologists and population geneticists.
Over the last decade and particularly in recent years, the macroscopic porous media theory has made decisive progress concerning the fundamentals of the theory and the development of mathematical models in various fields of engineering and biomechanics. This progress has attracted some attention, and therefore conferences devoted almost exclusively to the macrosopic porous media theory have been organized in order to collect all findings, to present new results, and to discuss new trends. Many important contributions have also been published in national and international journals, which have brought the porous media theory, in some parts, to a close. Therefore, the time seems to be ripe to review the state of the art and to show new trends in the continuum mechanical treatment of saturated and unsaturated capillary and non-capillary porous solids. This book addresses postgraduate students and scientists working in engineering, physics, and mathematics. It provides an outline of modern theory of porous media and shows some trends in theory and in applications.
Mathematics has stood as a bridge between the Humanities and the Sciences since the days of classical antiquity. For Plato, mathematics was evidence of Being in the midst of Becoming, garden variety evidence apparent even to small children and the unphilosophical, and therefore of the highest educational significance. In the great central similes of The Republic it is the touchstone ofintelligibility for discourse, and in the Timaeus it provides in an oddly literal sense the framework of nature, insuring the intelligibility ofthe material world. For Descartes, mathematical ideas had a clarity and distinctness akin to the idea of God, as the fifth of the Meditations makes especially clear. Cartesian mathematicals are constructions as well as objects envisioned by the soul; in the Principles, the work ofthe physicist who provides a quantified account ofthe machines of nature hovers between description and constitution. For Kant, mathematics reveals the possibility of universal and necessary knowledge that is neither the logical unpacking ofconcepts nor the record of perceptual experience. In the Critique ofPure Reason, mathematics is one of the transcendental instruments the human mind uses to apprehend nature, and by apprehending to construct it under the universal and necessary lawsofNewtonian mechanics.
Providing the first comprehensive treatment of the subject, this groundbreaking work is solidly founded on a decade of concentrated research, some of which is published here for the first time, as well as practical, ''hands on'' classroom experience. The clarity of presentation and abundance of examples and exercises make it suitable as a graduate level text in mathematics, decision making, artificial intelligence, and engineering courses.
From the very beginning of their investigation of human reasoning, philosophers have identified two other forms of reasoning, besides deduction, which we now call abduction and induction. Deduction is now fairly well understood, but abduction and induction have eluded a similar level of understanding. The papers collected here address the relationship between abduction and induction and their possible integration. The approach is sometimes philosophical, sometimes that of pure logic, and some papers adopt the more task-oriented approach of AI. The book will command the attention of philosophers, logicians, AI researchers and computer scientists in general.
Can you really keep your eye on the ball? How is massive data collection changing sports? Sports science courses are growing in popularity. The author's course at Roanoke College is a mix of physics, physiology, mathematics, and statistics. Many students of both genders find it exciting to think about sports. Sports problems are easy to create and state, even for students who do not live sports 24/7. Sports are part of their culture and knowledge base, and the opportunity to be an expert on some area of sports is invigorating. This should be the primary reason for the growth of mathematics of sports courses: the topic provides intrinsic motivation for students to do their best work. From the Author: "The topics covered in Sports Science and Sports Analytics courses vary widely. To use a golfing analogy, writing a book like this is like hitting a drive at a driving range; there are many directions you can go without going out of bounds. At the driving range, I pick out a small target to focus on, and that is what I have done here. I have chosen a sample of topics I find very interesting. Ideally, users of this book will have enough to choose from to suit whichever version of a sports course is being run." "The book is very appealing to teach from as well as to learn from. Students seem to have a growing interest in ways to apply traditionally different areas to solve problems. This, coupled with an enthusiasm for sports, makes Dr. Minton's book appealing to me."-Kevin Hutson, Furman University
Modal logics, originally conceived in philosophy, have recently found many applications in computer science, artificial intelligence, the foundations of mathematics, linguistics and other disciplines. Celebrated for their good computational behaviour, modal logics are used as effective formalisms for talking about time, space, knowledge, beliefs, actions, obligations, provability, etc. However, the nice computational properties can drastically change if we combine some of these formalisms into a many-dimensional system, say, to reason about knowledge bases developing in time or moving objects.
I am very happy to have this opportunity to introduce Luca Vigano's book on Labelled Non-Classical Logics. I put forward the methodology of labelled deductive systems to the participants of Logic Colloquium'90 (Labelled Deductive systems, a Position Paper, In J. Oikkonen and J. Vaananen, editors, Logic Colloquium '90, Volume 2 of Lecture Notes in Logic, pages 66-68, Springer, Berlin, 1993), in an attempt to bring labelling as a recognised and significant component of our logic culture. It was a response to earlier isolated uses of labels by various distinguished authors, as a means to achieve local proof theoretic goals. Labelling was used in many different areas such as resource labelling in relevance logics, prefix tableaux in modal logics, annotated logic programs in logic programming, proof tracing in truth maintenance systems, and various side annotations in higher-order proof theory, arithmetic and analysis. This widespread local use of labels was an indication of an underlying logical pattern, namely the simultaneous side-by-side manipulation of several kinds of logical information. It was clear that there was a need to establish the labelled deductive systems methodology. Modal logic is one major area where labelling can be developed quickly and sys tematically with a view of demonstrating its power and significant advantage. In modal logic the labels can play a double role." |
![]() ![]() You may like...
The Art of Logic - How to Make Sense in…
Eugenia Cheng
Paperback
![]()
Emerging Applications of Fuzzy Algebraic…
Chiranjibe Jana, Tapan Senapati, …
Hardcover
R8,410
Discovery Miles 84 100
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,602
Discovery Miles 56 020
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R865
Discovery Miles 8 650
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R600
Discovery Miles 6 000
|