![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
This volume consists of papers selected from the presentations at the workshop and includes mainly recent developments in the fields of formal languages, automata theory and algebraic systems related to the theoretical computer science and informatics. It covers the areas such as automata and grammars, languages and codes, combinatorics on words, cryptosystems, logics and trees, Grobner bases, minimal clones, zero-divisor graphs, fine convergence of functions, and others.
This careful selection of participant contributions reflects the focus of the 14th International Conference on Operator Theory, held in Timisoara (Romania) in June 1992, centering on the problems of extensions of operators and their connections with interpolation of analytic functions and with the spectral theory of differential operators. Other topics concern operator inequalities, spectral theory in general spaces and operator theory in Krein spaces.
The Philosophy of Mathematics Today gives a panorama of the best current work in this lively field, through twenty essays specially written for this collection by leading figures. The topics include indeterminacy, logical consequence, mathematical methodology, abstraction, and both Hilbert's and Frege's foundational programmes. The collection will be an important source for research in the philosophy of mathematics for years to come. Contributors Paul Benacerraf, George Boolos, John P. Burgess, Charles S. Chihara, Michael Detlefsen, Michael Dummett, Hartry Field, Kit Fine, Bob Hale, Richard G. Heck, Jnr., Geoffrey Hellman, Penelope Maddy, Karl-Georg Niebergall, Charles D. Parsons, Michael D. Resnik, Matthias Schirn, Stewart Shapiro, Peter Simons, W.W. Tait, Crispin Wright.
Residue number systems (RNSs) and arithmetic are useful for several reasons. First, a great deal of computing now takes place in embedded processors, such as those found in mobile devices, for which high speed and low-power consumption are critical; the absence of carry propagation facilitates the realization of high-speed, low-power arithmetic. Second, computer chips are now getting to be so dense that full testing will no longer be possible; so fault tolerance and the general area of computational integrity have become more important. RNSs are extremely good for applications such as digital signal processing, communications engineering, computer security (cryptography), image processing, speech processing, and transforms, all of which are extremely important in computing today.This book provides an up-to-date account of RNSs and arithmetic. It covers the underlying mathematical concepts of RNSs; the conversion between conventional number systems and RNSs; the implementation of arithmetic operations; various related applications are also introduced. In addition, numerous detailed examples and analysis of different implementations are provided.
This book explores an important central thread that unifies Russell's thoughts on logic in two works previously considered at odds with each other, the Principles of Mathematics and the later Principia Mathematica. This thread is Russell's doctrine that logic is an absolutely general science and that any calculus for it must embrace wholly unrestricted variables. The heart of Landini's book is a careful analysis of Russell's largely unpublished "substitutional" theory. On Landini's showing, the substitutional theory reveals the unity of Russell's philosophy of logic and offers new avenues for a genuine solution of the paradoxes plaguing Logicism.
This volume focuses on the important mathematical idea of functions that, with the technology of computers and calculators, can be dynamically represented in ways that have not been possible previously. The book's editors contend that as result of recent technological developments combined with the integrated knowledge available from research on teaching, instruction, students' thinking, and assessment, curriculum developers, researchers, and teacher educators are faced with an unprecedented opportunity for making dramatic changes. The book presents content considerations that occur when the mathematics of graphs and functions relate to curriculum. It also examines content in a carefully considered integration of research that conveys where the field stands and where it might go. Drawing heavily on their own work, the chapter authors reconceptualize research in their specific areas so that this knowledge is integrated with the others' strands. This model for synthesizing research can serve as a paradigm for how research in mathematics education can -- and probably should -- proceed.
Kurt Gödel was the most outstanding logician of the 20th century and a giant in the field. This book is part of a five volume set that makes available all of Gödels writings. The first three volumes, already published, consist of the papers and essays of Gödel. The final two volumes of the set deal with Gödel's correspondence with his contemporary mathematicians, this fifth volume consists of material from correspondents from H-Z.
This book principally concerns the rapidly growing area of what might be termed "Logical Complexity Theory": the study of bounded arithmetic, propositional proof systems, length of proof, and similar themes, and the relations of these topics to computational complexity theory. Issuing from a two-year international collaboration, the book contains articles concerning the existence of the most general unifier, a special case of Kreisel's conjecture on length-of-proof, propositional logic proof size, a new alternating logtime algorithm for boolean formula evaluation and relation to branching programs, interpretability between fragments of arithmetic, feasible interpretability, provability logic, open induction, Herbrand-type theorems, isomorphism between first and second order bounded arithmetics, forcing techniques in bounded arithmetic, and ordinal arithmetic in *L *D o. Also included is an extended abstract of J.P. Ressayre's new approach concerning the model completeness of the theory of real closed exponential fields. Additional features of the book include the transcription and translation of a recently discovered 1956 letter from Kurt Godel to J. von Neumann, asking about a polynomial time algorithm for the proof in k-symbols of predicate calculus formulas (equivalent to the P-NP question); and an open problem list consisting of seven fundamental and 39 technical questions contributed by many researchers, together with a bibliography of relevant references. This scholarly work will interest mathematical logicians, proof and recursion theorists, and researchers in computational complexity.
Volume II, on formal (ZFC) set theory, incorporates a self-contained "chapter 0" on proof techniques so that it is based on formal logic, in the style of Bourbaki. The emphasis on basic techniques provides a solid foundation in set theory and a thorough context for the presentation of advanced topics (such as absoluteness, relative consistency results, two expositions of Godel's construstive universe, numerous ways of viewing recursion and Cohen forcing).
Modern applications of logic, in mathematics, theoretical computer science, and linguistics, require combined systems involving many different logics working together. In this book the author offers a basic methodology for combining - or fibring - systems. This means that many existing complex systems can be broken down into simpler components, hence making them much easier to manipulate.
Kurt Gödel was the most outstanding logician of the 20th century and a giant in the field. This book is part of a five volume set that makes available all of Gödels writings. The first three volumes, already published, consist of the papers and essays of Gödel. The final two volumes of the set deal with Gödel's correspondence with his contemporary mathematicians, this fourth volume consists of material from correspondents from A-G.
A satisfactory and coherent theory of orthogonal polynomials in several variables, attached to root systems, and depending on two or more parameters, has developed in recent years. This comprehensive account of the subject provides a unified foundation for the theory to which I.G. Macdonald has been a principal contributor. The first four chapters lead up to Chapter 5 which contains all the main results.
This two-volume work bridges the gap between introductory expositions of logic (or set theory) and the research literature. It can be used as a text in an advanced undergraduate or beginning graduate course in mathematics, computer science, or philosophy. The volumes are written in a user-friendly lecture style that makes them equally effective for self-study or class use. Volume I includes formal proof techniques, applications of compactness (including nonstandard analysis), computability and its relation to the completeness phenonmenon, and the first presentation of a complete proof of Godel's 2nd incompleteness since Hilbert and Bernay's Grundlagen.
This book presents several recent advances in natural language semantics and explores the boundaries between syntax and semantics over the last two decades. It is based on some of the most recent theories in logic, such as linear logic and ludics, first created by Jean-Yves Girard, and it also provides some sharp analyses of computational semantical representations, explaining advanced theories in theoretical computer sciences, such as the lambda-mu and Lambek-Grishin calculi which were applied by Philippe de Groote and Michael Moortgat. The author also looks at Aarne Ranta's 'proof as meaning' approach, which was first based on Martin-Loef's Type Theory.Meaning, Logic and Ludics surveys the many solutions which have been proposed for the syntax-semantics interface, taking into account the specifications of linguistic signs (continuous or discontinuous) and the fundamental mechanisms developed by linguists and notable Generativists. This pioneering publication also presents ludics (in a chapter co-authored with Myriam Quatrini), a framework which allows us to characterize meaning as an invariant with regard to interaction between processes. It is an excellent book for advanced students, and academics alike, in the field of computational linguistics.
The book attempts an elementary exposition of the topics connected with many-valued logics. It gives an account of the constructions being "many-valued" at their origin, i.e. those obtained through intended introduction of logical values next to truth and falsity. To this aim, the matrix method has been chosen as a prevailing manner of presenting the subject. The inquiry throws light upon the profound problem of the criteria of many-valuedness and its classical characterizations. Besides, the reader can find information concerning the main systems of many-valued logic, related axiomatic constructions, and conceptions inspired by many valuedness. The examples of various applications to philosophical logic and some practical domains, as switching theory or Computer Science, helps to see many-valuedness in a wider perspective. Together with a selective bibliography and historical references it makes the work especially useful as a survey and guide in this field of logic.
Mild Cognitive Impairment (MCI) has been identified as an important clinical transition between normal aging and the early stages of Alzheimer's disease (AD). Since treatments for AD are most likely to be most effective early in the course of the disease, MCI has become a topic of great importance and has been investigated in different populations of interest in many countries. This book brings together these differing perspectives on MCI for the first time. This volume provides a comprehensive resource for clinicians, researchers, and students involved in the study, diagnosis, treatment, and rehabilitation of people with MCI. Clinical investigators initially defined mild cognitive impairment (MCI) as a transitional condition between normal aging and the early stages of Alzheimer's disease (AD). Because the prevalence of AD increases with age and very large numbers of older adults are affected worldwide, these clinicians saw a pressing need to identify AD as early as possible. It is at this very early stage in the disease course that treatments to slow the progress and control symptoms are likely to be most effective. Since the first introduction of MCI, research interest has grown exponentially, and the utility of the concept has been investigated from a variety of perspectives in different populations of interest (e.g., clinical samples, volunteers, population-based screening) in many different countries. Much variability in findings has resulted. Although it has been acknowledged that the differences observed between samples may be 'legitimate variations', there has been no attempt to understand what it is we have learned about MCI (i.e., common features and differences) from each of these perspectives. This book brings together information about MCI in different populations from around the world. Mild Cognitive Impairment will be an important resource for any clinician, researcher, or student involved in the study, detection, treatment, and rehabilitation of people with MCI.
This best-selling text by John Taylor, now released in its second edition, introduces the study of uncertainties to lower division science students. Assuming no prior knowledge, the author introduces error analysis through the use of familiar examples ranging from carpentry to well-known historic experiments. Pertinent worked examples, simple exercises throughout the text, and numerous chapter-ending problems combine to make the book ideal for use in physics, chemistry and engineering lab courses. This book has been translated into nine languages and has more adoptions than we can count.
The Asian Logic Conference is part of the series of logic conferences inaugurated in Singapore in 1981. It is normally held every three years and rotates among countries in the Asia-Pacific region. The 11th Asian Logic Conference is held in the National University of Singapore, in honour of Professor Chong Chitat on the occasion of his 60th birthday. The conference is on the broad area of logic, including theoretical computer science. It is considered a major event in this field and is regularly sponsored by the Association of Symbolic Logic. This volume contains papers from this meeting.
This is a mathematically-oriented advanced text in modal logic, a discipline conceived in philosophy and having found applications in mathematics, artificial intelligence, linguistics, and computer science. It presents in a systematic and comprehensive way a wide range of classical and novel methods and results and can be used by a specialist as a reference book.
All modern books on Einstein emphasize the genius of his relativity theory and the corresponding corrections and extensions of the ancient space-time concept. However, Einstein s opposition to the use of probability in the laws of nature and particularly in the laws of quantum mechanics is criticized and often portrayed as outdated. The author of Einstein Was Right takes a different view and shows that Einstein created a "Trojan horse" ready to unleash forces against the use of probability as a basis for the laws of nature. Einstein warned that the use of probability would, in the final analysis, lead to "spooky" actions and mysterious instantaneous influences at a distance. John Bell pulled Einstein s Trojan horse into the castle of physics. He developed a theory that, together with experimental results of Aspect, Zeilinger, and others, "proves" the existence of quantum non-localities, instantaneous influences. These have indeed the nature of what Einstein labeled as "spooky." The book Einstein Was Right shows that Bell was not aware of the special role that time and space-time play in any rigorous probability theory. As a consequence, his formalism is not general enough to be applied to the Aspect-Zeilinger type of experiments and his conclusions about the existence of instantaneous influences at a distance are incorrect. This fact suggests a world view that is less optimistic about claims that teleportation and influences at a distance could open new horizons and provide the possibility of quantum computing. On the positive side, however, and as compensation, we are assured that the space-time picture of mankind developed over millions of years and perfected by Einstein, is still able to cope with the phenomena that nature presents us on the atomic and sub-atomic level and that the "quantum weirdness" may be explainable and understandable after all. "
This book studies the universal constructions and properties in categories of commutative algebras, bringing out the specific properties that make commutative algebra and algebraic geometry work. Two universal constructions are presented and used here for the first time. The author shows that the concepts and constructions arising in commutative algebra and algebraic geometry are not bound so tightly to the absolute universe of rings, but possess a universality that is independent of them and can be interpreted in various categories of discourse. This brings new flexibility to classical commutative algebra and affords the possibility of extending the domain of validity and the application of the vast number of results obtained in classical commutative algebra. This innovative and original work will interest mathematicians in a range of specialities, including algebraists, categoricians, and algebraic geometers.
This penultimate volume contains numerous original, elegant, and surprising results in 1-dimensional cellular automata. Perhaps the most exciting, if not shocking, new result is the discovery that only 82 local rules, out of 256, suffice to predict the time evolution of any of the remaining 174 local rules from an arbitrary initial bit-string configuration. This is contrary to the well-known folklore that 256 local rules are necessary, leading to the new concept of quasi-global equivalence.Another surprising result is the introduction of a simple, yet explicit, infinite bit string called the super string S, which contains all random bit strings of finite length as sub-strings. As an illustration of the mathematical subtlety of this amazing discrete testing signal, the super string S is used to prove mathematically, in a trivial and transparent way, that rule 170 is as chaotic as a coin toss.Yet another unexpected new result, among many others, is the derivation of an explicit basin tree generation formula which provides an analytical relationship between the basin trees of globally-equivalent local rules. This formula allows the symbolic, rather than numerical, generation of the time evolution of any local rule corresponding to any initial bit-string configuration, from one of the 88 globally-equivalent local rules.But perhaps the most provocative idea is the proposal for adopting rule 137, over its three globally-equivalent siblings, including the heretofore more well-known rule 110, as the prototypical universal Turing machine.
This is a first course in propositional modal logic, suitable for mathematicians, computer scientists and philosophers. Emphasis is placed on semantic aspects, in the form of labelled transition structures, rather than on proof theory. The book covers all the basic material - propositional languages, semantics and correspondence results, proof systems and completeness results - as well as some topics not usually covered in a modal logic course. It is written from a mathematical standpoint. To help the reader, the material is covered in short chapters, each concentrating on one topic. These are arranged into five parts, each with a common theme. An important feature of the book is the many exercises, and an extensive set of solutions is provided.
The nature of truth in mathematics is a problem which has exercised the minds of thinkers from at least the time of the ancient Greeks. The great advances in mathematics and philosophy in the twentieth century--and in particular the proof of Gödel's theorem and the development of the notion of independence in mathematics--have led to new viewpoints on his question. This book is the result of the interaction of a number of outstanding mathematicians and philosophers--including Yurii Manin, Vaughan Jones, and Per Martin-Löf--and their discussions of this problem. It provides an overview of the forefront of current thinking, and is a valuable introduction and reference for researchers in the area.
A lively and engaging look at logic puzzles and their role in mathematics, philosophy, and recreation Logic puzzles were first introduced to the public by Lewis Carroll in the late nineteenth century and have been popular ever since. Games like Sudoku and Mastermind are fun and engrossing recreational activities, but they also share deep foundations in mathematical logic and are worthy of serious intellectual inquiry. Games for Your Mind explores the history and future of logic puzzles while enabling you to test your skill against a variety of puzzles yourself. In this informative and entertaining book, Jason Rosenhouse begins by introducing readers to logic and logic puzzles and goes on to reveal the rich history of these puzzles. He shows how Carroll's puzzles presented Aristotelian logic as a game for children, yet also informed his scholarly work on logic. He reveals how another pioneer of logic puzzles, Raymond Smullyan, drew on classic puzzles about liars and truthtellers to illustrate Kurt Goedel's theorems and illuminate profound questions in mathematical logic. Rosenhouse then presents a new vision for the future of logic puzzles based on nonclassical logic, which is used today in computer science and automated reasoning to manipulate large and sometimes contradictory sets of data. Featuring a wealth of sample puzzles ranging from simple to extremely challenging, this lively and engaging book brings together many of the most ingenious puzzles ever devised, including the "Hardest Logic Puzzle Ever," metapuzzles, paradoxes, and the logic puzzles in detective stories. |
![]() ![]() You may like...
Galois Covers, Grothendieck-Teichmuller…
Frank Neumann, Sibylle Schroll
Hardcover
R4,403
Discovery Miles 44 030
The Measurement of Association - A…
Kenneth J. Berry, Janis E. Johnston, …
Hardcover
R4,449
Discovery Miles 44 490
|