![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
The Equation of Knowledge: From Bayes' Rule to a Unified Philosophy of Science introduces readers to the Bayesian approach to science: teasing out the link between probability and knowledge. The author strives to make this book accessible to a very broad audience, suitable for professionals, students, and academics, as well as the enthusiastic amateur scientist/mathematician. This book also shows how Bayesianism sheds new light on nearly all areas of knowledge, from philosophy to mathematics, science and engineering, but also law, politics and everyday decision-making. Bayesian thinking is an important topic for research, which has seen dramatic progress in the recent years, and has a significant role to play in the understanding and development of AI and Machine Learning, among many other things. This book seeks to act as a tool for proselytising the benefits and limits of Bayesianism to a wider public. Features Presents the Bayesian approach as a unifying scientific method for a wide range of topics Suitable for a broad audience, including professionals, students, and academics Provides a more accessible, philosophical introduction to the subject that is offered elsewhere
This book introduces new models based on R-calculus and theories of belief revision for dealing with large and changing data. It extends R-calculus from first-order logic to propositional logic, description logics, modal logic and logic programming, and from minimal change semantics to subset minimal change, pseudo-subformula minimal change and deduction-based minimal change (the last two minimal changes are newly defined). And it proves soundness and completeness theorems with respect to the minimal changes in these logics. To make R-calculus computable, an approximate R-calculus is given which uses finite injury priority method in recursion theory. Moreover, two applications of R-calculus are given to default theory and semantic inheritance networks. This book offers a rich blend of theory and practice. It is suitable for students, researchers and practitioners in the field of logic. Also it is very useful for all those who are interested in data, digitization and correctness and consistency of information, in modal logics, non monotonic logics, decidable/undecidable logics, logic programming, description logics, default logics and semantic inheritance networks.
This book has a fundamental relationship to the International Seminar on Fuzzy Set Theory held each September in Linz, Austria. First, this volume is an extended account of the eleventh Seminar of 1989. Second, and more importantly, it is the culmination of the tradition of the preceding ten Seminars. The purpose of the Linz Seminar, since its inception, was and is to foster the development of the mathematical aspects of fuzzy sets. In the earlier years, this was accomplished by bringing together for a week small grou ps of mathematicians in various fields in an intimate, focused environment which promoted much informal, critical discussion in addition to formal presentations. Beginning with the tenth Seminar, the intimate setting was retained, but each Seminar narrowed in theme; and participation was broadened to include both younger scholars within, and established mathematicians outside, the mathematical mainstream of fuzzy sets theory. Most of the material of this book was developed over the years in close association with the Seminar or influenced by what transpired at Linz. For much of the content, it played a crucial role in either stimulating this material or in providing feedback and the necessary screening of ideas. Thus we may fairly say that the book, and the eleventh Seminar to which it is directly related, are in many respects a culmination of the previous Seminars.
An introductory textbook, Logic for Justice covers, in full detail, the language and semantics of both propositional logic and first-order logic. It motivates the study of those logical systems by drawing on social and political issues. Basically, Logic for Justice frames propositional logic and first-order logic as two theories of the distinction between good arguments and bad arguments. And the book explains why, for the purposes of social justice and political reform, we need theories of that distinction. In addition, Logic for Justice is extremely lucid, thorough, and clear. It explains, and motivates, many different features of the formalism of propositional logic and first-order logic, always connecting those features back to real-world issues. Key Features Connects the study of logic to real-world social and political issues, drawing in students who might not otherwise be attracted to the subject. Offers extremely clear and thorough presentations of technical material, allowing students to learn directly from the book without having to rely on instructor explanations. Carefully explains the value of arguing well throughout one’s life, with several discussions about how to argue and how arguments – when done with care – can be helpful personally. Includes examples that appear throughout the entire book, allowing students to see how the ideas presented in the book build on each other. Provides a large and diverse set of problems for each chapter. Teaches logic by connecting formal languages to natural languages with which students are already familiar, making it much easier for students to learn how logic works.
An introductory textbook, Logic for Justice covers, in full detail, the language and semantics of both propositional logic and first-order logic. It motivates the study of those logical systems by drawing on social and political issues. Basically, Logic for Justice frames propositional logic and first-order logic as two theories of the distinction between good arguments and bad arguments. And the book explains why, for the purposes of social justice and political reform, we need theories of that distinction. In addition, Logic for Justice is extremely lucid, thorough, and clear. It explains, and motivates, many different features of the formalism of propositional logic and first-order logic, always connecting those features back to real-world issues. Key Features Connects the study of logic to real-world social and political issues, drawing in students who might not otherwise be attracted to the subject. Offers extremely clear and thorough presentations of technical material, allowing students to learn directly from the book without having to rely on instructor explanations. Carefully explains the value of arguing well throughout one’s life, with several discussions about how to argue and how arguments – when done with care – can be helpful personally. Includes examples that appear throughout the entire book, allowing students to see how the ideas presented in the book build on each other. Provides a large and diverse set of problems for each chapter. Teaches logic by connecting formal languages to natural languages with which students are already familiar, making it much easier for students to learn how logic works.
This book treats bounded arithmetic and propositional proof complexity from the point of view of computational complexity. The first seven chapters include the necessary logical background for the material and are suitable for a graduate course. Associated with each of many complexity classes are both a two-sorted predicate calculus theory, with induction restricted to concepts in the class, and a propositional proof system. The complexity classes range from AC0 for the weakest theory up to the polynomial hierarchy. Each bounded theorem in a theory translates into a family of (quantified) propositional tautologies with polynomial size proofs in the corresponding proof system. The theory proves the soundness of the associated proof system. The result is a uniform treatment of many systems in the literature, including Buss's theories for the polynomial hierarchy and many disparate systems for complexity classes such as AC0, AC0(m), TC0, NC1, L, NL, NC, and P.
This monograph presents a new model of mathematical structures called weak n-categories. These structures find their motivation in a wide range of fields, from algebraic topology to mathematical physics, algebraic geometry and mathematical logic. While strict n-categories are easily defined in terms associative and unital composition operations they are of limited use in applications, which often call for weakened variants of these laws. The author proposes a new approach to this weakening, whose generality arises not from a weakening of such laws but from the very geometric structure of its cells; a geometry dubbed weak globularity. The new model, called weakly globular n-fold categories, is one of the simplest known algebraic structures yielding a model of weak n-categories. The central result is the equivalence of this model to one of the existing models, due to Tamsamani and further studied by Simpson. This theory has intended applications to homotopy theory, mathematical physics and to long-standing open questions in category theory. As the theory is described in elementary terms and the book is largely self-contained, it is accessible to beginning graduate students and to mathematicians from a wide range of disciplines well beyond higher category theory. The new model makes a transparent connection between higher category theory and homotopy theory, rendering it particularly suitable for category theorists and algebraic topologists. Although the results are complex, readers are guided with an intuitive explanation before each concept is introduced, and with diagrams showing the interconnections between the main ideas and results.
Cellular automata are widely-used tools for simulation in physics, ecology, evolution, mathematics and other fields. They are also digital "toy universes" worthy of study in their own right, with a significant and growing body of enthusiastic investigators. This book will present many of the most interesting new developments and applications of cellular automata. There has not been a compilation like this for some time and this field is due for an explosion of research interest in the rather near future.
Topos Theory is an important branch of mathematical logic of interest to theoretical computer scientists, logicians and philosophers who study the foundations of mathematics, and to those working in differential geometry and continuum physics. This compendium contains material that was previously available only in specialist journals. This is likely to become the standard reference work for all those interested in the subject.
Topos Theory is an important branch of mathematical logic of interest to theoretical computer scientists, logicians and philosophers who study the foundations of mathematics, and to those working in differential geometry and continuum physics. This compendium contains material that was previously available only in specialist journals. This is likely to become the standard reference work for all those interested in the subject.
Professor Chandrasekhar's work is an attempt by a distinguished practising scientist to read and comprehend the enormous intellectual achievement of the Principia without recourse to secondary sources. This text has stimulated great interest and debate among the scientific community, illuminating the brilliance of Newton's work under the gaze of Chandrasekhar's rare perception.
Today the notion of the algorithm is familiar not only to mathematicians. It forms a conceptual base for information processing; the existence of a corresponding algorithm makes automatic information processing possible. The theory of algorithms (together with mathematical logic ) forms the the oretical basis for modern computer science (see [Sem Us 86]; this article is called "Mathematical Logic in Computer Science and Computing Practice" and in its title mathematical logic is understood in a broad sense including the theory of algorithms). However, not everyone realizes that the word "algorithm" includes a transformed toponym Khorezm. Algorithms were named after a great sci entist of medieval East, is al-Khwarizmi (where al-Khwarizmi means "from Khorezm"). He lived between c. 783 and 850 B.C. and the year 1983 was chosen to celebrate his 1200th birthday. A short biography of al-Khwarizmi compiled in the tenth century starts as follows: "al-Khwarizmi. His name is Muhammad ibn Musa, he is from Khoresm" (cited according to [Bul Rozen Ah 83, p.8]).
This book is an exploration and defense of the coherence of classical theism's doctrine of divine aseity in the face of the challenge posed by Platonism with respect to abstract objects. A synoptic work in analytic philosophy of religion, the book engages discussions in philosophy of mathematics, philosophy of language, metaphysics, and metaontology. It addresses absolute creationism, non-Platonic realism, fictionalism, neutralism, and alternative logics and semantics, among other topics. The book offers a helpful taxonomy of the wide range of options available to the classical theist for dealing with the challenge of Platonism. It probes in detail the diverse views on the reality of abstract objects and their compatibility with classical theism. It contains a most thorough discussion, rooted in careful exegesis, of the biblical and patristic basis of the doctrine of divine aseity. Finally, it challenges the influential Quinean metaontological theses concerning the way in which we make ontological commitments.
This book explores the research of Professor Hilary Putnam, a Harvard professor as well as a leading philosopher, mathematician and computer scientist. It features the work of distinguished scholars in the field as well as a selection of young academics who have studied topics closely connected to Putnam's work. It includes 12 papers that analyze, develop, and constructively criticize this notable professor's research in mathematical logic, the philosophy of logic and the philosophy of mathematics. In addition, it features a short essay presenting reminiscences and anecdotes about Putnam from his friends and colleagues, and also includes an extensive bibliography of his work in mathematics and logic. The book offers readers a comprehensive review of outstanding contributions in logic and mathematics as well as an engaging dialogue between prominent scholars and researchers. It provides those interested in mathematical logic, the philosophy of logic, and the philosophy of mathematics unique insights into the work of Hilary Putnam.
This book addresses mechanisms for reducing model heterogeneity induced by the absence of explicit semantics expression in the formal techniques used to specify design models. More precisely, it highlights the advances in handling both implicit and explicit semantics in formal system developments, and discusses different contributions expressing different views and perceptions on the implicit and explicit semantics. The book is based on the discussions at the Shonan meeting on this topic held in 2016, and includes contributions from the participants summarising their perspectives on the problem and offering solutions. Divided into 5 parts: domain modelling, knowledge-based modelling, proof-based modelling, assurance cases, and refinement-based modelling, and offers inspiration for researchers and practitioners in the fields of formal methods, system and software engineering, domain knowledge modelling, requirement analysis, and explicit and implicit semantics of modelling languages.
This adaptation of an earlier work by the authors is a graduate text and professional reference on the fundamentals of graph theory. It covers the theory of graphs, its applications to computer networks and the theory of graph algorithms. Also includes exercises and an updated bibliography.
Automata Theory and its Applications is a uniform treatment of the theory of finite state machines on finite and infinite strings and trees. Many books deal with automata on finite strings, but there are very few expositions that prove the fundamental results of automata on infinite strings and trees. These results have important applications to modeling parallel computation and concurrency, the specification and verification of sequential and concurrent programs, databases, operating systems, computational complexity, and decision methods in logic and algebra. Thus, this textbook fills an important gap in the literature by exposing early fundamental results in automata theory and its applications. Beginning with coverage of all standard fundamental results regarding finite automata, the book deals in great detail with BA1/4chi and Rabin automata and their applications to various logical theories such as S1S and S2S, and describes game-theoretic models of concurrent operating and communication systems. The book is self-contained with numerous examples, illustrations, exercises, and is suitable for a two-semester undergraduate course for computer science or mathematics majors, or for a one-semester graduate course/seminar. Since no advanced mathematical background is required, the text is also useful for self-study by computer science professionals who wish to understand the foundations of modern formal approaches to software development, validation, and verification.
This volume presents essays by pioneering thinkers including Tyler Burge, Gregory Chaitin, Daniel Dennett, Barry Mazur, Nicholas Humphrey, John Searle and Ian Stewart. Together they illuminate the Map/Territory Distinction that underlies at the foundation of the scientific method, thought and the very reality itself. It is imperative to distinguish Map from the Territory while analyzing any subject but we often mistake map for the territory. Meaning for the Reference. Computational tool for what it computes. Representations are handy and tempting that we often end up committing the category error of over-marrying the representation with what is represented, so much so that the distinction between the former and the latter is lost. This error that has its roots in the pedagogy often generates a plethora of paradoxes/confusions which hinder the proper understanding of the subject. What are wave functions? Fields? Forces? Numbers? Sets? Classes? Operators? Functions? Alphabets and Sentences? Are they a part of our map (theory/representation)? Or do they actually belong to the territory (Reality)? Researcher, like a cartographer, clothes (or creates?) the reality by stitching multitudes of maps that simultaneously co-exist. A simple apple, for example, can be analyzed from several viewpoints beginning with evolution and biology, all the way down its microscopic quantum mechanical components. Is there a reality (or a real apple) out there apart from these maps? How do these various maps interact/intermingle with each other to produce a coherent reality that we interact with? Or do they not? Does our brain uses its own internal maps to facilitate "physicist/mathematician" in us to construct the maps about the external territories in turn? If so, what is the nature of these internal maps? Are there meta-maps? Evolution definitely fences our perception and thereby our ability to construct maps, revealing to us only those aspects beneficial for our survival. But the question is, to what extent? Is there a way out of the metaphorical Platonic cave erected around us by the nature? While "Map is not the territory" as Alfred Korzybski remarked, join us in this journey to know more, while we inquire on the nature and the reality of the maps which try to map the reality out there. The book also includes a foreword by Sir Roger Penrose and an afterword by Dagfinn Follesdal.
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
This monograph introduces and explores the notions of a commutator equation and the equationally-defined commutator from the perspective of abstract algebraic logic. An account of the commutator operation associated with equational deductive systems is presented, with an emphasis placed on logical aspects of the commutator for equational systems determined by quasivarieties of algebras. The author discusses the general properties of the equationally-defined commutator, various centralization relations for relative congruences, the additivity and correspondence properties of the equationally-defined commutator and its behavior in finitely generated quasivarieties. Presenting new and original research not yet considered in the mathematical literature, The Equationally-Defined Commutator will be of interest to professional algebraists and logicians, as well as graduate students and other researchers interested in problems of modern algebraic logic.
This monograph offers a critical introduction to current theories of how scientific models represent their target systems. Representation is important because it allows scientists to study a model to discover features of reality. The authors provide a map of the conceptual landscape surrounding the issue of scientific representation, arguing that it consists of multiple intertwined problems. They provide an encyclopaedic overview of existing attempts to answer these questions, and they assess their strengths and weaknesses. The book also presents a comprehensive statement of their alternative proposal, the DEKI account of representation, which they have developed over the last few years. They show how the account works in the case of material as well as non-material models; how it accommodates the use of mathematics in scientific modelling; and how it sheds light on the relation between representation in science and art. The issue of representation has generated a sizeable literature, which has been growing fast in particular over the last decade. This makes it hard for novices to get a handle on the topic because so far there is no book-length introduction that would guide them through the discussion. Likewise, researchers may require a comprehensive review that they can refer to for critical evaluations. This book meets the needs of both groups.
The contributions in this book survey results on combinations of probabilistic and various other classical, temporal and justification logical systems. Formal languages of these logics are extended with probabilistic operators. The aim is to provide a systematic overview and an accessible presentation of mathematical techniques used to obtain results on formalization, completeness, compactness and decidability. The book will be of value to researchers in logic and it can be used as a supplementary text in graduate courses on non-classical logics.
This is an introductory undergraduate textbook in set theory. In mathematics these days, essentially everything is a set. Some knowledge of set theory is necessary part of the background everyone needs for further study of mathematics. It is also possible to study set theory for its own interest--it is a subject with intruiging results anout simple objects. This book starts with material that nobody can do without. There is no end to what can be learned of set theory, but here is a beginning.
The purpose of this book is to present the classical analytic function theory of several variables as a standard subject in a course of mathematics after learning the elementary materials (sets, general topology, algebra, one complex variable). This includes the essential parts of Grauert-Remmert's two volumes, GL227(236) (Theory of Stein spaces) and GL265 (Coherent analytic sheaves) with a lowering of the level for novice graduate students (here, Grauert's direct image theorem is limited to the case of finite maps).The core of the theory is "Oka's Coherence", found and proved by Kiyoshi Oka. It is indispensable, not only in the study of complex analysis and complex geometry, but also in a large area of modern mathematics. In this book, just after an introductory chapter on holomorphic functions (Chap. 1), we prove Oka's First Coherence Theorem for holomorphic functions in Chap. 2. This defines a unique character of the book compared with other books on this subject, in which the notion of coherence appears much later.The present book, consisting of nine chapters, gives complete treatments of the following items: Coherence of sheaves of holomorphic functions (Chap. 2); Oka-Cartan's Fundamental Theorem (Chap. 4); Coherence of ideal sheaves of complex analytic subsets (Chap. 6); Coherence of the normalization sheaves of complex spaces (Chap. 6); Grauert's Finiteness Theorem (Chaps. 7, 8); Oka's Theorem for Riemann domains (Chap. 8). The theories of sheaf cohomology and domains of holomorphy are also presented (Chaps. 3, 5). Chapter 6 deals with the theory of complex analytic subsets. Chapter 8 is devoted to the applications of formerly obtained results, proving Cartan-Serre's Theorem and Kodaira's Embedding Theorem. In Chap. 9, we discuss the historical development of "Coherence".It is difficult to find a book at this level that treats all of the above subjects in a completely self-contained manner. In the present volume, a number of classical proofs are improved and simplified, so that the contents are easily accessible for beginning graduate students. |
You may like...
Intelligent Computing Paradigm: Recent…
J K Mandal, Devadutta Sinha
Hardcover
R2,653
Discovery Miles 26 530
Compressibility, Turbulence and High…
Thomas B. Gatski, Jean-Paul Bonnet
Hardcover
R2,266
Discovery Miles 22 660
|