![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
This book collects, for the first time in one volume, contributions honoring Professor Raymond Smullyan's work on self-reference. It serves not only as a tribute to one of the great thinkers in logic, but also as a celebration of self-reference in general, to be enjoyed by all lovers of this field. Raymond Smullyan, mathematician, philosopher, musician and inventor of logic puzzles, made a lasting impact on the study of mathematical logic; accordingly, this book spans the many personalities through which Professor Smullyan operated, offering extensions and re-evaluations of his academic work on self-reference, applying self-referential logic to art and nature, and lastly, offering new puzzles designed to communicate otherwise esoteric concepts in mathematical logic, in the manner for which Professor Smullyan was so well known. This book is suitable for students, scholars and logicians who are interested in learning more about Raymond Smullyan's work and life.
'Numbers and Proofs' presents a gentle introduction to the notion
of proof to give the reader an understanding of how to decipher
others' proofs as well as construct their own. Useful methods of
proof are illustrated in the context of studying problems
concerning mainly numbers (real, rational, complex and integers).
An indispensable guide to all students of mathematics. Each proof
is preceded by a discussion which is intended to show the reader
the kind of thoughts they might have before any attempt proof is
made. Established proofs which the student is in a better position
to follow then follow.
This is the first comprehensive treatment of subjective logic and all its operations. The author developed the approach, and in this book he first explains subjective opinions, opinion representation, and decision-making under vagueness and uncertainty, and he then offers a full definition of subjective logic, harmonising the key notations and formalisms, concluding with chapters on trust networks and subjective Bayesian networks, which when combined form general subjective networks. The author shows how real-world situations can be realistically modelled with regard to how situations are perceived, with conclusions that more correctly reflect the ignorance and uncertainties that result from partially uncertain input arguments. The book will help researchers and practitioners to advance, improve and apply subjective logic to build powerful artificial reasoning models and tools for solving real-world problems. A good grounding in discrete mathematics is a prerequisite.
The series is devoted to the publication of high-level monographs on all areas of mathematical logic and its applications. It is addressed to advanced students and research mathematicians, and may also serve as a guide for lectures and for seminars at the graduate level.
This textbook provides a first introduction to mathematical logic which is closely attuned to the applications of logic in computer science. In it the authors emphasize the notion that deduction is a form of computation. Whilst all the traditional subjects of logic are covered thoroughly: syntax, semantics, completeness, and compactness; much of the book deals with less traditional topics such as resolution theorem proving, logic programming and non-classical logics - modal and intuitionistic - which are becoming increasingly important in computer science. No previous exposure to logic is assumed and so this will be suitable for upper level undergraduates or beginning graduate students in computer science or mathematics.From reviews of the first edition: "... must surely rank as one of the most fruitful textbooks introduced into computer science ... We strongly suggest it as a textbook ..." SIGACT News
The Bachelier Society for Mathematical Finance, founded in 1996, held its 1st World Congress in Paris on June 28 to July 1, 2000, thus coinciding in time with the centenary of the thesis defence of Louis Bachelier. In his thesis Bachelier introduced Brownian motion as a tool for the analysis of financial markets as well as the exact definition of options, and this is widely considered the keystone for the emergence of mathematical finance as a scientific discipline. The prestigious list of plenary speakers in Paris included 2 Nobel laureates, Paul Samuelson and Robert Merton. Over 130 further selected talks were given in 3 parallel sessions, all well attended by the over 500 participants who registered from all continents.
Boolean algebras underlie many central constructions of analysis, logic, probability theory, and cybernetics. This book concentrates on the analytical aspects of their theory and application, which distinguishes it among other sources. Boolean Algebras in Analysis consists of two parts. The first concerns the general theory at the beginner's level. Presenting classical theorems, the book describes the topologies and uniform structures of Boolean algebras, the basics of complete Boolean algebras and their continuous homomorphisms, as well as lifting theory. The first part also includes an introductory chapter describing the elementary to the theory. The second part deals at a graduate level with the metric theory of Boolean algebras at a graduate level. The covered topics include measure algebras, their sub algebras, and groups of automorphisms. Ample room is allotted to the new classification theorems abstracting the celebrated counterparts by D.Maharam, A.H. Kolmogorov, and V.A.Rokhlin. Boolean Algebras in Analysis is an exceptional definitive source on Boolean algebra as applied to functional analysis and probability. It is intended for all who are interested in new and powerful tools for hard and soft mathematical analysis.
Mathematics of Fuzzy Sets: Logic, Topology and Measure Theory is a major attempt to provide much-needed coherence for the mathematics of fuzzy sets. Much of this book is new material required to standardize this mathematics, making this volume a reference tool with broad appeal as well as a platform for future research. Fourteen chapters are organized into three parts: mathematical logic and foundations (Chapters 1-2), general topology (Chapters 3-10), and measure and probability theory (Chapters 11-14). Chapter 1 deals with non-classical logics and their syntactic and semantic foundations. Chapter 2 details the lattice-theoretic foundations of image and preimage powerset operators. Chapters 3 and 4 lay down the axiomatic and categorical foundations of general topology using lattice-valued mappings as a fundamental tool. Chapter 3 focuses on the fixed-basis case, including a convergence theory demonstrating the utility of the underlying axioms. Chapter 4 focuses on the more general variable-basis case, providing a categorical unification of locales, fixed-basis topological spaces, and variable-basis compactifications. Chapter 5 relates lattice-valued topologies to probabilistic topological spaces and fuzzy neighborhood spaces. Chapter 6 investigates the important role of separation axioms in lattice-valued topology from the perspective of space embedding and mapping extension problems, while Chapter 7 examines separation axioms from the perspective of Stone-Cech-compactification and Stone-representation theorems. Chapters 8 and 9 introduce the most important concepts and properties of uniformities, including the covering and entourage approaches and the basic theory of precompact orcomplete [0,1]-valued uniform spaces. Chapter 10 sets out the algebraic, topological, and uniform structures of the fundamentally important fuzzy real line and fuzzy unit interval. Chapter 11 lays the foundations of generalized measure theory and representation by Markov kernels. Chapter 12 develops the important theory of conditioning operators with applications to measure-free conditioning. Chapter 13 presents elements of pseudo-analysis with applications to the Hamilton&endash;Jacobi equation and optimization problems. Chapter 14 surveys briefly the fundamentals of fuzzy random variables which are [0,1]-valued interpretations of random sets.
Synthesis of Finite State Machines: Logic Optimization is the second in a set of two monographs devoted to the synthesis of Finite State Machines (FSMs). The first volume, Synthesis of Finite State Machines: Functional Optimization, addresses functional optimization, whereas this one addresses logic optimization. The result of functional optimization is a symbolic description of an FSM which represents a sequential function chosen from a collection of permissible candidates. Logic optimization is the body of techniques for converting a symbolic description of an FSM into a hardware implementation. The mapping of a given symbolic representation into a two-valued logic implementation is called state encoding (or state assignment) and it impacts heavily area, speed, testability and power consumption of the realized circuit. The first part of the book introduces the relevant background, presents results previously scattered in the literature on the computational complexity of encoding problems, and surveys in depth old and new approaches to encoding in logic synthesis. The second part of the book presents two main results about symbolic minimization; a new procedure to find minimal two-level symbolic covers, under face, dominance and disjunctive constraints, and a unified frame to check encodability of encoding constraints and find codes of minimum length that satisfy them. The third part of the book introduces generalized prime implicants (GPIs), which are the counterpart, in symbolic minimization of two-level logic, to prime implicants in two-valued two-level minimization. GPIs enable the design of an exact procedure for two-level symbolic minimization, based on a covering step which is complicated by the need to guarantee encodability of the final cover. A new efficient algorithm to verify encodability of a selected cover is presented. If a cover is not encodable, it is shown how to augment it minimally until an encodable superset of GPIs is determined. To handle encodability the authors have extended the frame to satisfy encoding constraints presented in the second part. The covering problems generated in the minimization of GPIs tend to be very large. Recently large covering problems have been attacked successfully by representing the covering table with binary decision diagrams (BDD). In the fourth part of the book the authors introduce such techniques and extend them to the case of the implicit minimization of GPIs, where the encodability and augmentation steps are also performed implicitly. Synthesis of Finite State Machines: Logic Optimization will be of interest to researchers and professional engineers who work in the area of computer-aided design of integrated circuits.
In this new text, Steven Givant-the author of several acclaimed books, including works co-authored with Paul Halmos and Alfred Tarski-develops three theories of duality for Boolean algebras with operators. Givant addresses the two most recognized dualities (one algebraic and the other topological) and introduces a third duality, best understood as a hybrid of the first two. This text will be of interest to graduate students and researchers in the fields of mathematics, computer science, logic, and philosophy who are interested in exploring special or general classes of Boolean algebras with operators. Readers should be familiar with the basic arithmetic and theory of Boolean algebras, as well as the fundamentals of point-set topology.
Fuzzy logic is a recent revolutionary technology' which has brought together researchers from mathematics, engineering, computer science, cognitive and behavioral sciences, etc. The work in fuzzy technology at the Laboratory for International Fuzzy Engineering (LIFE) has been specifically applied to engineering problems. This book reflects the results of the work that has been undertaken at LIFE with chapters treating the following topical areas: Decision Support Systems, Intelligent Plant Operations Support, Fuzzy Modeling and Process Control, System Design, Image Understanding, Behavior Decisions for Mobile Robots, the Fuzzy Computer, and Fuzzy Neuro Systems. The book is a thorough analysis of research which has been implemented in the areas of fuzzy engineering technology. The analysis can be used to improve these specific applications or, perhaps more importantly, to investigate more sophisticated fuzzy control applications.
Lattice-valued Logic aims at establishing the logical foundation for uncertain information processing routinely performed by humans and artificial intelligence systems. In this textbook for the first time a general introduction on lattice-valued logic is given. It systematically summarizes research from the basic notions up to recent results on lattice implication algebras, lattice-valued logic systems based on lattice implication algebras, as well as the corresponding reasoning theories and methods. The book provides the suitable theoretical logical background of lattice-valued logic systems and supports newly designed intelligent uncertain-information-processing systems and a wide spectrum of intelligent learning tasks.
This superb exposition of a complex subject examines new developments in the theory and practice of computation from a mathematical perspective, with topics ranging from classical computability to complexity, from biocomputing to quantum computing. This book is suitable for researchers and graduate students in mathematics, philosophy, and computer science with a special interest in logic and foundational issues. Most useful to graduate students are the survey papers on computable analysis and biological computing. Logicians and theoretical physicists will also benefit from this book.
Natural duality theory is one of the major growth areas within general algebra. This text provides a short path to the forefront of research in duality theory. It presents a coherent approach to new results in the area, as well as exposing open problems. Unary algebras play a special role throughout the text. Individual unary algebras are relatively simple and easy to work with. But as a class they have a rich and complex entanglement with dualisability. This combination of local simplicity and global complexity ensures that, for the study of natural duality theory, unary algebras are an excellent source of examples and counterexamples. A number of results appear here for the first time. In particular, the text ends with an appendix that provides a new and definitive approach to the concept of the rank of a finite algebra and its relationship with strong dualisability.
Since their inception, fuzzy sets and fuzzy logic became popular. The reason is that the very idea of fuzzy sets and fuzzy logic attacks an old tradition in science, namely bivalent (black-or-white, all-or-none) judg ment and reasoning and the thus resulting approach to formation of scientific theories and models of reality. The idea of fuzzy logic, briefly speaking, is just the opposite of this tradition: instead of full truth and falsity, our judgment and reasoning also involve intermediate truth values. Application of this idea to various fields has become known under the term fuzzy approach (or graded truth approach). Both prac tice (many successful engineering applications) and theory (interesting nontrivial contributions and broad interest of mathematicians, logicians, and engineers) have proven the usefulness of fuzzy approach. One of the most successful areas of fuzzy methods is the application of fuzzy relational modeling. Fuzzy relations represent formal means for modeling of rather nontrivial phenomena (reasoning, decision, control, knowledge extraction, systems analysis and design, etc. ) in the pres ence of a particular kind of indeterminacy called vagueness. Models and methods based on fuzzy relations are often described by logical formulas (or by natural language statements that can be translated into logical formulas). Therefore, in order to approach these models and methods in an appropriate formal way, it is desirable to have a general theory of fuzzy relational systems with basic connections to (formal) language which enables us to describe relationships in these systems."
Goguen categories extend the relational calculus and its categorical formalization to the fuzzy world. Starting from the fundamental concepts of sets, binary relations and lattices this book introduces several categorical formulations of an abstract theory of relations such as allegories, Dedekind categories and related structures. It is shown that neither theory is sufficiently rich to describe basic operations on fuzzy relations. The book then introduces Goguen categories and provides a comprehensive study of these structures including their representation theory, and the definability of norm-based operations. The power of the theory is demonstrated by a comprehensive example. A certain Goguen category is used to specify and to develop a fuzzy controller. Based on its abstract description as well as certain desirable properties and their formal proofs, a verified controller is derived without compromising the - sometimes - intuitive choice of norm-based operations by fuzzy engineers.
This volume contains English translations of Frege's early writings in logic and philosophy and of relevant reviews by other leading logicians. Professor Bynum has contributed a biographical essay, introduction, and extensive bibliography.
Modern mathematical logic would not exist without the analytical tools first developed by George Boole in The Mathematical Analysis of Logic and The Laws of Thought. The influence of the Boolean school on the development of logic, always recognised but long underestimated, has recently become a major research topic. This collection is the first anthology of works on Boole. It contains two works published in 1865, the year of Boole's death, but never reprinted, as well as several classic studies of recent decades and ten original contributions appearing here for the first time. From the programme of the English Algebraic School to Boole's use of operator methods, from the problem of interpretability to that of psychologism, a full range of issues is covered. The Boole Anthology is indispensable to Boole studies and will remain so for years to come.
This volume is dedicated to Prof. Dag Prawitz and his outstanding contributions to philosophical and mathematical logic. Prawitz's eminent contributions to structural proof theory, or general proof theory, as he calls it, and inference-based meaning theories have been extremely influential in the development of modern proof theory and anti-realistic semantics. In particular, Prawitz is the main author on natural deduction in addition to Gerhard Gentzen, who defined natural deduction in his PhD thesis published in 1934. The book opens with an introductory paper that surveys Prawitz's numerous contributions to proof theory and proof-theoretic semantics and puts his work into a somewhat broader perspective, both historically and systematically. Chapters include either in-depth studies of certain aspects of Dag Prawitz's work or address open research problems that are concerned with core issues in structural proof theory and range from philosophical essays to papers of a mathematical nature. Investigations into the necessity of thought and the theory of grounds and computational justifications as well as an examination of Prawitz's conception of the validity of inferences in the light of three "dogmas of proof-theoretic semantics" are included. More formal papers deal with the constructive behaviour of fragments of classical logic and fragments of the modal logic S4 among other topics. In addition, there are chapters about inversion principles, normalization of p roofs, and the notion of proof-theoretic harmony and other areas of a more mathematical persuasion. Dag Prawitz also writes a chapter in which he explains his current views on the epistemic dimension of proofs and addresses the question why some inferences succeed in conferring evidence on their conclusions when applied to premises for which one already possesses evidence.
Blending Approximations with Sine Functions.- Quasi-interpolation in the Absence of Polynomial Reproduction.- Estimating the Condition Number for Multivariate Interpolation Problems.- Wavelets on a Bounded Interval.- Quasi-Kernel Polynomials and Convergence Results for Quasi-Minimal Residual Iterations.- Rate of Approximation of Weighted Derivatives by Linear Combinations of SMD Operators.- Approximation by Multivariate Splines: an Application of Boolean Methods.- Lm, ?, s-Splines in ?d.- Constructive Multivariate Approximation via Sigmoidal Functions with Applications to Neural Networks.- Spline-Wavelets of Minimal Support.- Necessary Conditions for Local Best Chebyshev Approximations by Splines with Free Knots.- C1 Interpolation on Higher-Dimensional Analogs of the 4-Direction Mesh.- Tabulation of Thin Plate Splines on a Very Fine Two-Dimensional Grid.- The L2-Approximation Orders of Principal Shift-Invariant Spaces Generated by a Radial Basis Function.- A Multi-Parameter Method for Nonlinear Least-Squares Approximation.- Analog VLSI Networks.- Converse Theorems for Approximation on Discrete Sets II.- A Dual Method for Smoothing Histograms using Nonnegative C1-Splines.- Segment Approximation By Using Linear Functionals.- Construction of Monotone Extensions to Boundary Function
This book provides an account of those parts of contemporary set theory that are relevant to other areas of pure mathematics. Intended for advanced undergraduates and beginning graduate students, the text is written in an easy-going style, with a minimum of formalism. The book begins with a review of "naive" set theory; it then develops the Zermelo-Fraenkel axioms of the theory, showing how they arise naturally from a rigorous answer to the question, "what is a set?" After discussing the ordinal and cardinal numbers, the book then delves into contemporary set theory, covering such topics as: the Borel hierarchy, stationary sets and regressive functions, and Lebesgue measure. Two chapters present an extension of the Zermelo-Fraenkel theory, discussing the axiom of constructibility and the question of provability in set theory. A final chapter presents an account of an alternative conception of set theory that has proved useful in computer science, the non-well-founded set theory of Peter Aczel. The author is a well-known mathematician and the editor of the "Computers in Mathematics" column in the AMS Notices and of FOCUS, the magazine published by the MAA.
New discoveries about algorithms are leading scientists beyond the
Church-Turing Thesis, which governs the "algorithmic universe" and
asserts the conventionality of recursive algorithms. A new paradigm
for computation, the super-recursive algorithm, offers promising
prospects for algorithms of much greater computing power and
efficiency. * Describes the strengthening link between the theory of super-recursive algorithms and actual algorithms close to practical realization * Examines the theory's basis as a foundation for advancements in computing, information science, and related technologies * Encompasses and systematizes all main types of mathematical models of algorithms * Highlights how super-recursive algorithms pave the way for more advanced design, utilization, and maintenance of computers * Examines and restructures the existing variety of mathematical models of complexity of algorithms and computation, introducing new models * Possesses a comprehensive bibliography and index
Mathematics and logic have been central topics of concern since the
dawn of philosophy. Since logic is the study of correct reasoning,
it is a fundamental branch of epistemology and a priority in any
philosophical system. Philosophers have focused on mathematics as a
case study for general philosophical issues and for its role in
overall knowledge- gathering. Today, philosophy of mathematics and
logic remain central disciplines in contemporary philosophy, as
evidenced by the regular appearance of articles on these topics in
the best mainstream philosophical journals; in fact, the last
decade has seen an explosion of scholarly work in these areas.
An approach to complexity theory which offers a means of analysing algorithms in terms of their tractability. The authors consider the problem in terms of parameterized languages and taking "k-slices" of the language, thus introducing readers to new classes of algorithms which may be analysed more precisely than was the case until now. The book is as self-contained as possible and includes a great deal of background material. As a result, computer scientists, mathematicians, and graduate students interested in the design and analysis of algorithms will find much of interest. |
![]() ![]() You may like...
Legal Issues in Mental Health Care
B. A. Weiner, R. Wettstein
Hardcover
R3,096
Discovery Miles 30 960
Contract, Tort and Restitution Statutes…
James Devenney, Howard Johnson
Paperback
R1,292
Discovery Miles 12 920
Law Of Damages Through The Cases
P.J. Visser, J.M. Potgieter
Paperback
|