![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
Studies in Logic and the Foundations of Mathematics, Volume 123: Constructivism in Mathematics: An Introduction, Vol. II focuses on various studies in mathematics and logic, including metric spaces, polynomial rings, and Heyting algebras. The publication first takes a look at the topology of metric spaces, algebra, and finite-type arithmetic and theories of operators. Discussions focus on intuitionistic finite-type arithmetic, theories of operators and classes, rings and modules, linear algebra, polynomial rings, fields and local rings, complete separable metric spaces, and located sets. The text then examines proof theory of intuitionistic logic, theory of types and constructive set theory, and choice sequences. The book elaborates on semantical completeness, sheaves, sites, and higher-order logic, and applications of sheaf models. Topics include a derived rule of local continuity, axiom of countable choice, forcing over sites, sheaf models for higher-order logic, and complete Heyting algebras. The publication is a valuable reference for mathematicians and researchers interested in mathematics and logic.
This monograph details several important advances in the area known as the proofs-as-programs paradigm, a set of approaches to developing programs from proofs in constructive logic. It serves the dual purpose of providing a state-of-the-art overview of the field and detailing tools and techniques to stimulate further research. One of the booka (TM)s central themes is a general, abstract framework for developing new systems of program synthesis by adapting proofs-as-programs to new contexts, which the authors call the Curry--Howard Protocol. This protocol is used to provide two novel applications for industrial-scale, complex software engineering: contractual imperative program synthesis and structured software synthesis. These applications constitute an exemplary justification for the applicability of the protocol to different contexts. The book is intended for graduate students in computer science or mathematics who wish to extend their background in logic and type theory as well as gain experience working with logical frameworks and practical proof systems. In addition, the proofs-as-programs research community, and the wider computational logic, formal methods and software engineering communities will benefit. The applications given in the book should be of interest for researchers working in the target problem domains.
This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical math fundamental oriented approach that is commonly found in mathematical logic textbooks).
This is a two-volume collection presenting the selected works of Herbert Busemann, one of the leading geometers of the twentieth century and one of the main founders of metric geometry, convexity theory and convexity in metric spaces. Busemann also did substantial work (probably the most important) on Hilbert's Problem IV. These collected works include Busemann's most important published articles on these topics. Volume I of the collection features Busemann's papers on the foundations of geodesic spaces and on the metric geometry of Finsler spaces. Volume II includes Busemann's papers on convexity and integral geometry, on Hilbert's Problem IV, and other papers on miscellaneous subjects. Each volume offers biographical documents and introductory essays on Busemann's work, documents from his correspondence and introductory essays written by leading specialists on Busemann's work. They are a valuable resource for researchers in synthetic and metric geometry, convexity theory and the foundations of geometry.
Alfred Tarski was one of the two giants of the twentieth-century development of logic, along with Kurt Goedel. The four volumes of this collection contain all of Tarski's published papers and abstracts, as well as a comprehensive bibliography. Here will be found many of the works, spanning the period 1921 through 1979, which are the bedrock of contemporary areas of logic, whether in mathematics or philosophy. These areas include the theory of truth in formalized languages, decision methods and undecidable theories, foundations of geometry, set theory, and model theory, algebraic logic, and universal algebra.
Data fusion or information fusion are names which have been primarily assigned to military-oriented problems. In military applications, typical data fusion problems are: multisensor, multitarget detection, object identification, tracking, threat assessment, mission assessment and mission planning, among many others. However, it is clear that the basic underlying concepts underlying such fusion procedures can often be used in nonmilitary applications as well. The purpose of this book is twofold: First, to point out present gaps in the way data fusion problems are conceptually treated. Second, to address this issue by exhibiting mathematical tools which treat combination of evidence in the presence of uncertainty in a more systematic and comprehensive way. These techniques are based essentially on two novel ideas relating to probability theory: the newly developed fields of random set theory and conditional and relational event algebra. This volume is intended to be both an update on research progress on data fusion and an introduction to potentially powerful new techniques: fuzzy logic, random set theory, and conditional and relational event algebra. Audience: This volume can be used as a reference book for researchers and practitioners in data fusion or expert systems theory, or for graduate students as text for a research seminar or graduate level course.
The purpose of the book is to advance in the understanding of brain function by defining a general framework for representation based on category theory. The idea is to bring this mathematical formalism into the domain of neural representation of physical spaces, setting the basis for a theory of mental representation, able to relate empirical findings, uniting them into a sound theoretical corpus. The innovative approach presented in the book provides a horizon of interdisciplinary collaboration that aims to set up a common agenda that synthesizes mathematical formalization and empirical procedures in a systemic way. Category theory has been successfully applied to qualitative analysis, mainly in theoretical computer science to deal with programming language semantics. Nevertheless, the potential of category theoretic tools for quantitative analysis of networks has not been tackled so far. Statistical methods to investigate graph structure typically rely on network parameters. Category theory can be seen as an abstraction of graph theory. Thus, new categorical properties can be added into network analysis and graph theoretic constructs can be accordingly extended in more fundamental basis. By generalizing networks using category theory we can address questions and elaborate answers in a more fundamental way without waiving graph theoretic tools. The vital issue is to establish a new framework for quantitative analysis of networks using the theory of categories, in which computational neuroscientists and network theorists may tackle in more efficient ways the dynamics of brain cognitive networks. The intended audience of the book is researchers who wish to explore the validity of mathematical principles in the understanding of cognitive systems. All the actors in cognitive science: philosophers, engineers, neurobiologists, cognitive psychologists, computer scientists etc. are akin to discover along its pages new unforeseen connections through the development of concepts and formal theories described in the book. Practitioners of both pure and applied mathematics e.g., network theorists, will be delighted with the mapping of abstract mathematical concepts in the terra incognita of cognition.
The expression of uncertainty in measurement poses a challenge since it involves physical, mathematical, and philosophical issues. This problem is intensified by the limitations of the probabilistic approach used by the current standard (the GUM Instrumentation Standard). This text presents an alternative approach. It makes full use of the mathematical theory of evidence to express the uncertainty in measurements. Coverage provides an overview of the current standard, then pinpoints and constructively resolves its limitations. Numerous examples throughout help explain the book 's unique approach.
"In case you are considering to adopt this book for courses with over 50 students, please contact ""[email protected]"" for more information. "
The last three chapters of the book provide an introduction to type theory (higher-order logic). It is shown how various mathematical concepts can be formalized in this very expressive formal language. This expressive notation facilitates proofs of the classical incompleteness and undecidability theorems which are very elegant and easy to understand. The discussion of semantics makes clear the important distinction between standard and nonstandard models which is so important in understanding puzzling phenomena such as the incompleteness theorems and Skolem's Paradox about countable models of set theory. Some of the numerous exercises require giving formal proofs. A computer program called ETPS which is available from the web facilitates doing and checking such exercises. "Audience: " This volume will be of interest to mathematicians, computer scientists, and philosophers in universities, as well as to computer scientists in industry who wish to use higher-order logic for hardware and software specification and verification. "
Fuzzy Logic Foundations and Industrial Applications is an organized edited collection of contributed chapters covering basic fuzzy logic theory, fuzzy linear programming, and applications. Special emphasis has been given to coverage of recent research results, and to industrial applications of fuzzy logic. The chapters are new works that have been written exclusively for this book by many of the leading and prominent researchers (such as Ronald Yager, Ellen Hisdal, Etienne Kerre, and others) in this field. The contributions are original and each chapter is self-contained. The authors have been careful to indicate direct links between fuzzy set theory and its industrial applications. Fuzzy Logic Foundations and Industrial Applications is an invaluable work that provides researchers and industrial engineers with up-to-date coverage of new results on fuzzy logic and relates these results to their industrial use.
Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis called convex analysis. In particular, it explores the topics of duality, separation, representation, and resolution. The work is intended for students of economics, management science, engineering, and mathematics who need exposure to the mathematical foundations of matrix games, optimization, and general equilibrium analysis. It is written at the advanced undergraduate to beginning graduate level and the only formal preparation required is some familiarity with set operations and with linear algebra and matrix theory. Fundamentals of Convex Analysis is self-contained in that a brief review of the essentials of these tool areas is provided in Chapter 1. Chapter exercises are also provided. Topics covered include: convex sets and their properties; separation and support theorems; theorems of the alternative; convex cones; dual homogeneous systems; basic solutions and complementary slackness; extreme points and directions; resolution and representation of polyhedra; simplicial topology; and fixed point theorems, among others. A strength of this work is how these topics are developed in a fully integrated fashion.
The series is devoted to the publication of monographs and high-level textbooks in mathematics, mathematical methods and their applications. Apart from covering important areas of current interest, a major aim is to make topics of an interdisciplinary nature accessible to the non-specialist. The works in this series are addressed to advanced students and researchers in mathematics and theoretical physics. In addition, it can serve as a guide for lectures and seminars on a graduate level. The series de Gruyter Studies in Mathematics was founded ca. 35 years ago by the late Professor Heinz Bauer and Professor Peter Gabriel with the aim to establish a series of monographs and textbooks of high standard, written by scholars with an international reputation presenting current fields of research in pure and applied mathematics. While the editorial board of the Studies has changed with the years, the aspirations of the Studies are unchanged. In times of rapid growth of mathematical knowledge carefully written monographs and textbooks written by experts are needed more than ever, not least to pave the way for the next generation of mathematicians. In this sense the editorial board and the publisher of the Studies are devoted to continue the Studies as a service to the mathematical community. Please submit any book proposals to Niels Jacob. Titles in planning include Flavia Smarazzo and Alberto Tesei, Measure Theory: Radon Measures, Young Measures, and Applications to Parabolic Problems (2019) Elena Cordero and Luigi Rodino, Time-Frequency Analysis of Operators (2019) Mark M. Meerschaert, Alla Sikorskii, and Mohsen Zayernouri, Stochastic and Computational Models for Fractional Calculus, second edition (2020) Mariusz Lemanczyk, Ergodic Theory: Spectral Theory, Joinings, and Their Applications (2020) Marco Abate, Holomorphic Dynamics on Hyperbolic Complex Manifolds (2021) Miroslava Antic, Joeri Van der Veken, and Luc Vrancken, Differential Geometry of Submanifolds: Submanifolds of Almost Complex Spaces and Almost Product Spaces (2021) Kai Liu, Ilpo Laine, and Lianzhong Yang, Complex Differential-Difference Equations (2021) Rajendra Vasant Gurjar, Kayo Masuda, and Masayoshi Miyanishi, Affine Space Fibrations (2022)
The theory of constructive (recursive) models follows from works of Froehlich, Shepherdson, Mal'tsev, Kuznetsov, Rabin, and Vaught in the 50s. Within the framework of this theory, algorithmic properties of abstract models are investigated by constructing representations on the set of natural numbers and studying relations between algorithmic and structural properties of these models. This book is a very readable exposition of the modern theory of constructive models and describes methods and approaches developed by representatives of the Siberian school of algebra and logic and some other researchers (in particular, Nerode and his colleagues). The main themes are the existence of recursive models and applications to fields, algebras, and ordered sets (Ershov), the existence of decidable prime models (Goncharov, Harrington), the existence of decidable saturated models (Morley), the existence of decidable homogeneous models (Goncharov and Peretyat'kin), properties of the Ehrenfeucht theories (Millar, Ash, and Reed), the theory of algorithmic dimension and conditions of autostability (Goncharov, Ash, Shore, Khusainov, Ventsov, and others), and the theory of computable classes of models with various properties. Future perspectives of the theory of constructive models are also discussed. Most of the results in the book are presented in monograph form for the first time. The theory of constructive models serves as a basis for recursive mathematics. It is also useful in computer science, in particular, in the study of programming languages, higher level languages of specification, abstract data types, and problems of synthesis and verification of programs. Therefore, the book will be usefulfor not only specialists in mathematical logic and the theory of algorithms but also for scientists interested in the mathematical fundamentals of computer science. The authors are eminent specialists in mathematical logic. They have established fundamental results on elementary theories, model theory, the theory of algorithms, field theory, group theory, applied logic, computable numberings, the theory of constructive models, and the theoretical computer science.
This book contains leading survey papers on the various aspects of Abduction, both logical and numerical approaches. Abduction is central to all areas of applied reasoning, including artificial intelligence, philosophy of science, machine learning, data mining and decision theory, as well as logic itself.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic."
This book gives an account of the fundamental results in geometric stability theory, a subject that has grown out of categoricity and classification theory. This approach studies the fine structure of models of stable theories, using the geometry of forking; this often achieves global results relevant to classification theory. Topics range from Zilber-Cherlin classification of infinite locally finite homogenous geometries, to regular types, their geometries, and their role in superstable theories. The structure and existence of definable groups is featured prominently, as is work by Hrushovski. The book is unique in the range and depth of material covered and will be invaluable to anyone interested in modern model theory.
Anyone involved in the philosophy of science is naturally drawn
into the study of the foundations of probability. Different
interpretations of probability, based on competing philosophical
ideas, lead to different statistical techniques, and frequently to
mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.
Parameterized complexity theory is a recent branch of computational complexity theory that provides a framework for a refined analysis of hard algorithmic problems. The central notion of the theory, fixed-parameter tractability, has led to the development of various new algorithmic techniques and a whole new theory of intractability. This book is a state-of-the-art introduction to both algorithmic techniques for fixed-parameter tractability and the structural theory of parameterized complexity classes, and it presents detailed proofs of recent advanced results that have not appeared in book form before. Several chapters are each devoted to intractability, algorithmic techniques for designing fixed-parameter tractable algorithms, and bounded fixed-parameter tractability and subexponential time complexity. The treatment is comprehensive, and the reader is supported with exercises, notes, a detailed index, and some background on complexity theory and logic. The book will be of interest to computer scientists, mathematicians and graduate students engaged with algorithms and problem complexity.
Our motivation for gathering the material for this book over aperiod of seven years has been to unify and simplify ideas wh ich appeared in a sizable number of re search articles during the past two decades. More specifically, it has been our aim to provide the categorical foundations for extensive work that was published on the epimorphism- and cowellpoweredness problem, predominantly for categories of topological spaces. In doing so we found the categorical not ion of closure operators interesting enough to be studied for its own sake, as it unifies and describes other significant mathematical notions and since it leads to a never-ending stream of ex amples and applications in all areas of mathematics. These are somewhat arbitrarily restricted to topology, algebra and (a small part of) discrete mathematics in this book, although other areas, such as functional analysis, would provide an equally rich and interesting supply of examples. We also had to restrict the themes in our theoretical exposition. In spite of the fact that closure operators generalize the uni versal closure operations of abelian category theory and of topos- and sheaf theory, we chose to mention these aspects only en passant, in favour of the presentation of new results more closely related to our original intentions. We also needed to refrain from studying topological concepts, such as compactness, in the setting of an arbitrary closure-equipped category, although this topic appears prominently in the published literature involving closure operators."
Fact finding in judicial proceedings is a dynamic process. This collection of papers considers whether computational methods or other formal logical methods developed in disciplines such as artificial intelligence, decision theory, and probability theory can facilitate the study and management of dynamic evidentiary and inferential processes in litigation. The papers gathered here have several epicenters, including (i) the dynamics of judicial proof, (ii) the relationship between artificial intelligence or formal analysis and "common sense," (iii) the logic of factual inference, including (a) the relationship between causality and inference and (b) the relationship between language and factual inference, (iv) the logic of discovery, including the role of abduction and serendipity in the process of investigation and proof of factual matters, and (v) the relationship between decision and inference.
One can distinguish, roughly speaking, two different approaches to the philosophy of mathematics. On the one hand, some philosophers (and some mathematicians) take the nature and the results of mathematicians' activities as given, and go on to ask what philosophical morals one might perhaps find in their story. On the other hand, some philosophers, logicians and mathematicians have tried or are trying to subject the very concepts which mathematicians are using in their work to critical scrutiny. In practice this usually means scrutinizing the logical and linguistic tools mathematicians wield. Such scrutiny can scarcely help relying on philosophical ideas and principles. In other words it can scarcely help being literally a study of language, truth and logic in mathematics, albeit not necessarily in the spirit of AJ. Ayer. As its title indicates, the essays included in the present volume represent the latter approach. In most of them one of the fundamental concepts in the foundations of mathematics and logic is subjected to a scrutiny from a largely novel point of view. Typically, it turns out that the concept in question is in need of a revision or reconsideration or at least can be given a new twist. The results of such a re-examination are not primarily critical, however, but typically open up new constructive possibilities. The consequences of such deconstructions and reconstructions are often quite sweeping, and are explored in the same paper or in others.
A thorough, self-contained and easily accessible treatment of the theory on the polynomial best approximation of functions with respect to maximum norms. The topics include Chebychev theory, Weierstrass theorems, smoothness of functions, and continuation of functions.
One criterion for classifying books is whether they are written for a single pur pose or for multiple purposes. This book belongs to the category of multipurpose books, but one of its roles is predominant-it is primarily a textbook. As such, it can be used for a variety ofcourses at the first-year graduate or upper-division undergraduate level. A common characteristic of these courses is that they cover fundamental systems concepts, major categories of systems problems, and some selected methods for dealing with these problems at a rather general level. A unique feature of the book is that the concepts, problems, and methods are introduced in the context of an architectural formulation of an expert system referred to as the general systems problem solver or aSPS-whose aim is to provide users ofall kinds with computer-based systems knowledge and methodo logy. Theasps architecture, which is developed throughout the book, facilitates a framework that is conducive to acoherent, comprehensive, and pragmaticcoverage ofsystems fundamentals-concepts, problems, and methods. A course that covers systems fundamentals is now offered not only in sys tems science, information science, or systems engineering programs, but in many programs in other disciplines as well. Although the level ofcoverage for systems science or engineering students is surely different from that used for students in other disciplines, this book is designed to serve both of these needs."
This undergraduate textbook is intended primarily for a transition course into higher mathematics, although it is written with a broader audience in mind. The heart and soul of this book is problem solving, where each problem is carefully chosen to clarify a concept, demonstrate a technique, or to enthuse. The exercises require relatively extensive arguments, creative approaches, or both, thus providing motivation for the reader. With a unified approach to a diverse collection of topics, this text points out connections, similarities, and differences among subjects whenever possible. This book shows students that mathematics is a vibrant and dynamic human enterprise by including historical perspectives and notes on the giants of mathematics, by mentioning current activity in the mathematical community, and by discussing many famous and less well-known questions that remain open for future mathematicians. Ideally, this text should be used for a two semester course,
where the first course has no prerequisites and the second is a
more challenging course for math majors; yet, the flexible
structure of the book allows it to be used in a variety of
settings, including as a source of various independent-study and
research projects.
The analysis and control of complex systems have been the main motivation for the emergence of fuzzy set theory since its inception. It is also a major research field where many applications, especially industrial ones, have made fuzzy logic famous. This unique handbook is devoted to an extensive, organized, and up-to-date presentation of fuzzy systems engineering methods. The book includes detailed material and extensive bibliographies, written by leading experts in the field, on topics such as: Use of fuzzy logic in various control systems. Fuzzy rule-based modeling and its universal approximation properties. Learning and tuning techniques for fuzzy models, using neural networks and genetic algorithms. Fuzzy control methods, including issues such as stability analysis and design techniques, as well as the relationship with traditional linear control. Fuzzy sets relation to the study of chaotic systems, and the fuzzy extension of set-valued approaches to systems modeling through the use of differential inclusions. Fuzzy Systems: Modeling and Control is part of The Handbooks of Fuzzy Sets Series. The series provides a complete picture of contemporary fuzzy set theory and its applications. This volume is a key reference for systems engineers and scientists seeking a guide to the vast amount of literature in fuzzy logic modeling and control. |
You may like...
Algebras, Lattices, Varieties - Volume…
Ralph S Freese, Ralph N. McKenzie, …
Paperback
R3,049
Discovery Miles 30 490
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,271
Discovery Miles 52 710
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R807
Discovery Miles 8 070
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
|