![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
The theory of constructive (recursive) models follows from works of Froehlich, Shepherdson, Mal'tsev, Kuznetsov, Rabin, and Vaught in the 50s. Within the framework of this theory, algorithmic properties of abstract models are investigated by constructing representations on the set of natural numbers and studying relations between algorithmic and structural properties of these models. This book is a very readable exposition of the modern theory of constructive models and describes methods and approaches developed by representatives of the Siberian school of algebra and logic and some other researchers (in particular, Nerode and his colleagues). The main themes are the existence of recursive models and applications to fields, algebras, and ordered sets (Ershov), the existence of decidable prime models (Goncharov, Harrington), the existence of decidable saturated models (Morley), the existence of decidable homogeneous models (Goncharov and Peretyat'kin), properties of the Ehrenfeucht theories (Millar, Ash, and Reed), the theory of algorithmic dimension and conditions of autostability (Goncharov, Ash, Shore, Khusainov, Ventsov, and others), and the theory of computable classes of models with various properties. Future perspectives of the theory of constructive models are also discussed. Most of the results in the book are presented in monograph form for the first time. The theory of constructive models serves as a basis for recursive mathematics. It is also useful in computer science, in particular, in the study of programming languages, higher level languages of specification, abstract data types, and problems of synthesis and verification of programs. Therefore, the book will be usefulfor not only specialists in mathematical logic and the theory of algorithms but also for scientists interested in the mathematical fundamentals of computer science. The authors are eminent specialists in mathematical logic. They have established fundamental results on elementary theories, model theory, the theory of algorithms, field theory, group theory, applied logic, computable numberings, the theory of constructive models, and the theoretical computer science.
This book gives an account of the fundamental results in geometric stability theory, a subject that has grown out of categoricity and classification theory. This approach studies the fine structure of models of stable theories, using the geometry of forking; this often achieves global results relevant to classification theory. Topics range from Zilber-Cherlin classification of infinite locally finite homogenous geometries, to regular types, their geometries, and their role in superstable theories. The structure and existence of definable groups is featured prominently, as is work by Hrushovski. The book is unique in the range and depth of material covered and will be invaluable to anyone interested in modern model theory.
Our motivation for gathering the material for this book over aperiod of seven years has been to unify and simplify ideas wh ich appeared in a sizable number of re search articles during the past two decades. More specifically, it has been our aim to provide the categorical foundations for extensive work that was published on the epimorphism- and cowellpoweredness problem, predominantly for categories of topological spaces. In doing so we found the categorical not ion of closure operators interesting enough to be studied for its own sake, as it unifies and describes other significant mathematical notions and since it leads to a never-ending stream of ex amples and applications in all areas of mathematics. These are somewhat arbitrarily restricted to topology, algebra and (a small part of) discrete mathematics in this book, although other areas, such as functional analysis, would provide an equally rich and interesting supply of examples. We also had to restrict the themes in our theoretical exposition. In spite of the fact that closure operators generalize the uni versal closure operations of abelian category theory and of topos- and sheaf theory, we chose to mention these aspects only en passant, in favour of the presentation of new results more closely related to our original intentions. We also needed to refrain from studying topological concepts, such as compactness, in the setting of an arbitrary closure-equipped category, although this topic appears prominently in the published literature involving closure operators."
Anyone involved in the philosophy of science is naturally drawn
into the study of the foundations of probability. Different
interpretations of probability, based on competing philosophical
ideas, lead to different statistical techniques, and frequently to
mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.
Fact finding in judicial proceedings is a dynamic process. This collection of papers considers whether computational methods or other formal logical methods developed in disciplines such as artificial intelligence, decision theory, and probability theory can facilitate the study and management of dynamic evidentiary and inferential processes in litigation. The papers gathered here have several epicenters, including (i) the dynamics of judicial proof, (ii) the relationship between artificial intelligence or formal analysis and "common sense," (iii) the logic of factual inference, including (a) the relationship between causality and inference and (b) the relationship between language and factual inference, (iv) the logic of discovery, including the role of abduction and serendipity in the process of investigation and proof of factual matters, and (v) the relationship between decision and inference.
One can distinguish, roughly speaking, two different approaches to the philosophy of mathematics. On the one hand, some philosophers (and some mathematicians) take the nature and the results of mathematicians' activities as given, and go on to ask what philosophical morals one might perhaps find in their story. On the other hand, some philosophers, logicians and mathematicians have tried or are trying to subject the very concepts which mathematicians are using in their work to critical scrutiny. In practice this usually means scrutinizing the logical and linguistic tools mathematicians wield. Such scrutiny can scarcely help relying on philosophical ideas and principles. In other words it can scarcely help being literally a study of language, truth and logic in mathematics, albeit not necessarily in the spirit of AJ. Ayer. As its title indicates, the essays included in the present volume represent the latter approach. In most of them one of the fundamental concepts in the foundations of mathematics and logic is subjected to a scrutiny from a largely novel point of view. Typically, it turns out that the concept in question is in need of a revision or reconsideration or at least can be given a new twist. The results of such a re-examination are not primarily critical, however, but typically open up new constructive possibilities. The consequences of such deconstructions and reconstructions are often quite sweeping, and are explored in the same paper or in others.
One criterion for classifying books is whether they are written for a single pur pose or for multiple purposes. This book belongs to the category of multipurpose books, but one of its roles is predominant-it is primarily a textbook. As such, it can be used for a variety ofcourses at the first-year graduate or upper-division undergraduate level. A common characteristic of these courses is that they cover fundamental systems concepts, major categories of systems problems, and some selected methods for dealing with these problems at a rather general level. A unique feature of the book is that the concepts, problems, and methods are introduced in the context of an architectural formulation of an expert system referred to as the general systems problem solver or aSPS-whose aim is to provide users ofall kinds with computer-based systems knowledge and methodo logy. Theasps architecture, which is developed throughout the book, facilitates a framework that is conducive to acoherent, comprehensive, and pragmaticcoverage ofsystems fundamentals-concepts, problems, and methods. A course that covers systems fundamentals is now offered not only in sys tems science, information science, or systems engineering programs, but in many programs in other disciplines as well. Although the level ofcoverage for systems science or engineering students is surely different from that used for students in other disciplines, this book is designed to serve both of these needs."
The analysis and control of complex systems have been the main motivation for the emergence of fuzzy set theory since its inception. It is also a major research field where many applications, especially industrial ones, have made fuzzy logic famous. This unique handbook is devoted to an extensive, organized, and up-to-date presentation of fuzzy systems engineering methods. The book includes detailed material and extensive bibliographies, written by leading experts in the field, on topics such as: Use of fuzzy logic in various control systems. Fuzzy rule-based modeling and its universal approximation properties. Learning and tuning techniques for fuzzy models, using neural networks and genetic algorithms. Fuzzy control methods, including issues such as stability analysis and design techniques, as well as the relationship with traditional linear control. Fuzzy sets relation to the study of chaotic systems, and the fuzzy extension of set-valued approaches to systems modeling through the use of differential inclusions. Fuzzy Systems: Modeling and Control is part of The Handbooks of Fuzzy Sets Series. The series provides a complete picture of contemporary fuzzy set theory and its applications. This volume is a key reference for systems engineers and scientists seeking a guide to the vast amount of literature in fuzzy logic modeling and control.
Parameterized complexity theory is a recent branch of computational complexity theory that provides a framework for a refined analysis of hard algorithmic problems. The central notion of the theory, fixed-parameter tractability, has led to the development of various new algorithmic techniques and a whole new theory of intractability. This book is a state-of-the-art introduction to both algorithmic techniques for fixed-parameter tractability and the structural theory of parameterized complexity classes, and it presents detailed proofs of recent advanced results that have not appeared in book form before. Several chapters are each devoted to intractability, algorithmic techniques for designing fixed-parameter tractable algorithms, and bounded fixed-parameter tractability and subexponential time complexity. The treatment is comprehensive, and the reader is supported with exercises, notes, a detailed index, and some background on complexity theory and logic. The book will be of interest to computer scientists, mathematicians and graduate students engaged with algorithms and problem complexity.
This is a continuation of Vol. 7 of Trends in Logic. It wil cover the wealth of recent developments of Lukasiewicz Logic and their algebras (Chang MV-algebras), with particular reference to (de Finetti) coherent evaluation of continuously valued events, (Renyi) conditionals for such events, related algorithms.
Recent major advances in model theory include connections between model theory and Diophantine and real analytic geometry, permutation groups, and finite algebras. The present book contains lectures on recent results in algebraic model theory, covering topics from the following areas: geometric model theory, the model theory of analytic structures, permutation groups in model theory, the spectra of countable theories, and the structure of finite algebras. Audience: Graduate students in logic and others wishing to keep abreast of current trends in model theory. The lectures contain sufficient introductory material to be able to grasp the recent results presented.
On the history of the book: In the early 1990s several new methods and perspectives in au- mated deduction emerged. We just mention the superposition calculus, meta-term inference and schematization, deductive decision procedures, and automated model building. It was this last ?eld which brought the authors of this book together. In 1994 they met at the Conference on Automated Deduction (CADE-12) in Nancy and agreed upon the general point of view, that semantics and, in particular, construction of models should play a central role in the ?eld of automated deduction. In the following years the deduction groups of the laboratory LEIBNIZ at IMAG Grenoble and the University of Technology in Vienna organized several bilateral projects promoting this topic. This book emerged as a main result of this cooperation. The authors are aware of the fact, that the book does not cover all relevant methods of automated model building (also called model construction or model generation); instead the book focuses on deduction-based symbolic methods for the construction of Herbrand models developed in the last 12 years. Other methods of automated model building, in particular also ?nite model building, are mainly treated in the ?nal chapter; this chapter is less formal and detailed but gives a broader view on the topic and a comparison of di?erent approaches. Howtoreadthisbook: In the introduction we give an overview of automated deduction in a historical context, taking into account its relationship with the human views on formal and informal proofs.
Decision makers in managerial and public organizations often encounter de cision problems under conflict or competition, because they select strategies independently or by mutual agreement and therefore their payoffs are then affected by the strategies of the other decision makers. Their interests do not always coincide and are at times even completely opposed. Competition or partial cooperation among decision makers should be considered as an essen tial part of the problem when we deal with the decision making problems in organizations which consist of decision makers with conflicting interests. Game theory has been dealing with such problems and its techniques have been used as powerful analytical tools in the resolution process of the decision problems. The publication of the great work by J. von Neumann and O. Morgen stern in 1944 attracted attention of many people and laid the foundation of game theory. We can see remarkable advances in the field of game theory for analysis of economic situations and a number of books in the field have been published in recent years. The aim of game theory is to specify the behavior of each player so as to optimize the interests of the player. It then recommends a set of solutions as strategies so that the actions chosen by each decision maker (player) lead to an outcome most profitable for himself or her self."
This volume is both a tribute to Ulrich Felgner's research in algebra, logic, and set theory and a strong research contribution to these areas. Felgner's former students, friends and collaborators have contributed sixteen papers to this volume that highlight the unity of these three fields in the spirit of Ulrich Felgner's own research. The interested reader will find excellent original research surveys and papers that span the field from set theory without the axiom of choice via model-theoretic algebra to the mathematics of intonation.
This volume tackles Goedel's two-stage project of first using Husserl's transcendental phenomenology to reconstruct and develop Leibniz' monadology, and then founding classical mathematics on the metaphysics thus obtained. The author analyses the historical and systematic aspects of that project, and then evaluates it, with an emphasis on the second stage. The book is organised around Goedel's use of Leibniz, Husserl and Brouwer. Far from considering past philosophers irrelevant to actual systematic concerns, Goedel embraced the use of historical authors to frame his own philosophical perspective. The philosophies of Leibniz and Husserl define his project, while Brouwer's intuitionism is its principal foil: the close affinities between phenomenology and intuitionism set the bar for Goedel's attempt to go far beyond intuitionism. The four central essays are `Monads and sets', `On the philosophical development of Kurt Goedel', `Goedel and intuitionism', and `Construction and constitution in mathematics'. The first analyses and criticises Goedel's attempt to justify, by an argument from analogy with the monadology, the reflection principle in set theory. It also provides further support for Goedel's idea that the monadology needs to be reconstructed phenomenologically, by showing that the unsupplemented monadology is not able to found mathematics directly. The second studies Goedel's reading of Husserl, its relation to Leibniz' monadology, and its influence on his publishe d writings. The third discusses how on various occasions Brouwer's intuitionism actually inspired Goedel's work, in particular the Dialectica Interpretation. The fourth addresses the question whether classical mathematics admits of the phenomenological foundation that Goedel envisaged, and concludes that it does not. The remaining essays provide further context. The essays collected here were written and published over the last decade. Notes have been added to record further thoughts, changes of mind, connections between the essays, and updates of references.
Two prisoners are told that they will be brought to a room and seated so that each can see the other. Hats will be placed on their heads; each hat is either red or green. The two prisoners must simultaneously submit a guess of their own hat color, and they both go free if at least one of them guesses correctly. While no communication is allowed once the hats have been placed, they will, however, be allowed to have a strategy session before being brought to the room. Is there a strategy ensuring their release? The answer turns out to be yes, and this is the simplest non-trivial example of a hat problem. This book deals with the question of how successfully one can predict the value of an arbitrary function at one or more points of its domain based on some knowledge of its values at other points. Topics range from hat problems that are accessible to everyone willing to think hard, to some advanced topics in set theory and infinitary combinatorics. For example, there is a method of predicting the value "f"("a") of a function f mapping the reals to the reals, based only on knowledge of "f"'s values on the open interval ("a" 1, "a"), and for every such function the prediction is incorrect only on a countable set that is nowhere dense. The monograph progresses from topics requiring fewer prerequisites to those requiring more, with most of the text being accessible to any graduate student in mathematics. The broad range of readership includes researchers, postdocs, and graduate students in the fields of set theory, mathematical logic, and combinatorics. The hope is that this book will bring together mathematicians from different areas to think about set theory via a very broad array of coordinated inference problems. "
Assessing the degree to which two objects, an object and a query, or two concepts are similar or compatible is a fundamental component of human reasoning and consequently is critical in the development of automated diagnosis, classification, information retrieval and decision systems. The assessment of similarity has played an important role in such diverse disciplines such as taxonomy, psychology, and the social sciences. Each discipline has proposed methods for quantifying similarity judgments suitable for its particular applications. This book presents a unified approach to quantifying similarity and compatibility within the framework of fuzzy set theory and examines the primary importance of these concepts in approximate reasoning. Examples of the application of similarity measures in various areas including expert systems, information retrieval, and intelligent database systems are provided.
Henkin-Keisler models emanate from a modification of the Henkin construction introduced by Keisler to motivate the definition of ultraproducts. Keisler modified the Henkin construction at that point at which `new' individual constants are introduced and did so in a way that illuminates a connection between Henkin-Keisler models and ultraproducts. The resulting construction can be viewed both as a specialization of the Henkin construction and as an alternative to the ultraproduct construction. These aspects of the Henkin-Keisler construction are utilized here to present a perspective on ultraproducts and their applications accessible to the reader familiar with Henkin's proof of the completeness of first order logic and naive set theory. This approach culminates in proofs of various forms of the Keisler-Shelah characterizations of elementary equivalence and elementary classes via Henkin-Keisler models. The presentation is self-contained and proofs of more advanced results from set theory are introduced as needed. Audience: Logicians in philosophy, computer science, linguistics and mathematics.
A practical introduction to the development of proofs and certified programs using Coq. An invaluable tool for researchers, students, and engineers interested in formal methods and the development of zero-fault software.
The theory of quasivarieties constitutes an independent direction in algebra and mathematical logic and specializes in a fragment of first-order logic-the so-called universal Horn logic. This treatise uniformly presents the principal directions of the theory from an effective algebraic approach developed by the author himself. A revolutionary exposition, this influential text contains a number of results never before published in book form, featuring in-depth commentary for applications of quasivarieties to graphs, convex geometries, and formal languages. Key features include coverage of the Birkhoff-Mal'tsev problem on the structure of lattices of quasivarieties, helpful exercises, and an extensive list of references.
This book contains the lectures given at the NATO ASI 910820 "Cellular Automata and Cooperative Systems" Meeting which was held at the Centre de Physique des Houches, France, from June 22 to July 2, 1992. This workshop brought together mathematical physicists, theoretical physicists and mathe maticians working in fields related to local interacting systems, cellular and probabilistic automata, statistical physics, and complexity theory, as well as applications of these fields. We would like to thank our sponsors and supporters whose interest and help was essential for the success of the meeting: the NATO Scientific Affairs Division, the DRET (Direction des Recherches, Etudes et Techniques), the Ministere des Affaires Etrangeres, the National Science Foundation. We would also like to thank all the secretaries who helped us during the preparation of the meeting, in particular Maryse Cohen-Solal (CPT, Marseille) and Janice Nowinski (Courant Institute, New York). We are grateful for the fine work of Mrs. Gladys Cavallone in preparing this volume."
This volume presents the results of approximately 15 years of work from researchers around the world on the use of fuzzy set theory to represent imprecision in databases. The maturity of the research in the discipline and the recent developments in commercial/industrial fuzzy databases provided an opportunity to produce this survey. Fuzzy Databases: Principles and Applications is self-contained providing background material on fuzzy sets and database theory. It is comprehensive covering all of the major approaches and models of fuzzy databases that have been developed including coverage of commercial/industrial systems and applications. Background and introductory material are provided in the first two chapters. The major approaches in fuzzy databases comprise the second part of the volume. This includes the use of similarity and proximity measures as the fuzzy techniques used to extend the relational data modeling and the use of possibility theory approaches in the relational model. Coverage includes extensions to the data model, querying approaches, functional dependencies and other topics including implementation issues, information measures, database security, alternative fuzzy data models, the IFO model, and the network data models. A number of object-oriented extensions are also discussed. The use of fuzzy data modeling in geographical information systems (GIS) and use of rough sets in rough and fuzzy rough relational data models are presented. Major emphasis has been given to applications and commercialization of fuzzy databases. Several specific industrial/commercial products and applications are described. These include approaches to developing fuzzy front-end systems andspecial-purpose systems incorporating fuzziness.
A partially ordered group is an algebraic object having the structure of a group and the structure of a partially ordered set which are connected in some natural way. These connections were established in the period between the end of 19th and beginning of 20th century. It was realized that ordered algebraic systems occur in various branches of mathemat ics bound up with its fundamentals. For example, the classification of infinitesimals resulted in discovery of non-archimedean ordered al gebraic systems, the formalization of the notion of real number led to the definition of ordered groups and ordered fields, the construc tion of non-archimedean geometries brought about the investigation of non-archimedean ordered groups and fields. The theory of partially ordered groups was developed by: R. Dedekind, a. Holder, D. Gilbert, B. Neumann, A. I. Mal'cev, P. Hall, G. Birkhoff. These connections between partial order and group operations allow us to investigate the properties of partially ordered groups. For exam ple, partially ordered groups with interpolation property were intro duced in F. Riesz's fundamental paper 1] as a key to his investigations of partially ordered real vector spaces, and the study of ordered vector spaces with interpolation properties were continued by many functional analysts since. The deepest and most developed part of the theory of partially ordered groups is the theory of lattice-ordered groups. In the 40s, following the publications of the works by G. Birkhoff, H. Nakano and P." |
You may like...
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,271
Discovery Miles 52 710
Emerging Applications of Fuzzy Algebraic…
Chiranjibe Jana, Tapan Senapati, …
Hardcover
R7,752
Discovery Miles 77 520
|