![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web site. The special feature of this book is that it presents a new "hands on" didatic approach using LISP and Mathematica software. The reader will be able to derive an understanding of the close relationship between mathematics and physics. "The Limits of Mathematics is a very personal and idiosyncratic account of Greg Chaitin's entire career in developing algorithmic information theory. The combination of the edited transcripts of his three introductory lectures maintains all the energy and content of the oral presentations, while the material on AIT itself gives a full explanation of how to implement Greg's ideas on real computers for those who want to try their hand at furthering the theory." (John Casti, Santa Fe Institute)
Goal Directed Proof Theory presents a uniform and coherent methodology for automated deduction in non-classical logics, the relevance of which to computer science is now widely acknowledged. The methodology is based on goal-directed provability. It is a generalization of the logic programming style of deduction, and it is particularly favourable for proof search. The methodology is applied for the first time in a uniform way to a wide range of non-classical systems, covering intuitionistic, intermediate, modal and substructural logics. The book can also be used as an introduction to these logical systems form a procedural perspective. Readership: Computer scientists, mathematicians and philosophers, and anyone interested in the automation of reasoning based on non-classical logics. The book is suitable for self study, its only prerequisite being some elementary knowledge of logic and proof theory.
Alfred Tarski was one of the two giants of the twentieth-century development of logic, along with Kurt Goedel. The four volumes of this collection contain all of Tarski's published papers and abstracts, as well as a comprehensive bibliography. Here will be found many of the works, spanning the period 1921 through 1979, which are the bedrock of contemporary areas of logic, whether in mathematics or philosophy. These areas include the theory of truth in formalized languages, decision methods and undecidable theories, foundations of geometry, set theory, and model theory, algebraic logic, and universal algebra.
Logic and Complexity looks at basic logic as it is used in Computer Science, and provides students with a logical approach to Complexity theory. With plenty of exercises, this book presents classical notions of mathematical logic, such as decidability, completeness and incompleteness, as well as new ideas brought by complexity theory such as NP-completeness, randomness and approximations, providing a better understanding for efficient algorithmic solutions to problems. Divided into three parts, it covers: - Model Theory and Recursive Functions - introducing the basic model theory of propositional, 1st order, inductive definitions and 2nd order logic. Recursive functions, Turing computability and decidability are also examined. - Descriptive Complexity - looking at the relationship between definitions of problems, queries, properties of programs and their computational complexity. - Approximation - explaining how some optimization problems and counting problems can be approximated according to their logical form. Logic is important in Computer Science, particularly for verification problems and database query languages such as SQL. Students and researchers in this field will find this book of great interest.
Since 1990 the German Research Society (Deutsche Forschungsgemeinschaft, DFG) has been funding PhD courses (Graduiertenkollegs) at selected universi- ties in the Federal Republic of Germany. TU Berlin has been one of the first universities joining that new funding program of DFG. The PhD courses have been funded over aperiod of 9 years. The grant for the nine years sums up to approximately 5 million DM. Our Grnduiertenkolleg on Communication-based Systems has been assigned to the Computer Science Department of TU Berlin although it is a joined effort of all three universities in Berlin, Technische Uni- versitat (TU), Freie Universitat (FU), and Humboldt Universitat (HU). The Graduiertenkolleg has been started its program in October 1991. The professors responsible for the program are: Hartmut Ehrig (TU), Gunter Hommel (TU), Stefan Jahnichen (TU), Peter Lohr (FU), Miroslaw Malek (RU), Peter Pep- per (TU), Radu Popescu-Zeletin (TU), Herbert Weber (TU), and Adam Wolisz (TU). The Graduiertenkolleg is a PhD program for highly qualified persons in the field of computer science. Twenty scholarships have been granted to fellows of the Graduiertenkolleg for a maximal period of three years. During this time the fellows take part in a selected educational program and work on their PhD thesis.
2. The Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3. Convergence Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ., . . . . 60 4. Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 5. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 A Simple Proof for a Result of Ollerenshaw on Steiner Trees . . . . . . . . . . 68 Xiufeng Du, Ding-Zhu Du, Biao Gao, and Lixue Qii 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 2. In the Euclidean Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 3. In the Rectilinear Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 4. Discussion . . . . . . . . . . . . -. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Optimization Algorithms for the Satisfiability (SAT) Problem . . . . . . . . . 72 Jun Gu 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 2. A Classification of SAT Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7:3 3. Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IV 4. Complete Algorithms and Incomplete Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5. Optimization: An Iterative Refinement Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 6. Local Search Algorithms for SAT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 7. Global Optimization Algorithms for SAT Problem . . . . . . . . . . . . . . . . . . . . . . . . 106 8. Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 9. Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 10. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 Ergodic Convergence in Proximal Point Algorithms with Bregman Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 Osman Guier 1. Introduction . . .: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 2. Convergence for Function Minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158 3. Convergence for Arbitrary Maximal Monotone Operators . . . . . . . . . . . .
The book presents in a mathematical clear way the fundamentals of algorithmic information theory and a few selected applications. This 2nd edition presents new and important results obtained in recent years: the characterization of computable enumerable random reals, the construction of an Omega Number for which ZFC cannot determine any digits, and the first successful attempt to compute the exact values of 64 bits of a specific Omega Number. Finally, the book contains a discussion of some interesting philosophical questions related to randomness and mathematical knowledge. "Professor Calude has produced a first-rate exposition of up-to-date work in information and randomness." D.S. Bridges, Canterbury University, co-author, with Errett Bishop, of Constructive Analysis "The second edition of this classic work is highly recommended to anyone interested in algorithmic information and randomness." G.J. Chaitin, IBM Research Division, New York, author of Conversations with a Mathematician "This book is a must for a comprehensive introduction to algorithmic information theory and for anyone interested in its applications in the natural sciences." K. Svozil, Technical University of Vienna, author of Randomness & Undecidability in Physics
This monograph details several important advances in the area known as the proofs-as-programs paradigm, a set of approaches to developing programs from proofs in constructive logic. It serves the dual purpose of providing a state-of-the-art overview of the field and detailing tools and techniques to stimulate further research. One of the booka (TM)s central themes is a general, abstract framework for developing new systems of program synthesis by adapting proofs-as-programs to new contexts, which the authors call the Curry--Howard Protocol. This protocol is used to provide two novel applications for industrial-scale, complex software engineering: contractual imperative program synthesis and structured software synthesis. These applications constitute an exemplary justification for the applicability of the protocol to different contexts. The book is intended for graduate students in computer science or mathematics who wish to extend their background in logic and type theory as well as gain experience working with logical frameworks and practical proof systems. In addition, the proofs-as-programs research community, and the wider computational logic, formal methods and software engineering communities will benefit. The applications given in the book should be of interest for researchers working in the target problem domains.
As is known, the book named "Multivariate spline functions and their applications" has been published by the Science Press in 1994. This book is an English edition based on the original book mentioned 1 above with many changes, including that of the structure of a cubic - interpolation in n-dimensional spline spaces, and more detail on triangu- lations have been added in this book. Special cases of multivariate spline functions (such as step functions, polygonal functions, and piecewise polynomials) have been examined math- ematically for a long time. I. J. Schoenberg (Contribution to the problem of application of equidistant data by analytic functions, Quart. Appl. Math., 4(1946), 45 - 99; 112 - 141) and W. Quade & L. Collatz (Zur Interpo- lations theories der reellen periodischen function, Press. Akad. Wiss. (PhysMath. KL), 30(1938), 383- 429) systematically established the the- ory of the spline functions. W. Quade & L. Collatz mainly discussed the periodic functions, while I. J. Schoenberg's work was systematic and com- plete. I. J. Schoenberg outlined three viewpoints for studing univariate splines: Fourier transformations, truncated polynomials and Taylor ex- pansions. Based on the first two viewpoints, I. J. Schoenberg deduced the B-spline function and its basic properties, especially the basis func- tions. Based on the latter viewpoint, he represented the spline functions in terms of truncated polynomials. These viewpoints and methods had significantly effected on the development of the spline functions.
This essential companion volume to Chaitin's highly successful "The Limits of Mathematics", also published by Springer, gives a brilliant historical survey of the work of this century on the foundations of mathematics, in which the author was a major participant. The Unknowable is a very readable and concrete introduction to Chaitin's ideas, and it includes a detailed explanation of the programming language used by Chaitin in both volumes. It will enable computer users to interact with the author's proofs and discover for themselves how they work. The software for The Unknowable can be downloaded from the author's Web site.
Studies in Logic and the Foundations of Mathematics, Volume 123: Constructivism in Mathematics: An Introduction, Vol. II focuses on various studies in mathematics and logic, including metric spaces, polynomial rings, and Heyting algebras. The publication first takes a look at the topology of metric spaces, algebra, and finite-type arithmetic and theories of operators. Discussions focus on intuitionistic finite-type arithmetic, theories of operators and classes, rings and modules, linear algebra, polynomial rings, fields and local rings, complete separable metric spaces, and located sets. The text then examines proof theory of intuitionistic logic, theory of types and constructive set theory, and choice sequences. The book elaborates on semantical completeness, sheaves, sites, and higher-order logic, and applications of sheaf models. Topics include a derived rule of local continuity, axiom of countable choice, forcing over sites, sheaf models for higher-order logic, and complete Heyting algebras. The publication is a valuable reference for mathematicians and researchers interested in mathematics and logic.
This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical math fundamental oriented approach that is commonly found in mathematical logic textbooks).
Unique selling point: * Industry standard book for merchants, banks, and consulting firms looking to learn more about PCI DSS compliance. Core audience: * Retailers (both physical and electronic), firms who handle credit or debit cards (such as merchant banks and processors), and firms who deliver PCI DSS products and services. Place in the market: * Currently there are no PCI DSS 4.0 books
"In case you are considering to adopt this book for courses with over 50 students, please contact ""[email protected]"" for more information. "
The last three chapters of the book provide an introduction to type theory (higher-order logic). It is shown how various mathematical concepts can be formalized in this very expressive formal language. This expressive notation facilitates proofs of the classical incompleteness and undecidability theorems which are very elegant and easy to understand. The discussion of semantics makes clear the important distinction between standard and nonstandard models which is so important in understanding puzzling phenomena such as the incompleteness theorems and Skolem's Paradox about countable models of set theory. Some of the numerous exercises require giving formal proofs. A computer program called ETPS which is available from the web facilitates doing and checking such exercises. "Audience: " This volume will be of interest to mathematicians, computer scientists, and philosophers in universities, as well as to computer scientists in industry who wish to use higher-order logic for hardware and software specification and verification. "
Data fusion or information fusion are names which have been primarily assigned to military-oriented problems. In military applications, typical data fusion problems are: multisensor, multitarget detection, object identification, tracking, threat assessment, mission assessment and mission planning, among many others. However, it is clear that the basic underlying concepts underlying such fusion procedures can often be used in nonmilitary applications as well. The purpose of this book is twofold: First, to point out present gaps in the way data fusion problems are conceptually treated. Second, to address this issue by exhibiting mathematical tools which treat combination of evidence in the presence of uncertainty in a more systematic and comprehensive way. These techniques are based essentially on two novel ideas relating to probability theory: the newly developed fields of random set theory and conditional and relational event algebra. This volume is intended to be both an update on research progress on data fusion and an introduction to potentially powerful new techniques: fuzzy logic, random set theory, and conditional and relational event algebra. Audience: This volume can be used as a reference book for researchers and practitioners in data fusion or expert systems theory, or for graduate students as text for a research seminar or graduate level course.
The expression of uncertainty in measurement poses a challenge since it involves physical, mathematical, and philosophical issues. This problem is intensified by the limitations of the probabilistic approach used by the current standard (the GUM Instrumentation Standard). This text presents an alternative approach. It makes full use of the mathematical theory of evidence to express the uncertainty in measurements. Coverage provides an overview of the current standard, then pinpoints and constructively resolves its limitations. Numerous examples throughout help explain the book 's unique approach.
Fuzzy Logic Foundations and Industrial Applications is an organized edited collection of contributed chapters covering basic fuzzy logic theory, fuzzy linear programming, and applications. Special emphasis has been given to coverage of recent research results, and to industrial applications of fuzzy logic. The chapters are new works that have been written exclusively for this book by many of the leading and prominent researchers (such as Ronald Yager, Ellen Hisdal, Etienne Kerre, and others) in this field. The contributions are original and each chapter is self-contained. The authors have been careful to indicate direct links between fuzzy set theory and its industrial applications. Fuzzy Logic Foundations and Industrial Applications is an invaluable work that provides researchers and industrial engineers with up-to-date coverage of new results on fuzzy logic and relates these results to their industrial use.
This book contains leading survey papers on the various aspects of Abduction, both logical and numerical approaches. Abduction is central to all areas of applied reasoning, including artificial intelligence, philosophy of science, machine learning, data mining and decision theory, as well as logic itself.
The purpose of the book is to advance in the understanding of brain function by defining a general framework for representation based on category theory. The idea is to bring this mathematical formalism into the domain of neural representation of physical spaces, setting the basis for a theory of mental representation, able to relate empirical findings, uniting them into a sound theoretical corpus. The innovative approach presented in the book provides a horizon of interdisciplinary collaboration that aims to set up a common agenda that synthesizes mathematical formalization and empirical procedures in a systemic way. Category theory has been successfully applied to qualitative analysis, mainly in theoretical computer science to deal with programming language semantics. Nevertheless, the potential of category theoretic tools for quantitative analysis of networks has not been tackled so far. Statistical methods to investigate graph structure typically rely on network parameters. Category theory can be seen as an abstraction of graph theory. Thus, new categorical properties can be added into network analysis and graph theoretic constructs can be accordingly extended in more fundamental basis. By generalizing networks using category theory we can address questions and elaborate answers in a more fundamental way without waiving graph theoretic tools. The vital issue is to establish a new framework for quantitative analysis of networks using the theory of categories, in which computational neuroscientists and network theorists may tackle in more efficient ways the dynamics of brain cognitive networks. The intended audience of the book is researchers who wish to explore the validity of mathematical principles in the understanding of cognitive systems. All the actors in cognitive science: philosophers, engineers, neurobiologists, cognitive psychologists, computer scientists etc. are akin to discover along its pages new unforeseen connections through the development of concepts and formal theories described in the book. Practitioners of both pure and applied mathematics e.g., network theorists, will be delighted with the mapping of abstract mathematical concepts in the terra incognita of cognition.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic."
Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis called convex analysis. In particular, it explores the topics of duality, separation, representation, and resolution. The work is intended for students of economics, management science, engineering, and mathematics who need exposure to the mathematical foundations of matrix games, optimization, and general equilibrium analysis. It is written at the advanced undergraduate to beginning graduate level and the only formal preparation required is some familiarity with set operations and with linear algebra and matrix theory. Fundamentals of Convex Analysis is self-contained in that a brief review of the essentials of these tool areas is provided in Chapter 1. Chapter exercises are also provided. Topics covered include: convex sets and their properties; separation and support theorems; theorems of the alternative; convex cones; dual homogeneous systems; basic solutions and complementary slackness; extreme points and directions; resolution and representation of polyhedra; simplicial topology; and fixed point theorems, among others. A strength of this work is how these topics are developed in a fully integrated fashion.
This is a two-volume collection presenting the selected works of Herbert Busemann, one of the leading geometers of the twentieth century and one of the main founders of metric geometry, convexity theory and convexity in metric spaces. Busemann also did substantial work (probably the most important) on Hilbert's Problem IV. These collected works include Busemann's most important published articles on these topics. Volume I of the collection features Busemann's papers on the foundations of geodesic spaces and on the metric geometry of Finsler spaces. Volume II includes Busemann's papers on convexity and integral geometry, on Hilbert's Problem IV, and other papers on miscellaneous subjects. Each volume offers biographical documents and introductory essays on Busemann's work, documents from his correspondence and introductory essays written by leading specialists on Busemann's work. They are a valuable resource for researchers in synthetic and metric geometry, convexity theory and the foundations of geometry.
The theory of constructive (recursive) models follows from works of Froehlich, Shepherdson, Mal'tsev, Kuznetsov, Rabin, and Vaught in the 50s. Within the framework of this theory, algorithmic properties of abstract models are investigated by constructing representations on the set of natural numbers and studying relations between algorithmic and structural properties of these models. This book is a very readable exposition of the modern theory of constructive models and describes methods and approaches developed by representatives of the Siberian school of algebra and logic and some other researchers (in particular, Nerode and his colleagues). The main themes are the existence of recursive models and applications to fields, algebras, and ordered sets (Ershov), the existence of decidable prime models (Goncharov, Harrington), the existence of decidable saturated models (Morley), the existence of decidable homogeneous models (Goncharov and Peretyat'kin), properties of the Ehrenfeucht theories (Millar, Ash, and Reed), the theory of algorithmic dimension and conditions of autostability (Goncharov, Ash, Shore, Khusainov, Ventsov, and others), and the theory of computable classes of models with various properties. Future perspectives of the theory of constructive models are also discussed. Most of the results in the book are presented in monograph form for the first time. The theory of constructive models serves as a basis for recursive mathematics. It is also useful in computer science, in particular, in the study of programming languages, higher level languages of specification, abstract data types, and problems of synthesis and verification of programs. Therefore, the book will be usefulfor not only specialists in mathematical logic and the theory of algorithms but also for scientists interested in the mathematical fundamentals of computer science. The authors are eminent specialists in mathematical logic. They have established fundamental results on elementary theories, model theory, the theory of algorithms, field theory, group theory, applied logic, computable numberings, the theory of constructive models, and the theoretical computer science.
This book gives an account of the fundamental results in geometric stability theory, a subject that has grown out of categoricity and classification theory. This approach studies the fine structure of models of stable theories, using the geometry of forking; this often achieves global results relevant to classification theory. Topics range from Zilber-Cherlin classification of infinite locally finite homogenous geometries, to regular types, their geometries, and their role in superstable theories. The structure and existence of definable groups is featured prominently, as is work by Hrushovski. The book is unique in the range and depth of material covered and will be invaluable to anyone interested in modern model theory. |
You may like...
Data Analytics for Social Microblogging…
Soumi Dutta, Asit Kumar Das, …
Paperback
R3,335
Discovery Miles 33 350
Computational Intelligence for Big Data…
D P Acharjya, Satchidananda Dehuri, …
Hardcover
The Science of Science
Dashun Wang, Albert-Laszlo Barabasi
Paperback
Decision Aid Models for Disaster…
Begona Vitoriano, Javier Montero, …
Hardcover
Developments in Robust Statistics…
Rudolf Dutter, Peter Filzmoser, …
Hardcover
R4,081
Discovery Miles 40 810
|