![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
This book constitutes the refereed proceedings of the 11th
International Conference on Typed Lambda Calculi and Applications,
TLCA 2013, held in Eindhoven, The Netherlands, in June 2013 as part
of RDP 2013, the 7th Federated Conference on Rewriting, Deduction,
and Programming, together with the 24th International Conference on
Rewriting Techniques and Applications, RTA 2013, and several
related events.
In recent years, there have been several attempts to define a logic for information retrieval (IR). The aim was to provide a rich and uniform representation of information and its semantics with the goal of improving retrieval effectiveness. The basis of a logical model for IR is the assumption that queries and documents can be represented effectively by logical formulae. To retrieve a document, an IR system has to infer the formula representing the query from the formula representing the document. This logical interpretation of query and document emphasizes that relevance in IR is an inference process. The use of logic to build IR models enables one to obtain models that are more general than earlier well-known IR models. Indeed, some logical models are able to represent within a uniform framework various features of IR systems such as hypermedia links, multimedia data, and user's knowledge. Logic also provides a common approach to the integration of IR systems with logical database systems. Finally, logic makes it possible to reason about an IR model and its properties. This latter possibility is becoming increasingly more important since conventional evaluation methods, although good indicators of the effectiveness of IR systems, often give results which cannot be predicted, or for that matter satisfactorily explained. However, logic by itself cannot fully model IR. The success or the failure of the inference of the query formula from the document formula is not enough to model relevance in IR. It is necessary to take into account the uncertainty inherent in such an inference process. In 1986, Van Rijsbergen proposed the uncertainty logical principle to model relevance as an uncertain inference process. When proposing the principle, Van Rijsbergen was not specific about which logic and which uncertainty theory to use. As a consequence, various logics and uncertainty theories have been proposed and investigated. The choice of an appropriate logic and uncertainty mechanism has been a main research theme in logical IR modeling leading to a number of logical IR models over the years. Information Retrieval: Uncertainty and Logics contains a collection of exciting papers proposing, developing and implementing logical IR models. This book is appropriate for use as a text for a graduate-level course on Information Retrieval or Database Systems, and as a reference for researchers and practitioners in industry.
also in: THE KLUWER INTERNATIONAL SERIES ON ASIAN STUDIES IN COMPUTER AND INFORMATION SCIENCE, Volume 2
Neural Networks and Fuzzy Systems: Theory and Applications discusses theories that have proven useful in applying neural networks and fuzzy systems to real world problems. The book includes performance comparison of neural networks and fuzzy systems using data gathered from real systems. Topics covered include the Hopfield network for combinatorial optimization problems, multilayered neural networks for pattern classification and function approximation, fuzzy systems that have the same functions as multilayered networks, and composite systems that have been successfully applied to real world problems. The author also includes representative neural network models such as the Kohonen network and radial basis function network. New fuzzy systems with learning capabilities are also covered. The advantages and disadvantages of neural networks and fuzzy systems are examined. The performance of these two systems in license plate recognition, a water purification plant, blood cell classification, and other real world problems is compared.
Computer systems that analyze images are critical to a wide variety of applications such as visual inspections systems for various manufacturing processes, remote sensing of the environment from space-borne imaging platforms, and automatic diagnosis from X-rays and other medical imaging sources. Professor Azriel Rosenfeld, the founder of the field of digital image analysis, made fundamental contributions to a wide variety of problems in image processing, pattern recognition and computer vision. Professor Rosenfeld's previous students, postdoctoral scientists, and colleagues illustrate in Foundations of Image Understanding how current research has been influenced by his work as the leading researcher in the area of image analysis for over two decades. Each chapter of Foundations of Image Understanding is written by one of the world's leading experts in his area of specialization, examining digital geometry and topology (early research which laid the foundations for many industrial machine vision systems), edge detection and segmentation (fundamental to systems that analyze complex images of our three-dimensional world), multi-resolution and variable resolution representations for images and maps, parallel algorithms and systems for image analysis, and the importance of human psychophysical studies of vision to the design of computer vision systems. Professor Rosenfeld's chapter briefly discusses topics not covered in the contributed chapters, providing a personal, historical perspective on the development of the field of image understanding. Foundations of Image Understanding is an excellent source of basic material for both graduate students entering the field and established researchers who require a compact source for many of the foundational topics in image analysis.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic.
Call-by-push-value is a programming language paradigm that,
surprisingly, breaks down the call-by-value and call-by-name
paradigms into simple primitives. This monograph, written for
graduate students and researchers, exposes the call-by-push-value
structure underlying a remarkable range of semantics, including
operational semantics, domains, possible worlds, continuations and
games.
The main idea of statistical convergence is to demand convergence only for a majority of elements of a sequence. This method of convergence has been investigated in many fundamental areas of mathematics such as: measure theory, approximation theory, fuzzy logic theory, summability theory, and so on. In this monograph we consider this concept in approximating a function by linear operators, especially when the classical limit fails. The results of this book not only cover the classical and statistical approximation theory, but also are applied in the fuzzy logic via the fuzzy-valued operators. The authors in particular treat the important Korovkin approximation theory of positive linear operators in statistical and fuzzy sense. They also present various statistical approximation theorems for some specific real and complex-valued linear operators that are not positive. This is the first monograph in Statistical Approximation Theory and Fuzziness. The chapters are self-contained and several advanced courses can be taught. The research findings will be useful in various applications including applied and computational mathematics, stochastics, engineering, artificial intelligence, vision and machine learning. This monograph is directed to graduate students, researchers, practitioners and professors of all disciplines.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
One of the important areas of contemporary combinatorics is Ramsey theory. Ramsey theory is basically the study of structure preserved under partitions. The general philosophy is reflected by its interdisciplinary character. The ideas of Ramsey theory are shared by logicians, set theorists and combinatorists, and have been successfully applied in other branches of mathematics. The whole subject is quickly developing and has some new and unexpected applications in areas as remote as functional analysis and theoretical computer science. This book is a homogeneous collection of research and survey articles by leading specialists. It surveys recent activity in this diverse subject and brings the reader up to the boundary of present knowledge. It covers virtually all main approaches to the subject and suggests various problems for individual research.
This book constitutes the thoroughly refereed post-conference proceedings of the 20th International Workshop on Algebraic Development Techniques, WADT 2010, held in July 2010 in Etelsen, Germany. The 15 revised papers presented were carefully reviewed and selected from 32 presentations. The workshop deals with the following topics: foundations of algebraic specification; other approaches to formal specification including process calculi and models of concurrent, distributed and mobile computing; specification languages, methods, and environments; semantics of conceptual modeling methods and techniques; model-driven development; graph transformations, term rewriting and proof systems; integration of formal specification techniques; formal testing and quality assurance validation, and verification.
The concept of infinity is one of the most important, and at the same time, one of the most mysterious concepts of science. Already in antiquity many philosophers and mathematicians pondered over its contradictory nature. In mathematics, the contradictions connected with infinity intensified after the creation, at the end of the 19th century, of the theory of infinite sets and the subsequent discovery, soon after, of paradoxes in this theory. At the time, many scientists ignored the paradoxes and used set theory extensively in their work, while others subjected set-theoretic methods in mathematics to harsh criticism. The debate intensified when a group of French mathematicians, who wrote under the pseudonym of Nicolas Bourbaki, tried to erect the whole edifice of mathematics on the single notion of a set. Some mathematicians greeted this attempt enthusiastically while others regarded it as an unnecessary formalization, an attempt to tear mathematics away from life-giving practical applications that sustain it. These differences notwithstanding, Bourbaki has had a significant influence on the evolution of mathematics in the twentieth century. In this book we try to tell the reader how the idea of the infinite arose and developed in physics and in mathematics, how the theory of infinite sets was constructed, what paradoxes it has led to, what significant efforts have been made to eliminate the resulting contradictions, and what routes scientists are trying to find that would provide a way out of the many difficulties.
"Kind of crude, but it works, boy, it works " AZan NeweZZ to Herb Simon, Christmas 1955 In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies, USA, it proved a number theoretic theorem in Presburger Arithmetic. Christmas 1955 heralded a computer program which generated the first proofs of some propositions of Principia Mathematica, developed by A. Newell, J. Shaw, and H. Simon at RAND Corporation, USA. In Sweden, H. Prawitz, D. Prawitz, and N. Voghera produced the first general program for the full first order predicate calculus to prove mathematical theorems; their computer proofs were obtained around 1957 and 1958, about the same time that H. Gelernter finished a computer program to prove simple high school geometry theorems. Since the field of computational logic (or automated theorem proving) is emerging from the ivory tower of academic research into real world applications, asserting also a definite place in many university curricula, we feel the time has corne to examine and evaluate its history. The article by Martin Davis in the first of this series of volumes traces the most influential ideas back to the 'prehistory' of early logical thought showing how these ideas influenced the underlying concepts of most early automatic theorem proving programs.
Coalgebraic logic is an important research topic in the areas of concurrency theory, semantics, transition systems and modal logics. It provides a general approach to modeling systems, allowing us to apply important results from coalgebras, universal algebra and category theory in novel ways. Stochastic systems provide important tools for systems modeling, and recent work shows that categorical reasoning may lead to new insights, previously not available in a purely probabilistic setting. This book combines coalgebraic reasoning, stochastic systems and logics. It provides an insight into the principles of coalgebraic logic from a categorical point of view, and applies these systems to interpretations of stochastic coalgebraic logics, which include well-known modal logics and continuous time branching logics. The author introduces stochastic systems together with their probabilistic and categorical foundations and gives a comprehensive discussion of the Giry monad as the underlying categorical construction, presenting many new, hitherto unpublished results. He discusses modal logics, introduces their probabilistic interpretations, and then proceeds to an analysis of Kripke models for coalgebraic logics. The book will be of interest to researchers in theoretical computer science, logic and category theory.
1. The ?rst edition of this book was published in 1977. The text has been well received and is still used, although it has been out of print for some time. In the intervening three decades, a lot of interesting things have happened to mathematical logic: (i) Model theory has shown that insights acquired in the study of formal languages could be used fruitfully in solving old problems of conventional mathematics. (ii) Mathematics has been and is moving with growing acceleration from the set-theoretic language of structures to the language and intuition of (higher) categories, leaving behind old concerns about in?nities: a new view of foundations is now emerging. (iii) Computer science, a no-nonsense child of the abstract computability theory, has been creatively dealing with old challenges and providing new ones, such as the P/NP problem. Planning additional chapters for this second edition, I have decided to focus onmodeltheory, the conspicuousabsenceofwhichinthe ?rsteditionwasnoted in several reviews, and the theory of computation, including its categorical and quantum aspects. The whole Part IV: Model Theory, is new. I am very grateful to Boris I. Zilber, who kindly agreed to write it. It may be read directly after Chapter II. The contents of the ?rst edition are basically reproduced here as Chapters I-VIII. Section IV.7, on the cardinality of the continuum, is completed by Section IV.7.3, discussing H. Woodin's discovery.
The theory of fuzzy sets has become known in Czechoslovakia in the early seventies. Since then, it was applied in various areas of science, engineering and economics where indeterminate concepts had to be handled. There has been a number of national semi- nars and conferences devoted to this topic. However, the International Symposium on Fuzzy Approach to Reasoning and Decision-Making, held in 1990, was the first really representative international meeting of this kind organized in Czechoslovakia. The symposium took place in the House of Scientists of the Czechoslovak Academy of Sciences in Bechyne from June 25 till 29, 1990. Its main organizer was Mining In- stitute of the Czechoslovak Academy of Sciences in Ostrava in cooperation and support of several other institutions and organizations. A crucial role in preparing of the Sym- posium was played by the working group for Fuzzy Sets and Systems which is active in the frame of the Society of Czechoslovak Mathematicians and Physicists. The organizing and program committee was headed by Dr. Vilem Novak from the Mining Institute in Ostrava. Its members (in alphabetical order) were Dr. Martin Cerny (Prague), Prof. Bla- hoslav Harman (Liptovsky Mikulas), Ema Hyklova (Prague), Prof. Zdenek Karpfsek (Brno), Jan Laub (Prague), Dr. Milan MareS - vice-chairman (Prague), Prof. Radko Mesiar (Bratislava), Dr. Jifi Nekola - vice-chairman (Prague), Daria Novakova (Os- trava), Dr. Jaroslav Ramfk (Ostrava), Prof. Dr. Beloslav Riecan (Bratislava), Dr. Jana TalaSova (Pi'erov) and Dr. Milos Vitek (Pardubice).
Combinatorial Algorithms on Words refers to the collection of manipulations of strings of symbols (words) - not necessarily from a finite alphabet - that exploit the combinatorial properties of the logical/physical input arrangement to achieve efficient computational performances. The model of computation may be any of the established serial paradigms (e.g. RAM's, Turing Machines), or one of the emerging parallel models (e.g. PRAM, WRAM, Systolic Arrays, CCC). This book focuses on some of the accomplishments of recent years in such disparate areas as pattern matching, data compression, free groups, coding theory, parallel and VLSI computation, and symbolic dynamics; these share a common flavor, yet ltave not been examined together in the past. In addition to being theoretically interest ing, these studies have had significant applications. It happens that these works have all too frequently been carried out in isolation, with contributions addressing similar issues scattered throughout a rather diverse body of literature. We felt that it would be advantageous to both current and future researchers to collect this work in a sin gle reference. It should be clear that the book's emphasis is on aspects of combinatorics and com plexity rather than logic, foundations, and decidability. In view of the large body of research and the degree of unity already achieved by studies in the theory of auto mata and formal languages, we have allocated very little space to them."
Approximate reasoning is a key motivation in fuzzy sets and possibility theory. This volume provides a coherent view of this field, and its impact on database research and information retrieval. First, the semantic foundations of approximate reasoning are presented. Special emphasis is given to the representation of fuzzy rules and specialized types of approximate reasoning. Then syntactic aspects of approximate reasoning are surveyed and the algebraic underpinnings of fuzzy consequence relations are presented and explained. The second part of the book is devoted to inductive and neuro-fuzzy methods for learning fuzzy rules. It also contains new material on the application of possibility theory to data fusion. The last part of the book surveys the growing literature on fuzzy information systems. Each chapter contains extensive bibliographical material. Fuzzy Sets in Approximate Reasoning and Information Systems is a major source of information for research scholars and graduate students in computer science and artificial intelligence, interested in human information processing.
"Kind of Cl'Ude ~ but it UJorks~ boy~ it UJOrksl" Alan Ner. ueH to Herb Simon~ C1rl'istmas 1955 In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies, USA, it proved a number theoretic theorem in Presburger Arithmetic. Christmas 1955 heralded a computer program which generated the first proofs of some propositions of Principia Mathematica, developed by A. Newell, J. Shaw, and H. Simon at RAND Corporation, USA. In Sweden, H. Prawitz, D. Prawitz, and N. Voghera produced the first general program for the full first order predicate calculus to prove mathematical theorems; their computer proofs were obtained around 1957 and 1958, about the same time that H. Gelernter finished a computer program to prove simple high school geometry theorems. Since the field of computational logic (or automated theorem proving) is emerging from the ivory tower of academic research into real world applications, asserting also a definite place in many university curricula, we feel the time has come to examine and evaluate its history. The article by Martin Davis in the first of this series of volumes traces the most influential ideas back to the 'prehistory' of early logical thought showing how these ideas influenced the underlying concepts of most early automatic theorem proving programs.
One high-level ability of the human brain is to understand what it has learned. This seems to be the crucial advantage in comparison to the brain activity of other primates. At present we are technologically almost ready to artificially reproduce human brain tissue, but we still do not fully understand the information processing and the related biological mechanisms underlying this ability. Thus an electronic clone of the human brain is still far from being realizable. At the same time, around twenty years after the revival of the connectionist paradigm, we are not yet satisfied with the typical subsymbolic attitude of devices like neural networks: we can make them learn to solve even difficult problems, but without a clear explanation of why a solution works. Indeed, to widely use these devices in a reliable and non elementary way we need formal and understandable expressions of the learnt functions. of being tested, manipulated and composed with These must be susceptible other similar expressions to build more structured functions as a solution of complex problems via the usual deductive methods of the Artificial Intelligence. Many effort have been steered in this directions in the last years, constructing artificial hybrid systems where a cooperation between the sub symbolic processing of the neural networks merges in various modes with symbolic algorithms. In parallel, neurobiology research keeps on supplying more and more detailed explanations of the low-level phenomena responsible for mental processes.
When solving real-life engineering problems, linguistic information is often encountered that is frequently hard to quantify using "classical" mathematical techniques. This linguistic information represents subjective knowledge. Through the assumptions made by the analyst when forming the mathematical model, the linguistic information is often ignored. On the other hand, a wide range of traffic and transportation engineering parameters are characterized by uncertainty, subjectivity, imprecision, and ambiguity. Human operators, dispatchers, drivers, and passengers use this subjective knowledge or linguistic information on a daily basis when making decisions. Decisions about route choice, mode of transportation, most suitable departure time, or dispatching trucks are made by drivers, passengers, or dispatchers. In each case the decision maker is a human. The environment in which a human expert (human controller) makes decisions is most often complex, making it difficult to formulate a suitable mathematical model. Thus, the development of fuzzy logic systems seems justified in such situations. In certain situations we accept linguistic information much more easily than numerical information. In the same vein, we are perfectly capable of accepting approximate numerical values and making decisions based on them. In a great number of cases we use approximate numerical values exclusively. It should be emphasized that the subjective estimates of different traffic parameters differs from dispatcher to dispatcher, driver to driver, and passenger to passenger.
Convexity of sets in linear spaces, and concavity and convexity of functions, lie at the root of beautiful theoretical results that are at the same time extremely useful in the analysis and solution of optimization problems, including problems of either single objective or multiple objectives. Not all of these results rely necessarily on convexity and concavity; some of the results can guarantee that each local optimum is also a global optimum, giving these methods broader application to a wider class of problems. Hence, the focus of the first part of the book is concerned with several types of generalized convex sets and generalized concave functions. In addition to their applicability to nonconvex optimization, these convex sets and generalized concave functions are used in the book's second part, where decision-making and optimization problems under uncertainty are investigated. Uncertainty in the problem data often cannot be avoided when dealing with practical problems. Errors occur in real-world data for a host of reasons. However, over the last thirty years, the fuzzy set approach has proved to be useful in these situations. It is this approach to optimization under uncertainty that is extensively used and studied in the second part of this book. Typically, the membership functions of fuzzy sets involved in such problems are neither concave nor convex. They are, however, often quasiconcave or concave in some generalized sense. This opens possibilities for application of results on generalized concavity to fuzzy optimization. Despite this obvious relation, applying the interface of these two areas has been limited to date. It is hoped that the combination of ideas and results from the field of generalized concavity on the one hand and fuzzy optimization on the other hand outlined and discussed in Generalized Concavity in Fuzzy Optimization and Decision Analysis will be of interest to both communities. Our aim is to broaden the classes of problems that the combination of these two areas can satisfactorily address and solve.
This monograph contains the results of our joint research over the last ten years on the logic of the fixed point operation. The intended au dience consists of graduate students and research scientists interested in mathematical treatments of semantics. We assume the reader has a good mathematical background, although we provide some prelimi nary facts in Chapter 1. Written both for graduate students and research scientists in theoret ical computer science and mathematics, the book provides a detailed investigation of the properties of the fixed point or iteration operation. Iteration plays a fundamental role in the theory of computation: for example, in the theory of automata, in formal language theory, in the study of formal power series, in the semantics of flowchart algorithms and programming languages, and in circular data type definitions. It is shown that in all structures that have been used as semantical models, the equational properties of the fixed point operation are cap tured by the axioms describing iteration theories. These structures include ordered algebras, partial functions, relations, finitary and in finitary regular languages, trees, synchronization trees, 2-categories, and others."
In the six years since the first edition of this book was published, the field of Structural Complexity has grown quite a bit. However, we are keeping this volume at the same basic level that it had in the first edition, and the only new result incorporated as an appendix is the closure under complementation of nondeterministic space classes, which in the previous edition was posed as an open problem. This result was already included in our Volume II, but we feel that due to the basic nature of the result, it belongs to this volume. There are of course other important results obtained during these last six years. However, as they belong to new areas opened in the field they are outside the scope of this fundamental volume. Other changes in this second edition are the update of some Bibliograph ical Remarks and references, correction of many mistakes and typos, and a renumbering of the definitions and results. Experience has shown us that this new numbering is a lot more friendly, and several readers have confirmed this opinion. For the sake of the reader of Volume II, where all references to Volume I follow the old numbering, we have included here a table indicating the new number corresponding to each of the old ones."
Substructural logics are by now one of the most prominent branches
of the research field usually labelled as "nonclassical logics" -
and perhaps of logic tout court. Over the last few decades a vast
amount of research papers and even some books have been devoted to
this subject. The aim of the present book is to give a
comprehensive account of the "state of the art" of substructural
logics, focusing both on their proof theory (especially on sequent
calculi and their generalizations) and on their semantics (both
algebraic and relational). |
You may like...
The Public School Arithmetic - Based on…
J a (James Alexander) 18 McLellan, A F (Albert Flintoft) Ames
Hardcover
R919
Discovery Miles 9 190
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R807
Discovery Miles 8 070
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R981
Discovery Miles 9 810
|