![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Domain theory is a rich interdisciplinary area at the intersection of logic, computer science, and mathematics. This volume contains selected papers presented at the International Symposium on Domain Theory which took place in Shanghai in October 1999. Topics of papers range from the encounters between topology and domain theory, sober spaces, Lawson topology, real number computability and continuous functionals to fuzzy modelling, logic programming, and pi-calculi. This book is a valuable reference for researchers and students interested in this rapidly developing area of theoretical computer science.
One criterion for classifying books is whether they are written for a single pur pose or for multiple purposes. This book belongs to the category of multipurpose books, but one of its roles is predominant-it is primarily a textbook. As such, it can be used for a variety ofcourses at the first-year graduate or upper-division undergraduate level. A common characteristic of these courses is that they cover fundamental systems concepts, major categories of systems problems, and some selected methods for dealing with these problems at a rather general level. A unique feature of the book is that the concepts, problems, and methods are introduced in the context of an architectural formulation of an expert system referred to as the general systems problem solver or aSPS-whose aim is to provide users ofall kinds with computer-based systems knowledge and methodo logy. Theasps architecture, which is developed throughout the book, facilitates a framework that is conducive to acoherent, comprehensive, and pragmaticcoverage ofsystems fundamentals-concepts, problems, and methods. A course that covers systems fundamentals is now offered not only in sys tems science, information science, or systems engineering programs, but in many programs in other disciplines as well. Although the level ofcoverage for systems science or engineering students is surely different from that used for students in other disciplines, this book is designed to serve both of these needs."
Computational intelligence paradigms have attracted the growing interest of researchers, scientists, engineers and application engineers in a number of everyday applications. These applications are not limited to any particular field and include engineering, business, banking and consumer electronics. Computational intelligence paradigms include artificial intelligence, artificial neural networks, fuzzy systems and evolutionary computing. Artificial neural networks can mimic the biological information processing mechanism in a very limited sense. Evolutionary computing algorithms are used for optimisation applications, and fuzzy logic provides a basis for representing uncertain and imprecise knowledge. Practical Applications of Computational Intelligence Techniques contains twelve chapters providing actual application of these techniques in the real world. Such examples include, but are not limited to, intelligent household appliances, aerial spray models, industrial applications and medical diagnostics and practice. This book will be useful to researchers, practicing engineers/scientists and students, who are interested in developing practical applications in a computational intelligence environment.
For more than 30 years, the author has studied the model-theoretic aspects of the theory of valued fields and multi-valued fields. Many of the key results included in this book were obtained by the author whilst preparing the manuscript. Thus the unique overview of the theory, as developed in the book, has been previously unavailable. The book deals with the theory of valued fields and mutli-valued fields. The theory of Prufer rings is discussed from the `geometric' point of view. The author shows that by introducing the Zariski topology on families of valuation rings, it is possible to distinguish two important subfamilies of Prufer rings that correspond to Boolean and near Boolean families of valuation rings. Also, algebraic and model-theoretic properties of multi-valued fields with near Boolean families of valuation rings satisfying the local-global principle are studied. It is important that this principle is elementary, i.e., it can be expressed in the language of predicate calculus. The most important results obtained in the book include a criterion for the elementarity of an embedding of a multi-valued field and a criterion for the elementary equivalence for multi-valued fields from the class defined by the additional natural elementary conditions (absolute unramification, maximality and almost continuity of local elementary properties). The book concludes with a brief chapter discussing the bibliographic references available on the material presented, and a short history of the major developments within the field.
When solving real-life engineering problems, linguistic information is often encountered that is frequently hard to quantify using "classical" mathematical techniques. This linguistic information represents subjective knowledge. Through the assumptions made by the analyst when forming the mathematical model, the linguistic information is often ignored. On the other hand, a wide range of traffic and transportation engineering parameters are characterized by uncertainty, subjectivity, imprecision, and ambiguity. Human operators, dispatchers, drivers, and passengers use this subjective knowledge or linguistic information on a daily basis when making decisions. Decisions about route choice, mode of transportation, most suitable departure time, or dispatching trucks are made by drivers, passengers, or dispatchers. In each case the decision maker is a human. The environment in which a human expert (human controller) makes decisions is most often complex, making it difficult to formulate a suitable mathematical model. Thus, the development of fuzzy logic systems seems justified in such situations. In certain situations we accept linguistic information much more easily than numerical information. In the same vein, we are perfectly capable of accepting approximate numerical values and making decisions based on them. In a great number of cases we use approximate numerical values exclusively. It should be emphasized that the subjective estimates of different traffic parameters differs from dispatcher to dispatcher, driver to driver, and passenger to passenger.
Fuzzy knowledge and fuzzy systems affect our lives today as systems enter the world of commerce. Fuzzy systems are incorporated in domestic appliances (washing machine, air conditioning, microwave, telephone) and in transport systems (a pilotless helicopter has recently completed a test flight). Future applications are expected to have dramatic implications for the demand for labor, among other things. It was with such thoughts in mind that this first international survey of future applications of fuzzy logic has been undertaken. The results are likely to be predictive for a decade beyond the millenium. The predictive element is combined with a bibliography which serves as an historical anchor as well as being both extensive and extremely useful. Analysis and Evaluation of Fuzzy Systems is thus a milestone in the development of fuzzy logic and applications of three representative subsystems: Fuzzy Control, Fuzzy Pattern Recognition and Fuzzy Communications.
Alonzo Church was undeniably one ofthe intellectual giants of theTwenti- eth Century . These articles are dedicated to his memory and illustrate the tremendous importance his ideas have had in logic , mathematics, comput er science and philosophy . Discussions of some of thesevarious contributions have appeared in The Bulletin of Symbolic Logic, and th e interested reader is invited to seek details there . Here we justtry to give somegener al sense of the scope, depth,and value of his work. Church is perhaps best known for the theorem , appropriately called " C h u r c h ' s Theorem ", that there is no decision procedure forthelogical valid- ity of formulas first-order of logic . A d ecision proce dure forthat part of logic would have come near to fulfilling Leibniz's dream of a calculus that could be mechanically used tosettle logical disputes . It was not to . be It could not be . What Church proved precisely is that there is no lambda-definable function that can i n every case providethe right answer , ' y e s ' or ' n o', tothe question of whether or not any arbitrarily given formula is valid .
Mathematics is often considered as a body of knowledge that is essen tially independent of linguistic formulations, in the sense that, once the content of this knowledge has been grasped, there remains only the problem of professional ability, that of clearly formulating and correctly proving it. However, the question is not so simple, and P. Weingartner's paper (Language and Coding-Dependency of Results in Logic and Mathe matics) deals with some results in logic and mathematics which reveal that certain notions are in general not invariant with respect to different choices of language and of coding processes. Five example are given: 1) The validity of axioms and rules of classical propositional logic depend on the interpretation of sentential variables; 2) The language dependency of verisimilitude; 3) The proof of the weak and strong anti inductivist theorems in Popper's theory of inductive support is not invariant with respect to limitative criteria put on classical logic; 4) The language-dependency of the concept of provability; 5) The language dependency of the existence of ungrounded and paradoxical sentences (in the sense of Kripke). The requirements of logical rigour and consistency are not the only criteria for the acceptance and appreciation of mathematical proposi tions and theories.
In The Foundations of Quantum Mechanics - Historical Analysis and Open Questions, leading Italian researchers involved in different aspects of the foundations and history of quantum mechanics are brought together in an interdisciplinary debate. The book therefore presents an invaluable overview of the state of Italian work in the field at this moment, and of the open problems that still exist in the foundations of the theory. Audience: Physicists, logicians, mathematicians and epistemologists whose research concerns the historical analysis of quantum mechanics.
This book has a fundamental relationship to the International Seminar on Fuzzy Set Theory held each September in Linz, Austria. First, this volume is an extended account of the eleventh Seminar of 1989. Second, and more importantly, it is the culmination of the tradition of the preceding ten Seminars. The purpose of the Linz Seminar, since its inception, was and is to foster the development of the mathematical aspects of fuzzy sets. In the earlier years, this was accomplished by bringing together for a week small grou ps of mathematicians in various fields in an intimate, focused environment which promoted much informal, critical discussion in addition to formal presentations. Beginning with the tenth Seminar, the intimate setting was retained, but each Seminar narrowed in theme; and participation was broadened to include both younger scholars within, and established mathematicians outside, the mathematical mainstream of fuzzy sets theory. Most of the material of this book was developed over the years in close association with the Seminar or influenced by what transpired at Linz. For much of the content, it played a crucial role in either stimulating this material or in providing feedback and the necessary screening of ideas. Thus we may fairly say that the book, and the eleventh Seminar to which it is directly related, are in many respects a culmination of the previous Seminars.
This volume consists of papers delivered at the conference 'The Lvov-Warsaw School and Contemporary Philosophy', organised in celebration of the hun- dredth anniversary ofKazimierz Twardowski's first lecture as Professor of Phi- losophy at Lvov University. This lecture can be regarded as the starting point of the development of analytic philosophy in Poland, which culminated in the Warsaw School of Logic. The conference was held in Lvov (15-17 November) and Warsaw (19-21 November 1995). It was organised jointly by the Ukrainian Academy of Sci- ences and the Polish Academy of Sciences. The general organisation was un- dertaken by Professor Andrzej Grzegorczyk (polish Academy of Sciences) and Professor Marat Vernikov (Ukrainian Academy of Sciences). Professors Jaroslaw Isaievich (Ukrainian Academy of Sciences) and Jan Wolenski (Jagiel- Ionian University) were responsible for the scientific programme. Over 100 philosophers participated in the conference. Papers published in this volume are organised according to the sections of the conference and rep- of the papers delivered. resent a selection The editors would like to express their gratitude to Professor Andrzej Grze- gorczyk, spiritus movens of the conference, who, by including the present vol- ume in a programme of publications connected with the hundredth anniversary of the Lvov-Warsaw School, provided financial support for its preparation. Fi- nally, we express our gratitude to Dr Timothy Childers of the Academy of Sciences of the Czech Republic for correcting the English of the papers.
We dedicate this volume to Professor Parimala on the occasion of her 60th birthday. It contains a variety of papers related to the themes of her research. Parimala's rst striking result was a counterexample to a quadratic analogue of Serre's conjecture (Bulletin of the American Mathematical Society, 1976). Her in uence has cont- ued through her tenure at the Tata Institute of Fundamental Research in Mumbai (1976-2006),and now her time at Emory University in Atlanta (2005-present). A conference was held from 30 December 2008 to 4 January 2009, at the U- versity of Hyderabad, India, to celebrate Parimala's 60th birthday (see the conf- ence's Web site at http://mathstat.uohyd.ernet.in/conf/quadforms2008). The or- nizing committee consisted of J.-L. Colliot-Thel ' en ' e, Skip Garibaldi, R. Sujatha, and V. Suresh. The present volume is an outcome of this event. We would like to thank all the participants of the conference, the authors who have contributed to this volume, and the referees who carefully examined the s- mitted papers. We would also like to thank Springer-Verlag for readily accepting to publish the volume. In addition, the other three editors of the volume would like to place on record their deep appreciation of Skip Garibaldi's untiring efforts toward the nal publication.
Adaptive Resonance Theory Microchips describes circuit strategies resulting in efficient and functional adaptive resonance theory (ART) hardware systems. While ART algorithms have been developed in software by their creators, this is the first book that addresses efficient VLSI design of ART systems. All systems described in the book have been designed and fabricated (or are nearing completion) as VLSI microchips in anticipation of the impending proliferation of ART applications to autonomous intelligent systems. To accommodate these systems, the book not only provides circuit design techniques, but also validates them through experimental measurements. The book also includes a chapter tutorially describing four ART architectures (ART1, ARTMAP, Fuzzy-ART and Fuzzy-ARTMAP) while providing easily understandable MATLAB code examples to implement these four algorithms in software. In addition, an entire chapter is devoted to other potential applications for real-time data clustering and category learning.
The importance of having ef cient and effective methods for data mining and kn- ledge discovery (DM&KD), to which the present book is devoted, grows every day and numerous such methods have been developed in recent decades. There exists a great variety of different settings for the main problem studied by data mining and knowledge discovery, and it seems that a very popular one is formulated in terms of binary attributes. In this setting, states of nature of the application area under consideration are described by Boolean vectors de ned on some attributes. That is, by data points de ned in the Boolean space of the attributes. It is postulated that there exists a partition of this space into two classes, which should be inferred as patterns on the attributes when only several data points are known, the so-called positive and negative training examples. The main problem in DM&KD is de ned as nding rules for recognizing (cl- sifying) new data points of unknown class, i. e. , deciding which of them are positive and which are negative. In other words, to infer the binary value of one more attribute, called the goal or class attribute. To solve this problem, some methods have been suggested which construct a Boolean function separating the two given sets of positive and negative training data points.
3. Textbook for a course in expert systems, if an emphasis is placed on Chapters 1 to 3 and on a selection of material from Chapters 4 to 7. There is also the option of using an additional commercially available sheU for a programming project. In assigning a programming project, the instructor may use any part of a great variety of books covering many subjects, such as car repair. Instructions for mostofthe "weekend mechanic" books are close stylisticaUy to expert system rules. Contents Chapter 1 gives an introduction to the subject matter; it briefly presents basic concepts, history, and some perspectives ofexpert systems. Then itpresents the architecture of an expert system and explains the stages of building an expert system. The concept of uncertainty in expert systems and the necessity of deal ing with the phenomenon are then presented. The chapter ends with the descrip tion of taxonomy ofexpert systems. Chapter 2 focuses on knowledge representation. Four basic ways to repre sent knowledge in expert systems are presented: first-order logic, production sys tems, semantic nets, and frames. Chapter 3 contains material about knowledge acquisition. Among machine learning techniques, a methodofrule learning from examples is explained in de tail. Then problems ofrule-base verification are discussed. In particular, both consistency and completeness oftherule base are presented."
As of today, Evolutionary Computing and Fuzzy Set Computing are two mature, wen -developed, and higbly advanced technologies of information processing. Bach of them has its own clearly defined research agenda, specific goals to be achieved, and a wen setUed algorithmic environment. Concisely speaking, Evolutionary Computing (EC) is aimed at a coherent population -oriented methodology of structural and parametric optimization of a diversity of systems. In addition to this broad spectrum of such optimization applications, this paradigm otTers an important ability to cope with realistic goals and design objectives reflected in the form of relevant fitness functions. The GA search (which is often regarded as a dominant domain among other techniques of EC such as evolutionary strategies, genetic programming or evolutionary programming) delivers a great deal of efficiency helping navigate through large search spaces. The main thrust of fuzzy sets is in representing and managing nonnumeric (linguistic) information. The key notion (whose conceptual as weH as algorithmic importance has started to increase in the recent years) is that of information granularity. It somewhat concurs with the principle of incompatibility coined by L. A. Zadeh. Fuzzy sets form a vehic1e helpful in expressing a granular character of information to be captured. Once quantified via fuzzy sets or fuzzy relations, the domain knowledge could be used efficiently very often reducing a heavy computation burden when analyzing and optimizing complex systems.
Text Retrieval and Filtering: Analytical Models of Performance is the first book that addresses the problem of analytically computing the performance of retrieval and filtering systems. The book describes means by which retrieval may be studied analytically, allowing one to describe current performance, predict future performance, and to understand why systems perform as they do. The focus is on retrieving and filtering natural language text, with material addressing retrieval performance for the simple case of queries with a single term, the more complex case with multiple terms, both with term independence and term dependence, and for the use of grammatical information to improve performance. Unambiguous statements of the conditions under which one method or system will be more effective than another are developed. Text Retrieval and Filtering: Analytical Models of Performance focuses on the performance of systems that retrieve natural language text, considering full sentences as well as phrases and individual words. The last chapter explicitly addresses how grammatical constructs and methods may be studied in the context of retrieval or filtering system performance. The book builds toward solving this problem, although the material in earlier chapters is as useful to those addressing non-linguistic, statistical concerns as it is to linguists. Those interested in grammatical information should be cautioned to carefully examine earlier chapters, especially Chapters 7 and 8, which discuss purely statistical relationships between terms, before moving on to Chapter 10, which explicitly addresses linguistic issues. Text Retrieval and Filtering: Analytical Models of Performance is suitable as a secondary text for a graduate level course on Information Retrieval or Linguistics, and as a reference for researchers and practitioners in industry.
On the 26th of November 1992 the organizing committee gathered together, at Luigi Salce's invitation, for the first time. The tradition of abelian groups and modules Italian conferences (Rome 77, Udine 85, Bressanone 90) needed to be kept up by one more meeting. Since that first time it was clear to us that our goal was not so easy. In fact the main intended topics of abelian groups, modules over commutative rings and non commutative rings have become so specialized in the last years that it looked really ambitious to fit them into only one meeting. Anyway, since everyone of us shared the same mathematical roots, we did want to emphasize a common link. So we elaborated the long symposium schedule: three days of abelian groups and three days of modules over non commutative rings with a two days' bridge of commutative algebra in between. Many of the most famous names in these fields took part to the meeting. Over 140 participants, both attending and contributing the 18 Main Lectures and 64 Communications (see list on page xv) provided a really wide audience for an Algebra meeting. Now that the meeting is over, we can say that our initial feeling was right.
Industrial development is essential to improvement of the standard of living in all countries. People's health and the environment can be affected, directly or indirectly by routine waste discharges or by accidents. A series of recent major industrial accidents and the effect of pollution highlighted, once again, the need for better management of routine and accidental risks. Moreover, the existence of natural hazards complicate even more the situation in any given region. In the past effort to cope with these risks, if made at all, have been largely on a plant by plant basis; some plants are well equipped to manage environmental and health hazards, while others are not. Managing the hazards of modern technological systems has become a key activity in highly industrialised countries. Decision makers are often confronted with complex issues concerning economic and social development, industrialisation and associated infrastructure needs, population and land use planning. Such issues have to be addressed in such a way that ensures that public health will not be disrupted or substantially degraded. Due to the increasing complexity of technological systems and the higher geographical density of punctual hazard sources, new methodologies and a novel approach to these problems are challenging risk managers and regional planers. Risks from these new complex technological systems are inherently different form those addressed by the risk managers for decades ago.
In this monograph we study two generalizations of standard unification, E-unification and higher-order unification, using an abstract approach orig inated by Herbrand and developed in the case of standard first-order unifi cation by Martelli and Montanari. The formalism presents the unification computation as a set of non-deterministic transformation rules for con verting a set of equations to be unified into an explicit representation of a unifier (if such exists). This provides an abstract and mathematically elegant means of analysing the properties of unification in various settings by providing a clean separation of the logical issues from the specification of procedural information, and amounts to a set of 'inference rules' for unification, hence the title of this book. We derive the set of transformations for general E-unification and higher order unification from an analysis of the sense in which terms are 'the same' after application of a unifying substitution. In both cases, this results in a simple extension of the set of basic transformations given by Herbrand Martelli-Montanari for standard unification, and shows clearly the basic relationships of the fundamental operations necessary in each case, and thus the underlying structure of the most important classes of term unifi cation problems."
The theory of constructive (recursive) models follows from works of Froehlich, Shepherdson, Mal'tsev, Kuznetsov, Rabin, and Vaught in the 50s. Within the framework of this theory, algorithmic properties of abstract models are investigated by constructing representations on the set of natural numbers and studying relations between algorithmic and structural properties of these models. This book is a very readable exposition of the modern theory of constructive models and describes methods and approaches developed by representatives of the Siberian school of algebra and logic and some other researchers (in particular, Nerode and his colleagues). The main themes are the existence of recursive models and applications to fields, algebras, and ordered sets (Ershov), the existence of decidable prime models (Goncharov, Harrington), the existence of decidable saturated models (Morley), the existence of decidable homogeneous models (Goncharov and Peretyat'kin), properties of the Ehrenfeucht theories (Millar, Ash, and Reed), the theory of algorithmic dimension and conditions of autostability (Goncharov, Ash, Shore, Khusainov, Ventsov, and others), and the theory of computable classes of models with various properties. Future perspectives of the theory of constructive models are also discussed. Most of the results in the book are presented in monograph form for the first time. The theory of constructive models serves as a basis for recursive mathematics. It is also useful in computer science, in particular, in the study of programming languages, higher level languages of specification, abstract data types, and problems of synthesis and verification of programs. Therefore, the book will be useful for not only specialists in mathematical logic and the theory of algorithms but also for scientists interested in the mathematical fundamentals of computer science. The authors are eminent specialists in mathematical logic. They have established fundamental results on elementary theories, model theory, the theory of algorithms, field theory, group theory, applied logic, computable numberings, the theory of constructive models, and the theoretical computer science.
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Call-by-push-value is a programming language paradigm that,
surprisingly, breaks down the call-by-value and call-by-name
paradigms into simple primitives. This monograph, written for
graduate students and researchers, exposes the call-by-push-value
structure underlying a remarkable range of semantics, including
operational semantics, domains, possible worlds, continuations and
games.
Since their inception, fuzzy sets and fuzzy logic became popular. The reason is that the very idea of fuzzy sets and fuzzy logic attacks an old tradition in science, namely bivalent (black-or-white, all-or-none) judg ment and reasoning and the thus resulting approach to formation of scientific theories and models of reality. The idea of fuzzy logic, briefly speaking, is just the opposite of this tradition: instead of full truth and falsity, our judgment and reasoning also involve intermediate truth values. Application of this idea to various fields has become known under the term fuzzy approach (or graded truth approach). Both prac tice (many successful engineering applications) and theory (interesting nontrivial contributions and broad interest of mathematicians, logicians, and engineers) have proven the usefulness of fuzzy approach. One of the most successful areas of fuzzy methods is the application of fuzzy relational modeling. Fuzzy relations represent formal means for modeling of rather nontrivial phenomena (reasoning, decision, control, knowledge extraction, systems analysis and design, etc. ) in the pres ence of a particular kind of indeterminacy called vagueness. Models and methods based on fuzzy relations are often described by logical formulas (or by natural language statements that can be translated into logical formulas). Therefore, in order to approach these models and methods in an appropriate formal way, it is desirable to have a general theory of fuzzy relational systems with basic connections to (formal) language which enables us to describe relationships in these systems.
This monograph covers the recent major advances in various areas of set theory. From the reviews: "One of the classical textbooks and reference books in set theory....The present Third Millennium edition...is a whole new book. In three parts the author offers us what in his view every young set theorist should learn and master....This well-written book promises to influence the next generation of set theorists, much as its predecessor has done." --MATHEMATICAL REVIEWS |
![]() ![]() You may like...
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R1,060
Discovery Miles 10 600
Primary Maths for Scotland Textbook 1C…
Craig Lowther, Antoinette Irwin, …
Paperback
Primary Maths for Scotland Textbook 2A…
Craig Lowther, Antoinette Irwin, …
Paperback
The New Method Arithmetic [microform]
P (Phineas) McIntosh, C a (Carl Adolph) B 1879 Norman
Hardcover
R993
Discovery Miles 9 930
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R1,057
Discovery Miles 10 570
Logic and Implication - An Introduction…
Petr Cintula, Carles Noguera
Hardcover
R3,629
Discovery Miles 36 290
|