![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
This is a monograph that details the use of Siegel's method and the classical results of homotopy groups of spheres and Lie groups to determine some Gottlieb groups of projective spaces or to give the lower bounds of their orders. Making use of the properties of Whitehead products, the authors also determine some Whitehead center groups of projective spaces that are relevant and new within this monograph.
This book contains the proceedings of the International Symposium on Mathematical Morphology and its Applications to Image and Signal Processing IV, held June 3-5, 1998, in Amsterdam, The Netherlands. The purpose of the work is to provide the image analysis community with a sampling of recent developments in theoretical and practical aspects of mathematical morphology and its applications to image and signal processing. Among the areas covered are: digitization and connectivity, skeletonization, multivariate morphology, morphological segmentation, color image processing, filter design, gray-scale morphology, fuzzy morphology, decomposition of morphological operators, random sets and statistical inference, differential morphology and scale-space, morphological algorithms and applications. Audience: This volume will be of interest to research mathematicians and computer scientists whose work involves mathematical morphology, image and signal processing.
Many years of practical experience in teaching discrete mathematics form the basis of this text book. Part I contains problems on such topics as Boolean algebra, k-valued logics, graphs and networks, elements of coding theory, automata theory, algorithms theory, combinatorics, Boolean minimization and logical design. The exercises are preceded by ample theoretical background material. For further study the reader is referred to the extensive bibliography. Part II follows the same structure as Part I, and gives helpful hints and solutions. Audience: This book will be of great value to undergraduate students of discrete mathematics, whereas the more difficult exercises, which comprise about one-third of the material, will also appeal to postgraduates and researchers.
Fundamentals of Fuzzy Sets covers the basic elements of fuzzy set theory. Its four-part organization provides easy referencing of recent as well as older results in the field. The first part discusses the historical emergence of fuzzy sets, and delves into fuzzy set connectives, and the representation and measurement of membership functions. The second part covers fuzzy relations, including orderings, similarity, and relational equations. The third part, devoted to uncertainty modelling, introduces possibility theory, contrasting and relating it with probabilities, and reviews information measures of specificity and fuzziness. The last part concerns fuzzy sets on the real line - computation with fuzzy intervals, metric topology of fuzzy numbers, and the calculus of fuzzy-valued functions. Each chapter is written by one or more recognized specialists and offers a tutorial introduction to the topics, together with an extensive bibliography.
Fuzzy logic techniques have had extraordinary growth in various engineering systems. The developments in engineering sciences have caused apprehension in modern years due to high-tech industrial processes with ever-increasing levels of complexity. Advanced Fuzzy Logic Approaches in Engineering Science provides innovative insights into a comprehensive range of soft fuzzy logic techniques applied in various fields of engineering problems like fuzzy sets theory, adaptive neuro fuzzy inference system, and hybrid fuzzy logic genetic algorithms belief networks in industrial and engineering settings. The content within this publication represents the work of particle swarms, fuzzy computing, and rough sets. It is a vital reference source for engineers, research scientists, academicians, and graduate-level students seeking coverage on topics centered on the applications of fuzzy logic in high-tech industrial processes.
This is the first book on cut-elimination in first-order predicate logic from an algorithmic point of view. Instead of just proving the existence of cut-free proofs, it focuses on the algorithmic methods transforming proofs with arbitrary cuts to proofs with only atomic cuts (atomic cut normal forms, so-called ACNFs). The first part investigates traditional reductive methods from the point of view of proof rewriting. Within this general framework, generalizations of Gentzen's and Sch\"utte-Tait's cut-elimination methods are defined and shown terminating with ACNFs of the original proof. Moreover, a complexity theoretic comparison of Gentzen's and Tait's methods is given. The core of the book centers around the cut-elimination method CERES (cut elimination by resolution) developed by the authors. CERES is based on the resolution calculus and radically differs from the reductive cut-elimination methods. The book shows that CERES asymptotically outperforms all reductive methods based on Gentzen's cut-reduction rules. It obtains this result by heavy use of subsumption theorems in clause logic. Moreover, several applications of CERES are given (to interpolation, complexity analysis of cut-elimination, generalization of proofs, and to the analysis of real mathematical proofs). Lastly, the book demonstrates that CERES can be extended to nonclassical logics, in particular to finitely-valued logics and to G\"odel logic.
The main characteristics of the real-world decision-making problems facing humans today are multidimensional and have multiple objectives including eco nomic, environmental, social, and technical ones. Hence, it seems natural that the consideration of many objectives in the actual decision-making process re quires multiobjective approaches rather than single-objective. One ofthe major systems-analytic multiobjective approaches to decision-making under constraints is multiobjective optimization as a generalization of traditional single-objective optimization. Although multiobjective optimization problems differ from single objective optimization problems only in the plurality of objective functions, it is significant to realize that multiple objectives are often noncom mensurable and conflict with each other in multiobjective optimization problems. With this ob servation, in multiobjective optimization, the notion of Pareto optimality or effi ciency has been introduced instead of the optimality concept for single-objective optimization. However, decisions with Pareto optimality or efficiency are not uniquely determined; the final decision must be selected from among the set of Pareto optimal or efficient solutions. Therefore, the question is, how does one find the preferred point as a compromise or satisficing solution with rational pro cedure? This is the starting point of multiobjective optimization. To be more specific, the aim is to determine how one derives a compromise or satisficing so lution of a decision maker (DM), which well represents the subjective judgments, from a Pareto optimal or an efficient solution set."
Category theory is a branch of abstract algebra with incredibly
diverse applications. This text and reference book is aimed not
only at mathematicians, but also researchers and students of
computer science, logic, linguistics, cognitive science,
philosophy, and any of the other fields in which the ideas are
being applied. Containing clear definitions of the essential
concepts, illuminated with numerous accessible examples, and
providing full proofs of all important propositions and theorems,
this book aims to make the basic ideas, theorems, and methods of
category theory understandable to this broad readership.
This second edition of "A Beginner's Guide to Finite Mathematics" takes a distinctly applied approach to finite mathematics at the freshman and sophomore level. Topics are presented sequentially: the book opens with a brief review of sets and numbers, followed by an introduction to data sets, histograms, means and medians. Counting techniques and the Binomial Theorem are covered, which provides the foundation for elementary probability theory; this, in turn, leads to basic statistics. This new edition includes chapters on game theory and financial mathematics. Requiring little mathematical background beyond high school algebra, the text will be especially useful for business and liberal arts majors.
*An emphasis on the art of proof. *Enhanced number theory chapter presents some easily accessible but still-unsolved problems. These include the Goldbach conjecture, the twin-prime conjecture, and so forth. *The discussion of equivalence relations is revised to present reflexivity, symmetry, and transitivity before we define equivalence relations. *The discussion of the RSA cryptosystem in Chapter 10 is expanded. *The author introduces groups much earlier, as this is an incisive example of an axiomatic theory. Coverage of group theory, formerly in Chapter 11, has been moved up, this is an incisive example of an axiomatic theory.
Approximation Theory, Wavelets and Applications draws together the latest developments in the subject, provides directions for future research, and paves the way for collaborative research. The main topics covered include constructive multivariate approximation, theory of splines, spline wavelets, polynomial and trigonometric wavelets, interpolation theory, polynomial and rational approximation. Among the scientific applications were de-noising using wavelets, including the de-noising of speech and images, and signal and digital image processing. In the area of the approximation of functions the main topics include multivariate interpolation, quasi-interpolation, polynomial approximation with weights, knot removal for scattered data, convergence theorems in PadA(c) theory, Lyapunov theory in approximation, Neville elimination as applied to shape preserving presentation of curves, interpolating positive linear operators, interpolation from a convex subset of Hilbert space, and interpolation on the triangle and simplex. Wavelet theory is growing extremely rapidly and has applications which will interest readers in the physical, medical, engineering and social sciences.
In 1907 Luitzen Egbertus Jan Brouwer defended his doctoral dissertation on the foundations of mathematics and with this event the modem version of mathematical intuitionism came into being. Brouwer attacked the main currents of the philosophy of mathematics: the formalists and the Platonists. In tum, both these schools began viewing intuitionism as the most harmful party among all known philosophies of mathematics. That was the origin of the now-90-year-old debate over intuitionism. As both sides have appealed in their arguments to philosophical propositions, the discussions have attracted the attention of philosophers as well. One might ask here what role a philosopher can play in controversies over mathematical intuitionism. Can he reasonably enter into disputes among mathematicians? I believe that these disputes call for intervention by a philo sopher. The three best-known arguments for intuitionism, those of Brouwer, Heyting and Dummett, are based on ontological and epistemological claims, or appeal to theses that properly belong to a theory of meaning. Those lines of argument should be investigated in order to find what their assumptions are, whether intuitionistic consequences really follow from those assumptions, and finally, whether the premises are sound and not absurd. The intention of this book is thus to consider seriously the arguments of mathematicians, even if philosophy was not their main field of interest. There is little sense in disputing whether what mathematicians said about the objectivity and reality of mathematical facts belongs to philosophy, or not."
This volume contains the proceedings of the conference Logical Foundations of Mathematics, Computer Science, and Physics-Kurt Godel's Legacy, held in Brno, Czech Republic on the 90th anniversary of his birth. The wide and continuing importance of Godel s work in the logical foundations of mathematics, computer science, and physics is confirmed by the broad range of speakers who participated in making this gathering a scientific event.
In Western Civilization Mathematics and Music have a long and interesting history in common, with several interactions, traditionally associated with the name of Pythagoras but also with a significant number of other mathematicians, like Leibniz, for instance. Mathematical models can be found for almost all levels of musical activities from composition to sound production by traditional instruments or by digital means. Modern music theory has been incorporating more and more mathematical content during the last decades. This book offers a journey into recent work relating music and mathematics. It contains a large variety of articles, covering the historical aspects, the influence of logic and mathematical thought in composition, perception and understanding of music and the computational aspects of musical sound processing. The authors illustrate the rich and deep interactions that exist between Mathematics and Music.
The nationwide research project Deduktion', funded by the Deutsche Forschungsgemeinschaft (DFG)' for a period of six years, brought together almost all research groups within Germany engaged in the field of automated reasoning. Intensive cooperation and exchange of ideas led to considerable progress both in the theoretical foundations and in the application of deductive knowledge. This three-volume book covers these original contributions moulded into the state of the art of automated deduction. The three volumes are intended to document and advance a development in the field of automated deduction that can now be observed all over the world. Rather than restricting the interest to purely academic research, the focus now is on the investigation of problems derived from realistic applications. In fact industrial applications are already pursued on a trial basis. In consequence the emphasis of the volumes is not on the presentation of the theoretical foundations of logical deduction as such, as in a handbook; rather the books present the concepts and methods now available in automated deduction in a form which can be easily accessed by scientists working in applications outside of the field of deduction. This reflects the strong conviction that automated deduction is on the verge of being fully included in the evolution of technology. Volume I focuses on basic research in deduction and on the knowledge on which modern deductive systems are based. Volume II presents techniques of implementation and details about system building. Volume III deals with applications of deductive techniques mainly, but not exclusively, to mathematics and the verification of software. Each chapter was read bytwo referees, one an international expert from abroad and the other a knowledgeable participant in the national project. It has been accepted for inclusion on the basis of these review reports. Audience: Researchers and developers in software engineering, formal methods, certification, verification, validation, specification of complex systems and software, expert systems, natural language processing.
This book presents an in-depth and critical reconstruction of Prawitz's epistemic grounding, and discusses it within the broader field of proof-theoretic semantics. The theory of grounds is also provided with a formal framework, through which several relevant results are proved. Investigating Prawitz's theory of grounds, this work answers one of the most fundamental questions in logic: why and how do some inferences have the epistemic power to compel us to accept their conclusion, if we have accepted their premises? Prawitz proposes an innovative description of inferential acts, as applications of constructive operations on grounds for the premises, yielding a ground for the conclusion. The book is divided into three parts. In the first, the author discusses the reasons that have led Prawitz to abandon his previous semantics of valid arguments and proofs. The second part presents Prawitz's grounding as found in his ground-theoretic papers. Finally, in the third part, a formal apparatus is developed, consisting of a class of languages whose terms are equipped with denotation functions associating them to operations and grounds, as well as of a class of systems where important properties of the terms can be proved.
This monograph is the r st in Fuzzy Approximation Theory. It contains mostly the author s research work on fuzziness of the last ten years and relies a lot on [10]-[32] and it is a natural outgrowth of them. It belongs to the broader area of Fuzzy Mathematics. Chapters are self-contained and several advanced courses can be taught out of this book. We provide lots of applications but always within the framework of Fuzzy Mathematics. In each chapter is given background and motivations. A c- plete list of references is provided at the end. The topics covered are very diverse. In Chapter 1 we give an extensive basic background on Fuzziness and Fuzzy Real Analysis, as well a complete description of the book. In the following Chapters 2,3 we cover in deep Fuzzy Di?erentiation and Integ- tion Theory, e.g. we present Fuzzy Taylor Formulae. It follows Chapter 4 on Fuzzy Ostrowski Inequalities. Then in Chapters 5, 6 we present results on classical algebraic and trigonometric polynomial Fuzzy Approximation.
The papers in this volume represent a selection of updated talks which were presented in an SDS sponsored International Workshop in Panporovo, Bulgaria, in September 1990. The aim of the text is to bring the reader up to date on research in set-valued analysis and differential inclusions.
1. Interpolation problems play an important role both in theoretical and applied investigations. This explains the great number of works dedicated to classical and new interpolation problems ([1)-[5], [8), [13)-[16], [26)-[30], [57]). In this book we use a method of operator identities for investigating interpo lation problems. Following the method of operator identities we formulate a general interpolation problem containing the classical interpolation problems (Nevanlinna Pick, Caratheodory, Schur, Humburger, Krein) as particular cases. We write down the abstract form of the Potapov inequality. By solving this inequality we give the description of the set of solutions of the general interpolation problem in the terms of the linear-fractional transformation. Then we apply the obtained general results to a number of classical and new interpolation problems. Some chapters of the book are dedicated to the application of the interpola tion theory results to several other problems (the extension problem, generalized stationary processes, spectral theory, nonlinear integrable equations, functions with operator arguments). 2. Now we shall proceed to a more detailed description of the book contents.
In the beginning of 1983, I came across A. Kaufmann's book "Introduction to the theory of fuzzy sets" (Academic Press, New York, 1975). This was my first acquaintance with the fuzzy set theory. Then I tried to introduce a new component (which determines the degree of non-membership) in the definition of these sets and to study the properties of the new objects so defined. I defined ordinary operations as "n," "U," "+" and ."" over the new sets, but I had began to look more seriously at them since April 1983, when I defined operators analogous to the modal operators of "necessity" and "possibility." The late George Gargov (7 April 1947 - 9 November 1996) is the "god father" of the sets I introduced - in fact, he has invented the name "intu itionistic fuzzy," motivated by the fact that the law of the excluded middle does not hold for them. Presently, intuitionistic fuzzy sets are an object of intensive research by scholars and scientists from over ten countries. This book is the first attempt for a more comprehensive and complete report on the intuitionistic fuzzy set theory and its more relevant applications in a variety of diverse fields. In this sense, it has also a referential character."
In this two-volume compilation of articles, leading researchers reevaluate the success of Hilbert's axiomatic method, which not only laid the foundations for our understanding of modern mathematics, but also found applications in physics, computer science and elsewhere. The title takes its name from David Hilbert's seminal talk Axiomatisches Denken, given at a meeting of the Swiss Mathematical Society in Zurich in 1917. This marked the beginning of Hilbert's return to his foundational studies, which ultimately resulted in the establishment of proof theory as a new branch in the emerging field of mathematical logic. Hilbert also used the opportunity to bring Paul Bernays back to Goettingen as his main collaborator in foundational studies in the years to come. The contributions are addressed to mathematical and philosophical logicians, but also to philosophers of science as well as physicists and computer scientists with an interest in foundations.
Since the late 1980s, a large number of very user-friendly tools for fuzzy control, fuzzy expert systems, and fuzzy data analysis have emerged. This has changed the character of this area and started the area of `fuzzy technology'. The next large step in the development occurred in 1992 when almost independently in Europe, Japan and the USA, the three areas of fuzzy technology, artificial neural nets and genetic algorithms joined forces under the title of `computational intelligence' or `soft computing'. The synergies which were possible between these three areas have been exploited very successfully. Practical Applications of Fuzzy Sets focuses on model and real applications of fuzzy sets, and is structured into four major parts: engineering and natural sciences; medicine; management; and behavioral, cognitive and social sciences. This book will be useful for practitioners of fuzzy technology, scientists and students who are looking for applications of their models and methods, for topics of their theses, and even for venture capitalists who look for attractive possibilities for investments.
This book defines a logical system called the Protocol-theoretic Logic of Epistemic Norms (PLEN), it develops PLEN into a formal framework for representing and reasoning about epistemic norms, and it shows that PLEN is theoretically interesting and useful with regard to the aims of such a framework. In order to motivate the project, the author defends an account of epistemic norms called epistemic proceduralism. The core of this view is the idea that, in virtue of their indispensable, regulative role in cognitive life, epistemic norms are closely intertwined with procedural rules that restrict epistemic actions, procedures, and processes. The resulting organizing principle of the book is that epistemic norms are protocols for epistemic planning and control. The core of the book is developing PLEN, which is essentially a novel variant of propositional dynamic logic (PDL) distinguished by more or less elaborate revisions of PDL's syntax and semantics. The syntax encodes the procedural content of epistemic norms by means of the well-known protocol or program constructions of dynamic and epistemic logics. It then provides a novel language of operators on protocols, including a range of unique protocol equivalence relations, syntactic operations on protocols, and various procedural relations among protocols in addition to the standard dynamic (modal) operators of PDL. The semantics of the system then interprets protocol expressions and expressions embedding protocols over a class of directed multigraph-like structures rather than the standard labeled transition systems or modal frames. The intent of the system is to better represent epistemic dynamics, build a logic of protocols atop it, and then show that the resulting logic of protocols is useful as a logical framework for epistemic norms. The resulting theory of epistemic norms centers on notions of norm equivalence derived from theories of process equivalence familiar from the study of dynamic and modal logics. The canonical account of protocol equivalence in PLEN turns out to possess a number of interesting formal features, including satisfaction of important conditions on hyperintensional equivalence, a matter of recently recognized importance in the logic of norms, generally. To show that the system is interesting and useful as a framework for representing and reasoning about epistemic norms, the author applies the logical system to the analysis of epistemic deontic operators, and, partly on the basis of this, establishes representation theorems linking protocols to the action-guiding content of epistemic norms. The protocol-theoretic logic of epistemic norms is then shown to almost immediately validate the main principles of epistemic proceduralism. |
You may like...
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R569
Discovery Miles 5 690
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,271
Discovery Miles 52 710
Key to Advanced Arithmetic for Canadian…
Barnard 1817-1876 Smith, Archibald McMurchy
Hardcover
R863
Discovery Miles 8 630
|