![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
This is a monograph that details the use of Siegel's method and the classical results of homotopy groups of spheres and Lie groups to determine some Gottlieb groups of projective spaces or to give the lower bounds of their orders. Making use of the properties of Whitehead products, the authors also determine some Whitehead center groups of projective spaces that are relevant and new within this monograph.
Introduction to Fuzzy Reliability treats fuzzy methodology in hardware reliability and software reliability in a relatively systematic manner. The contents of this book are organized as follows. Chapter 1 places reliability engineering in the scope of a broader area, i.e. system failure engineering. Readers will find that although this book is confined to hardware and software reliability, it may be useful for other aspects of system failure engineering, like maintenance and quality control. Chapter 2 contains the elementary knowledge of fuzzy sets and possibility spaces which are required reading for the rest of this book. This chapter is included for the overall completeness of the book, but a few points (e.g. definition of conditional possibility and existence theorem of possibility space) may be new. Chapter 3 discusses how to calculate probist system reliability when the component reliabilities are represented by fuzzy numbers, and how to analyze fault trees when probabilities of basic events are fuzzy. Chapter 4 presents the basic theory of profust reliability, whereas Chapter 5 analyzes the profust reliability behavior of a number of engineering systems. Chapters 6 and 7 are devoted to probist reliability theory from two different perspectives. Chapter 8 discusses how to model software reliability behavior by using fuzzy methodology. Chapter 9 includes a number of mathematical problems which are raised by applications of fuzzy methodology in hardware and software reliability, but may be important for fuzzy set and possibility theories.
Boolean valued analysis is a technique for studying properties of an arbitrary mathematical object by comparing its representations in two different set-theoretic models whose construction utilises principally distinct Boolean algebras. The use of two models for studying a single object is a characteristic of the so-called non-standard methods of analysis. Application of Boolean valued models to problems of analysis rests ultimately on the procedures of ascending and descending, the two natural functors acting between a new Boolean valued universe and the von Neumann universe. This book demonstrates the main advantages of Boolean valued analysis which provides the tools for transforming, for example, function spaces to subsets of the reals, operators to functionals, and vector-functions to numerical mappings. Boolean valued representations of algebraic systems, Banach spaces, and involutive algebras are examined thoroughly. Audience: This volume is intended for classical analysts seeking new tools, and for model theorists in search of challenging applications of nonstandard models.
In this two-volume compilation of articles, leading researchers reevaluate the success of Hilbert's axiomatic method, which not only laid the foundations for our understanding of modern mathematics, but also found applications in physics, computer science and elsewhere. The title takes its name from David Hilbert's seminal talk Axiomatisches Denken, given at a meeting of the Swiss Mathematical Society in Zurich in 1917. This marked the beginning of Hilbert's return to his foundational studies, which ultimately resulted in the establishment of proof theory as a new branch in the emerging field of mathematical logic. Hilbert also used the opportunity to bring Paul Bernays back to Goettingen as his main collaborator in foundational studies in the years to come. The contributions are addressed to mathematical and philosophical logicians, but also to philosophers of science as well as physicists and computer scientists with an interest in foundations.
This book defines a logical system called the Protocol-theoretic Logic of Epistemic Norms (PLEN), it develops PLEN into a formal framework for representing and reasoning about epistemic norms, and it shows that PLEN is theoretically interesting and useful with regard to the aims of such a framework. In order to motivate the project, the author defends an account of epistemic norms called epistemic proceduralism. The core of this view is the idea that, in virtue of their indispensable, regulative role in cognitive life, epistemic norms are closely intertwined with procedural rules that restrict epistemic actions, procedures, and processes. The resulting organizing principle of the book is that epistemic norms are protocols for epistemic planning and control. The core of the book is developing PLEN, which is essentially a novel variant of propositional dynamic logic (PDL) distinguished by more or less elaborate revisions of PDL's syntax and semantics. The syntax encodes the procedural content of epistemic norms by means of the well-known protocol or program constructions of dynamic and epistemic logics. It then provides a novel language of operators on protocols, including a range of unique protocol equivalence relations, syntactic operations on protocols, and various procedural relations among protocols in addition to the standard dynamic (modal) operators of PDL. The semantics of the system then interprets protocol expressions and expressions embedding protocols over a class of directed multigraph-like structures rather than the standard labeled transition systems or modal frames. The intent of the system is to better represent epistemic dynamics, build a logic of protocols atop it, and then show that the resulting logic of protocols is useful as a logical framework for epistemic norms. The resulting theory of epistemic norms centers on notions of norm equivalence derived from theories of process equivalence familiar from the study of dynamic and modal logics. The canonical account of protocol equivalence in PLEN turns out to possess a number of interesting formal features, including satisfaction of important conditions on hyperintensional equivalence, a matter of recently recognized importance in the logic of norms, generally. To show that the system is interesting and useful as a framework for representing and reasoning about epistemic norms, the author applies the logical system to the analysis of epistemic deontic operators, and, partly on the basis of this, establishes representation theorems linking protocols to the action-guiding content of epistemic norms. The protocol-theoretic logic of epistemic norms is then shown to almost immediately validate the main principles of epistemic proceduralism.
Many years of practical experience in teaching discrete mathematics form the basis of this text book. Part I contains problems on such topics as Boolean algebra, k-valued logics, graphs and networks, elements of coding theory, automata theory, algorithms theory, combinatorics, Boolean minimization and logical design. The exercises are preceded by ample theoretical background material. For further study the reader is referred to the extensive bibliography. Part II follows the same structure as Part I, and gives helpful hints and solutions. Audience: This book will be of great value to undergraduate students of discrete mathematics, whereas the more difficult exercises, which comprise about one-third of the material, will also appeal to postgraduates and researchers.
This book contains the proceedings of the International Symposium on Mathematical Morphology and its Applications to Image and Signal Processing IV, held June 3-5, 1998, in Amsterdam, The Netherlands. The purpose of the work is to provide the image analysis community with a sampling of recent developments in theoretical and practical aspects of mathematical morphology and its applications to image and signal processing. Among the areas covered are: digitization and connectivity, skeletonization, multivariate morphology, morphological segmentation, color image processing, filter design, gray-scale morphology, fuzzy morphology, decomposition of morphological operators, random sets and statistical inference, differential morphology and scale-space, morphological algorithms and applications. Audience: This volume will be of interest to research mathematicians and computer scientists whose work involves mathematical morphology, image and signal processing.
Fundamentals of Fuzzy Sets covers the basic elements of fuzzy set theory. Its four-part organization provides easy referencing of recent as well as older results in the field. The first part discusses the historical emergence of fuzzy sets, and delves into fuzzy set connectives, and the representation and measurement of membership functions. The second part covers fuzzy relations, including orderings, similarity, and relational equations. The third part, devoted to uncertainty modelling, introduces possibility theory, contrasting and relating it with probabilities, and reviews information measures of specificity and fuzziness. The last part concerns fuzzy sets on the real line - computation with fuzzy intervals, metric topology of fuzzy numbers, and the calculus of fuzzy-valued functions. Each chapter is written by one or more recognized specialists and offers a tutorial introduction to the topics, together with an extensive bibliography.
This second edition of "A Beginner's Guide to Finite Mathematics" takes a distinctly applied approach to finite mathematics at the freshman and sophomore level. Topics are presented sequentially: the book opens with a brief review of sets and numbers, followed by an introduction to data sets, histograms, means and medians. Counting techniques and the Binomial Theorem are covered, which provides the foundation for elementary probability theory; this, in turn, leads to basic statistics. This new edition includes chapters on game theory and financial mathematics. Requiring little mathematical background beyond high school algebra, the text will be especially useful for business and liberal arts majors.
This open access book offers a self-contained introduction to the homotopy theory of simplicial and dendroidal sets and spaces. These are essential for the study of categories, operads, and algebraic structure up to coherent homotopy. The dendroidal theory combines the combinatorics of trees with the theory of Quillen model categories. Dendroidal sets are a natural generalization of simplicial sets from the point of view of operads. In this book, the simplicial approach to higher category theory is generalized to a dendroidal approach to higher operad theory. This dendroidal theory of higher operads is carefully developed in this book. The book also provides an original account of the more established simplicial approach to infinity-categories, which is developed in parallel to the dendroidal theory to emphasize the similarities and differences. Simplicial and Dendroidal Homotopy Theory is a complete introduction, carefully written with the beginning researcher in mind and ideally suited for seminars and courses. It can also be used as a standalone introduction to simplicial homotopy theory and to the theory of infinity-categories, or a standalone introduction to the theory of Quillen model categories and Bousfield localization.
This is the first book on cut-elimination in first-order predicate logic from an algorithmic point of view. Instead of just proving the existence of cut-free proofs, it focuses on the algorithmic methods transforming proofs with arbitrary cuts to proofs with only atomic cuts (atomic cut normal forms, so-called ACNFs). The first part investigates traditional reductive methods from the point of view of proof rewriting. Within this general framework, generalizations of Gentzen's and Sch\"utte-Tait's cut-elimination methods are defined and shown terminating with ACNFs of the original proof. Moreover, a complexity theoretic comparison of Gentzen's and Tait's methods is given. The core of the book centers around the cut-elimination method CERES (cut elimination by resolution) developed by the authors. CERES is based on the resolution calculus and radically differs from the reductive cut-elimination methods. The book shows that CERES asymptotically outperforms all reductive methods based on Gentzen's cut-reduction rules. It obtains this result by heavy use of subsumption theorems in clause logic. Moreover, several applications of CERES are given (to interpolation, complexity analysis of cut-elimination, generalization of proofs, and to the analysis of real mathematical proofs). Lastly, the book demonstrates that CERES can be extended to nonclassical logics, in particular to finitely-valued logics and to G\"odel logic.
The main characteristics of the real-world decision-making problems facing humans today are multidimensional and have multiple objectives including eco nomic, environmental, social, and technical ones. Hence, it seems natural that the consideration of many objectives in the actual decision-making process re quires multiobjective approaches rather than single-objective. One ofthe major systems-analytic multiobjective approaches to decision-making under constraints is multiobjective optimization as a generalization of traditional single-objective optimization. Although multiobjective optimization problems differ from single objective optimization problems only in the plurality of objective functions, it is significant to realize that multiple objectives are often noncom mensurable and conflict with each other in multiobjective optimization problems. With this ob servation, in multiobjective optimization, the notion of Pareto optimality or effi ciency has been introduced instead of the optimality concept for single-objective optimization. However, decisions with Pareto optimality or efficiency are not uniquely determined; the final decision must be selected from among the set of Pareto optimal or efficient solutions. Therefore, the question is, how does one find the preferred point as a compromise or satisficing solution with rational pro cedure? This is the starting point of multiobjective optimization. To be more specific, the aim is to determine how one derives a compromise or satisficing so lution of a decision maker (DM), which well represents the subjective judgments, from a Pareto optimal or an efficient solution set."
This book offers an original introduction to the representation theory of algebras, suitable for beginning researchers in algebra. It includes many results and techniques not usually covered in introductory books, some of which appear here for the first time in book form. The exposition employs methods from linear algebra (spectral methods and quadratic forms), as well as categorical and homological methods (module categories, Galois coverings, Hochschild cohomology) to present classical aspects of ring theory under new light. This includes topics such as rings with several objects, the Harada-Sai lemma, chain conditions, and Auslander-Reiten theory. Noteworthy and significant results covered in the book include the Brauer-Thrall conjectures, Drozd's theorem, and criteria to distinguish tame from wild algebras. This text may serve as the basis for a second graduate course in algebra or as an introduction to research in the field of representation theory of algebras. The originality of the exposition and the wealth of topics covered also make it a valuable resource for more established researchers.
This book gives a comprehensive introduction to Universal Algebraic Logic. The three main themes are (i) universal logic and the question of what logic is, (ii) duality theories between the world of logics and the world of algebra, and (iii) Tarskian algebraic logic proper including algebras of relations of various ranks, cylindric algebras, relation algebras, polyadic algebras and other kinds of algebras of logic. One of the strengths of our approach is that it is directly applicable to a wide range of logics including not only propositional logics but also e.g. classical first order logic and other quantifier logics. Following the Tarskian tradition, besides the connections between logic and algebra, related logical connections with geometry and eventually spacetime geometry leading up to relativity are also part of the perspective of the book. Besides Tarskian algebraizations of logics, category theoretical perspectives are also touched upon. This book, apart from being a monograph containing state of the art results in algebraic logic, can be used as the basis for a number of different courses intended for both novices and more experienced students of logic, mathematics, or philosophy. For instance, the first two chapters can be used in their own right as a crash course in Universal Algebra.
This volume contains the proceedings of the conference Logical Foundations of Mathematics, Computer Science, and Physics-Kurt Godel's Legacy, held in Brno, Czech Republic on the 90th anniversary of his birth. The wide and continuing importance of Godel s work in the logical foundations of mathematics, computer science, and physics is confirmed by the broad range of speakers who participated in making this gathering a scientific event.
The articles collected in this volume represent the contributions presented at the IMA workshop on "Dynamics of Algorithms" which took place in November 1997. The workshop was an integral part of the 1997 -98 IMA program on "Emerging Applications of Dynamical Systems." The interaction between algorithms and dynamical systems is mutually beneficial since dynamical methods can be used to study algorithms that are applied repeatedly. Convergence, asymptotic rates are indeed dynamical properties. On the other hand, the study of dynamical systems benefits enormously from having efficient algorithms to compute dynamical objects.
Approximation Theory, Wavelets and Applications draws together the latest developments in the subject, provides directions for future research, and paves the way for collaborative research. The main topics covered include constructive multivariate approximation, theory of splines, spline wavelets, polynomial and trigonometric wavelets, interpolation theory, polynomial and rational approximation. Among the scientific applications were de-noising using wavelets, including the de-noising of speech and images, and signal and digital image processing. In the area of the approximation of functions the main topics include multivariate interpolation, quasi-interpolation, polynomial approximation with weights, knot removal for scattered data, convergence theorems in PadA(c) theory, Lyapunov theory in approximation, Neville elimination as applied to shape preserving presentation of curves, interpolating positive linear operators, interpolation from a convex subset of Hilbert space, and interpolation on the triangle and simplex. Wavelet theory is growing extremely rapidly and has applications which will interest readers in the physical, medical, engineering and social sciences.
First published in 2000. Routledge is an imprint of Taylor & Francis, an informa company.
In 1907 Luitzen Egbertus Jan Brouwer defended his doctoral dissertation on the foundations of mathematics and with this event the modem version of mathematical intuitionism came into being. Brouwer attacked the main currents of the philosophy of mathematics: the formalists and the Platonists. In tum, both these schools began viewing intuitionism as the most harmful party among all known philosophies of mathematics. That was the origin of the now-90-year-old debate over intuitionism. As both sides have appealed in their arguments to philosophical propositions, the discussions have attracted the attention of philosophers as well. One might ask here what role a philosopher can play in controversies over mathematical intuitionism. Can he reasonably enter into disputes among mathematicians? I believe that these disputes call for intervention by a philo sopher. The three best-known arguments for intuitionism, those of Brouwer, Heyting and Dummett, are based on ontological and epistemological claims, or appeal to theses that properly belong to a theory of meaning. Those lines of argument should be investigated in order to find what their assumptions are, whether intuitionistic consequences really follow from those assumptions, and finally, whether the premises are sound and not absurd. The intention of this book is thus to consider seriously the arguments of mathematicians, even if philosophy was not their main field of interest. There is little sense in disputing whether what mathematicians said about the objectivity and reality of mathematical facts belongs to philosophy, or not."
In Western Civilization Mathematics and Music have a long and interesting history in common, with several interactions, traditionally associated with the name of Pythagoras but also with a significant number of other mathematicians, like Leibniz, for instance. Mathematical models can be found for almost all levels of musical activities from composition to sound production by traditional instruments or by digital means. Modern music theory has been incorporating more and more mathematical content during the last decades. This book offers a journey into recent work relating music and mathematics. It contains a large variety of articles, covering the historical aspects, the influence of logic and mathematical thought in composition, perception and understanding of music and the computational aspects of musical sound processing. The authors illustrate the rich and deep interactions that exist between Mathematics and Music.
Problems books are popular with instructors and students alike, as well as among general readers. The key to this book is the many alternative solutions to single problems. Mathematics educators, secondary mathematics teachers, and university instructors will find the book interesting and useful.
This monograph is the r st in Fuzzy Approximation Theory. It contains mostly the author s research work on fuzziness of the last ten years and relies a lot on [10]-[32] and it is a natural outgrowth of them. It belongs to the broader area of Fuzzy Mathematics. Chapters are self-contained and several advanced courses can be taught out of this book. We provide lots of applications but always within the framework of Fuzzy Mathematics. In each chapter is given background and motivations. A c- plete list of references is provided at the end. The topics covered are very diverse. In Chapter 1 we give an extensive basic background on Fuzziness and Fuzzy Real Analysis, as well a complete description of the book. In the following Chapters 2,3 we cover in deep Fuzzy Di?erentiation and Integ- tion Theory, e.g. we present Fuzzy Taylor Formulae. It follows Chapter 4 on Fuzzy Ostrowski Inequalities. Then in Chapters 5, 6 we present results on classical algebraic and trigonometric polynomial Fuzzy Approximation.
This monograph shows that, through a recourse to the concepts and methods of abstract algebraic logic, the algebraic theory of regular varieties and the concept of analyticity in formal logic can profitably interact. By extending the technique of Plonka sums from algebras to logical matrices, the authors investigate the different classes of models for logics of variable inclusion and they shed new light into their formal properties. The book opens with the historical origins of logics of variable inclusion and on their philosophical motivations. It includes the basics of the algebraic theory of regular varieties and the construction of Plonka sums over semilattice direct systems of algebra. The core of the book is devoted to an abstract definition of logics of left and right variable inclusion, respectively, and the authors study their semantics using the construction of Plonka sums of matrix models. The authors also cover Paraconsistent Weak Kleene logic and survey its abstract algebraic logical properties. This book is of interest to scholars of formal logic. |
![]() ![]() You may like...
Microbiorobotics - Biologically Inspired…
Minjun Kim, Agung Julius, …
Hardcover
R3,415
Discovery Miles 34 150
Advances in Intelligent Manufacturing…
Grzegorz Krolczyk, Chander Prakash, …
Hardcover
R6,327
Discovery Miles 63 270
Informatics in Control, Automation and…
Oleg Gusikhin, Kurosh Madani
Hardcover
R4,535
Discovery Miles 45 350
Proceedings of the 4th International…
Yong Qin, Limin Jia, …
Hardcover
R5,748
Discovery Miles 57 480
Proceedings of the 2019 DigitalFUTURES…
Philip F. Yuan, Yi Min (Mike) Xie, …
Hardcover
R4,406
Discovery Miles 44 060
Control of Complex Systems - Theory and…
Kyriakos Vamvoudakis, Sarangapani Jagannathan
Hardcover
Delay Systems - From Theory to Numerics…
Tomas Vyhlidal, Jean-Francois Lafay, …
Hardcover
R5,133
Discovery Miles 51 330
|