![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
In the mathematical practice, the Baire category method is a tool for establishing the existence of a rich array of generic structures. However, in mathematics, the Baire category method is also behind a number of fundamental results such as the Open Mapping Theorem or the Banach-Steinhaus Boundedness Principle. This volume brings the Baire category method to another level of sophistication via the internal version of the set-theoretic forcing technique. It is the first systematic account of applications of the higher forcing axioms with the stress on the technique of building forcing notions rather than on the relationship between different forcing axioms or their consistency strengths.
From the reviews: "This is a very interesting book containing material for a comprehensive study of the cyclid homological theory of algebras, cyclic sets and S1-spaces. Lie algebras and algebraic K-theory and an introduction to Connes'work and recent results on the Novikov conjecture. The book requires a knowledge of homological algebra and Lie algebra theory as well as basic technics coming from algebraic topology. The bibliographic comments at the end of each chapter offer good suggestions for further reading and research. The book can be strongly recommended to anybody interested in noncommutative geometry, contemporary algebraic topology and related topics." European Mathematical Society Newsletter In this second edition the authors have added a chapter 13 on MacLane (co)homology.
Many people start the day with physical exercise but few seem to be so concerned with exercising the most human of organs-the brain. This book provides you with entertaining and challenging mental exercises for every week of the year. Whether you are a high school student eager to sharpen your brain, or someone older who would like to retain your mental agility, you will find your brain getting sharper and more agile as you solve the puzzles in this book. Read a few puzzles every week, think about them, solve them, and you will see the results. And on the way to a sharper mind, you will enjoy every step.
Nonlinear systems with stationary sets are important because they cover a lot of practical systems in engineering. Previous analysis has been based on the frequency-domain for this class of systems. However, few results on robustness analysis and controller design for these systems are easily available.This book presents the analysis as well as methods based on the global properties of systems with stationary sets in a unified time-domain and frequency-domain framework. The focus is on multi-input and multi-output systems, compared to previous publications which considered only single-input and single-output systems. The control methods presented in this book will be valuable for research on nonlinear systems with stationary sets.
Science involves descriptions of the world we live in. It also depends on nature exhibiting what we can best describe as a high aLgorithmic content. The theme running through this collection of papers is that of the interaction between descriptions, in the form of formal theories, and the algorithmic content of what is described, namely of the modeLs of those theories. This appears most explicitly here in a number of valuable, and substantial, contributions to what has until recently been known as 'recursive model theory' - an area in which researchers from the former Soviet Union (in particular Novosibirsk) have been pre-eminent. There are also articles concerned with the computability of aspects of familiar mathematical structures, and - a return to the sort of basic underlying questions considered by Alan Turing in the early days of the subject - an article giving a new perspective on computability in the real world. And, of course, there are also articles concerned with the classical theory of computability, including the first widely available survey of work on quasi-reducibility. The contributors, all internationally recognised experts in their fields, have been associated with the three-year INTAS-RFBR Research Project "Com putability and Models" (Project No. 972-139), and most have participated in one or more of the various international workshops (in Novosibirsk, Heidelberg and Almaty) and otherresearch activities of the network."
This book describes a powerful language for multidimensional declarative programming called Lucid. Lucid has evolved considerably in the past ten years. The main catalyst for this metamorphosis was the discovery that Lucid is based on intensional logic, one commonly used in studying natural languages. Intensionality, and more specifically indexicality, has enabled Lucid to implicitly express multidimensional objects that change, a fundamental capability with several consequences which are explored in this book. The author covers a broad range of topics, from foundations to applications, and from implementations to implications. The role of intensional logic in Lucid as well as its consequences for programming in general is discussed. The syntax and mathematical semantics of the language are given and its ability to be used as a formal system for transformation and verification is presented. The use of Lucid in both multidimensional applications programming and software systems construction (such as a parallel programming system and a visual programming system) is described. A novel model of multidimensional computation--education--is described along with its serendipitous practical benefits for harnessing parallelism and tolerating faults. As the only volume that reflects the advances over the past decade, this work will be of great interest to researchers and advanced students involved with declarative language systems and programming.
Methods of dimensionality reduction provide a way to understand and visualize the structure of complex data sets. Traditional methods like principal component analysis and classical metric multidimensional scaling suffer from being based on linear models. Until recently, very few methods were able to reduce the data dimensionality in a nonlinear way. However, since the late nineties, many new methods have been developed and nonlinear dimensionality reduction, also called manifold learning, has become a hot topic. New advances that account for this rapid growth are, e.g. the use of graphs to represent the manifold topology, and the use of new metrics like the geodesic distance. In addition, new optimization schemes, based on kernel techniques and spectral decomposition, have lead to spectral embedding, which encompasses many of the
This volume celebrates the work of Petr Hajek on mathematical fuzzy logic and presents how his efforts have influenced prominent logicians who are continuing his work. The book opens with a discussion on Hajek's contribution to mathematical fuzzy logic and with a scientific biography of him, progresses to include two articles with a foundation flavour, that demonstrate some important aspects of Hajek's production, namely, a paper on the development of fuzzy sets and another paper on some fuzzy versions of set theory and arithmetic. Articles in the volume also focus on the treatment of vagueness, building connections between Hajek's favorite fuzzy logic and linguistic models of vagueness. Other articles introduce alternative notions of consequence relation, namely, the preservation of truth degrees, which is discussed in a general context, and the differential semantics. For the latter, a surprisingly strong standard completeness theorem is proved. Another contribution also looks at two principles valid in classical logic and characterize the three main t-norm logics in terms of these principles. Other articles, with an algebraic flavour, offer a summary of the applications of lattice ordered-groups to many-valued logic and to quantum logic, as well as an investigation of prelinearity in varieties of pointed lattice ordered algebras that satisfy a weak form of distributivity and have a very weak implication. The last part of the volume contains an article on possibilistic modal logics defined over MTL chains, a topic that Hajek discussed in his celebrated work, Metamathematics of Fuzzy Logic, and another one where the authors, besides offering unexpected premises such as proposing to call Hajek's basic fuzzy logic HL, instead of BL, propose a very weak system, called SL as a candidate for the role of the really basic fuzzy logic. The paper also provides a generalization of the prelinearity axiom, which was investigated by Hajek in the context of fuzzy logic."
There are many proposed aims for scientific inquiry - to explain or predict events, to confirm or falsify hypotheses, or to find hypotheses that cohere with our other beliefs in some logical or probabilistic sense. This book is devoted to a different proposal - that the logical structure of the scientist's method should guarantee eventual arrival at the truth, given the scientist's background assumptions. Interest in this methodological property, called "logical reliability", stems from formal learning theory, which draws its insights not from the theory of probability, but from the theory of computability. Kelly first offers an accessible explanation of formal learning theory, then goes on to develop and explore a systematic framework in which various standard learning-theoretic results can be seen as special cases of simpler and more general considerations. Finally, Kelly clarifies the relationship between the resulting framework and other standard issues in the philosophy of science, such as probability, causation, and relativism. Extensively illustrated with figures by the author, The Logic of Reliable Inquiry assumes only introductory knowledge of basic logic and computability theory. It is a major contribution to the literature and will be essential reading for scientists, statiticians, psychologists, linguists, logicians, and philosophers.
Stephen Cole Kleene was one of the greatest logicians of the twentieth century and this book is the influential textbook he wrote to teach the subject to the next generation. It was first published in 1952, some twenty years after the publication of Gadel's paper on the incompleteness of arithmetic, which marked, if not the beginning of modern logic, at least a turning point after which oenothing was ever the same. Kleene was an important figure in logic, and lived a long full life of scholarship and teaching. The 1930s was a time of creativity and ferment in the subject, when the notion of aEUROoecomputableaEURO moved from the realm of philosophical speculation to the realm of science. This was accomplished by the work of Kurt Gade1, Alan Turing, and Alonzo Church, who gave three apparently different precise definitions of aEUROoecomputableaEURO . When they all turned out to be equivalent, there was a collective realization that this was indeed the oeright notion. Kleene played a key role in this process. One could say that he was oethere at the beginning of modern logic. He showed the equivalence of lambda calculus with Turing machines and with Gadel's recursion equations, and developed the modern machinery of partial recursive functions. This textbook played an invaluable part in educating the logicians of the present. It played an important role in their own logical education.
We live in a world that is not quite "right." The central tenet of statistical inquiry is that Observation = Truth + Error because even the most careful of scientific investigations have always been bedeviled by uncertainty. Our attempts to measure things are plagued with small errors. Our attempts to understand our world are blocked by blunders. And, unfortunately, in some cases, people have been known to lie. In this long-awaited follow-up to his well-regarded bestseller, The Lady Tasting Tea, David Salsburg opens a door to the amazing widespread use of statistical methods by looking at historical examples of errors, blunders and lies from areas as diverse as archeology, law, economics, medicine, psychology, sociology, Biblical studies, history, and war-time espionage. In doing so, he shows how, upon closer statistical investigation, errors and blunders often lead to useful information. And how statistical methods have been used to uncover falsified data. Beginning with Edmund Halley's examination of the Transit of Venus and ending with a discussion of how many tanks Rommel had during the Second World War, the author invites the reader to come along on this easily accessible and fascinating journey of how to identify the nature of errors, minimize the effects of blunders, and figure out who the liars are.
The Handbook of Logic in Artificial Intelligence and Logic Programming is a multi-volume work covering all major areas of the application of logic to artificial intelligence and logic programming. The authors are chosen on an international basis and are leaders in the fields covered. Volume 5 is the last in this well-regarded series. Logic is now widely recognized as one of the foundational disciplines of computing. It has found applications in virtually all aspects of the subject, from software and hardware engineering to programming languages and artificial intelligence. In response to the growing need for an in-depth survey of these applications the Handbook of Logic in Artificial Intelligence and its companion, the Handbook of Logic in Computer Science have been created. The Handbooks are a combination of authoritative exposition, comprehensive survey, and fundamental research exploring the underlying themes in the various areas. Some mathematical background is assumed, and much of the material will be of interest to logicians and mathematicians. Volume 5 focuses particularly on logic programming. This book is intended for theoretical computer scientists.
The book is designed for students studying on their own, without access to lecturers and other reading, along the lines of the internationally renowned course produced by the Open University. There are thus a large number of exercises within the main body of the text designed to help students engage with the subject, many of which have full teaching solutions. In addition, there are a number of exercises without answers so that students studying under the guidance of a tutor may be assessed.
With rapid progress in Internet and digital imaging technology, there are more and more ways to easily create, publish, and distribute images. Considered the first book to focus on the relationship between digital imaging and privacy protection, Visual Cryptography and Secret Image Sharing is a complete introduction to novel security methods and sharing-control mechanisms used to protect against unauthorized data access and secure dissemination of sensitive information. Image data protection and image-based authentication techniques offer efficient solutions for controlling how private data and images are made available only to select people. Essential to the design of systems used to manage images that contain sensitive data-such as medical records, financial transactions, and electronic voting systems-the methods presented in this book are useful to counter traditional encryption techniques, which do not scale well and are less efficient when applied directly to image files. An exploration of the most prominent topics in digital imaging security, this book discusses: Potential for sharing multiple secrets Visual cryptography schemes-based either on the probabilistic reconstruction of the secret image, or on different logical operations for combining shared images Inclusion of pictures in the distributed shares Contrast enhancement techniques Color-image visual cryptography Cheating prevention Alignment problems for image shares Steganography and authentication In the continually evolving world of secure image sharing, a growing number of people are becoming involved as new applications and business models are being developed all the time. This contributed volume gives academicians, researchers, and professionals the insight of well-known experts on key concepts, issues, trends, and technologies in this emerging field.
During 1996-97 MSRI held a full academic-year program on combinatorics, with special emphasis on its connections to other branches of mathematics, such as algebraic geometry, topology, commutative algebra, representation theory, and convex geometry. The rich combinatorial problems arising from the study of various algebraic structures are the subject of this book, which features work done or presented at the program's seminars. The text contains contributions on matroid bundles, combinatorial representation theory, lattice points in polyhedra, bilinear forms, combinatorial differential topology and geometry, Macdonald polynomials and geometry, enumeration of matchings, the generalized Baues problem, and Littlewood-Richardson semigroups. These expository articles, written by some of the most respected researchers in the field, present the state of the art to graduate students and researchers in combinatorics as well as in algebra, geometry, and topology.
This book casts the theory of periods of algebraic varieties in the natural setting of Madhav Nori's abelian category of mixed motives. It develops Nori's approach to mixed motives from scratch, thereby filling an important gap in the literature, and then explains the connection of mixed motives to periods, including a detailed account of the theory of period numbers in the sense of Kontsevich-Zagier and their structural properties. Period numbers are central to number theory and algebraic geometry, and also play an important role in other fields such as mathematical physics. There are long-standing conjectures about their transcendence properties, best understood in the language of cohomology of algebraic varieties or, more generally, motives. Readers of this book will discover that Nori's unconditional construction of an abelian category of motives (over fields embeddable into the complex numbers) is particularly well suited for this purpose. Notably, Kontsevich's formal period algebra represents a torsor under the motivic Galois group in Nori's sense, and the period conjecture of Kontsevich and Zagier can be recast in this setting. Periods and Nori Motives is highly informative and will appeal to graduate students interested in algebraic geometry and number theory as well as researchers working in related fields. Containing relevant background material on topics such as singular cohomology, algebraic de Rham cohomology, diagram categories and rigid tensor categories, as well as many interesting examples, the overall presentation of this book is self-contained.
The book is devoted to the theory of groups of finite Morley rank. These groups arise in model theory and generalize the concept of algebraic groups over algebraically closed fields. The book contains almost all the known results in the subject. Trying to attract pure group theorists in the subject and to prepare the graduate student to start the research in the area, the authors adopted an algebraic and self evident point of view rather than a model theoretic one, and developed the theory from scratch. All the necessary model theoretical and group theoretical notions are explained in length. The book is full of exercises and examples and one of its chapters contains a discussion of open problems and a program for further research.
Compactly supported smooth piecewise polynomial functions provide an efficient tool for the approximation of curves and surfaces and other smooth functions of one and several arguments. Since they are locally polynomial, they are easy to evaluate. Since they are smooth, they can be used when smoothness is required, as in the numerical solution of partial differential equations (in the Finite Element method) or the modeling of smooth sur faces (in Computer Aided Geometric Design). Since they are compactly supported, their linear span has the needed flexibility to approximate at all, and the systems to be solved in the construction of approximations are 'banded'. The construction of compactly supported smooth piecewise polynomials becomes ever more difficult as the dimension, s, of their domain G ~ IRs, i. e. , the number of arguments, increases. In the univariate case, there is only one kind of cell in any useful partition, namely, an interval, and its boundary consists of two separated points, across which polynomial pieces would have to be matched as one constructs a smooth piecewise polynomial function. This can be done easily, with the only limitation that the num ber of smoothness conditions across such a breakpoint should not exceed the polynomial degree (since that would force the two joining polynomial pieces to coincide). In particular, on any partition, there are (nontrivial) compactly supported piecewise polynomials of degree ~ k and in C(k-l), of which the univariate B-spline is the most useful example.
Applicable to any problem that requires a finite number of solutions, finite state-based models (also called finite state machines or finite state automata) have found wide use in various areas of computer science and engineering. Handbook of Finite State Based Models and Applications provides a complete collection of introductory materials on finite state theories, algorithms, and the latest domain applications. For beginners, the book is a handy reference for quickly looking up model details. For more experienced researchers, it is suitable as a source of in-depth study in this area. The book first introduces the fundamentals of automata theory, including regular expressions, as well as widely used automata, such as transducers, tree automata, quantum automata, and timed automata. It then presents algorithms for the minimization and incremental construction of finite automata and describes Esterel, an automata-based synchronous programming language for embedded system software development. Moving on to applications, the book explores regular path queries on graph-structured data, timed automata in model checking security protocols, pattern matching, compiler design, and XML processing. It also covers other finite state-based modeling approaches and applications, including Petri nets, statecharts, temporal logic, and UML state machine diagrams.
Handbook of Mathematical Induction: Theory and Applications shows how to find and write proofs via mathematical induction. This comprehensive book covers the theory, the structure of the written proof, all standard exercises, and hundreds of application examples from nearly every area of mathematics. In the first part of the book, the author discusses different inductive techniques, including well-ordered sets, basic mathematical induction, strong induction, double induction, infinite descent, downward induction, and several variants. He then introduces ordinals and cardinals, transfinite induction, the axiom of choice, Zorn's lemma, empirical induction, and fallacies and induction. He also explains how to write inductive proofs. The next part contains more than 750 exercises that highlight the levels of difficulty of an inductive proof, the variety of inductive techniques available, and the scope of results provable by mathematical induction. Each self-contained chapter in this section includes the necessary definitions, theory, and notation and covers a range of theorems and problems, from fundamental to very specialized. The final part presents either solutions or hints to the exercises. Slightly longer than what is found in most texts, these solutions provide complete details for every step of the problem-solving process.
Conditional reasoning is reasoning that involves statements of the sort If A (Antecedent) then C (Consequent). This type of reasoning is ubiquitous; everyone engages in it. Indeed, the ability to do so may be considered a defining human characteristic. Without this ability, human cognition would be greatly impoverished. "What-if" thinking could not occur. There would be no retrospective efforts to understand history by imagining how it could have taken a different course. Decisions that take possible contingencies into account could not be made; there could be no attempts to influence the future by selecting actions on the basis of their expected effects. Despite the commonness and importance of conditional reasoning and the considerable attention it has received from scholars, it remains the subject of much continuing debate. Unsettled questions, both normative and empirical, continue to be asked. What constitutes normative conditional reasoning? How do people engage in it? Does what people do match what would be expected of a rational agent with the abilities and limitations of human beings? If not, how does it deviate and how might people's ability to engage in it be improved? This book reviews the work of prominent psychologists and philosophers on conditional reasoning. It describes empirical research on how people deal with conditional arguments and on how conditional statements are used and interpreted in everyday communication. It examines philosophical and theoretical treatments of the mental processes that support conditional reasoning. Its extensive coverage of the subject makes it an ideal resource for students, teachers, and researchers with a focus on cognition across disciplines.
This book presents a collection of invited articles by distinguished Mathematicians on the occasion of the Platinum Jubilee Celebrations of the Indian Statistical Institute, during the year 2007. These articles provide a current perspective of different areas of research, emphasizing the major challenging issues. Given the very significant record of the Institute in research in the areas of Statistics, Probability and Mathematics, distinguished authors have very admirably responded to the invitation. Some of the articles are written keeping students and potential new entrants to an area of mathematics in mind. This volume is thus very unique and gives a perspective of several important aspects of mathematics.
The latest volume in this major reference work covers all major areas of application of logic and theoretical computer science |
![]() ![]() You may like...
Introduction To Financial Accounting…
Willem Lotter, Nadia Rhodes, …
Paperback
R729
Discovery Miles 7 290
Time Series Analysis - With Applications…
Jonathan D. Cryer, Kung-Sik Chan
Hardcover
R2,636
Discovery Miles 26 360
Foundations and Methods of Stochastic…
Barry L. Nelson, Linda Pei
Hardcover
R3,094
Discovery Miles 30 940
Topological Data Analysis - The Abel…
Nils A. Baas, Gunnar E. Carlsson, …
Hardcover
R5,422
Discovery Miles 54 220
|