![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
All students taking laboratory courses within the physical sciences and engineering will benefit from this book, whilst researchers will find it an invaluable reference. This concise, practical guide brings the reader up-to-speed on the proper handling and presentation of scientific data and its inaccuracies. It covers all the vital topics with practical guidelines, computer programs (in Python), and recipes for handling experimental errors and reporting experimental data. In addition to the essentials, it also provides further background material for advanced readers who want to understand how the methods work. Plenty of examples, exercises and solutions are provided to aid and test understanding, whilst useful data, tables and formulas are compiled in a handy section for easy reference.
All students taking laboratory courses within the physical sciences and engineering will benefit from this book, whilst researchers will find it an invaluable reference. This concise, practical guide brings the reader up-to-speed on the proper handling and presentation of scientific data and its inaccuracies. It covers all the vital topics with practical guidelines, computer programs (in Python), and recipes for handling experimental errors and reporting experimental data. In addition to the essentials, it also provides further background material for advanced readers who want to understand how the methods work. Plenty of examples, exercises and solutions are provided to aid and test understanding, whilst useful data, tables and formulas are compiled in a handy section for easy reference.
In the last 20 years, the study of operator algebras has developed from a branch of functional analysis to a central field of mathematics with applications and connections with different areas in both pure mathematics (foliations, index theory, K-theory, cyclic homology, affine Kac-Moody algebras, quantum groups, low dimensional topology) and mathematical physics (integrable theories, statistical mechanics, conformal field theories and the string theories of elementary particles). The theory of operator algebras was initiated by von Neumann and Murray as a tool for studying group representations and as a framework for quantum mechanics, and has since kept in touch with its roots in physics as a framework for quantum statistical mechanics and the formalism of algebraic quantum field theory. However, in 1981, the study of operator algebras took a new turn with the introduction by Vaughan Jones of subfactor theory and remarkable connections were found with knot theory, 3-manifolds, quantum groups and integrable systems in statistical mechanics and conformal field theory. The purpose of this book, one of the first in the area, is to look at these combinatorial-algebraic developments from the perspective of operator algebras; to bring the reader to the frontline of research with the minimum of prerequisites from classical theory.
This book introduces new models based on R-calculus and theories of belief revision for dealing with large and changing data. It extends R-calculus from first-order logic to propositional logic, description logics, modal logic and logic programming, and from minimal change semantics to subset minimal change, pseudo-subformula minimal change and deduction-based minimal change (the last two minimal changes are newly defined). And it proves soundness and completeness theorems with respect to the minimal changes in these logics. To make R-calculus computable, an approximate R-calculus is given which uses finite injury priority method in recursion theory. Moreover, two applications of R-calculus are given to default theory and semantic inheritance networks. This book offers a rich blend of theory and practice. It is suitable for students, researchers and practitioners in the field of logic. Also it is very useful for all those who are interested in data, digitization and correctness and consistency of information, in modal logics, non monotonic logics, decidable/undecidable logics, logic programming, description logics, default logics and semantic inheritance networks.
Algebraic theories, introduced as a concept in the 1960s, have been a fundamental step towards a categorical view of general algebra. Moreover, they have proved very useful in various areas of mathematics and computer science. This carefully developed book gives a systematic introduction to algebra based on algebraic theories that is accessible to both graduate students and researchers. It will facilitate interactions of general algebra, category theory and computer science. A central concept is that of sifted colimits - that is, those commuting with finite products in sets. The authors prove the duality between algebraic categories and algebraic theories and discuss Morita equivalence between algebraic theories. They also pay special attention to one-sorted algebraic theories and the corresponding concrete algebraic categories over sets, and to S-sorted algebraic theories, which are important in program semantics. The final chapter is devoted to finitary localizations of algebraic categories, a recent research area.
This book is dedicated to the work of Alasdair Urquhart. The book starts out with an introduction to and an overview of Urquhart's work, and an autobiographical essay by Urquhart. This introductory section is followed by papers on algebraic logic and lattice theory, papers on the complexity of proofs, and papers on philosophical logic and history of logic. The final section of the book contains a response to the papers by Urquhart. Alasdair Urquhart has made extremely important contributions to a variety of fields in logic. He produced some of the earliest work on the semantics of relevant logic. He provided the undecidability of the logics R (of relevant implication) and E (of relevant entailment), as well as some of their close neighbors. He proved that interpolation fails in some of those systems. Urquhart has done very important work in complexity theory, both about the complexity of proofs in classical and some nonclassical logics. In pure algebra, he has produced a representation theorem for lattices and some rather beautiful duality theorems. In addition, he has done important work in the history of logic, especially on Bertrand Russell, including editing Volume four of Russell's Collected Papers.
If you have ever wondered what quaternions are - then look no further, John Vince will show you how simple and useful they are. This 2nd edition has been completely revised and includes extra detail on the invention of quaternions, a complete review of the text and equations, all figures are in colour, extra worked examples, an expanded index, and a bibliography arranged for each chapter. Quaternions for Computer Graphics includes chapters on number sets and algebra, imaginary and complex numbers, the complex plane, rotation transforms, and a comprehensive description of quaternions in the context of rotation. The book will appeal to students of computer graphics, computer science and mathematics, as well as programmers, researchers, academics and professional practitioners interested in learning about quaternions. John Vince explains in an easy-to-understand language, with the aid of useful figures, how quaternions emerged, gave birth to modern vector analysis, disappeared, and reemerged to be adopted by the flight simulation industry and computer graphics. This book will give you the confidence to use quaternions within your every-day mathematics, and explore more advanced texts.
This book describes some basic principles that allow developers of computer programs (computer scientists, software engineers, programmers) to clearly think about the artifacts they deal with in their daily work: data types, programming languages, programs written in these languages that compute from given inputs wanted outputs, and programs that describe continuously executing systems. The core message is that clear thinking about programs can be expressed in a single universal language, the formal language of logic. Apart from its universal elegance and expressiveness, this "logical" approach to the formal modeling of and reasoning about computer programs has another advantage: due to advances in computational logic (automated theorem proving, satisfiability solving, model checking), nowadays much of this process can be supported by software. This book therefore accompanies its theoretical elaborations by practical demonstrations of various systems and tools that are based on respectively make use of the presented logical underpinnings.
Automata theory lies at the foundation of computer science, and is vital to a theoretical understanding of how computers work and what constitutes formal methods. This treatise gives a rigorous account of the topic and illuminates its real meaning by looking at the subject in a variety of ways. The first part of the book is organised around notions of rationality and recognisability. The second part deals with relations between words realised by finite automata, which not only exemplifies the automata theory but also illustrates the variety of its methods and its fields of application. Many exercises are included, ranging from those that test the reader, to those that are technical results, to those that extend ideas presented in the text. Solutions or answers to many of these are included in the book.
This book is a concise, self-contained, up-to-date introduction to extremal combinatorics for nonspecialists. There is a strong emphasis on theorems with particularly elegant and informative proofs, they may be called gems of the theory. The author presents a wide spectrum of the most powerful combinatorial tools together with impressive applications in computer science: methods of extremal set theory, the linear algebra method, the probabilistic method, and fragments of Ramsey theory. No special knowledge in combinatorics or computer science is assumed - the text is self-contained and the proofs can be enjoyed by undergraduate students in mathematics and computer science. Over 300 exercises of varying difficulty, and hints to their solution, complete the text. This second edition has been extended with substantial new material, and has been revised and updated throughout. It offers three new chapters on expander graphs and eigenvalues, the polynomial method and error-correcting codes. Most of the remaining chapters also include new material, such as the Kruskal-Katona theorem on shadows, the Lovasz-Stein theorem on coverings, large cliques in dense graphs without induced 4-cycles, a new lower bounds argument for monotone formulas, Dvir's solution of the finite field Kakeya conjecture, Moser's algorithmic version of the Lovasz Local Lemma, Schoning's algorithm for 3-SAT, the Szemeredi-Trotter theorem on the number of point-line incidences, surprising applications of expander graphs in extremal number theory, and some other new results."
Logic programming has emerged over the last five years as one of the most promising new programming paradigms and as a very active research area. The PROLOG experience has shown that relevant problems in areas such as expert systems, deductive databases, knowledge representation, and rapid prototyping can profitably be tackled by logic programming technology. It has also shown that the performance of PROLOG systems can compare with more traditional programming languages by means of sophisticated optimization and implementation of a new class of languages: the concurrent logic languages. Many recent advances in the theory of logic programs are related to extensions of the basic positive logic language and the related semantic problems. The original non-monotonic negation-as-failure rule has been extended in various ways and provided with new declarative characterizations. Other new language constructs are constraints (which lead to a very important extension of the paradigm which allows us to compute on new domains), concurrency, and modules and objects. This book, written by a team of international experts, goes beyond the classical theory to discuss these recent advances for the first time in a systematic form. The work is intended for advanced students of computer science, logic programming and artificial intelligence.
Gaisi Takeuti was one of the most brilliant, genius, and influential logicians of the 20th century. He was a long-time professor and professor emeritus of mathematics at the University of Illinois at Urbana-Champaign, USA, before he passed away on May 10, 2017, at the age of 91. Takeuti was one of the founders of Proof Theory, a branch of mathematical logic that originated from Hilbert's program about the consistency of mathematics. Based on Gentzen's pioneering works of proof theory in the 1930s, he proposed a conjecture in 1953 concerning the essential nature of formal proofs of higher-order logic now known as Takeuti's fundamental conjecture and of which he gave a partial positive solution. His arguments on the conjecture and proof theory in general have had great influence on the later developments of mathematical logic, philosophy of mathematics, and applications of mathematical logic to theoretical computer science. Takeuti's work ranged over the whole spectrum of mathematical logic, including set theory, computability theory, Boolean valued analysis, fuzzy logic, bounded arithmetic, and theoretical computer science. He wrote many monographs and textbooks both in English and in Japanese, and his monumental monograph Proof Theory, published in 1975, has long been a standard reference of proof theory. He had a wide range of interests covering virtually all areas of mathematics and extending to physics. His publications include many Japanese books for students and general readers about mathematical logic, mathematics in general, and connections between mathematics and physics, as well as many essays for Japanese science magazines. This volume is a collection of papers based on the Symposium on Advances in Mathematical Logic 2018. The symposium was held September 18-20, 2018, at Kobe University, Japan, and was dedicated to the memory of Professor Gaisi Takeuti.
This book provides a hands-on introduction to runtime verification which guides the reader from zero to sufficient practical knowledge required to consider and apply it in industry. It starts with almost no assumptions on the knowledge of the reader and provides exercises throughout the book through which the reader builds their own runtime verification tool. All that is required are basic programming skills and a good working knowledge of the object-oriented paradigm, ideally Java. Drawing from years of the authors' real-world experience, the reader progresses from manually writing runtime verification code to instrumenting monitoring using aspect-oriented programming, after which they explore increasing levels of specification abstraction: automata, regular expressions, and linear time temporal logic. A range of other topics is also explored in the book, including real-time properties, concerns of efficiency and persistence, integration with testing and architectural considerations. The book is written for graduate students specializing in software engineering as well as for industry professionals who need an introduction to the topic of runtime verification. While the book focuses on underlying foundations and practical techniques, it additionally provides for each chapter a reading list in the appendix for the interested reader who would like to deepen their knowledge in a particular area.
This book creates a conceptual schema that acts as a correlation between Epistemology and Epistemic Logic. It connects both fields and offers a proper theoretical foundation for the contemporary developments of Epistemic Logic regarding the dynamics of information. It builds a bridge between the view of Awareness Justification Internalism, and a dynamic approach to Awareness Logic. The book starts with an introduction to the main topics in Epistemic Logic and Epistemology and reviews the disconnection between the two fields. It analyses three core notions representing the basic structure of the conceptual schema: "Epistemic Awareness", "Knowledge" and "Justification". Next, it presents the Explicit Aware Knowledge (EAK) Schema, using a diagram of three ellipses to illustrate the schema, and a formal model based on a neighbourhood-model structure, that shows one concrete application of the EAK-Schema into a logical structure. The book ends by presenting conclusions and final remarks about the uses and applications of the EAK-Schema. It shows that the most important feature of the schema is that it serves both as a theoretical correlate to the dynamic extensions of Awareness Logic, providing it with a philosophical background, and as an abstract conceptual structure for a re-interpretation of Epistemology.
This book is a collection of contributions honouring Arnon Avron's seminal work on the semantics and proof theory of non-classical logics. It includes presentations of advanced work by some of the most esteemed scholars working on semantic and proof-theoretical aspects of computer science logic. Topics in this book include frameworks for paraconsistent reasoning, foundations of relevance logics, analysis and characterizations of modal logics and fuzzy logics, hypersequent calculi and their properties, non-deterministic semantics, algebraic structures for many-valued logics, and representations of the mechanization of mathematics. Avron's foundational and pioneering contributions have been widely acknowledged and adopted by the scientific community. His research interests are very broad, spanning over proof theory, automated reasoning, non-classical logics, foundations of mathematics, and applications of logic in computer science and artificial intelligence. This is clearly reflected by the diversity of topics discussed in the chapters included in this book, all of which directly relate to Avron's past and present works. This book is of interest to computer scientists and scholars of formal logic.
For a brief time in history, it was possible to imagine that a sufficiently advanced intellect could, given sufficient time and resources, in principle understand how to mathematically prove everything that was true. They could discern what math corresponds to physical laws, and use those laws to predict anything that happens before it happens. That time has passed. Goedel's undecidability results (the incompleteness theorems), Turing's proof of non-computable values, the formulation of quantum theory, chaos, and other developments over the past century have shown that there are rigorous arguments limiting what we can prove, compute, and predict. While some connections between these results have come to light, many remain obscure, and the implications are unclear. Are there, for example, real consequences for physics - including quantum mechanics - of undecidability and non-computability? Are there implications for our understanding of the relations between agency, intelligence, mind, and the physical world? This book, based on the winning essays from the annual FQXi competition, contains ten explorations of Undecidability, Uncomputability, and Unpredictability. The contributions abound with connections, implications, and speculations while undertaking rigorous but bold and open-minded investigation of the meaning of these constraints for the physical world, and for us as humans.
Classically Semisimple Rings is a textbook on rings, modules and categories, aimed at advanced undergraduate and beginning graduate students. The book presents the classical theory of semisimple rings from a modern, category-theoretic point of view. Examples from algebra are used to motivate the abstract language of category theory, which then provides a framework for the study of rings and modules, culminating in the Wedderburn-Artin classification of semisimple rings. In the last part of the book, readers are gently introduced to related topics such as tensor products, exchange modules and C*-algebras. As a final flourish, Rickart's theorem on group rings ties a number of these topics together. Each chapter ends with a selection of exercises of varying difficulty, and readers interested in the history of mathematics will find biographical sketches of important figures scattered throughout the text.Assuming previous knowledge in linear and basic abstract algebra, this book can serve as a textbook for a course in algebra, providing students with valuable early exposure to category theory.
Elements of Mathematics takes readers on a fascinating tour that begins in elementary mathematics--but, as John Stillwell shows, this subject is not as elementary or straightforward as one might think. Not all topics that are part of today's elementary mathematics were always considered as such, and great mathematical advances and discoveries had to occur in order for certain subjects to become "elementary." Stillwell examines elementary mathematics from a distinctive twenty-first-century viewpoint and describes not only the beauty and scope of the discipline, but also its limits. From Gaussian integers to propositional logic, Stillwell delves into arithmetic, computation, algebra, geometry, calculus, combinatorics, probability, and logic. He discusses how each area ties into more advanced topics to build mathematics as a whole. Through a rich collection of basic principles, vivid examples, and interesting problems, Stillwell demonstrates that elementary mathematics becomes advanced with the intervention of infinity. Infinity has been observed throughout mathematical history, but the recent development of "reverse mathematics" confirms that infinity is essential for proving well-known theorems, and helps to determine the nature, contours, and borders of elementary mathematics. Elements of Mathematics gives readers, from high school students to professional mathematicians, the highlights of elementary mathematics and glimpses of the parts of math beyond its boundaries.
This book features more than 20 papers that celebrate the work of Hajnal Andreka and Istvan Nemeti. It illustrates an interaction between developing and applying mathematical logic. The papers offer new results as well as surveys in areas influenced by these two outstanding researchers. They also provide details on the after-life of some of their initiatives. Computer science connects the papers in the first part of the book. The second part concentrates on algebraic logic. It features a range of papers that hint at the intricate many-way connections between logic, algebra, and geometry. The third part explores novel applications of logic in relativity theory, philosophy of logic, philosophy of physics and spacetime, and methodology of science. They include such exciting subjects as time travelling in emergent spacetime. The short autobiographies of Hajnal Andreka and Istvan Nemeti at the end of the book describe an adventurous journey from electric engineering and Maxwell's equations to a complex system of computer programs for designing Hungary's electric power system, to exploring and contributing deep results to Tarskian algebraic logic as the deepest core theory of such questions, then on to applications of the results in such exciting new areas as relativity theory in order to rejuvenate logic itself.
This book introduces the notion of an effective Kan fibration, a new mathematical structure which can be used to study simplicial homotopy theory. The main motivation is to make simplicial homotopy theory suitable for homotopy type theory. Effective Kan fibrations are maps of simplicial sets equipped with a structured collection of chosen lifts that satisfy certain non-trivial properties. Here it is revealed that fundamental properties of ordinary Kan fibrations can be extended to explicit constructions on effective Kan fibrations. In particular, a constructive (explicit) proof is given that effective Kan fibrations are stable under push forward, or fibred exponentials. Further, it is shown that effective Kan fibrations are local, or completely determined by their fibres above representables, and the maps which can be equipped with the structure of an effective Kan fibration are precisely the ordinary Kan fibrations. Hence implicitly, both notions still describe the same homotopy theory. These new results solve an open problem in homotopy type theory and provide the first step toward giving a constructive account of Voevodsky's model of univalent type theory in simplicial sets.
This monograph presents a general theory of weakly implicative logics, a family covering a vast number of non-classical logics studied in the literature, concentrating mainly on the abstract study of the relationship between logics and their algebraic semantics. It can also serve as an introduction to (abstract) algebraic logic, both propositional and first-order, with special attention paid to the role of implication, lattice and residuated connectives, and generalized disjunctions. Based on their recent work, the authors develop a powerful uniform framework for the study of non-classical logics. In a self-contained and didactic style, starting from very elementary notions, they build a general theory with a substantial number of abstract results. The theory is then applied to obtain numerous results for prominent families of logics and their algebraic counterparts, in particular for superintuitionistic, modal, substructural, fuzzy, and relevant logics. The book may be of interest to a wide audience, especially students and scholars in the fields of mathematics, philosophy, computer science, or related areas, looking for an introduction to a general theory of non-classical logics and their algebraic semantics.
This book presents a new nominalistic philosophy of mathematics: semantic conventionalism. Its central thesis is that mathematics should be founded on the human ability to create language - and specifically, the ability to institute conventions for the truth conditions of sentences. This philosophical stance leads to an alternative way of practicing mathematics: instead of "building" objects out of sets, a mathematician should introduce new syntactical sentence types, together with their truth conditions, as he or she develops a theory. Semantic conventionalism is justified first through criticism of Cantorian set theory, intuitionism, logicism, and predicativism; then on its own terms; and finally, exemplified by a detailed reconstruction of arithmetic and real analysis. Also included is a simple solution to the liar paradox and the other paradoxes that have traditionally been recognized as semantic. And since it is argued that mathematics is semantics, this solution also applies to Russell's paradox and the other mathematical paradoxes of self-reference. In addition to philosophers who care about the metaphysics and epistemology of mathematics or the paradoxes of self-reference, this book should appeal to mathematicians interested in alternative approaches.
This book outlines a vast array of techniques and methods regarding model categories, without focussing on the intricacies of the proofs. Quillen model categories are a fundamental tool for the understanding of homotopy theory. While many introductions to model categories fall back on the same handful of canonical examples, the present book highlights a large, self-contained collection of other examples which appear throughout the literature. In particular, it collects a highly scattered literature into a single volume. The book is aimed at anyone who uses, or is interested in using, model categories to study homotopy theory. It is written in such a way that it can be used as a reference guide for those who are already experts in the field. However, it can also be used as an introduction to the theory for novices.
Monograph( based very largely upon results original to the Czechoslovakian authors) presents an abstract account of the theory of automata for sophisticated readers presumed to be already conversant in the language of category theory. The seven chapters are punctuated at frequent intervals by exampl
This text presents six mini-courses, all devoted to interactions between representation theory of algebras, homological algebra, and the new ever-expanding theory of cluster algebras. The interplay between the topics discussed in this text will continue to grow and this collection of courses stands as a partial testimony to this new development. The courses are useful for any mathematician who would like to learn more about this rapidly developing field; the primary aim is to engage graduate students and young researchers. Prerequisites include knowledge of some noncommutative algebra or homological algebra. Homological algebra has always been considered as one of the main tools in the study of finite-dimensional algebras. The strong relationship with cluster algebras is more recent and has quickly established itself as one of the important highlights of today's mathematical landscape. This connection has been fruitful to both areas-representation theory provides a categorification of cluster algebras, while the study of cluster algebras provides representation theory with new objects of study. The six mini-courses comprising this text were delivered March 7-18, 2016 at a CIMPA (Centre International de Mathematiques Pures et Appliquees) research school held at the Universidad Nacional de Mar del Plata, Argentina. This research school was dedicated to the founder of the Argentinian research group in representation theory, M.I. Platzeck. The courses held were: Advanced homological algebra Introduction to the representation theory of algebras Auslander-Reiten theory for algebras of infinite representation type Cluster algebras arising from surfaces Cluster tilted algebras Cluster characters Introduction to K-theory Brauer graph algebras and applications to cluster algebras |
You may like...
Groups, Invariants, Integrals, and…
Maria Ulan, Stanislav Hronek
Hardcover
R3,328
Discovery Miles 33 280
Key to Advanced Arithmetic for Canadian…
Barnard 1817-1876 Smith, Archibald McMurchy
Hardcover
R863
Discovery Miles 8 630
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R807
Discovery Miles 8 070
|