Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Mathematical foundations
The book contains 8 detailed expositions of the lectures given at the Kaikoura 2000 Workshop on Computability, Complexity, and Computational Algebra. Topics covered include basic models and questions of complexity theory, the Blum-Shub-Smale model of computation, probability theory applied to algorithmics (randomized alogrithms), parametric complexity, Kolmogorov complexity of finite strings, computational group theory, counting problems, and canonical models of ZFC providing a solution to continuum hypothesis. The text addresses students in computer science or mathematics, and professionals in these areas who seek a complete, but gentle introduction to a wide range of techniques, concepts, and research horizons in the area of computational complexity in a broad sense.
This monograph presents a new model of mathematical structures called weak n-categories. These structures find their motivation in a wide range of fields, from algebraic topology to mathematical physics, algebraic geometry and mathematical logic. While strict n-categories are easily defined in terms associative and unital composition operations they are of limited use in applications, which often call for weakened variants of these laws. The author proposes a new approach to this weakening, whose generality arises not from a weakening of such laws but from the very geometric structure of its cells; a geometry dubbed weak globularity. The new model, called weakly globular n-fold categories, is one of the simplest known algebraic structures yielding a model of weak n-categories. The central result is the equivalence of this model to one of the existing models, due to Tamsamani and further studied by Simpson. This theory has intended applications to homotopy theory, mathematical physics and to long-standing open questions in category theory. As the theory is described in elementary terms and the book is largely self-contained, it is accessible to beginning graduate students and to mathematicians from a wide range of disciplines well beyond higher category theory. The new model makes a transparent connection between higher category theory and homotopy theory, rendering it particularly suitable for category theorists and algebraic topologists. Although the results are complex, readers are guided with an intuitive explanation before each concept is introduced, and with diagrams showing the interconnections between the main ideas and results.
This volume provides an introduction to the properties of functional differential equations and their applications in diverse fields such as immunology, nuclear power generation, heat transfer, signal processing, medicine and economics. In particular, it deals with problems and methods relating to systems having a memory (hereditary systems). The book contains eight chapters. Chapter 1 explains where functional differential equations come from and what sort of problems arise in applications. Chapter 2 gives a broad introduction to the basic principle involved and deals with systems having discrete and distributed delay. Chapters 3-5 are devoted to stability problems for retarded, neutral and stochastic functional differential equations. Problems of optimal control and estimation are considered in Chapters 6-8. For applied mathematicians, engineers, and physicists whose work involves mathematical modeling of hereditary systems. This volume can also be recommended as a supplementary text for graduate students who wish to become better acquainted with the properties and applications of functional differential equations.
This volume contains English translations of Goedel's chapters on logicism and the antinomies and on the calculi of pure logic, as well as outlines for a chapter on metamathematics. It also comprises most of his reading notes. This book is a testimony to Goedel's understanding of the situation of foundational research in mathematics after his great discovery, the incompleteness theorem of 1931. It is also a source for his views on his logical predecessors, from Leibniz, Frege, and Russell to his own times. Goedel's "own book on foundations," as he called it, is essential reading for logicians and philosophers interested in foundations. Furthermore, it opens a new chapter to the life and achievement of one of the icons of 20th century science and philosophy.
This book explores the premise that a physical theory is an interpretation of the analytico-canonical formalism. Throughout the text, the investigation stresses that classical mechanics in its Lagrangian formulation is the formal backbone of theoretical physics. The authors start from a presentation of the analytico-canonical formalism for classical mechanics, and its applications in electromagnetism, Schroedinger's quantum mechanics, and field theories such as general relativity and gauge field theories, up to the Higgs mechanism. The analysis uses the main criterion used by physicists for a theory: to formulate a physical theory we write down a Lagrangian for it. A physical theory is a particular instance of the Lagrangian functional. So, there is already an unified physical theory. One only has to specify the corresponding Lagrangian (or Lagrangian density); the dynamical equations are the associated Euler-Lagrange equations. The theory of Suppes predicates as the main tool in the axiomatization and examples from the usual theories in physics. For applications, a whole plethora of results from logic that lead to interesting, and sometimes unexpected, consequences. This volume looks at where our physics happen and which mathematical universe we require for the description of our concrete physical events. It also explores if we use the constructive universe or if we need set-theoretically generic spacetimes.
Introduces the GUHA method of mechanizing hypothesis formation as a data mining tool. Presents examples of data mining with enhanced association rules, histograms, contingency tables and action rules. Provides examples of data mining for exception rules and examples of subgroups discovery. Outlines possibilities of GUHA in business intelligence and big data. Overviews related theoretical results and challenges related to mechanizing hypothesis formation.
This book explores the research of Professor Hilary Putnam, a Harvard professor as well as a leading philosopher, mathematician and computer scientist. It features the work of distinguished scholars in the field as well as a selection of young academics who have studied topics closely connected to Putnam's work. It includes 12 papers that analyze, develop, and constructively criticize this notable professor's research in mathematical logic, the philosophy of logic and the philosophy of mathematics. In addition, it features a short essay presenting reminiscences and anecdotes about Putnam from his friends and colleagues, and also includes an extensive bibliography of his work in mathematics and logic. The book offers readers a comprehensive review of outstanding contributions in logic and mathematics as well as an engaging dialogue between prominent scholars and researchers. It provides those interested in mathematical logic, the philosophy of logic, and the philosophy of mathematics unique insights into the work of Hilary Putnam.
Steps forward in mathematics often reverberate in other scientific disciplines, and give rise to innovative conceptual developments or find surprising technological applications. This volume brings to the forefront some of the proponents of the mathematics of the twentieth century, who have put at our disposal new and powerful instruments for investigating the reality around us. The portraits present people who have impressive charisma and wide-ranging cultural interests, who are passionate about defending the importance of their own research, are sensitive to beauty, and attentive to the social and political problems of their times. What we have sought to document is mathematics' central position in the culture of our day. Space has been made not only for the great mathematicians but also for literary texts, including contributions by two apparent interlopers, Robert Musil and Raymond Queneau, for whom mathematical concepts represented a valuable tool for resolving the struggle between 'soul and precision.'
The aim of this book is to give self-contained proofs of all basic results concerning the infinite-valued proposition al calculus of Lukasiewicz and its algebras, Chang's MV -algebras. This book is for self-study: with the possible exception of Chapter 9 on advanced topics, the only prere- quisite for the reader is some acquaintance with classical propositional logic, and elementary algebra and topology. In this book it is not our aim to give an account of Lukasiewicz's motivations for adding new truth values: readers interested in this topic will find appropriate references in Chapter 10. Also, we shall not explain why Lukasiewicz infinite-valued propositionallogic is a ba- sic ingredient of any logical treatment of imprecise notions: Hajek's book in this series on Trends in Logic contains the most authorita- tive explanations. However, in order to show that MV-algebras stand to infinite-valued logic as boolean algebras stand to two-valued logic, we shall devote Chapter 5 to Ulam's game of Twenty Questions with lies/errors, as a natural context where infinite-valued propositions, con- nectives and inferences are used. While several other semantics for infinite-valued logic are known in the literature-notably Giles' game- theoretic semantics based on subjective probabilities-still the transi- tion from two-valued to many-valued propositonallogic can hardly be modelled by anything simpler than the transformation of the familiar game of Twenty Questions into Ulam game with lies/errors.
This book examines an abstract mathematical theory, placing special emphasis on results applicable to formal logic. If a theory is especially abstract, it may find a natural home within several of the more familiar branches of mathematics. This is the case with the theory of closure spaces. It might be considered part of topology, lattice theory, universal algebra or, no doubt, one of several other branches of mathematics as well. In our development we have treated it, conceptually and methodologically, as part of topology, partly because we first thought ofthe basic structure involved (closure space), as a generalization of Frechet's concept V-space. V-spaces have been used in some developments of general topology as a generalization of topological space. Indeed, when in the early '50s, one of us started thinking about closure spaces, we thought ofit as the generalization of Frechet V space which comes from not requiring the null set to be CLOSURE SPACES ANDLOGIC XlI closed(as it is in V-spaces). This generalization has an extreme advantage in connection with application to logic, since the most important closure notion in logic, deductive closure, in most cases does not generate a V-space, since the closure of the null set typically consists of the "logical truths" of the logic being examined."
Project Origami: Activities for Exploring Mathematics, Second Edition presents a flexible, discovery-based approach to learning origami-math topics. It helps readers see how origami intersects a variety of mathematical topics, from the more obvious realm of geometry to the fields of algebra, number theory, and combinatorics. With over 100 new pages, this updated and expanded edition now includes 30 activities and offers better solutions and teaching tips for all activities. The book contains detailed plans for 30 hands-on, scalable origami activities. Each activity lists courses in which the activity might fit, includes handouts for classroom use, and provides notes for instructors on solutions, how the handouts can be used, and other pedagogical suggestions. The handouts are also available on the book's CRC Press web page. Reflecting feedback from teachers and students who have used the book, this classroom-tested text provides an easy and entertaining way for teachers to incorporate origami into a range of college and advanced high school math courses. Visit the author's website for more information.
SECTION I In 1972, Donald Davison and Gilbert Hannan wrote in the introduction to the volume Semantics of Natural Language: "The success of linguistics in treating natural languages as formal ~yntactic systems has aroused the interest of a number of linguists in a parallel or related development of semantics. For the most part quite independently, many philosophers and logicians have recently been applying formal semantic methods to structures increasingly like natural languages. While differences in training, method and vocabulary tend to veil the fact, philosophers and linguists are converging, it seems, on a common set of interrelated problems. " Davidson and Harman called for an interdisciplinary dialogue of linguists, philosophers and logicians on the semantics of natural language, and during the last ten years such an enterprise has proved extremely fruitful. Thanks to the cooperative effort in these several fields, the last decade has brought about striking progress in our understanding of the semantics of natural language. This work on semantics has typically paid little attention to psychological aspects of meaning. Thus, psychologists or computer scientists working on artificial intelligence were not invited to join the forces in the influential introduction of Semantics of Natural Language. No doubt it was felt that while psychological aspects of language are important in their own right, they are not relevant to our immediate semantic concerns. In the last few years, several linguists and logicians have come to question the fundamental anti-psychological assumptions underlying their theorizing.
This book explores the classical and beautiful character theory of finite groups. It does it by using some rudiments of the language of categories. Originally emerging from two courses offered at Peking University (PKU), primarily for third-year students, it is now better suited for graduate courses, and provides broader coverage than books that focus almost exclusively on groups. The book presents the basic tools, notions and theorems of character theory (including a new treatment of the control of fusion and isometries), and introduces readers to the categorical language at several levels. It includes and proves the major results on characteristic zero representations without any assumptions about the base field. The book includes a dedicated chapter on graded representations and applications of polynomial invariants of finite groups, and its closing chapter addresses the more recent notion of the Drinfeld double of a finite group and the corresponding representation of GL_2(Z).
The purpose of this book is to present the classical analytic function theory of several variables as a standard subject in a course of mathematics after learning the elementary materials (sets, general topology, algebra, one complex variable). This includes the essential parts of Grauert-Remmert's two volumes, GL227(236) (Theory of Stein spaces) and GL265 (Coherent analytic sheaves) with a lowering of the level for novice graduate students (here, Grauert's direct image theorem is limited to the case of finite maps).The core of the theory is "Oka's Coherence", found and proved by Kiyoshi Oka. It is indispensable, not only in the study of complex analysis and complex geometry, but also in a large area of modern mathematics. In this book, just after an introductory chapter on holomorphic functions (Chap. 1), we prove Oka's First Coherence Theorem for holomorphic functions in Chap. 2. This defines a unique character of the book compared with other books on this subject, in which the notion of coherence appears much later.The present book, consisting of nine chapters, gives complete treatments of the following items: Coherence of sheaves of holomorphic functions (Chap. 2); Oka-Cartan's Fundamental Theorem (Chap. 4); Coherence of ideal sheaves of complex analytic subsets (Chap. 6); Coherence of the normalization sheaves of complex spaces (Chap. 6); Grauert's Finiteness Theorem (Chaps. 7, 8); Oka's Theorem for Riemann domains (Chap. 8). The theories of sheaf cohomology and domains of holomorphy are also presented (Chaps. 3, 5). Chapter 6 deals with the theory of complex analytic subsets. Chapter 8 is devoted to the applications of formerly obtained results, proving Cartan-Serre's Theorem and Kodaira's Embedding Theorem. In Chap. 9, we discuss the historical development of "Coherence".It is difficult to find a book at this level that treats all of the above subjects in a completely self-contained manner. In the present volume, a number of classical proofs are improved and simplified, so that the contents are easily accessible for beginning graduate students.
In order to perform effective analysis of today’s information security systems, numerous components must be taken into consideration. This book presents a well-organized, consistent solution created by the author, which allows for precise multilevel analysis of information security systems and accounts for all of the significant details. Enabling the multilevel modeling of secure systems, the quality of protection modeling language (QoP-ML) approach provides for the abstraction of security systems while maintaining an emphasis on quality protection. This book introduces the basis of the QoP modeling language along with all the advanced analysis modules, syntax, and semantics. It delineates the steps used in cryptographic protocols and introduces a multilevel protocol analysis that expands current understanding. Introduces quality of protection evaluation of IT Systems Covers the financial, economic, and CO2 emission analysis phase Supplies a multilevel analysis of Cloud-based data centers Details the structures for advanced communication modeling and energy analysis Considers security and energy efficiency trade-offs for the protocols of wireless sensor network architectures Includes case studies that illustrate the QoP analysis process using the QoP-ML Examines the robust security metrics of cryptographic primitives Compares and contrasts QoP-ML with the PL/SQL, SecureUML, and UMLsec approaches by means of the SEQUAL framework The book explains the formal logic for representing the relationships between security mechanisms in a manner that offers the possibility to evaluate security attributes. It presents the architecture and API of tools that ensure automatic analysis, including the automatic quality of protection analysis tool (AQoPA), crypto metrics tool (CMTool), and security mechanisms evaluation tool (SMETool). The book includes a number of examples and case studies that illustrate the QoP analysis process by the QoP-ML. Every operation defined by QoP-ML is described within parameters of security metrics to help you better evaluate the impact of each operation on your system's security.
1 Introduction.- 2 Pritchard-Salamon systems.- 3 Linear quadratic control and frequency domain inequalities.- 4 H?-control with state-feedback.- 5 H?-control with measurement-feedback.- 6 Examples and conclusions.- A Stability theory.- B Differentiability and some convergence results.- C The invariant zeros condition.
Digital forensics plays a crucial role in identifying, analysing, and presenting cyber threats as evidence in a court of law. Artificial intelligence, particularly machine learning and deep learning, enables automation of the digital investigation process. This book provides an in-depth look at the fundamental and advanced methods in digital forensics. It also discusses how machine learning and deep learning algorithms can be used to detect and investigate cybercrimes. This book demonstrates digital forensics and cyber-investigating techniques with real-world applications. It examines hard disk analytics and style architectures, including Master Boot Record and GUID Partition Table as part of the investigative process. It also covers cyberattack analysis in Windows, Linux, and network systems using virtual machines in real-world scenarios. Digital Forensics in the Era of Artificial Intelligence will be helpful for those interested in digital forensics and using machine learning techniques in the investigation of cyberattacks and the detection of evidence in cybercrimes.
Digital forensics plays a crucial role in identifying, analysing, and presenting cyber threats as evidence in a court of law. Artificial intelligence, particularly machine learning and deep learning, enables automation of the digital investigation process. This book provides an in-depth look at the fundamental and advanced methods in digital forensics. It also discusses how machine learning and deep learning algorithms can be used to detect and investigate cybercrimes. This book demonstrates digital forensics and cyber-investigating techniques with real-world applications. It examines hard disk analytics and style architectures, including Master Boot Record and GUID Partition Table as part of the investigative process. It also covers cyberattack analysis in Windows, Linux, and network systems using virtual machines in real-world scenarios. Digital Forensics in the Era of Artificial Intelligence will be helpful for those interested in digital forensics and using machine learning techniques in the investigation of cyberattacks and the detection of evidence in cybercrimes.
The International Biometric Society (IBS) was formed at the First International Biometric Conference at Woods Hole on September 6, 1947. The History of the International Biometric Society presents a deep dive into the voluminous archival records, with primary focus on IBS's first fifty years. It contains numerous photos and extracts from the archival materials, and features many photos of important leaders who served IBS across the decades. Features: Describes events leading up to and at Woods Hole on September 6, 1947 that led to the formation of IBS Outlines key markers that shaped IBS after the 1947 formation through to the modern day Describes the regional and national group structure, and the formation of regions and national groups Describes events surrounding the key scientific journal of IBS, Biometrics, including the transfer of ownership to IBS, content, editors, policies, management, and importance Describes the other key IBS publications - Biometric Bulletin, Journal of Agricultural Biological and Environmental Statistics, and regional publications Provides details of International Biometric Conferences and key early symposia Describes IBS constitution and by-laws processes, and the evolution of business arrangements Provides a record of international officers, including regional presidents, national group secretaries, journal editors, and the locations of meetings Includes a gallery of international Presidents, and a gallery of Secretaries and Treasurers The History of the International Biometric Society will appeal to anyone interested in the activities of our statistical and biometrical forebearers. The focus is on issues and events that engaged the attention of the officers of IBS. Some of these records are riveting, some entertaining, some intriguing, and some colorful. Some of the issues covered were difficult to handle, but even these often resulted in changes that benefited IBS.
The International Biometric Society (IBS) was formed at the First International Biometric Conference at Woods Hole on September 6, 1947. The History of the International Biometric Society presents a deep dive into the voluminous archival records, with primary focus on IBS's first fifty years. It contains numerous photos and extracts from the archival materials, and features many photos of important leaders who served IBS across the decades. Features: Describes events leading up to and at Woods Hole on September 6, 1947 that led to the formation of IBS Outlines key markers that shaped IBS after the 1947 formation through to the modern day Describes the regional and national group structure, and the formation of regions and national groups Describes events surrounding the key scientific journal of IBS, Biometrics, including the transfer of ownership to IBS, content, editors, policies, management, and importance Describes the other key IBS publications - Biometric Bulletin, Journal of Agricultural Biological and Environmental Statistics, and regional publications Provides details of International Biometric Conferences and key early symposia Describes IBS constitution and by-laws processes, and the evolution of business arrangements Provides a record of international officers, including regional presidents, national group secretaries, journal editors, and the locations of meetings Includes a gallery of international Presidents, and a gallery of Secretaries and Treasurers The History of the International Biometric Society will appeal to anyone interested in the activities of our statistical and biometrical forebearers. The focus is on issues and events that engaged the attention of the officers of IBS. Some of these records are riveting, some entertaining, some intriguing, and some colorful. Some of the issues covered were difficult to handle, but even these often resulted in changes that benefited IBS.
This book has a fundamental relationship to the International Seminar on Fuzzy Set Theory held each September in Linz, Austria. First, this volume is an extended account of the eleventh Seminar of 1989. Second, and more importantly, it is the culmination of the tradition of the preceding ten Seminars. The purpose of the Linz Seminar, since its inception, was and is to foster the development of the mathematical aspects of fuzzy sets. In the earlier years, this was accomplished by bringing together for a week small grou ps of mathematicians in various fields in an intimate, focused environment which promoted much informal, critical discussion in addition to formal presentations. Beginning with the tenth Seminar, the intimate setting was retained, but each Seminar narrowed in theme; and participation was broadened to include both younger scholars within, and established mathematicians outside, the mathematical mainstream of fuzzy sets theory. Most of the material of this book was developed over the years in close association with the Seminar or influenced by what transpired at Linz. For much of the content, it played a crucial role in either stimulating this material or in providing feedback and the necessary screening of ideas. Thus we may fairly say that the book, and the eleventh Seminar to which it is directly related, are in many respects a culmination of the previous Seminars.
This book explains exactly what human knowledge is. The key concepts in this book are structures and algorithms, i.e., what the readers "see" and how they make use of what they see. Thus in comparison with some other books on the philosophy (or methodology) of science, which employ a syntactic approach, the author's approach is model theoretic or structural. Properly understood, it extends the current art and science of mathematical modeling to all fields of knowledge. The link between structure and algorithms is mathematics. But viewing "mathematics" as such a link is not exactly what readers most likely learned in school; thus, the task of this book is to explain what "mathematics" should actually mean. Chapter 1, an introductory essay, presents a general analysis of structures, algorithms and how they are to be linked. Several examples from the natural and social sciences, and from the history of knowledge, are provided in Chapters 2-6. In turn, Chapters 7 and 8 extend the analysis to include language and the mind. Structures are what the readers see. And, as abstract cultural objects, they can almost always be seen in many different ways. But certain structures, such as natural numbers and the basic theory of grammar, seem to have an absolute character. Any theory of knowledge grounded in human culture must explain how this is possible. The author's analysis of this cultural invariance, combining insights from evolutionary theory and neuroscience, is presented in the book's closing chapter. The book will be of interest to researchers, students and those outside academia who seek a deeper understanding of knowledge in our present-day society. |
You may like...
Theory and Applications of…
Florentin Smarandache, Madeline Al-Tahan
Hardcover
R7,022
Discovery Miles 70 220
From Quantum Information to Musical…
Maria Luisa Dalla Chiara, Roberto Giuntini, …
Paperback
R484
Discovery Miles 4 840
Mathematical Proofs: A Transition to…
Gary Chartrand, Albert Polimeni, …
Paperback
R2,185
Discovery Miles 21 850
National Arithmetic in Theory and…
John Herbert 1831-1904 Sangster
Hardcover
R959
Discovery Miles 9 590
Key to Advanced Arithmetic for Canadian…
Barnard 1817-1876 Smith, Archibald McMurchy
Hardcover
R857
Discovery Miles 8 570
The High School Arithmetic - for Use in…
W. H. Ballard, A. C. McKay, …
Hardcover
R956
Discovery Miles 9 560
|