![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
Originally published in 1931. This inquiry investigates and develops John Cook Wilson's view of the province of logic. It bases the study on the posthumous collected papers Statement and Inference. The author seeks to answer questions on the nature of logic using Cook Wilson's thought. The chapters introduce and consider topics from metaphysics to grammar and from psychology to knowledge. An early conception of logic in the sciences and presenting the work of an important twentieth century philosopher, this is an engaging work.
This book addresses the argument in the history of the philosophy of science between the positivists and the anti-positivists. The author starts from a point of firm conviction that all science and philosophy must start with the given... But that the range of the given is not definite. He begins with an examination of science from the outside and then the inside, explaining his position on metaphysics and attempts to formulate the character of operational acts before a general theory of symbolism is explored. The last five chapters constitute a treatise to show that the development from one stage of symbolismto the next is inevitable, consequently that explanatory science represents the culmination of knowledge.
Originally published in 1973. This book presents a valid mode of reasoning that is different to mathematical probability. This inductive logic is investigated in terms of scientific investigation. The author presents his criteria of adequacy for analysing inductive support for hypotheses and discusses each of these criteria in depth. The chapters cover philosophical problems and paradoxes about experimental support, probability and justifiability, ending with a system of logical syntax of induction. Each section begins with a summary of its contents and there is a glossary of technical terms to aid the reader.
Originally published in 1981. This is a book for the final year undergraduate or first year graduate who intends to proceed with serious research in philosophical logic. It will be welcomed by both lecturers and students for its careful consideration of main themes ranging from Gricean accounts of meaning to two dimensional modal logic. The first part of the book is concerned with the nature of the semantic theorist's project, and particularly with the crucial concepts of meaning, truth, and semantic structure. The second and third parts deal with various constructions that are found in natural languages: names, quantifiers, definite descriptions, and modal operators. Throughout, while assuming some familiarity with philosophical logic and elementary formal logic, the text provides a clear exposition. It brings together related ideas, and in some places refines and improves upon existing accounts.
Originally published in 1966. This is a self-instructional course intended for first-year university students who have not had previous acquaintance with Logic. The book deals with "propositional" logic by the truth-table method, briefly introducing axiomatic procedures, and proceeds to the theory of the syllogism, the logic of one-place predicates, and elementary parts of the logic of many-place predicates. Revision material is provided covering the main parts of the course. The course represents from eight to twenty hours work. depending on the student's speed of work and on whether optional chapters are taken.
Originally published in 1965. This is a textbook of modern deductive logic, designed for beginners but leading further into the heart of the subject than most other books of the kind. The fields covered are the Propositional Calculus, the more elementary parts of the Predicate Calculus, and Syllogistic Logic treated from a modern point of view. In each of the systems discussed the main emphases are on Decision Procedures and Axiomatisation, and the material is presented with as much formal rigour as is compatible with clarity of exposition. The techniques used are not only described but given a theoretical justification. Proofs of Consistency, Completeness and Independence are set out in detail. The fundamental characteristics of the various systems studies, and their relations to each other are established by meta-logical proofs, which are used freely in all sections of the book. Exercises are appended to most of the chapters, and answers are provided.
For computer scientists, especially those in the security field, the use of chaos has been limited to the computation of a small collection of famous but unsuitable maps that offer no explanation of why chaos is relevant in the considered contexts. Discrete Dynamical Systems and Chaotic Machines: Theory and Applications shows how to make finite machines, such as computers, neural networks, and wireless sensor networks, work chaotically as defined in a rigorous mathematical framework. Taking into account that these machines must interact in the real world, the authors share their research results on the behaviors of discrete dynamical systems and their use in computer science. Covering both theoretical and practical aspects, the book presents: Key mathematical and physical ideas in chaos theory Computer science fundamentals, clearly establishing that chaos properties can be satisfied by finite state machines Concrete applications of chaotic machines in computer security, including pseudorandom number generators, hash functions, digital watermarking, and steganography Concrete applications of chaotic machines in wireless sensor networks, including secure data aggregation and video surveillance Until the authors' recent research, the practical implementation of the mathematical theory of chaos on finite machines raised several issues. This self-contained book illustrates how chaos theory enables the study of computer security problems, such as steganalysis, that otherwise could not be tackled. It also explains how the theory reinforces existing cryptographically secure tools and schemes.
‘Another terrific book by Rob Eastaway’ SIMON SINGH ‘A delightfully accessible guide to how to play with numbers’ HANNAH FRY How many cats are there in the world? What's the chance of winning the lottery twice? And just how long does it take to count to a million? Learn how to tackle tricky maths problems with nothing but the back of an envelope, a pencil and some good old-fashioned brain power. Join Rob Eastaway as he takes an entertaining look at how to figure without a calculator. Packed with amusing anecdotes, quizzes, and handy calculation tips for every situation, Maths on the Back of an Envelope is an invaluable introduction to the art of estimation, and a welcome reminder that sometimes our own brain is the best tool we have to deal with numbers.
Researchers and practitioners of cryptography and information security are constantly challenged to respond to new attacks and threats to information systems. Authentication Codes and Combinatorial Designs presents new findings and original work on perfect authentication codes characterized in terms of combinatorial designs, namely strong partially balanced designs (SPBD). Beginning with examples illustrating the concepts of authentication schemes and combinatorial designs, the book considers the probability of successful deceptions followed by schemes involving three and four participants, respectively. From this point, the author constructs the perfect authentication schemes and explores encoding rules for such schemes in some special cases. Using rational normal curves in projective spaces over finite fields, the author constructs a new family of SPBD. He then presents some established combinatorial designs that can be used to construct perfect schemes, such as t-designs, orthogonal arrays of index unity, and designs constructed by finite geometry. The book concludes by studying definitions of perfect secrecy, properties of perfectly secure schemes, and constructions of perfect secrecy schemes with and without authentication. Supplying an appendix of construction schemes for authentication and secrecy schemes, Authentication Codes and Combinatorial Designs points to new applications of combinatorial designs in cryptography.
For propositional logic it can be decided whether a formula has a deduction from a finite set of other formulas. This volume begins with a method to decide this for the quantified formulas of those fragments of arithmetic which express the properties of order-plus-successor and of order-plus-addition (Pressburger arithmetic). It makes use of an algorithm eliminating quantifiers which, in turn, is also applied to obtain consistency proofs for these fragments.
In this volume, logic starts from the observation that in everyday arguments, as brought forward by say a lawyer, statements are transformed linguistically, connecting them in formal ways irrespective of their contents. Understanding such arguments as deductive situations, or "sequents" in the technical terminology, the transformations between them can be expressed as logical rules. The book concludes with the algorithms producing the results of Gentzen's midsequent theorem and Herbrand's theorem for prenex formulas.
The huge number and broad range of the existing and potential applications of fuzzy logic have precipitated a veritable avalanche of books published on the subject. Most, however, focus on particular areas of application. Many do no more than scratch the surface of the theory that holds the power and promise of fuzzy logic. Fuzzy Automata and Languages: Theory and Applications offers the first in-depth treatment of the theory and mathematics of fuzzy automata and fuzzy languages. After introducing background material, the authors study max-min machines and max-product machines, developing their respective algebras and exploring properties such as equivalences, homomorphisms, irreducibility, and minimality. The focus then turns to fuzzy context-free grammars and languages, with special attention to trees, fuzzy dendrolanguage generating systems, and normal forms. A treatment of algebraic fuzzy automata theory follows, along with additional results on fuzzy languages, minimization of fuzzy automata, and recognition of fuzzy languages. Although the book is theoretical in nature, the authors also discuss applications in a variety of fields, including databases, medicine, learning systems, and pattern recognition. Much of the information on fuzzy languages is new and never before presented in book form. Fuzzy Automata and Languages incorporates virtually all of the important material published thus far. It stands alone as a complete reference on the subject and belongs on the shelves of anyone interested in fuzzy mathematics or its applications.
This is the first comprehensive treatment of subjective logic and all its operations. The author developed the approach, and in this book he first explains subjective opinions, opinion representation, and decision-making under vagueness and uncertainty, and he then offers a full definition of subjective logic, harmonising the key notations and formalisms, concluding with chapters on trust networks and subjective Bayesian networks, which when combined form general subjective networks. The author shows how real-world situations can be realistically modelled with regard to how situations are perceived, with conclusions that more correctly reflect the ignorance and uncertainties that result from partially uncertain input arguments. The book will help researchers and practitioners to advance, improve and apply subjective logic to build powerful artificial reasoning models and tools for solving real-world problems. A good grounding in discrete mathematics is a prerequisite.
In this new text, Steven Givant-the author of several acclaimed books, including works co-authored with Paul Halmos and Alfred Tarski-develops three theories of duality for Boolean algebras with operators. Givant addresses the two most recognized dualities (one algebraic and the other topological) and introduces a third duality, best understood as a hybrid of the first two. This text will be of interest to graduate students and researchers in the fields of mathematics, computer science, logic, and philosophy who are interested in exploring special or general classes of Boolean algebras with operators. Readers should be familiar with the basic arithmetic and theory of Boolean algebras, as well as the fundamentals of point-set topology.
In order to perform effective analysis of today’s information security systems, numerous components must be taken into consideration. This book presents a well-organized, consistent solution created by the author, which allows for precise multilevel analysis of information security systems and accounts for all of the significant details. Enabling the multilevel modeling of secure systems, the quality of protection modeling language (QoP-ML) approach provides for the abstraction of security systems while maintaining an emphasis on quality protection. This book introduces the basis of the QoP modeling language along with all the advanced analysis modules, syntax, and semantics. It delineates the steps used in cryptographic protocols and introduces a multilevel protocol analysis that expands current understanding. Introduces quality of protection evaluation of IT Systems Covers the financial, economic, and CO2 emission analysis phase Supplies a multilevel analysis of Cloud-based data centers Details the structures for advanced communication modeling and energy analysis Considers security and energy efficiency trade-offs for the protocols of wireless sensor network architectures Includes case studies that illustrate the QoP analysis process using the QoP-ML Examines the robust security metrics of cryptographic primitives Compares and contrasts QoP-ML with the PL/SQL, SecureUML, and UMLsec approaches by means of the SEQUAL framework The book explains the formal logic for representing the relationships between security mechanisms in a manner that offers the possibility to evaluate security attributes. It presents the architecture and API of tools that ensure automatic analysis, including the automatic quality of protection analysis tool (AQoPA), crypto metrics tool (CMTool), and security mechanisms evaluation tool (SMETool). The book includes a number of examples and case studies that illustrate the QoP analysis process by the QoP-ML. Every operation defined by QoP-ML is described within parameters of security metrics to help you better evaluate the impact of each operation on your system's security.
'Points, questions, stories, and occasional rants introduce the 24 chapters of this engaging volume. With a focus on mathematics and peppered with a scattering of computer science settings, the entries range from lightly humorous to curiously thought-provoking. Each chapter includes sections and sub-sections that illustrate and supplement the point at hand. Most topics are self-contained within each chapter, and a solid high school mathematics background is all that is needed to enjoy the discussions. There certainly is much to enjoy here.'CHOICEEver notice how people sometimes use math words inaccurately? Or how sometimes you instinctively know a math statement is false (or not known)?Each chapter of this book makes a point like those above and then illustrates the point by doing some real mathematics through step-by-step mathematical techniques.This book gives readers valuable information about how mathematics and theoretical computer science work, while teaching them some actual mathematics and computer science through examples and exercises. Much of the mathematics could be understood by a bright high school student. The points made can be understood by anyone with an interest in math, from the bright high school student to a Field's medal winner.
The series is devoted to the publication of high-level monographs on all areas of mathematical logic and its applications. It is addressed to advanced students and research mathematicians, and may also serve as a guide for lectures and for seminars at the graduate level.
Turing's famous 1936 paper introduced a formal definition of a computing machine, a Turing machine. This model led to both the development of actual computers and to computability theory, the study of what machines can and cannot compute. This book presents classical computability theory from Turing and Post to current results and methods, and their use in studying the information content of algebraic structures, models, and their relation to Peano arithmetic. The author presents the subject as an art to be practiced, and an art in the aesthetic sense of inherent beauty which all mathematicians recognize in their subject. Part I gives a thorough development of the foundations of computability, from the definition of Turing machines up to finite injury priority arguments. Key topics include relative computability, and computably enumerable sets, those which can be effectively listed but not necessarily effectively decided, such as the theorems of Peano arithmetic. Part II includes the study of computably open and closed sets of reals and basis and nonbasis theorems for effectively closed sets. Part III covers minimal Turing degrees. Part IV is an introduction to games and their use in proving theorems. Finally, Part V offers a short history of computability theory. The author has honed the content over decades according to feedback from students, lecturers, and researchers around the world. Most chapters include exercises, and the material is carefully structured according to importance and difficulty. The book is suitable for advanced undergraduate and graduate students in computer science and mathematics and researchers engaged with computability and mathematical logic.
The Asian Logic Conference (ALC) is a major international event in mathematical logic. It features the latest scientific developments in the fields of mathematical logic and its applications, logic in computer science, and philosophical logic. The ALC series also aims to promote mathematical logic in the Asia-Pacific region and to bring logicians together both from within Asia and elsewhere for an exchange of information and ideas. This combined proceedings volume represents works presented or arising from the 14th and 15th ALCs.
If we take mathematical statements to be true, then must we also believe in the existence of invisible mathematical objects, accessible only by the power of thought? Jody Azzouni says we do not, and claims that the way to escape such a commitment is to accept - as an essential part of scientific doctrine - true statesments which are 'about' objects which don't exist in any real sense.
Combinatory logic started as a programme in the foundation of mathematics and in an historical context at a time when such endeavours attracted the most gifted among the mathematicians. This small volume arose under quite differ ent circumstances, namely within the context of reworking the mathematical foundations of computer science. I have been very lucky in finding gifted students who agreed to work with me and chose, for their Ph. D. theses, subjects that arose from my own attempts 1 to create a coherent mathematical view of these foundations. The result of this collaborative work is presented here in the hope that it does justice to the individual contributor and that the reader has a chance of judging the work as a whole. E. Engeler ETH Zurich, April 1994 lCollected in Chapter III, An Algebraization of Algorithmics, in Algorithmic Properties of Structures, Selected Papers of Erwin Engeler, World Scientific PubJ. Co., Singapore, 1993, pp. 183-257. I Historical and Philosophical Background Erwin Engeler In the fall of 1928 a young American turned up at the Mathematical Institute of Gottingen, a mecca of mathematicians at the time; he was a young man with a dream and his name was H. B. Curry. He felt that he had the tools in hand with which to solve the problem of foundations of mathematics mice and for all. His was an approach that came to be called "formalist" and embodied that later became known as Combinatory Logic."
· Are you more likely to become a professional footballer if your surname is Ball? · How can you be one hundred per cent sure you will win a bet? · Why did so many Pompeiians stay put while Mount Vesuvius was erupting? · How do you prevent a nuclear war? Ever since the dawn of human civilisation, we have been trying to make predictions about what's in store for us. We do this on a personal level, so that we can get on with our lives efficiently (should I hang my laundry out to dry, or will it rain?). But we also have to predict on a much larger scale, often for the good of our broader society (how can we spot economic downturns or prevent terrorist attacks?). For just as long, we have been getting it wrong. From religious oracles to weather forecasters, and from politicians to economists, we are subjected to poor predictions all the time. Our job is to separate the good from the bad. Unfortunately, the foibles of our own biology - the biases that ultimately make us human - can let us down when it comes to making rational inferences about the world around us. And that can have disastrous consequences. How to Expect the Unexpected will teach you how and why predictions go wrong, help you to spot phony forecasts and give you a better chance of getting your own predictions correct.
In distributed, open systems like cyberspace, where the behavior of autonomous agents is uncertain and can affect other agents' welfare, trust management is used to allow agents to determine what to expect about the behavior of other agents. The role of trust management is to maximize trust between the parties and thereby provide a basis for cooperation to develop. Bringing together expertise from technology-oriented sciences, law, philosophy, and social sciences, Managing Trust in Cyberspace addresses fundamental issues underpinning computational trust models and covers trust management processes for dynamic open systems and applications in a tutorial style that aids in understanding. Topics include trust in autonomic and self-organized networks, cloud computing, embedded computing, multi-agent systems, digital rights management, security and quality issues in trusting e-government service delivery, and context-aware e-commerce applications. The book also presents a walk-through of online identity management and examines using trust and argumentation in recommender systems. It concludes with a comprehensive survey of anti-forensics for network security and a review of password security and protection. Researchers and practitioners in fields such as distributed computing, Internet technologies, networked systems, information systems, human computer interaction, human behavior modeling, and intelligent informatics especially benefit from a discussion of future trust management research directions including pervasive and ubiquitous computing, wireless ad-hoc and sensor networks, cloud computing, social networks, e-services, P2P networks, near-field communications (NFC), electronic knowledge management, and nano-communication networks.
Fuzzy theory is an interesting name for a method that has been highly effective in a wide variety of significant, real-world applications. A few examples make this readily apparent. As the result of a faulty design the method of computer-programmed trading, the biggest stock market crash in history was triggered by a small fraction of a percent change in the interest rate in a Western European country. A fuzzy theory ap proach would have weighed a number of relevant variables and the ranges of values for each of these variables. Another example, which is rather simple but pervasive, is that of an electronic thermostat that turns on heat or air conditioning at a specific temperature setting. In fact, actual comfort level involves other variables such as humidity and the location of the sun with respect to windows in a home, among others. Because of its great applied significance, fuzzy theory has generated widespread activity internationally. In fact, institutions devoted to research in this area have come into being. As the above examples suggest, Fuzzy Systems Theory is of fundamen tal importance for the analysis and design of a wide variety of dynamic systems. This clearly manifests the fundamental importance of time con siderations in the Fuzzy Systems design approach in dynamic systems. This textbook by Prof. Dr. Jernej Virant provides what is evidently a uniquely significant and comprehensive treatment of this subject on the international scene."
Information security has a major gap when cryptography is implemented. Cryptographic algorithms are well defined, key management schemes are well known, but the actual deployment is typically overlooked, ignored, or unknown. Cryptography is everywhere. Application and network architectures are typically well-documented but the cryptographic architecture is missing. This book provides a guide to discovering, documenting, and validating cryptographic architectures. Each chapter builds on the next to present information in a sequential process. This approach not only presents the material in a structured manner, it also serves as an ongoing reference guide for future use. |
You may like...
Extending the Horizons: Advances in…
Edward K. Baker, Anito Joseph, …
Hardcover
R2,685
Discovery Miles 26 850
Matheuristics - Algorithms and…
Vittorio Maniezzo, Marco Antonio Boschetti, …
Hardcover
R3,021
Discovery Miles 30 210
PowerShell, IT Pro Solutions…
William R. Stanek, William Stanek
Hardcover
R1,434
Discovery Miles 14 340
Kirstenbosch - A Visitor's Guide
Colin Paterson-Jones, John Winter
Paperback
The Accidental Mayor - Herman Mashaba…
Michael Beaumont
Paperback
(5)
|