![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations > Mathematical logic
The book Advances in Distance Learning in Times of Pandemic is devoted to the issues and challenges faced by universities in the field of distance learning in COVID-19 times. It covers both the theoretical and practical aspects connected to distance education. It elaborates on issues regarding distance learning, its challenges, assessment by students and their expectations, the use of tools to improve distance learning, and the functioning of e-learning in the industry 4.0 and society 5.0 eras. The book also devotes a lot of space to the issues of Web 3.0 in university e-learning, quality assurance, and knowledge management. The aim and scope of this book is to draw a holistic picture of ongoing online teaching-activities before and during the lockdown period and present the meaning and future of e-learning from students’ points of view, taking into consideration their attitudes and expectations as well as industry 4.0 and society 5.0 aspects. The book presents the approach to distance learning and how it has changed, especially during a pandemic that revolutionized education. It highlights • the function of online education and how that has changed before and during the pandemic. • how e-learning is beneficial in promoting digital citizenship. • distance learning characteristic in the era of industry 4.0 and society 5.0. • how the era of industry 4.0 treats distance learning as a desirable form of education. The book covers both scientific and educational aspects and can be useful for university-level undergraduate, postgraduate and research-grade courses and can be referred to by anyone interested in exploring the diverse aspects of distance learning.
This contributed volume explores the ways logical skills have been perceived over the course of history. The authors approach the topic from the lenses of philosophy, anthropology, sociology, and history to examine two opposing perceptions of logic: the first as an innate human ability and the second as a skill that can be learned and mastered. Chapters focus on the social and political dynamics of the use of logic throughout history, utilizing case studies and critical analyses. Specific topics covered include: the rise of logical skills problems concerning medieval notions of idiocy and rationality decolonizing natural logic natural logic and the course of time Logical Skills: Social-Historical Perspectives will appeal to undergraduate and graduate students, as well as researchers in the fields of history, sociology, philosophy, and logic. Psychology and colonial studies scholars will also find this volume to be of particular interest.
This two-volume work bridges the gap between introductory expositions of logic or set theory on one hand, and the research literature on the other. It can be used as a text in an advanced undergraduate or beginning graduate course in mathematics, computer science, or philosophy. The volumes are written in a user-friendly conversational lecture style that makes them equally effective for self-study or class use. Volume 1 includes formal proof techniques, a section on applications of compactness (including nonstandard analysis), a generous dose of computability and its relation to the incompleteness phenomenon, and the first presentation of a complete proof of Godel's 2nd incompleteness since Hilbert and Bernay's Grundlagen theorem.
The Annual European Meeting of the Association for Symbolic Logic, also known as the Logic Colloquium, is among the most prestigious annual meetings in the field. The current volume, Logic Colloquium 2007, with contributions from plenary speakers and selected special session speakers, contains both expository and research papers by some of the best logicians in the world. This volume covers many areas of contemporary logic: model theory, proof theory, set theory, and computer science, as well as philosophical logic, including tutorials on cardinal arithmetic, on Pillay's conjecture, and on automatic structures. This volume will be invaluable for experts as well as those interested in an overview of central contemporary themes in mathematical logic.
A more accessible approach than most competitor texts, which move into advanced, research-level topics too quickly for today's students. Part I is comprehensive in providing all necessary mathematical underpinning, particularly for those who need more opportunity to develop their mathematical competence. More confident students may move directly to Part II and dip back into Part I as a reference. Ideal for use as an introductory text for courses in quantum computing. Fully worked examples illustrate the application of mathematical techniques. Exercises throughout develop concepts and enhance understanding. End-of-chapter exercises offer more practice in developing a secure foundation.
Suitable for anyone who enjoys logic puzzles Could be used as a companion book for a course on mathematical proof. The puzzles feature the same issues of problem-solving and proof-writing. For anyone who enjoys logical puzzles. For anyone interested in legal reasoning. For anyone who loves the game of baseball.
Almost all of the problems studied in this book are motivated by an overriding foundational question: What are the appropriate axioms for mathematics? Through a series of case studies, these axioms are examined to prove particular theorems in core mathematical areas such as algebra, analysis, and topology, focusing on the language of second-order arithmetic, the weakest language rich enough to express and develop the bulk of mathematics. In many cases, if a mathematical theorem is proved from appropriately weak set existence axioms, then the axioms will be logically equivalent to the theorem. Furthermore, only a few specific set existence axioms arise repeatedly in this context, which in turn correspond to classical foundational programs. This is the theme of reverse mathematics, which dominates the first half of the book. The second part focuses on models of these and other subsystems of second-order arithmetic.
How should we think about the meaning of the words that make up our language? How does reference of these terms work, and what is their referent when these are connected to abstract objects rather than to concrete ones? Can logic help to address these questions? This collection of papers aims to unify the questions of syntax and semantics of language, which span across the fields of logic, philosophy and ontology of language. The leading motif of the presented selection is the differentiation between linguistic tokens (material, concrete objects) on the one hand and linguistic types (ideal, abstract objects) on the other. Through a promenade among articles that span over all of the Author's career, this book addresses the complex philosophical question of the ontology of language by following the crystalline conceptual tools offered by logic. At the core of Wybraniec-Skardowska's scholarship is the idea that language is an ontological being, characterized in compliance with the logical conception of language proposed by Ajdukiewicz. The application throughout the book of tools of classical logic and set theory results fosters the emergence of a general formal logical theory of syntax, semantics and of the pragmatics of language, which takes into account the duality token-type in the understanding of linguistic expressions. Via a functional approach to language itself, logic appears as ontologically neutral with respect to existential assumptions relating to the nature of linguistic expressions and their extra-linguistic counterparts. The book is addressed to readers both at the graduate and undergraduate level, but also to a more general audience interested in getting a firmer grip on the interplay between reality and the language we use to describe and understand it.
Combinatorics and Number Theory of Counting Sequences is an introduction to the theory of finite set partitions and to the enumeration of cycle decompositions of permutations. The presentation prioritizes elementary enumerative proofs. Therefore, parts of the book are designed so that even those high school students and teachers who are interested in combinatorics can have the benefit of them. Still, the book collects vast, up-to-date information for many counting sequences (especially, related to set partitions and permutations), so it is a must-have piece for those mathematicians who do research on enumerative combinatorics. In addition, the book contains number theoretical results on counting sequences of set partitions and permutations, so number theorists who would like to see nice applications of their area of interest in combinatorics will enjoy the book, too. Features The Outlook sections at the end of each chapter guide the reader towards topics not covered in the book, and many of the Outlook items point towards new research problems. An extensive bibliography and tables at the end make the book usable as a standard reference. Citations to results which were scattered in the literature now become easy, because huge parts of the book (especially in parts II and III) appear in book form for the first time.
Aggregation is the process of combining several numerical values into a single representative value, and an aggregation function performs this operation. These functions arise wherever aggregating information is important: applied and pure mathematics (probability, statistics, decision theory, functional equations), operations research, computer science, and many applied fields (economics and finance, pattern recognition and image processing, data fusion, etc.). This is a comprehensive, rigorous and self-contained exposition of aggregation functions. Classes of aggregation functions covered include triangular norms and conorms, copulas, means and averages, and those based on nonadditive integrals. The properties of each method, as well as their interpretation and analysis, are studied in depth, together with construction methods and practical identification methods. Special attention is given to the nature of scales on which values to be aggregated are defined (ordinal, interval, ratio, bipolar). It is an ideal introduction for graduate students and a unique resource for researchers.
Diagrams are widely used in reasoning about problems in physics, mathematics and logic, but have traditionally been considered to be only heuristic tools and not valid elements of mathematical proofs. This book challenges this prejudice against visualisation in the history of logic and mathematics and provides a formal foundation for work on natural reasoning in a visual mode. The author presents Venn diagrams as a formal system of representation equipped with its own syntax and semantics and specifies rules of transformation that make this system sound and complete. The system is then extended to the equivalent of a first-order monadic language. The soundness of these diagrammatic systems refutes the contention that graphical representation is misleading in reasoning. The validity of the transformation rules ensures that the correct application of the rules will not lead to fallacies. The book concludes with a discussion of some fundamental differences between graphical systems and linguistic systems. This groundbreaking work will have important influence on research in logic, philosophy and knowledge representation.
Laws of Form is a seminal work in foundations of logic, mathematics and philosophy published by G Spencer-Brown in 1969. The book provides a new point of view on form and the role of distinction, markedness and the absence of distinction (the unmarked state) in the construction of any universe. A conference was held August 8-10, 2019 at the Old Library, Liverpool University, 19 Abercromby Square, L697ZN, UK to celebrate the 50th anniversary of the publication of Laws of Form and to remember George Spencer-Brown, its author. The book is a collection of papers introducing and extending Laws of Form written primarily by people who attended the conference in 2019.
The Unprovability of Consistency is concerned with connections between two branches of logic: proof theory and modal logic. Modal logic is the study of the principles that govern the concepts of necessity and possibility; proof theory is, in part, the study of those that govern provability and consistency. In this book, George Boolos looks at the principles of provability from the standpoint of modal logic. In doing so, he provides two perspectives on a debate in modal logic that has persisted for at least thirty years between the followers of C. I. Lewis and W. V. O. Quine. The author employs semantic methods developed by Saul Kripke in his analysis of modal logical systems. The book will be of interest to advanced undergraduate and graduate students in logic, mathematics and philosophy, as well as to specialists in those fields.
This book presents a comprehensive treatment of basic mathematical logic. The author's aim is to make exact the vague, intuitive notions of natural number, preciseness, and correctness, and to invent a method whereby these notions can be communicated to others and stored in the memory. He adopts a symbolic language in which ideas about natural numbers can be stated precisely and meaningfully, and then investigates the properties and limitations of this language. The treatment of mathematical concepts in the main body of the text is rigorous, but, a section of 'historical remarks' traces the evolution of the ideas presented in each chapter. Sources of the original accounts of these developments are listed in the bibliography.
In this book, leading experts discuss innovative components of complexity theory and chaos theory in economics. The underlying perspective is that investigations of economic phenomena should view these phenomena not as deterministic, predictable and mechanistic but rather as process dependent, organic and always evolving. The aim is to highlight the exciting potential of this approach in economics and its ability to overcome the limitations of past research and offer important new insights. The book offers a stimulating mix of theory, examples and policy. By casting light on a variety of topics in the field, it will provide an ideal platform for researchers wishing to deepen their understanding and identify areas for further investigation.
This book is designed to be usable as a textbook for an undergraduate course or for an advanced graduate course in coding theory as well as a reference for researchers in discrete mathematics, engineering and theoretical computer science. This second edition has three parts: an elementary introduction to coding, theory and applications of codes, and algebraic curves. The latter part presents a brief introduction to the theory of algebraic curves and its most important applications to coding theory.
Medical imaging is one of the heaviest funded biomedical engineering research areas. The second edition of Pattern Recognition and Signal Analysis in Medical Imaging brings sharp focus to the development of integrated systems for use in the clinical sector, enabling both imaging and the automatic assessment of the resultant data. Since the first edition, there has been tremendous development of new, powerful technologies for detecting, storing, transmitting, analyzing, and displaying medical images. Computer-aided analytical techniques, coupled with a continuing need to derive more information from medical images, has led to a growing application of digital processing techniques in cancer detection as well as elsewhere in medicine. This book is an essential tool for students and professionals, compiling and explaining proven and cutting-edge methods in pattern recognition for medical imaging.
This book addresses two-person zero-sum finite games in which the payoffs in any situation are expressed with fuzzy numbers. The purpose of this book is to develop a suite of effective and efficient linear programming models and methods for solving matrix games with payoffs in fuzzy numbers. Divided into six chapters, it discusses the concepts of solutions of matrix games with payoffs of intervals, along with their linear programming models and methods. Furthermore, it is directly relevant to the research field of matrix games under uncertain economic management. The book offers a valuable resource for readers involved in theoretical research and practical applications from a range of different fields including game theory, operational research, management science, fuzzy mathematical programming, fuzzy mathematics, industrial engineering, business and social economics.
This book focuses on one of the major challenges of the newly created scientific domain known as data science: turning data into actionable knowledge in order to exploit increasing data volumes and deal with their inherent complexity. Actionable knowledge has been qualitatively and intensively studied in management, business, and the social sciences but in computer science and engineering, its connection has only recently been established to data mining and its evolution, 'Knowledge Discovery and Data Mining' (KDD). Data mining seeks to extract interesting patterns from data, but, until now, the patterns discovered from data have not always been 'actionable' for decision-makers in Socio-Technical Organizations (STO). With the evolution of the Internet and connectivity, STOs have evolved into Cyber-Physical and Social Systems (CPSS) that are known to describe our world today. In such complex and dynamic environments, the conventional KDD process is insufficient, and additional processes are required to transform complex data into actionable knowledge. Readers are presented with advanced knowledge concepts and the analytics and information fusion (AIF) processes aimed at delivering actionable knowledge. The authors provide an understanding of the concept of 'relation' and its exploitation, relational calculus, as well as the formalization of specific dimensions of knowledge that achieve a semantic growth along the AIF processes. This book serves as an important technical presentation of relational calculus and its application to processing chains in order to generate actionable knowledge. It is ideal for graduate students, researchers, or industry professionals interested in decision science and knowledge engineering.
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed development of higher-order logic, including a comprehensive discussion of its semantics. Professor Shapiro demonstrates the prevalence of second-order notions in mathematics is practised, and also the extent to which mathematical concepts can be formulated in second-order languages . He shows how first-order languages are insufficient to codify many concepts in contemporary mathematics, and thus that higher-order logic is needed to fully reflect current mathematics. Throughout, the emphasis is on discussing the philosophical and historical issues associated with this subject, and the implications that they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic as might be gained from a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in this subject.
This is a first course in propositional modal logic, suitable for mathematicians, computer scientists and philosophers. Emphasis is placed on semantic aspects, in the form of labelled transition structures, rather than on proof theory. The book covers all the basic material - propositional languages, semantics and correspondence results, proof systems and completeness results - as well as some topics not usually covered in a modal logic course. It is written from a mathematical standpoint. To help the reader, the material is covered in short chapters, each concentrating on one topic. These are arranged into five parts, each with a common theme. An important feature of the book is the many exercises, and an extensive set of solutions is provided.
Computability and Logic has become a classic because of its accessibility to students without a mathematical background and because it covers not simply the staple topics of an intermediate logic course, such as Godel's incompleteness theorems, but also a large number of optional topics, from Turing's theory of computability to Ramsey's theorem. This 2007 fifth edition has been thoroughly revised by John Burgess. Including a selection of exercises, adjusted for this edition, at the end of each chapter, it offers a simpler treatment of the representability of recursive functions, a traditional stumbling block for students on the way to the Godel incompleteness theorems. This updated edition is also accompanied by a website as well as an instructor's manual.
Introduces the GUHA method of mechanizing hypothesis formation as a data mining tool. Presents examples of data mining with enhanced association rules, histograms, contingency tables and action rules. Provides examples of data mining for exception rules and examples of subgroups discovery. Outlines possibilities of GUHA in business intelligence and big data. Overviews related theoretical results and challenges related to mechanizing hypothesis formation.
Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically.This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as
A compilation of papers presented at the 2001 European Summer Meeting of the Association for Symbolic Logic, Logic Colloquium '01 includes surveys and research articles from some of the world's preeminent logicians. Two long articles are based on tutorials given at the meeting and present accessible expositions of research in two active areas of logic, geometric model theory and descriptive set theory of group actions. The remaining articles cover seperate research topics in many areas of mathematical logic, including applications in Computer Science, Proof Theory, Set Theory, Model Theory, Computability Theory, and aspects of Philosophy. This collection will be of interest not only to specialists in mathematical logic, but also to philosophical logicians, historians of logic, computer scientists, formal linguists and mathematicians in the areas of algebra, abstract analysis and topology. A number of the articles are aimed at non-specialists and serve as good introductions for graduate students. |
You may like...
Calculus for Engineering Students…
Jesus Martin Vaquero, Michael Carr, …
Paperback
R2,162
Discovery Miles 21 620
Theory and Applications of…
Florentin Smarandache, Madeline Al-Tahan
Hardcover
R6,648
Discovery Miles 66 480
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R569
Discovery Miles 5 690
|