![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
This meticulous critical assessment of the ground-breaking work of philosopher Stanislaw Le niewski focuses exclusively on primary texts and explores the full range of output by one of the master logicians of the Lvov-Warsaw school. The author's nuanced survey eschews secondary commentary, analyzing Le niewski's core philosophical views and evaluating the formulations that were to have such a profound influence on the evolution of mathematical logic. One of the undisputed leaders of the cohort of brilliant logicians that congregated in Poland in the early twentieth century, Le niewski was a guide and mentor to a generation of celebrated analytical philosophers (Alfred Tarski was his PhD student). His primary achievement was a system of foundational mathematical logic intended as an alternative to the Principia Mathematica of Alfred North Whitehead and Bertrand Russell. Its three strands-'protothetic', 'ontology', and 'mereology', are detailed in discrete sections of this volume, alongside a wealth other chapters grouped to provide the fullest possible coverage of Le niewski's academic output. With material on his early philosophical views, his contributions to set theory and his work on nominalism and higher-order quantification, this book offers a uniquely expansive critical commentary on one of analytical philosophy's great pioneers. "
This book deals with the problem of finding suitable languages that can represent specific classes of Petri nets, the most studied and widely accepted model for distributed systems. Hence, the contribution of this book amounts to the alphabetization of some classes of distributed systems. The book also suggests the need for a generalization of Turing computability theory. It is important for graduate students and researchers engaged with the concurrent semantics of distributed communicating systems. The author assumes some prior knowledge of formal languages and theoretical computer science.
This collection of papers, published in honour of Hector J. Levesque on the occasion of his 60th birthday, addresses a number of core areas in the field of knowledge representation and reasoning. In a broad sense, the book is about knowledge and belief, tractable reasoning, and reasoning about action and change. More specifically, the book contains contributions to Description Logics, the expressiveness of knowledge representation languages, limited forms of inference, satisfiablity (SAT), the logical foundations of BDI architectures, only-knowing, belief revision, planning, causation, the situation calculus, the action language Golog, and cognitive robotics.
Fuzzy Cluster Analysis presents advanced and powerful fuzzy clustering techniques. This thorough and self-contained introduction to fuzzy clustering methods and applications covers classification, image recognition, data analysis and rule generation. Combining theoretical and practical perspectives, each method is analysed in detail and fully illustrated with examples. Features include:
This visionary and engaging book provides a mathematical perspective on the fundamental ideas of numbers, space, life, evolution, the brain and the mind. The author suggests how a development of mathematical concepts in the spirit of category theory may lead to unravelling the mystery of the human mind and the design of universal learning algorithms. The book is divided into two parts, the first of which describes the ideas of great mathematicians and scientists, those who saw sparks of light in the dark sea of unknown. The second part, Memorandum Ergo, reflects on how mathematics can contribute to the understanding of the mystery of thought. It argues that the core of the human mind is a structurally elaborated object that needs a creation of a broad mathematical context for its understanding. Readers will discover the main properties of the expected mathematical objects within this context, called ERGO-SYSTEMS, and readers will see how these "systems" may serve as prototypes for design of universal learning computer programs. This is a work of great, poetical insight and is richly illustrated. It is a highly attractive read for all those who welcome a mathematical and scientific way of thinking about the world.
This book presents the entire body of thought of Norbert Wiener (1894-1964), knowledge of which is essential if one wishes to understand and correctly interpret the age in which we live. The focus is in particular on the philosophical and sociological aspects of Wiener's thought, but these aspects are carefully framed within the context of his scientific journey. Important biographical events, including some that were previously unknown, are also highlighted, but while the book has a biographical structure, it is not only a biography. The book is divided into four chronological sections, the first two of which explore Wiener's development as a philosopher and logician and his brilliant interwar career as a mathematician, supported by his philosophical background. The third section considers his research during World War II, which drew upon his previous scientific work and reflections and led to the birth of cybernetics. Finally, the radical post-war shift in Wiener's intellectual path is considered, examining how he came to abandon computer science projects and commenced ceaseless public reflections on the new sciences and technologies of information, their social effects, and the need for responsibility in science.
This monograph proposes a new way of implementing interaction in logic. It also provides an elementary introduction to Constructive Type Theory (CTT). The authors equally emphasize basic ideas and finer technical details. In addition, many worked out exercises and examples will help readers to better understand the concepts under discussion. One of the chief ideas animating this study is that the dialogical understanding of definitional equality and its execution provide both a simple and a direct way of implementing the CTT approach within a game-theoretical conception of meaning. In addition, the importance of the play level over the strategy level is stressed, binding together the matter of execution with that of equality and the finitary perspective on games constituting meaning. According to this perspective the emergence of concepts are not only games of giving and asking for reasons (games involving Why-questions), they are also games that include moves establishing how it is that the reasons brought forward accomplish their explicative task. Thus, immanent reasoning games are dialogical games of Why and How.
In recent years, mathematical logic has developed in many directions, the initial unity of its subject matter giving way to a myriad of seemingly unrelated areas. The articles collected here, which range from historical scholarship to recent research in geometric model theory, squarely address this development. These articles also connect to the diverse work of Vaananen, whose ecumenical approach to logic reflects the unity of the discipline."
This book is a comprehensive, systematic survey of the synthesis problem, and of region theory which underlies its solution, covering the related theory, algorithms, and applications. The authors focus on safe Petri nets and place/transition nets (P/T-nets), treating synthesis as an automated process which, given behavioural specifications or partial specifications of a system to be realized, decides whether the specifications are feasible, and then produces a Petri net realizing them exactly, or if this is not possible produces a Petri net realizing an optimal approximation of the specifications. In Part I the authors introduce elementary net synthesis. In Part II they explain variations of elementary net synthesis and the unified theory of net synthesis. The first three chapters of Part III address the linear algebraic structure of regions, synthesis of P/T-nets from finite initialized transition systems, and the synthesis of unbounded P/T-nets. Finally, the last chapter in Part III and the chapters in Part IV cover more advanced topics and applications: P/T-net with the step firing rule, extracting concurrency from transition systems, process discovery, supervisory control, and the design of speed-independent circuits. Most chapters conclude with exercises, and the book is a valuable reference for both graduate students of computer science and electrical engineering and researchers and engineers in this domain.
This monograph considers several well-known mathematical theorems and asks the question, "Why prove it again?" while examining alternative proofs. It explores the different rationales mathematicians may have for pursuing and presenting new proofs of previously established results, as well as how they judge whether two proofs of a given result are different. While a number of books have examined alternative proofs of individual theorems, this is the first that presents comparative case studies of other methods for a variety of different theorems. The author begins by laying out the criteria for distinguishing among proofs and enumerates reasons why new proofs have, for so long, played a prominent role in mathematical practice. He then outlines various purposes that alternative proofs may serve. Each chapter that follows provides a detailed case study of alternative proofs for particular theorems, including the Pythagorean Theorem, the Fundamental Theorem of Arithmetic, Desargues' Theorem, the Prime Number Theorem, and the proof of the irreducibility of cyclotomic polynomials. Why Prove It Again? will appeal to a broad range of readers, including historians and philosophers of mathematics, students, and practicing mathematicians. Additionally, teachers will find it to be a useful source of alternative methods of presenting material to their students.
Computability Theory: An Introduction to Recursion Theory,
provides a concise, comprehensive, and authoritative introduction
to contemporary computability theory, techniques, and results. The
basic concepts and techniques of computability theory are placed in
their historical, philosophical and logical context. This
presentation is characterized by an unusual breadth of coverage and
the inclusion of advanced topics not to be found elsewhere in the
literature at this level. The text includes both the standard
material for a first course in computability and more advanced
looks at degree structures, forcing, priority methods, and
determinacy. The final chapter explores a variety of computability
applications to mathematics and science. Computability Theory is an
invaluable text, reference, and guide to the direction of current
research in the field. Nowhere else will you find the techniques
and results of this beautiful and basic subject brought alive in
such an approachable way. Frequent historical information presented throughout More extensive motivation for each of the topics than other texts currently available Connects with topics not included in other textbooks, such as complexity theory "
An ontology is a formal description of concepts and relationships that can exist for a community of human and/or machine agents. The notion of ontologies is crucial for the purpose of enabling knowledge sharing and reuse. The Handbook on Ontologies provides a comprehensive overview of the current status and future prospectives of the field of ontologies considering ontology languages, ontology engineering methods, example ontologies, infrastructures and technologies for ontologies, and how to bring this all into ontology-based infrastructures and applications that are among the best of their kind. The field of ontologies has tremendously developed and grown in the five years since the first edition of the "Handbook on Ontologies." Therefore, its revision includes 21 completely new chapters as well as a major re-working of 15 chapters transferred to this second edition.
This book offers an introduction to artificial adaptive systems and a general model of the relationships between the data and algorithms used to analyze them. It subsequently describes artificial neural networks as a subclass of artificial adaptive systems, and reports on the backpropagation algorithm, while also identifying an important connection between supervised and unsupervised artificial neural networks. The book's primary focus is on the auto contractive map, an unsupervised artificial neural network employing a fixed point method versus traditional energy minimization. This is a powerful tool for understanding, associating and transforming data, as demonstrated in the numerous examples presented here. A supervised version of the auto contracting map is also introduced as an outstanding method for recognizing digits and defects. In closing, the book walks the readers through the theory and examples of how the auto contracting map can be used in conjunction with another artificial neural network, the "spin-net," as a dynamic form of auto-associative memory.
In his rich and varied career as a mathematician, computer scientist, and educator, Jacob T. Schwartz wrote seminal works in analysis, mathematical economics, programming languages, algorithmics, and computational geometry. In this volume of essays, his friends, students, and collaborators at the Courant Institute of Mathematical Sciences present recent results in some of the fields that Schwartz explored: quantum theory, the theory and practice of programming, program correctness and decision procedures, dextrous manipulation in Robotics, motion planning, and genomics. In addition to presenting recent results in these fields, these essays illuminate the astonishingly productive trajectory of a brilliant and original scientist and thinker.
Since the birth of rational homotopy theory, the possibility of extending the Quillen approach - in terms of Lie algebras - to a more general category of spaces, including the non-simply connected case, has been a challenge for the algebraic topologist community. Despite the clear Eckmann-Hilton duality between Quillen and Sullivan treatments, the simplicity in the realization of algebraic structures in the latter contrasts with the complexity required by the Lie algebra version. In this book, the authors develop new tools to address these problems. Working with complete Lie algebras, they construct, in a combinatorial way, a cosimplicial Lie model for the standard simplices. This is a key object, which allows the definition of a new model and realization functors that turn out to be homotopically equivalent to the classical Quillen functors in the simply connected case. With this, the authors open new avenues for solving old problems and posing new questions. This monograph is the winner of the 2020 Ferran Sunyer i Balaguer Prize, a prestigious award for books of expository nature presenting the latest developments in an active area of research in mathematics.
This book offers an inspiring and naive view on language and reasoning. It presents a new approach to ordinary reasoning that follows the author's former work on fuzzy logic. Starting from a pragmatic scientific view on meaning as a quantity, and the common sense reasoning from a primitive notion of inference, which is shared by both laypeople and experts, the book shows how this can evolve, through the addition of more and more suppositions, into various formal and specialized modes of precise, imprecise, and approximate reasoning. The logos are intended here as a synonym for rationality, which is usually shown by the processes of questioning, guessing, telling, and computing. Written in a discursive style and without too many technicalities, the book presents a number of reflections on the study of reasoning, together with a new perspective on fuzzy logic and Zadeh's "computing with words" grounded in both language and reasoning. It also highlights some mathematical developments supporting this view. Lastly, it addresses a series of questions aimed at fostering new discussions and future research into this topic. All in all, this book represents an inspiring read for professors and researchers in computer science, and fuzzy logic in particular, as well as for psychologists, linguists and philosophers.
This open access book examines the many contributions of Paul Lorenzen, an outstanding philosopher from the latter half of the 20th century. It features papers focused on integrating Lorenzen's original approach into the history of logic and mathematics. The papers also explore how practitioners can implement Lorenzen's systematical ideas in today's debates on proof-theoretic semantics, databank management, and stochastics. Coverage details key contributions of Lorenzen to constructive mathematics, Lorenzen's work on lattice-groups and divisibility theory, and modern set theory and Lorenzen's critique of actual infinity. The contributors also look at the main problem of Grundlagenforschung and Lorenzen's consistency proof and Hilbert's larger program. In addition, the papers offer a constructive examination of a Russell-style Ramified Type Theory and a way out of the circularity puzzle within the operative justification of logic and mathematics. Paul Lorenzen's name is associated with the Erlangen School of Methodical Constructivism, of which the approach in linguistic philosophy and philosophy of science determined philosophical discussions especially in Germany in the 1960s and 1970s. This volume features 10 papers from a meeting that took place at the University of Konstanz.
Weighted finite automata are classical nondeterministic finite automata in which the transitions carry weights. These weights may model, for example, the cost involved when executing a transition, the resources or time needed for this, or the probability or reliability of its successful execution. Weights can also be added to classical automata with infinite state sets like pushdown automata, and this extension constitutes the general concept of weighted automata. Since their introduction in the 1960s they have stimulated research in related areas of theoretical computer science, including formal language theory, algebra, logic, and discrete structures. Moreover, weighted automata and weighted context-free grammars have found application in natural-language processing, speech recognition, and digital image compression. This book covers all the main aspects of weighted automata and formal power series methods, ranging from theory to applications. The contributors are the leading experts in their respective areas, and each chapter presents a detailed survey of the state of the art and pointers to future research. The chapters in Part I cover the foundations of the theory of weighted automata, specifically addressing semirings, power series, and fixed point theory. Part II investigates different concepts of weighted recognizability. Part III examines alternative types of weighted automata and various discrete structures other than words. Finally, Part IV deals with applications of weighted automata, including digital image compression, fuzzy languages, model checking, and natural-language processing. Computer scientists and mathematicians will find this book an excellent survey and reference volume, and it will also be a valuable resource for students exploring this exciting research area.
Boolean functions are the building blocks of symmetric
cryptographic systems. Symmetrical cryptographic algorithms are
fundamental tools in the design of all types of digital security
systems (i.e. communications, financial and e-commerce).
Let's try to play the music and not the background. Ornette Coleman, liner notes of the LP "Free Jazz" 20] WhenIbegantocreateacourseonfreejazz, theriskofsuchanenterprise was immediately apparent: I knew that Cecil Taylor had failed to teach such a matter, and that for other, more academic instructors, the topic was still a sort of outlandish adventure. To be clear, we are not talking about tea- ing improvisation here-a di?erent, and also problematic, matter-rather, we wish to create a scholarly discourse about free jazz as a cultural achievement, and follow its genealogy from the American jazz tradition through its various outbranchings, suchastheEuropeanandJapanesejazzconceptionsandint- pretations. We also wish to discuss some of the underlying mechanisms that are extant in free improvisation, things that could be called technical aspects. Such a discourse bears the ?avor of a contradicto in adjecto: Teachingthe unteachable, the very negation of rules, above all those posited by white jazz theorists, and talking about the making of sounds without aiming at so-called factual results and all those intellectual sedimentations: is this not a suicidal topic? My own endeavors as a free jazz pianist have informed and advanced my conviction that this art has never been theorized in a satisfactory way, not even by Ekkehard Jost in his unequaled, phenomenologically precise p- neering book "Free Jazz" 57].
The book offers a comprehensive survey of intuitionistic fuzzy logics. By reporting on both the author's research and others' findings, it provides readers with a complete overview of the field and highlights key issues and open problems, thus suggesting new research directions. Starting with an introduction to the basic elements of intuitionistic fuzzy propositional calculus, it then provides a guide to the use of intuitionistic fuzzy operators and quantifiers, and lastly presents state-of-the-art applications of intuitionistic fuzzy sets. The book is a valuable reference resource for graduate students and researchers alike.
This book features a unique approach to the teaching of mathematical logic by putting it in the context of the puzzles and paradoxes of common language and rational thought. It serves as a bridge from the author 's puzzle books to his technical writing in the fascinating field of mathematical logic. Using the logic of lying and truth-telling, the author introduces the readers to informal reasoning preparing them for the formal study of symbolic logic, from propositional logic to first-order logic, a subject that has many important applications to philosophy, mathematics, and computer science. The book includes a journey through the amazing labyrinths of infinity, which have stirred the imagination of mankind as much, if not more, than any other subject.
Logic networks and automata are facets of digital systems. The change of the design of logic networks from skills and art into a scientific discipline was possible by the development of the underlying mathematical theory called the Switching Theory. The fundamentals of this theory come from the attempts towards an algebraic description of laws of thoughts presented in the works by George J. Boole and the works on logic by Augustus De Morgan. As often the case in engineering, when the importance of a problem and the need for solving it reach certain limits, the solutions are searched by many scholars in different parts of the word, simultaneously or at about the same time, however, quite independently and often unaware of the work by other scholars. The formulation and rise of Switching Theory is such an example. This book presents a brief account of the developments of Switching Theory and highlights some less known facts in the history of it. The readers will find the book a fresh look into the development of the field revealing how difficult it has been to arrive at many of the concepts that we now consider obvious . Researchers in the history or philosophy of computing will find this book a valuable source of information that complements the standard presentations of the topic. |
![]() ![]() You may like...
From Quantum Information to Musical…
Maria Luisa Dalla Chiara, Roberto Giuntini, …
Paperback
R507
Discovery Miles 5 070
Emerging Applications of Fuzzy Algebraic…
Chiranjibe Jana, Tapan Senapati, …
Hardcover
R8,595
Discovery Miles 85 950
Logic from Russell to Church, Volume 5
Dov M. Gabbay, John Woods
Hardcover
R5,472
Discovery Miles 54 720
An Elementary Arithmetic [microform]
By a Committee of Teachers Supervised
Hardcover
R830
Discovery Miles 8 300
Elementary Lessons in Logic - Deductive…
William Stanley Jevons
Paperback
R560
Discovery Miles 5 600
Primary Maths for Scotland Textbook 2A…
Craig Lowther, Antoinette Irwin, …
Paperback
|