![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations
Spline functions entered Approximation Theory as solutions of natural extremal problems. A typical example is the problem of drawing a function curve through given n + k points that has a minimal norm of its k-th derivative. Isolated facts about the functions, now called splines, can be found in the papers of L. Euler, A. Lebesgue, G. Birkhoff, J. Favard, L. Tschakaloff. However, the Theory of Spline Functions has developed in the last 30 years by the effort of dozens of mathematicians. Recent fundamental results on multivariate polynomial interpolation and multivari ate splines have initiated a new wave of theoretical investigations and variety of applications. The purpose of this book is to introduce the reader to the theory of spline functions. The emphasis is given to some new developments, such as the general Birkoff's type interpolation, the extremal properties of the splines and their prominant role in the optimal recovery of functions, multivariate interpolation by polynomials and splines. The material presented is based on the lectures of the authors, given to the students at the University of Sofia and Yerevan University during the last 10 years. Some more elementary results are left as excercises and detailed hints are given."
Algorithmic Information Theory treats the mathematics of many important areas in digital information processing. It has been written as a read-and-learn book on concrete mathematics, for teachers, students and practitioners in electronic engineering, computer science and mathematics. The presentation is dense, and the examples and exercises are numerous. It is based on lectures on information technology (Data Compaction, Cryptography, Polynomial Coding) for engineers.
Stig Kanger (1924-1988) made important contributions to logic and formal philosophy. Kanger's dissertation Provability in Logic, 1957, contained significant results in proof theory as well as the first fully worked out model-theoretic interpretation of quantified modal logic. It is generally accepted nowadays that Kanger was one of the originators of possible worlds semantics for modal logic. Kanger's most original achievements were in the areas of general proof theory, the semantics of modal and deontic logic, and the logical analysis of the concept of rights. He also contributed to action theory, preference logic, and the theory of measurement. This is the first of two volumes dedicated to the work of Stig Kanger. The present volume is a complete collection of Kanger's philosophical papers. The second volume contains critical essays on Kanger's work, as well as biographical essays on Kanger written by colleagues and friends.
The modern theory of algebras of binary relations, reformulated by
Tarski as an abstract, algebraic, equational theory of relation
algebras, has considerable mathematical significance, with
applications in various fields: e.g., in computer
science---databases, specification theory, AI---and in
anthropology, economics, physics, and philosophical logic.
The classic results obtained by Godel, Tarski, Kleene, and Church
in the early thirties are the finest flowers of symbolic logic.
They are of fundamental importance to those investigations of the
foundations of mathematics via the concept of a formal system that
were inaugurated by Frege, and of obvious significance to the
mathematical disciplines, such as computability theory, that
developed from them.
This book gives an intuitive and hands-on introduction to Topological Data Analysis (TDA). Covering a wide range of topics at levels of sophistication varying from elementary (matrix algebra) to esoteric (Grothendieck spectral sequence), it offers a mirror of data science aimed at a general mathematical audience. The required algebraic background is developed in detail. The first third of the book reviews several core areas of mathematics, beginning with basic linear algebra and applications to data fitting and web search algorithms, followed by quick primers on algebra and topology. The middle third introduces algebraic topology, along with applications to sensor networks and voter ranking. The last third covers key contemporary tools in TDA: persistent and multiparameter persistent homology. Also included is a user's guide to derived functors and spectral sequences (useful but somewhat technical tools which have recently found applications in TDA), and an appendix illustrating a number of software packages used in the field. Based on a course given as part of a masters degree in statistics, the book is appropriate for graduate students.
This book deals with the rise of mathematics in physical sciences, beginning with Galileo and Newton and extending to the present day. The book is divided into two parts. The first part gives a brief history of how mathematics was introduced into physics-despite its "unreasonable effectiveness" as famously pointed out by a distinguished physicist-and the criticisms it received from earlier thinkers. The second part takes a more philosophical approach and is intended to shed some light on that mysterious effectiveness. For this purpose, the author reviews the debate between classical philosophers on the existence of innate ideas that allow us to understand the world and also the philosophically based arguments for and against the use of mathematics in physical sciences. In this context, Schopenhauer's conceptions of causality and matter are very pertinent, and their validity is revisited in light of modern physics. The final question addressed is whether the effectiveness of mathematics can be explained by its "existence" in an independent platonic realm, as Goedel believed. The book aims at readers interested in the history and philosophy of physics. It is accessible to those with only a very basic (not professional) knowledge of physics.
This book centers around a dialogue between Roger Penrose and Emanuele Severino about one of most intriguing topics of our times, the comparison of artificial intelligence and natural intelligence, as well as its extension to the notions of human and machine consciousness. Additional insightful essays by Mauro D'Ariano, Federico Faggin, Ines Testoni, Giuseppe Vitiello and an introduction of Fabio Scardigli complete the book and illuminate different aspects of the debate. Although from completely different points of view, all the authors seem to converge on the idea that it is almost impossible to have real "intelligence" without a form of "consciousness". In fact, consciousness, often conceived as an enigmatic "mirror" of reality (but is it really a mirror?), is a phenomenon under intense investigation by science and technology, particularly in recent decades. Where does this phenomenon originate from (in humans, and perhaps also in animals)? Is it reproducible on some "device"? Do we have a theory of consciousness today? Will we arrive to build thinking or conscious machines, as machine learning, or cognitive computing, seem to promise? These questions and other related issues are discussed in the pages of this work, which provides stimulating reading to both specialists and general readers. The Chapter "Hard Problem and Free Will: An Information-Theoretical Approach" is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.
This volume offers English translations of three early works by Ernst Schroeder (1841-1902), a mathematician and logician whose philosophical ruminations and pathbreaking contributions to algebraic logic attracted the admiration and ire of figures such as Dedekind, Frege, Husserl, and C. S. Peirce. Today he still engages the sympathetic interest of logicians and philosophers. The works translated record Schroeder's journey out of algebra into algebraic logic and document his transformation of George Boole's opaque and unwieldy logical calculus into what we now recognize as Boolean algebra. Readers interested in algebraic logic and abstract algebra can look forward to a tour of the early history of those fields with a guide who was exceptionally thorough, unfailingly honest, and deeply reflective.
Quadratic equations, Pythagoras' theorem, imaginary numbers, and pi - you may remember studying these at school, but did anyone ever explain why? Never fear - bestselling science writer, and your new favourite maths teacher, Michael Brooks, is here to help. In The Maths That Made Us, Brooks reminds us of the wonders of numbers: how they enabled explorers to travel far across the seas and astronomers to map the heavens; how they won wars and halted the HIV epidemic; how they are responsible for the design of your home and almost everything in it, down to the smartphone in your pocket. His clear explanations of the maths that built our world, along with stories about where it came from and how it shaped human history, will engage and delight. From ancient Egyptian priests to the Apollo astronauts, and Babylonian tax collectors to juggling robots, join Brooks and his extraordinarily eccentric cast of characters in discovering how maths made us who we are today.
Games, Norms, and Reasons: Logic at the Crossroads provides an overview of modern logic focusing on its relationships with other disciplines, including new interfaces with rational choice theory, epistemology, game theory and informatics. This book continues a series called "Logic at the Crossroads" whose title reflects a view that the deep insights from the classical phase of mathematical logic can form a harmonious mixture with a new, more ambitious research agenda of understanding and enhancing human reasoning and intelligent interaction. The editors have gathered together articles from active authors in this new area that explore dynamic logical aspects of norms, reasons, preferences and beliefs in human agency, human interaction and groups. The book pays a special tribute to Professor Rohit Parikh, a pioneer in this movement.
Simplicity theory is an extension of stability theory to a wider class of structures, containing, among others, the random graph, pseudo-finite fields, and fields with a generic automorphism. Following Kim's proof of forking symmetry' which implies a good behaviour of model-theoretic independence, this area of model theory has been a field of intense study. It has necessitated the development of some important new tools, most notably the model-theoretic treatment of hyperimaginaries (classes modulo type-definable equivalence relations). It thus provides a general notion of independence (and of rank in the supersimple case) applicable to a wide class of algebraic structures. The basic theory of forking independence is developed, and its properties in a simple structure are analyzed. No prior knowledge of stability theory is assumed; in fact many stability-theoretic results follow either from more general propositions, or are developed in side remarks. Audience: This book is intended both as an introduction to simplicity theory accessible to graduate students with some knowledge of model theory, and as a reference work for research in the field.
This monograph looks at causal nets from a philosophical point of view. The author shows that one can build a general philosophical theory of causation on the basis of the causal nets framework that can be fruitfully used to shed new light on philosophical issues. Coverage includes both a theoretical as well as application-oriented approach to the subject. The author first counters David Hume's challenge about whether causation is something ontologically real. The idea behind this is that good metaphysical concepts should behave analogously to good theoretical concepts in scientific theories. In the process, the author offers support for the theory of causal nets as indeed being a correct theory of causation. Next, the book offers an application-oriented approach to the subject. The author shows that causal nets can investigate philosophical issues related to causation. He does this by means of two exemplary applications. The first consists of an evaluation of Jim Woodward's interventionist theory of causation. The second offers a contribution to the new mechanist debate. Introductory chapters outline all the formal basics required. This helps make the book useful for those who are not familiar with causal nets, but interested in causation or in tools for the investigation of philosophical issues related to causation.
Logic plays a central conceptual role in modern mathematics. However, mathematical logic has grown into one of the most recondite areas of mathematics. As a result, most of modern logic is inaccessible to all but the specialist. This new book is a resource that provides a quick introduction and review of the key topics in logic for the computer scientist, engineer, or mathematician. Handbook of Logic and Proof Techniques for Computer Science presents the elements of modern logic, including many current topics, to the reader having only basic mathematical literacy. Computer scientists will find specific examples and important ideas such as axiomatics, recursion theory, decidability, independence, completeness, consistency, model theory, and P/NP completeness. The book contains definitions, examples and discussion of all of the key ideas in basic logic, but also makes a special effort to cut through the mathematical formalism, difficult notation, and esoteric terminology that is typical of modern mathematical logic. T This handbook delivers cogent and self-contained introductions to critical advanced topics, including: * Godels completeness and incompleteness theorems * Methods of proof, cardinal and ordinal numbers, the continuum hypothesis, the axiom of choice, model theory, and number systems and their construction * Extensive treatment of complexity theory and programming applications * Applications to algorithms in Boolean algebra * Discussion of set theory and applications of logic The book is an excellent resource for the working mathematical scientist. The graduate student or professional in computer science and engineering or the systems scientist whoneeds to have a quick sketch of a key idea from logic will find it here in this self-contained, accessible, and easy-to-use reference.
1. The ?rst edition of this book was published in 1977. The text has been well received and is still used, although it has been out of print for some time. In the intervening three decades, a lot of interesting things have happened to mathematical logic: (i) Model theory has shown that insights acquired in the study of formal languages could be used fruitfully in solving old problems of conventional mathematics. (ii) Mathematics has been and is moving with growing acceleration from the set-theoretic language of structures to the language and intuition of (higher) categories, leaving behind old concerns about in?nities: a new view of foundations is now emerging. (iii) Computer science, a no-nonsense child of the abstract computability theory, has been creatively dealing with old challenges and providing new ones, such as the P/NP problem. Planning additional chapters for this second edition, I have decided to focus onmodeltheory, the conspicuousabsenceofwhichinthe ?rsteditionwasnoted in several reviews, and the theory of computation, including its categorical and quantum aspects. The whole Part IV: Model Theory, is new. I am very grateful to Boris I. Zilber, who kindly agreed to write it. It may be read directly after Chapter II. The contents of the ?rst edition are basically reproduced here as Chapters I-VIII. Section IV.7, on the cardinality of the continuum, is completed by Section IV.7.3, discussing H. Woodin's discovery.
Reverse mathematics studies the complexity of proving mathematical theorems and solving mathematical problems. Typical questions include: Can we prove this result without first proving that one? Can a computer solve this problem? A highly active part of mathematical logic and computability theory, the subject offers beautiful results as well as significant foundational insights. This text provides a modern treatment of reverse mathematics that combines computability theoretic reductions and proofs in formal arithmetic to measure the complexity of theorems and problems from all areas of mathematics. It includes detailed introductions to techniques from computable mathematics, Weihrauch style analysis, and other parts of computability that have become integral to research in the field. Topics and features: Provides a complete introduction to reverse mathematics, including necessary background from computability theory, second order arithmetic, forcing, induction, and model construction Offers a comprehensive treatment of the reverse mathematics of combinatorics, including Ramsey's theorem, Hindman's theorem, and many other results Provides central results and methods from the past two decades, appearing in book form for the first time and including preservation techniques and applications of probabilistic arguments Includes a large number of exercises of varying levels of difficulty, supplementing each chapter The text will be accessible to students with a standard first year course in mathematical logic. It will also be a useful reference for researchers in reverse mathematics, computability theory, proof theory, and related areas. Damir D. Dzhafarov is an Associate Professor of Mathematics at the University of Connecticut, CT, USA. Carl Mummert is a Professor of Computer and Information Technology at Marshall University, WV, USA.
This contributed volume collects papers related to the Logic in Question workshop, which has taken place annually at Sorbonne University in Paris since 2011. Each year, the workshop brings together historians, philosophers, mathematicians, linguists, and computer scientists to explore questions related to the nature of logic and how it has developed over the years. As a result, chapter authors provide a thorough, interdisciplinary exploration of topics that have been studied in the workshop. Organized into three sections, the first part of the book focuses on historical questions related to logic, the second explores philosophical questions, and the third section is dedicated to mathematical discussions. Specific topics include: * logic and analogy* Chinese logic* nineteenth century British logic (in particular Boole and Lewis Carroll)* logical diagrams * the place and value of logic in Louis Couturat's philosophical thinking* contributions of logical analysis for mathematics education* the exceptionality of logic* the logical expressive power of natural languages* the unification of mathematics via topos theory Logic in Question will appeal to pure logicians, historians of logic, philosophers, linguists, and other researchers interested in the history of logic, making this volume a unique and valuable contribution to the field.
After the pioneering works by Robbins {1944, 1945) and Choquet (1955), the notation of a set-valued random variable (called a random closed set in literatures) was systematically introduced by Kendall {1974) and Matheron {1975). It is well known that the theory of set-valued random variables is a natural extension of that of general real-valued random variables or random vectors. However, owing to the topological structure of the space of closed sets and special features of set-theoretic operations ( cf. Beer [27]), set-valued random variables have many special properties. This gives new meanings for the classical probability theory. As a result of the development in this area in the past more than 30 years, the theory of set-valued random variables with many applications has become one of new and active branches in probability theory. In practice also, we are often faced with random experiments whose outcomes are not numbers but are expressed in inexact linguistic terms.
This book highlights a number of recent research advances in the field of symplectic and contact geometry and topology, and related areas in low-dimensional topology. This field has experienced significant and exciting growth in the past few decades, and this volume provides an accessible introduction into many active research problems in this area. The papers were written with a broad audience in mind so as to reach a wide range of mathematicians at various levels. Aside from teaching readers about developing research areas, this book will inspire researchers to ask further questions to continue to advance the field. The volume contains both original results and survey articles, presenting the results of collaborative research on a wide range of topics. These projects began at the Research Collaboration Conference for Women in Symplectic and Contact Geometry and Topology (WiSCon) in July 2019 at ICERM, Brown University. Each group of authors included female and nonbinary mathematicians at different career levels in mathematics and with varying areas of expertise. This paved the way for new connections between mathematicians at all career levels, spanning multiple continents, and resulted in the new collaborations and directions that are featured in this work.
Proof theory and category theory were first drawn together by Lambek some 30 years ago but, until now, the most fundamental notions of category theory (as opposed to their embodiments in logic) have not been explained systematically in terms of proof theory. Here it is shown that these notions, in particular the notion of adjunction, can be formulated in such as way as to be characterised by composition elimination. Among the benefits of these composition-free formulations are syntactical and simple model-theoretical, geometrical decision procedures for the commuting of diagrams of arrows. Composition elimination, in the form of Gentzen's cut elimination, takes in categories, and techniques inspired by Gentzen are shown to work even better in a purely categorical context than in logic. An acquaintance with the basic ideas of general proof theory is relied on only for the sake of motivation, however, and the treatment of matters related to categories is also in general self contained. Besides familiar topics, presented in a novel, simple way, the monograph also contains new results. It can be used as an introductory text in categorical proof theory.
The edited volume includes papers in the fields of fuzzy mathematical analysis and advances in computational mathematics. The fields of fuzzy mathematical analysis and advances in computational mathematics can provide valuable solutions to complex problems. They have been applied in multiple areas such as high dimensional data analysis, medical diagnosis, computer vision, hand-written character recognition, pattern recognition, machine intelligence, weather forecasting, network optimization, VLSI design, etc. The volume covers ongoing research in fuzzy and computational mathematical analysis and brings forward its recent applications to important real-world problems in various fields. The book includes selected high-quality papers from the International Conference on Fuzzy Mathematical Analysis and Advances in Computational Mathematics (FMAACM 2020).
Arising from the 1996 Cape Town conference in honour of the mathematician Bernhard Banaschewski, this collection of 30 refereed papers represents current developments in category theory, topology, topos theory, universal algebra, model theory, and diverse ordered and algebraic structures. Banaschewski's influence is reflected here, particularly in the contributions to pointfree topology at the levels of nearness, uniformity, and asymmetry. The unifying theme of the volume is the application of categorical methods. The contributing authors are: D. Baboolar, P. Bankston, R. Betti, D. Bourn, P. Cherenack, D. Dikranjan/H.-P. KA1/4nzi, X. Dong/W. Tholen, M. ErnA(c), T.H. Fay, T.H. Fay/S.V. Joubert, D.N. Georgiou/B.K. Papadopoulos, K.A. Hardie/K.H. Kamps/R.W. Kieboom, H. Herrlich/A. Pultr, K.M. Hofmann, S.S. Hong/Y.K. Kim, J. Isbell, R. Jayewardene/O. Wyler, P. Johnstone, R. Lowen/P. Wuyts, E. Lowen-Colebunders/C. Verbeeck, R. Nailana, J. Picado, T. Plewe, J. Reinhold, G. Richter, H. RArl, S.-H. Sun, Tozzi/V. TrnkovA, V. Valov/D. Vuma, and S. Veldsman. Audience: This volume will be of interest to mathematicians whose research involves category theory and its applications to topology, order, and algebra.
The approach to probability theory followed in this book (which
differs radically from the usual one, based on a measure-theoretic
framework) characterizes probability as a linear operator rather
than as a measure, and is based on the concept of coherence, which
can be framed in the most general view of conditional probability.
It is a flexible' and unifying tool suited for handling, e.g.,
partial probability assessments (not requiring that the set of all
possible outcomes' be endowed with a previously given algebraic
structure, such as a Boolean algebra), and conditional
independence, in a way that avoids all the inconsistencies related
to logical dependence (so that a theory referring to graphical
models more general than those usually considered in bayesian
networks can be derived). Moreover, it is possible to encompass
other approaches to uncertain reasoning, such as fuzziness,
possibility functions, and default reasoning.
In this two-volume compilation of articles, leading researchers reevaluate the success of Hilbert's axiomatic method, which not only laid the foundations for our understanding of modern mathematics, but also found applications in physics, computer science and elsewhere. The title takes its name from David Hilbert's seminal talk Axiomatisches Denken, given at a meeting of the Swiss Mathematical Society in Zurich in 1917. This marked the beginning of Hilbert's return to his foundational studies, which ultimately resulted in the establishment of proof theory as a new branch in the emerging field of mathematical logic. Hilbert also used the opportunity to bring Paul Bernays back to Goettingen as his main collaborator in foundational studies in the years to come. The contributions are addressed to mathematical and philosophical logicians, but also to philosophers of science as well as physicists and computer scientists with an interest in foundations. Chapter 8 is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com. |
![]() ![]() You may like...
Cholesterol Modulation of Protein…
Avia Rosenhouse-Dantsker, Anna N. Bukiya
Hardcover
R4,579
Discovery Miles 45 790
Proteomics in Domestic Animals: from…
Andre Martinho De Almeida, David Eckersall, …
Hardcover
R5,219
Discovery Miles 52 190
Advances in the Complex Variable…
Theodore V Hromadka, Robert J Whitley
Hardcover
R4,583
Discovery Miles 45 830
Electre and Decision Support - Methods…
Martin Gerard Rogers, Michael Bruen, …
Hardcover
R3,000
Discovery Miles 30 000
|