Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Mathematical foundations
Medical imaging is one of the heaviest funded biomedical engineering research areas. The second edition of Pattern Recognition and Signal Analysis in Medical Imaging brings sharp focus to the development of integrated systems for use in the clinical sector, enabling both imaging and the automatic assessment of the resultant data. Since the first edition, there has been tremendous development of new, powerful technologies for detecting, storing, transmitting, analyzing, and displaying medical images. Computer-aided analytical techniques, coupled with a continuing need to derive more information from medical images, has led to a growing application of digital processing techniques in cancer detection as well as elsewhere in medicine. This book is an essential tool for students and professionals, compiling and explaining proven and cutting-edge methods in pattern recognition for medical imaging.
This monograph looks at causal nets from a philosophical point of view. The author shows that one can build a general philosophical theory of causation on the basis of the causal nets framework that can be fruitfully used to shed new light on philosophical issues. Coverage includes both a theoretical as well as application-oriented approach to the subject. The author first counters David Hume's challenge about whether causation is something ontologically real. The idea behind this is that good metaphysical concepts should behave analogously to good theoretical concepts in scientific theories. In the process, the author offers support for the theory of causal nets as indeed being a correct theory of causation. Next, the book offers an application-oriented approach to the subject. The author shows that causal nets can investigate philosophical issues related to causation. He does this by means of two exemplary applications. The first consists of an evaluation of Jim Woodward's interventionist theory of causation. The second offers a contribution to the new mechanist debate. Introductory chapters outline all the formal basics required. This helps make the book useful for those who are not familiar with causal nets, but interested in causation or in tools for the investigation of philosophical issues related to causation.
Michael Holzhauser discusses generalizations of well-known network flow and packing problems by additional or modified side constraints. By exploiting the inherent connection between the two problem classes, the author investigates the complexity and approximability of several novel network flow and packing problems and presents combinatorial solution and approximation algorithms.
Formal languages are widely regarded as being above all mathematical objects and as producing a greater level of precision and technical complexity in logical investigations because of this. Yet defining formal languages exclusively in this way offers only a partial and limited explanation of the impact which their use (and the uses of formalisms more generally elsewhere) actually has. In this book, Catarina Dutilh Novaes adopts a much wider conception of formal languages so as to investigate more broadly what exactly is going on when theorists put these tools to use. She looks at the history and philosophy of formal languages and focuses on the cognitive impact of formal languages on human reasoning, drawing on their historical development, psychology, cognitive science and philosophy. Her wide-ranging study will be valuable for both students and researchers in philosophy, logic, psychology and cognitive and computer science.
Filling a gap in the literature, this book takes the reader to the frontiers of equivariant topology, the study of objects with specified symmetries. The discussion is motivated by reference to a list of instructive "toy" examples and calculations in what is a relatively unexplored field. The authors also provide a reading path for the first-time reader less interested in working through sophisticated machinery but still desiring a rigorous understanding of the main concepts. The subject's classical counterparts, ordinary homology and cohomology, dating back to the work of Henri Poincare in topology, are calculational and theoretical tools which are important in many parts of mathematics and theoretical physics, particularly in the study of manifolds. Similarly powerful tools have been lacking, however, in the context of equivariant topology. Aimed at advanced graduate students and researchers in algebraic topology and related fields, the book assumes knowledge of basic algebraic topology and group actions.
Karl Menger, one of the founders of dimension theory, is among the most original mathematicians and thinkers of the twentieth century. He was a member of the Vienna Circle and the founder of its mathematical equivalent, the Viennese Mathematical Colloquium. Both during his early years in Vienna and, after his emigration, in the United States, Karl Menger made significant contributions to a wide variety of mathematical fields, and greatly influenced many of his colleagues. These two volumes contain Menger's major mathematical papers, based on his own selection from his extensive writings. They deal with topics as diverse as topology, geometry, analysis and algebra, and also include material on economics, sociology, logic and philosophy. The Selecta Mathematica is a monument to the diversity and originality of Menger's ideas.
The book offers a comprehensive survey of intuitionistic fuzzy logics. By reporting on both the author's research and others' findings, it provides readers with a complete overview of the field and highlights key issues and open problems, thus suggesting new research directions. Starting with an introduction to the basic elements of intuitionistic fuzzy propositional calculus, it then provides a guide to the use of intuitionistic fuzzy operators and quantifiers, and lastly presents state-of-the-art applications of intuitionistic fuzzy sets. The book is a valuable reference resource for graduate students and researchers alike.
This book covers work written by leading scholars from different schools within the research area of paraconsistency. The authors critically investigate how contemporary paraconsistent logics can be used to better understand human reasoning in science and mathematics. Offering a variety of perspectives, they shed a new light on the question of whether paraconsistent logics can function as the underlying logics of inconsistent but useful scientific and mathematical theories. The great variety of paraconsistent logics gives rise to various, interrelated questions, such as what are the desiderata a paraconsistent logic should satisfy, is there prospect of a universal approach to paraconsistent reasoning with axiomatic theories, and to what extent is reasoning about sets structurally analogous to reasoning about truth. Furthermore, the authors consider paraconsistent logic's status as either a normative or descriptive discipline (or one which falls in between) and which inconsistent but non-trivial axiomatic theories are well understood by which types of paraconsistent approaches. This volume addresses such questions from different perspectives in order to (i) obtain a representative overview of the state of the art in the philosophical debate on paraconsistency, (ii) come up with fresh ideas for the future of paraconsistency, and most importantly (iii) provide paraconsistent logic with a stronger philosophical foundation, taking into account the developments within the different schools of paraconsistency.
In this book the authors present new results on interpolation for nonmonotonic logics, abstract (function) independence, the Talmudic Kal Vachomer rule, and an equational solution of contrary-to-duty obligations. The chapter on formal construction is the conceptual core of the book, where the authors combine the ideas of several types of nonmonotonic logics and their analysis of 'natural' concepts into a formal logic, a special preferential construction that combines formal clarity with the intuitive advantages of Reiter defaults, defeasible inheritance, theory revision, and epistemic considerations. It is suitable for researchers in the area of computer science and mathematical logic.
Spline Regression Models shows the nuts-and-bolts of using dummy variables to formulate and estimate various spline regression models. For some researchers this will involve situations where the number and location of the spline knots are known in advance, while others will need to determine the number and location of spline knots as part of the estimation process. Through the use of a number of straightforward examples, the authors will show readers how to work with both types of spline knot situations as well as offering practical, down-to-earth information on estimating splines.
This is the first comprehensive treatment of subjective logic and all its operations. The author developed the approach, and in this book he first explains subjective opinions, opinion representation, and decision-making under vagueness and uncertainty, and he then offers a full definition of subjective logic, harmonising the key notations and formalisms, concluding with chapters on trust networks and subjective Bayesian networks, which when combined form general subjective networks. The author shows how real-world situations can be realistically modelled with regard to how situations are perceived, with conclusions that more correctly reflect the ignorance and uncertainties that result from partially uncertain input arguments. The book will help researchers and practitioners to advance, improve and apply subjective logic to build powerful artificial reasoning models and tools for solving real-world problems. A good grounding in discrete mathematics is a prerequisite.
Action theory is the object of growing attention in a variety of scientific disciplines and this is the first volume to offer a synthetic view of the range of approaches possible in the topic. The volume focuses on the nexus of formal action theory with a startlingly diverse set of subjects, which range from logic, linguistics, artificial intelligence and automata theory to jurisprudence, deontology and economics. It covers semantic, mathematical and logical aspects of action, showing how the problem of action breaks the boundaries of traditional branches of logic located in syntactics and semantics and now lies on lies on the borderline between logical pragmatics and praxeology. The chapters here focus on specialized tasks in formal action theory, beginning with a thorough description and formalization of the language of action and moving through material on the differing models of action theory to focus on probabilistic models, the relations of formal action theory to deontic logic and its key applications in algorithmic and programming theory. The coverage thus fills a notable lacuna in the literary corpus and offers solid formal underpinning in cognitive science by approaching the problem of cognition as a composite action of mind.
Starting with a simple formulation accessible to all mathematicians, this second edition is designed to provide a thorough introduction to nonstandard analysis. Nonstandard analysis is now a well-developed, powerful instrument for solving open problems in almost all disciplines of mathematics; it is often used as a 'secret weapon' by those who know the technique. This book illuminates the subject with some of the most striking applications in analysis, topology, functional analysis, probability and stochastic analysis, as well as applications in economics and combinatorial number theory. The first chapter is designed to facilitate the beginner in learning this technique by starting with calculus and basic real analysis. The second chapter provides the reader with the most important tools of nonstandard analysis: the transfer principle, Keisler's internal definition principle, the spill-over principle, and saturation. The remaining chapters of the book study different fields for applications; each begins with a gentle introduction before then exploring solutions to open problems. All chapters within this second edition have been reworked and updated, with several completely new chapters on compactifications and number theory. Nonstandard Analysis for the Working Mathematician will be accessible to both experts and non-experts, and will ultimately provide many new and helpful insights into the enterprise of mathematics.
This volume showcases the best of recent research in the philosophy of science. A compilation of papers presented at the EPSA 13, it explores a broad distribution of topics such as causation, truthlikeness, scientific representation, gender-specific medicine, laws of nature, science funding and the wisdom of crowds. Papers are organised into headings which form the structure of the book. Readers will find that it covers several major fields within the philosophy of science, from general philosophy of science to the more specific philosophy of physics, philosophy of chemistry, philosophy of the life sciences, philosophy of psychology, and philosophy of the social sciences and humanities, amongst others. This volume provides an excellent overview of the state of the art in the philosophy of science, as practiced in different European countries and beyond. It will appeal to researchers with an interest in the philosophical underpinnings of their own discipline, and to philosophers who wish to explore the latest work on the themes explored.
Few mathematical results capture the imagination like Georg Cantor's groundbreaking work on infinity in the late nineteenth century. This opened the door to an intricate axiomatic theory of sets which was born in the decades that followed. Written for the motivated novice, this book provides an overview of key ideas in set theory, bridging the gap between technical accounts of mathematical foundations and popular accounts of logic. Readers will learn of the formal construction of the classical number systems, from the natural numbers to the real numbers and beyond, and see how set theory has evolved to analyse such deep questions as the status of the continuum hypothesis and the axiom of choice. Remarks and digressions introduce the reader to some of the philosophical aspects of the subject and to adjacent mathematical topics. The rich, annotated bibliography encourages the dedicated reader to delve into what is now a vast literature.
This book explores the limits of our knowledge. The author shows how uncertainty and indefiniteness not only define the borders confining our understanding, but how they feed into the process of discovery and help to push back these borders. Starting with physics the author collects examples from economics, neurophysiology, history, ecology and philosophy. The first part shows how information helps to reduce indefiniteness. Understanding rests on our ability to find the right context, in which we localize a problem as a point in a network of connections. New elements must be combined with the old parts of the existing complex knowledge system, in order to profit maximally from the information. An attempt is made to quantify the value of information by its ability to reduce indefiniteness. The second part explains how to handle indefiniteness with methods from fuzzy logic, decision theory, hermeneutics and semiotics. It is not sufficient that the new element appears in an experiment, one also has to find a theoretical reason for its existence. Indefiniteness becomes an engine of science, which gives rise to new ideas.
This volume covers a wide range of topics that fall under the 'philosophy of quantifiers', a philosophy that spans across multiple areas such as logic, metaphysics, epistemology and even the history of philosophy. It discusses the import of quantifier variance in the model theory of mathematics. It advances an argument for the uniqueness of quantifier meaning in terms of Evert Beth’s notion of implicit definition and clarifies the oldest explicit formulation of quantifier variance: the one proposed by Rudolf Carnap. The volume further examines what it means that a quantifier can have multiple meanings and addresses how existential vagueness can induce vagueness in our modal notions. Finally, the book explores the role played by quantifiers with respect to various kinds of semantic paradoxes, the logicality issue, ontological commitment, and the behavior of quantifiers in intensional contexts.
First published in 1908 as the second edition of a 1900 original, this book explains the content of the fifth and sixth books of Euclid's Elements, which are primarily concerned with ratio and magnitudes. Hill furnishes the text with copious diagrams to illustrate key points of Euclidian reasoning. This book will be of value to anyone with an interest in the history of education.
This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, number theory, algebraic geometry, and algebraic K-theory. Assuming basic knowledge of algebraic geometry and homotopy theory, Local Homotopy Theory will appeal to researchers and advanced graduate students seeking to understand and advance the applications of homotopy theory in multiple areas of mathematics and the mathematical sciences.
This book consolidates and extends the authors' work on the connection between iconicity and abductive inference. It emphasizes a pragmatic, experimental and fallibilist view of knowledge without sacrificing formal rigor. Within this context, the book focuses particularly on scientific knowledge and its prevalent use of mathematics. To find an answer to the question "What kind of experimental activity is the scientific employment of mathematics?" the book addresses the problems involved in formalizing abductive cognition. For this, it implements the concept and method of iconicity, modeling this theoretical framework mathematically through category theory and topoi. Peirce's concept of iconic signs is treated in depth, and it is shown how Peirce's diagrammatic logical notation of Existential Graphs makes use of iconicity and how important features of this iconicity are representable within category theory. Alain Badiou's set-theoretical model of truth procedures and his relational sheaf-based theory of phenomenology are then integrated within the Peircean logical context. Finally, the book opens the path towards a more naturalist interpretation of the abductive models developed in Peirce and Badiou through an analysis of several recent attempts to reformulate quantum mechanics with categorical methods. Overall, the book offers a comprehensive and rigorous overview of past approaches to iconic semiotics and abduction, and it encompasses new extensions of these methods towards an innovative naturalist interpretation of abductive reasoning.
This book illustrates the program of Logical-Informational Dynamics. Rational agents exploit the information available in the world in delicate ways, adopt a wide range of epistemic attitudes, and in that process, constantly change the world itself. Logical-Informational Dynamics is about logical systems putting such activities at center stage, focusing on the events by which we acquire information and change attitudes. Its contributions show many current logics of information and change at work, often in multi-agent settings where social behavior is essential, and often stressing Johan van Benthem's pioneering work in establishing this program. However, this is not a Festschrift, but a rich tapestry for a field with a wealth of strands of its own. The reader will see the state of the art in such topics as information update, belief change, preference, learning over time, and strategic interaction in games. Moreover, no tight boundary has been enforced, and some chapters add more general mathematical or philosophical foundations or links to current trends in computer science. Â The theme of this book lies at the interface of many disciplines. Logic is the main methodology, but the various chapters cross easily between mathematics, computer science, philosophy, linguistics, cognitive and social sciences, while also ranging from pure theory to empirical work. Accordingly, the authors of this book represent a wide variety of original thinkers from different research communities. And their interconnected themes challenge at the same time how we think of logic, philosophy and computation. Â Thus, very much in line with van Benthem's work over many decades, the volume shows how all these disciplines form a natural unity in the perspective of dynamic logicians (broadly conceived) exploring their new themes today. And at the same time, in doing so, it offers a broader conception of logic with a certain grandeur, moving its horizons beyond the traditional study of consequence relations.
In this new text, Steven Givant—the author of several acclaimed books, including works co-authored with Paul Halmos and Alfred Tarski—develops three theories of duality for Boolean algebras with operators. Givant addresses the two most recognized dualities (one algebraic and the other topological) and introduces a third duality, best understood as a hybrid of the first two. This text will be of interest to graduate students and researchers in the fields of mathematics, computer science, logic, and philosophy who are interested in exploring special or general classes of Boolean algebras with operators. Readers should be familiar with the basic arithmetic and theory of Boolean algebras, as well as the fundamentals of point-set topology.
This volume is dedicated to Prof. Dag Prawitz and his outstanding contributions to philosophical and mathematical logic. Prawitz's eminent contributions to structural proof theory, or general proof theory, as he calls it, and inference-based meaning theories have been extremely influential in the development of modern proof theory and anti-realistic semantics. In particular, Prawitz is the main author on natural deduction in addition to Gerhard Gentzen, who defined natural deduction in his PhD thesis published in 1934. The book opens with an introductory paper that surveys Prawitz's numerous contributions to proof theory and proof-theoretic semantics and puts his work into a somewhat broader perspective, both historically and systematically. Chapters include either in-depth studies of certain aspects of Dag Prawitz's work or address open research problems that are concerned with core issues in structural proof theory and range from philosophical essays to papers of a mathematical nature. Investigations into the necessity of thought and the theory of grounds and computational justifications as well as an examination of Prawitz's conception of the validity of inferences in the light of three "dogmas of proof-theoretic semantics" are included. More formal papers deal with the constructive behaviour of fragments of classical logic and fragments of the modal logic S4 among other topics. In addition, there are chapters about inversion principles, normalization of p roofs, and the notion of proof-theoretic harmony and other areas of a more mathematical persuasion. Dag Prawitz also writes a chapter in which he explains his current views on the epistemic dimension of proofs and addresses the question why some inferences succeed in conferring evidence on their conclusions when applied to premises for which one already possesses evidence.
This is a monograph that details the use of Siegel’s method and the classical results of homotopy groups of spheres and Lie groups to determine some Gottlieb groups of projective spaces or to give the lower bounds of their orders. Making use of the properties of Whitehead products, the authors also determine some Whitehead center groups of projective spaces that are relevant and new within this monograph.
This volume is the first systematic and thorough attempt to investigate the relation and the possible applications of mereology to contemporary science. It gathers contributions from leading scholars in the field and covers a wide range of scientific theories and practices such as physics, mathematics, chemistry, biology, computer science and engineering. Throughout the volume, a variety of foundational issues are investigated both from the formal and the empirical point of view. The first section looks at the topic as it applies to physics. The section addresses questions of persistence and composition within quantum and relativistic physics and concludes by scrutinizing the possibility to capture continuity of motion as described by our best physical theories within gunky space times. The second part tackles mathematics and shows how to provide a foundation for point-free geometry of space switching to fuzzy-logic. The relation between mereological sums and set-theoretic suprema is investigated and issues about different mereological perspectives such as classical and natural Mereology are thoroughly discussed. The third section in the volume looks at natural science. Several questions from biology, medicine and chemistry are investigated. From the perspective of biology, there is an attempt to provide axioms for inferring statements about part hood between two biological entities from statements about their spatial relation. From the perspective of chemistry, it is argued that classical mereological frameworks are not adequate to capture the practices of chemistry in that they consider neither temporal nor modal parameters. The final part introduces computer science and engineering. A new formal mereological framework in which an indeterminate relation of part hood is taken as a primitive notion is constructed and then applied to a wide variety of disciplines from robotics to knowledge engineering. A formal framework for discrete mereotopology and its applications is developed and finally, the importance of mereology for the relatively new science of domain engineering is also discussed. |
You may like...
When Love Kills - The Tragic Tale Of AKA…
Melinda Ferguson
Paperback
(1)
Research Handbook on Fiduciary Law
D.G. Smith, Andrew S. Gold
Hardcover
R6,492
Discovery Miles 64 920
1 Recce: Volume 3 - Onsigbaarheid Is Ons…
Alexander Strachan
Paperback
The Evolution of Supplementary Pensions…
James Kolaczkowski, Michelle Maher, …
Hardcover
R4,645
Discovery Miles 46 450
|