![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
The book is devoted to various constructions of sets which are
nonmeasurable with respect to invariant (more generally,
quasi-invariant) measures. Our starting point is the classical
Vitali theorem stating the existence of subsets of the real line
which are not measurable in the Lebesgue sense. This theorem
stimulated the development of the following interesting topics in
mathematics:
Over the last decade and particularly in recent years, the macroscopic porous media theory has made decisive progress concerning the fundamentals of the theory and the development of mathematical models in various fields of engineering and biomechanics. This progress has attracted some attention, and therefore conferences devoted almost exclusively to the macrosopic porous media theory have been organized in order to collect all findings, to present new results, and to discuss new trends. Many important contributions have also been published in national and international journals, which have brought the porous media theory, in some parts, to a close. Therefore, the time seems to be ripe to review the state of the art and to show new trends in the continuum mechanical treatment of saturated and unsaturated capillary and non-capillary porous solids. This book addresses postgraduate students and scientists working in engineering, physics, and mathematics. It provides an outline of modern theory of porous media and shows some trends in theory and in applications.
We see numbers on automobile license plates, addresses, weather reports, and, of course, on our smartphones. Yet we look at these numbers for their role as descriptors, not as an entity in and unto themselves. Each number has its own history of meaning, usage, and connotation in the larger world. The Secret Lives of Numbers takes readers on a journey through integers, considering their numerological assignments as well as their significance beyond mathematics and in the realm of popular culture. Of course we all know that the number 13 carries a certain value of unluckiness with it. The phobia of the number is called Triskaidekaphobia; Franklin Delano Roosevelt was known to invite and disinvite guests to parties to avoid having 13 people in attendance; high-rise buildings often skip the 13th floor out of superstition. There are many explanations as to how the number 13 received this negative honor, but from a mathematical point of view, the number 13 is also the smallest prime number that when its digits are reversed is also a prime number. It is honored with a place among the Fibonacci numbers and integral Pythagorean triples, as well as many other interesting and lesser-known occurrences. In The Secret Lives of Numbers, popular mathematician Alfred S. Posamentier provides short and engaging mini-biographies of more than 100 numbers, starting with 1 and featuring some especially interesting numbers -like 6,174, a number with most unusual properties -to provide readers with a more comprehensive picture of the lives of numbers both mathematically and socially.
The present volume of the Handbook of the History of Logic brings
together two of the most important developments in 20th century
non-classical logic. These are many-valuedness and
non-monotonicity. On the one approach, in deference to vagueness,
temporal or quantum indeterminacy or reference-failure, sentences
that are classically non-bivalent are allowed as inputs and outputs
to consequence relations. Many-valued, dialetheic, fuzzy and
quantum logics are, among other things, principled attempts to
regulate the flow-through of sentences that are neither true nor
false. On the second, or non-monotonic, approach, constraints are
placed on inputs (and sometimes on outputs) of a classical
consequence relation, with a view to producing a notion of
consequence that serves in a more realistic way the requirements of
real-life inference.
Providing the first comprehensive treatment of the subject, this groundbreaking work is solidly founded on a decade of concentrated research, some of which is published here for the first time, as well as practical, ''hands on'' classroom experience. The clarity of presentation and abundance of examples and exercises make it suitable as a graduate level text in mathematics, decision making, artificial intelligence, and engineering courses.
From the very beginning of their investigation of human reasoning, philosophers have identified two other forms of reasoning, besides deduction, which we now call abduction and induction. Deduction is now fairly well understood, but abduction and induction have eluded a similar level of understanding. The papers collected here address the relationship between abduction and induction and their possible integration. The approach is sometimes philosophical, sometimes that of pure logic, and some papers adopt the more task-oriented approach of AI. The book will command the attention of philosophers, logicians, AI researchers and computer scientists in general.
Time is a fascinating subject and has long since captured mankind's imagination, from the ancients to modern man, both adult and child alike. It has been studied across a wide range of disciplines, from the natural sciences to philosophy and logic. Today, thirty plus years since Prior's work in laying out foundations for temporal logic, and two decades on from Pnueli's seminal work applying of temporal logic in specification and verification of computer programs, temporal logic has a strong and thriving international research community within the broad disciplines of computer science and artificial intelligence. Areas of activity include, but are certainly not restricted to: Pure Temporal Logic, e. g. temporal systems, proof theory, model theory, expressiveness and complexity issues, algebraic properties, application of game theory; Specification and Verification, e. g. of reactive systems, ofreal-time components, of user interaction, of hardware systems, techniques and tools for verification, execution and prototyping methods; Temporal Databases, e. g. temporal representation, temporal query ing, granularity of time, update mechanisms, active temporal data bases, hypothetical reasoning; Temporal Aspects in AI, e. g. modelling temporal phenomena, in terval temporal calculi, temporal nonmonotonicity, interaction of temporal reasoning with action/knowledge/belief logics, temporal planning; Tense and Aspect in Natural Language, e. g. models, ontologies, temporal quantifiers, connectives, prepositions, processing tempo ral statements; Temporal Theorem Proving, e. g. translation methods, clausal and non-clausal resolution, tableaux, automata-theoretic approaches, tools and practical systems."
Written by one of the subject's foremost experts, this book focuses on the central developments and modern methods of the advanced theory of abelian groups, while remaining accessible, as an introduction and reference, to the non-specialist. It provides a coherent source for results scattered throughout the research literature with lots of new proofs. The presentation highlights major trends that have radically changed the modern character of the subject, in particular, the use of homological methods in the structure theory of various classes of abelian groups, and the use of advanced set-theoretical methods in the study of un decidability problems. The treatment of the latter trend includes Shelah's seminal work on the un decidability in ZFC of Whitehead's Problem; while the treatment of the former trend includes an extensive (but non-exhaustive) study of p-groups, torsion-free groups, mixed groups and important classes of groups arising from ring theory. To prepare the reader to tackle these topics, the book reviews the fundamentals of abelian group theory and provides some background material from category theory, set theory, topology and homological algebra. An abundance of exercises are included to test the reader's comprehension, and to explore noteworthy extensions and related sidelines of the main topics. A list of open problems and questions, in each chapter, invite the reader to take an active part in the subject's further development.
This is a monograph about logic. Specifically, it presents the mathe matical theory of the logic of bunched implications, BI: I consider Bl's proof theory, model theory and computation theory. However, the mono graph is also about informatics in a sense which I explain. Specifically, it is about mathematical models of resources and logics for reasoning about resources. I begin with an introduction which presents my (background) view of logic from the point of view of informatics, paying particular attention to three logical topics which have arisen from the development of logic within informatics: * Resources as a basis for semantics; * Proof-search as a basis for reasoning; and * The theory of representation of object-logics in a meta-logic. The ensuing development represents a logical theory which draws upon the mathematical, philosophical and computational aspects of logic. Part I presents the logical theory of propositional BI, together with a computational interpretation. Part II presents a corresponding devel opment for predicate BI. In both parts, I develop proof-, model- and type-theoretic analyses. I also provide semantically-motivated compu tational perspectives, so beginning a mathematical theory of resources. I have not included any analysis, beyond conjecture, of properties such as decidability, finite models, games or complexity. I prefer to leave these matters to other occasions, perhaps in broader contexts.
I am very happy to have this opportunity to introduce Luca Vigano's book on Labelled Non-Classical Logics. I put forward the methodology of labelled deductive systems to the participants of Logic Colloquium'90 (Labelled Deductive systems, a Position Paper, In J. Oikkonen and J. Vaananen, editors, Logic Colloquium '90, Volume 2 of Lecture Notes in Logic, pages 66-68, Springer, Berlin, 1993), in an attempt to bring labelling as a recognised and significant component of our logic culture. It was a response to earlier isolated uses of labels by various distinguished authors, as a means to achieve local proof theoretic goals. Labelling was used in many different areas such as resource labelling in relevance logics, prefix tableaux in modal logics, annotated logic programs in logic programming, proof tracing in truth maintenance systems, and various side annotations in higher-order proof theory, arithmetic and analysis. This widespread local use of labels was an indication of an underlying logical pattern, namely the simultaneous side-by-side manipulation of several kinds of logical information. It was clear that there was a need to establish the labelled deductive systems methodology. Modal logic is one major area where labelling can be developed quickly and sys tematically with a view of demonstrating its power and significant advantage. In modal logic the labels can play a double role."
Modal logics, originally conceived in philosophy, have recently found many applications in computer science, artificial intelligence, the foundations of mathematics, linguistics and other disciplines. Celebrated for their good computational behaviour, modal logics are used as effective formalisms for talking about time, space, knowledge, beliefs, actions, obligations, provability, etc. However, the nice computational properties can drastically change if we combine some of these formalisms into a many-dimensional system, say, to reason about knowledge bases developing in time or moving objects.
This book is about stochastic Petri nets (SPNs), which have proven to be a popular tool for modelling and performance analysis of complex discrete-event stochastic systems. The focus is on methods for modelling a system as an SPN with general firing times and for studying the long-run behavior of the resulting SPN model using computer simulation. Modelling techniques are illustrated in the context of computer, manufacturing, telecommunication, workflow, and transportation systems. The simulation discussion centers on the theory that underlies estimation procedures such as the regenerative method, the method of batch means, and spectral methods.Tying these topics together are conditions on the building blocks of an SPN under which the net is stable over time and specified estimation procedures are valid. In addition, the book develops techniques for comparing the modelling power of different discrete-event formalisms. These techniques provide a means for making principled choices between alternative modelling frameworks and also can be used to extend stability results and limit theorems from one framework to another. As an overview of fundamental modelling, stability, convergence, and estimation issues for discrete-event systems, this book will be of interest to researchers and graduate students in Applied Mathematics, Operations Research, Applied Probability, and Statistics. This book also will be of interest to practitioners of Industrial, Computer, Transportation, and Electrical Engineering, because it provides an introduction to a powerful set of tools both for modelling and for simulation-based performance analysis. Peter J. Haas is a member of the Research Staff at the IBM Almaden Research Center in San Jose, California. He also teaches Computer Simulation at Stanford University and is an Associate Editor (Simulation Area) for Operations Research.
Take the mystery out of basic math with the latest edition of BarCharts best-selling Math Review QuickStudy(r) guide. With updated content and an additional panel of information, Math Review includes hard-to-remember formulas and properties, along with numerous examples and illustrations to improve understanding. This comprehensive math guide will assist you way beyond your high school and college years. "
"Mathematics in Kant's Critical Philosophy" provides a much needed reading (and re-reading) of Kant's theory of the construction of mathematical concepts through a fully contextualized analysis. In this work Lisa Shabel convincingly argues that it is only through an understanding of the relevant eighteenth century mathematics textbooks, and the related mathematical practice, can the material and context necessary for a successful interpretation of Kant's philosophy be provided. This is borne out through sustained readings of Euclid and Woolf in particular, which, when brought together with Kant's work, allows for the elucidation of several key issues and the reinterpretation of many hitherto opaque and long debated passages.
This book covers work written by leading scholars from different schools within the research area of paraconsistency. The authors critically investigate how contemporary paraconsistent logics can be used to better understand human reasoning in science and mathematics. Offering a variety of perspectives, they shed a new light on the question of whether paraconsistent logics can function as the underlying logics of inconsistent but useful scientific and mathematical theories. The great variety of paraconsistent logics gives rise to various, interrelated questions, such as what are the desiderata a paraconsistent logic should satisfy, is there prospect of a universal approach to paraconsistent reasoning with axiomatic theories, and to what extent is reasoning about sets structurally analogous to reasoning about truth. Furthermore, the authors consider paraconsistent logic's status as either a normative or descriptive discipline (or one which falls in between) and which inconsistent but non-trivial axiomatic theories are well understood by which types of paraconsistent approaches. This volume addresses such questions from different perspectives in order to (i) obtain a representative overview of the state of the art in the philosophical debate on paraconsistency, (ii) come up with fresh ideas for the future of paraconsistency, and most importantly (iii) provide paraconsistent logic with a stronger philosophical foundation, taking into account the developments within the different schools of paraconsistency.
Most papers published in this volume are based on lectures presented at the Chico Conference on Semigroups held on the Chico campus of the Cal ifornia State University on April 10-12, 1986. The conference was spon sored by the California State University, Chico in cooperation with the Engineering Computer Sciences Department of the Pacific Gas and Electric Company. The program included seven 50-minute addresses and seventeen 30-minute lectures. Speakers were invited by the organizing committee consisting of S. M. Goberstein and P. M. Higgins. The purpose of the conference was to bring together some of the leading researchers in the area of semigroup theory for a discussion of major recent developments in the field. The algebraic theory of semigroups is growing so rapidly and new important results are being produced at such a rate that the need for another meeting was well justified. It was hoped that the conference would help to disseminate new results more rapidly among those working in semi groups and related areas and that the exchange of ideas would stimulate research in the subject even further. These hopes were realized beyond all expectations."
The book "Foundational Theories of Classical and Constructive Mathematics" is a book on the classical topic of foundations of mathematics. Its originality resides mainly in its treating at the same time foundations of classical and foundations of constructive mathematics. This confrontation of two kinds of foundations contributes to answering questions such as: Are foundations/foundational theories of classical mathematics of a different nature compared to those of constructive mathematics? Do they play the same role for the resp. mathematics? Are there connections between the two kinds of foundational theories? etc. The confrontation and comparison is often implicit and sometimes explicit. Its great advantage is to extend the traditional discussion of the foundations of mathematics and to render it at the same time more subtle and more differentiated. Another important aspect of the book is that some of its contributions are of a more philosophical, others of a more technical nature. This double face is emphasized, since foundations of mathematics is an eminent topic in the philosophy of mathematics: hence both sides of this discipline ought to be and are being paid due to.
Exploring new variations of classical methods as well as recent approaches appearing in the field, Computational Fluid Dynamics demonstrates the extensive use of numerical techniques and mathematical models in fluid mechanics. It presents various numerical methods, including finite volume, finite difference, finite element, spectral, smoothed particle hydrodynamics (SPH), mixed-element-volume, and free surface flow. Taking a unified point of view, the book first introduces the basis of finite volume, weighted residual, and spectral approaches. The contributors present the SPH method, a novel approach of computational fluid dynamics based on the mesh-free technique, and then improve the method using an arbitrary Lagrange Euler (ALE) formalism. They also explain how to improve the accuracy of the mesh-free integration procedure, with special emphasis on the finite volume particle method (FVPM). After describing numerical algorithms for compressible computational fluid dynamics, the text discusses the prediction of turbulent complex flows in environmental and engineering problems. The last chapter explores the modeling and numerical simulation of free surface flows, including future behaviors of glaciers. The diverse applications discussed in this book illustrate the importance of numerical methods in fluid mechanics. With research continually evolving in the field, there is no doubt that new techniques and tools will emerge to offer greater accuracy and speed in solving and analyzing even more fluid flow problems.
Domain theory is a rich interdisciplinary area at the intersection of logic, computer science, and mathematics. This volume contains selected papers presented at the International Symposium on Domain Theory which took place in Shanghai in October 1999. Topics of papers range from the encounters between topology and domain theory, sober spaces, Lawson topology, real number computability and continuous functionals to fuzzy modelling, logic programming, and pi-calculi. This book is a valuable reference for researchers and students interested in this rapidly developing area of theoretical computer science.
Iterative Splitting Methods for Differential Equations explains how to solve evolution equations via novel iterative-based splitting methods that efficiently use computational and memory resources. It focuses on systems of parabolic and hyperbolic equations, including convection-diffusion-reaction equations, heat equations, and wave equations. In the theoretical part of the book, the author discusses the main theorems and results of the stability and consistency analysis for ordinary differential equations. He then presents extensions of the iterative splitting methods to partial differential equations and spatial- and time-dependent differential equations. The practical part of the text applies the methods to benchmark and real-life problems, such as waste disposal, elastics wave propagation, and complex flow phenomena. The book also examines the benefits of equation decomposition. It concludes with a discussion on several useful software packages, including r3t and FIDOS. Covering a wide range of theoretical and practical issues in multiphysics and multiscale problems, this book explores the benefits of using iterative splitting schemes to solve physical problems. It illustrates how iterative operator splitting methods are excellent decomposition methods for obtaining higher-order accuracy.
Problems in decision making and in other areas such as pattern recogni tion, control, structural engineering etc. involve numerous aspects of uncertainty. Additional vagueness is introduced as models become more complex but not necessarily more meaningful by the added details. During the last two decades one has become more and more aware of the fact that not all this uncertainty is of stochastic (random) cha racter and that, therefore, it can not be modelled appropriately by probability theory. This becomes the more obvious the more we want to represent formally human knowledge. As far as uncertain data are concerned, we have neither instru ments nor reasoning at our disposal as well defined and unquestionable as those used in the probability theory. This almost infallible do main is the result of a tremendous work by the whole scientific world. But when measures are dubious, bad or no longer possible and when we really have to make use of the richness of human reasoning in its variety, then the theories dealing with the treatment of uncertainty, some quite new and other ones older, provide the required complement, and fill in the gap left in the field of knowledge representation. Nowadays, various theories are widely used: fuzzy sets, belief function, the convenient associations between probability and fuzzines~ etc *** We are more and more in need of a wide range of instruments and theories to build models that are more and more adapted to the most complex systems.
It is known that many control processes are characterized by both quantitative and qualitative complexity. Tbe quantitative complexity is usually expressed in a large number of state variables, respectively high dimensional mathematical model. Tbe qualitative complexity is usually associated with uncertain behaviour, respectively approximately known mathematical model. If the above two aspects of complexity are considered separately, the corresponding control problem can be easily solved. On one hand, large scale systems theory has existed for more than 20 years and has proved its capabilities in solving high dimensional control problems on the basis of decomposition, hierarchy, decentralization and multilayers. On the other hand, the fuzzy linguistic approach is almost at the same age and has shown its advantages in solving approximately formulated control problems on the basis of linguistic reasoning and logical inference. However, if both aspects of complexity are considered together, the corresponding control problem becomes non-trivial and does not have an easy solution. Modem control theory and practice have reacted accordingly to the above mentioned new cballenges of tbe day by utilizing the latest achievements in computer technology and artificial intelligence distributed computation and intelligent operation. In this respect, a new field has emerged in the last decade, called " Distributed intelligent control systems" . However, the majority of the familiar works in this field are still either on an empirical or on a conceptual level and this is a significant drawback. |
You may like...
Web Services - Concepts, Methodologies…
Information Reso Management Association
Hardcover
R8,959
Discovery Miles 89 590
|