![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
In fall 2000, the Notre Dame logic community hosted Greg Hjorth, Rodney G. Downey, ZoA(c) Chatzidakis, and Paola D'Aquino as visiting lecturers. Each of them presented a month long series of expository lectures at the graduate level. The articles in this volume are refinements of these excellent lectures.
This compilation of papers presented at the 2000 European Summer Meeting of the Association for Symbolic Logic marks the centenial anniversery of Hilbert's famous lecture. Held in the same hall at La Sorbonne where Hilbert first presented his famous problems, this meeting carries special significance to the Mathematics and Logic communities. The presentations include tutorials and research articles from some of the world's preeminent logicians. Three long articles are based on tutorials given at the meeting, and present accessible expositions of devloping research in three active areas of logic: model theory, computability, and set theory. The eleven subsequent articles cover seperate research topics in all areas of mathematical logic, including: aspects in Computer Science, Proof Theory, Set Theory, Model Theory, Computability Theory, and aspects of Philosophy.
Godel's Incompleteness Theorems are among the most significant results in the foundation of mathematics. These results have a positive consequence: any system of axioms for mathematics that we recognize as correct can be properly extended by adding as a new axiom a formal statement expressing that the original system is consistent. This suggests that our mathematical knowledge is inexhaustible, an essentially philosophical topic to which this book is devoted. Basic material in predicate logic, set theory and recursion theory is presented, leading to a proof of incompleteness theorems. The inexhaustibility of mathematical knowledge is treated based on the concept of transfinite progressions of theories as conceived by Turing and Feferman. All concepts and results necessary to understand the arguments are introduced as needed, making the presentation self-contained and thorough."
Boolean valued analysis is a technique for studying properties of an arbitrary mathematical object by comparing its representations in two different set-theoretic models whose construction utilises principally distinct Boolean algebras. The use of two models for studying a single object is a characteristic of the so-called non-standard methods of analysis. Application of Boolean valued models to problems of analysis rests ultimately on the procedures of ascending and descending, the two natural functors acting between a new Boolean valued universe and the von Neumann universe. This book demonstrates the main advantages of Boolean valued analysis which provides the tools for transforming, for example, function spaces to subsets of the reals, operators to functionals, and vector-functions to numerical mappings. Boolean valued representations of algebraic systems, Banach spaces, and involutive algebras are examined thoroughly. Audience: This volume is intended for classical analysts seeking new tools, and for model theorists in search of challenging applications of nonstandard models.
'Points, questions, stories, and occasional rants introduce the 24 chapters of this engaging volume. With a focus on mathematics and peppered with a scattering of computer science settings, the entries range from lightly humorous to curiously thought-provoking. Each chapter includes sections and sub-sections that illustrate and supplement the point at hand. Most topics are self-contained within each chapter, and a solid high school mathematics background is all that is needed to enjoy the discussions. There certainly is much to enjoy here.'CHOICEEver notice how people sometimes use math words inaccurately? Or how sometimes you instinctively know a math statement is false (or not known)?Each chapter of this book makes a point like those above and then illustrates the point by doing some real mathematics through step-by-step mathematical techniques.This book gives readers valuable information about how mathematics and theoretical computer science work, while teaching them some actual mathematics and computer science through examples and exercises. Much of the mathematics could be understood by a bright high school student. The points made can be understood by anyone with an interest in math, from the bright high school student to a Field's medal winner.
Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking came to be, and its rise to primacy in the nineteenth and early twentieth centuries. Additionally, it considers how seeing the world through a quantitative lens has shaped our perception of the world we live in, and explores the lives of the individuals behind its early establishment. This worldview was unlike anything humankind had before, and it came about because of a momentous human achievement: we had learned how to measure uncertainty. Probability as a science was conceptualised. As a result of probability theory, we now had correlations, reliable predictions, regressions, the bellshaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments happened during a relatively short period in world history- roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. At which time, transportation had advanced rapidly, due to the invention of the steam engine, and literacy rates had increased exponentially. This brief period in time was ready for fresh intellectual activity, and it gave a kind of impetus for the probability inventions. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances; in the Bayesian logic of artificial intelligence, as well as applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of quantitative thinking. The Error of Truth tells its story- when, why, and how it happened.
This book lays out the theory of Mordell-Weil lattices, a very powerful and influential tool at the crossroads of algebraic geometry and number theory, which offers many fruitful connections to other areas of mathematics. The book presents all the ingredients entering into the theory of Mordell-Weil lattices in detail, notably, relevant portions of lattice theory, elliptic curves, and algebraic surfaces. After defining Mordell-Weil lattices, the authors provide several applications in depth. They start with the classification of rational elliptic surfaces. Then a useful connection with Galois representations is discussed. By developing the notion of excellent families, the authors are able to design many Galois representations with given Galois groups such as the Weyl groups of E6, E7 and E8. They also explain a connection to the classical topic of the 27 lines on a cubic surface.Two chapters deal with elliptic K3 surfaces, a pulsating area of recent research activity which highlights many central properties of Mordell-Weil lattices. Finally, the book turns to the rank problem-one of the key motivations for the introduction of Mordell-Weil lattices. The authors present the state of the art of the rank problem for elliptic curves both over Q and over C(t) and work out applications to the sphere packing problem. Throughout, the book includes many instructive examples illustrating the theory.
‘Another terrific book by Rob Eastaway’ SIMON SINGH ‘A delightfully accessible guide to how to play with numbers’ HANNAH FRY How many cats are there in the world? What's the chance of winning the lottery twice? And just how long does it take to count to a million? Learn how to tackle tricky maths problems with nothing but the back of an envelope, a pencil and some good old-fashioned brain power. Join Rob Eastaway as he takes an entertaining look at how to figure without a calculator. Packed with amusing anecdotes, quizzes, and handy calculation tips for every situation, Maths on the Back of an Envelope is an invaluable introduction to the art of estimation, and a welcome reminder that sometimes our own brain is the best tool we have to deal with numbers.
The proceedings of the Los Angeles Caltech-UCLA 'Cabal Seminar' were originally published in the 1970s and 1980s. Wadge Degrees and Projective Ordinals is the second of a series of four books collecting the seminal papers from the original volumes together with extensive unpublished material, new papers on related topics and discussion of research developments since the publication of the original volumes. Focusing on the subjects of 'Wadge Degrees and Pointclasses' (Part III) and 'Projective Ordinals' (Part IV), each of the two sections is preceded by an introductory survey putting the papers into present context. These four volumes will be a necessary part of the book collection of every set theorist.
Design theory has grown to be a subject of considerable interest in mathematics, not only in itself, but for its connections to other fields such as geometry, group theory, graph theory and coding theory. This textbook, first published in 1985, is intended to be an accessible introduction to the subject for advanced undergraduate and beginning graduate students which should prepare them for research in design theory and its applications. The first four chapters of the book are designed to be the core of any course in the subject, while the remaining chapters can be utilized in more advanced or longer courses. The authors assume some knowledge of linear algebra for the first half of the book, but for the second half, students need further background in algebra.
This book proves some important new theorems in the theory of canonical inner models for large cardinal hypotheses, a topic of central importance in modern set theory. In particular, the author 'completes' the theory of Fine Structure and Iteration Trees (FSIT) by proving a comparison theorem for mouse pairs parallel to the FSIT comparison theorem for pure extender mice, and then using the underlying comparison process to develop a fine structure theory for strategy mice. Great effort has been taken to make the book accessible to non-experts so that it may also serve as an introduction to the higher reaches of inner model theory. It contains a good deal of background material, some of it unpublished folklore, and includes many references to the literature to guide further reading. An introductory essay serves to place the new results in their broader context. This is a landmark work in inner model theory that should be in every set theorist's library.
Computing in Nonlinear Media and Automata Collectives presents an account of new ways to design massively parallel computing devices in advanced mathematical models, such as cellular automata and lattice swarms, from unconventional materials, including chemical solutions, bio-polymers, and excitable media.
Computability theory is a branch of mathematical logic and computer science that has become increasingly relevant in recent years. The field has developed growing connections in diverse areas of mathematics, with applications in topology, group theory, and other subfields. In A Hierarchy of Turing Degrees, Rod Downey and Noam Greenberg introduce a new hierarchy that allows them to classify the combinatorics of constructions from many areas of computability theory, including algorithmic randomness, Turing degrees, effectively closed sets, and effective structure theory. This unifying hierarchy gives rise to new natural definability results for Turing degree classes, demonstrating how dynamic constructions become reflected in definability. Downey and Greenberg present numerous construction techniques involving high-level nonuniform arguments, and their self-contained work is appropriate for graduate students and researchers. Blending traditional and modern research results in computability theory, A Hierarchy of Turing Degrees establishes novel directions in the field.
The articles collected in this volume represent the contributions presented at the IMA workshop on "Dynamics of Algorithms" which took place in November 1997. The workshop was an integral part of the 1997 -98 IMA program on "Emerging Applications of Dynamical Systems." The interaction between algorithms and dynamical systems is mutually beneficial since dynamical methods can be used to study algorithms that are applied repeatedly. Convergence, asymptotic rates are indeed dynamical properties. On the other hand, the study of dynamical systems benefits enormously from having efficient algorithms to compute dynamical objects.
In these essays Geoffrey Hellman presents a strong case for a healthy pluralism in mathematics and its logics, supporting peaceful coexistence despite what appear to be contradictions between different systems, and positing different frameworks serving different legitimate purposes. The essays refine and extend Hellman's modal-structuralist account of mathematics, developing a height-potentialist view of higher set theory which recognizes indefinite extendability of models and stages at which sets occur. In the first of three new essays written for this volume, Hellman shows how extendability can be deployed to derive the axiom of Infinity and that of Replacement, improving on earlier accounts; he also shows how extendability leads to attractive, novel resolutions of the set-theoretic paradoxes. Other essays explore advantages and limitations of restrictive systems - nominalist, predicativist, and constructivist. Also included are two essays, with Solomon Feferman, on predicative foundations of arithmetic.
This story of a highly intelligent observer of the turbulent 20th century who was intimately involved as the secretary and bodyguard to Leon Trotsky is based on extensive interviews with the subject, Jean van Heijenoort, and his family, friends, and colleagues. The author has captured the personal drama and the professional life of her protagonist--ranging from the political passion of a young intellectual to the scientific and historic work in the most abstract and yet philosophically important area of logic--in a very readable narrative.
This textbook reviews the foundational topics that are typically covered in an introduction to proof course and studies the language of sentential logic as well as investigating the more powerful language of first-order logic and the notion of a formal deduction in first-order logic, in addition, it proves Godel's Completeness Theorem and discusses incompleteness and the computability concept.
Automatic sequences are sequences over a finite alphabet generated by a finite-state machine. This book presents a novel viewpoint on automatic sequences, and more generally on combinatorics on words, by introducing a decision method through which many new results in combinatorics and number theory can be automatically proved or disproved with little or no human intervention. This approach to proving theorems is extremely powerful, allowing long and error-prone case-based arguments to be replaced by simple computations. Readers will learn how to phrase their desired results in first-order logic, using free software to automate the computation process. Results that normally require multipage proofs can emerge in milliseconds, allowing users to engage with mathematical questions that would otherwise be difficult to solve. With more than 150 exercises included, this text is an ideal resource for researchers, graduate students, and advanced undergraduates studying combinatorics, sequences, and number theory.
Using a unique pedagogical approach, this text introduces mathematical logic by guiding students in implementing the underlying logical concepts and mathematical proofs via Python programming. This approach, tailored to the unique intuitions and strengths of the ever-growing population of programming-savvy students, brings mathematical logic into the comfort zone of these students and provides clarity that can only be achieved by a deep hands-on understanding and the satisfaction of having created working code. While the approach is unique, the text follows the same set of topics typically covered in a one-semester undergraduate course, including propositional logic and first-order predicate logic, culminating in a proof of Goedel's completeness theorem. A sneak peek to Goedel's incompleteness theorem is also provided. The textbook is accompanied by an extensive collection of programming tasks, code skeletons, and unit tests. Familiarity with proofs and basic proficiency in Python is assumed.
Floating-point arithmetic is ubiquitous in modern computing, as it is the tool of choice to approximate real numbers. Due to its limited range and precision, its use can become quite involved and potentially lead to numerous failures. One way to greatly increase confidence in floating-point software is by computer-assisted verification of its correctness proofs. This book provides a comprehensive view of how to formally specify and verify tricky floating-point algorithms with the Coq proof assistant. It describes the Flocq formalization of floating-point arithmetic and some methods to automate theorem proofs. It then presents the specification and verification of various algorithms, from error-free transformations to a numerical scheme for a partial differential equation. The examples cover not only mathematical algorithms but also C programs as well as issues related to compilation.
This volume comprises an imaginative collection of pieces created in tribute to Martin Gardner. Perhaps best known for writing Scientific American's "Mathematical Games" column for years, Gardner used his personal exuberance and fascination with puzzles and magic to entice a wide range of readers into a world of mathematical discovery. This tribute therefore contains pieces as widely varied as Gardner's own interests, ranging from limericks to lengthy treatises, from mathematical journal articles to personal stories. This book makes a charming and unusual addition to any personal library. Selected papers: - The Odyssey of the Figure Eight Puzzle by Stewart Coffin - Block-Packing Jambalaya by Bill Cutler - O'Beirne's Hexiamond by Richard K. Guy - Biblical Ladders by Donald E. Knuth - Three Limericks: On Space, Time and Speed by Tim Rowett.
Is mathematics 'entangled' with its various formalisations? Or are the central concepts of mathematics largely insensitive to formalisation, or 'formalism free'? What is the semantic point of view and how is it implemented in foundational practice? Does a given semantic framework always have an implicit syntax? Inspired by what she calls the 'natural language moves' of Goedel and Tarski, Juliette Kennedy considers what roles the concepts of 'entanglement' and 'formalism freeness' play in a range of logical settings, from computability and set theory to model theory and second order logic, to logicality, developing an entirely original philosophy of mathematics along the way. The treatment is historically, logically and set-theoretically rich, and topics such as naturalism and foundations receive their due, but now with a new twist.
Convergence of Blockchain, AI and IoT: A Digital Platform discusses the convergence of three powerful technologies that play into the digital revolution and blur the lines between biological, digital, and physical objects. This book covers novel algorithms, solutions for addressing issues in applications, security, authentication, and privacy. Discusses innovative technological upgradation and significant challenges in the current era Gives an overview of clinical scientific research that enables smart diagnosis through artificial intelligence Provides an insight into how disruptive technology enabled with the self-running devices and protection mechanism is involved in an augmented reality with blockchain mechanism Talks about neural science being capable of enhancing deep brain waves to predict an overall improvement in human thoughts and behaviours Covers the digital currency mechanism in detail Enhances the knowledge of the readers about smart contract and ledger mechanism with artificial intelligence and blockchain mechanism Targeted audiences range from those interested in the technical revolution of blockchain, big data and the Internet of Things, to research scholars and the professional market.
This is the first of two volumes by Professor Cherlin presenting the state of the art in the classification of homogeneous structures in binary languages and related problems in the intersection of model theory and combinatorics. Researchers and graduate students in the area will find in these volumes many far-reaching results and interesting new research directions to pursue. In this volume, Cherlin develops a complete classification of homogeneous ordered graphs and provides a full proof. He then proposes a new family of metrically homogeneous graphs, a weakening of the usual homogeneity condition. A general classification conjecture is presented, together with general structure theory and applications to a general classification conjecture for such graphs. It also includes introductory chapters giving an overview of the results and methods of both volumes, and an appendix surveying recent developments in the area. An extensive accompanying bibliography of related literature, organized by topic, is available online. |
You may like...
Mentoring Strategies To Facilitate the…
Kerry Karukstis, Bridget Gourley, …
Hardcover
R5,463
Discovery Miles 54 630
Contact Force Models for Multibody…
Paulo Flores, Hamid M. Lankarani
Hardcover
Proceedings of the 10th International…
Katia Lucchesi Cavalca, Hans Ingo Weber
Hardcover
R5,242
Discovery Miles 52 420
Cyclostationarity: Theory and Methods…
Fakher Chaari, Jacek Leskow, …
Hardcover
R4,601
Discovery Miles 46 010
|