![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Mathematical foundations
The International Biometric Society (IBS) was formed at the First International Biometric Conference at Woods Hole on September 6, 1947. The History of the International Biometric Society presents a deep dive into the voluminous archival records, with primary focus on IBS's first fifty years. It contains numerous photos and extracts from the archival materials, and features many photos of important leaders who served IBS across the decades. Features: Describes events leading up to and at Woods Hole on September 6, 1947 that led to the formation of IBS Outlines key markers that shaped IBS after the 1947 formation through to the modern day Describes the regional and national group structure, and the formation of regions and national groups Describes events surrounding the key scientific journal of IBS, Biometrics, including the transfer of ownership to IBS, content, editors, policies, management, and importance Describes the other key IBS publications - Biometric Bulletin, Journal of Agricultural Biological and Environmental Statistics, and regional publications Provides details of International Biometric Conferences and key early symposia Describes IBS constitution and by-laws processes, and the evolution of business arrangements Provides a record of international officers, including regional presidents, national group secretaries, journal editors, and the locations of meetings Includes a gallery of international Presidents, and a gallery of Secretaries and Treasurers The History of the International Biometric Society will appeal to anyone interested in the activities of our statistical and biometrical forebearers. The focus is on issues and events that engaged the attention of the officers of IBS. Some of these records are riveting, some entertaining, some intriguing, and some colorful. Some of the issues covered were difficult to handle, but even these often resulted in changes that benefited IBS.
Digital forensics plays a crucial role in identifying, analysing, and presenting cyber threats as evidence in a court of law. Artificial intelligence, particularly machine learning and deep learning, enables automation of the digital investigation process. This book provides an in-depth look at the fundamental and advanced methods in digital forensics. It also discusses how machine learning and deep learning algorithms can be used to detect and investigate cybercrimes. This book demonstrates digital forensics and cyber-investigating techniques with real-world applications. It examines hard disk analytics and style architectures, including Master Boot Record and GUID Partition Table as part of the investigative process. It also covers cyberattack analysis in Windows, Linux, and network systems using virtual machines in real-world scenarios. Digital Forensics in the Era of Artificial Intelligence will be helpful for those interested in digital forensics and using machine learning techniques in the investigation of cyberattacks and the detection of evidence in cybercrimes.
This book is a comprehensive examination of the conception, perception, performance, and composition of time in music across time and culture. It surveys the literature of time in mathematics, philosophy, psychology, music theory, and somatic studies (medicine and disability studies) and looks ahead through original research in performance, composition, psychology, and education. It is the first monograph solely devoted to the theory of construction of musical time since Kramer in 1988, with new insights, mathematical precision, and an expansive global and historical context. The mathematical methods applied for the construction of musical time are totally new. They relate to category theory (projective limits) and the mathematical theory of gestures. These methods and results extend the music theory of time but also apply to the applied performative understanding of making music. In addition, it is the very first approach to a constructive theory of time, deduced from the recent theory of musical gestures and their categories. Making Musical Time is intended for a wide audience of scholars with interest in music. These include mathematicians, music theorists, (ethno)musicologists, music psychologists / educators / therapists, music performers, philosophers of music, audiologists, and acousticians.
Decision Theory An Introduction to Dynamic Programming and
Sequential Decisions John Bather University of Sussex, UK
Mathematical induction, and its use in solving optimization
problems, is a topic of great interest with many applications. It
enables us to study multistage decision problems by proceeding
backwards in time, using a method called dynamic programming. All
the techniques needed to solve the various problems are explained,
and the author's fluent style will leave the reader with an avid
interest in the subject.
This monograph provides a modern introduction to the theory of quantales. First coined by C.J. Mulvey in 1986, quantales have since developed into a significant topic at the crossroads of algebra and logic, of notable interest to theoretical computer science. This book recasts the subject within the powerful framework of categorical algebra, showcasing its versatility through applications to C*- and MV-algebras, fuzzy sets and automata. With exercises and historical remarks at the end of each chapter, this self-contained book provides readers with a valuable source of references and hints for future research. This book will appeal to researchers across mathematics and computer science with an interest in category theory, lattice theory, and many-valued logic.
This unique and contemporary text not only offers an introduction to proofs with a view towards algebra and analysis, a standard fare for a transition course, but also presents practical skills for upper-level mathematics coursework and exposes undergraduate students to the context and culture of contemporary mathematics. The authors implement the practice recommended by the Committee on the Undergraduate Program in Mathematics (CUPM) curriculum guide, that a modern mathematics program should include cognitive goals and offer a broad perspective of the discipline. Part I offers: An introduction to logic and set theory. Proof methods as a vehicle leading to topics useful for analysis, topology, algebra, and probability. Many illustrated examples, often drawing on what students already know, that minimize conversation about "doing proofs." An appendix that provides an annotated rubric with feedback codes for assessing proof writing. Part II presents the context and culture aspects of the transition experience, including: 21st century mathematics, including the current mathematical culture, vocations, and careers. History and philosophical issues in mathematics. Approaching, reading, and learning from journal articles and other primary sources. Mathematical writing and typesetting in LaTeX. Together, these Parts provide a complete introduction to modern mathematics, both in content and practice. Table of Contents Part I - Introduction to Proofs Logic and Sets Arguments and Proofs Functions Properties of the Integers Counting and Combinatorial Arguments Relations Part II - Culture, History, Reading, and Writing Mathematical Culture, Vocation, and Careers History and Philosophy of Mathematics Reading and Researching Mathematics Writing and Presenting Mathematics Appendix A. Rubric for Assessing Proofs Appendix B. Index of Theorems and Definitions from Calculus and Linear Algebra Bibliography Index Biographies Danilo R. Diedrichs is an Associate Professor of Mathematics at Wheaton College in Illinois. Raised and educated in Switzerland, he holds a PhD in applied mathematical and computational sciences from the University of Iowa, as well as a master's degree in civil engineering from the Ecole Polytechnique Federale in Lausanne, Switzerland. His research interests are in dynamical systems modeling applied to biology, ecology, and epidemiology. Stephen Lovett is a Professor of Mathematics at Wheaton College in Illinois. He holds a PhD in representation theory from Northeastern University. His other books include Abstract Algebra: Structures and Applications (2015), Differential Geometry of Curves and Surfaces, with Tom Banchoff (2016), and Differential Geometry of Manifolds (2019).
Chaitin, the inventor of algorithmic information theory, presents in this book the strongest possible version of Goedel's incompleteness theorem, using an information theoretic approach based on the size of computer programs. One half of the book is concerned with studying the halting probability of a universal computer if its program is chosen by tossing a coin. The other half is concerned with encoding the halting probability as an algebraic equation in integers, a so-called exponential diophantine equation.
Linear Logic is a branch of proof theory which provides refined tools for the study of the computational aspects of proofs. These tools include a duality-based categorical semantics, an intrinsic graphical representation of proofs, the introduction of well-behaved non-commutative logical connectives, and the concepts of polarity and focalisation. These various aspects are illustrated here through introductory tutorials as well as more specialised contributions, with a particular emphasis on applications to computer science: denotational semantics, lambda-calculus, logic programming and concurrency theory. The volume is rounded-off by two invited contributions on new topics rooted in recent developments of linear logic. The book derives from a summer school that was the climax of the EU Training and Mobility of Researchers project "Linear Logic in Computer Science." It is an excellent introduction to some of the most active research topics in the area.
Recursive Functions and Metamathematics deals with problems of the completeness and decidability of theories, using as its main tool the theory of recursive functions. This theory is first introduced and discussed. Then G del's incompleteness theorems are presented, together with generalizations, strengthenings, and the decidability theory. The book also considers the historical and philosophical context of these issues and their philosophical and methodological consequences. Recent results and trends have been included, such as undecidable sentences of mathematical content, reverse mathematics. All the main results are presented in detail. The book is self-contained and presupposes only some knowledge of elementary mathematical logic. There is an extensive bibliography. Readership: Scholars and advanced students of logic, mathematics, philosophy of science.
This book provides an introduction to mathematical logic and the foundations of mathematics. It will help prepare students for advanced study in set theory and mathematical logic as well as other areas of mathematics, such as analysis, topology, and algebra. The presentation of finite state and Turing machines leads to the Halting Problem and Goedel's Incompleteness Theorem, which have broad academic interest, particularly in computer science and philosophy.
In his Master Plan Cai Chen (1167-1230) created an original divination manual based on the Yijing and keyed it to an intricate series of 81 matrixes with the properties of "magic squares." Previously unrecognized, Cai's work is a milestone in the history of mathematics, and, in introducing it, this book dramatically expands our understanding of the Chinese number theory practiced by the "Image and Number" school within Confucian philosophy. Thinkers of that leaning devised graphic arrays of the binary figures called "trigrams" and "hexagrams" in the Yijing as a way of exploring the relationship between the random draws of divination and the classic's readings. Cai adapted this perspective to his 81 matrix series, which he saw as tracing the recurring temporal cycles of the natural world. The architecture of the matrix series is echoed in the language of his divination texts, which he called "number names"-hence, the book's title. This book will appeal to those interested in philosophy, the history of science and mathematics, and Chinese intellectual history. The divination text has significant literary as well as philosophical dimensions, and its audience lies both among specialists in these fields and with a general readership interested in recreational mathematics and topics like divination, Taiji, and Fengshui.
Digital forensics plays a crucial role in identifying, analysing, and presenting cyber threats as evidence in a court of law. Artificial intelligence, particularly machine learning and deep learning, enables automation of the digital investigation process. This book provides an in-depth look at the fundamental and advanced methods in digital forensics. It also discusses how machine learning and deep learning algorithms can be used to detect and investigate cybercrimes. This book demonstrates digital forensics and cyber-investigating techniques with real-world applications. It examines hard disk analytics and style architectures, including Master Boot Record and GUID Partition Table as part of the investigative process. It also covers cyberattack analysis in Windows, Linux, and network systems using virtual machines in real-world scenarios. Digital Forensics in the Era of Artificial Intelligence will be helpful for those interested in digital forensics and using machine learning techniques in the investigation of cyberattacks and the detection of evidence in cybercrimes.
The International Biometric Society (IBS) was formed at the First International Biometric Conference at Woods Hole on September 6, 1947. The History of the International Biometric Society presents a deep dive into the voluminous archival records, with primary focus on IBS's first fifty years. It contains numerous photos and extracts from the archival materials, and features many photos of important leaders who served IBS across the decades. Features: Describes events leading up to and at Woods Hole on September 6, 1947 that led to the formation of IBS Outlines key markers that shaped IBS after the 1947 formation through to the modern day Describes the regional and national group structure, and the formation of regions and national groups Describes events surrounding the key scientific journal of IBS, Biometrics, including the transfer of ownership to IBS, content, editors, policies, management, and importance Describes the other key IBS publications - Biometric Bulletin, Journal of Agricultural Biological and Environmental Statistics, and regional publications Provides details of International Biometric Conferences and key early symposia Describes IBS constitution and by-laws processes, and the evolution of business arrangements Provides a record of international officers, including regional presidents, national group secretaries, journal editors, and the locations of meetings Includes a gallery of international Presidents, and a gallery of Secretaries and Treasurers The History of the International Biometric Society will appeal to anyone interested in the activities of our statistical and biometrical forebearers. The focus is on issues and events that engaged the attention of the officers of IBS. Some of these records are riveting, some entertaining, some intriguing, and some colorful. Some of the issues covered were difficult to handle, but even these often resulted in changes that benefited IBS.
In recent years, mathematical logic has developed in many directions, the initial unity of its subject matter giving way to a myriad of seemingly unrelated areas. The articles collected here, which range from historical scholarship to recent research in geometric model theory, squarely address this development. These articles also connect to the diverse work of Vaananen, whose ecumenical approach to logic reflects the unity of the discipline."
Philosophers of science have produced a variety of definitions for the notion of one sentence, theory or hypothesis being closer to the truth, more verisimilar, or more truthlike than another one. The definitions put forward by philosophers presuppose at least implicitly that the subject matter with which the compared sentences, theories or hypotheses are concerned has been specified,! and the property of closeness to the truth, verisimilitude or truth likeness appearing in such definitions should be understood as closeness to informative truth about that subject matter. This monograph is concerned with a special case of the problem of defining verisimilitude, a case in which this subject matter is of a rather restricted kind. Below, I shall suppose that there is a finite number of interrelated quantities which are used for characterizing the state of some system. Scientists might arrive at different hypotheses concerning the values of such quantities in a variety of ways. There might be various theories that give different predictions (whose informativeness might differ , too) on which combinations of the values of these quantities are possible. Scientists might also have measured all or some of the quantities in question with some accuracy. Finally, they might also have combined these two methods of forming hypotheses on their values by first measuring some of the quantities and then deducing the values of some others from the combination of a theory and the measurement results.
Discusses in detail a World Formula, which is the unification of the greatest theories in physics, namely quantum theory and Einstein's general theory Demystifies David Hilbert's World Formula by simplifying the complex math involved in it Explains why nobody had realized Hilbert's immortal stroke of genius As a "Theory of Everything" approach, it automatically provides just the most holistic tools for each and every optimization, decision-making or solution-finding problem there can possibly be-be it in physics, social science, medicine, socioeconomy and politics, real or artificial intelligence or, rather generally, philosophy
Architecture of Mathematics describes the logical structure of Mathematics from its foundations to its real-world applications. It describes the many interweaving relationships between different areas of mathematics and its practical applications, and as such provides unique reading for professional mathematicians and nonmathematicians alike. This book can be a very important resource both for the teaching of mathematics and as a means to outline the research links between different subjects within and beyond the subject. Features All notions and properties are introduced logically and sequentially, to help the reader gradually build understanding. Focusses on illustrative examples that explain the meaning of mathematical objects and their properties. Suitable as a supplementary resource for teaching undergraduate mathematics, and as an aid to interdisciplinary research. Forming the reader's understanding of Mathematics as a unified science, the book helps to increase his general mathematical culture.
*An emphasis on the art of proof. *Enhanced number theory chapter presents some easily accessible but still-unsolved problems. These include the Goldbach conjecture, the twin-prime conjecture, and so forth. *The discussion of equivalence relations is revised to present reflexivity, symmetry, and transitivity before we define equivalence relations. *The discussion of the RSA cryptosystem in Chapter 10 is expanded. *The author introduces groups much earlier, as this is an incisive example of an axiomatic theory. Coverage of group theory, formerly in Chapter 11, has been moved up, this is an incisive example of an axiomatic theory.
This book is dedicated to the work of Alasdair Urquhart. The book starts out with an introduction to and an overview of Urquhart's work, and an autobiographical essay by Urquhart. This introductory section is followed by papers on algebraic logic and lattice theory, papers on the complexity of proofs, and papers on philosophical logic and history of logic. The final section of the book contains a response to the papers by Urquhart. Alasdair Urquhart has made extremely important contributions to a variety of fields in logic. He produced some of the earliest work on the semantics of relevant logic. He provided the undecidability of the logics R (of relevant implication) and E (of relevant entailment), as well as some of their close neighbors. He proved that interpolation fails in some of those systems. Urquhart has done very important work in complexity theory, both about the complexity of proofs in classical and some nonclassical logics. In pure algebra, he has produced a representation theorem for lattices and some rather beautiful duality theorems. In addition, he has done important work in the history of logic, especially on Bertrand Russell, including editing Volume four of Russell's Collected Papers.
This monograph considers several well-known mathematical theorems and asks the question, "Why prove it again?" while examining alternative proofs. It explores the different rationales mathematicians may have for pursuing and presenting new proofs of previously established results, as well as how they judge whether two proofs of a given result are different. While a number of books have examined alternative proofs of individual theorems, this is the first that presents comparative case studies of other methods for a variety of different theorems. The author begins by laying out the criteria for distinguishing among proofs and enumerates reasons why new proofs have, for so long, played a prominent role in mathematical practice. He then outlines various purposes that alternative proofs may serve. Each chapter that follows provides a detailed case study of alternative proofs for particular theorems, including the Pythagorean Theorem, the Fundamental Theorem of Arithmetic, Desargues' Theorem, the Prime Number Theorem, and the proof of the irreducibility of cyclotomic polynomials. Why Prove It Again? will appeal to a broad range of readers, including historians and philosophers of mathematics, students, and practicing mathematicians. Additionally, teachers will find it to be a useful source of alternative methods of presenting material to their students.
This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic types of models. New homotopy techniques are applied to Marin Lof types of computations with model categories. Functorial computability, induction, and recursion are examined in view of the above, presenting new computability techniques with monad transformations and projective sets. This informative volume will give readers a complete new feel for models, computability, recursion sets, complexity, and realizability. This book pulls together functorial thoughts, models, computability, sets, recursion, arithmetic hierarchy, filters, with real tree computing areas, presented in a very intuitive manner for university teaching, with exercises for every chapter. The book will also prove valuable for faculty in computer science and mathematics.
This open access book is the first ever collection of Karl Popper's writings on deductive logic. Karl R. Popper (1902-1994) was one of the most influential philosophers of the 20th century. His philosophy of science ("falsificationism") and his social and political philosophy ("open society") have been widely discussed way beyond academic philosophy. What is not so well known is that Popper also produced a considerable work on the foundations of deductive logic, most of it published at the end of the 1940s as articles at scattered places. This little-known work deserves to be known better, as it is highly significant for modern proof-theoretic semantics. This collection assembles Popper's published writings on deductive logic in a single volume, together with all reviews of these papers. It also contains a large amount of unpublished material from the Popper Archives, including Popper's correspondence related to deductive logic and manuscripts that were (almost) finished, but did not reach the publication stage. All of these items are critically edited with additional comments by the editors. A general introduction puts Popper's work into the context of current discussions on the foundations of logic. This book should be of interest to logicians, philosophers, and anybody concerned with Popper's work.
This book presents the entire body of thought of Norbert Wiener (1894-1964), knowledge of which is essential if one wishes to understand and correctly interpret the age in which we live. The focus is in particular on the philosophical and sociological aspects of Wiener's thought, but these aspects are carefully framed within the context of his scientific journey. Important biographical events, including some that were previously unknown, are also highlighted, but while the book has a biographical structure, it is not only a biography. The book is divided into four chronological sections, the first two of which explore Wiener's development as a philosopher and logician and his brilliant interwar career as a mathematician, supported by his philosophical background. The third section considers his research during World War II, which drew upon his previous scientific work and reflections and led to the birth of cybernetics. Finally, the radical post-war shift in Wiener's intellectual path is considered, examining how he came to abandon computer science projects and commenced ceaseless public reflections on the new sciences and technologies of information, their social effects, and the need for responsibility in science.
This monograph proposes a new way of implementing interaction in logic. It also provides an elementary introduction to Constructive Type Theory (CTT). The authors equally emphasize basic ideas and finer technical details. In addition, many worked out exercises and examples will help readers to better understand the concepts under discussion. One of the chief ideas animating this study is that the dialogical understanding of definitional equality and its execution provide both a simple and a direct way of implementing the CTT approach within a game-theoretical conception of meaning. In addition, the importance of the play level over the strategy level is stressed, binding together the matter of execution with that of equality and the finitary perspective on games constituting meaning. According to this perspective the emergence of concepts are not only games of giving and asking for reasons (games involving Why-questions), they are also games that include moves establishing how it is that the reasons brought forward accomplish their explicative task. Thus, immanent reasoning games are dialogical games of Why and How. |
You may like...
Learn PowerShell in a Month of Lunches…
Travis Plunk, James Petty, …
Paperback
Linux and Solaris Recipes for Oracle…
Darl Kuhn, Bernard Lopuz, …
Paperback
R3,122
Discovery Miles 31 220
Utilizing Blockchain Technologies in…
S. B. Goyal, Nijalingappa Pradeep, …
Hardcover
R6,170
Discovery Miles 61 700
Discrete Mathematics with Cryptographic…
Alexander I Kheyfits
Hardcover
|