![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics
The engineering and business problems the world faces today have become more impenetrable and unstructured, making the design of a satisfactory problem-specific algorithm nontrivial. Modeling, Analysis, and Applications in Metaheuristic Computing: Advancements and Trends is a collection of the latest developments, models, and applications within the transdisciplinary fields related to metaheuristic computing. Providing researchers, practitioners, and academicians with insight into a wide range of topics such as genetic algorithms, differential evolution, and ant colony optimization, this book compiles the latest findings, analysis, improvements, and applications of technologies within metaheuristic computing.
This book is specially designed to refresh and elevate the level of understanding of the foundational background in probability and distributional theory required to be successful in a graduate-level statistics program. Advanced undergraduate students and introductory graduate students from a variety of quantitative backgrounds will benefit from the transitional bridge that this volume offers, from a more generalized study of undergraduate mathematics and statistics to the career-focused, applied education at the graduate level. In particular, it focuses on growing fields that will be of potential interest to future M.S. and Ph.D. students, as well as advanced undergraduates heading directly into the workplace: data analytics, statistics and biostatistics, and related areas.
The greatly expanded and updated 3rd edition of this textbook offers the reader a comprehensive introduction to the concepts of logic functions and equations and their applications across computer science and engineering. The authors' approach emphasizes a thorough understanding of the fundamental principles as well as numerical and computer-based solution methods. The book provides insight into applications across propositional logic, binary arithmetic, coding, cryptography, complexity, logic design, and artificial intelligence. Updated throughout, some major additions for the 3rd edition include: a new chapter about the concepts contributing to the power of XBOOLE; a new chapter that introduces into the application of the XBOOLE-Monitor XBM 2; many tasks that support the readers in amplifying the learned content at the end of the chapters; solutions of a large subset of these tasks to confirm learning success; challenging tasks that need the power of the XBOOLE software for their solution. The XBOOLE-monitor XBM 2 software is used to solve the exercises; in this way the time-consuming and error-prone manipulation on the bit level is moved to an ordinary PC, more realistic tasks can be solved, and the challenges of thinking about algorithms leads to a higher level of education.
The book contains a detailed treatment of thermodynamic formalism on general compact metrizable spaces. Topological pressure, topological entropy, variational principle, and equilibrium states are presented in detail. Abstract ergodic theory is also given a significant attention. Ergodic theorems, ergodicity, and Kolmogorov-Sinai metric entropy are fully explored. Furthermore, the book gives the reader an opportunity to find rigorous presentation of thermodynamic formalism for distance expanding maps and, in particular, subshifts of finite type over a finite alphabet. It also provides a fairly complete treatment of subshifts of finite type over a countable alphabet. Transfer operators, Gibbs states and equilibrium states are, in this context, introduced and dealt with. Their relations are explored. All of this is applied to fractal geometry centered around various versions of Bowen's formula in the context of expanding conformal repellors, limit sets of conformal iterated function systems and conformal graph directed Markov systems. A unique introduction to iteration of rational functions is given with emphasize on various phenomena caused by rationally indifferent periodic points. Also, a fairly full account of the classicaltheory of Shub's expanding endomorphisms is given; it does not have a book presentation in English language mathematical literature.
This book (hardcover) is part of the TREDITION CLASSICS. It contains classical literature works from over two thousand years. Most of these titles have been out of print and off the bookstore shelves for decades. The book series is intended to preserve the cultural legacy and to promote the timeless works of classical literature. Readers of a TREDITION CLASSICS book support the mission to save many of the amazing works of world literature from oblivion. With this series, tredition intends to make thousands of international literature classics available in printed format again - worldwide.
Science and engineering students depend heavily on concepts of
mathematical modeling. In an age where almost everything is done on
a computer, author Clive Dym believes that students need to
understand and "own" the underlying mathematics that computers are
doing on their behalf. His goal for Principles of Mathematical
Modeling, Second Edition, is to engage the student reader in
developing a foundational understanding of the subject that will
serve them well into their careers.
Fuzzy set and logic theory suggest that all natural language linguistic expressions are imprecise and must be assessed as a matter of degree. But in general membership degree is an imprecise notion which requires that Type 2 membership degrees be considered in most applications related to human decision making schemas. Even if the membership functions are restricted to be Type1, their combinations generate an interval - valued Type 2 membership. This is part of the general result that Classical equivalences breakdown in Fuzzy theory. Thus all classical formulas must be reassessed with an upper and lower expression that are generated by the breakdown of classical formulas.
Gottlob Frege's Grundgesetze der Arithmetik, or Basic Laws of Arithmetic, was intended to be his magnum opus, the book in which he would finally establish his logicist philosophy of arithmetic. But because of the disaster of Russell's Paradox, which undermined Frege's proofs, the more mathematical parts of the book have rarely been read. Richard G. Heck, Jr., aims to change that, and establish it as a neglected masterpiece that must be placed at the center of Frege's philosophy. Part I of Reading Frege's Grundgesetze develops an interpretation of the philosophy of logic that informs Grundgesetze, paying especially close attention to the difficult sections of Frege's book in which he discusses his notorious 'Basic Law V' and attempts to secure its status as a law of logic. Part II examines the mathematical basis of Frege's logicism, explaining and exploring Frege's formal arguments. Heck argues that Frege himself knew that his proofs could be reconstructed so as to avoid Russell's Paradox, and presents Frege's arguments in a way that makes them available to a wide audience. He shows, by example, that careful attention to the structure of Frege's arguments, to what he proved, to how he proved it, and even to what he tried to prove but could not, has much to teach us about Frege's philosophy.
This volume shares and makes accessible new research lines and recent results in several branches of theoretical and mathematical physics, among them Quantum Optics, Coherent States, Integrable Systems, SUSY Quantum Mechanics, and Mathematical Methods in Physics. In addition to a selection of the contributions presented at the "6th International Workshop on New Challenges in Quantum Mechanics: Integrability and Supersymmetry", held in Valladolid, Spain, 27-30 June 2017, several high quality contributions from other authors are also included. The conference gathered 60 participants from many countries working in different fields of Theoretical Physics, and was dedicated to Prof. Veronique Hussin-an internationally recognized expert in many branches of Mathematical Physics who has been making remarkable contributions to this field since the 1980s. The reader will find interesting reviews on the main topics from internationally recognized experts in each field, as well as other original contributions, all of which deal with recent applications or discoveries in the aforementioned areas.
This book is considered the first extended survey on algorithms and techniques for efficient cohesive subgraph computation. With rapid development of information technology, huge volumes of graph data are accumulated. An availability of rich graph data not only brings great opportunities for realizing big values of data to serve key applications, but also brings great challenges in computation. Using a consistent terminology, the book gives an excellent introduction to the models and algorithms for the problem of cohesive subgraph computation. The materials of this book are well organized from introductory content to more advanced topics while also providing well-designed source codes for most algorithms described in the book. This is a timely book for researchers who are interested in this topic and efficient data structure design for large sparse graph processing. It is also a guideline book for new researchers to get to know the area of cohesive subgraph computation.
In order to ensure the criteria for monitoring and managing the various problems and design for decision control, a mathematical description of exact human knowledge is required for the management of adaptive and complex systems. Decision Control, Management, and Support in Adaptive and Complex Systems: Quantitative Models presents an application and demonstration of a new mathematical technique for descriptions of complex systems. This comprehensive collection contains scientific results in the field of contemporary approaches to adaptive decision making that is essential for researchers, scholars, and students alike.
The Keller-Segel model for chemotaxis is a prototype of nonlocal systems describing concentration phenomena in physics and biology. While the two-dimensional theory is by now quite complete, the questions of global-in-time solvability and blowup characterization are largely open in higher dimensions. In this book, global-in-time solutions are constructed under (nearly) optimal assumptions on initial data and rigorous blowup criteria are derived.
This is the proceedings of the IUTAM Symposium on Exploiting Nonlinear Dynamics for Engineering Systems that was held in Novi Sad, Serbia, from July 15th to 19th, 2018. The appearance of nonlinear phenomena used to be perceived as dangerous, with a general tendency to avoid them or control them. This perception has led to intensive research using various approaches and tailor-made tools developed over decades. However, the Nonlinear Dynamics of today is experiencing a profound shift of paradigm since recent investigations rely on a different strategy which brings good effects of nonlinear phenomena to the forefront. This strategy has a positive impact on different fields in science and engineering, such as vibration isolation, energy harvesting, micro/nano-electro-mechanical systems, etc. Therefore, the ENOLIDES Symposium was devoted to demonstrate the benefits and to unlock the potential of exploiting nonlinear dynamical behaviour in these but also in other emerging fields of science and engineering. This proceedings is useful for researchers in the fields of nonlinear dynamics of mechanical systems and structures, and in Mechanical and Civil Engineering.
This updated revision gives a complete and topical overview on Nonconservative Stability which is essential for many areas of science and technology ranging from particles trapping in optical tweezers and dynamics of subcellular structures to dissipative and radiative instabilities in fluid mechanics, astrophysics and celestial mechanics. The author presents relevant mathematical concepts as well as rigorous stability results and numerous classical and contemporary examples from non-conservative mechanics and non-Hermitian physics. New coverage of ponderomotive magnetism, experimental detection of Ziegler's destabilization phenomenon and theory of double-diffusive instabilities in magnetohydrodynamics.
New Edition of a Classic Guide to Statistical Applications in the Biomedical Sciences In the last decade, there have been significant changes in the way statistics is incorporated into biostatistical, medical, and public health research. Addressing the need for a modernized treatment of these statistical applications, Basic Statistics, Fourth Edition presents relevant, up-to-date coverage of research methodology using careful explanations of basic statistics and how they are used to address practical problems that arise in the medical and public health settings. Through concise and easy-to-follow presentations, readers will learn to interpret and examine data by applying common statistical tools, such as sampling, random assignment, and survival analysis. Continuing the tradition of its predecessor, this new edition outlines a thorough discussion of different kinds of studies and guides readers through the important, related decision-making processes such as determining what information is needed and planning the collections process. The book equips readers with the knowledge to carry out these practices by explaining the various types of studies that are commonly conducted in the fields of medical and public health, and how the level of evidence varies depending on the area of research. Data screening and data entry into statistical programs is explained and accompanied by illustrations of statistical analyses and graphs. Additional features of the Fourth Edition include: A new chapter on data collection that outlines the initial steps in planning biomedical and public health studiesA new chapter on nonparametric statistics that includes a discussion and application of the Sign test, the Wilcoxon Signed Rank test, and the Wilcoxon Rank Sum test and its relationship to the Mann-Whitney U testAn updated introduction to survival analysis that includes the Kaplan Meier method for graphing the survival function and a brief introduction to tests for comparing survival functionsIncorporation of modern statistical software, such as SAS, Stata, SPSS, and Minitab into the presented discussion of data analysisUpdated references at the end of each chapter "Basic Statistics," Fourth Edition is an ideal book for courses on biostatistics, medicine, and public health at the upper-undergraduate and graduate levels. It is also appropriate as a reference for researchers and practitioners who would like to refresh their fundamental understanding of statistical techniques.
Starting with the basic linear model where the design and covariance matrices are of full rank, this book demonstrates how the same statistical ideas can be used to explore the more general linear model with rank-deficient design and/or covariance matrices. The unified treatment presented here provides a clearer understanding of the general linear model from a statistical perspective, thus avoiding the complex matrix-algebraic arguments that are often used in the rank-deficient case. Elegant geometric arguments are used as needed.The book has a very broad coverage, from illustrative practical examples in Regression and Analysis of Variance alongside their implementation using R, to providing comprehensive theory of the general linear model with 181 worked-out examples, 227 exercises with solutions, 152 exercises without solutions (so that they may be used as assignments in a course), and 320 up-to-date references.This completely updated and new edition of Linear Models: An Integrated Approach includes the following features:
The main purpose of this book is not only to present recent studies and advances in the field of social science research, but also to stimulate discussion on related practical issues concerning statistics, mathematics, and economics. Accordingly, a broad range of tools and techniques that can be used to solve problems on these topics are presented in detail in this book, which offers an ideal reference work for all researchers interested in effective quantitative and qualitative tools. The content is divided into three major sections. The first, which is titled "Social work", collects papers on problems related to the social sciences, e.g. social cohesion, health, and digital technologies. Papers in the second part, "Education and teaching issues," address qualitative aspects, education, learning, violence, diversity, disability, and ageing, while the book's final part, "Recent trends in qualitative and quantitative models for socio-economic systems and social work", features contributions on both qualitative and quantitative issues. The book is based on a scientific collaboration, in the social sciences, mathematics, statistics, and economics, among experts from the "Pablo de Olavide" University of Seville (Spain), the "University of Defence" of Brno (Czech Republic), the "G. D'Annunzio" University of Chieti-Pescara (Italy) and "Alexandru Ioan Cuza University" of Iasi (Romania). The contributions, which have been selected using a peer-review process, examine a wide variety of topics related to the social sciences in general, while also highlighting new and intriguing empirical research conducted in various countries. Given its scope, the book will appeal, in equal measure, to sociologists, mathematicians, statisticians and philosophers, and more generally to scholars and specialists in related fields.
This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract "best estimate" values for model parameters and predicted results, together with "best estimate" uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information. The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, "cost functional" (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the "data adjustment" and/or the "4D-VAR" data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a "model validation metric" which quantifies the consistency (agreement/disagreement) between measurements and computations. This "model validation metric" (or "consistency indicator") is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters. Traditional methods for computing response sensitivities are hampered by the "curse of dimensionality," which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, "best estimate" predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.
This monograph is the first and an initial introduction to the
theory of bitopological spaces and its applications. In particular,
different families of subsets of bitopological spaces are
introduced and various relations between two topologies are
analyzed on one and the same set; the theory of dimension of
bitopological spaces and the theory of Baire bitopological spaces
are constructed, and various classes of mappings of bitopological
spaces are studied. The previously known results as well the
results obtained in this monograph are applied in analysis,
potential theory, general topology, and theory of ordered
topological spaces. Moreover, a high level of modern knowledge of
bitopological spaces theory has made it possible to introduce and
study algebra of new type, the corresponding representation of
which brings one to the special class of bitopological spaces.
Advanced Topics in Linear Algebra presents, in an engaging style, novel topics linked through the Weyr matrix canonical form, a largely unknown cousin of the Jordan canonical form discovered by Eduard Weyr in 1885. The book also develops much linear algebra unconnected to canonical forms, that has not previously appeared in book form. It presents common applications of Weyr form, including matrix commutativity problems, approximate simultaneous diagonalization, and algebraic geometry, with the latter two having topical connections to phylogenetic invariants in biomathematics and multivariate interpolation. The Weyr form clearly outperforms the Jordan form in many situations, particularly where two or more commuting matrices are involved, due to the block upper triangular form a Weyr matrix forces on any commuting matrix. In this book, the authors develop the Weyr form from scratch, and include an algorithm for computing it. The Weyr form is also derived ring-theoretically in an entirely different way to the classical derivation of the Jordan form. A fascinating duality exists between the two forms that allows one to flip back and forth and exploit the combined powers of each. The book weaves together ideas from various mathematical disciplines, demonstrating dramatically the variety and unity of mathematics. Though the book's main focus is linear algebra, it also draws upon ideas from commutative and noncommutative ring theory, module theory, field theory, topology, and algebraic geometry. Advanced Topics in Linear Algebra offers self-contained accounts of the non-trivial results used from outside linear algebra, and lots of worked examples, thereby making it accessible to graduate students. Indeed, the scope of the book makes it an appealing graduate text, either as a reference or for an appropriately designed one or two semester course. A number of the authors' previously unpublished results appear as well.
Classical Mechanics teaches readers how to solve physics problems; in other words, how to put math and physics together to obtain a numerical or algebraic result and then interpret these results physically. These skills are important and will be needed in more advanced science and engineering courses. However, more important than developing problem-solving skills and physical-interpretation skills, the main purpose of this multi-volume series is to survey the basic concepts of classical mechanics and to provide the reader with a solid understanding of the foundational content knowledge of classical mechanics. Classical Mechanics: Conservation laws and rotational motion covers the conservation of energy and the conservation of momentum, which are crucial concepts in any physics course. It also introduces the concepts of center-of-mass and rotational motion. |
You may like...
Statistics For Business And Economics
David Anderson, James Cochran, …
Paperback
(1)
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Teaching Statistics - A Bag of Tricks
Andrew Gelman, Deborah Nolan
Hardcover
R3,117
Discovery Miles 31 170
What If There Were No Significance…
Lisa L. Harlow, Stanley A. Mulaik, …
Hardcover
R4,529
Discovery Miles 45 290
Advances in Personality Assessment…
C.D. Spielberger, J N Butcher, …
Hardcover
R1,978
Discovery Miles 19 780
African Safari - Into The Great Game…
Peter & Beverly Pickford
Hardcover
(1)
|