Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics
Identifying the sources and measuring the impact of haphazard variations are important in any number of research applications, from clinical trials and genetics to industrial design and psychometric testing. Only in very simple situations can such variations be represented effectively by independent, identically distributed random variables or by random sampling from a hypothetical infinite population.
This text gathers, revises and explains the newly developed Adomian decomposition method along with its modification and some traditional techniques.
This remarkable text by John R. Taylor has been a non-stop best-selling international hit since it was first published forty years ago. However, the two-plus decades since the second edition was released have seen two dramatic developments; the huge rise in popularity of Bayesian statistics, and the continued increase in the power and availability of computers and calculators. In response to the former, Taylor has added a full chapter dedicated to Bayesian thinking, introducing conditional probabilities and Bayes’ theorem. The several examples presented in the new third edition are intentionally very simple, designed to give readers a clear understanding of what Bayesian statistics is all about as their first step on a journey to become practicing Bayesians. In response to the second development, Taylor has added a number of chapter-ending problems that will encourage readers to learn how to solve problems using computers. While many of these can be solved using programs such as Matlab or Mathematica, almost all of them are stated to apply to commonly available spreadsheet programs like Microsoft Excel. These programs provide a convenient way to record and process data and to calculate quantities like standard deviations, correlation coefficients, and normal distributions; they also have the wonderful ability – if students construct their own spreadsheets and avoid the temptation to use built-in functions – to teach the meaning of these concepts.
Features Introduces, defines, and illustrates the concept of "dynamic consistency" as the foundation of modeling Can be used as the basis of an upper-level undergraduate course on general procedures for mathematical modeling using differential equations Discusses the issue of dimensional analysis and continually demonstrates its value for both the construction and analysis of mathematical modeling.
The Ninth Edition of Social Statistics for a Diverse Society continues to emphasize intuition and common sense, while demonstrating the link between the practice of statistics and important social issues. Recognizing that we live in a world characterized by a growing diversity and richness of social differences, best-selling authors Frankfort-Nachmias, Leon-Guerrero, and Davis help you learn key statistical concepts through real research examples related to the dynamic interplay of race, class, gender, and other social variables. The text also helps you develop important skills such as problem-solving (through a rich variety of exercises), use of statistical software (both SPSS and Excel), and interpreting research literature.
"Examines classic algorithms, geometric diagrams, and mechanical principles for enhances visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming."
"Configural Frequency Analysis" (CFA) provides an up-to-the-minute
comprehensive introduction to its techniques, models, and
applications. Written in a formal yet accessible style, actual
empirical data examples are used to illustrate key concepts.
Step-by-step program sequences are used to show readers how to
employ CFA methods using commercial software packages, such as SAS,
SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
"Configural Frequency Analysis" (CFA) provides an up-to-the-minute
comprehensive introduction to its techniques, models, and
applications. Written in a formal yet accessible style, actual
empirical data examples are used to illustrate key concepts.
Step-by-step program sequences are used to show readers how to
employ CFA methods using commercial software packages, such as SAS,
SPSS, SYSTAT, S-Plus, or those written specifically to perform CFA.
The crypto wars have raged for half a century. In the 1970s, digital privacy activists prophesied the emergence of an Orwellian State, made possible by computer-mediated mass surveillance. The antidote: digital encryption. The U.S. government warned encryption would not only prevent surveillance of law-abiding citizens, but of criminals, terrorists, and foreign spies, ushering in a rival dystopian future. Both parties fought to defend the citizenry from what they believed the most perilous threats. The government tried to control encryption to preserve its surveillance capabilities; privacy activists armed citizens with cryptographic tools and challenged encryption regulations in the courts. No clear victor has emerged from the crypto wars. Governments have failed to forge a framework to govern the, at times conflicting, civil liberties of privacy and security in the digital age-an age when such liberties have an outsized influence on the citizen-State power balance. Solving this problem is more urgent than ever. Digital privacy will be one of the most important factors in how we architect twenty-first century societies-its management is paramount to our stewardship of democracy for future generations. We must elevate the quality of debate on cryptography, on how we govern security and privacy in our technology-infused world. Failure to end the crypto wars will result in societies sleepwalking into a future where the citizen-State power balance is determined by a twentieth-century status quo unfit for this century, endangering both our privacy and security. This book provides a history of the crypto wars, with the hope its chronicling sets a foundation for peace.
An exploration of the key issues in the teaching of mathematics, a key subject in its own right, and one that forms an important part of many other disciplines. The volume includes contributions from a wide range of experts in the field, and has a broad and international perspective. It is part of a series on effective learning and teaching in higher education. Each volume in the series contains advice, guidance and expert opinion on teaching in the key subjects in higher education today, and are backed up by the authority of the Institute for Learning and Teaching.
Making Sense of Statistics, Eighth Edition, is the ideal introduction to the concepts of descriptive and inferential statistics for students undertaking their first research project. It presents each statistical concept in a series of short steps, then uses worked examples and exercises to enable students to apply their own learning. It focuses on presenting the "why," as well as the "how" of statistical concepts, rather than computations and formulas. As such, it is suitable for students from all disciplines regardless of mathematical background. Only statistical techniques that are almost universally included in introductory statistics courses, and widely reported in journals, have been included. This conceptual book is useful for all study levels, from undergraduate to doctoral level across disciplines. Once students understand and feel comfortable with the statistics presented in this book, they should find it easy to master additional statistical concepts. New to the Eighth Edition Reorganization of chapters to allow a better progress of conceptual understanding Additional discussions on program evaluation, display of outcomes and examples Chapter objectives at the beginning of each chapter are listed with clear learning objectives for the reader Expanded Appendices include a reference to common computational formulas and examples Glossary of key terms has been updated to function as useful vocabulary list for use in first course in statistics Updated online resources, including a basic math review and answers, PowerPoint slides and a test bank of questions The downloadable Support Material can be accessed at: www.routledge.com/9781032289649
In the 1960s divorce was increasing around the world and marriage conciliation services were a necessary development to deal with those who wanted to seek help for their problems. Originally published in 1968, the purpose of this title was to give some account of the widely differing types of marital conciliation services operating in Britain and also some other parts of the world at the time. The author, who was based at the National Marriage Guidance Council of Great Britain, first outlines the British services, then presents comparative studies of the services overseas in Australia, New Zealand, Scandinavia and Finland and the United States and Canada. Today it can be read and enjoyed in its historical context.
Few Americans escape the experience of divorce, either first-hand or through the dissolutions of marriages of friends or relatives. According to the author, mediation offers a good alternative to the strictly adversarial divorce process that was so prevalent before such programs began to emerge. Originally published in 1991, this book was unique at the time in that it not only explores the role of communication in divorce mediation, but it also presents original research to support its claims. A series of empirical studies, it points readers to a more focused set of recommendations about communication than the typical practitioner's "How-to" books. A simulation exercise is also included, so that readers can apply the concepts described and see the results. The main goal of this text is to provide mediators with a language for understanding their own and their disputants' communication patterns, strategies, and tactics - a shortcoming of most other books on this topic when first published.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
Your Essential Guide to Quantitative Hedge Fund Investing provides a conceptual framework for understanding effective hedge fund investment strategies. The book offers a mathematically rigorous exploration of different topics, framed in an easy to digest set of examples and analogies, including stories from some legendary hedge fund investors. Readers will be guided from the historical to the cutting edge, while building a framework of understanding that encompasses it all. Features Filled with novel examples and analogies from within and beyond the world of finance Suitable for practitioners and graduate-level students with a passion for understanding the complexities that lie behind the raw mechanics of quantitative hedge fund investment A unique insight from an author with experience of both the practical and academic spheres.
This book aims to develop models and modeling techniques that are useful when applied to all complex systems. It adopts both analytic tools and computer simulation. The book is intended for students and researchers with a variety of backgrounds.
Highly recommended by the Journal of Official Statistics, The American Statistician, and other journals, Applied Survey Data Analysis, Second Edition provides an up-to-date overview of state-of-the-art approaches to the analysis of complex sample survey data. Building on the wealth of material on practical approaches to descriptive analysis and regression modeling from the first edition, this second edition expands the topics covered and presents more step-by-step examples of modern approaches to the analysis of survey data using the newest statistical software. Designed for readers working in a wide array of disciplines who use survey data in their work, this book continues to provide a useful framework for integrating more in-depth studies of the theory and methods of survey data analysis. An example-driven guide to the applied statistical analysis and interpretation of survey data, the second edition contains many new examples and practical exercises based on recent versions of real-world survey data sets. Although the authors continue to use Stata for most examples in the text, they also continue to offer SAS, SPSS, SUDAAN, R, WesVar, IVEware, and Mplus software code for replicating the examples on the book's updated website.
Analysis of Failure and Survival Data is an essential textbook for graduate-level students of survival analysis and reliability and a valuable reference for practitioners. It focuses on the many techniques that appear in popular software packages, including plotting product-limit survival curves, hazard plots, and probability plots in the context of censored data. The author integrates S-Plus and Minitab output throughout the text, along with a variety of real data sets so readers can see how the theory and methods are applied. He also incorporates exercises in each chapter that provide valuable problem-solving experience.
Chaos surrounds us. Seemingly random events -- the flapping of a flag, a storm-driven wave striking the shore, a pinball's path -- often appear to have no order, no rational pattern. Explicating the theory of chaos and the consequences of its principal findings -- that actual, precise rules may govern such apparently random behavior -- has been a major part of the work of Edward N. Lorenz. In "The Essence of Chaos," Lorenz presents to the general reader the features of this "new science," with its far-reaching implications for much of modern life, from weather prediction to philosophy, and he describes its considerable impact on emerging scientific fields. Unlike the phenomena dealt with in relativity theory and quantum mechanics, systems that are now described as "chaotic" can be observed without telescopes or microscopes. They range from the simplest happenings, such as the falling of a leaf, to the most complex processes, like the fluctuations of climate. Each process that qualifies, however, has certain quantifiable characteristics: how it unfolds depends very sensitively upon its present state, so that, even though it is not random, it seems to be. Lorenz uses examples from everyday life, and simple calculations, to show how the essential nature of chaotic systems can be understood. In order to expedite this task, he has constructed a mathematical model of a board sliding down a ski slope as his primary illustrative example. With this model as his base, he explains various chaotic phenomena, including some associated concepts such as strange attractors and bifurcations. As a meteorologist, Lorenz initially became interested in the field of chaos because of its implications for weather forecasting. In a chapter ranging through the history of weather prediction and meteorology to a brief picture of our current understanding of climate, he introduces many of the researchers who conceived the experiments and theories, and he describes his own initial encounter with chaos. A further discussion invites readers to make their own chaos. Still others debate the nature of randomness and its relationship to chaotic systems, and describe three related fields of scientific thought: nonlinearity, complexity, and fractality. Appendixes present the first publication of Lorenz's seminal paper "Does the Flap of a Butterfly's Wing in Brazil Set Off a Tornado in Texas?"; the mathematical equations from which the copious illustrations were derived; and a glossary.
This brilliantly illustrated tale of reason, insanity, love and truth recounts the story of Bertrand Russell's life. Raised by his paternal grandparents, young Russell was never told the whereabouts of his parents. Driven by a desire for knowledge of his own history, he attempted to force the world to yield to his yearnings: for truth, clarity and resolve. As he grew older, and increasingly sophisticated as a philosopher and mathematician, Russell strove to create an objective language with which to describe the world - one free of the biases and slippages of the written word. At the same time, he began courting his first wife, teasing her with riddles and leaning on her during the darker days, when his quest was bogged down by paradoxes, frustrations and the ghosts of his family's secrets. Ultimately, he found considerable success - but his career was stalled when he was outmatched by an intellectual rival: his young, strident, brilliantly original student, Ludwig Wittgenstein. An insightful and complexly layered narrative, Logicomix reveals both Russell's inner struggle and the quest for the foundations of logic. Narration by an older, wiser Russell, as well as asides from the author himself, make sense of the story's heady and powerful ideas. At its heart, Logicomix is a story about the conflict between pure reason and the persistent flaws of reality, a narrative populated by great and august thinkers, young lovers, ghosts and insanity.
This book presents new research in probability theory using ideas from mathematical logic. It is a general study of stochastic processes on adapted probability spaces, employing the concept of similarity of stochastic processes based on the notion of adapted distribution. The authors use ideas from model theory and methods from nonstandard analysis. The construction of spaces with certain richness properties, defined by insights from model theory, becomes easy using nonstandard methods, but remains difficult or impossible without them.
Congruences are ubiquitous in computer science, engineering, mathematics, and related areas. Developing techniques for finding (the number of) solutions of congruences is an important problem. But there are many scenarios in which we are interested in only a subset of the solutions; in other words, there are some restrictions. What do we know about these restricted congruences, their solutions, and applications? This book introduces the tools that are needed when working on restricted congruences and then systematically studies a variety of restricted congruences. Restricted Congruences in Computing defines several types of restricted congruence, obtains explicit formulae for the number of their solutions using a wide range of tools and techniques, and discusses their applications in cryptography, information security, information theory, coding theory, string theory, quantum field theory, parallel computing, artificial intelligence, computational biology, discrete mathematics, number theory, and more. This is the first book devoted to restricted congruences and their applications. It will be of interest to graduate students and researchers across computer science, electrical engineering, and mathematics.
- presents in-depth insights regarding fundamentals associated with big data technologies involved in petroleum streams. - builds on earlier works of researchers and inventors, which is essential source material for students in this area of study. - discusses essential processes and methodologies in petroleum streams that will direct researchers to pursue a practical approach to the field. - sheds light on challenges and problems of individual streams and inert-relation issues, while asking the reader to innovate and ideate upon those issues. - Offers an analysis of the financial aspects and business perspective on the processes to help the reader make constructive and practical decision in the field.
Nearly a century before Mondrian made geometrical red, yellow, and blue lines famous, 19th-century mathematician Oliver Byrne employed the color scheme for his 1847 edition of Euclid's mathematical and geometric treatise Elements. Byrne's idea was to use color to make learning easier and "diffuse permanent knowledge." The result has been described as one of the oddest and most beautiful books of the 19th century. The facsimile of Byrne's vivid publication is now available in a beautiful new edition. A masterwork of art and science, it is as beautiful in the boldness of its red, yellow, and blue figures and diagrams as it is in the mathematical precision of its theories. In the simplicity of forms and colors, the pages anticipate the vigor of De Stijl and Bauhaus design. In making complex information at once accessible and aesthetically engaging, this work is a forerunner to the information graphics that today define much of our data consumption. |
You may like...
Calculus, Metric Edition
James Stewart, Saleem Watson, …
Hardcover
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
Calculus: Early Transcendentals, Metric…
James Stewart, Saleem Watson, …
Hardcover
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
|