Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 25 of 106 matches in All Departments
Unlike some other reproductions of classic texts (1) We have not used OCR(Optical Character Recognition), as this leads to bad quality books with introduced typos. (2) In books where there are images such as portraits, maps, sketches etc We have endeavoured to keep the quality of these images, so they represent accurately the original artefact. Although occasionally there may be certain imperfections with these old texts, we feel they deserve to be made available for future generations to enjoy.
This Bayesian modeling book provides a self-contained entry to computational Bayesian statistics. Focusing on the most standard statistical models and backed up by real datasets and an all-inclusive R (CRAN) package called bayess, the book provides an operational methodology for conducting Bayesian inference, rather than focusing on its theoretical and philosophical justifications. Readers are empowered to participate in the real-life data analysis situations depicted here from the beginning. The stakes are high and the reader determines the outcome. Special attention is paid to the derivation of prior distributions in each case and specific reference solutions are given for each of the models. Similarly, computational details are worked out to lead the reader towards an effective programming of the methods given in the book. In particular, all R codes are discussed with enough detail to make them readily understandable and expandable. This works in conjunction with the bayess package. Bayesian Essentials with R can be used as a textbook at both undergraduate and graduate levels, as exemplified by courses given at Universite Paris Dauphine (France), University of Canterbury (New Zealand), and University of British Columbia (Canada). It is particularly useful with students in professional degree programs and scientists to analyze data the Bayesian way. The text will also enhance introductory courses on Bayesian statistics. Prerequisites for the book are an undergraduate background in probability and statistics, if not in Bayesian statistics. A strength of the text is the noteworthy emphasis on the role of models in statistical analysis. This is the new, fully-revised edition to the book Bayesian Core: A Practical Approach to Computational Bayesian Statistics. Jean-Michel Marin is Professor of Statistics at Universite Montpellier 2, France, and Head of the Mathematics and Modelling research unit. He has written over 40 papers on Bayesian methodology and computing, as well as worked closely with population geneticists over the past ten years. Christian Robert is Professor of Statistics at Universite Paris-Dauphine, France. He has written over 150 papers on Bayesian Statistics and computational methods and is the author or co-author of seven books on those topics, including The Bayesian Choice (Springer, 2001), winner of the ISBA DeGroot Prize in 2004. He is a Fellow of the Institute of Mathematical Statistics, the Royal Statistical Society and the American Statistical Society. He has been co-editor of the Journal of the Royal Statistical Society, Series B, and in the editorial boards of the Journal of the American Statistical Society, the Annals of Statistics, Statistical Science, and Bayesian Analysis. He is also a recipient of an Erskine Fellowship from the University of Canterbury (NZ) in 2006 and a senior member of the Institut Universitaire de France (2010-2015)."
Insects are the most ecologically important multicellular heterotrophs in terrestrial systems. They play critical roles in ecological food webs, remain devastating agricultural and medical pests, and represent the most diverse group of eukaryotes in terms of species numbers. Their dominant role among terrestrial heterotrophs arises from a number of key physiological traits, and in particular by the developmental and evolutionary plasticity of these traits. Ecological and Environmental Physiology of Insects presents a current and comprehensive overview of how the key physiological traits of insects respond to environmental variation. It forges conceptual links from molecular biology through organismal function to population and community ecology. As with other books in the Series, the emphasis is on the unique physiological characteristics of the insects, but with applications to questions of broad relevance in physiological ecology. As an aid to new researchers on insects, it also includes introductory chapters on the basics and techniques of insect physiology ecology.
This is the first transnational study of British, Norwegian, and Swedish engagement with the Antarctic, from the years before the Great War to the early years of the Cold War. Rather than charting how Europeans unveiled the Antarctic, it uses the history of Antarctic activity as a window into the political and cultural worlds of twentieth-century Britain and Scandinavia. Science was a resource for states attempting to reveal - and control - the Antarctic and its resources. But it was also a source of personal and institutional capital, a means of earning civic status and professional advancement. The book ranges from the politics of whaling management to the changing value of geographical exploration in the academy and the rise of specialized, state-sponsored research, presenting an episodic rather than a linear narrative focused on historically specific networks and strategies. Drawing upon scholarship in critical geopolitics, imperial environmental history, and the cultural history of science, author Peder Roberts argues that despite its splendid geographical isolation, the Antarctic was a field for distinctly local European dreams.
Surveillance is a key notion for understanding power and control in the modern world, but it has been curiously neglected by historians of science and technology. Using the overarching concept of the "surveillance imperative," this collection of essays offers a new window on the evolution of the environmental sciences during and after the Cold War.
This second edition provides 21 new chapters on methods used in laboratories for investigating the physiology and molecular genetics of the pathogen Clostridium difficile. Chapters detail up-to -date experimental techniques for gene editing and transcriptional analysis which are used to investigate the fundamental biology of the organism and its virulence factors. Additional chapters describe development of potential new treatments including vaccines, bacteriophage and faecal transplantation. Written in the highly successful Methods in Molecular Biology series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and tips on troubleshooting and avoiding known pitfalls. Authoritative and cutting-edge, Clostridium difficile: Methods and Protocols, Second Edition provides a comprehensive catalogue of molecular tools and techniques authored by the researchers who have developed them.
The Edict of Nantes of 1598 is traditionally celebrated as an enlightened act of religious toleration ending the long and bloody conflict of the French religious wars. It is often forgotten, however, that it was preceded by a series of increasingly elaborate royal edicts which sought to pacify the country and to reconcile Protestant and Catholic. This book provides the first comprehensive overview of the process of peacemaking to cover the whole period of the wars throughout the French kingdom. It re-examines the sometimes fraught relationship between the crown and its subjects: the nobility, regional authorities, and urban communities, as well as confessional groups dissatisfied with royal policy. Through a wide-ranging and close analysis of archival sources, it re-evaluates both the role of royal authority and of local agency in the peace process, and provides a new perspective on the political, religious, social and cultural history of the conflict.
This book comprehensively discusses the basic principles and working mechanism of all kind of batteries towards clean energy storage devices. In addition, it focuses on the synthesis of various electrode materials with 1D architecture via electrospinning technique. This book will give a clear idea about recent synthetic strategy towards nanofibers and nanocomposites for alkali-ion storage applications. The reader could understand the formation mechanism of nanofibers and their potential application in the future energy storage system.
Clostridium difficile, a major nosocomial pathogen shown to be a primary cause of antibiotic-associated disease, has emerged as a highly transmissible and frequently antibiotic-resistant organism, causing a considerable burden on health care systems worldwide. In Clostridium difficile: Methods and Protocols, expert researchers bring together the most recently developed methods for studying the organism, including techniques involving isolation, molecular typing, genomics, genetic manipulation, and the use of animal models. Written in the highly successful Methods in Molecular Biology (TM) series format, chapters include brief introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and notes highlighting tips on troubleshooting and avoiding known pitfalls. Authoritative and cutting-edge, Clostridium difficile: Methods and Protocols serves as an ideal guide for scientists now in a position to gain an in-depth understanding of how this organism is transmitted and how it causes disease.
What does it mean to be human in an age of science, technology, and faith? The ability to ask such a question suggests at least a partial answer, in that however we describe ourselves we bear a major role in determining what we will become. In this book, Philip Hefner reminds us that this inescapable condition is the challenge and opportunity of Homo sapiens as the created co-creator. In four original chapters and an epilogue, Hefner frames the created co-creator as a memoirist with an ambiguous legacy, explores some of the roots of this ambiguity, emphasizes the importance of answering this ambiguity with symbols that can interpret it in wholesome ways, proposes a partial theological framework for co-creating such symbols, and applies this framework to the challenge of using technology like artificial intelligence and robotics to create other co-creators in our own image. Editors Jason P. Roberts and Mladen Turk have compiled eight responses to Hefner's work to honor his scholarly career and answer his call to help co-create a more wholesome future in an age of science, technology, and faith.
Mixture models have been around for over 150 years, and they are found in many branches of statistical modelling, as a versatile and multifaceted tool. They can be applied to a wide range of data: univariate or multivariate, continuous or categorical, cross-sectional, time series, networks, and much more. Mixture analysis is a very active research topic in statistics and machine learning, with new developments in methodology and applications taking place all the time. The Handbook of Mixture Analysis is a very timely publication, presenting a broad overview of the methods and applications of this important field of research. It covers a wide array of topics, including the EM algorithm, Bayesian mixture models, model-based clustering, high-dimensional data, hidden Markov models, and applications in finance, genomics, and astronomy. Features: Provides a comprehensive overview of the methods and applications of mixture modelling and analysis Divided into three parts: Foundations and Methods; Mixture Modelling and Extensions; and Selected Applications Contains many worked examples using real data, together with computational implementation, to illustrate the methods described Includes contributions from the leading researchers in the field The Handbook of Mixture Analysis is targeted at graduate students and young researchers new to the field. It will also be an important reference for anyone working in this field, whether they are developing new methodology, or applying the models to real scientific problems.
Since the middle of the last century, the emergence and development of fields as diverse as artificial intelligence, evolutionary science, cognitive linguistics, and neuroscience have led to a greater understanding of the ways in which humans think. One of the major discoveries involves what researchers refer to as conceptual mapping. According to theories of conceptual mapping, human thought is profoundly shaped by the ability to make connections. Simply put, human thinking is metaphorical all the way down. This insight has revolutionized the way in which scientists and philosophers think about the mind/body problem, the formation and function of language, and even the development of scientific progress itself. Until recently however, this research has gone largely unnoticed within Christian theology. But this revolution in understanding human cognition calls for broader and richer engagement with theology and religious studies: How does this new insight into human meaning-making bear on our understanding of religious meaning-making? And how might Christian theology interpret and respond to this new understanding of the development of human thought? This edited volume offers an introduction to conceptual mapping that is accessible to those with no previous knowledge of the field, and demonstrates the substantial resources this interdisciplinary research has for thinking about a variety of theological questions. The book begins with a chapter introducing the reader to the basics of conceptual mapping. The remaining chapters apply these insights to a variety of theological topics including anthropology, sacramental theology, biblical studies, ecumenical theology, and ethics.
There are many aspects of life which require us to distinguish between memories of different events, such as deciding whether you locked the door or only intended to lock the door. Source monitoring, or identifying the source of a particular memory (was the event experienced? related by someone else? or simply imagined?) is a cognitive skill that develops across the life span. In this book, the first to integrate research on children's source monitoring, readers will find an accessible overview of source-monitoring theory and findings from the research programs of leading investigators in this area. The programs of research cut across different methodologies (e.g., nomothetic, individual differences, clinical) and are applied to a wide range of issues in children's lives. Particular emphasis is placed on the effects of source monitoring on eyewitness memory and identification, learning and knowledge, and the development of a theory of mind.
Although topology was recognized by Gauss and Maxwell to play a pivotal role in the formulation of electromagnetic boundary value problems, it is a largely unexploited tool for field computation. The development of algebraic topology since Maxwell provides a framework for linking data structures, algorithms, and computation to topological aspects of three-dimensional electromagnetic boundary value problems. This book attempts to expose the link between Maxwell and a modern approach to algorithms. The first chapters lay out the relevant facts about homology and cohomology, stressing their interpretations in electromagnetism. These topological structures are subsequently tied to variational formulations in electromagnetics, the finite element method, algorithms, and certain aspects of numerical linear algebra. A recurring theme is the formulation of and algorithms for the problem of making branch cuts for computing magnetic scalar potentials and eddy currents. Appendices bridge the gap between the material presented and standard expositions of differential forms, Hodge decompositions, and tools for realizing representatives of homology classes as embedded manifolds.
There are many aspects of life which require us to distinguish
between memories of different events, such as deciding whether you
locked the door or only intended to lock the door. Source
monitoring, or identifying the source of a particular memory (was
the event experienced? related by someone else? or simply
imagined?) is a cognitive skill that develops across the life span.
In this book, the first to integrate research on children's source
monitoring, readers will find an accessible overview of
source-monitoring theory and findings from the research programs of
leading investigators in this area. The programs of research cut
across different methodologies (e.g., nomothetic, individual
differences, clinical) and are applied to a wide range of issues in
children's lives. Particular emphasis is placed on the effects of
source monitoring on eyewitness memory and identification, learning
and knowledge, and the development of a theory of mind.
Following the demise of the USSR in 1991, and the ensuing collapse of communist regimes in Eastern Europe, widespread population movements took place across Central and Eastern Europe. Whole nations disappeared and (re)-emerged and diasporic transnational ties and belonging have experienced a revival. This book explores some of the many different facets of diasporic life and migration across Central and Eastern Europe by specifically employing the concept of cosmopolitanism. It examines aspects of migrants' everyday lives and identities, considers some of the difficulties faced by migrant minorities in being accepted and integrated in the host societies, but also examines questions of citizenship and diasporic politics.
Following the demise of the USSR in 1991, and the ensuing collapse of communist regimes in Eastern Europe, widespread population movements took place across Central and Eastern Europe. Whole nations disappeared and (re)-emerged and diasporic transnational ties and belonging have experienced a revival. This book explores some of the many different facets of diasporic life and migration across Central and Eastern Europe by specifically employing the concept of cosmopolitanism. It examines aspects of migrants' everyday lives and identities, considers some of the difficulties faced by migrant minorities in being accepted and integrated in the host societies, but also examines questions of citizenship and diasporic politics.
This book comprehensively discusses the basic principles and working mechanism of all kind of batteries towards clean energy storage devices. In addition, it focuses on the synthesis of various electrode materials with 1D architecture via electrospinning technique. This book will give a clear idea about recent synthetic strategy towards nanofibers and nanocomposites for alkali-ion storage applications. The reader could understand the formation mechanism of nanofibers and their potential application in the future energy storage system.
Presenting a range of substantive applied problems within Bayesian Statistics along with their Bayesian solutions, this book arises from a research program at CIRM in France in the second semester of 2018, which supported Kerrie Mengersen as a visiting Jean-Morlet Chair and Pierre Pudlo as the local Research Professor. The field of Bayesian statistics has exploded over the past thirty years and is now an established field of research in mathematical statistics and computer science, a key component of data science, and an underpinning methodology in many domains of science, business and social science. Moreover, while remaining naturally entwined, the three arms of Bayesian statistics, namely modelling, computation and inference, have grown into independent research fields. While the research arms of Bayesian statistics continue to grow in many directions, they are harnessed when attention turns to solving substantive applied problems. Each such problem set has its own challenges and hence draws from the suite of research a bespoke solution. The book will be useful for both theoretical and applied statisticians, as well as practitioners, to inspect these solutions in the context of the problems, in order to draw further understanding, awareness and inspiration.
This second edition provides 21 new chapters on methods used in laboratories for investigating the physiology and molecular genetics of the pathogen Clostridium difficile. Chapters detail up-to -date experimental techniques for gene editing and transcriptional analysis which are used to investigate the fundamental biology of the organism and its virulence factors. Additional chapters describe development of potential new treatments including vaccines, bacteriophage and faecal transplantation. Written in the highly successful Methods in Molecular Biology series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and tips on troubleshooting and avoiding known pitfalls. Authoritative and cutting-edge, Clostridium difficile: Methods and Protocols, Second Edition provides a comprehensive catalogue of molecular tools and techniques authored by the researchers who have developed them.
Clostridium difficile, a major nosocomial pathogen shown to be a primary cause of antibiotic-associated disease, has emerged as a highly transmissible and frequently antibiotic-resistant organism, causing a considerable burden on health care systems worldwide. In Clostridium difficile: Methods and Protocols, expert researchers bring together the most recently developed methods for studying the organism, including techniques involving isolation, molecular typing, genomics, genetic manipulation, and the use of animal models. Written in the highly successful Methods in Molecular Biology (TM) series format, chapters include brief introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and notes highlighting tips on troubleshooting and avoiding known pitfalls. Authoritative and cutting-edge, Clostridium difficile: Methods and Protocols serves as an ideal guide for scientists now in a position to gain an in-depth understanding of how this organism is transmitted and how it causes disease.
This Bayesian modeling book provides a self-contained entry to computational Bayesian statistics. Focusing on the most standard statistical models and backed up by real datasets and an all-inclusive R (CRAN) package called bayess, the book provides an operational methodology for conducting Bayesian inference, rather than focusing on its theoretical and philosophical justifications. Readers are empowered to participate in the real-life data analysis situations depicted here from the beginning. Special attention is paid to the derivation of prior distributions in each case and specific reference solutions are given for each of the models. Similarly, computational details are worked out to lead the reader towards an effective programming of the methods given in the book. In particular, all R codes are discussed with enough detail to make them readily understandable and expandable. Bayesian Essentials with R can be used as a textbook at both undergraduate and graduate levels. It is particularly useful with students in professional degree programs and scientists to analyze data the Bayesian way. The text will also enhance introductory courses on Bayesian statistics. Prerequisites for the book are an undergraduate background in probability and statistics, if not in Bayesian statistics.
Surveillance is a key notion for understanding power and control in the modern world, but it has been curiously neglected by historians of science and technology. Using the overarching concept of the "surveillance imperative," this collection of essays offers a new window on the evolution of the environmental sciences during and after the Cold War. |
You may like...
Terminator 6: Dark Fate
Linda Hamilton, Arnold Schwarzenegger
Blu-ray disc
(1)
R76 Discovery Miles 760
|