Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics
Do large cities grow more or less rapidly than small ones? Why should the relationship between city size and population growth vary so much from one period to another? This book studies the process of population growth in a national set of cities, relating its findings to the theoretical concepts of urban geography. To test his ideas, the author studies the growth of cities in England and Wales between 1801 and 1911. His explanations draw strongly on the connection between growth and the adoption of innovations. He develops a model of innovation diffusions in a set of cities and, in support of this model, looks at the way in which three particular innovations - the telephone, building societies and gaslighting - spread amongst English towns in the nineteenth century. This book was first published in 1973.
Probability and Statistical Inference: From Basic Principles to Advanced Models covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible manner without sacrificing mathematical rigour, bridging the gap between the many excellent introductory books and the more advanced, graduate-level texts. The book introduces and explores techniques that are relevant to modern practitioners, while being respectful to the history of statistical inference. It seeks to provide a thorough grounding in both the theory and application of statistics, with even the more abstract parts placed in the context of a practical setting. Features: *Complete introduction to mathematical probability, random variables, and distribution theory. *Concise but broad account of statistical modelling, covering topics such as generalised linear models, survival analysis, time series, and random processes. *Extensive discussion of the key concepts in classical statistics (point estimation, interval estimation, hypothesis testing) and the main techniques in likelihood-based inference. *Detailed introduction to Bayesian statistics and associated topics. *Practical illustration of some of the main computational methods used in modern statistical inference (simulation, boostrap, MCMC). This book is for students who have already completed a first course in probability and statistics, and now wish to deepen and broaden their understanding of the subject. It can serve as a foundation for advanced undergraduate or postgraduate courses. Our aim is to challenge and excite the more mathematically able students, while providing explanations of statistical concepts that are more detailed and approachable than those in advanced texts. This book is also useful for data scientists, researchers, and other applied practitioners who want to understand the theory behind the statistical methods used in their fields.
Who doesn't love a mystery? Students will have a blast reading the adventures of kid detectives Zara and Mendel and their dog, Digit, and helping them solve intriguing math puzzlers along the way. In this collection of comical mysteries, students meld reading strategies, such as text marking, with essential math skills to tackle real-world problems. A delightful way to practice whole-number computation, fractions, measurement, geometry, algebraic reasoning, and so much more!
This book was first published in 1970.
* A valuable teaching resource, replete with exercises, for any course of gambling mathematics * Suitable for a wide audience of professionals, researchers and students * Many practical applications for the gambling industry
Bayesian analyses have made important inroads in modern clinical research due, in part, to the incorporation of the traditional tools of noninformative priors as well as the modern innovations of adaptive randomization and predictive power. Presenting an introductory perspective to modern Bayesian procedures, Elementary Bayesian Biostatistics explores Bayesian principles and illustrates their application to healthcare research. Building on the basics of classic biostatistics and algebra, this easy-to-read book provides a clear overview of the subject. It focuses on the history and mathematical foundation of Bayesian procedures, before discussing their implementation in healthcare research from first principles. The author also elaborates on the current controversies between Bayesian and frequentist biostatisticians. The book concludes with recommendations for Bayesians to improve their standing in the clinical trials community. Calculus derivations are relegated to the appendices so as not to overly complicate the main text. As Bayesian methods gain more acceptance in healthcare, it is necessary for clinical scientists to understand Bayesian principles. Applying Bayesian analyses to modern healthcare research issues, this lucid introduction helps readers make the correct choices in the development of clinical research programs.
This book presents the foundation and validation of the Cosserat Plate Theory, numerical experiments of deformation and vibration, and the unique properties of the Cosserat plates. Our approach incorporates the high accuracy assumptions of the Cosserat plate deformation consistent with the Cosserat Elasticity equilibrium equations, constitutive formulas, strain-displacement and torsion-microrotation relations. The Cosserat Plate Theory is parametric, where the "splitting parameter" minimizes the Cosserat plate energy. The validation of the theory is based on the comparison with the three-dimensional Cosserat Elastostatics and Elastodynamics. The numerical results are obtained using the Finite Element Method (FEM) specifically developed to solve the parametric system of equations. The analysis of deformation of a variety of Cosserat plates shows the stress concentration reduction, higher stiffness of Cosserat plates, and the size-effect related to the microstructure. The analysis of vibration of Cosserat plates predicts size-related properties of the plate vibration, the existence of the additional so-called Cosserat plate resonances, and the dynamic anisotropy, related to the dependency of the resonances on the microelement's shapes and orientations.
Provides an introduction to statistical thinking that will help the public consume results reported in the popular media.
Discover an accessible and easy-to-use guide to calculus fundamentals In Quick Calculus: A Self-Teaching Guide, 3rd Edition, a team of expert MIT educators delivers a hands-on and practical handbook to essential calculus concepts and terms. The author explores calculus techniques and applications, showing readers how to immediately implement the concepts discussed within to help solve real-world problems. In the book, readers will find: An accessible introduction to the basics of differential and integral calculus An interactive self-teaching guide that offers frequent questions and practice problems with solutions. A format that enables them to monitor their progress and gauge their knowledge This latest edition provides new sections, rewritten introductions, and worked examples that demonstrate how to apply calculus concepts to problems in physics, health sciences, engineering, statistics, and other core sciences. Quick Calculus: A Self-Teaching Guide, 3rd Edition is an invaluable resource for students and lifelong learners hoping to strengthen their foundations in calculus.
Higher Engineering Mathematics has helped thousands of students to succeed in their exams by developing problem-solving skills, It is supported by over 600 practical engineering examples and applications which relate theory to practice. The extensive and thorough topic coverage makes this a solid text for undergraduate and upper-level vocational courses. Its companion website provides resources for both students and lecturers, including lists of essential formulae, ands full solutions to all 2,000 further questions contained in the 277 practice exercises; and illustrations and answers to revision tests for adopting course instructors.
Design and Analysis in Educational Research Using jamovi is an integrated approach to learning about research design alongside statistical analysis concepts. Strunk and Mwavita maintain a focus on applied educational research throughout the text, with practical tips and advice on how to do high-quality quantitative research. Based on their successful SPSS version of the book, the authors focus on using jamovi in this version due to its accessibility as open source software, and ease of use. The book teaches research design (including epistemology, research ethics, forming research questions, quantitative design, sampling methodologies, and design assumptions) and introductory statistical concepts (including descriptive statistics, probability theory, sampling distributions), basic statistical tests (like z and t), and ANOVA designs, including more advanced designs like the factorial ANOVA and mixed ANOVA. This textbook is tailor-made for first-level doctoral courses in research design and analysis. It will also be of interest to graduate students in education and educational research. The book includes Support Material with downloadable data sets, and new case study material from the authors for teaching on race, racism, and Black Lives Matter, available at www.routledge.com/9780367723088.
* 16 accompanying datasets across a wide range of contexts (e.g. academic, corporate, sports, marketing) * Clear step-by-step instructions on executing the analyses. * Clear guidance on how to interpret results. * Primary instruction in R but added sections for Python coders. * Discussion exercises and data exercises for each of the main chapters. * Final chapter of practice material and datasets ideal for class homework or project work.
Utilizes data driven examples and exercises. Emphasizes the iterative model building and evaluation process. Surveys an interconnected range of multivariable regression and classification models. Presents fundamental Markov chain Monte Carlo simulation techniques for Bayesian models.
Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can be formulated as optimization problems. Next, many fundamental methods in optimization are described and analyzed, including: gradient and accelerated gradient methods for unconstrained optimization of smooth (especially convex) functions; the stochastic gradient method, a workhorse algorithm in machine learning; the coordinate descent approach; several key algorithms for constrained optimization problems; algorithms for minimizing nonsmooth functions arising in data science; foundations of the analysis of nonsmooth functions and optimization duality; and the back-propagation approach, relevant to neural networks.
This ground-breaking volume presents a unique contribution to the development of social and political psychology both in Turkey and globally, providing a complex analysis of intergroup relations in the diverse Turkish context. Turkey is home to a huge variety of social, ethnic and religious groups and hosts the largest number of refugees in the world. This diversity creates a unique opportunity to understand how powerful forces of ethnicity, migration and political ideology shape intergroup processes and intergroup relations. Bringing together novel research findings, the international collection of authors explore everything from disability, age and gender, Kurdish and Armenian relations as "traditional minorities", the recent emergence of a "new minority" of Syrian refugees and Turkey's complex political history. The theories and paradigms considered in the book - social identity, intergroup contact, integrated threat, social representations - are leading approaches in social and political psychology, but the research presented tests these approaches in the context of a very diverse and dynamic non-WEIRD (Western, Educated, Industrialized, Rich and Democratic) society, with the goal of contributing toward the development of a more intercultural and democratic social and political psychology. Bringing together cutting-edge research and providing important insights into the psychological underpinnings of a singular societal situation from a variety of perspectives, this book is essential reading for students studying the psychology, politics and social science of intergroup relations, as well as practitioners interested in conflict resolution.
The number of innovative applications of randomization tests in various fields and recent developments in experimental design, significance testing, computing facilities, and randomization test algorithms have necessitated a new edition of Randomization Tests. Updated, reorganized, and revised, the text emphasizes the irrelevance and implausibility of the random sampling assumption for the typical experiment in three completely rewritten chapters. It also discusses factorial designs and interactions and combines repeated-measures and randomized block designs in one chapter. The authors focus more attention on the practicality of N-of-1 randomization tests and the availability of user-friendly software to perform them. In addition, they provide an overview of free and commercial computer programs for all of the tests presented in the book. Building on the previous editions that have served as standard textbooks for more than twenty-five years, Randomization Tests, Fourth Edition includes a CD-ROM of up-to-date randomization test programs that facilitate application of the tests to experimental data. This CD-ROM enables students to work out problems that have been added to the chapters and helps professors teach the basics of randomization tests and devise tasks for assignments and examinations.
In 1990, the National Science Foundation recommended that every college mathematics curriculum should include a second course in linear algebra. In answer to this recommendation, Matrix Theory: From Generalized Inverses to Jordan Form provides the material for a second semester of linear algebra that probes introductory linear algebra concepts while also exploring topics not typically covered in a sophomore-level class. Tailoring the material to advanced undergraduate and beginning graduate students, the authors offer instructors flexibility in choosing topics from the book. The text first focuses on the central problem of linear algebra: solving systems of linear equations. It then discusses LU factorization, derives Sylvester's rank formula, introduces full-rank factorization, and describes generalized inverses. After discussions on norms, QR factorization, and orthogonality, the authors prove the important spectral theorem. They also highlight the primary decomposition theorem, Schur's triangularization theorem, singular value decomposition, and the Jordan canonical form theorem. The book concludes with a chapter on multilinear algebra. With this classroom-tested text students can delve into elementary linear algebra ideas at a deeper level and prepare for further study in matrix theory and abstract algebra.
Interpreting statistical data as evidence, Statistical Evidence: A Likelihood Paradigm focuses on the law of likelihood, fundamental to solving many of the problems associated with interpreting data in this way. Statistics has long neglected this principle, resulting in a seriously defective methodology. This book redresses the balance, explaining why science has clung to a defective methodology despite its well-known defects. After examining the strengths and weaknesses of the work of Neyman and Pearson and the Fisher paradigm, the author proposes an alternative paradigm which provides, in the law of likelihood, the explicit concept of evidence missing from the other paradigms. At the same time, this new paradigm retains the elements of objective measurement and control of the frequency of misleading results, features which made the old paradigms so important to science. The likelihood paradigm leads to statistical methods that have a compelling rationale and an elegant simplicity, no longer forcing the reader to choose between frequentist and Bayesian statistics.
Coding, Shaping, Making combines inspiration from architecture, mathematics, biology, chemistry, physics and computation to look towards the future of architecture, design and art. It presents ongoing experiments in the search for fundamental principles of form and form-making in nature so that we can better inform our own built environment. In the coming decades, matter will become encoded with shape information so that it shapes itself, as happens in biology. Physical objects, shaped by forces as well, will begin to design themselves based on information encoded in matter they are made of. This knowledge will be scaled and trickled up to architecture. Consequently, architecture will begin to design itself and the role of the architect will need redefining. This heavily illustrated book highlights Haresh Lalvani's efforts towards this speculative future through experiments in form and form-making, including his work in developing a new approach to shape-coding, exploring higher-dimensional geometry for designing physical structures and organizing form in higher-dimensional diagrams. Taking an in-depth look at Lalvani's pioneering experiments of mass customization in industrial products in architecture, combined with his idea of a form continuum, this book argues for the need for integration of coding, shaping and making in future technologies into one seamless process. Drawing together decades of research, this book will be a thought-provoking read for architecture professionals and students, especially those interested in the future of the discipline as it relates to mathematics, science, technology and art. It will also interest those in the latter fields for its broader implications.
A First Course in Chaotic Dynamical Systems: Theory and Experiment, Second Edition The long-anticipated revision of this well-liked textbook offers many new additions. In the twenty-five years since the original version of this book was published, much has happened in dynamical systems. Mandelbrot and Julia sets were barely ten years old when the first edition appeared, and most of the research involving these objects then centered around iterations of quadratic functions. This research has expanded to include all sorts of different types of functions, including higher-degree polynomials, rational maps, exponential and trigonometric functions, and many others. Several new sections in this edition are devoted to these topics. The area of dynamical systems covered in A First Course in Chaotic Dynamical Systems: Theory and Experiment, Second Edition is quite accessible to students and also offers a wide variety of interesting open questions for students at the undergraduate level to pursue. The only prerequisite for students is a one-year calculus course (no differential equations required); students will easily be exposed to many interesting areas of current research. This course can also serve as a bridge between the low-level, often non-rigorous calculus courses, and the more demanding higher-level mathematics courses. Features More extensive coverage of fractals, including objects like the Sierpinski carpet and others that appear as Julia sets in the later sections on complex dynamics, as well as an actual chaos "game." More detailed coverage of complex dynamical systems like the quadratic family and the exponential maps. New sections on other complex dynamical systems like rational maps. A number of new and expanded computer experiments for students to perform. About the Author Robert L. Devaney is currently professor of mathematics at Boston University. He received his PhD from the University of California at Berkeley under the direction of Stephen Smale. He taught at Northwestern University and Tufts University before coming to Boston University in 1980. His main area of research is dynamical systems, primarily complex analytic dynamics, but also including more general ideas about chaotic dynamical systems. Lately, he has become intrigued with the incredibly rich topological aspects of dynamics, including such things as indecomposable continua, Sierpinski curves, and Cantor bouquets.
In this popular text for an Numerical Analysis course, the authors introduce several major methods of solving various partial differential equations (PDEs) including elliptic, parabolic, and hyperbolic equations. It covers traditional techniques including the classic finite difference method, finite element method, and state-of-the-art numercial methods.The text uniquely emphasizes both theoretical numerical analysis and practical implementation of the algorithms in MATLAB. This new edition includes a new chapter, Finite Value Method, the presentation has been tightened, new exercises and applications are included, and the text refers now to the latest release of MATLAB. Key Selling Points: A successful textbook for an undergraduate text on numerical analysis or methods taught in mathematics and computer engineering. This course is taught in every university throughout the world with an engineering department or school. Competitive advantage broader numerical methods (including finite difference, finite element, meshless method, and finite volume method), provides the MATLAB source code for most popular PDEs with detailed explanation about the implementation and theoretical analysis. No other existing textbook in the market offers a good combination of theoretical depth and practical source codes.
Researchers and students who want a less mathematical alternative to the EQS manual will find exactly what they're looking for in this practical text. Written specifically for those with little to no knowledge of structural equation modeling (SEM) or EQS, the author's goal is to provide a non-mathematical introduction to the basic concepts of SEM by applying these principles to EQS, Version 6.1. The book clearly demonstrates a wide variety of SEM/EQS applications that include confirmatory factor analytic and full latent variable models. Analyses are based on a wide variety of data representing single and multiple-group models; these include data that are normal/non-normal, complete/incomplete, and continuous/categorical. Written in a user-friendly style, the author walks the reader through the varied steps involved in the process of testing SEM models. These include model specification and estimation, assessment of model fit, description of EQS output, and interpretation of findings. hypothesis being tested, a schematic representation of the model, explanations and interpretations of the related EQS input and output files, tips on how to use the associated pull-down menus and icons, and the data file upon which the application is based. Beginning with an overview of the basic concepts of SEM and the EQS program, the book carefully works through applications starting with relatively simple single group analyses, through to more advanced applications, such as a multi-group, latent growth curve, and multilevel modeling. The new edition features: Many new applications that include a latent growth curve model, a multilevel model, a second-order model based on categorical data, a missing data multi-group model based on the EM algorithm, and the testing for latent mean differences related to a higher-order model. A CD enclosed with the book that includes all application data. Vignettes illustrating procedural and/or data management tasks using a Windows interface. Description of how to build models both interactively using the BUILD_EQ interface and graphically using the EQS Diagrammer.
City, Region and Regionalism was first published in 1947.
GAUGE INTEGRAL STRUCTURES FOR STOCHASTIC CALCULUS AND QUANTUM ELECTRODYNAMICS A stand-alone introduction to specific integration problems in the probabilistic theory of stochastic calculus Picking up where his previous book, A Modern Theory of Random Variation, left off, Gauge Integral Structures for Stochastic Calculus and Quantum Electrodynamics introduces readers to particular problems of integration in the probability-like theory of quantum mechanics. Written as a motivational explanation of the key points of the underlying mathematical theory, and including ample illustrations of the calculus, this book relies heavily on the mathematical theory set out in the author's previous work. That said, this work stands alone and does not require a reading of A Modern Theory of Random Variation in order to be understandable. Gauge Integral Structures for Stochastic Calculus and Quantum Electrodynamics takes a gradual, relaxed, and discursive approach to the subject in a successful attempt to engage the reader by exploring a narrower range of themes and problems. Organized around examples with accompanying introductions and explanations, the book covers topics such as: Stochastic calculus, including discussions of random variation, integration and probability, and stochastic processes Field theory, including discussions of gauges for product spaces and quantum electrodynamics Robust and thorough appendices, examples, illustrations, and introductions for each of the concepts discussed within An introduction to basic gauge integral theory (for those unfamiliar with the author's previous book) The methods employed in this book show, for instance, that it is no longer necessary to resort to unreliable "Black Box" theory in financial calculus; that full mathematical rigor can now be combined with clarity and simplicity. Perfect for students and academics with even a passing interest in the application of the gauge integral technique pioneered by R. Henstock and J. Kurzweil, Gauge Integral Structures for Stochastic Calculus and Quantum Electrodynamics is an illuminating and insightful exploration of the complex mathematical topics contained within.
Hall argues that 'London was the chief manufacturing centre of the country in 1861, and without doubt for centuries before that'. This book looks at industries in London over time from 1861. This book was first published in 1962. |
You may like...
Differential Equations with…
Warren Wright, Dennis Zill
Paperback
(1)
Precalculus: Mathematics for Calculus…
Lothar Redlin, Saleem Watson, …
Paperback
Precalculus: Mathematics for Calculus…
James Stewart, Lothar Redlin, …
Paperback
(2)
Aleks 360 (18 Weeks) Access Card for…
William Navidi, Barry Monk
Digital product license key
R2,871
Discovery Miles 28 710
Numbers, Hypotheses & Conclusions - A…
Colin Tredoux, Kevin Durrheim
Paperback
|