![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > Economic statistics
Discover how statistical information impacts decisions in today's business world as Anderson/Sweeney/Williams/Camm/Cochran/Fry/Ohlmann's leading ESSENTIALS OF STATISTICS FOR BUSINESS AND ECONOMICS, 9E connects concepts in each chapter to real-world practice. This edition delivers sound statistical methodology, a proven problem-scenario approach and meaningful applications that reflect the latest developments in business and statistics today. More than 350 new and proven real business examples, a wealth of practical cases and meaningful hands-on exercises highlight statistics in action. You gain practice using leading professional statistical software with exercises and appendices that walk you through using JMP (R) Student Edition 14 and Excel (R) 2016. WebAssign's online course management systems is available separately to further strengthen this business statistics approach and helps you maximize your course success.
Develop the analytical skills that are in high demand in businesses today with Camm/Cochran/Fry/Ohlmann's best-selling BUSINESS ANALYTICS, 4E. You master the full range of analytics as you strengthen descriptive, predictive and prescriptive analytic skills. Real examples and memorable visuals illustrate data and results for each topic. Step-by-step instructions guide you through using Microsoft (R) Excel, Tableau, R, and JMP Pro software to perform even advanced analytics concepts. Practical, relevant problems at all levels of difficulty further help you apply what you've learned. This edition assists you in becoming proficient in topics beyond the traditional quantitative concepts, such as data visualization and data mining, which are increasingly important in today's analytical problem solving. MindTap digital learning resources with an interactive eBook, algorithmic practice problems with solutions and Exploring Analytics visualizations strengthen your understanding of key concepts.
The book unifies quantum theory and the general theory of relativity. As an unsolved problem for about 100 years and influencing so many fields, this is probably of some importance to the scientific community. Examples like Higgs field, limit to classical Dirac and Klein-Gordon or Schroedinger cases, quantized Schwarzschild, Kerr, Kerr-Newman objects, and the photon are considered for illustration. An interesting explanation for the asymmetry of matter and antimatter in the early universe was found while quantizing the Schwarzschild metric.
A variety of different social, natural, and technological systems can be described by the same mathematical framework. This holds from the Internet to food webs and to boards of company directors. In all these situations a graph of the elements of the system and their interconnections displays a universal feature. There are only few elements with many connections, and many elements with few connections. This book presents the experimental evidence of these 'scale-free networks' and provides students and researchers with a corpus of theoretical results and algorithms to analyse and understand these features. The content of this book and the exposition makes it a clear textbook for beginners, and a reference book for the experts.
An introduction to how the mathematical tools from quantum field theory can be applied to economics and finance, providing a wide range of quantum mathematical techniques for designing financial instruments. The ideas of Lagrangians, Hamiltonians, state spaces, operators and Feynman path integrals are demonstrated to be the mathematical underpinning of quantum field theory, and which are employed to formulate a comprehensive mathematical theory of asset pricing as well as of interest rates, which are validated by empirical evidence. Numerical algorithms and simulations are applied to the study of asset pricing models as well as of nonlinear interest rates. A range of economic and financial topics are shown to have quantum mechanical formulations, including options, coupon bonds, nonlinear interest rates, risky bonds and the microeconomic action functional. This is an invaluable resource for experts in quantitative finance and in mathematics who have no specialist knowledge of quantum field theory.
Actuaries have access to a wealth of individual data in pension and insurance portfolios, but rarely use its full potential. This book will pave the way, from methods using aggregate counts to modern developments in survival analysis. Based on the fundamental concept of the hazard rate, Part I shows how and why to build statistical models, based on data at the level of the individual persons in a pension scheme or life insurance portfolio. Extensive use is made of the R statistics package. Smooth models, including regression and spline models in one and two dimensions, are covered in depth in Part II. Finally, Part III uses multiple-state models to extend survival models beyond the simple life/death setting, and includes a brief introduction to the modern counting process approach. Practising actuaries will find this book indispensable, and students will find it helpful when preparing for their professional examinations.
Chris Albright's VBA FOR MODELERS, 4E, International Edition is an essential tool for helping students learn to use Visual Basic for Applications (VBA) as a means to automate common spreadsheet tasks, as well as to create sophisticated management science applications. VBA is the programming language for Microsoft (R) Office. VBA FOR MODELERS, 4E, International Edition contains two parts. The first part teaches students the essentials of VBA for Excel. The second part illustrates how a number of management science models can be automated with VBA. From a user's standpoint, these applications hide the details of the management science techniques and instead present a simple user interface for inputs and results.
What do we mean by inequality comparisons? If the rich just get richer and the poor get poorer, the answer might seem easy. But what if the income distribution changes in a complicated way? Can we use mathematical or statistical techniques to simplify the comparison problem in a way that has economic meaning? What does it mean to measure inequality? Is it similar to National Income? Or a price index? Is it enough just to work out the Gini coefficient? Measuring Inequality tackles these questions and examines the underlying principles of inequality measurement and its relation to welfare economics, distributional analysis, and information theory. The book covers modern theoretical developments in inequality analysis, as well as showing how the way we think about inequality today has been shaped by classic contributions in economics and related disciplines. Formal results and detailed literature discussion are provided in two appendices. The principal points are illustrated in the main text, using examples from US and UK data, as well as other data sources, and associated web materials provide hands-on learning. Measuring Inequality is designed to appeal to both undergraduate and post-graduate students, and academic economists. Its emphasis on practical application means that it will also be useful to policy analysts and advisors.
Several recent advances in smoothing and semiparametric regression are presented in this book from a unifying, Bayesian perspective. Simulation-based full Bayesian Markov chain Monte Carlo (MCMC) inference, as well as empirical Bayes procedures closely related to penalized likelihood estimation and mixed models, are considered here. Throughout, the focus is on semiparametric regression and smoothing based on basis expansions of unknown functions and effects in combination with smoothness priors for the basis coefficients. Beginning with a review of basic methods for smoothing and mixed models, longitudinal data, spatial data and event history data are treated in separate chapters. Worked examples from various fields such as forestry, development economics, medicine and marketing are used to illustrate the statistical methods covered in this book. Most of these examples have been analysed using implementations in the Bayesian software, BayesX, and some with R Codes. These, as well as some of the data sets, are made publicly available on the website accompanying this book.
'A manual for the 21st-century citizen... accessible, refreshingly critical, relevant and urgent' - Financial Times 'Fascinating and deeply disturbing' - Yuval Noah Harari, Guardian Books of the Year In this New York Times bestseller, Cathy O'Neil, one of the first champions of algorithmic accountability, sounds an alarm on the mathematical models that pervade modern life -- and threaten to rip apart our social fabric. We live in the age of the algorithm. Increasingly, the decisions that affect our lives - where we go to school, whether we get a loan, how much we pay for insurance - are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: everyone is judged according to the same rules, and bias is eliminated. And yet, as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and incontestable, even when they're wrong. Most troubling, they reinforce discrimination. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These "weapons of math destruction" score teachers and students, sort CVs, grant or deny loans, evaluate workers, target voters, and monitor our health. O'Neil calls on modellers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change.
Die Monographie stellt eine prinzipielle Verallgemeinerung der herkoemmlichen Wahrscheinlichkeitstheorie vor. Diese erlaubt die Anwendung des Begriffs der Wahrscheinlichkeit auch in jenen Fallen, in denen die vorliegende Information nicht ausreicht, um jedes relevante Ereignis durch eine einzelne Zahl zu charakterisieren. Der mathematisch exakte Umgang mit Wahrscheinlichkeitsbewertungen erfordert eine systematische Erweiterung des Kanons der Begriffe und Methoden. Die Grundlagen hierfur werden im vorliegenden Band gelegt. Die Anwendungsmoeglichkeiten von Intervallwahrscheinlichkeit sind betrachtlich umfassender als die des herkoemmlichen Wahrscheinlichkeitsbegriffs, z.B. in den Bereichen Medizin, Technik, Versicherungswesen und kunstliche Intelligenz.
This book provides a comprehensive account of stochastic filtering as a modeling tool in finance and economics. It aims to present this very important tool with a view to making it more popular among researchers in the disciplines of finance and economics. It is not intended to give a complete mathematical treatment of different stochastic filtering approaches, but rather to describe them in simple terms and illustrate their application with real historical data for problems normally encountered in these disciplines. Beyond laying out the steps to be implemented, the steps are demonstrated in the context of different market segments. Although no prior knowledge in this area is required, the reader is expected to have knowledge of probability theory as well as a general mathematical aptitude.Its simple presentation of complex algorithms required to solve modeling problems in increasingly sophisticated financial markets makes this book particularly valuable as a reference for graduate students and researchers interested in the field. Furthermore, it analyses the model estimation results in the context of the market and contrasts these with contemporary research publications. It is also suitable for use as a text for graduate level courses on stochastic modeling.
The rapidly growing field of computational social choice, at the intersection of computer science and economics, deals with the computational aspects of collective decision making. This handbook, written by thirty-six prominent members of the computational social choice community, covers the field comprehensively. Chapters devoted to each of the field's major themes offer detailed introductions. Topics include voting theory (such as the computational complexity of winner determination and manipulation in elections), fair allocation (such as algorithms for dividing divisible and indivisible goods), coalition formation (such as matching and hedonic games), and many more. Graduate students, researchers, and professionals in computer science, economics, mathematics, political science, and philosophy will benefit from this accessible and self-contained book.
How the obsession with quantifying human performance threatens our schools, medical care, businesses, and government Today, organizations of all kinds are ruled by the belief that the path to success is quantifying human performance, publicizing the results, and dividing up the rewards based on the numbers. But in our zeal to instill the evaluation process with scientific rigor, we've gone from measuring performance to fixating on measuring itself. The result is a tyranny of metrics that threatens the quality of our lives and most important institutions. In this timely and powerful book, Jerry Muller uncovers the damage our obsession with metrics is causing-and shows how we can begin to fix the problem. Filled with examples from education, medicine, business and finance, government, the police and military, and philanthropy and foreign aid, this brief and accessible book explains why the seemingly irresistible pressure to quantify performance distorts and distracts, whether by encouraging "gaming the stats" or "teaching to the test." That's because what can and does get measured is not always worth measuring, may not be what we really want to know, and may draw effort away from the things we care about. Along the way, we learn why paying for measured performance doesn't work, why surgical scorecards may increase deaths, and much more. But metrics can be good when used as a complement to-rather than a replacement for-judgment based on personal experience, and Muller also gives examples of when metrics have been beneficial. Complete with a checklist of when and how to use metrics, The Tyranny of Metrics is an essential corrective to a rarely questioned trend that increasingly affects us all.
This book provides a broad, mature, and systematic introduction to current financial econometric models and their applications to modeling and prediction of financial time series data. It utilizes real-world examples and real financial data throughout the book to apply the models and methods described. The author begins with basic characteristics of financial time series data before covering three main topics: Analysis and application of univariate financial time seriesThe return series of multiple assetsBayesian inference in finance methods Key features of the new edition include additional coverage of modern day topics such as arbitrage, pair trading, realized volatility, and credit risk modeling; a smooth transition from S-Plus to R; and expanded empirical financial data sets. The overall objective of the book is to provide some knowledge of financial time series, introduce some statistical tools useful for analyzing these series and gain experience in financial applications of various econometric methods.
The substantially updated third edition of the popular Actuarial Mathematics for Life Contingent Risks is suitable for advanced undergraduate and graduate students of actuarial science, for trainee actuaries preparing for professional actuarial examinations, and for life insurance practitioners who wish to increase or update their technical knowledge. The authors provide intuitive explanations alongside mathematical theory, equipping readers to understand the material in sufficient depth to apply it in real-world situations and to adapt their results in a changing insurance environment. Topics include modern actuarial paradigms, such as multiple state models, cash-flow projection methods and option theory, all of which are required for managing the increasingly complex range of contemporary long-term insurance products. Numerous exam-style questions allow readers to prepare for traditional professional actuarial exams, and extensive use of Excel ensures that readers are ready for modern, Excel-based exams and for the actuarial work environment. The Solutions Manual (ISBN 9781108747615), available for separate purchase, provides detailed solutions to the text's exercises.
The majority of empirical research in economics ignores the potential benefits of nonparametric methods, while the majority of advances in nonparametric theory ignores the problems faced in applied econometrics. This book helps bridge this gap between applied economists and theoretical nonparametric econometricians. It discusses in depth, and in terms that someone with only one year of graduate econometrics can understand, basic to advanced nonparametric methods. The analysis starts with density estimation and motivates the procedures through methods that should be familiar to the reader. It then moves on to kernel regression, estimation with discrete data, and advanced methods such as estimation with panel data and instrumental variables models. The book pays close attention to the issues that arise with programming, computing speed, and application. In each chapter, the methods discussed are applied to actual data, paying attention to presentation of results and potential pitfalls.
Predictive modeling involves the use of data to forecast future events. It relies on capturing relationships between explanatory variables and the predicted variables from past occurrences and exploiting this to predict future outcomes. Forecasting future financial events is a core actuarial skill actuaries routinely apply predictive-modeling techniques in insurance and other risk-management applications. This book is for actuaries and other financial analysts who are developing their expertise in statistics and wish to become familiar with concrete examples of predictive modeling. The book also addresses the needs of more seasoned practicing analysts who would like an overview of advanced statistical topics that are particularly relevant in actuarial practice. Predictive Modeling Applications in Actuarial Science emphasizes life-long learning by developing tools in an insurance context, providing the relevant actuarial applications, and introducing advanced statistical techniques that can be used by analysts to gain a competitive advantage in situations with complex data."
For one-semester business statistics courses. A focus on using statistical methods to analyse and interpret results to make data-informed business decisions Statistics is essential for all business majors, and Business Statistics: A First Course helps students see the role statistics will play in their own careers by providing examples drawn from all functional areas of business. Guided by the principles set forth by major statistical and business science associations (ASA and DSI), plus the authors' diverse experiences, the 8th Edition, Global Edition, continues to innovate and improve the way this course is taught to all students. With new examples, case scenarios, and problems, the text continues its tradition of focusing on the interpretation of results, evaluation of assumptions, and discussion of next steps that lead to data-informed decision making. The authors feel that this approach, rather than a focus on manual calculations, better serves students in their future careers. This brief offering, created to fit the needs of a one-semester course, is part of the established Berenson/Levine series.
This book provides a comprehensive and unified treatment of finite
sample statistics and econometrics, a field that has evolved in the
last five decades. Within this framework, this is the first book
which discusses the basic analytical tools of finite sample
econometrics, and explores their applications to models covered in
a first year graduate course in econometrics, including repression
functions, dynamic models, forecasting, simultaneous equations
models, panel data models, and censored models. Both linear and
nonlinear models, as well as models with normal and non-normal
errors, are studied.
This definitive textbook provides a solid introduction to discrete and continuous stochastic processes, tackling a complex field in a way that instils a deep understanding of the relevant mathematical principles, and develops an intuitive grasp of the way these principles can be applied to modelling real-world systems. It includes a careful review of elementary probability and detailed coverage of Poisson, Gaussian and Markov processes with richly varied queuing applications. The theory and applications of inference, hypothesis testing, estimation, random walks, large deviations, martingales and investments are developed. Written by one of the world's leading information theorists, evolving over twenty years of graduate classroom teaching and enriched by over 300 exercises, this is an exceptional resource for anyone looking to develop their understanding of stochastic processes.
Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization. The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic review of the field, this text emphasizes the strategic role that analytics is playing in globally competitive corporations today. The chapters of this book are organized in two main parts. The first part introduces a problem area and presents some basic analytical concepts that have been successfully used to address the problem area. The objective of this material is to provide the student, the manager of the future, with a general understanding of the tools and techniques used by the analyst.
Developed over 20 years of teaching academic courses, the Handbook of Financial Risk Management can be divided into two main parts: risk management in the financial sector; and a discussion of the mathematical and statistical tools used in risk management. This comprehensive text offers readers the chance to develop a sound understanding of financial products and the mathematical models that drive them, exploring in detail where the risks are and how to manage them. Key Features: Written by an author with both theoretical and applied experience Ideal resource for students pursuing a master's degree in finance who want to learn risk management Comprehensive coverage of the key topics in financial risk management Contains 114 exercises, with solutions provided online at www.crcpress.com/9781138501874
The small sample properties of estimators and tests are frequently too complex to be useful or are unknown. Much econometric theory is therefore developed for very large or asymptotic samples where it is assumed that the behaviour of estimators and tests will adequately represent their properties in small samples. Refined asymptotic methods adopt an intermediate position by providing improved approximations to small sample behaviour using asymptotic expansions. Dedicated to the memory of Michael Magdalinos, whose work is a major contribution to this area, this book contains chapters directly concerned with refined asymptotic methods. In addition, there are chapters focusing on new asymptotic results; the exploration through simulation of the small sample behaviour of estimators and tests in panel data models; and improvements in methodology. With contributions from leading econometricians, this collection will be essential reading for researchers and graduate students concerned with the use of asymptotic methods in econometric analysis.
How can organizations ensure that they can get best value for money in their procurement decisions? How can they stimulate innovations from their dedicated suppliers? With contributions from leading academics and professionals, this 2006 handbook offers expert guidance on the fundamental aspects of successful procurement design and management in firms, public administrations, and international institutions. The issues addressed include the management of dynamic procurement; the handling of procurement risk; the architecture of purchasing systems; the structure of incentives in procurement contracts; methods to increase suppliers' participation in procurement contests and e-procurement platforms; how to minimize the risk of collusion and of corruption; pricing and reputation mechanisms in e-procurement platforms; and how procurement can enhance innovation. Inspired by frontier research, it provides practical recommendations to managers, engineers and lawyers engaged in private and public procurement design. |
You may like...
Further Developments in the Theory and…
D.R.F. Taylor, Erik Anonby, …
Paperback
R3,819
Discovery Miles 38 190
Teaching Strategies For Quality Teaching…
Roy Killen, Annemarie Hattingh
Paperback
R164
Discovery Miles 1 640
Dixon Jones - Buildings and Projects…
Ian Latham, Mark Swenarton
Hardcover
R956
Discovery Miles 9 560
|