![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > Economic statistics
This book presents a unique collection of contributions on modern topics in statistics and econometrics, written by leading experts in the respective disciplines and their intersections. It addresses nonparametric statistics and econometrics, quantiles and expectiles, and advanced methods for complex data, including spatial and compositional data, as well as tools for empirical studies in economics and the social sciences. The book was written in honor of Christine Thomas-Agnan on the occasion of her 65th birthday. Given its scope, it will appeal to researchers and PhD students in statistics and econometrics alike who are interested in the latest developments in their field.
Mit diesem Buch liegen kompakte Beschreibungen von Prognoseverfahren vor, die vor allem in Systemen der betrieblichen Informationsverarbeitung eingesetzt werden. Praktiker mit langjahriger Prognoseerfahrung zeigen ausserdem, wie die einzelnen Methoden in der Unternehmung Verwendung finden und wo die Probleme beim Einsatz liegen. Das Buch wendet sich gleichermassen an Wissenschaft und Praxis. Das Spektrum reicht von einfachen Verfahren der Vorhersage uber neuere Ansatze der kunstlichen Intelligenz und Zeitreihenanalyse bis hin zur Prognose von Softwarezuverlassigkeit und zur kooperativen Vorhersage in Liefernetzen. In der siebenten, wesentlich uberarbeiteten und erweiterten Auflage werden neue Vergleiche von Prognosemethoden, GARCH-Modelle zur Finanzmarktprognose, Predictive Analytics" als Variante der Business Intelligence" und die Kombination von Vorhersagen mit Elementen der Chaostheorie berucksichtigt."
This well-balanced introduction to enterprise risk management integrates quantitative and qualitative approaches and motivates key mathematical and statistical methods with abundant real-world cases - both successes and failures. Worked examples and end-of-chapter exercises support readers in consolidating what they learn. The mathematical level, which is suitable for graduate and senior undergraduate students in quantitative programs, is pitched to give readers a solid understanding of the concepts and principles involved, without diving too deeply into more complex theory. To reveal the connections between different topics, and their relevance to the real world, the presentation has a coherent narrative flow, from risk governance, through risk identification, risk modelling, and risk mitigation, capped off with holistic topics - regulation, behavioural biases, and crisis management - that influence the whole structure of ERM. The result is a text and reference that is ideal for graduate and senior undergraduate students, risk managers in industry, and anyone preparing for ERM actuarial exams.
In der IT-Organisation geht es um die zuverlassige, zeit-, kosten-
und qualitatsoptimale Bereitstellung
geschaftsprozessunterstutzender IT-Dienstleistungen. Renommierte
Wissenschaftler, erfahrene Unternehmensberater und Fuhrungskrafte
diskutieren die Strategien, Instrumente, Konzepte und
Organisationsansatze fur das IT-Management von morgen.
This essential reference for students and scholars in the input-output research and applications community has been fully revised and updated to reflect important developments in the field. Expanded coverage includes construction and application of multiregional and interregional models, including international models and their application to global economic issues such as climate change and international trade; structural decomposition and path analysis; linkages and key sector identification and hypothetical extraction analysis; the connection of national income and product accounts to input-output accounts; supply and use tables for commodity-by-industry accounting and models; social accounting matrices; non-survey estimation techniques; and energy and environmental applications. Input-Output Analysis is an ideal introduction to the subject for advanced undergraduate and graduate students in many scholarly fields, including economics, regional science, regional economics, city, regional and urban planning, environmental planning, public policy analysis and public management.
Formal Models of Domestic Politics offers a unified and accessible approach to canonical and important new models of politics. Intended for political science and economics students who have already taken a course in game theory, this new edition retains the widely appreciated pedagogic approach of the first edition. Coverage has been expanded to include a new chapter on nondemocracy; new material on valance and issue ownership, dynamic veto and legislative bargaining, delegation to leaders by imperfectly informed politicians, and voter competence; and numerous additional exercises. Political economists, comparativists, and Americanists will all find models in the text central to their research interests. This leading graduate textbook assumes no mathematical knowledge beyond basic calculus, with an emphasis placed on clarity of presentation. Political scientists will appreciate the simplification of economic environments to focus on the political logic of models; economists will discover many important models published outside of their discipline; and both instructors and students will value the classroom-tested exercises. This is a vital update to a classic text.
Who decides how official statistics are produced? Do politicians have control or are key decisions left to statisticians in independent statistical agencies? Interviews with statisticians in Australia, Canada, Sweden, the UK and the USA were conducted to get insider perspectives on the nature of decision making in government statistical administration. While the popular adage suggests there are 'lies, damned lies and statistics', this research shows that official statistics in liberal democracies are far from mistruths; they are consistently insulated from direct political interference. Yet, a range of subtle pressures and tensions exist that governments and statisticians must manage. The power over statistics is distributed differently in different countries, and this book explains why. Differences in decision-making powers across countries are the result of shifting pressures politicians and statisticians face to be credible, and the different national contexts that provide distinctive institutional settings for the production of government numbers.
Info-metrics is a framework for modeling, reasoning, and drawing inferences under conditions of noisy and insufficient information. It is an interdisciplinary framework situated at the intersection of information theory, statistical inference, and decision-making under uncertainty. In Advances in Info-Metrics, Min Chen, J. Michael Dunn, Amos Golan, and Aman Ullah bring together a group of thirty experts to expand the study of info-metrics across the sciences and demonstrate how to solve problems using this interdisciplinary framework. Building on the theoretical underpinnings of info-metrics, the volume sheds new light on statistical inference, information, and general problem solving. The book explores the basis of information-theoretic inference and its mathematical and philosophical foundations. It emphasizes the interrelationship between information and inference and includes explanations of model building, theory creation, estimation, prediction, and decision making. Each of the nineteen chapters provides the necessary tools for using the info-metrics framework to solve a problem. The collection covers recent developments in the field, as well as many new cross-disciplinary case studies and examples. Designed to be accessible for researchers, graduate students, and practitioners across disciplines, this book provides a clear, hands-on experience for readers interested in solving problems when presented with incomplete and imperfect information.
This book provides in-depth analyses on accounting methods of GDP, statistic calibers and comparative perspectives on Chinese GDP. Beginning with an exploration of international comparisons of GDP, the book introduces the theoretical backgrounds, data sources, algorithms of the exchange rate method and the purchasing power parity method and discusses the advantages, disadvantages, and the latest developments in the two methods. This book further elaborates on the reasons for the imperfections of the Chinese GDP data including limitations of current statistical techniques and the accounting system, as well as the relatively confusing statistics for the service industry. The authors then make suggestions for improvement. Finally, the authors emphasize that evaluation of a country's economy and social development should not be solely limited to GDP, but should focus more on indicators of the comprehensive national power, national welfare, and the people's livelihood. This book will be of interest to economists, China-watchers, and scholars of geopolitics.
This book presents strategies for analyzing qualitative and mixed methods data with MAXQDA software, and provides guidance on implementing a variety of research methods and approaches, e.g. grounded theory, discourse analysis and qualitative content analysis, using the software. In addition, it explains specific topics, such as transcription, building a coding frame, visualization, analysis of videos, concept maps, group comparisons and the creation of literature reviews. The book is intended for masters and PhD students as well as researchers and practitioners dealing with qualitative data in various disciplines, including the educational and social sciences, psychology, public health, business or economics.
How did Americans come to quantify their society's progress and well-being in units of money? In today's GDP-run world, prices are the standard measure of not only our goods and commodities but our environment, our communities, our nation, even our self-worth. The Pricing of Progress traces the long history of how and why we moderns adopted the monetizing values and valuations of capitalism as an indicator of human prosperity while losing sight of earlier social and moral metrics that did not put a price on everyday life. Eli Cook roots the rise of economic indicators in the emergence of modern capitalism and the contested history of English enclosure, Caribbean slavery, American industrialization, economic thought, and corporate power. He explores how the maximization of market production became the chief objective of American economic and social policy. We see how distinctly capitalist quantification techniques used to manage or invest in railroad corporations, textile factories, real estate holdings, or cotton plantations escaped the confines of the business world and seeped into every nook and cranny of society. As economic elites quantified the nation as a for-profit, capitalized investment, the progress of its inhabitants, free or enslaved, came to be valued according to their moneymaking abilities. Today as in the nineteenth century, political struggles rage over who gets to determine the statistical yardsticks used to gauge the "health" of our economy and nation. The Pricing of Progress helps us grasp the limits and dangers of entrusting economic indicators to measure social welfare and moral goals.
A beautiful, compelling and eye-opening guide to the way we live in Britain today. ______________ How much more do we drink than we should? Why do immigrants come here? How have house prices changed in the past decade? What do we spend our money on? Britain by Numbers answers all these questions and more, vividly bringing our nation to life in new and unexpected ways by showing who lives here, where we work, who we marry, what crimes we commit and much else besides. Beautifully designed and illustrated throughout, it takes the reader on a fascinating journey up and down the land, enriching their understanding of a complex - and contradictory - country.
For one-semester business statistics courses. A focus on using statistical methods to analyse and interpret results to make data-informed business decisions Statistics is essential for all business majors, and Business Statistics: A First Course helps students see the role statistics will play in their own careers by providing examples drawn from all functional areas of business. Guided by the principles set forth by major statistical and business science associations (ASA and DSI), plus the authors' diverse experiences, the 8th Edition, Global Edition, continues to innovate and improve the way this course is taught to all students. With new examples, case scenarios, and problems, the text continues its tradition of focusing on the interpretation of results, evaluation of assumptions, and discussion of next steps that lead to data-informed decision making. The authors feel that this approach, rather than a focus on manual calculations, better serves students in their future careers. This brief offering, created to fit the needs of a one-semester course, is part of the established Berenson/Levine series.
'A manual for the 21st-century citizen... accessible, refreshingly critical, relevant and urgent' - Financial Times 'Fascinating and deeply disturbing' - Yuval Noah Harari, Guardian Books of the Year In this New York Times bestseller, Cathy O'Neil, one of the first champions of algorithmic accountability, sounds an alarm on the mathematical models that pervade modern life -- and threaten to rip apart our social fabric. We live in the age of the algorithm. Increasingly, the decisions that affect our lives - where we go to school, whether we get a loan, how much we pay for insurance - are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: everyone is judged according to the same rules, and bias is eliminated. And yet, as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and incontestable, even when they're wrong. Most troubling, they reinforce discrimination. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These "weapons of math destruction" score teachers and students, sort CVs, grant or deny loans, evaluate workers, target voters, and monitor our health. O'Neil calls on modellers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change.
Business Statistics with Solutions in R covers a wide range of applications of statistics in solving business related problems. It will introduce readers to quantitative tools that are necessary for daily business needs and help them to make evidence-based decisions. The book provides an insight on how to summarize data, analyze it, and draw meaningful inferences that can be used to improve decisions. It will enable readers to develop computational skills and problem-solving competence using the open source language, R. Mustapha Abiodun Akinkunmi uses real life business data for illustrative examples while discussing the basic statistical measures, probability, regression analysis, significance testing, correlation, the Poisson distribution, process control for manufacturing, time series analysis, forecasting techniques, exponential smoothing, univariate and multivariate analysis including ANOVA and MANOVA and more in this valuable reference for policy makers, professionals, academics and individuals interested in the areas of business statistics, applied statistics, statistical computing, finance, management and econometrics.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policymakers and business people with worldwide statistics on current performance and trends in the manufacturing sector.The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyse patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented.
Cluster analysis finds groups in data automatically. Most methods have been heuristic and leave open such central questions as: how many clusters are there? Which method should I use? How should I handle outliers? Classification assigns new observations to groups given previously classified observations, and also has open questions about parameter tuning, robustness and uncertainty assessment. This book frames cluster analysis and classification in terms of statistical models, thus yielding principled estimation, testing and prediction methods, and sound answers to the central questions. It builds the basic ideas in an accessible but rigorous way, with extensive data examples and R code; describes modern approaches to high-dimensional data and networks; and explains such recent advances as Bayesian regularization, non-Gaussian model-based clustering, cluster merging, variable selection, semi-supervised and robust classification, clustering of functional data, text and images, and co-clustering. Written for advanced undergraduates in data science, as well as researchers and practitioners, it assumes basic knowledge of multivariate calculus, linear algebra, probability and statistics.
In recent years, interest in rigorous impact evaluation has grown tremendously in policy-making, economics, public health, social sciences and international relations. Evidence-based policy-making has become a recurring theme in public policy, alongside greater demands for accountability in public policies and public spending, and requests for independent and rigorous impact evaluations for policy evidence. Froelich and Sperlich offer a comprehensive and up-to-date approach to quantitative impact evaluation analysis, also known as causal inference or treatment effect analysis, illustrating the main approaches for identification and estimation: experimental studies, randomization inference and randomized control trials (RCTs), matching and propensity score matching and weighting, instrumental variable estimation, difference-in-differences, regression discontinuity designs, quantile treatment effects, and evaluation of dynamic treatments. The book is designed for economics graduate courses but can also serve as a manual for professionals in research institutes, governments, and international organizations, evaluating the impact of a wide range of public policies in health, environment, transport and economic development.
This book pulls together robust practices in Partial Least Squares Structural Equation Modeling (PLS-SEM) from other disciplines and shows how they can be used in the area of Banking and Finance. In terms of empirical analysis techniques, Banking and Finance is a conservative discipline. As such, this book will raise awareness of the potential of PLS-SEM for application in various contexts. PLS-SEM is a non-parametric approach designed to maximize explained variance in latent constructs. Latent constructs are directly unobservable phenomena such as customer service quality and managerial competence. Explained variance refers to the extent we can predict, say, customer service quality, by examining other theoretically related latent constructs such as conduct of staff and communication skills. Examples of latent constructs at the microeconomic level include customer service quality, managerial effectiveness, perception of market leadership, etc.; macroeconomic-level latent constructs would be found in contagion of systemic risk from one financial sector to another, herd behavior among fund managers, risk tolerance in financial markets, etc. Behavioral Finance is bound to provide a wealth of opportunities for applying PLS-SEM. The book is designed to expose robust processes in application of PLS-SEM, including use of various software packages and codes, including R. PLS-SEM is already a popular tool in marketing and management information systems used to explain latent constructs. Until now, PLS-SEM has not enjoyed a wide acceptance in Banking and Finance. Based on recent research developments, this book represents the first collection of PLS-SEM applications in Banking and Finance. This book will serve as a reference book for those researchers keen on adopting PLS-SEM to explain latent constructs in Banking and Finance.
This textbook addresses postgraduate students in applied mathematics, probability, and statistics, as well as computer scientists, biologists, physicists and economists, who are seeking a rigorous introduction to applied stochastic processes. Pursuing a pedagogic approach, the content follows a path of increasing complexity, from the simplest random sequences to the advanced stochastic processes. Illustrations are provided from many applied fields, together with connections to ergodic theory, information theory, reliability and insurance. The main content is also complemented by a wealth of examples and exercises with solutions.
Eine speziell fur Wirtschafts- und Sozialwissenschaftler geeignete Einfuhrung in die Grundlagen der Statistik und deren computergestutzte Anwendung. Aus dem Inhalt: Datenerfassung und Datenmodifikation. Haufigkeitsverteilungen und deskriptive Statistiken. Explorative Datenanalyse. Kreuztabellen und Assoziationsmasse. Testverfahren. Korrelationsmasse. Streudiagramme. Regressionsanalyse. Trendanalysen und Kurvenanpassung. Zeitreihenanalyse. Faktorenanalyse. Clusteranalyse. Diskriminanzanalyse. Aufgaben."
An introduction to how the mathematical tools from quantum field theory can be applied to economics and finance, providing a wide range of quantum mathematical techniques for designing financial instruments. The ideas of Lagrangians, Hamiltonians, state spaces, operators and Feynman path integrals are demonstrated to be the mathematical underpinning of quantum field theory, and which are employed to formulate a comprehensive mathematical theory of asset pricing as well as of interest rates, which are validated by empirical evidence. Numerical algorithms and simulations are applied to the study of asset pricing models as well as of nonlinear interest rates. A range of economic and financial topics are shown to have quantum mechanical formulations, including options, coupon bonds, nonlinear interest rates, risky bonds and the microeconomic action functional. This is an invaluable resource for experts in quantitative finance and in mathematics who have no specialist knowledge of quantum field theory.
This handbook presents a systematic overview of approaches to, diversity, and problems involved in interdisciplinary rating methodologies. Historically, the purpose of ratings is to achieve information transparency regarding a given body's activities, whether in the field of finance, banking, or sports for example. This book focuses on commonly used rating methods in three important fields: finance, sports, and the social sector. In the world of finance, investment decisions are largely shaped by how positively or negatively economies or financial instruments are rated. Ratings have thus become a basis of trust for investors. Similarly, sports evaluation and funding are largely based on core ratings. From local communities to groups of nations, public investment and funding are also dependent on how these bodies are continuously rated against expected performance targets. As such, ratings need to reflect the consensus of all stakeholders on selected aspects of the work and how to evaluate their success. The public should also have the opportunity to participate in this process. The authors examine current rating approaches from a variety of proposals that are closest to the public consensus, analyzing the rating models and summarizing the methods of their construction. This handbook offers a valuable reference guide for managers, analysts, economists, business informatics specialists, and researchers alike. |
You may like...
Quantitative statistical techniques
Swanepoel Swanepoel, Vivier Vivier, …
Paperback
(2)R718 Discovery Miles 7 180
State Profiles 2022 - The Population and…
Hannah Anderson Krog
Hardcover
R4,858
Discovery Miles 48 580
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
|