![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > Economic statistics
This book provides a comprehensive and unified treatment of finite
sample statistics and econometrics, a field that has evolved in the
last five decades. Within this framework, this is the first book
which discusses the basic analytical tools of finite sample
econometrics, and explores their applications to models covered in
a first year graduate course in econometrics, including repression
functions, dynamic models, forecasting, simultaneous equations
models, panel data models, and censored models. Both linear and
nonlinear models, as well as models with normal and non-normal
errors, are studied.
This book provides a comprehensive and unified treatment of finite sample statistics and econometrics, a field that has evolved in the last five decades. Within this framework, this is the first book which discusses the basic analytical tools of finite sample econometrics, and explores their applications to models covered in a first year graduate course in econometrics, including repression functions, dynamic models, forecasting, simultaneous equations models, panel data models, and censored models. Both linear and nonlinear models, as well as models with normal and non-normal errors, are studied. Finite sample results are extremely useful for applied researchers doing proper econometric analysis with small or moderately large sample data. Finite sample econometrics also provides the results for very large (asymptotic) samples. This book provides simple and intuitive presentations of difficult concepts, unified and heuristic developments of methods, and applications to various econometric models. It provides a new perspective on teaching and research in econometrics, statistics, and other applied subjects.
Developed over 20 years of teaching academic courses, the Handbook of Financial Risk Management can be divided into two main parts: risk management in the financial sector; and a discussion of the mathematical and statistical tools used in risk management. This comprehensive text offers readers the chance to develop a sound understanding of financial products and the mathematical models that drive them, exploring in detail where the risks are and how to manage them. Key Features: Written by an author with both theoretical and applied experience Ideal resource for students pursuing a master's degree in finance who want to learn risk management Comprehensive coverage of the key topics in financial risk management Contains 114 exercises, with solutions provided online at www.crcpress.com/9781138501874
Meaningful use of advanced Bayesian methods requires a good understanding of the fundamentals. This engaging book explains the ideas that underpin the construction and analysis of Bayesian models, with particular focus on computational methods and schemes. The unique features of the text are the extensive discussion of available software packages combined with a brief but complete and mathematically rigorous introduction to Bayesian inference. The text introduces Monte Carlo methods, Markov chain Monte Carlo methods, and Bayesian software, with additional material on model validation and comparison, transdimensional MCMC, and conditionally Gaussian models. The inclusion of problems makes the book suitable as a textbook for a first graduate-level course in Bayesian computation with a focus on Monte Carlo methods. The extensive discussion of Bayesian software - R/R-INLA, OpenBUGS, JAGS, STAN, and BayesX - makes it useful also for researchers and graduate students from beyond statistics.
Summarizes the latest developments and techniques in the field and highlights areas such as sample surveys, nonparametric analysis, hypothesis testing, time series analysis, Bayesian inference, and distribution theory for current applications in statistics, economics, medicine, biology, engineering, sociology, psychology, and information technology. Containing more than 800 contemporary references to facilitate further study, the Handbook of Applied Econometrics and Statistical Inference is an in-depth guide for applied statisticians, econometricians, economists, sociologists, psychologists, data analysts, biometricians, medical researchers, and upper-level undergraduate and graduate-level students in these disciplines.
Volume 27 of the International Symposia in Economic Theory and Econometrics series collects a range of unique and diverse chapters, each investigating different spheres of development in emerging markets with a specific focus on significant engines of growth and advancement in the Asia-Pacific economies. Looking at the most sensitive issues behind economic growth in emerging markets, and particularly their long-term prospects, the chapters included in this volume explore the newest fields of research to understand the potential of these markets better. Including chapters from leading scholars worldwide, the volume provides comprehensive coverage of the key topics in fields spanning SMEs, terrorism, manufacturing waste reduction, financial literacy, female empowerment, leadership and corporate management, and the relationship between environmental, social, governance, and firm value. For students, researchers and practitioners, this volume offers a dynamic reference resource on emerging markets across a diverse range of topics.
Demographics is a vital field of study for understanding social
and economic change and it has attracted attention in recent years
as concerns have grown over the aging populations of developed
nations. Demographic studies help make sense of key aspects of the
economy, offering insight into trends in fertility, mortality,
immigration, and labor force participation, as well as age, gender,
and race specific trends in health and disability.
Originally published in 1987. This collection of original papers deals with various issues of specification in the context of the linear statistical model. The volume honours the early econometric work of Donald Cochrane, late Dean of Economics and Politics at Monash University in Australia. The chapters focus on problems associated with autocorrelation of the error term in the linear regression model and include appraisals of early work on this topic by Cochrane and Orcutt. The book includes an extensive survey of autocorrelation tests; some exact finite-sample tests; and some issues in preliminary test estimation. A wide range of other specification issues is discussed, including the implications of random regressors for Bayesian prediction; modelling with joint conditional probability functions; and results from duality theory. There is a major survey chapter dealing with specification tests for non-nested models, and some of the applications discussed by the contributors deal with the British National Accounts and with Australian financial and housing markets.
This textbook contains and explains essential mathematical formulas within an economic context. A broad range of aids and supportive examples will help readers to understand the formulas and their practical applications. This mathematical formulary is presented in a practice-oriented, clear, and understandable manner, as it is needed for meaningful and relevant application in global business, as well as in the academic setting and economic practice. The topics presented include, but are not limited to: mathematical signs and symbols, logic, arithmetic, algebra, linear algebra, combinatorics, financial mathematics, optimisation of linear models, functions, differential calculus, integral calculus, elasticities, economic functions, and the Peren theorem. Given its scope, the book offers an indispensable reference guide and is a must-read for undergraduate and graduate students, as well as managers, scholars, and lecturers in business, politics, and economics.
This book presents a unique collection of contributions on modern topics in statistics and econometrics, written by leading experts in the respective disciplines and their intersections. It addresses nonparametric statistics and econometrics, quantiles and expectiles, and advanced methods for complex data, including spatial and compositional data, as well as tools for empirical studies in economics and the social sciences. The book was written in honor of Christine Thomas-Agnan on the occasion of her 65th birthday. Given its scope, it will appeal to researchers and PhD students in statistics and econometrics alike who are interested in the latest developments in their field.
Mit diesem Buch liegen kompakte Beschreibungen von Prognoseverfahren vor, die vor allem in Systemen der betrieblichen Informationsverarbeitung eingesetzt werden. Praktiker mit langjahriger Prognoseerfahrung zeigen ausserdem, wie die einzelnen Methoden in der Unternehmung Verwendung finden und wo die Probleme beim Einsatz liegen. Das Buch wendet sich gleichermassen an Wissenschaft und Praxis. Das Spektrum reicht von einfachen Verfahren der Vorhersage uber neuere Ansatze der kunstlichen Intelligenz und Zeitreihenanalyse bis hin zur Prognose von Softwarezuverlassigkeit und zur kooperativen Vorhersage in Liefernetzen. In der siebenten, wesentlich uberarbeiteten und erweiterten Auflage werden neue Vergleiche von Prognosemethoden, GARCH-Modelle zur Finanzmarktprognose, Predictive Analytics" als Variante der Business Intelligence" und die Kombination von Vorhersagen mit Elementen der Chaostheorie berucksichtigt."
This volume provides a general framework for a macroeconomic theory of income distribution and wealth distribution and accumulation. The book is divided into two parts. In the first the author surveys the sets of literature on the subject and relates them to each other. In the second part he makes his own contribution by presenting a new model which uses both neo-classical and post-Keynesian analytical tools. The author focuses on the laws which regulate the behavior of individuals and social groups within a given institutional set-up, and in particular those which regulate the accumulation of inter-generational wealth and life-cycle savings of families or dynasties, both in a deterministic and stochastic context. The theoretical issue of savings accumulation is reconsidered, alongside income distribution, and profit determination by concentrating on the historical reasons that are at the basis of "class distinction," as well as "generation distinction," in modern economic analysis.
Statistical Rethinking: A Bayesian Course with Examples in R and Stan builds your knowledge of and confidence in making inferences from data. Reflecting the need for scripting in today's model-based statistics, the book pushes you to perform step-by-step calculations that are usually automated. This unique computational approach ensures that you understand enough of the details to make reasonable choices and interpretations in your own modeling work. The text presents causal inference and generalized linear multilevel models from a simple Bayesian perspective that builds on information theory and maximum entropy. The core material ranges from the basics of regression to advanced multilevel models. It also presents measurement error, missing data, and Gaussian process models for spatial and phylogenetic confounding. The second edition emphasizes the directed acyclic graph (DAG) approach to causal inference, integrating DAGs into many examples. The new edition also contains new material on the design of prior distributions, splines, ordered categorical predictors, social relations models, cross-validation, importance sampling, instrumental variables, and Hamiltonian Monte Carlo. It ends with an entirely new chapter that goes beyond generalized linear modeling, showing how domain-specific scientific models can be built into statistical analyses. Features Integrates working code into the main text Illustrates concepts through worked data analysis examples Emphasizes understanding assumptions and how assumptions are reflected in code Offers more detailed explanations of the mathematics in optional sections Presents examples of using the dagitty R package to analyze causal graphs Provides the rethinking R package on the author's website and on GitHub
In der IT-Organisation geht es um die zuverlassige, zeit-, kosten-
und qualitatsoptimale Bereitstellung
geschaftsprozessunterstutzender IT-Dienstleistungen. Renommierte
Wissenschaftler, erfahrene Unternehmensberater und Fuhrungskrafte
diskutieren die Strategien, Instrumente, Konzepte und
Organisationsansatze fur das IT-Management von morgen.
This essential reference for students and scholars in the input-output research and applications community has been fully revised and updated to reflect important developments in the field. Expanded coverage includes construction and application of multiregional and interregional models, including international models and their application to global economic issues such as climate change and international trade; structural decomposition and path analysis; linkages and key sector identification and hypothetical extraction analysis; the connection of national income and product accounts to input-output accounts; supply and use tables for commodity-by-industry accounting and models; social accounting matrices; non-survey estimation techniques; and energy and environmental applications. Input-Output Analysis is an ideal introduction to the subject for advanced undergraduate and graduate students in many scholarly fields, including economics, regional science, regional economics, city, regional and urban planning, environmental planning, public policy analysis and public management.
Formal Models of Domestic Politics offers a unified and accessible approach to canonical and important new models of politics. Intended for political science and economics students who have already taken a course in game theory, this new edition retains the widely appreciated pedagogic approach of the first edition. Coverage has been expanded to include a new chapter on nondemocracy; new material on valance and issue ownership, dynamic veto and legislative bargaining, delegation to leaders by imperfectly informed politicians, and voter competence; and numerous additional exercises. Political economists, comparativists, and Americanists will all find models in the text central to their research interests. This leading graduate textbook assumes no mathematical knowledge beyond basic calculus, with an emphasis placed on clarity of presentation. Political scientists will appreciate the simplification of economic environments to focus on the political logic of models; economists will discover many important models published outside of their discipline; and both instructors and students will value the classroom-tested exercises. This is a vital update to a classic text.
Who decides how official statistics are produced? Do politicians have control or are key decisions left to statisticians in independent statistical agencies? Interviews with statisticians in Australia, Canada, Sweden, the UK and the USA were conducted to get insider perspectives on the nature of decision making in government statistical administration. While the popular adage suggests there are 'lies, damned lies and statistics', this research shows that official statistics in liberal democracies are far from mistruths; they are consistently insulated from direct political interference. Yet, a range of subtle pressures and tensions exist that governments and statisticians must manage. The power over statistics is distributed differently in different countries, and this book explains why. Differences in decision-making powers across countries are the result of shifting pressures politicians and statisticians face to be credible, and the different national contexts that provide distinctive institutional settings for the production of government numbers.
Info-metrics is a framework for modeling, reasoning, and drawing inferences under conditions of noisy and insufficient information. It is an interdisciplinary framework situated at the intersection of information theory, statistical inference, and decision-making under uncertainty. In Advances in Info-Metrics, Min Chen, J. Michael Dunn, Amos Golan, and Aman Ullah bring together a group of thirty experts to expand the study of info-metrics across the sciences and demonstrate how to solve problems using this interdisciplinary framework. Building on the theoretical underpinnings of info-metrics, the volume sheds new light on statistical inference, information, and general problem solving. The book explores the basis of information-theoretic inference and its mathematical and philosophical foundations. It emphasizes the interrelationship between information and inference and includes explanations of model building, theory creation, estimation, prediction, and decision making. Each of the nineteen chapters provides the necessary tools for using the info-metrics framework to solve a problem. The collection covers recent developments in the field, as well as many new cross-disciplinary case studies and examples. Designed to be accessible for researchers, graduate students, and practitioners across disciplines, this book provides a clear, hands-on experience for readers interested in solving problems when presented with incomplete and imperfect information.
This book provides in-depth analyses on accounting methods of GDP, statistic calibers and comparative perspectives on Chinese GDP. Beginning with an exploration of international comparisons of GDP, the book introduces the theoretical backgrounds, data sources, algorithms of the exchange rate method and the purchasing power parity method and discusses the advantages, disadvantages, and the latest developments in the two methods. This book further elaborates on the reasons for the imperfections of the Chinese GDP data including limitations of current statistical techniques and the accounting system, as well as the relatively confusing statistics for the service industry. The authors then make suggestions for improvement. Finally, the authors emphasize that evaluation of a country's economy and social development should not be solely limited to GDP, but should focus more on indicators of the comprehensive national power, national welfare, and the people's livelihood. This book will be of interest to economists, China-watchers, and scholars of geopolitics.
This publication features a broad suite of statistical indicators characterizing the supply-and-use interactions of economic sectors within and across 25 economies of Asia and the Pacific. The indicators include sector- and economy-specific multipliers and linkages, trade orientation and openness, participation in global value chains, patterns of product specialization, and domestic agglomeration, among many others. Supplementing these analyses are special chapters on the economic impacts of the COVID-19 pandemic, the contribution of the digital economy, and the significance of activities related to real estate. All analyses and indicators draw on the Multiregional Input-Output database maintained by the Asian Development Bank.
This book presents strategies for analyzing qualitative and mixed methods data with MAXQDA software, and provides guidance on implementing a variety of research methods and approaches, e.g. grounded theory, discourse analysis and qualitative content analysis, using the software. In addition, it explains specific topics, such as transcription, building a coding frame, visualization, analysis of videos, concept maps, group comparisons and the creation of literature reviews. The book is intended for masters and PhD students as well as researchers and practitioners dealing with qualitative data in various disciplines, including the educational and social sciences, psychology, public health, business or economics.
A beautiful, compelling and eye-opening guide to the way we live in Britain today. ______________ How much more do we drink than we should? Why do immigrants come here? How have house prices changed in the past decade? What do we spend our money on? Britain by Numbers answers all these questions and more, vividly bringing our nation to life in new and unexpected ways by showing who lives here, where we work, who we marry, what crimes we commit and much else besides. Beautifully designed and illustrated throughout, it takes the reader on a fascinating journey up and down the land, enriching their understanding of a complex - and contradictory - country.
How did Americans come to quantify their society's progress and well-being in units of money? In today's GDP-run world, prices are the standard measure of not only our goods and commodities but our environment, our communities, our nation, even our self-worth. The Pricing of Progress traces the long history of how and why we moderns adopted the monetizing values and valuations of capitalism as an indicator of human prosperity while losing sight of earlier social and moral metrics that did not put a price on everyday life. Eli Cook roots the rise of economic indicators in the emergence of modern capitalism and the contested history of English enclosure, Caribbean slavery, American industrialization, economic thought, and corporate power. He explores how the maximization of market production became the chief objective of American economic and social policy. We see how distinctly capitalist quantification techniques used to manage or invest in railroad corporations, textile factories, real estate holdings, or cotton plantations escaped the confines of the business world and seeped into every nook and cranny of society. As economic elites quantified the nation as a for-profit, capitalized investment, the progress of its inhabitants, free or enslaved, came to be valued according to their moneymaking abilities. Today as in the nineteenth century, political struggles rage over who gets to determine the statistical yardsticks used to gauge the "health" of our economy and nation. The Pricing of Progress helps us grasp the limits and dangers of entrusting economic indicators to measure social welfare and moral goals.
|
You may like...
The Life and Afterlife of St. Elizabeth…
Kenneth Baxter Wolf
Hardcover
R3,090
Discovery Miles 30 900
|