![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
"A book perfect for this moment" -Katherine M. O'Regan, Former Assistant Secretary, US Department of Housing and Urban Development More than fifty years after the passage of the Fair Housing Act, American cities remain divided along the very same lines that this landmark legislation explicitly outlawed. Keeping Races in Their Places tells the story of these lines-who drew them, why they drew them, where they drew them, and how they continue to circumscribe residents' opportunities to this very day. Weaving together sophisticated statistical analyses of more than a century's worth of data with an engaging, accessible narrative that brings the numbers to life, Keeping Races in Their Places exposes the entrenched effects of redlining on American communities. This one-of-a-kind contribution to the real estate and urban economics literature applies the author's original geographic information systems analyses to historical maps to reveal redlining's causal role in shaping today's cities. Spanning the era from the Great Migration to the Great Recession, Keeping Races in Their Places uncovers the roots of the Black-white wealth gap, the subprime lending crisis, and today's lack of affordable housing in maps created by banks nearly a century ago. Most of all, it offers hope that with the latest scholarly tools we can pinpoint how things went wrong-and what we must do to make them right.
The behaviour of commodity prices never ceases to marvel economists, financial analysts, industry experts, and policymakers. Unexpected swings in commodity prices used to occur infrequently but have now become a permanent feature of global commodity markets. This book is about modelling commodity price shocks. It is intended to provide insights into the theoretical, conceptual, and empirical modelling of the underlying causes of global commodity price shocks. Three main objectives motivated the writing of this book. First, to provide a variety of modelling frameworks for documenting the frequency and intensity of commodity price shocks. Second, to evaluate existing approaches used for forecasting large movements in future commodity prices. Third, to cover a wide range and aspects of global commodities including currencies, rare-hard-lustrous transition metals, agricultural commodities, energy, and health pandemics. Some attempts have already been made towards modelling commodity price shocks. However, most tend to narrowly focus on a subset of commodity markets, i.e., agricultural commodities market and/or the energy market. In this book, the author moves the needle forward by operationalizing different models, which allow researchers to identify the underlying causes and effects of commodity price shocks. Readers also learn about different commodity price forecasting models. The author presents the topics to readers assuming less prior or specialist knowledge. Thus, the book is accessible to industry analysts, researchers, undergraduate and graduate students in economics and financial economics, academic and professional economists, investors, and financial professionals working in different sectors of the commodity markets. Another advantage of the book's approach is that readers are not only exposed to several innovative modelling techniques to add to their modelling toolbox but are also exposed to diverse empirical applications of the techniques presented.
Contains information for using R software with the examples in the textbook Sampling: Design and Analysis, 3rd edition by Sharon L. Lohr.
"A book perfect for this moment" -Katherine M. O'Regan, Former Assistant Secretary, US Department of Housing and Urban Development More than fifty years after the passage of the Fair Housing Act, American cities remain divided along the very same lines that this landmark legislation explicitly outlawed. Keeping Races in Their Places tells the story of these lines-who drew them, why they drew them, where they drew them, and how they continue to circumscribe residents' opportunities to this very day. Weaving together sophisticated statistical analyses of more than a century's worth of data with an engaging, accessible narrative that brings the numbers to life, Keeping Races in Their Places exposes the entrenched effects of redlining on American communities. This one-of-a-kind contribution to the real estate and urban economics literature applies the author's original geographic information systems analyses to historical maps to reveal redlining's causal role in shaping today's cities. Spanning the era from the Great Migration to the Great Recession, Keeping Races in Their Places uncovers the roots of the Black-white wealth gap, the subprime lending crisis, and today's lack of affordable housing in maps created by banks nearly a century ago. Most of all, it offers hope that with the latest scholarly tools we can pinpoint how things went wrong-and what we must do to make them right.
This book covers diverse themes, including institutions and efficiency, choice and values, law and economics, development and policy, and social and economic measurement. Written in honour of the distinguished economist Satish K. Jain, this compilation of essays should appeal not only to students and researchers of economic theory but also to those interested in the design and evaluation of institutions and policy.
This volume deals with a range of contemporary issues in Indian and other world economies, with a focus on economic theory and policy and their longstanding implications. It analyses and predicts the mechanisms that can come into play to determine the function of institutions and the impact of public policy.
Thoroughly updated throughout, A First Course in Linear Model Theory, Second Edition is an intermediate-level statistics text that fills an important gap by presenting the theory of linear statistical models at a level appropriate for senior undergraduate or first-year graduate students. With an innovative approach, the authors introduce to students the mathematical and statistical concepts and tools that form a foundation for studying the theory and applications of both univariate and multivariate linear models. In addition to adding R functionality, this second edition features three new chapters and several sections on new topics that are extremely relevant to the current research in statistical methodology. Revised or expanded topics include linear fixed, random and mixed effects models, generalized linear models, Bayesian and hierarchical linear models, model selection, multiple comparisons, and regularized and robust regression. New to the Second Edition: Coverage of inference for linear models has been expanded into two chapters. Expanded coverage of multiple comparisons, random and mixed effects models, model selection, and missing data. A new chapter on generalized linear models (Chapter 12). A new section on multivariate linear models in Chapter 13, and expanded coverage of the Bayesian linear models and longitudinal models. A new section on regularized regression in Chapter 14. Detailed data illustrations using R. The authors' fresh approach, methodical presentation, wealth of examples, use of R, and introduction to topics beyond the classical theory set this book apart from other texts on linear models. It forms a refreshing and invaluable first step in students' study of advanced linear models, generalized linear models, nonlinear models, and dynamic models.
There isn't a book currently on the market which focuses on multiple hypotheses testing. - Can be used on a range of course, including social & behavioral sciences, biological sciences, as well as professional researchers. Includes various examples of the multiple hypotheses method in practice in a variety of fields, including: sport and crime.
This book presents selected peer-reviewed contributions from the International Conference on Time Series and Forecasting, ITISE 2018, held in Granada, Spain, on September 19-21, 2018. The first three parts of the book focus on the theory of time series analysis and forecasting, and discuss statistical methods, modern computational intelligence methodologies, econometric models, financial forecasting, and risk analysis. In turn, the last three parts are dedicated to applied topics and include papers on time series analysis in the earth sciences, energy time series forecasting, and time series analysis and prediction in other real-world problems. The book offers readers valuable insights into the different aspects of time series analysis and forecasting, allowing them to benefit both from its sophisticated and powerful theory, and from its practical applications, which address real-world problems in a range of disciplines. The ITISE conference series provides a valuable forum for scientists, engineers, educators and students to discuss the latest advances and implementations in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
This completely restructured, updated third edition of the volume first published in 1992 provides a general overview of the econometrics of panel data, both from a theoretical and from an applied viewpoint. Since the pioneering papers by Kuh (1959), Mundlak (1961), Hoch (1962), and Balestra and Nerlove (1966), the pooling of cross section and time series data has become an increasingly popular way of quantifying economic relationships. Each series provides information lacking in the other, so a combination of both leads to more accurate and reliable results than would be achievable by one type of series alone.
Part I is concerned with the fundamentals of panel data econometrics, both linear and non linear; Part II deals with more advanced topics such as dynamic models, simultaneity and measurement errors, unit roots and co integration, incomplete panels and selectivity, duration and count models, etc. This volume also provides insights into the use of panel data in empirical studies. Part III deals with surveys in several major fields of applied economics, such as investment demand, foreign direct investment and international trade, production efficiency, labour supply, transitions on the labour market, etc. Six new chapters about R&D and innovation, wages, health economics, policy evaluation, growth empirics and the impact of monetary policy have been included.
In Econometrics the author has provided a text that bridges the gap between classical econometrics (with an emphasis on linear methods such as OLS, GLS and instrumental variables) and some of the key research areas of the last few years, including sampling problems, nonparametric methods and panel data analysis. Designed for advanced undergraduate and postgraduate students of the subject, Econometrics provides rigorous, yet accessible, coverage of the subject. Key features include:
New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, and signal extraction. They then move on to advanced topics, focusing on heteroscedastic models, nonlinear time series models, Bayesian time series analysis, nonparametric time series analysis, and neural networks. Multivariate time series coverage includes presentations on vector ARMA models, cointegration, and multivariate linear systems. Special features include:
Requiring no previous knowledge of the subject, A Course in Time Series Analysis is an important reference and a highly useful resource for researchers and practitioners in statistics, economics, business, engineering, and environmental analysis.
Shedding light on some of the most pressing open questions in the analysis of high frequency data, this volume presents cutting-edge developments in high frequency financial econometrics. Coverage spans a diverse range of topics, including market microstructure, tick-by-tick data, bond and foreign exchange markets, and large dimensional volatility modeling. The volume is of interest to graduate students, researchers, and industry professionals.
1. Material on single asset problems, market timing, unconditional and conditional portfolio problems, hedged portfolios. 2. Inference via both Frequentist and Bayesian paradigms. 3. A comprehensive treatment of overoptimism and overfitting of trading strategies. 4. Advice on backtesting strategies. 5. Dozens of examples and hundreds of exercises for self study.
Much of our thinking is flawed because it is based on faulty intuition. By using the framework and tools of probability and statistics, we can overcome this to provide solutions to many real-world problems and paradoxes. We show how to do this, and find answers that are frequently very contrary to what we might expect. Along the way, we venture into diverse realms and thought experiments which challenge the way that we see the world. Features: An insightful and engaging discussion of some of the key ideas of probabilistic and statistical thinking Many classic and novel problems, paradoxes, and puzzles An exploration of some of the big questions involving the use of choice and reason in an uncertain world The application of probability, statistics, and Bayesian methods to a wide range of subjects, including economics, finance, law, and medicine Exercises, references, and links for those wishing to cross-reference or to probe further Solutions to exercises at the end of the book This book should serve as an invaluable and fascinating resource for university, college, and high school students who wish to extend their reading, as well as for teachers and lecturers who want to liven up their courses while retaining academic rigour. It will also appeal to anyone who wishes to develop skills with numbers or has an interest in the many statistical and other paradoxes that permeate our lives. Indeed, anyone studying the sciences, social sciences, or humanities on a formal or informal basis will enjoy and benefit from this book.
Much of our thinking is flawed because it is based on faulty intuition. By using the framework and tools of probability and statistics, we can overcome this to provide solutions to many real-world problems and paradoxes. We show how to do this, and find answers that are frequently very contrary to what we might expect. Along the way, we venture into diverse realms and thought experiments which challenge the way that we see the world. Features: An insightful and engaging discussion of some of the key ideas of probabilistic and statistical thinking Many classic and novel problems, paradoxes, and puzzles An exploration of some of the big questions involving the use of choice and reason in an uncertain world The application of probability, statistics, and Bayesian methods to a wide range of subjects, including economics, finance, law, and medicine Exercises, references, and links for those wishing to cross-reference or to probe further Solutions to exercises at the end of the book This book should serve as an invaluable and fascinating resource for university, college, and high school students who wish to extend their reading, as well as for teachers and lecturers who want to liven up their courses while retaining academic rigour. It will also appeal to anyone who wishes to develop skills with numbers or has an interest in the many statistical and other paradoxes that permeate our lives. Indeed, anyone studying the sciences, social sciences, or humanities on a formal or informal basis will enjoy and benefit from this book.
*Furnishes a thorough introduction and detailed information about the linear regression model, including how to understand and interpret its results, test assumptions, and adapt the model when assumptions are not satisfied. *Uses numerous graphs in R to illustrate the model's results, assumptions, and other features. *Does not assume a background in calculus or linear algebra; rather, an introductory statistics course and familiarity with elementary algebra are sufficient. *Provides many examples using real world datasets relevant to various academic disciplines. *Fully integrates the R software environment in its numerous examples.
1. This book is applicable to courses across the social and behavioral science on a wide range of quantitative methods courses. 2. The book is based solely on Stata for EFA - one of the top statistics software packages used in behavioral and social sciences. 3. Clear step-by-step guidance combined with screen shots to show how to apply EFA to real data.
In January 2005, the German government enacted a substantial reform of the welfare system, the so-called "Hartz IV reform." This book evaluates key characteristics of the reform from a microeconometric perspective. It investigates whether a centralized or decentralized organization of welfare administration is more successful to integrate welfare recipients into employment. Moreover, it analyzes the employment effects of an intensified use of benefit sanctions and evaluates the effectiveness and efficiency of the most frequently assigned Active Labor Market Programs. The analyses have a focus on immigrants, who are highly over-represented in the German welfare system. "
Measurement in Economics: a Handbook aims to serve as a source,
reference, and teaching supplement for quantitative empirical
economics, inside and outside the laboratory. Covering an extensive
range of fields in economics: econometrics, actuarial science,
experimental economics, index theory, national accounts, and
economic forecasting, it is the first book that takes measurement
in economics as its central focus. It shows how different and
sometimes distinct fields share the same kind of measurement
problems and so how the treatment of these problems in one field
can function as a guidance in other fields. This volume provides
comprehensive and up-to-date surveys of recent developments in
economic measurement, written at a level intended for professional
use by economists, econometricians, statisticians and social
scientists.
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
A guide to economics, statistics and finance that explores the mathematical foundations underling econometric methods An Introduction to Econometric Theory offers a text to help in the mastery of the mathematics that underlie econometric methods and includes a detailed study of matrix algebra and distribution theory. Designed to be an accessible resource, the text explains in clear language why things are being done, and how previous material informs a current argument. The style is deliberately informal with numbered theorems and lemmas avoided. However, very few technical results are quoted without some form of explanation, demonstration or proof. The author -- a noted expert in the field -- covers a wealth of topics including: simple regression, basic matrix algebra, the general linear model, distribution theory, the normal distribution, properties of least squares, unbiasedness and efficiency, eigenvalues, statistical inference in regression, t and F tests, the partitioned regression, specification analysis, random regressor theory, introduction to asymptotics and maximum likelihood. Each of the chapters is supplied with a collection of exercises, some of which are straightforward and others more challenging. This important text: Presents a guide for teaching econometric methods to undergraduate and graduate students of economics, statistics or finance Offers proven classroom-tested material Contains sets of exercises that accompany each chapter Includes a companion website that hosts additional materials, solution manual and lecture slides Written for undergraduates and graduate students of economics, statistics or finance, An Introduction to Econometric Theory is an essential beginner's guide to the underpinnings of econometrics.
This book covers the econometric methodsnecessary for a practicing applied economist or data analyst. This requiresboth an understanding of statistical theory and how it is used in actual applications. Chapters 1 to 9 present the material concerned with basic statistical theory. Chapters 10 to 13 introduce a number of topics which form the basis of more advanced option modules, such as time series methods in applied econometrics. To get the most out of these topics, companion files include Excel datasets and 4-color figures. It includes pull down menus to graph the data, calculate sample statistics and estimate regression equations. FEATURES: Integration of econometrics methods with statistical foundations Worked examples of all models considered in the text Includes Excel datasheets to facilitate estimation and application of models Features instructor ancillaries for use as atextbook |
You may like...
Computational Science and High…
Egon Krause, Yurii I Shokin, …
Hardcover
R5,199
Discovery Miles 51 990
Fuzzy Evidence in Identification…
Alexander P. Rotshtein, Hanna B. Rakytyanska
Hardcover
R4,048
Discovery Miles 40 480
The Conservative Press in…
Ronald Lora, William Henry Longton
Hardcover
Handbook of Genetic Programming…
Amir H Gandomi, Amir H Alavi, …
Hardcover
R4,123
Discovery Miles 41 230
|