![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Methods for Estimation and Inference in Modern Econometrics provides a comprehensive introduction to a wide range of emerging topics, such as generalized empirical likelihood estimation and alternative asymptotics under drifting parameterizations, which have not been discussed in detail outside of highly technical research papers. The book also addresses several problems often arising in the analysis of economic data, including weak identification, model misspecification, and possible nonstationarity. The book's appendix provides a review of some basic concepts and results from linear algebra, probability theory, and statistics that are used throughout the book. Topics covered include: Well-established nonparametric and parametric approaches to estimation and conventional (asymptotic and bootstrap) frameworks for statistical inference Estimation of models based on moment restrictions implied by economic theory, including various method-of-moments estimators for unconditional and conditional moment restriction models, and asymptotic theory for correctly specified and misspecified models Non-conventional asymptotic tools that lead to improved finite sample inference, such as higher-order asymptotic analysis that allows for more accurate approximations via various asymptotic expansions, and asymptotic approximations based on drifting parameter sequences Offering a unified approach to studying econometric problems, Methods for Estimation and Inference in Modern Econometrics links most of the existing estimation and inference methods in a general framework to help readers synthesize all aspects of modern econometric theory. Various theoretical exercises and suggested solutions are included to facilitate understanding.
List of Illustrations - List of Figures - List of Tables - Glossary - Acknowledgements - Preface - Introduction - Background to the IT Race - The Japanese Challenge - The American Response - The European Response - The British Response - Strategies of European IT Companies in the 80s - Conclusion - Bibliography - Index
Design and Analysis of Time Series Experiments presents the elements of statistical time series analysis while also addressing recent developments in research design and causal modeling. A distinguishing feature of the book is its integration of design and analysis of time series experiments. Drawing examples from criminology, economics, education, pharmacology, public policy, program evaluation, public health, and psychology, Design and Analysis of Time Series Experiments is addressed to researchers and graduate students in a wide range of behavioral, biomedical and social sciences. Readers learn not only how-to skills but, also the underlying rationales for the design features and the analytical methods. ARIMA algebra, Box-Jenkins-Tiao models and model-building strategies, forecasting, and Box-Tiao impact models are developed in separate chapters. The presentation of the models and model-building assumes only exposure to an introductory statistics course, with more difficult mathematical material relegated to appendices. Separate chapters cover threats to statistical conclusion validity, internal validity, construct validity, and external validity with an emphasis on how these threats arise in time series experiments. Design structures for controlling the threats are presented and illustrated through examples. The chapters on statistical conclusion validity and internal validity introduce Bayesian methods, counterfactual causality and synthetic control group designs. Building on the earlier of the authors, Design and Analysis of Time Series Experiments includes more recent developments in modeling, and considers design issues in greater detail than any existing work. Additionally, the book appeals to those who want to conduct or interpret time series experiments, as well as to those interested in research designs for causal inference.
This book makes indicators more accessible, in terms of what they are, who created them and how they are used. It examines the subjectivity and human frailty behind these quintessentially 'hard' and technical measures of the world. To achieve this goal, The Rise and Rise of Indicators presents the world in terms of a selected set of indicators. The emphasis is upon the origins of the indicators and the motivation behind their creation and evolution. The ideas and assumptions behind the indicators are made transparent to demonstrate how changes to them can dramatically alter the ranking of countries that emerge. They are, after all, human constructs and thus embody human biases. The book concludes by examining the future of indicators and the author sets out some possible trajectories, including the growing emphasis on indicators as important tools in the Sustainable Development Goals that have been set for the world up until 2030. This is a valuable resource for undergraduate and postgraduate students in the areas of economics, sociology, geography, environmental studies, development studies, area studies, business studies, politics and international relations.
Let eRN be the usual vector-space of real N-uples with the usual inner product denoted by (. ,. ). In this paper P is a nonempty compact polyhedral set of mN, f is a real-valued function defined on (RN continuously differentiable and fP is the line- ly constrained minimization problem stated as : min (f(x) I x EURO P) * For computing stationary points of problemtj) we propose a method which attempts to operate within the linear-simplex method structure. This method then appears as a same type of method as the convex-simplex method of Zangwill [6]. It is however, different and has the advantage of being less technical with regards to the Zangwill method. It has also a simple geometrical interpretation which makes it more under standable and more open to other improvements. Also in the case where f is convex an implementable line-search is proposed which is not the case in the Zangwill method. Moreover, if f(x) = (c,x) this method will coincide with the simplex method (this is also true in the case of the convex simplex method) i if f(x) = I Ixl 12 it will be almost the same as the algorithm given by Bazaraa, Goode, Rardin [2].
One of the most important features of China's economic emergence has been the role of foreign investment and foreign companies. The importance goes well beyond the USD 1.6 trillion in foreign direct investment that China has received since it started opening its economy. Using the tools of economic impact analysis, the author estimates that around one-third of China's GDP in recent years has been generated by the investments, operations, and supply chains of foreign invested companies. In addition, foreign companies have developed industries, created suppliers and distributors, introduced modern technologies, improved business practices, modernized management training, improved sustainability performance, and helped shape China's legal and regulatory systems. These impacts have helped China become the world's second largest economy, its leading exporter, and one of its leading destinations for inward investment. The book provides a powerful analysis of China's policies toward foreign investment that can inform policy makers around the world, while giving foreign companies tools to demonstrate their contributions to host countries and showing the tremendous power of foreign investment to help transform economies.
One of the most important features of China's economic emergence has been the role of foreign investment and foreign companies. The importance goes well beyond the USD 1.6 trillion in foreign direct investment that China has received since it started opening its economy. Using the tools of economic impact analysis, the author estimates that around one-third of China's GDP in recent years has been generated by the investments, operations, and supply chains of foreign invested companies. In addition, foreign companies have developed industries, created suppliers and distributors, introduced modern technologies, improved business practices, modernized management training, improved sustainability performance, and helped shape China's legal and regulatory systems. These impacts have helped China become the world's second largest economy, its leading exporter, and one of its leading destinations for inward investment. The book provides a powerful analysis of China's policies toward foreign investment that can inform policy makers around the world, while giving foreign companies tools to demonstrate their contributions to host countries and showing the tremendous power of foreign investment to help transform economies.
This volume gathers peer-reviewed contributions that address a wide range of recent developments in the methodology and applications of data analysis and classification tools in micro and macroeconomic problems. The papers were originally presented at the 29th Conference of the Section on Classification and Data Analysis of the Polish Statistical Association, SKAD 2020, held in Sopot, Poland, September 7-9, 2020. Providing a balance between methodological contributions and empirical papers, the book is divided into five parts focusing on methodology, finance, economics, social issues and applications dealing with COVID-19 data. It is aimed at a wide audience, including researchers at universities and research institutions, graduate and doctoral students, practitioners, data scientists and employees in public statistical institutions.
Looking at a very simple example of an error-in-variables model, I was surprised at the effect that standard dynamic features (in the form of autocorre 11 lation. in the variables) could have on the state of identification of the model. It became apparent that identification of error-in-variables models was less of a problem when some dynamic features were present, and that the cathegory of "pre determined variables" was meaningless, since lagged endogenous and truly exogenous variables had very different identification properties. Also, for'the models I was considering, both necessary and sufficient conditions for identification could be expressed as simple counting rules, trivial to compute. These results seemed somewhat striking in the context of traditional econometrics literature, and p- vided the original motivation for this monograph. The monograph, therefore, atempts to analyze econometric identification of models when the variables are measured with error and when dynamic features are present. In trying to generalize the examples I was considering, although the final results had very simple expressions, the process of formally proving them became cumbersome and lengthy (in particular for the "sufficiency" part of the proofs). Possibly this was also due to a lack of more high-powered analytical tools and/or more elegant derivations, for which I feel an apology coul be appropiate. With some minor modifications, this monograph is a Ph. D. dissertation presented to the Department of Economics of the University of Wisconsin, Madison. Thanks are due to. Dennis J. Aigner and Arthur S."
Originally published in 1981. Discrete-choice modelling is an area of econometrics where significant advances have been made at the research level. This book presents an overview of these advances, explaining the theory underlying the model, and explores its various applications. It shows how operational choice models can be used, and how they are particularly useful for a better understanding of consumer demand theory. It discusses particular problems connected with the model and its use, and reports on the authors' own empirical research. This is a comprehensive survey of research developments in discrete choice modelling and its applications.
Models of the American economy exist in government, research institutes, universities, and private corporations. Given the proliferation, it is wise to take stock because these models come from diverse sources and describe different conditions from alternative points of view. They could be saying different things about the economy. The high-level comparative studies in this volume, gathered from several issues of the International Economic Review, with a substantive introduction and the addition of more comparative material, evaluate the performance of eleven models of the American economy: the Wharton Mark Ill Model; Brookings Model; Hickman-Coen Annual Model; Liu-Hwa Monthly Model; Data Resources, Inc. (DRI) Model; Federal Reserve Bank of St. Louis Model; Michigan Quarterly Econometric (MOEM) Model; Wharton Annual and Industry Model; Anticipation Version of the Wharton Mark Ill Model/Fair Model; U.S. Department of Commerce (BEA) Model.Each of the proprietors or builders of these models describes his own system in his own words. These studies come closer than ever before to standardizing model operations for testing purposes.Some of the models are monthly, while others are annual. but the quarterly unit of time is the most frequent. Some are demand oriented, others are supply oriented, and focus on the input-output sectors of the economy. Some use only observed. objective data; others use subjective. anticipatory data. Both large and small models are included. In spite of the diversity, the contributors have cooperated to trace the differences between their models to root causes and to report jointly the results of their research. There are also some general papers that look at model performance from outside the CEME group.
Principles of Econometrics, 4th Edition, is an introductory book on economics and finance designed to provide an understanding of why econometrics is necessary, and a working knowledge of basic econometric tools. This latest edition is updated to reflect current state of economic and financial markets and provides new content on Kernel Density Fitting and Analysis of Treatment Effects. It offers new end-of-chapters questions and problems in each chapter; updated comprehensive Glossary of Terms; and summary of Probably and Statistics. The text applies basic econometric tools to modeling, estimation, inference, and forecasting through real world problems and evaluates critically the results and conclusions from others who use basic econometric tools. Furthermore, it provides a foundation and understanding for further study of econometrics and more advanced techniques.
" …deals rigorously with many of the problems that have bedevilled the subject up to the present time…" — Stephen Pollock, Econometric Theory "I continued to be pleasantly surprised by the variety and usefulness of its contents " — Isabella Verdinelli, Journal of the American Statistical Association Continuing the success of their first edition, Magnus and Neudecker present an exhaustive and self-contained revised text on matrix theory and matrix differential calculus. Matrix calculus has become an essential tool for quantitative methods in a large number of applications, ranging from social and behavioural sciences to econometrics. While the structure and successful elements of the first edition remain, this revised and updated edition contains many new examples and exercises.
Methods and perspectives to model and measure productivity and efficiency have made a number of important advances in the last decade. Using the standard and innovative formulations of the theory and practice of efficiency and productivity measurement, Robin C. Sickles and Valentin Zelenyuk provide a comprehensive approach to productivity and efficiency analysis, covering its theoretical underpinnings and its empirical implementation, paying particular attention to the implications of neoclassical economic theory. A distinct feature of the book is that it presents a wide array of theoretical and empirical methods utilized by researchers and practitioners who study productivity issues. An accompanying website includes methods, programming codes that can be used with widely available software like MATLAB (R) and R, and test data for many of the productivity and efficiency estimators discussed in the book. It will be valuable to upper-level undergraduates, graduate students, and professionals.
The first number of our earlier series, A Programme for Growth, carried a notice of forthcoming papers. Five were announced but eventually only four were published. The fifth, which was intended to deal with consumption functions, never appeared; now it takes its place as number one in the new series. It is not that ten years ago we had nothing to say on the subject of consumers' behaviour. The crude estimation method that I had used in my original (1954) paper on the linear expenditure system gave interesting and in many respects satisfactory results, some of which were published outside our series, for instance in Stone, Brown and ). With this method the parameter estimates changed Rowe ( 1964 very little after the first few iterations. Nevertheless they did change, and with the computing resources then at our disposal we failed to reach convergence. It was mainly for this reason that we decided to wait.
At this point in time, there is no generally accepted methodology for explaining and predicting human behavior given a product choice situation. This is true despite the critical importance of such meth odology to marketing, transportation and urban planning. While the social sciences provide numerous theories to be tested and the mathe matical and statistical procedures exist in general to do so, at this point, no single unified theory has emerged. It is generally accepted that to explain product choice behav ior, products must be described in terms of attributes. Using anyone of a number of procedures, it is possible to obtain measurements on the attributes of the products under consideration. However, there is no generally accepted methodology. Given the attribute profiles of two products, in order to explain and predict preference, it is necessary to determine the relative importance of each of the product attributes. Once again, there is no generally accepted methodology. There are two basic approaches: The first, called the attitudinal approach, obtains importance measure ments directly from respondents using one of many scaling techniques; the second, termed the inferential method endeavors to infer impor tances from product preference and attribute data. Since it is gen erally felt that respondents are unwilling and/or unable to provide meaningful importance measurements, the inferential method is most widely accepted."
This book introduces econometric analysis of cross section, time series and panel data with the application of statistical software. It serves as a basic text for those who wish to learn and apply econometric analysis in empirical research. The level of presentation is as simple as possible to make it useful for undergraduates as well as graduate students. It contains several examples with real data and Stata programmes and interpretation of the results. While discussing the statistical tools needed to understand empirical economic research, the book attempts to provide a balance between theory and applied research. Various concepts and techniques of econometric analysis are supported by carefully developed examples with the use of statistical software package, Stata 15.1, and assumes that the reader is somewhat familiar with the Strata software. The topics covered in this book are divided into four parts. Part I discusses introductory econometric methods for data analysis that economists and other social scientists use to estimate the economic and social relationships, and to test hypotheses about them, using real-world data. There are five chapters in this part covering the data management issues, details of linear regression models, the related problems due to violation of the classical assumptions. Part II discusses some advanced topics used frequently in empirical research with cross section data. In its three chapters, this part includes some specific problems of regression analysis. Part III deals with time series econometric analysis. It covers intensively both the univariate and multivariate time series econometric models and their applications with software programming in six chapters. Part IV takes care of panel data analysis in four chapters. Different aspects of fixed effects and random effects are discussed here. Panel data analysis has been extended by taking dynamic panel data models which are most suitable for macroeconomic research. The book is invaluable for students and researchers of social sciences, business, management, operations research, engineering, and applied mathematics.
Over the past two decades, experimental economics has moved from a fringe activity to become a standard tool for empirical research. With experimental economics now regarded as part of the basic tool-kit for applied economics, this book demonstrates how controlled experiments can be a useful in providing evidence relevant to economic research. Professors Jacquemet and L'Haridon take the standard model in applied econometrics as a basis to the methodology of controlled experiments. Methodological discussions are illustrated with standard experimental results. This book provides future experimental practitioners with the means to construct experiments that fit their research question, and new comers with an understanding of the strengths and weaknesses of controlled experiments. Graduate students and academic researchers working in the field of experimental economics will be able to learn how to undertake, understand and criticise empirical research based on lab experiments, and refer to specific experiments, results or designs completed with case study applications.
This is the first monograph that discusses in detail the interactions between Fourier analysis and dynamic economic theories, in particular, business cycles.Many economic theories have analyzed cyclical behaviors of economic variables. In this book, the focus is on a couple of trials: (1) the Kaldor theory and (2) the Slutsky effect. The Kaldor theory tries to explain business fluctuations in terms of nonlinear, 2nd-order ordinary differential equations (ODEs). In order to explain periodic behaviors of a solution, the Hopf-bifurcation theorem frequently plays a key role. Slutsky's idea is to look at the periodic movement as an overlapping effect of random shocks. The Slutsky process is a weakly stationary process, the periodic (or almost periodic) behavior of which can be analyzed by the Bochner theorem. The goal of this book is to give a comprehensive and rigorous justification of these ideas. Therefore, the aim is first to give a complete theory that supports the Hopf theorem and to prove the existence of periodic solutions of ODEs; and second to explain the mathematical structure of the Bochner theorem and its relation to periodic (or almost periodic) behaviors of weakly stationary processes.Although these two targets are the principal ones, a large number of results from Fourier analysis must be prepared in order to reach these goals. The basic concepts and results from classical as well as generalized Fourier analysis are provided in a systematic way.Prospective readers are assumed to have sufficient knowledge of real, complex analysis. However, necessary economic concepts are explained in the text, making this book accessible even to readers without a background in economics.
|
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Multi-Agent Transport Simulation…
Andreas Horni, Kai Nagel, …
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
|