![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
A guide to economics, statistics and finance that explores the mathematical foundations underling econometric methods An Introduction to Econometric Theory offers a text to help in the mastery of the mathematics that underlie econometric methods and includes a detailed study of matrix algebra and distribution theory. Designed to be an accessible resource, the text explains in clear language why things are being done, and how previous material informs a current argument. The style is deliberately informal with numbered theorems and lemmas avoided. However, very few technical results are quoted without some form of explanation, demonstration or proof. The author -- a noted expert in the field -- covers a wealth of topics including: simple regression, basic matrix algebra, the general linear model, distribution theory, the normal distribution, properties of least squares, unbiasedness and efficiency, eigenvalues, statistical inference in regression, t and F tests, the partitioned regression, specification analysis, random regressor theory, introduction to asymptotics and maximum likelihood. Each of the chapters is supplied with a collection of exercises, some of which are straightforward and others more challenging. This important text: Presents a guide for teaching econometric methods to undergraduate and graduate students of economics, statistics or finance Offers proven classroom-tested material Contains sets of exercises that accompany each chapter Includes a companion website that hosts additional materials, solution manual and lecture slides Written for undergraduates and graduate students of economics, statistics or finance, An Introduction to Econometric Theory is an essential beginner's guide to the underpinnings of econometrics.
This book investigates the relationship between environmental degradation and income, focusing on carbon dioxide (CO2) emissions from around the world, to explore the possibility of sustainable development under global warming. Although many researchers have tackled this problem by estimating the Environmental Kuznets Curve (EKC), unlike the approach to sulfur dioxide emissions, there seems to be little consensus about whether EKC is formed with regard to CO2 emissions. Thus, EKC is one of the most controversial issues in the field of environmental economics. This book contributes three points with academic rigor. First, an unbalanced panel dataset containing over 150 countries with the latest CO2 emission data between 1960 and 2010 is constructed. Second, based on this dataset, the CO2 emission-income relationship is analyzed using strict econometric methods such as the dynamic panel model. Third, as it is often pointed out that some factors other than income affect CO2 emission, several variables were added to the estimation model to examine the effects of changes of industrial structure, energy composition, and overseas trade on CO2 emission.
The Analytic Hierarchy Process (AHP) has been one of the foremost mathematical methods for decision making with multiple criteria and has been widely studied in the operations research literature as well as applied to solve countless real-world problems. This book is meant to introduce and strengthen the readers' knowledge of the AHP, no matter how familiar they may be with the topic. This book provides a concise, yet self-contained, introduction to the AHP that uses a novel and more pedagogical approach. It begins with an introduction to the principles of the AHP, covering the critical points of the method, as well as some of its applications. Next, the book explores further aspects of the method, including the derivation of the priority vector, the estimation of inconsistency, and the use of AHP for group decisions. Each of these is introduced by relaxing initial assumptions. Furthermore, this booklet covers extensions of AHP, which are typically neglected in elementary expositions of the methods. Such extensions concern different numerical representations of preferences and the interval and fuzzy representations of preferences to account for uncertainty. During the whole exposition, an eye is kept on the most recent developments of the method.
Written for a broad audience this book offers a comprehensive account of early warning systems for hydro meteorological disasters such as floods and storms, and for geological disasters such as earthquakes. One major theme is the increasingly important role in early warning systems played by the rapidly evolving fields of space and information technology. The authors, all experts in their respective fields, offer a comprehensive and in-depth insight into the current and future perspectives for early warning systems. The text is aimed at decision-makers in the political arena, scientists, engineers and those responsible for public communication and dissemination of warnings.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
This volume addresses advanced DEA methodology and techniques developed for modeling unique new performance evaluation issues. Many numerical examples, real management cases and verbal descriptions make it very valuable for researchers and practitioners.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
One of the best known statisticians of the 20th century, Frederick Mosteller has inspired numerous statisticians and other scientists by his creative approach to statistics and its applications. This volume collects 40 of his most original and influential papers, capturing the variety and depth of his writings. It is hoped that sharing these writings with a new generation of researchers will inspire them to build upon his insights and efforts.
In recent years, as part of the increasing "informationization" of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and "laws" akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled "New Approaches to the Analysis of Large-Scale Business and E- nomic Data," held in Tokyo, March 1-5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)-Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
The book examines applications in two disparate fields linked by the importance of valuing information: public health and space. Researchers in the health field have developed some of the most innovative methodologies for valuing information, used to help determine, for example, the value of diagnostics in informing patient treatment decisions. In the field of space, recent applications of value-of-information methods are critical for informing decisions on investment in satellites that collect data about air quality, fresh water supplies, climate and other natural and environmental resources affecting global health and quality of life.
The Analytic Network Process (ANP), developed by Thomas Saaty in his work on multicriteria decision making, applies network structures with dependence and feedback to complex decision making. This new edition of Decision Making with the Analytic Network Process is a selection of the latest applications of ANP to economic, social and political decisions, and also to technological design. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate. The book focuses on the application of the ANP in three different areas: economics, the social sciences and the linking of measurement with human values. Economists can use the ANP for an alternate approach for dealing with economic problems than the usual mathematical models on which economics bases its quantitative thinking. For psychologists, sociologists and political scientists, the ANP offers the methodology they have sought for some time to quantify and derive measurements for intangibles. Finally the book applies the ANP to provide people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process, one is able to interpret the true meaning of measurements made on a uniform scale using a unit.
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
In macro-econometrics more attention needs to be paid to the relationships among deterministic trends of different variables, or co-trending, especially when economic growth is of concern. The number of relationships, i.e., the co-trending rank, plays an important role in evaluating the veracity of propositions, particularly relating to the Japanese economic growth in view of the structural changes involved within it. This book demonstrates how to determine the co-trending rank from a given set of time series data for different variables. At the same time, the method determines how many of the co-trending relations also represent cointegrations. This enables us to perform statistical inference on the parameters of relations among the deterministic trends. Co-trending is an important contribution to the fields of econometric methods, macroeconomics, and time series analyses.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
The Oxford Handbook of Computational Economics and Finance provides a survey of both the foundations of and recent advances in the frontiers of analysis and action. It is both historically and interdisciplinarily rich and also tightly connected to the rise of digital society. It begins with the conventional view of computational economics, including recent algorithmic development in computing rational expectations, volatility, and general equilibrium. It then moves from traditional computing in economics and finance to recent developments in natural computing, including applications of nature-inspired intelligence, genetic programming, swarm intelligence, and fuzzy logic. Also examined are recent developments of network and agent-based computing in economics. How these approaches are applied is examined in chapters on such subjects as trading robots and automated markets. The last part deals with the epistemology of simulation in its trinity form with the integration of simulation, computation, and dynamics. Distinctive is the focus on natural computationalism and the examination of the implications of intelligent machines for the future of computational economics and finance. Not merely individual robots, but whole integrated systems are extending their "immigration" to the world of Homo sapiens, or symbiogenesis.
This volume uses state of the art models from the frontier of macroeconomics to answer key questions about how the economy functions and how policy should be conducted. The contributions cover a wide range of issues in macroeconomics and macroeconomic policy. They combine high level mathematics with economic analysis, and highlight the need to update our mathematical toolbox in order to understand the increased complexity of the macroeconomic environment. The volume represents hard evidence of high research intensity in many fields of macroeconomics, and warns against interpreting the scope of macroeconomics too narrowly. The mainstream business cycle analysis, based on dynamic stochastic general equilibrium (DSGE) modelling of a particular type, has been criticised for its inability to predict or resolve the recent financial crisis. However, macroeconomic research on financial, information, and learning imperfections had not yet made their way into many of the pre-crisis DSGE models because practical econometric versions of those models were mainly designed to fit data periods that did not include financial crises. A major response to the limitations of those older DSGE models is an active research program to bring big financial shocks and various kinds of financial, learning, and labour market frictions into a new generation of DSGE models for guiding policy. The contributors to this book utilise models and modelling assumptions that go beyond particular modelling conventions. By using alternative yet plausible assumptions, they seek to enrich our knowledge and ability to explain macroeconomic phenomena. They contribute to expanding the frontier of macroeconomic knowledge in ways that will prove useful for macroeconomic policy.
This volume is dedicated to two recent intensive areas of research
in the econometrics of panel data, namely nonstationary panels and
dynamic panels. It includes a comprehensive survey of the
nonstationary panel literature including panel unit root tests,
spurious panel regressions and panel cointegration
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
This book focuses on general frameworks for modeling heavy-tailed distributions in economics, finance, econometrics, statistics, risk management and insurance. A central theme is that of (non-)robustness, i.e., the fact that the presence of heavy tails can either reinforce or reverse the implications of a number of models in these fields, depending on the degree of heavy-tailed ness. These results motivate the development and applications of robust inference approaches under heavy tails, heterogeneity and dependence in observations. Several recently developed robust inference approaches are discussed and illustrated, together with applications.
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
The interaction between mathematicians and statisticians reveals to be an effective approach to the analysis of insurance and financial problems, in particular in an operative perspective. The Maf2006 conference, held at the University of Salerno in 2006, had precisely this purpose and the collection published here gathers some of the papers presented at the conference and successively worked out to this aim. They cover a wide variety of subjects in insurance and financial fields.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
This book, which was first published in 1980, is concerned with one particular branch of growth theory, namely descriptive growth theory. It is typically assumed in growth theory that both the factors and goods market are perfectly competitive. In particular this implies amongst other things that the reward to each factor is identical in each sector of the economy. In this book the assumption of identical factor rewards is relaxed and the implications of an intersectoral wage differential for economic growth are analysed. There is also some discussion on the short-term and long-run effects of minimum wage legislation on growth. This book will serve as key reading for students of economics.
Taxpayer compliance is a voluntary activity, and the degree to which the tax system works is affected by taxpayers' knowledge that it is their moral and legal responsibility to pay their taxes. Taxpayers also recognize that they face a lottery in which not all taxpayer noncompliance will ever be detected. In the United States most individuals comply with the tax law, yet the tax gap has grown significantly over time for individual taxpayers. The US Internal Revenue Service attempts to ensure that the minority of taxpayers who are noncompliant pay their fair share with a variety of enforcement tools and penalties. The Causes and Consequences of Income Tax Noncompliance provides a comprehensive summary of the empirical evidence concerning taxpayer noncompliance and presents innovative research with new results on the role of IRS audit and enforcements activities on compliance with federal and state income tax collection. Other issues examined include to what degree taxpayers respond to the threat of civil and criminal enforcement and the important role of the media on taxpayer compliance. This book offers researchers, students, and tax administrators insight into the allocation of taxpayer compliance enforcement and service resources, and suggests policies that will prevent further increases in the tax gap. The book's aggregate data analysis methods have practical applications not only to taxpayer compliance but also to other forms of economic behavior, such as welfare fraud.
Connections among different assets, asset classes, portfolios, and the stocks of individual institutions are critical in examining financial markets. Interest in financial markets implies interest in underlying macroeconomic fundamentals. In Financial and Macroeconomic Connectedness, Frank Diebold and Kamil Yilmaz propose a simple framework for defining, measuring, and monitoring connectedness, which is central to finance and macroeconomics. These measures of connectedness are theoretically rigorous yet empirically relevant. The approach to connectedness proposed by the authors is intimately related to the familiar econometric notion of variance decomposition. The full set of variance decompositions from vector auto-regressions produces the core of the 'connectedness table.' The connectedness table makes clear how one can begin with the most disaggregated pair-wise directional connectedness measures and aggregate them in various ways to obtain total connectedness measures. The authors also show that variance decompositions define weighted, directed networks, so that these proposed connectedness measures are intimately related to key measures of connectedness used in the network literature. After describing their methods in the first part of the book, the authors proceed to characterize daily return and volatility connectedness across major asset (stock, bond, foreign exchange and commodity) markets as well as the financial institutions within the U.S. and across countries since late 1990s. These specific measures of volatility connectedness show that stock markets played a critical role in spreading the volatility shocks from the U.S. to other countries. Furthermore, while the return connectedness across stock markets increased gradually over time the volatility connectedness measures were subject to significant jumps during major crisis events. This book examines not only financial connectedness, but also real fundamental connectedness. In particular, the authors show that global business cycle connectedness is economically significant and time-varying, that the U.S. has disproportionately high connectedness to others, and that pairwise country connectedness is inversely related to bilateral trade surpluses. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R7,224
Discovery Miles 72 240
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
|