![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This book investigates the relationship between environmental degradation and income, focusing on carbon dioxide (CO2) emissions from around the world, to explore the possibility of sustainable development under global warming. Although many researchers have tackled this problem by estimating the Environmental Kuznets Curve (EKC), unlike the approach to sulfur dioxide emissions, there seems to be little consensus about whether EKC is formed with regard to CO2 emissions. Thus, EKC is one of the most controversial issues in the field of environmental economics. This book contributes three points with academic rigor. First, an unbalanced panel dataset containing over 150 countries with the latest CO2 emission data between 1960 and 2010 is constructed. Second, based on this dataset, the CO2 emission-income relationship is analyzed using strict econometric methods such as the dynamic panel model. Third, as it is often pointed out that some factors other than income affect CO2 emission, several variables were added to the estimation model to examine the effects of changes of industrial structure, energy composition, and overseas trade on CO2 emission.
Originally published in 1939, this book forms the second part of a two-volume series on the mathematics required for the examinations of the Institute of Actuaries, focusing on finite differences, probability and elementary statistics. Miscellaneous examples are included at the end of the text. This book will be of value to anyone with an interest in actuarial science and mathematics.
The Analytic Hierarchy Process (AHP) has been one of the foremost mathematical methods for decision making with multiple criteria and has been widely studied in the operations research literature as well as applied to solve countless real-world problems. This book is meant to introduce and strengthen the readers' knowledge of the AHP, no matter how familiar they may be with the topic. This book provides a concise, yet self-contained, introduction to the AHP that uses a novel and more pedagogical approach. It begins with an introduction to the principles of the AHP, covering the critical points of the method, as well as some of its applications. Next, the book explores further aspects of the method, including the derivation of the priority vector, the estimation of inconsistency, and the use of AHP for group decisions. Each of these is introduced by relaxing initial assumptions. Furthermore, this booklet covers extensions of AHP, which are typically neglected in elementary expositions of the methods. Such extensions concern different numerical representations of preferences and the interval and fuzzy representations of preferences to account for uncertainty. During the whole exposition, an eye is kept on the most recent developments of the method.
Written for a broad audience this book offers a comprehensive account of early warning systems for hydro meteorological disasters such as floods and storms, and for geological disasters such as earthquakes. One major theme is the increasingly important role in early warning systems played by the rapidly evolving fields of space and information technology. The authors, all experts in their respective fields, offer a comprehensive and in-depth insight into the current and future perspectives for early warning systems. The text is aimed at decision-makers in the political arena, scientists, engineers and those responsible for public communication and dissemination of warnings.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
This volume addresses advanced DEA methodology and techniques developed for modeling unique new performance evaluation issues. Many numerical examples, real management cases and verbal descriptions make it very valuable for researchers and practitioners.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
One of the best known statisticians of the 20th century, Frederick Mosteller has inspired numerous statisticians and other scientists by his creative approach to statistics and its applications. This volume collects 40 of his most original and influential papers, capturing the variety and depth of his writings. It is hoped that sharing these writings with a new generation of researchers will inspire them to build upon his insights and efforts.
In recent years, as part of the increasing "informationization" of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and "laws" akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled "New Approaches to the Analysis of Large-Scale Business and E- nomic Data," held in Tokyo, March 1-5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)-Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
The book examines applications in two disparate fields linked by the importance of valuing information: public health and space. Researchers in the health field have developed some of the most innovative methodologies for valuing information, used to help determine, for example, the value of diagnostics in informing patient treatment decisions. In the field of space, recent applications of value-of-information methods are critical for informing decisions on investment in satellites that collect data about air quality, fresh water supplies, climate and other natural and environmental resources affecting global health and quality of life.
This volume presents original and up-to-date studies in unobserved components (UC) time series models from both theoretical and methodological perspectives. It also presents empirical studies where the UC time series methodology is adopted. Drawing on the intellectual influence of Andrew Harvey, the work covers three main topics: the theory and methodology for unobserved components time series models; applications of unobserved components time series models; and time series econometrics and estimation and testing. These types of time series models have seen wide application in economics, statistics, finance, climate change, engineering, biostatistics, and sports statistics. The volume effectively provides a key review into relevant research directions for UC time series econometrics and will be of interest to econometricians, time series statisticians, and practitioners (government, central banks, business) in time series analysis and forecasting, as well to researchers and graduate students in statistics, econometrics, and engineering.
The Analytic Network Process (ANP), developed by Thomas Saaty in his work on multicriteria decision making, applies network structures with dependence and feedback to complex decision making. This new edition of Decision Making with the Analytic Network Process is a selection of the latest applications of ANP to economic, social and political decisions, and also to technological design. The ANP is a methodological tool that is helpful to organize knowledge and thinking, elicit judgments registered in both in memory and in feelings, quantify the judgments and derive priorities from them, and finally synthesize these diverse priorities into a single mathematically and logically justifiable overall outcome. In the process of deriving this outcome, the ANP also allows for the representation and synthesis of diverse opinions in the midst of discussion and debate. The book focuses on the application of the ANP in three different areas: economics, the social sciences and the linking of measurement with human values. Economists can use the ANP for an alternate approach for dealing with economic problems than the usual mathematical models on which economics bases its quantitative thinking. For psychologists, sociologists and political scientists, the ANP offers the methodology they have sought for some time to quantify and derive measurements for intangibles. Finally the book applies the ANP to provide people in the physical and engineering sciences with a quantitative method to link hard measurement to human values. In such a process, one is able to interpret the true meaning of measurements made on a uniform scale using a unit.
Up-to-date coverage of most micro-econometric topics; first half parametric, second half semi- (non-) parametric Many empirical examples and tips in applying econometric theories to data Essential ideas and steps shown for most estimators and tests; well-suited for both applied and theoretical readers
In macro-econometrics more attention needs to be paid to the relationships among deterministic trends of different variables, or co-trending, especially when economic growth is of concern. The number of relationships, i.e., the co-trending rank, plays an important role in evaluating the veracity of propositions, particularly relating to the Japanese economic growth in view of the structural changes involved within it. This book demonstrates how to determine the co-trending rank from a given set of time series data for different variables. At the same time, the method determines how many of the co-trending relations also represent cointegrations. This enables us to perform statistical inference on the parameters of relations among the deterministic trends. Co-trending is an important contribution to the fields of econometric methods, macroeconomics, and time series analyses.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policy makers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyze patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented. Contents: Introduction Part I: Summary Tables 1.1. The Manufacturing Sector 1.2. The Manufacturing Branches Part II: Country Tables
This workbook consists of exercises taken from Likelihood-Based
Inferences in Cointegrated Vector Autoregressive Models by Soren
Johansen, together with worked-out solutions.
Quantile regression constitutes an ensemble of statistical techniques intended to estimate and draw inferences about conditional quantile functions. Median regression, as introduced in the 18th century by Boscovich and Laplace, is a special case. In contrast to conventional mean regression that minimizes sums of squared residuals, median regression minimizes sums of absolute residuals; quantile regression simply replaces symmetric absolute loss by asymmetric linear loss. Since its introduction in the 1970's by Koenker and Bassett, quantile regression has been gradually extended to a wide variety of data analytic settings including time series, survival analysis, and longitudinal data. By focusing attention on local slices of the conditional distribution of response variables it is capable of providing a more complete, more nuanced view of heterogeneous covariate effects. Applications of quantile regression can now be found throughout the sciences, including astrophysics, chemistry, ecology, economics, finance, genomics, medicine, and meteorology. Software for quantile regression is now widely available in all the major statistical computing environments. The objective of this volume is to provide a comprehensive review of recent developments of quantile regression methodology illustrating its applicability in a wide range of scientific settings. The intended audience of the volume is researchers and graduate students across a diverse set of disciplines.
This book investigates whether the effects of economic integration differ according to the size of countries. The analysis incorporates a classification of the size of countries, reflecting the key economic characteristics of economies in order to provide an appropriate benchmark for each size group in the empirical analysis of the effects of asymmetric economic integration. The formation or extension of Preferential Trade Areas (PTAs) leads to a reduction in trade costs. This poses a critical secondary question as to the extent to which trade costs differ according to the size of countries. The extent to which membership of PTAs has an asymmetric impact on trade flow according to the size of member countries is analyzed by employing econometric tools and general equilibrium analysis, estimating both the ex-post and ex-ante effects of economic integration on the size of countries, using a data set of 218 countries, 45 of which are European. ?
The Oxford Handbook of Computational Economics and Finance provides a survey of both the foundations of and recent advances in the frontiers of analysis and action. It is both historically and interdisciplinarily rich and also tightly connected to the rise of digital society. It begins with the conventional view of computational economics, including recent algorithmic development in computing rational expectations, volatility, and general equilibrium. It then moves from traditional computing in economics and finance to recent developments in natural computing, including applications of nature-inspired intelligence, genetic programming, swarm intelligence, and fuzzy logic. Also examined are recent developments of network and agent-based computing in economics. How these approaches are applied is examined in chapters on such subjects as trading robots and automated markets. The last part deals with the epistemology of simulation in its trinity form with the integration of simulation, computation, and dynamics. Distinctive is the focus on natural computationalism and the examination of the implications of intelligent machines for the future of computational economics and finance. Not merely individual robots, but whole integrated systems are extending their "immigration" to the world of Homo sapiens, or symbiogenesis.
This volume uses state of the art models from the frontier of macroeconomics to answer key questions about how the economy functions and how policy should be conducted. The contributions cover a wide range of issues in macroeconomics and macroeconomic policy. They combine high level mathematics with economic analysis, and highlight the need to update our mathematical toolbox in order to understand the increased complexity of the macroeconomic environment. The volume represents hard evidence of high research intensity in many fields of macroeconomics, and warns against interpreting the scope of macroeconomics too narrowly. The mainstream business cycle analysis, based on dynamic stochastic general equilibrium (DSGE) modelling of a particular type, has been criticised for its inability to predict or resolve the recent financial crisis. However, macroeconomic research on financial, information, and learning imperfections had not yet made their way into many of the pre-crisis DSGE models because practical econometric versions of those models were mainly designed to fit data periods that did not include financial crises. A major response to the limitations of those older DSGE models is an active research program to bring big financial shocks and various kinds of financial, learning, and labour market frictions into a new generation of DSGE models for guiding policy. The contributors to this book utilise models and modelling assumptions that go beyond particular modelling conventions. By using alternative yet plausible assumptions, they seek to enrich our knowledge and ability to explain macroeconomic phenomena. They contribute to expanding the frontier of macroeconomic knowledge in ways that will prove useful for macroeconomic policy.
Discover the Benefits of Risk Parity Investing Despite recent progress in the theoretical analysis and practical applications of risk parity, many important fundamental questions still need to be answered. Risk Parity Fundamentals uses fundamental, quantitative, and historical analysis to address these issues, such as: What are the macroeconomic dimensions of risk in risk parity portfolios? What are the appropriate risk premiums in a risk parity portfolio? What are market environments in which risk parity might thrive or struggle? What is the role of leverage in a risk parity portfolio? An experienced researcher and portfolio manager who coined the term "risk parity," the author provides investors with a practical understanding of the risk parity investment approach. Investors will gain insight into the merit of risk parity as well as the practical and underlying aspects of risk parity investing.
This volume is dedicated to two recent intensive areas of research
in the econometrics of panel data, namely nonstationary panels and
dynamic panels. It includes a comprehensive survey of the
nonstationary panel literature including panel unit root tests,
spurious panel regressions and panel cointegration
The book's comprehensive coverage on the application of econometric methods to empirical analysis of economic issues is impressive. It uncovers the missing link between textbooks on economic theory and econometrics and highlights the powerful connection between economic theory and empirical analysis perfectly through examples on rigorous experimental design. The use of data sets for estimation derived with the Monte Carlo method helps facilitate the understanding of the role of hypothesis testing applied to economic models. Topics covered in the book are: consumer behavior, producer behavior, market equilibrium, macroeconomic models, qualitative-response models, panel data analysis and time-series analysis. Key econometric models are introduced, specified, estimated and evaluated. The treatment on methods of estimation in econometrics and the discipline of hypothesis testing makes it a must-have for graduate students of economics and econometrics and aids their understanding on how to estimate economic models and evaluate the results in terms of policy implications.
Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs). EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current research trends and future perspectives in the enhancement and applicability of EDAs are also covered. The contributions included in the book address topics as relevant as the application of probabilistic-based fitness models, the use of belief propagation algorithms in EDAs and the application of Markov network based EDAs to real-world optimization problems. The book should be of interest to researchers and practitioners from areas such as optimization, evolutionary computation, and machine learning.
This book focuses on general frameworks for modeling heavy-tailed distributions in economics, finance, econometrics, statistics, risk management and insurance. A central theme is that of (non-)robustness, i.e., the fact that the presence of heavy tails can either reinforce or reverse the implications of a number of models in these fields, depending on the degree of heavy-tailed ness. These results motivate the development and applications of robust inference approaches under heavy tails, heterogeneity and dependence in observations. Several recently developed robust inference approaches are discussed and illustrated, together with applications. |
You may like...
Computer Architectures for Spatially…
Herbert Freeman, G.G. Pieroni
Paperback
R2,697
Discovery Miles 26 970
Storage Management in Data Centers…
Volker Herminghaus, Albrecht Scriba
Hardcover
In Search of the Next Memory - Inside…
Roberto Gastaldi, Giovanni Campardo
Hardcover
R4,017
Discovery Miles 40 170
The Design and Implementation of…
Derek Shaeffer, Thomas H. Lee
Hardcover
R2,768
Discovery Miles 27 680
|