![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Volume 39A of Research in the History of Economic Thought and Methodology features a selection of essays presented at the 2019 Conference of the Latin American Society for the History of Economic Thought (ALAHPE), edited by Felipe Almeida and Carlos Eduardo Suprinyak, as well as a new general-research essay by Daniel Kuehn, an archival discovery by Katia Caldari and Luca Fiorito, and a book review by John Hall.
These essays honor Professor Peter C.B. Phillips of Yale University and his many contributions to the field of econometrics. Professor Phillips's research spans many topics in econometrics including: non-stationary time series and panel models partial identification and weak instruments Bayesian model evaluation and prediction financial econometrics and finite-sample statistical methods and results. The papers in this volume reflect additions to and amplifications of many of Professor Phillips' research contributions. Some of the topics discussed in the volume include panel macro-econometric modeling, efficient estimation and inference in difference-in-difference models, limiting and empirical distributions of IV estimates when some of the instruments are endogenous, the use of stochastic dominance techniques to examine conditional wage distributions of incumbents and newly hired employees, long-horizon predictive tests in financial markets, new developments in information matrix testing, testing for co-integration in Markov switching error correction models, and deviation information criteria for comparing vector autoregressive models.
The Analytic Hierarchy Process (AHP) is a prominent and powerful tool for making decisions in situations involving multiple objectives. Models, Methods, Concepts and Applications of the Analytic Hierarchy Process, 2nd Edition applies the AHP in order to solve problems focused on the following three themes: economics, the social sciences, and the linking of measurement with human values. For economists, the AHP offers a substantially different approach to dealing with economic problems through ratio scales. Psychologists and political scientists can use the methodology to quantify and derive measurements for intangibles. Meanwhile researchers in the physical and engineering sciences can apply the AHP methods to help resolve the conflicts between hard measurement data and human values. Throughout the book, each of these topics is explored utilizing real life models and examples, relevant to problems in today's society. This new edition has been updated and includes five new chapters that includes discussions of the following: - The eigenvector and why it is necessary - A summary of ongoing research in the Middle East that brings together Israeli and Palestinian scholars to develop concessions from both parties - A look at the Medicare Crisis and how AHP can be used to understand the problems and help develop ideas to solve them.
This proceedings volume presents new methods and applications in applied economic research with an emphasis on advances in panel data analysis. Featuring papers presented at the 2017 International Conference on Applied Economics (ICOAE) held at Coventry University, this volume provides current research on econometric panel data methodologies as they are applied in microeconomics, macroeconomics, financial economics and agricultural economics. International Conference on Applied Economics (ICOAE) is an annual conference that started in 2008 designed to bring together economists from different fields of applied economic research in order to share methods and ideas. Applied economics is a rapidly growing field of economics that combines economic theory with econometrics to analyse economic problems of the real world usually with economic policy interest. In addition, there is growing interest in the field for panel data estimation methods, tests and techniques. This volume makes a contribution in the field of applied economic research in this area. Featuring country specific studies, this book will be of interest to academics, students, researchers, practitioners, and policy makers in applied economics and economic policy.
Stokes discusses--and illustrates with output from actual problems--a number of applied econometric techniques, including OLS specification tests, recursive residual analysis, limited dependent variable models, error component models, and others. His book is clearly written and copiously illustrated with equations, with follow-up analysis to show how models are built and some of their limitations. His B34S DEGREESDTM software is available and allows readers to do further research with a large number of datasets distributed with the program. A necessary resource for applied econometrics researchers in economics, finance, and in health, energy, and labor economics. This work illustrates the use of model specification and diagnostic tests applied to a variety of econometric modeling techniques. For each technique discussed the basic mathematical models are outlined. A sample problem is discussed and estimated using the B34S DEGREESDTM Data Analysis System. The output of the program is displayed in the text and discussed. Where appropriate, output from the RATS DEGREESDTM software is displayed. Follow-up models are estimated and discussed. The examples selected are taken from a variety of sources and reflect actual applied research. Complete data are given in the text to enable the reader to use these problems with other programs and techniques. It is the author's experience that applied econometric techniques are best learned by running actual problems. Since most users experiment with a limited number of techniques, their experience is limited. This book discusses a broad range of techniques and shows how they are interrelated. DEGREESL DEGREESL The techniques discussed include the following: simple, one-equation OLS and GLS models with continuous variables on the left-hand side, which are tested with recursive residual and BLUS residual techniques. Another class of models includes restrictions on the left-hand side variables. Models studied and illustrated with data include probit, logit, multinomial logit, and ordered probit models. Other techniques discussed and illustrated include two-stage least squares, limited information maximum likelihood, three-stage least squares, iterative three-stage least squares, error component models and Markov probability models, which are illustrated with a model of OPEC production dynamics. ARIMA and transfer function models are shown to be generalizations of the single-equation model, while VAR and VARMA models are shown to be a time series generalization of three-stage least squares and full information maximum likelihood models. VAR models are viewed in the frequency domain for added insight, and extensive nonlinearity tests are developed and applied. More specialized techniques include state space models, optimal control analysis, nonlinear analysis, and the QR approach to computation. An important feature of the book is the emphasis on nonlinear model building. The Hinich nonlinear testing approach is discussed and integrated into the OLS, times series, and nonlinear estimation procedures. The MARS and PISPLINE methods of analysis are illustrated with models that failed linearity tests when estimated with linear methods. The purpose of the monograph is to illustrate the above techniques, using actual research data. To facilitate the calculations, the B34S DEGREESDTM Data Analysis System was developed. Sample output for all procedures discussed in the text has been provided so that the availability of the B34S DEGREESDTM program is DEGREESInot DEGREESR required in order to benefit from this book. While the book is self-contained, interested readers can obtain the B34S DEGREESDTM Data Analysis program and do further research with the datasets discussed in the book which are supplied with the software.
Economic indicators provide invaluable insights into how different economies and financial markets are performing, enabling practitioners to adjust their investment strategies in order to gain knowledge about markets and to achieve higher returns. However, in order to make the right decisions, you must know how to interpret the relevant indicators. Using Economic Indicators in Analysing Financial Markets provides this important guidance. The first and second part of Using Economic Indicators in Analysing Financial Markets focuses on the short-term analysis, explaining exactly what the indicators are, why they are significant, where and when they are published, and how reliable they are. In the third part, author Bernd Krampen highlights medium and long-term economic trends: It is shown how some previously discussed and additional market indicators like stocks, bond yields, commodities can be employed as basis for forecasting both GDP growth and inflation. This includes the estimation of possible future recessions. In the fourth part the predominantly good forecast properties of sentiment indicators are illustrated examining the real estate market, which is rounded up by an introduction into psychology and Behavioural Finance providing further tips and tricks in analysing financial markets. Using Economic Indicators in Analysing Financial Markets is an invaluable resource for investors, strategists, policymakers, students, and private investors worldwide who want to understand the true meaning of the latest economic trends to make the best decisions for future profits on financial markets.
The disintegration of Yugoslavia, accompanied by the emergence of new borders, is paradigmatically highlighting the relevance of borders in processes of societal change, crisis and conflict. This is even more the case, if we consider the violent practices that evolved out of populist discourse of ethnically homogenous bounded space in this process that happened in the wars in Yugoslavia in the 1990ies. Exploring the boundaries of Yugoslavia is not just relevant in the context of Balkan area studies, but the sketched phenomena acquire much wider importance, and can be helpful in order to better understand the dynamics of b/ordering societal space, that are so characteristic for our present situation.
This book treats the notion of morphisms in spatial analysis, paralleling these concepts in spatial statistics (Part I) and spatial econometrics (Part II). The principal concept is morphism (e.g., isomorphisms, homomorphisms, and allomorphisms), which is defined as a structure preserving the functional linkage between mathematical properties or operations in spatial statistics and spatial econometrics, among other disciplines. The purpose of this book is to present selected conceptions in both domains that are structurally the same, even though their labelling and the notation for their elements may differ. As the approaches presented here are applied to empirical materials in geography and economics, the book will also be of interest to scholars of regional science, quantitative geography and the geospatial sciences. It is a follow-up to the book "Non-standard Spatial Statistics and Spatial Econometrics" by the same authors, which was published by Springer in 2011.
This book offers hands-on statistical tools for business professionals by focusing on the practical application of a single-equation regression. The authors discuss commonly applied econometric procedures, which are useful in building regression models for economic forecasting and supporting business decisions. A significant part of the book is devoted to traps and pitfalls in implementing regression analysis in real-world scenarios. The book consists of nine chapters, the final two of which are fully devoted to case studies. Today's business environment is characterised by a huge amount of economic data. Making successful business decisions under such data-abundant conditions requires objective analytical tools, which can help to identify and quantify multiple relationships between dozens of economic variables. Single-equation regression analysis, which is discussed in this book, is one such tool. The book offers a valuable guide and is relevant in various areas of economic and business analysis, including marketing, financial and operational management.
The proliferation of the internet has often been referred to as the fourth technological revolution. This book explores the diffusion of radical new communication technologies, and the subsequent transformation not only of products, but also of the organisation of production and business methods.
This book presents the works and research findings of physicists, economists, mathematicians, statisticians, and financial engineers who have undertaken data-driven modelling of market dynamics and other empirical studies in the field of Econophysics. During recent decades, the financial market landscape has changed dramatically with the deregulation of markets and the growing complexity of products. The ever-increasing speed and decreasing costs of computational power and networks have led to the emergence of huge databases. The availability of these data should permit the development of models that are better founded empirically, and econophysicists have accordingly been advocating that one should rely primarily on the empirical observations in order to construct models and validate them. The recent turmoil in financial markets and the 2008 crash appear to offer a strong rationale for new models and approaches. The Econophysics community accordingly has an important future role to play in market modelling. The Econophys-Kolkata VIII conference proceedings are devoted to the presentation of many such modelling efforts and address recent developments. A number of leading researchers from across the globe report on their recent work, comment on the latest issues, and review the contemporary literature.
'Experiments in Organizational Economics' highlights the importance of replicating previous economic experiments. Replication enables experimental findings to be subjected to rigorous scrutiny. Despite this obvious advantage, direct replication remains relatively scant in economics. One possible explanation for this situation is that publication outlets favor novel work over tests of robustness. Readers will gain a better understanding of the role that replication plays in economic discovery as well as valuable insights into the robustness of previously reported findings.
Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics. Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees. Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles. Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments. The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models. Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
This book provides a quantitative framework for the analysis of conflict dynamics and for estimating the economic costs associated with civil wars. The author develops modified Lotka-Volterra equations to model conflict dynamics, to yield realistic representations of battle processes, and to allow us to assess prolonged conflict traps. The economic costs of civil wars are evaluated with the help of two alternative methods: Firstly, the author employs a production function to determine how the destruction of human and physical capital stocks undermines economic growth in the medium term. Secondly, he develops a synthetic control approach, where the cost is obtained as the divergence of actual economic activity from a hypothetical path in the absence of civil war. The difference between the two approaches gives an indication of the adverse externalities impinging upon the economy in the form of institutional destruction. By using detailed time-series regarding battle casualties, local socio-economic indicators, and capital stock destruction during the Greek Civil War (1946-1949), a full-scale application of the above framework is presented and discussed.
Microsimulation Modelling involves the application of simulation methods to micro data for the purposes of evaluating the effectiveness and improving the design of public policy. The field has existed for over 50 years and has been applied to many different policy areas and is a methodology that is applied within both government and academia. This handbook brings together leading authors in the field to describe and discuss the main current issues within the field. The handbook provides an overview of current developments across each of the sub-fields of microsimulation modelling such as tax-benefit, pensions, spatial, health, labour, consumption, transport and land use policy as well as macro-micro, environmental and demographic issues. It focuses also on the modelling different micro units such as households, firms and farms. Each chapter discusses its sub-field under the following headings: the main methodologies of the sub-field; survey the literature in the area; critique the literature; and propose future directions for research within the sub-field.
Within the subprime crisis (2007) and the recent global financial crisis of 2008-2009, we have observed significant decline, corrections and structural changes in most US and European financial markets. Furthermore, it seems that this crisis has been rapidly transmitted toward the most developed and emerging countries and has strongly affected the whole economy. This volume aims to present recent researches in linear and nonlinear modelling of economic and financial time-series. The several discussions of empirical results of its chapters clearly help to improve the understanding of the financial mechanisms inherent to this crisis. They also yield an important overview on the sources of the financial crisis and its main economic and financial consequences. The book provides the audience a comprehensive understanding of financial and economic dynamics in various aspects using modern financial econometric methods. It addresses the empirical techniques needed by economic agents to analyze the dynamics of these markets and illustrates how they can be applied to the actual data. It also presents and discusses new research findings and their implications.
Computational Economics: A Perspective from Computational Intelligence provides models of various economic and financial issues while using computational intelligence as a foundation. The scope of this volume comprises finance, economics, management, organizational theory and public policies. It explains the ongoing and novel research in this field, and displays the power of these computational methods in coping with difficult problems with methods from traditional perspectives. By encouraging the discussion of different views, this book serves as an introductory and inspiring volume that helps to flourish studies in computational economics.
This collection of original articles 8 years in the making
shines a bright light on recent advances in financial econometrics.
From a survey of mathematical and statistical tools for
understanding nonlinear Markov processes to an exploration of the
time-series evolution of the risk-return tradeoff for stock market
investment, noted scholars Yacine Ait-Sahalia and Lars Peter Hansen
benchmark the current state of knowledge while contributors build a
framework for its growth. Whether in the presence of statistical
uncertainty or the proven advantages and limitations of value at
risk models, readers will discover that they can set few
constraints on the value of this long-awaited volume.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
The main theme of this volume is credit risk and credit derivatives. Recent developments in financial markets show that appropriate modeling and quantification of credit risk is fundamental in the context of modern complex structured financial products. The reader will find several points of view on credit risk when looked at from the perspective of Econometrics and Financial Mathematics. The volume consists of eleven contributions by both practitioners and theoreticians with expertise in financial markets, in general, and econometrics and mathematical finance in particular. The challenge of modeling defaults and their correlations is addressed, and new results on copula, reduced form and structural models, and the top-down approach are presented. After the so-called subprime crisis that hit global markets in the summer of 2007, the volume is very timely and will be useful to researchers in the area of credit risk.
This second edition sees the light three years after the first one: too short a time to feel seriously concerned to redesign the entire book, but sufficient to be challenged by the prospect of sharpening our investigation on the working of econometric dynamic models and to be inclined to change the title of the new edition by dropping the "Topics in" of the former edition. After considerable soul searching we agreed to include several results related to topics already covered, as well as additional sections devoted to new and sophisticated techniques, which hinge mostly on the latest research work on linear matrix polynomials by the second author. This explains the growth of chapter one and the deeper insight into representation theorems in the last chapter of the book. The role of the second chapter is that of providing a bridge between the mathematical techniques in the backstage and the econometric profiles in the forefront of dynamic modelling. For this purpose, we decided to add a new section where the reader can find the stochastic rationale of vector autoregressive specifications in econometrics. The third (and last) chapter improves on that of the first edition by re- ing the fruits of the thorough analytic equipment previously drawn up."
A wide variety of processes occur on multiple scales, either naturally or as a consequence of measurement. This book contains methodology for the analysis of data that arise from such multiscale processes. The book brings together a number of recent developments and makes them accessible to a wider audience. Taking a Bayesian approach allows for full accounting of uncertainty, and also addresses the delicate issue of uncertainty at multiple scales. The Bayesian approach also facilitates the use of knowledge from prior experience or data, and these methods can handle different amounts of prior knowledge at different scales, as often occurs in practice. The book is aimed at statisticians, applied mathematicians, and engineers working on problems dealing with multiscale processes in time and/or space, such as in engineering, finance, and environmetrics. The book will also be of interest to those working on multiscale computation research. The main prerequisites are knowledge of Bayesian statistics and basic Markov chain Monte Carlo methods. A number of real-world examples are thoroughly analyzed in order to demonstrate the methods and to assist the readers in applying these methods to their own work. To further assist readers, the authors are making source code (for R) available for many of the basic methods discussed herein.
This 30th volume of the International Symposia in Economic Theory and Econometrics explores the latest social and financial developments across Asian markets. Chapters cover a range of topics such as the impact of COVID-19 related events in Southeast Asia along the determinants of capital structure before and during the pandemic; the influence of new distribution concepts on macro and micro economic levels; as well as the effects of long-term cross-currency basis swaps on government bonds. These peer-reviewed papers touch on a variety of timely, interdisciplinary subjects such as real earnings impact and the effects of public policy. Together, Quantitative Analysis of Social and Financial Market Development is a crucial resource of current, cutting-edge research for any scholar of international finance and economics.
This book offers a practical guide to Agent Based economic modeling, adopting a "learning by doing" approach to help the reader master the fundamental tools needed to create and analyze Agent Based models. After providing them with a basic "toolkit" for Agent Based modeling, it present and discusses didactic models of real financial and economic systems in detail. While stressing the main features and advantages of the bottom-up perspective inherent to this approach, the book also highlights the logic and practical steps that characterize the model building procedure. A detailed description of the underlying codes, developed using R and C, is also provided. In addition, each didactic model is accompanied by exercises and applications designed to promote active learning on the part of the reader. Following the same approach, the book also presents several complementary tools required for the analysis and validation of the models, such as sensitivity experiments, calibration exercises, economic network and statistical distributions analysis. By the end of the book, the reader will have gained a deeper understanding of the Agent Based methodology and be prepared to use the fundamental techniques required to start developing their own economic models. Accordingly, "Economics with Heterogeneous Interacting Agents" will be of particular interest to graduate and postgraduate students, as well as to academic institutions and lecturers interested in including an overview of the AB approach to economic modeling in their courses. |
You may like...
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Oxford Handbook of Applied Bayesian…
Anthony O'Hagan, Mike West
Hardcover
R4,188
Discovery Miles 41 880
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
The Multi-Agent Transport Simulation…
Andreas Horni, Kai Nagel, …
Hardcover
R1,633
Discovery Miles 16 330
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|