![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Building on the success of Abadir and Magnus' Matrix Algebra in the Econometric Exercises Series, Statistics serves as a bridge between elementary and specialized statistics. Professors Abadir, Heijmans, and Magnus freely use matrix algebra to cover intermediate to advanced material. Each chapter contains a general introduction, followed by a series of connected exercises which build up knowledge systematically. The characteristic feature of the book (and indeed the series) is that all exercises are fully solved. The authors present many new proofs of established results, along with new results, often involving shortcuts that resort to statistical conditioning arguments.
This introductory statistics textbook conveys the essential concepts and tools needed to develop and nurture statistical thinking. It presents descriptive, inductive and explorative statistical methods and guides the reader through the process of quantitative data analysis. In the experimental sciences and interdisciplinary research, data analysis has become an integral part of any scientific study. Issues such as judging the credibility of data, analyzing the data, evaluating the reliability of the obtained results and finally drawing the correct and appropriate conclusions from the results are vital. The text is primarily intended for undergraduate students in disciplines like business administration, the social sciences, medicine, politics, macroeconomics, etc. It features a wealth of examples, exercises and solutions with computer code in the statistical programming language R as well as supplementary material that will enable the reader to quickly adapt all methods to their own applications.
This book explores the US economy from 1960 to 2010 using a more Keynsian, Cowles model approach, which the author argues has substantial advantages over the vector autoregression (VAR) and dynamic stochastic general equilibrium (DSGE) models used almost exclusively today. Heim presents a robust argument in favor of the Cowles model as an answer to the pressing, unresolved methodological question of how to accurately model the macroeconomy so that policymakers can reliably use these models to assist their decision making. Thirty-eight behavioral equations, describing determinants of variables such as consumption, taxes, and government spending, are connected by eighteen identities to construct a comprehensive model of the real US economy that Heim then tests across four different time periods to ensure that results are consistent. This comprehensive demonstration of the value of a long-ignored model provides overwhelming evidence that the more Keynesian (Cowles) structural models outperform VAR and DSGE, and therefore should be the models of choice in future macroeconomic studies.
In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, labour markets, and economic growth.
This second edition retains the positive features of being clearly written, well organized, and incorporating calculus in the text, while adding expanded coverage on game theory, experimental economics, and behavioural economics. It remains more focused and manageable than similar textbooks, and provides a concise yet comprehensive treatment of the core topics of microeconomics, including theories of the consumer and of the firm, market structure, partial and general equilibrium, and market failures caused by public goods, externalities and asymmetric information. The book includes helpful solved problems in all the substantive chapters, as well as over seventy new mathematical exercises and enhanced versions of the ones in the first edition. The authors make use of the book's full color with sharp and helpful graphs and illustrations. This mathematically rigorous textbook is meant for students at the intermediate level who have already had an introductory course in microeconomics, and a calculus course.
This book studies the evolution of the middle class in Russia after the fall of the Soviet Union. Using data from the RLMS (Russian Longitudinal Monitoring Survey), the volume covers the period of transition (1991-2008) during which many fundamental economic reforms were implemented. The first part of the book is devoted to a discussion of the concept of middle class and a description of the economic situation in Russia during the transition period. Particular attention is given to variations in the distribution of Russian incomes and the estimated importance of the middle class. The second part of the book focuses on the link between the middle class and income bipolarization. The third and last section of the book uses the semiparametric "mixture model" to discover how many different groups may be derived from the income distribution in Russia, as well as what the main socio-economic and demographic characteristics of those groups are. The mobility of households into and out of the middle class during the transition period is also studied in hopes of determining the factors that contribute to such mobility. Using rigorous empirical methods, this volume sheds light on a relatively unstudied economic group and provides insight for countries which are about to enter a transition period. As such, this book will be of great interest to researchers in economics and inequality as well as professionals and practitioners working with international organizations.
This is the second volume in a two-part series on frontiers in regional research. It identifies methodological advances as well as trends and future developments in regional systems modelling and open science. Building on recent methodological and modelling advances, as well as on extensive policy-analysis experience, top international regional scientists identify and evaluate emerging new conceptual and methodological trends and directions in regional research. Topics such as dynamic interindustry modelling, computable general equilibrium models, exploratory spatial data analysis, geographic information science, spatial econometrics and other advanced methods are the central focus of this book. The volume provides insights into the latest developments in object orientation, open source, and workflow systems, all in support of open science. It will appeal to a wide readership, from regional scientists and economists to geographers, quantitatively oriented regional planners and other related disciplines. It offers a source of relevant information for academic researchers and policy analysts in government, and is also suitable for advanced teaching courses on regional and spatial science, economics and political science.
This book presents recent research on robustness in econometrics. Robust data processing techniques - i.e., techniques that yield results minimally affected by outliers - and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.
Central Bank Balance Sheet and Real Business Cycles argues that a deeper comprehension of changes to the central bank balance sheet can lead to more effective policymaking. Any transaction engaged in by the central bank-issuing currency, conducting foreign exchange operations, investing its own funds, intervening to provide emergency liquidity assistance and carrying out monetary policy operations-influences its balance sheet. Despite this, many central banks throughout the world have largely ignored balance sheet movements, and have instead focused on implementing interest rates. In this book, Mustapha Abiodun Akinkunmi highlights the challenges and controversies faced by central banks in the past and present when implementing policies, and analyzes the links between these policies, the central bank balance sheet, and the consequences to economies as a whole. He argues that the composition and evolution of the central bank balance sheet provides a valuable basis for understanding the needs of an economy, and is an important tool in developing strategies that would most effectively achieve policy goals. This book is an important resource for anyone interested in monetary policy or whose work is affected by the actions of the policies of central banks.
This edited volume, with contributions by area experts, offers discussions on a range of evolving topics in economics and social development. At center are important issues central to sustainable development, economic growth, technological change, the economics of climate change, commodity markets, long wave theory, non-linear dynamic models, and boom-bust cycles. This is an excellent reference for academic and professional economists interested in emerging areas of empirical macroeconomics and finance. For policy makers and curious readers alike, it is also an outstanding introduction to the economic thinking of those who seek a holistic and all-compassing approach in economic theory and policy. Looking into new data and methodology, this book offers fresh approaches in a post-crisis environment. Set in a profound understanding of the diverse currents within the many traditions of economic thought, this book pushes the established frontiers of economic thinking. It is dedicated to a leading scholar in the areas covered in this book, Willi Semmler.
This book provides a quantitative framework for the analysis of conflict dynamics and for estimating the economic costs associated with civil wars. The author develops modified Lotka-Volterra equations to model conflict dynamics, to yield realistic representations of battle processes, and to allow us to assess prolonged conflict traps. The economic costs of civil wars are evaluated with the help of two alternative methods: Firstly, the author employs a production function to determine how the destruction of human and physical capital stocks undermines economic growth in the medium term. Secondly, he develops a synthetic control approach, where the cost is obtained as the divergence of actual economic activity from a hypothetical path in the absence of civil war. The difference between the two approaches gives an indication of the adverse externalities impinging upon the economy in the form of institutional destruction. By using detailed time-series regarding battle casualties, local socio-economic indicators, and capital stock destruction during the Greek Civil War (1946-1949), a full-scale application of the above framework is presented and discussed.
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit theorems) are described under SRD; mixing and weak dependence are also reviewed. In closing, it describes moment techniques together with their relations to cumulant sums as well as an application to kernel type estimation.The appendix reviews basic probability theory facts and discusses useful laws stemming from the Gaussian laws as well as the basic principles of probability, and is completed by R-scripts used for the figures. Richly illustrated with examples and simulations, the book is recommended for advanced master courses for mathematicians just entering the field of time series, and statisticians who want more mathematical insights into the background of non-linear time series.
This proceedings volume presents the latest scientific research and trends in experimental economics, with particular focus on neuroeconomics. Derived from the 2016 Computational Methods in Experimental Economics (CMEE) conference held in Szczecin, Poland, this book features research and analysis of novel computational methods in neuroeconomics. Neuroeconomics is an interdisciplinary field that combines neuroscience, psychology and economics to build a comprehensive theory of decision making. At its core, neuroeconomics analyzes the decision-making process not only in terms of external conditions or psychological aspects, but also from the neuronal point of view by examining the cerebral conditions of decision making. The application of IT enhances the possibilities of conducting such analyses. Such studies are now performed by software that provides interaction among all the participants and possibilities to register their reactions more accurately. This book examines some of these applications and methods. Featuring contributions on both theory and application, this book is of interest to researchers, students, academics and professionals interested in experimental economics, neuroeconomics and behavioral economics.
The book addresses the problem of calculation of d-dimensional integrals (conditional expectations) in filter problems. It develops new methods of deterministic numerical integration, which can be used to speed up and stabilize filter algorithms. With the help of these methods, better estimates and predictions of latent variables are made possible in the fields of economics, engineering and physics. The resulting procedures are tested within four detailed simulation studies.
The proliferation of the internet has often been referred to as the fourth technological revolution. This book explores the diffusion of radical new communication technologies, and the subsequent transformation not only of products, but also of the organisation of production and business methods.
This book is an extension of the author's first book and serves as a guide and manual on how to specify and compute 2-, 3-, and 4-Event Bayesian Belief Networks (BBN). It walks the learner through the steps of fitting and solving fifty BBN numerically, using mathematical proof. The author wrote this book primarily for inexperienced learners as well as professionals, while maintaining a proof-based academic rigor. The author's first book on this topic, a primer introducing learners to the basic complexities and nuances associated with learning Bayes' theorem and inverse probability for the first time, was meant for non-statisticians unfamiliar with the theorem-as is this book. This new book expands upon that approach and is meant to be a prescriptive guide for building BBN and executive decision-making for students and professionals; intended so that decision-makers can invest their time and start using this inductive reasoning principle in their decision-making processes. It highlights the utility of an algorithm that served as the basis for the first book, and includes fifty 2-, 3-, and 4-event BBN of numerous variants.
This book grows from a conference on the state of the art and recent advances in Efficiency and Productivity. Papers were commissioned from leading researchers in the field, and include eight explorations into the analytical foundations of efficiency and productivity analysis. Chapters on modeling advances include reverse directional distance function, a new method for estimating technological production possibilities, a new distance function called a loss distance function, an analysis of productivity and price recovery indices, the relation of technical efficiency measures to productivity measures, the implications for benchmarking and target setting of imposing weight restrictions on DEA models, weight restrictions in a regulatory setting, and the Principle of Least Action. Chapters on empirical applications include a study of innovative firms that use innovation inputs to produce innovation outputs, a study of the impact of potential "coopetition" or cooperation among competitors on the financial performance of European automobile plants, using SFA to estimate the eco-efficiency of dairy farms in Spain, a DEA bankruptcy prediction model, a combined stochastic cost frontier analysis model/mixture hazard model, the evolution of energy intensity in nine Spanish manufacturing industries, and the productivity of US farmers as they age.
This book examines discrete dynamical systems with memory-nonlinear systems that exist extensively in biological organisms and financial and economic organizations, and time-delay systems that can be discretized into the memorized, discrete dynamical systems. It book further discusses stability and bifurcations of time-delay dynamical systems that can be investigated through memorized dynamical systems as well as bifurcations of memorized nonlinear dynamical systems, discretization methods of time-delay systems, and periodic motions to chaos in nonlinear time-delay systems. The book helps readers find analytical solutions of MDS, change traditional perturbation analysis in time-delay systems, detect motion complexity and singularity in MDS; and determine stability, bifurcation, and chaos in any time-delay system.
This book explores the role of national fiscal policies in a selected group of Euro-area countries under the European Economic and Monetary Union (EMU). In particular, the authors characterize the response of output to fiscal consolidations and expansions in the small Euro-area open economies affected by high public and private debt. It is shown that the macroeconomic outcome of fiscal shocks is strongly related to debt levels. The Euro-area countries included in the investigation are Greece, Ireland, Italy, the Netherlands, Spain, and Portugal, over the sample period 1999-2016, i.e., the EMU period. The main econometric tools used in this research are structural vector autoregressive (VAR) models, including panel VAR models. The available literature relating to the subject is also fully reviewed. A further closely investigated topic is the potential spillover effects of German fiscal policies on the selected small Euro-area economies. Moreover, in the perspective of the evolution of the Euro Area towards a full Monetary and Fiscal Union, the authors study the effects of area-wide government spending shocks on aggregate output and other macroeconomic variables during the EMU period. The closing chapter of the book considers evidence on the consequences of austerity policies for European labour markets during recent years.
Structural vector autoregressive (VAR) models are important tools for empirical work in macroeconomics, finance, and related fields. This book not only reviews the many alternative structural VAR approaches discussed in the literature, but also highlights their pros and cons in practice. It provides guidance to empirical researchers as to the most appropriate modeling choices, methods of estimating, and evaluating structural VAR models. The book traces the evolution of the structural VAR methodology and contrasts it with other common methodologies, including dynamic stochastic general equilibrium (DSGE) models. It is intended as a bridge between the often quite technical econometric literature on structural VAR modeling and the needs of empirical researchers. The focus is not on providing the most rigorous theoretical arguments, but on enhancing the reader's understanding of the methods in question and their assumptions. Empirical examples are provided for illustration.
This book presents an empirical investigation into the relationship between companies' short-term response to capital and labor market frictions and performance. Two different kinds of performance measures are considered, namely innovation performance and firm performance. The author focuses on two major topics: first, on the relation between innovation performance and the use of trade credit. Second, on the relation between firm performance and the use of temporary employment. The use of in-depth firm-level data and state-of-the-art microeconometric methods provide the scientific rigor to this important investigation to answer the questions currently being confronted by many companies in different economies.
This book presents eleven classic papers by the late Professor Suzanne Scotchmer with introductions by leading economists and legal scholars. This book introduces Scotchmer's life and work; analyses her pioneering contributions to the economics of patents and innovation incentives, with a special focus on the modern theory of cumulative innovation; and describes her pioneering work on law and economics, evolutionary game theory, and general equilibrium/club theory. This book also provides a self-contained introduction to students who want to learn more about the various fields that Professor Scotchmer worked in, with a particular focus on patent incentives and cumulative innovation.
Essentials of Applied Econometrics prepares students for a world in which more data surround us every day and in which econometric tools are put to diverse uses. Written for students in economics and for professionals interested in continuing an education in econometrics, this succinct text not only teaches best practices and state-of-the-art techniques, but uses vivid examples and data obtained from a variety of real world sources. The book's emphasis on application uniquely prepares the reader for today's econometric work, which can include analyzing causal relationships or correlations in big data to obtain useful insights.
This new and exciting book offers a fresh approach to quantitative finance and utilises novel features, including stereoscopic images which permit 3D visualisation of complex subjects without the need for additional tools. Offering an integrated approach to the subject, A First Course in Quantitative Finance introduces students to the architecture of complete financial markets before exploring the concepts and models of modern portfolio theory, derivative pricing and fixed income products in both complete and incomplete market settings. Subjects are organised throughout in a way that encourages a gradual and parallel learning process of both the economic concepts and their mathematical descriptions, framed by additional perspectives from classical utility theory, financial economics and behavioural finance. Suitable for postgraduate students studying courses in quantitative finance, financial engineering and financial econometrics as part of an economics, finance, econometric or mathematics program, this book contains all necessary theoretical and mathematical concepts and numerical methods, as well as the necessary programming code for porting algorithms onto a computer. |
You may like...
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Econometrics, Volume 6B
James J. Heckman, Edward Leamer
Hardcover
R3,274
Discovery Miles 32 740
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|