![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This Handbook takes an econometric approach to the foundations of economic performance analysis. The focus is on the measurement of efficiency, productivity, growth and performance. These concepts are commonly measured residually and difficult to quantify in practice. In real-life applications, efficiency and productivity estimates are often quite sensitive to the models used in the performance assessment and the methodological approaches adopted by the analysis. The Palgrave Handbook of Performance Analysis discusses the two basic techniques of performance measurement - deterministic benchmarking and stochastic benchmarking - in detail, and addresses the statistical techniques that connect them. All chapters include applications and explore topics ranging from the output/input ratio to productivity indexes and national statistics.
Now in its fourth edition, this comprehensive introduction of fundamental panel data methodologies provides insights on what is most essential in panel literature. A capstone to the forty-year career of a pioneer of panel data analysis, this new edition's primary contribution will be the coverage of advancements in panel data analysis, a statistical method widely used to analyze two or higher-dimensional panel data. The topics discussed in early editions have been reorganized and streamlined to comprehensively introduce panel econometric methodologies useful for identifying causal relationships among variables, supported by interdisciplinary examples and case studies. This book, to be featured in Cambridge's Econometric Society Monographs series, has been the leader in the field since the first edition. It is essential reading for researchers, practitioners and graduate students interested in the analysis of microeconomic behavior.
We live in a time of economic virtualism, whereby our lives are
made to conform to the virtual reality of economic thought.
Globalization, transnational capitalism, structural adjustment
programmes and the decay of welfare are all signs of the growing
power of economics, one of the most potent forces of recent
decades. In the last thirty years, economics has ceased to be just
an academic discipline concerned with the study of economy, and has
come to be the only legitimate way to think about all aspects of
society and how we order our lives. Economic models are no longer
measured against the world they seek to describe, but instead the
world is measured against them, found wanting and made to conform.
This book is dedicated to the study of the term structures of the yields of zero-coupon bonds. The methods it describes differ from those usually found in the literature in that the time variable is not the term to maturity but the interest rate duration, or another convenient non-linear transformation of terms. This makes it possible to consider yield curves not only for a limited interval of term values, but also for the entire positive semiaxis of terms. The main focus is the comparative analysis of yield curves and forward curves and the analytical study of their features. Generalizations of yield term structures are studied where the dimension of the state space of the financial market is increased. In cases where the analytical approach is too cumbersome, or impossible, numerical techniques are used. This book will be of interest to financial analysts, financial market researchers, graduate students and PhD students.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
Connections among different assets, asset classes, portfolios, and the stocks of individual institutions are critical in examining financial markets. Interest in financial markets implies interest in underlying macroeconomic fundamentals. In Financial and Macroeconomic Connectedness, Frank Diebold and Kamil Yilmaz propose a simple framework for defining, measuring, and monitoring connectedness, which is central to finance and macroeconomics. These measures of connectedness are theoretically rigorous yet empirically relevant. The approach to connectedness proposed by the authors is intimately related to the familiar econometric notion of variance decomposition. The full set of variance decompositions from vector auto-regressions produces the core of the 'connectedness table.' The connectedness table makes clear how one can begin with the most disaggregated pair-wise directional connectedness measures and aggregate them in various ways to obtain total connectedness measures. The authors also show that variance decompositions define weighted, directed networks, so that these proposed connectedness measures are intimately related to key measures of connectedness used in the network literature. After describing their methods in the first part of the book, the authors proceed to characterize daily return and volatility connectedness across major asset (stock, bond, foreign exchange and commodity) markets as well as the financial institutions within the U.S. and across countries since late 1990s. These specific measures of volatility connectedness show that stock markets played a critical role in spreading the volatility shocks from the U.S. to other countries. Furthermore, while the return connectedness across stock markets increased gradually over time the volatility connectedness measures were subject to significant jumps during major crisis events. This book examines not only financial connectedness, but also real fundamental connectedness. In particular, the authors show that global business cycle connectedness is economically significant and time-varying, that the U.S. has disproportionately high connectedness to others, and that pairwise country connectedness is inversely related to bilateral trade surpluses.
Drawing on the author's extensive and varied research, this book provides readers with a firm grounding in the concepts and issues across several disciplines including economics, nutrition, psychology and public health in the hope of improving the design of food policies in the developed and developing world. Using longitudinal (panel) data from India, Bangladesh, Kenya, the Philippines, Vietnam, and Pakistan and extending the analytical framework used in economics and biomedical sciences to include multi-disciplinary analyses, Alok Bhargava shows how rigorous and thoughtful econometric and statistical analysis can improve our understanding of the relationships between a number of socioeconomic, nutritional, and behavioural variables on a number of issues like cognitive development in children and labour productivity in the developing world. These unique insights combined with a multi-disciplinary approach forge the way for a more refined and effective approach to food policy formation going forward. A chapter on the growing obesity epidemic is also included, highlighting the new set of problems facing not only developed but developing countries. The book also includes a glossary of technical terms to assist readers coming from a variety of disciplines.
This report is a partial result of the China's Quarterly Macroeconomic Model (CQMM), a project developed and maintained by the Center for Macroeconomic Research (CMR) at Xiamen University. The CMR, one of the Key Research Institutes of Humanities and Social Sciences sponsored by the Ministry of Education of China, has been focusing on China's economic forecast and macroeconomic policy analysis, and it started to develop the CQMM for purpose of short-term forecasting, policy analysis, and simulation in 2005.Based on the CQMM, the CMR and its partners hold press conferences to release forecasts for China' major macroeconomic variables. Since July, 2006, twenty-six quarterly reports on China's macroeconomic outlook have been presented and thirteen annual reports have been published. This 27th quarterly report has been presented at the Forum on China's Macroeconomic Prospects and Press Conference of the CQMM at Xiamen University Malaysia on October 25, 2019. This conference was jointly held by Xiamen University and Economic Information Daily of Xinhua News Agency.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
This introductory overview explores the methods, models and interdisciplinary links of artificial economics, a new way of doing economics in which the interactions of artificial economic agents are computationally simulated to study their individual and group behavior patterns. Conceptually and intuitively, and with simple examples, Mercado addresses the differences between the basic assumptions and methods of artificial economics and those of mainstream economics. He goes on to explore various disciplines from which the concepts and methods of artificial economics originate; for example cognitive science, neuroscience, artificial intelligence, evolutionary science and complexity science. Introductory discussions on several controversial issues are offered, such as the application of the concepts of evolution and complexity in economics and the relationship between artificial intelligence and the philosophies of mind. This is one of the first books to fully address artificial economics, emphasizing its interdisciplinary links and presenting in a balanced way its occasionally controversial aspects.
In this book, different quantitative approaches to the study of electoral systems have been developed: game-theoretic, decision-theoretic, statistical, probabilistic, combinatorial, geometric, and optimization ones. All the authors are prominent scholars from these disciplines. Quantitative approaches offer a powerful tool to detect inconsistencies or poor performance in actual systems. Applications to concrete settings such as EU, American Congress, regional, and committee voting are discussed.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
Coverage has been extended to include recent topics. The book again presents a unified treatment of economic theory, with the method of maximum likelihood playing a key role in both estimation and testing. Exercises are included and the book is suitable as a general text for final-year undergraduate and postgraduate students.
The main objective of this book is to develop a strategy and policy measures to enhance the formalization of the shadow economy in order to improve the competitiveness of the economy and contribute to economic growth; it explores these issues with special reference to Serbia. The size and development of the shadow economy in Serbia and other Central and Eastern European countries are estimated using two different methods (the MIMIC method and household-tax-compliance method). Micro-estimates are based on a special survey of business entities in Serbia, which for the first time allows us to explore the shadow economy from the perspective of enterprises and entrepreneurs. The authors identify the types of shadow economy at work in business entities, the determinants of shadow economy participation, and the impact of competition from the informal sector on businesses. Readers will learn both about the potential fiscal effects of reducing the shadow economy to the levels observed in more developed countries and the effects that formalization of the shadow economy can have on economic growth.
Decision-theoretic ideas can structure the process of inference together with the decision-making that inference supports. Statistical decision theory is the sub-discipline of statistics which explores and develops this structure. Typically, discusion of decision theory within one discipline does not recognise that other disciplines may have considered the same or similar problems. This text, Volume 9 in the prestigious Kendall's Library of Statistics, provides an overview of the main ideas and concepts of statistical decision theory and sets it within the broader concept of decision theory, decision analysis and decision support as they are practised in many disciplines beyond statistics - including artificial intelligence, economics, operational research, philosophy and psychology.
Computable general equilibrium (CGE) models play an important role in supporting public-policy making on such issues as trade, climate change and taxation. This significantly revised volume, keeping pace with the next-generation standard CGE model, is the only undergraduate-level introduction of its kind. The volume utilizes a graphical approach to explain the economic theory underlying a CGE model, and provides results from simple, small-scale CGE models to illustrate the links between theory and model outcomes. Its eleven hands-on exercises introduce modelling techniques that are applied to real-world economic problems. Students learn how to integrate their separate fields of economic study into a comprehensive, general equilibrium perspective as they develop their skills as producers or consumers of CGE-based analysis.
This is the perfect (and essential) supplement for all econometrics
classes--from a rigorous first undergraduate course, to a first
master's, to a PhD course.
Including contributions spanning a variety of theoretical and applied topics in econometrics, this volume of Advances in Econometrics is published in honour of Cheng Hsiao. In the first few chapters of this book, new theoretical panel and time series results are presented, exploring JIVE estimators, HAC, HAR and various sandwich estimators, as well as asymptotic distributions for using information criteria to distinguish between the unit root model and explosive models. Other chapters address topics such as structural breaks or growth empirics; auction models; and semiparametric methods testing for common vs. individual trends. Three chapters provide novel empirical approaches to applied problems, such as estimating the impact of survey mode on responses, or investigating how cross-sectional and spatial dependence of mortgages varies by default rates and geography. In the final chapters, Cheng Hsiao offers a forward-focused discussion of the role of big data in economics. For any researcher of econometrics, this is an unmissable volume of the most current and engaging research in the field.
Nanak Kakwani and Hyun Hwa Son make use of social welfare functions to derive indicators of development relevant to specific social objectives, such as poverty- and inequality-reduction. Arguing that the measurement of development cannot be value-free, the authors assert that if indicators of development are to have policy relevance, they must be assessed on the basis of the social objectives in question. This study develops indicators that are sensitive to both the level and the distribution of individuals' capabilities. The idea of the social welfare function, defined in income space, is extended to the concept of the social well-being function, defined in capability space. Through empirical analysis from selected developing countries, with a particular focus on Brazil, the authors shape techniques appropriate to the analysis of development in different dimensions. The focus of this evidence-based policy analysis is to evaluate alternative policies affecting the capacities of people to enjoy a better life.
This book examines whether continuous-time models in frictionless financial economies can be well approximated by discrete-time models. It specifically looks to answer the question: in what sense and to what extent does the famous Black-Scholes-Merton (BSM) continuous-time model of financial markets idealize more realistic discrete-time models of those markets? While it is well known that the BSM model is an idealization of discrete-time economies where the stock price process is driven by a binomial random walk, it is less known that the BSM model idealizes discrete-time economies whose stock price process is driven by more general random walks. Starting with the basic foundations of discrete-time and continuous-time models, David M. Kreps takes the reader through to this important insight with the goal of lowering the entry barrier for many mainstream financial economists, thus bringing less-technical readers to a better understanding of the connections between BSM and nearby discrete-economies.
This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time series.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis.This edited volume addresses the complex issue of Spatial Economics from an applied point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume I: Theory) collecting original papers which address Spatial Economics from a theoretical perspective.
Space is a crucial variable in any economic activity. Spatial Economics is the branch of economics that explicitly aims to incorporate the space dimension in the analysis of economic phenomena. From its beginning in the last century, Spatial Economics has contributed to the understanding of the economy by developing plenty of theoretical models as well as econometric techniques having the "space" as a core dimension of the analysis. This edited volume addresses the complex issue of Spatial Economics from a theoretical point of view. This volume is part of a more complex project including another edited volume (Spatial Economics Volume II: Applications) collecting original papers which address Spatial Economics from an applied perspective. |
You may like...
Principles and Methods of Explainable…
Victor Hugo C. de Albuquerque, P Naga Srinivasu, …
Hardcover
R10,591
Discovery Miles 105 910
Artificial Intelligence and Machine…
Vedik Basetti, Chandan Kumar Shiva, …
Paperback
R2,479
Discovery Miles 24 790
Advanced Machine Learning Algorithms for…
Mohammad Irfan, Mohamed Elhoseny, …
Hardcover
R6,690
Discovery Miles 66 900
Security in IoT Social Networks
Fadi Al-Turjman, B.D. Deebak
Paperback
R2,634
Discovery Miles 26 340
|