![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
In 1945, very early in the history of the development of a rigorous analytical theory of probability, Feller (1945) wrote a paper called "The fundamental limit theorems in probability" in which he set out what he considered to be "the two most important limit theorems in the modern theory of probability: the central limit theorem and the recently discovered ... 'Kolmogoroff's cel ebrated law of the iterated logarithm' ". A little later in the article he added to these, via a charming description, the "little brother (of the central limit theo rem), the weak law of large numbers", and also the strong law of large num bers, which he considers as a close relative of the law of the iterated logarithm. Feller might well have added to these also the beautiful and highly applicable results of renewal theory, which at the time he himself together with eminent colleagues were vigorously producing. Feller's introductory remarks include the visionary: "The history of probability shows that our problems must be treated in their greatest generality: only in this way can we hope to discover the most natural tools and to open channels for new progress. This remark leads naturally to that characteristic of our theory which makes it attractive beyond its importance for various applications: a combination of an amazing generality with algebraic precision.
​ The goal of this book is to assess the efficacy of India’s financial deregulation programme by analyzing the developments in cost efficiency and total factor productivity growth across different ownership types and size classes in the banking sector over the post-deregulation years. The work also gauges the impact of inclusion or exclusion of a proxy for non-traditional activities on the cost efficiency estimates for Indian banks, and ranking of distinct ownership groups. It also investigates the hitherto neglected aspect of the nature of returns-to-scale in the Indian banking industry. In addition, the work explores the key bank-specific factors that explain the inter-bank variations in efficiency and productivity growth. Overall, the empirical results of this work allow us to ascertain whether the gradualist approach to reforming the banking system in a developing economy like India has yielded the most significant policy goal of achieving efficiency and productivity gains. The authors believe that the findings of this book could give useful policy directions and suggestions to other developing economies that have embarked on a deregulation path or are contemplating doing so.
Econophysics of Games and Social Choices.- Kolkata Paise Restaurant Problem in Some Uniform Learning Strategy Limits.- Cycle Monotonicity in Scheduling Models.- Reinforced Learning in Market Games.- Mechanisms Supporting Cooperation for the Evolutionary Prisoner's Dilemma Games.- Economic Applications of Quantum Information Processing.- Using Many-Body Entanglement for Coordinated Action in Game Theory Problems.- Condensation Phenomena and Pareto Distribution in Disordered Urn Models.- Economic Interactions and the Distribution of Wealth.- Wealth Redistribution in Boltzmann-like Models of Conservative Economies.- Multi-species Models in Econo- and Sociophysics.- The Morphology of Urban Agglomerations for Developing Countries: A Case Study with China.- A Mean-Field Model of Financial Markets: Reproducing Long Tailed Distributions and Volatility Correlations.- Statistical Properties of Fluctuations: A Method to Check Market Behavior.- Modeling Saturation in Industrial Growth.- The Kuznets Curve and the Inequality Process.- Monitoring the Teaching - Learning Process via an Entropy Based Index.- Technology Level in the Industrial Supply Chain: Thermodynamic Concept.- Discussions and Comments in Econophys Kolkata IV.- Contributions to Quantitative Economics.- On Multi-Utility Representation of Equitable Intergenerational Preferences.- Variable Populations and Inequality-Sensitive Ethical Judgments.- A Model of Income Distribution.- Statistical Database of the Indian Economy: Need for New Directions.- Does Parental Education Protect Child Health? Some Evidence from Rural Udaipur.- Food Security and Crop Diversification: Can West Bengal Achieve Both?.- Estimating Equivalence Scales Through Engel Curve Analysis.- Testing for Absolute Convergence: A Panel Data Approach.- Goodwin's Growth Cycles: A Reconsideration.- Human Capital Accumulation, Economic Growth and Educational Subsidy Policy in a Dual Economy.- Arms Trade and Conflict Resolution: A Trade-Theoretic Analysis.- Trade andWage Inequality with Endogenous Skill Formation.- Dominant Strategy Implementation in Multi-unit Allocation Problems.- Allocation through Reduction on Minimum Cost Spanning Tree Games.- Unmediated and Mediated Communication Equilibria of Battle of the Sexes with Incomplete Information.- A Characterization Result on the Coincidence of the Prenucleolus and the Shapley Value.- The Ordinal Equivalence of the Johnston Index and the Established Notions of Power.- Reflecting on Market Size and Entry under Oligopoly.
This restructured, updated Third Edition provides a general overview of the econometrics of panel data, from both theoretical and applied viewpoints. Readers discover how econometric tools are used to study organizational and household behaviors as well as other macroeconomic phenomena such as economic growth. The book contains sixteen entirely new chapters; all other chapters have been revised to account for recent developments. With contributions from well known specialists in the field, this handbook is a standard reference for all those involved in the use of panel data in econometrics.
Pioneered by American economist Paul Samuelson, revealed preference theory is based on the idea that the preferences of consumers are revealed in their purchasing behavior. Researchers in this field have developed complex and sophisticated mathematical models to capture the preferences that are 'revealed' through consumer choice behavior. This study of consumer demand and behavior is closely tied up with econometrics (especially nonparametric econometrics), where testing the validity of different theoretical models is an important aspect of research. The theory of revealed preference has a very long and distinguished tradition in economics, but there was no systematic presentation of the theory until now. This book deals with basic questions in economic theory, such as the relation between theory and data, and studies the situations in which empirical observations are consistent or inconsistent with some of the best known theories in economics.
This lively book lays out a methodology of confidence distributions and puts them through their paces. Among other merits, they lead to optimal combinations of confidence from different sources of information, and they can make complex models amenable to objective and indeed prior-free analysis for less subjectively inclined statisticians. The generous mixture of theory, illustrations, applications and exercises is suitable for statisticians at all levels of experience, as well as for data-oriented scientists. Some confidence distributions are less dispersed than their competitors. This concept leads to a theory of risk functions and comparisons for distributions of confidence. Neyman-Pearson type theorems leading to optimal confidence are developed and richly illustrated. Exact and optimal confidence distribution is the gold standard for inferred epistemic distributions. Confidence distributions and likelihood functions are intertwined, allowing prior distributions to be made part of the likelihood. Meta-analysis in likelihood terms is developed and taken beyond traditional methods, suiting it in particular to combining information across diverse data sources.
This book is a practical guide for theory-based empirical analysis in economics that guides the reader through the first steps when moving between economic theory and applied research. The book provides a hands-on introduction to some of the techniques that economists use for econometric estimation and shows how to convert a selection of standard and advanced estimators into MATLAB code. The book first provides a brief introduction to MATLAB and its syntax, before moving into microeconometric applications studied in undergraduate and graduate econometrics courses. Along with standard estimation methods such as, for example, Method of Moments, Maximum Likelihood, and constrained optimisation, the book also includes a series of chapters examining more advanced research methods. These include discrete choice, discrete games, dynamic models on a finite and infinite horizon, and semi- and nonparametric methods. In closing, it discusses more advanced features that can be used to optimise use of MATLAB, including parallel computing. Each chapter is structured around a number of worked examples, designed for the reader to tackle as they move through the book. Each chapter ends with a series of readings, questions, and extensions, designed to help the reader on their way to adapting the examples in the book to fit their own research questions.
Will history repeat itself, leaving Saudi Arabia to face another financial crisis due to drastic overspending and/or a dramatic drop in oil revenue? If the situation remains on its current trajectory, by 2030 government debt due to rising expenditures over revenues will be too overwhelming for the government to cope with.
The global demographic transition presents marked asymmetries as poor, emerging, and advanced countries are undergoing different stages of transition. Emerging countries are demographically younger than advanced economies. This youth is favorable to growth and generates a demographic dividend. However, the future of emerging economies will bring a decline in the working-age share and a rise in the older population, as is the case in today's developed world. Hence, developing countries must get rich before getting old, while advanced economies must try not to become poorer as they age. Asymmetric Demography and the Global Economy contributes to our understanding of why this demographic transition matters to the domestic macroeconomics and global capital movements affect the asset accumulation, growth potential, current account, and the economy's international investment position. This collaborative collection approaches these questions from the perspective of "systemically important" emerging countries i.e., members of the G20 but considers both the national and the global sides of the problem.
This ambitious book looks 'behind the model' to reveal how economists use formal models to generate insights into the economy. Drawing on recent work in the philosophy of science and economic methodology, the book presents a novel framework for understanding the logic of economic modeling. It also reveals the ways in which economic models can mislead rather than illuminate. Importantly, the book goes beyond purely negative critique, proposing a concrete program of methodological reform to better equip economists to detect potential mismatches between their models and the targets of their inquiry. Ranging across economics, philosophy, and social science methods, and drawing on a variety of examples, including the recent financial crisis, Behind the Model will be of interest to anyone who has wondered how economics works - and why it sometimes fails so spectacularly.
Three leading experts have produced a landmark work based on a set of working papers published by the Center for Operations Research and Econometrics (CORE) at the Universite Catholique de Louvain in 1994 under the title 'Repeated Games', which holds almost mythic status among game theorists. Jean-Francois Mertens, Sylvain Sorin and Shmuel Zamir have significantly elevated the clarity and depth of presentation with many results presented at a level of generality that goes far beyond the original papers - many written by the authors themselves. Numerous results are new, and many classic results and examples are not to be found elsewhere. Most remain state of the art in the literature. This book is full of challenging and important problems that are set up as exercises, with detailed hints provided for their solutions. A new bibliography traces the development of the core concepts up to the present day.
Upon the backdrop of impressive progress made by the Indian economy during the last two decades after the large-scale economic reforms in the early 1990s, this book evaluates the performance of the economy on some income and non-income dimensions of development at the national, state and sectoral levels. It examines regional economic growth and inequality in income originating from agriculture, industry and services. In view of the importance of the agricultural sector, despite its declining share in gross domestic product, it evaluates the performance of agricultural production and the impact of agricultural reforms on spatial integration of food grain markets. It studies rural poverty, analyzing the trend in employment, the trickle-down process and the inclusiveness of growth in rural India. It also evaluates the impact of microfinance, as an instrument of financial inclusion, on the socio-economic conditions of rural households. Lastly, it examines the relative performance of fifteen major states of India in terms of education, health and human development. An important feature of the book is that it approaches these issues, applying rigorously advanced econometric methods, and focusing primarily on their regional disparities during the post-reform period vis-a-vis the pre-reform period. It offers important results to guide policies for future development.
Many economic theories depend on the presence or absence of a unit root for their validity, and econometric and statistical theory undergo considerable changes when unit roots are present. Thus, knowledge on unit roots has become so important, necessitating an extensive, compact, and nontechnical book on this subject. This book is rested on this motivation and introduces the literature on unit roots in a comprehensive manner to both empirical and theoretical researchers in economics and other areas. By providing a clear, complete, and critical discussion of unit root literature, In Choi covers a wide range of topics, including uniform confidence interval construction, unit root tests allowing structural breaks, mildly explosive processes, exuberance testing, fractionally integrated processes, seasonal unit roots and panel unit root testing. Extensive, up to date, and readily accessible, this book is a comprehensive reference source on unit roots for both students and applied workers.
Many economic and social surveys are designed as panel studies, which provide important data for describing social changes and testing causal relations between social phenomena. This textbook shows how to manage, describe, and model these kinds of data. It presents models for continuous and categorical dependent variables, focusing either on the level of these variables at different points in time or on their change over time. It covers fixed and random effects models, models for change scores and event history models. All statistical methods are explained in an application-centered style using research examples from scholarly journals, which can be replicated by the reader through data provided on the accompanying website. As all models are compared to each other, it provides valuable assistance with choosing the right model in applied research. The textbook is directed at master and doctoral students as well as applied researchers in the social sciences, psychology, business administration and economics. Readers should be familiar with linear regression and have a good understanding of ordinary least squares estimation.
The book describes formal models of reasoning that are aimed at capturing the way that economic agents, and decision makers in general think about their environment and make predictions based on their past experience. The focus is on analogies (case-based reasoning) and general theories (rule-based reasoning), and on the interaction between them, as well as between them and Bayesian reasoning. A unified approach allows one to study the dynamics of inductive reasoning in terms of the mode of reasoning that is used to generate predictions.
This book gives an introduction to R to build up graphing, simulating and computing skills to enable one to see theoretical and statistical models in economics in a unified way. The great advantage of R is that it is free, extremely flexible and extensible. The book addresses the specific needs of economists, and helps them move up the R learning curve. It covers some mathematical topics such as, graphing the Cobb-Douglas function, using R to study the Solow growth model, in addition to statistical topics, from drawing statistical graphs to doing linear and logistic regression. It uses data that can be downloaded from the internet, and which is also available in different R packages. With some treatment of basic econometrics, the book discusses quantitative economics broadly and simply, looking at models in the light of data. Students of economics or economists keen to learn how to use R would find this book very useful.
The estimation and the validation of the Basel II risk parameters PD (default probability), LGD (loss given fault), and EAD (exposure at default) is an important problem in banking practice. These parameters are used on the one hand as inputs to credit portfolio models and in loan pricing frameworks, on the other to compute regulatory capital according to the new Basel rules. This book covers the state-of-the-art in designing and validating rating systems and default probability estimations. Furthermore, it presents techniques to estimate LGD and EAD and includes a chapter on stress testing of the Basel II risk parameters. The second edition is extended by three chapters explaining how the Basel II risk parameters can be used for building a framework for risk-adjusted pricing and risk management of loans.
To what extent should anybody who has to make model forecasts generated from detailed data analysis adjust their forecasts based on their own intuition? In this book, Philip Hans Franses, one of Europe's leading econometricians, presents the notion that many publicly available forecasts have experienced an 'expert's touch', and questions whether this type of intervention is useful and if a lighter adjustment would be more beneficial. Covering an extensive research area, this accessible book brings together current theoretical insights and new empirical results to examine expert adjustment from an econometric perspective. The author's analysis is based on a range of real forecasts and the datasets upon which the forecasters relied. The various motivations behind experts' modifications are considered, and guidelines for creating more useful and reliable adjusted forecasts are suggested. This book will appeal to academics and practitioners with an interest in forecasting methodology.
Analyzing Event Statistics in Corporate Finance provides new alternative methodologies to increase accuracy when performing statistical tests for event studies within corporate finance. In contrast to conventional surveys or literature reviews, Jeng focuses on various methodological defects or deficiencies that lead to inaccurate empirical results, which ultimately produce bad corporate policies. This work discusses the issues of data collection and structure, the recursive smoothing for systematic components in excess returns, the choices of event windows, different time horizons for the events, and the consequences of applications of different methodologies. In providing improvement for event studies in corporate finance, and based on the fact that changes in parameters for financial time series are common knowledge, a new alternative methodology is developed to extend the conventional analysis to more robust arguments.
Three leading experts have produced a landmark work based on a set of working papers published by the Center for Operations Research and Econometrics (CORE) at the Universite Catholique de Louvain in 1994 under the title 'Repeated Games', which holds almost mythic status among game theorists. Jean-Francois Mertens, Sylvain Sorin and Shmuel Zamir have significantly elevated the clarity and depth of presentation with many results presented at a level of generality that goes far beyond the original papers - many written by the authors themselves. Numerous results are new, and many classic results and examples are not to be found elsewhere. Most remain state of the art in the literature. This book is full of challenging and important problems that are set up as exercises, with detailed hints provided for their solutions. A new bibliography traces the development of the core concepts up to the present day.
The productivity of a business exerts an important influence on its financial performance. A similar influence exists for industries and economies: those with superior productivity performance thrive at the expense of others. Productivity performance helps explain the growth and demise of businesses and the relative prosperity of nations. Productivity Accounting: The Economics of Business Performance offers an in-depth analysis of variation in business performance, providing the reader with an analytical framework within which to account for this variation and its causes and consequences. The primary focus is the individual business, and the principal consequence of business productivity performance is business financial performance. Alternative measures of financial performance are considered, including profit, profitability, cost, unit cost, and return on assets. Combining analytical rigor with empirical illustrations, the analysis draws on wide-ranging literatures, both historical and current, from business and economics, and explains how businesses create value and distribute it.
This book presents the reader with new operators and matrices that arise in the area of matrix calculus. The properties of these mathematical concepts are investigated and linked with zero-one matrices such as the commutation matrix. Elimination and duplication matrices are revisited and partitioned into submatrices. Studying the properties of these submatrices facilitates achieving new results for the original matrices themselves. Different concepts of matrix derivatives are presented and transformation principles linking these concepts are obtained. One of these concepts is used to derive new matrix calculus results, some involving the new operators and others the derivatives of the operators themselves. The last chapter contains applications of matrix calculus, including optimization, differentiation of log-likelihood functions, iterative interpretations of maximum likelihood estimators, and a Lagrangian multiplier test for endogeneity.
The "Theory of Macrojustice", introduced by S.-C. Kolm, is a stimulating contribution to the debate on the macroeconomic income distribution. The solution called "Equal Labour Income Equalisation" (ELIE) is the result of a three stages construction: collective agreement on the scheme of labour income redistribution, collective agreement on the degree of equalisation to be chosen in that framework, individual freedom to exploit his--her personal productive capicities (the source of labour income and the sole basis for taxation). This book is organised as a discussion around four complementary themes: philosophical aspects of macrojustice, economic analysis of macrojustice, combination of ELIE with other targeted tranfers, econometric evaluations of ELIE.
The recent financial crisis has heightened the need for appropriate methodologies for managing and monitoring complex risks in financial markets. The measurement, management, and regulation of risks in portfolios composed of credits, credit derivatives, or life insurance contracts is difficult because of the nonlinearities of risk models, dependencies between individual risks, and the several thousands of contracts in large portfolios. The granularity principle was introduced in the Basel regulations for credit risk to solve these difficulties in computing capital reserves. In this book, authors Patrick Gagliardini and Christian Gourieroux provide the first comprehensive overview of the granularity theory and illustrate its usefulness for a variety of problems related to risk analysis, statistical estimation, and derivative pricing in finance and insurance. They show how the granularity principle leads to analytical formulas for risk analysis that are simple to implement and accurate even when the portfolio size is large."
The Handbook is written for academics, researchers, practitioners and advanced graduate students. It has been designed to be read by those new or starting out in the field of spatial analysis as well as by those who are already familiar with the field. The chapters have been written in such a way that readers who are new to the field will gain important overview and insight. At the same time, those readers who are already practitioners in the field will gain through the advanced and/or updated tools and new materials and state-of-the-art developments included. This volume provides an accounting of the diversity of current and emergent approaches, not available elsewhere despite the many excellent journals and te- books that exist. Most of the chapters are original, some few are reprints from the Journal of Geographical Systems, Geographical Analysis, The Review of Regional Studies and Letters of Spatial and Resource Sciences. We let our contributors - velop, from their particular perspective and insights, their own strategies for m- ping the part of terrain for which they were responsible. As the chapters were submitted, we became the first consumers of the project we had initiated. We gained from depth, breadth and distinctiveness of our contributors' insights and, in particular, the presence of links between them. |
You may like...
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R7,224
Discovery Miles 72 240
|