![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Originally published in 1979. This book addresses three questions regarding uncertainty in economic life: how do we define uncertainty and use the concept meaningfully to provide conclusions; how can the level of uncertainty associated with a particular variable of economic interest be measured; and does experience provide any support for the view that uncertainty really matters. It develops a theory of the effect of price uncertainty on production and trade, takes a graphical approach to look at effects of a mean preserving spread to create rules for ordering distributions, and finishes with an econometric analysis of the effects of Brazil's adoption of a crawling peg in reducing real exchange rate uncertainty. This is an important early study into the significance of uncertainty.
This volume investigates the accuracy and dynamic performance of a high-frequency forecast model for the Japanese and United States economies based on the Current Quarter Model (CQM) or High Frequency Model (HFM) developed by the late Professor Emeritus Lawrence R. Klein. It also presents a survey of recent developments in high-frequency forecasts and gives an example application of the CQM model in forecasting Gross Regional Products (GRPs).
This book presents Professor Lawrence R Klein and his group's last quarterly econometric model of the United States economy that they had produced at the University of Pennsylvania. This is the last econometric model that Lawrence Klein and his disciples have left after some 50 years of cumulated efforts of constructing the US economy model up to around 2000. It was widely known as the WEFA Econometric Model Mark 10, and is the culmination of Professor Klein's research which spans more than 70 years, and would please not only Professor Klein's old students and colleagues, but also younger students who have heard so much of Klein models but have yet to see the latest model in its complete and printed form.
Originally published in 1984. This book examines two important dimensions of efficiency in the foreign exchange market using econometric techniques. It responds to the macroeconomics trend to re-examining the theories of exchange rate determination following the erratic behaviour of exchange rates in the late 1970s. In particular the text looks at the relation between spot and forward exchange rates and the term structure of the forward premium, both of which require a joint test of market efficiency and the equilibrium model. Approaches used are the regression of spot rates on lagged forward rates and an explicit time series analysis of the spot and forward rates, using data from Canada, the United Kingdom, the Netherlands, Switzerland and Germany.
Originally published in 1976 and with second edition published in 1984. This book established itself as the first genuinely introductory text on econometric methods, assuming no formal background on the part of the reader. The second edition maintains this distinctive feature. Fundamental concepts are carefully explained and, where possible, techniques are developed by verbal reasoning rather than formal proof. It provides all the material for a basic course. and is also ideal for a student working alone. Very little knowledge of maths and statistics is assumed, and the logic of statistical method is carefully stated. There are numerous exercises, designed to help the student assess individual progress. Methods are described with computer solutions in mind and the author shows how a variety of different calculations can be performed with relatively simple programs. This new edition also includes much new material - statistical tables are now included and their use carefully explained.
Originally published in 1984. Since the logic underlying economic theory can only be grasped fully by a thorough understanding of the mathematics, this book will be invaluable to economists wishing to understand vast areas of important research. It provides a basic introduction to the fundamental mathematical ideas of topology and calculus, and uses these to present modern singularity theory and recent results on the generic existence of isolated price equilibria in exchange economies.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
Both theoretical and empirical aspects of single- and multi-winner voting procedures are presented in this collection of papers. Starting from a discussion of the underlying principles of democratic representation, the volume includes a description of a great variety of voting procedures. It lists and illustrates their susceptibility to the main voting paradoxes, assesses (under various models of voters' preferences) the probability of paradoxical outcomes, and discusses the relevance of the theoretical results to the choice of voting system.
The productivity of a business exerts an important influence on its financial performance. A similar influence exists for industries and economies: those with superior productivity performance thrive at the expense of others. Productivity performance helps explain the growth and demise of businesses and the relative prosperity of nations. Productivity Accounting: The Economics of Business Performance offers an in-depth analysis of variation in business performance, providing the reader with an analytical framework within which to account for this variation and its causes and consequences. The primary focus is the individual business, and the principal consequence of business productivity performance is business financial performance. Alternative measures of financial performance are considered, including profit, profitability, cost, unit cost, and return on assets. Combining analytical rigor with empirical illustrations, the analysis draws on wide-ranging literatures, both historical and current, from business and economics, and explains how businesses create value and distribute it.
Financial econometrics is one of the greatest on-going success stories of recent decades, as it has become one of the most active areas of research in econometrics. In this book, Michael Clements presents a clear and logical explanation of the key concepts and ideas of forecasts of economic and financial variables. He shows that forecasts of the single most likely outcome of an economic and financial variable are of limited value. Forecasts that provide more information on the expected likely ranges of outcomes are more relevant. This book provides a comprehensive treatment of the evaluation of different types of forecasts and draws out the parallels between the different approaches. It describes the methods of evaluating these more complex forecasts which provide a fuller description of the range of possible future outcomes.
Why should we be interested in macroeconomic survey expectations? This important book offers an in-depth treatment of this question from a point of view not covered in existing works on time-series econometrics and forecasting. Clements presents the nature of survey data, addresses some of the difficulties posed by the way in which survey expectations are elicited and considers the evaluation of point predictions and probability distributions. He outlines how, from a behavioural perspective, surveys offer insight into how economic agents form their expectations.
The interaction between mathematicians, statisticians and econometricians working in actuarial sciences and finance is producing numerous meaningful scientific results. This volume introduces new ideas, in the form of four-page papers, presented at the international conference Mathematical and Statistical Methods for Actuarial Sciences and Finance (MAF), held at Universidad Carlos III de Madrid (Spain), 4th-6th April 2018. The book covers a wide variety of subjects in actuarial science and financial fields, all discussed in the context of the cooperation between the three quantitative approaches. The topics include: actuarial models; analysis of high frequency financial data; behavioural finance; carbon and green finance; credit risk methods and models; dynamic optimization in finance; financial econometrics; forecasting of dynamical actuarial and financial phenomena; fund performance evaluation; insurance portfolio risk analysis; interest rate models; longevity risk; machine learning and soft-computing in finance; management in insurance business; models and methods for financial time series analysis, models for financial derivatives; multivariate techniques for financial markets analysis; optimization in insurance; pricing; probability in actuarial sciences, insurance and finance; real world finance; risk management; solvency analysis; sovereign risk; static and dynamic portfolio selection and management; trading systems. This book is a valuable resource for academics, PhD students, practitioners, professionals and researchers, and is also of interest to other readers with quantitative background knowledge.
The book provides an up-to-date survey of statistical and econometric techniques for the analysis of count data, with a focus on conditional distribution models. The book starts with a presentation of the benchmark Poisson regression model. Alternative models address unobserved heterogeneity, state dependence, selectivity, endogeneity, underreporting, and clustered sampling. Testing and estimation is discussed. Finally, applications are reviewed in various fields.
This book employs a computable general equilibrium (CGE) model - a widely used economic model which uses actual data to provide economic analysis and policy assessment - and applies it to economic data on Singapore's tourism industry. The authors set out to demonstrate how a novice modeller can acquire the necessary skills and knowledge to successfully apply general equilibrium models to tourism studies. The chapters explain how to build a computable general equilibrium model for tourism, how to conduct simulation and, most importantly, how to analyse modelling results. This applied study acts as a modelling book at both introductory and intermediate levels, specifically targeting students and researchers who are interested in and wish to learn computable general equilibrium modelling. The authors offer insightful analysis of Singapore's tourism industry and provide both students and researchers with a guide on how to apply general equilibrium models to actual economic data and draw accurate conclusions.
This book covers diverse themes, including institutions and efficiency, choice and values, law and economics, development and policy, and social and economic measurement. Written in honour of the distinguished economist Satish K. Jain, this compilation of essays should appeal not only to students and researchers of economic theory but also to those interested in the design and evaluation of institutions and policy.
This volume deals with a range of contemporary issues in Indian and other world economies, with a focus on economic theory and policy and their longstanding implications. It analyses and predicts the mechanisms that can come into play to determine the function of institutions and the impact of public policy.
This book explores new topics in modern research on empirical corporate finance and applied accounting, especially the econometric analysis of microdata. Dubbed "financial microeconometrics" by the author, this concept unites both methodological and applied approaches. The book examines how quantitative methods can be applied in corporate finance and accounting research in order to predict companies getting into financial distress. Presented in a clear and straightforward manner, it also suggests methods for linking corporate governance to financial performance, and discusses what the determinants of accounting disclosures are. Exploring these questions by way of numerous practical examples, this book is intended for researchers, practitioners and students who are not yet familiar with the variety of approaches available for data analysis and microeconometrics. "This book on financial microeconometrics is an excellent starting point for research in corporate finance and accounting. In my view, the text is positioned between a narrative and a scientific treatise. It is based on a vast amount of literature but is not overloaded with formulae. My appreciation of financial microeconometrics has very much increased. The book is well organized and properly written. I enjoyed reading it." Wolfgang Marty, Senior Investment Strategist, AgaNola AG
Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices. This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Economics and Game Theory.
Econometric Model Specification reviews and extends the author's papers on consistent model specification testing and semi-nonparametric modeling and inference. This book consists of two parts. The first part discusses consistent tests of functional form of regression and conditional distribution models, including a consistent test of the martingale difference hypothesis for time series regression errors. In the second part, semi-nonparametric modeling and inference for duration and auction models are considered, as well as a general theory of the consistency and asymptotic normality of semi-nonparametric sieve maximum likelihood estimators. Moreover, this volume also contains addendums and appendices that provide detailed proofs and extensions of all the results. It is uniquely self-contained and is a useful source for students and researchers interested in model specification issues.
The objective of this book is the discussion and the practical illustration of techniques used in applied macroeconometrics. There are currently three competing approaches: the LSE (London School of Economics) approach, the VAR approach, and the intertemporal optimization/Real Business Cycle approach. This book discusses and illustrates the empirical research strategy of these three alternative approaches, pairing them with extensive discussions and replications of the relevant empirical work. Common benchmarks are used to evaluate the alternative approaches.
Economic Models for Industrial Organization focuses on the specification and estimation of econometric models for research in industrial organization. In recent decades, empirical work in industrial organization has moved towards dynamic and equilibrium models, involving econometric methods which have features distinct from those used in other areas of applied economics. These lecture notes, aimed for a first or second-year PhD course, motivate and explain these econometric methods, starting from simple models and building to models with the complexity observed in typical research papers. The covered topics include discrete-choice demand analysis, models of dynamic behavior and dynamic games, multiple equilibria in entry games and partial identification, and auction models.
This book presents the methodology and applications of Data Envelopment Analysis (DEA) in measuring productivity, efficiency and effectiveness in Financial Services firms such as banks, bank branches, stock markets, pension funds, mutual funds, insurance firms, credit unions, risk tolerance, and corporate failure prediction. Financial service DEA research includes banking; insurance businesses; hedge, pension and mutual funds; and credit unions. Significant business transactions among financial service organizations such as bank mergers and acquisitions and valuation of IPOs have also been the focus of DEA research. The book looks at the range of DEA uses for financial services by presenting prior studies, examining the current capabilities reflected in the most recent research, and projecting future new uses of DEA in finance related applications.
'Overall, the book is highly technical, including full mathematical proofs of the results stated. Potential readers are post-graduate students or researchers in Quantitative Risk Management willing to have a manual with the state-of-the-art on portfolio diversification and risk aggregation with heavy tails, including the fundamental theorems as well as collateral (but most useful) results on majorization and copula theory.'Quantitative Finance This book offers a unified approach to the study of crises, large fluctuations, dependence and contagion effects in economics and finance. It covers important topics in statistical modeling and estimation, which combine the notions of copulas and heavy tails - two particularly valuable tools of today's research in economics, finance, econometrics and other fields - in order to provide a new way of thinking about such vital problems as diversification of risk and propagation of crises through financial markets due to contagion phenomena, among others. The aim is to arm today's economists with a toolbox suited for analyzing multivariate data with many outliers and with arbitrary dependence patterns. The methods and topics discussed and used in the book include, in particular, majorization theory, heavy-tailed distributions and copula functions - all applied to study robustness of economic, financial and statistical models, and estimation methods to heavy tails and dependence.
The book is a collection of essays in honour of Clive Granger. The chapters are by some of the world's leading econometricians, all of whom have collaborated with or studied with (or both) Clive Granger. Central themes of Granger's work are reflected in the book with attention to tests for unit roots and cointegration, tests of misspecification, forecasting models and forecast evaluation, non-linear and non-parametric econometric techniques, and overall, a careful blend of practical empirical work and strong theory. The book shows the scope of Granger's research and the range of the profession that has been influenced by his work. |
You may like...
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
R3,581
Discovery Miles 35 810
|