![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. The volume collects Herbert Scarfs papers in the area of Production in Indivisibilities and the Theories of Large Firms.
Why should we be interested in macroeconomic survey expectations? This important book offers an in-depth treatment of this question from a point of view not covered in existing works on time-series econometrics and forecasting. Clements presents the nature of survey data, addresses some of the difficulties posed by the way in which survey expectations are elicited and considers the evaluation of point predictions and probability distributions. He outlines how, from a behavioural perspective, surveys offer insight into how economic agents form their expectations.
Originally published in 1976 and with second edition published in 1984. This book established itself as the first genuinely introductory text on econometric methods, assuming no formal background on the part of the reader. The second edition maintains this distinctive feature. Fundamental concepts are carefully explained and, where possible, techniques are developed by verbal reasoning rather than formal proof. It provides all the material for a basic course. and is also ideal for a student working alone. Very little knowledge of maths and statistics is assumed, and the logic of statistical method is carefully stated. There are numerous exercises, designed to help the student assess individual progress. Methods are described with computer solutions in mind and the author shows how a variety of different calculations can be performed with relatively simple programs. This new edition also includes much new material - statistical tables are now included and their use carefully explained.
Originally published in 1970; with a second edition in 1989. Empirical Bayes methods use some of the apparatus of the pure Bayes approach, but an actual prior distribution is assumed to generate the data sequence. It can be estimated thus producing empirical Bayes estimates or decision rules. In this second edition, details are provided of the derivation and the performance of empirical Bayes rules for a variety of special models. Attention is given to the problem of assessing the goodness of an empirical Bayes estimator for a given set of prior data. Chapters also focus on alternatives to the empirical Bayes approach and actual applications of empirical Bayes methods.
Originally published in 1929. This balanced combination of fieldwork, statistical measurement, and realistic applications shows a synthesis of economics and political science in a conception of an organic relationship between the two sciences that involves functional analysis, institutional interpretation, and a more workmanlike approach to questions of organization such as division of labour and the control of industry. The treatise applies the test of fact through statistical analysis to economic and political theories for the quantitative and institutional approach in solving social and industrial problems. It constructs a framework of concepts, combining both economic and political theory, to systematically produce an original statement in general terms of the principles and methods for statistical fieldwork. The separation into Parts allows selective reading for the methods of statistical measurement; the principles and fallacies of applying these measures to economic and political fields; and the resultant construction of a statistical economics and politics. Basic statistical concepts are described for application, with each method of statistical measurement illustrated with instances relevant to the economic and political theory discussed and a statistical glossary is included.
Environmental risk directly affects the financial stability of banks since they bear the financial consequences of the loss of liquidity of the entities to which they lend and of the financial penalties imposed resulting from the failure to comply with regulations and for actions taken that are harmful to the natural environment. This book explores the impact of environmental risk on the banking sector and analyzes strategies to mitigate this risk with a special emphasis on the role of modelling. It argues that environmental risk modelling allows banks to estimate the patterns and consequences of environmental risk on their operations, and to take measures within the context of asset and liability management to minimize the likelihood of losses. An important role here is played by the environmental risk modelling methodology as well as the software and mathematical and econometric models used. It examines banks' responses to macroprudential risk, particularly from the point of view of their adaptation strategies; the mechanisms of its spread; risk management and modelling; and sustainable business models. It introduces the basic concepts, definitions, and regulations concerning this type of risk, within the context of its influence on the banking industry. The book is primarily based on a quantitative and qualitative approach and proposes the delivery of a new methodology of environmental risk management and modelling in the banking sector. As such, it will appeal to researchers, scholars, and students of environmental economics, finance and banking, sociology, law, and political sciences.
In order to make informed decisions, there are three important elements: intuition, trust, and analytics. Intuition is based on experiential learning and recent research has shown that those who rely on their "gut feelings" may do better than those who don't. Analytics, however, are important in a data-driven environment to also inform decision making. The third element, trust, is critical for knowledge sharing to take place. These three elements-intuition, analytics, and trust-make a perfect combination for decision making. This book gathers leading researchers who explore the role of these three elements in the process of decision-making.
Originally published in 1985. Mathematical methods and models to facilitate the understanding of the processes of economic dynamics and prediction were refined considerably over the period before this book was written. The field had grown; and many of the techniques involved became extremely complicated. Areas of particular interest include optimal control, non-linear models, game-theoretic approaches, demand analysis and time-series forecasting. This book presents a critical appraisal of developments and identifies potentially productive new directions for research. It synthesises work from mathematics, statistics and economics and includes a thorough analysis of the relationship between system understanding and predictability.
Originally published in 1960 and 1966. This is an elementary introduction to the sources of economic statistics and their uses in answering economic questions. No mathematical knowledge is assumed, and no mathematical symbols are used. The book shows - by asking and answering a number of typical questions of applied economics - what the most useful statistics are, where they are found, and how they are to be interpreted and presented. The reader is introduced to the major British, European and American official sources, to the social accounts, to index numbers and averaging, and to elementary aids to inspection such as moving averages and scatter diagrams.
Global econometric models have a long history. From the early 1970s to the present, as modeling techniques have advanced, different modeling paradigms have emerged and been used to support national and international policy making. One purpose of this volume - based on a conference in recognition of the seminal impact of Nobel Prize winner in Economic Sciences Lawrence R Klein, whose pioneering work has spawned the field of international econometric modeling - is to survey these developments from today's perspective.A second objective of the volume is to shed light on the wide range of attempts to broaden the scope of modeling on an international scale. Beyond new developments in traditional areas of the trade and financial flows, the volume reviews new approaches to the modeling of linkages between macroeconomic activity and individual economic units, new research on the analysis of trends in income distribution and economic wellbeing on a global scale, and innovative ideas about modeling the interactions between economic development and the environment.With the expansion of elaborated economic linkages, this volume makes an important contribution to the evolving literature of global econometric models.
The rapidly increasing importance of China, India, Indonesia, Japan, South Korea and Taiwan both in Asia and in the world economy, represents a trend that is set to continue into the 21st century. This book provides an authoritative assessment of the 20th century performance of these countries, and in particular the factors contributing to the acceleration of Asian growth in the latter part of the century. The contributors look at Asia within a global perspective and detailed comparisons are drawn with Australia and the USA. Contributions from leading experts offer a comprehensive review of the procedures necessary to establish valid international comparisons for countries with very different economic histories and levels of development. These include methods of growth performance measurement and techniques of growth accounting. The Asian Economies in the Twentieth Century will be an indispensable new tool for policy analysts, international agencies and academic researchers.
-Up-to-date with cutting edge topics -Suitable for professional quants and as library reference for students of finance and financial mathematics
This book covers diverse themes, including institutions and efficiency, choice and values, law and economics, development and policy, and social and economic measurement. Written in honour of the distinguished economist Satish K. Jain, this compilation of essays should appeal not only to students and researchers of economic theory but also to those interested in the design and evaluation of institutions and policy.
This volume deals with a range of contemporary issues in Indian and other world economies, with a focus on economic theory and policy and their longstanding implications. It analyses and predicts the mechanisms that can come into play to determine the function of institutions and the impact of public policy.
This important three volume set is a collection of Edgeworth's published writings in the areas of statistics and probability. There is a newly-emerging interest in probability theory as a basis for economic thought and this collection makes the writings of Edgeworth more accessible.A new introduction written by the editor covers the biographical details, a brief abstract of each of the articles and the basis of their selection is also included.
Factor Analysis and Dimension Reduction in R provides coverage, with worked examples, of a large number of dimension reduction procedures along with model performance metrics to compare them. Factor analysis in the form of principal components analysis (PCA) or principal factor analysis (PFA) is familiar to most social scientists. However, what is less familiar is understanding that factor analysis is a subset of the more general statistical family of dimension reduction methods. The social scientist's toolkit for factor analysis problems can be expanded to include the range of solutions this book presents. In addition to covering FA and PCA with orthogonal and oblique rotation, this book's coverage includes higher-order factor models, bifactor models, models based on binary and ordinal data, models based on mixed data, generalized low-rank models, cluster analysis with GLRM, models involving supplemental variables or observations, Bayesian factor analysis, regularized factor analysis, testing for unidimensionality, and prediction with factor scores. The second half of the book deals with other procedures for dimension reduction. These include coverage of kernel PCA, factor analysis with multidimensional scaling, locally linear embedding models, Laplacian eigenmaps, diffusion maps, force directed methods, t-distributed stochastic neighbor embedding, independent component analysis (ICA), dimensionality reduction via regression (DRR), non-negative matrix factorization (NNMF), Isomap, Autoencoder, uniform manifold approximation and projection (UMAP) models, neural network models, and longitudinal factor analysis models. In addition, a special chapter covers metrics for comparing model performance. Features of this book include: Numerous worked examples with replicable R code Explicit comprehensive coverage of data assumptions Adaptation of factor methods to binary, ordinal, and categorical data Residual and outlier analysis Visualization of factor results Final chapters that treat integration of factor analysis with neural network and time series methods Presented in color with R code and introduction to R and RStudio, this book will be suitable for graduate-level and optional module courses for social scientists, and on quantitative methods and multivariate statistics courses.
The case-control approach is a powerful method for investigating factors that may explain a particular event. It is extensively used in epidemiology to study disease incidence, one of the best-known examples being Bradford Hill and Doll's investigation of the possible connection between cigarette smoking and lung cancer. More recently, case-control studies have been increasingly used in other fields, including sociology and econometrics. With a particular focus on statistical analysis, this book is ideal for applied and theoretical statisticians wanting an up-to-date introduction to the field. It covers the fundamentals of case-control study design and analysis as well as more recent developments, including two-stage studies, case-only studies and methods for case-control sampling in time. The latter have important applications in large prospective cohorts which require case-control sampling designs to make efficient use of resources. More theoretical background is provided in an appendix for those new to the field.
A Handbook of Statistical Analyses Using SPSS clearly describes how to conduct a range of univariate and multivariate statistical analyses using the latest version of the Statistical Package for the Social Sciences, SPSS 11. Each chapter addresses a different type of analytical procedure applied to one or more data sets, primarily from the social and behavioral sciences areas. Each chapter also contains exercises relating to the data sets introduced, providing readers with a means to develop both their SPSS and statistical skills. Model answers to the exercises are also provided. Readers can download all of the data sets from a companion Web site furnished by the authors.
This book provides the reader with user-friendly applications of normal distribution. In several variables it is called the multinormal distribution which is often handled using matrices for convenience. The author seeks to make the arguments less abstract and hence, starts with the univariate case and moves progressively toward the vector and matrix cases. The approach used in the book is a gradual one, going from one scalar variable to a vector variable and to a matrix variable. The author presents the unified aspect of normal distribution, as well as addresses several other issues, including random matrix theory in physics. Other well-known applications, such as Herrnstein and Murray's argument that human intelligence is substantially influenced by both inherited and environmental factors, will be discussed in this book. It is a better predictor of many personal dynamics - including financial income, job performance, birth out of wedlock, and involvement in crime - than are an individual's parental socioeconomic status, or education level, and deserve to be mentioned and discussed.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
This book surveys existing similar econometric models in Japan and offers several econometric models combining Japan, the US and other Asia-Pacific countries. These models have been explored by the author and his group at Nagoya University and other institutions for three decades, and are applied for the following four objectives. First, they construct a world econometric model of industry and trade, and thereby quantitatively assess the impacts of protective US trade policies and Japan's technical progress on Asia-Pacific economies. Second, they use an international input-output table, including China, to analyze the interdependence between Japanese firms with the subsidiaries in the US and Asia, and other foreign companies. Third, they use a small link model of China, Japan, Korea and the US, and thereby evaluate the macroeconomic effects of the respective fiscal policies. Fourth, they offer a multi-sector econometric model of the interactions pertaining to economic activity, energy and environment in China, and assess the effects of improved energy efficiency and demand shift in China.This volume comprises papers written by Soshichi Kinoshita (Professor Emeritus, Nagoya University, Nagoya), Jiro Nemoto (Professor of Economics, Nagoya University, Nagoya), Mitsuo Yamada (Professor of Economics, Chukyo University, Nagoya) and Taiyo Ozaki (Professor of Economics, Kyoto Gakuen University, Kyoto).
General Equilibrium Theory has been one of the major intellectual developments in economics during the past half-century. The theory of general equilibrium is centred on an inquiry about human societies which has several of the characteristics of a fundamental scientific question. In an economy, a multitude of agents produce, exchange, and consume a large number of commodities. Their decisions are independent of each other and dictated by self-interest. Attempting to answer the question 'Why is social chaos not the result?' has required an intensive research effort by several generations of leading economists.This important three volume set gathers together many of the articles that have played an influential role in the history of ideas in the general equilibrium area in the contemporary period.
Both theoretical and empirical aspects of single- and multi-winner voting procedures are presented in this collection of papers. Starting from a discussion of the underlying principles of democratic representation, the volume includes a description of a great variety of voting procedures. It lists and illustrates their susceptibility to the main voting paradoxes, assesses (under various models of voters' preferences) the probability of paradoxical outcomes, and discusses the relevance of the theoretical results to the choice of voting system. |
![]() ![]() You may like...
Contemporary Project Management…
Timothy Kloppenborg, Kathryn Wells, …
Paperback
The Leading Indicators - A Short History…
Zachary Karabell
Paperback
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Handbook of Research Methods and…
Nigar Hashimzade, Michael A. Thornton
Hardcover
R8,286
Discovery Miles 82 860
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
|