|
|
Books > Business & Economics > Economics > Econometrics
Econometric theory, as presented in textbooks and the econometric
literature generally, is a somewhat disparate collection of
findings. Its essential nature is to be a set of demonstrated
results that increase over time, each logically based on a specific
set of axioms or assumptions, yet at every moment, rather than a
finished work, these inevitably form an incomplete body of
knowledge. The practice of econometric theory consists of selecting
from, applying, and evaluating this literature, so as to test its
applicability and range. The creation, development, and use of
computer software has led applied economic research into a new age.
This book describes the history of econometric computation from
1950 to the present day, based upon an interactive survey involving
the collaboration of the many econometricians who have designed and
developed this software. It identifies each of the econometric
software packages that are made available to and used by economists
and econometricians worldwide.
A function is convex if its epigraph is convex. This geometrical
structure has very strong implications in terms of continuity and
differentiability. Separation theorems lead to optimality
conditions and duality for convex problems. A function is
quasiconvex if its lower level sets are convex. Here again, the geo
metrical structure of the level sets implies some continuity and
differentiability properties for quasiconvex functions. Optimality
conditions and duality can be derived for optimization problems
involving such functions as well. Over a period of about fifty
years, quasiconvex and other generalized convex functions have been
considered in a variety of fields including economies, man agement
science, engineering, probability and applied sciences in
accordance with the need of particular applications. During the
last twenty-five years, an increase of research activities in this
field has been witnessed. More recently generalized monotonicity of
maps has been studied. It relates to generalized convexity off
unctions as monotonicity relates to convexity. Generalized
monotonicity plays a role in variational inequality problems,
complementarity problems and more generally, in equilibrium prob
lems."
National income estimates date back to the late 17th century, but
only in the half-century since the Second World War have economic
accounts developed in their present form, becoming an indispensable
tool for macroeconomic analysis, projections and policy
formulation. Furthermore, it was in this period that the United
Nations issued several versions of a system of national accounts
(SNA) to make possible economic comparisons on a consistent basis.
The latest version, SNA 1993, published in early 1994, occasioned
this collection of essays and commentaries. The three chief
objectives of the volume are: to enhance understanding of
socioeconomic accounts generally and of SNA 1993 in particular; to
offer a critique of SNA 1993, including constructive suggestions
for future revisions of the system, making it even more useful for
its national and international purposes; and to serve as a
textbook, or book of readings in conjunction with SNA 1993, for
courses in economic accounts.
This is the first textbook designed to teach statistics to students
in aviation courses. All examples and exercises are grounded in an
aviation context, including flight instruction, air traffic
control, airport management, and human factors. Structured in six
parts, theiscovers the key foundational topics relative to
descriptive and inferential statistics, including hypothesis
testing, confidence intervals, z and t tests, correlation,
regression, ANOVA, and chi-square. In addition, this book promotes
both procedural knowledge and conceptual understanding. Detailed,
guided examples are presented from the perspective of conducting a
research study. Each analysis technique is clearly explained,
enabling readers to understand, carry out, and report results
correctly. Students are further supported by a range of pedagogical
features in each chapter, including objectives, a summary, and a
vocabulary check. Digital supplements comprise downloadable data
sets and short video lectures explaining key concepts. Instructors
also have access to PPT slides and an instructor’s manual that
consists of a test bank with multiple choice exams, exercises with
data sets, and solutions. This is the ideal statistics textbook for
aviation courses globally, especially in aviation statistics,
research methods in aviation, human factors, and related areas.
This textbook provides future data analysts with the tools,
methods, and skills needed to answer data-focused, real-life
questions; to carry out data analysis; and to visualize and
interpret results to support better decisions in business,
economics, and public policy. Data wrangling and exploration,
regression analysis, machine learning, and causal analysis are
comprehensively covered, as well as when, why, and how the methods
work, and how they relate to each other. As the most effective way
to communicate data analysis, running case studies play a central
role in this textbook. Each case starts with an industry-relevant
question and answers it by using real-world data and applying the
tools and methods covered in the textbook. Learning is then
consolidated by 360 practice questions and 120 data exercises.
Extensive online resources, including raw and cleaned data and
codes for all analysis in Stata, R, and Python, can be found at
www.gabors-data-analysis.com.
Drawing on the author's extensive and varied research, this book
provides readers with a firm grounding in the concepts and issues
across several disciplines including economics, nutrition,
psychology and public health in the hope of improving the design of
food policies in the developed and developing world. Using
longitudinal (panel) data from India, Bangladesh, Kenya, the
Philippines, Vietnam, and Pakistan and extending the analytical
framework used in economics and biomedical sciences to include
multi-disciplinary analyses, Alok Bhargava shows how rigorous and
thoughtful econometric and statistical analysis can improve our
understanding of the relationships between a number of
socioeconomic, nutritional, and behavioural variables on a number
of issues like cognitive development in children and labour
productivity in the developing world. These unique insights
combined with a multi-disciplinary approach forge the way for a
more refined and effective approach to food policy formation going
forward. A chapter on the growing obesity epidemic is also
included, highlighting the new set of problems facing not only
developed but developing countries. The book also includes a
glossary of technical terms to assist readers coming from a variety
of disciplines.
The purpose of this volume is to honour a pioneer in the field of
econometrics, A. L. Nagar, on the occasion of his sixtieth
birthday. Fourteen econometricians from six countries on four
continents have contributed to this project. One of us was his
teacher, some of us were his students, many of us were his
colleagues, all of us are his friends. Our volume opens with a
paper by L. R. Klein which discusses the meaning and role of
exogenous variables in struc tural and vector-autoregressive
econometric models. Several examples from recent macroeconomic
history are presented and the notion of Granger-causality is
discussed. This is followed by two papers dealing with an issue of
considerable relevance to developing countries, such as India; the
measurement of the inequality in the distribution of income. The
paper by C. T. West and H. Theil deals with the problem of
measuring inequality of all components of total income vvithin a
region, rather than just labour income. It applies its results to
the regions of the United States. The second paper in this group,
by N. Kakwani, derives the large-sample distributions of several
popular inequality measures, thus providing a method for drawing
large-sample inferences about the differences in inequality between
regions. The techniques are applied to the regions of Cote
d'Ivoire. The next group of papers is devoted to econometric theory
in the context of the dynamic, simultaneous, linear equations
model. The first, by P. J."
This book describes the classical axiomatic theories of decision
under uncertainty, as well as critiques thereof and alternative
theories. It focuses on the meaning of probability, discussing some
definitions and surveying their scope of applicability. The
behavioral definition of subjective probability serves as a way to
present the classical theories, culminating in Savage's theorem.
The limitations of this result as a definition of probability lead
to two directions - first, similar behavioral definitions of more
general theories, such as non-additive probabilities and multiple
priors, and second, cognitive derivations based on case-based
techniques.
In plying their trade, social scientists often are confronted with
significant phenomena that appear incapable of measurement. Past
practice would suggest that the way to deal with these cases is to
work harder at finding appropriate measures so that standard
quantitative analysis can still be applied. Professor Katzner's
approach, however is quite different. Rather than concentrating on
the construction of measures, he raises the question of how such
phenomena can be investigated and understood in the absence of
numerical gauges to represent them.
At the time of this volume's publication in 1985, general
equilibrium modelling had become a significant area of applied
economic research. Its focus was to develop techniques to
facilitate economy-wide quantitative assessment of allocative and
distributional impacts on policy changes. UK Tax Policy and Applied
General Equilibrium Analysis was the first book-length treatment of
the development and application of an applied general equilibrium
model of the Walrasian type, constructed to analyse UK taxation and
subsidy policy. As a whole, UK Tax Policy and Applied General
Equilibrium Analysis offers the reader two things. First, it gives
a detailed account of the development of an applied general
equilibrium model of the UK. Second, it provides results of model
experiments which have been designed to inform the policy debate,
not only in the UK but also in other countries. It should thus be
of interest to both researchers and students undertaking research
in the applied general equilibrium area and to policy makers
concerned with tax reform.
Giovanni Castellani Rector of the University of Venice This book
contains the Proceedings of the Conference on "Economic Policy and
Control Theory" which was held at the University of Venice (Italy)
on 27 January-l February 1985. The goal of the Conference was to
survey the main developments of control theory in economics, by
emphasizing particularly new achievements in the analysis of
dynamic economic models by con trol methods. The development of
control theory is strictly related to the development of science
and technology in the last forty years. Control theory was indeed
applied mainly in engineering, and only in the sixties economists
started using control methods for analys ing economic problems,
even if some preliminary economic applica tions of calculus of
variations, from which control theory was then developed, date back
to the twenties. Applications of control theory in economics also
had to solve new, complicated, problems, like those encountered in
optimal growth models, or like the determination of the appropriate
inter temporal social welfare function, of the policy horizon and
the relative final state of the system, of the appropriate discount
factor. Furthermore, the uncertainty characterizing economic models
had to be taken into account, thus giving rise to the development
of stochastic control theory in economics."
Thi s book ari ses from The Fourth European Coll oqui urn on
Theoret i ca 1 and Quant itat i ve Geography wh i ch was he 1 din
Ve 1 dhoven, The Netherlands in September 1985. It contains a
series of papers on spatial choice dynamics and dynamical spatial
systems which were presented at the colloquium, together with a few
other soll icited ones. The book is intended primarily as a
state-of-the art review of mainly European research on these two
fastly growing problem areas. As a consequence of this decision,
the book contains a selection of papers that differs in terms of
focus, level of sophistication and conceptual background.
Evidently, the dissimination of ideas and computer software is a
time-related phenomenon, which in the European context is amplified
by differences in language, the profile of geography and the formal
training of geographers. The book reflects such differences. It
would have been impossible to produce this book without the support
of the various European study groups on theoretical and
quantitative geography. Without their help the meetings from which
this volumes originates would not have been held in the first
place. We are also indebted to the Royal Dutch Academy of Science
for partly funding the colloquium, and to SISWO and TNOjPSC for
providing general support in the organisation of the conference.
This book covers recent advances in efficiency evaluations, most
notably Data Envelopment Analysis (DEA) and Stochastic Frontier
Analysis (SFA) methods. It introduces the underlying theories,
shows how to make the relevant calculations and discusses
applications. The aim is to make the reader aware of the pros and
cons of the different methods and to show how to use these methods
in both standard and non-standard cases. Several software packages
have been developed to solve some of the most common DEA and SFA
models. This book relies on R, a free, open source software
environment for statistical computing and graphics. This enables
the reader to solve not only standard problems, but also many other
problem variants. Using R, one can focus on understanding the
context and developing a good model. One is not restricted to
predefined model variants and to a one-size-fits-all approach. To
facilitate the use of R, the authors have developed an R package
called Benchmarking, which implements the main methods within both
DEA and SFA. The book uses mathematical formulations of models and
assumptions, but it de-emphasizes the formal proofs - in part by
placing them in appendices -- or by referring to the original
sources. Moreover, the book emphasizes the usage of the theories
and the interpretations of the mathematical formulations. It
includes a series of small examples, graphical illustrations,
simple extensions and questions to think about. Also, it combines
the formal models with less formal economic and organizational
thinking. Last but not least it discusses some larger applications
with significant practical impacts, including the design of
benchmarking-based regulations of energy companies in different
European countries, and the development of merger control programs
for competition authorities.
o. Guvenen, University of Paris IX-Dauphine The aim of this
publication is to present recent developments in international com
modity market model building and policy analysis. This book is
based mainly on the research presented at the XlIth International
Conference organised by the Applied Econometric Association (AEA)
which was held at the University of Zaragoza in Spain. This
conference would not have been possible with out the cooperation of
the Department of Econometrics of the University of Zaragoza and
its Chairman A.A. Grasa. I would like to express my thanks to all
contributors. I am grateful to J.H.P. Paelinck, J.P. Ancot, A.J.
Hughes Hallett and H. Serbat for their constructive contributions
and comments concerning the structure of the book. vii INTRODUCTION
o. Guvenen The challenge of increasing complexity and global
interdependence at the world level necessitates new modelling
approaches and policy analysis at the macroeconomic level, and for
commodities. The evaluation of economic modelling.follows the
evolution of international economic phenomena. In that
interdependent context there is a growing need for forecasting and
simulation tools in the analysis of international primary com
modity markets."
These proceedings, from a conference held at the Federal Reserve
Bank of St. Louis on October 17-18, 1991, attempted to layout what
we currently know about aggregate economic fluctuations.
Identifying what we know inevitably reveals what we do not know
about such fluctuations as well. From the vantage point of where
the conference's participants view our current understanding to be,
these proceedings can be seen as suggesting an agenda for further
research. The conference was divided into five sections. It began
with the formu lation of an empirical definition of the "business
cycle" and a recitation of the stylized facts that must be
explained by any theory that purports to capture the business
cycle's essence. After outlining the historical develop ment and
key features of the current "theories" of business cycles, the
conference evaluated these theories on the basis of their ability
to explain the facts. Included in this evaluation was a discussion
of whether (and how) the competing theories could be distinguished
empirically. The conference then examined the implications for
policy of what is known and not known about business cycles. A
panel discussion closed the conference, high lighting important
unresolved theoretical and empirical issues that should be taken up
in future business cycle research. What Is a Business Cycle? Before
gaining a genuine understanding of business cycles, economists must
agree and be clear about what they mean when they refer to the
cycle."
Max-Min problems are two-step allocation problems in which one side
must make his move knowing that the other side will then learn what
the move is and optimally counter. They are fundamental in parti
cular to military weapons-selection problems involving large
systems such as Minuteman or Polaris, where the systems in the mix
are so large that they cannot be concealed from an opponent. One
must then expect the opponent to determine on an optlmal mixture
of, in the case men tioned above, anti-Minuteman and anti-submarine
effort. The author's first introduction to a problem of Max-Min
type occurred at The RAND Corporation about 1951. One side
allocates anti-missile defenses to various cities. The other side
observes this allocation and then allocates missiles to those
cities. If F(x, y) denotes the total residual value of the cities
after the attack, with x denoting the defender's strategy and y the
attacker's, the problem is then to find Max MinF(x, y) = Max
MinF(x, y)] ."
In this Element and its accompanying second Element, A Practical
Introduction to Regression Discontinuity Designs: Extensions,
Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an
accessible and practical guide for the analysis and interpretation
of regression discontinuity (RD) designs that encourages the use of
a common set of practices and facilitates the accumulation of
RD-based empirical evidence. In this Element, the authors discuss
the foundations of the canonical Sharp RD design, which has the
following features: (i) the score is continuously distributed and
has only one dimension, (ii) there is only one cutoff, and (iii)
compliance with the treatment assignment is perfect. In the second
Element, the authors discuss practical and conceptual extensions to
this basic RD setup.
How could Finance benefit from AI? How can AI techniques provide an
edge? Moving well beyond simply speeding up computation, this book
tackles AI for Finance from a range of perspectives including
business, technology, research, and students. Covering aspects like
algorithms, big data, and machine learning, this book answers these
and many other questions.
Accessible to a general audience with some background in statistics
and computing Many examples and extended case studies Illustrations
using R and Rstudio A true blend of statistics and computer science
-- not just a grab bag of topics from each
Macroeconomic Policy in the Canadian Economy investigates
developments in Canada over the last forty years, using recent
advances in the field of applied econometrics. In particular, the
book analyzes the theoretical foundations of public sector
activities and evaluates the several theories of government growth.
Issues of convergence are also investigated as they manifest
themselves in per capita income across Canadian provinces, and as
to how successful government income equalization policies have been
in furthering such convergence. Moreover, the openness of the
Canadian economy is investigated in terms of the importance of
exports on GDP growth and of its participation in the world of an
internationally integrated capital market. The book also analyzes
monetary policy issues and investigates the role of monetary
aggregates and the effectiveness of monetary policy. Finally, it
addresses the issue of the existence or not of electoral and
partisan cycles in Canada, by incorporating both fiscal and
monetary principles and applying them to the lively world of
Canadian politics.
Empirical Studies In Applied Economics presents nine previously
unpublished analyses in monograph form. In this work, the topics
are presented so that each chapter stands on its own. The emphasis
is on the applications but attention is also given to the
econometric and statistical issues for advanced readers.
Econometric methods include multivariate regression analysis,
limited dependent variable analysis, and other maximum likelihood
techniques. The empirical topics include the measurement of
competition and market power in natural gas transportation markets
and in the pharmaceutical market for chemotherapy drugs. Additional
topics include an empirical analysis of NFL football demand, the
accuracy of an econometric model for mail demand, and the
allocation of police services in rural Alaska. Other chapters
consider the valuation of technology patents and the determination
of patent scope, duration, and reasonable royalty, and the reaction
of financial markets to health scares in the fast-food industry.
Finally, two chapters are devoted to the theory and testing of
synergistic health effects from the combined exposure to asbestos
and cigarette smoking.
|
|