|
|
Books > Business & Economics > Economics > Econometrics
 |
Macbeth
(Paperback)
William Shakespeare
|
R369
Discovery Miles 3 690
|
Ships in 18 - 22 working days
|
|
|
Recent economic history suggests that a key element in economic
growth and development for many countries has been an aggressive
export policy and a complementary import policy. Such policies can
be very effective provided that resources are used wisely to
encourage exports from industries that can be com petitive in the
international arena. Also, import protection must be used carefully
so that it encourages infant industries instead of providing rents
to industries that are not competitive. Policy makers may use a
variety of methods of analysis in planning trade policy. As
computing power has grown in recent years increasing attention has
been give to economic models as one of the most powerful aids to
policy making. These models can be used on the one hand to help in
selecting export industries to encourage and infant industries to
protect and on the other hand to chart the larger effects ofttade
policy on the entire economy. While many models have been developed
in recent years there has not been any analysis of the strengths
and weaknesses of the various types of models. Therefore, this
monograph provides a review and analysis of the models which can be
used to analyze dynamic comparative advantage."
How might one determine if a financial institution is taking risk
in a balanced and productive manner? A powerful tool to address
this question is economic capital, which is a model-based measure
of the amount of equity that an entity must hold to satisfactorily
offset its risk-generating activities. This book, with a particular
focus on the credit-risk dimension, pragmatically explores
real-world economic-capital methodologies and applications. It
begins with the thorny practical issues surrounding the
construction of an (industrial-strength) credit-risk
economic-capital model, defensibly determining its parameters, and
ensuring its efficient implementation. It then broadens its gaze to
examine various critical applications and extensions of economic
capital; these include loan pricing, the computation of loan
impairments, and stress testing. Along the way, typically working
from first principles, various possible modelling choices and
related concepts are examined. The end result is a useful reference
for students and practitioners wishing to learn more about a
centrally important financial-management device.
Parallel Algorithms for Linear Models provides a complete and
detailed account of the design, analysis and implementation of
parallel algorithms for solving large-scale linear models. It
investigates and presents efficient, numerically stable algorithms
for computing the least-squares estimators and other quantities of
interest on massively parallel systems. The monograph is in two
parts. The first part consists of four chapters and deals with the
computational aspects for solving linear models that have
applicability in diverse areas. The remaining two chapters form the
second part, which concentrates on numerical and computational
methods for solving various problems associated with seemingly
unrelated regression equations (SURE) and simultaneous equations
models. The practical issues of the parallel algorithms and the
theoretical aspects of the numerical methods will be of interest to
a broad range of researchers working in the areas of numerical and
computational methods in statistics and econometrics, parallel
numerical algorithms, parallel computing and numerical linear
algebra. The aim of this monograph is to promote research in the
interface of econometrics, computational statistics, numerical
linear algebra and parallelism.
In recent years there has been a growing interest in and concern
for the development of a sound spatial statistical body of theory.
This work has been undertaken by geographers, statisticians,
regional scientists, econometricians, and others (e. g.,
sociologists). It has led to the publication of a number of books,
including Cliff and Ord's Spatial Processes (1981), Bartlett's The
Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial
Statistics (1981), Paelinck and Klaassen's Spatial Economet ics
(1979), Ahuja and Schachter's Pattern Models (1983), and Upton and
Fingleton's Spatial Data Analysis by Example (1985). The first of
these books presents a useful introduction to the topic of spatial
autocorrelation, focusing on autocorrelation indices and their
sampling distributions. The second of these books is quite brief,
but nevertheless furnishes an eloquent introduction to the rela
tionship between spatial autoregressive and two-dimensional
spectral models. Ripley's book virtually ignores autoregressive and
trend surface modelling, and focuses almost solely on point pattern
analysis. Paelinck and Klaassen's book closely follows an
econometric textbook format, and as a result overlooks much of the
important material necessary for successful spatial data analy sis.
It almost exclusively addresses distance and gravity models, with
some treatment of autoregressive modelling. Pattern Models
supplements Cliff and Ord's book, which in combination provide a
good introduction to spatial data analysis. Its basic limitation is
a preoccupation with the geometry of planar patterns, and hence is
very narrow in scope."
This book presents the effects of integrating information and
communication technologies (ICT) and economic processes in
macroeconomic dynamics, finance, marketing, industrial policies,
and in government economic strategy. The text explores modeling and
applications in these fields and also describes, in a clear and
accessible manner, the theories that guide the integration among
information technology (IT), telecommunications, and the economy,
while presenting examples of their applications. Current trends
such as artificial intelligence, machine learning, and big data
technologies used in economics are also included. This volume is
suitable for researchers, practitioners, and students working in
economic theory and the computational social sciences.
The first edition of this book has been described as a landmark
book, being the first of its kind in applied econometrics. This
second edition is thoroughly revised and updated and explains how
to use many recent technical developments in time series
econometrics. The main objective of the book is to help many
applied economists, with a limited background in econometric
estimation theory, to understand and apply widely used time eseries
econometric techniques.
Features content that has been used extensively in a university
setting, allowing the reader to benefit from tried and tested
methods, practices, and knowledge. In contrast to existing books on
the market, it details the specialized packages that have been
developed over the past decade, and focuses on pulling real-time
data directly from free data sources on the internet. It achieves
its goal by providing a large number of examples in hot topics such
as machine learning. Assumes no prior knowledge of R, allowing it
to be useful to a range of people from undergraduates to
professionals. Comprehensive explanations make the reader
proficient in a multitude of advanced methods, and provides
overviews of many different resources that will be useful to the
readers.
'Fascinating . . . timely' Daily Mail 'Refreshingly clear and
engaging' Tim Harford 'Delightful . . . full of unique insights'
Prof Sir David Spiegelhalter There's no getting away from
statistics. We encounter them every day. We are all users of
statistics whether we like it or not. Do missed appointments really
cost the NHS GBP1bn per year? What's the difference between the
mean gender pay gap and the median gender pay gap? How can we work
out if a claim that we use 42 billion single-use plastic straws per
year in the UK is accurate? What did the Vote Leave campaign's
GBP350m bus really mean? How can we tell if the headline 'Public
pensions cost you GBP4,000 a year' is correct? Does snow really
cost the UK economy GBP1bn per day? But how do we distinguish
statistical fact from fiction? What can we do to decide whether a
number, claim or news story is accurate? Without an understanding
of data, we cannot truly understand what is going on in the world
around us. Written by Anthony Reuben, the BBC's first head of
statistics, Statistical is an accessible and empowering guide to
challenging the numbers all around us.
Franz Ferschl is seventy. According to his birth certificate it is
true, but it is unbelievable. Two of the three editors remembers
very well the Golden Age of Operations Research at Bonn when Franz
Ferschl worked together with Wilhelm Krelle, Martin Beckmann and
Horst Albach. The importance of this fruitful cooperation is
reflected by the fact that half of the contributors to this book
were strongly influenced by Franz Ferschl and his colleagues at the
University of Bonn. Clearly, Franz Ferschl left his traces at all
the other places of his professional activities, in Vienna and
Munich. This is demonstrated by the present volume as well. Born in
1929 in the Upper-Austrian Miihlviertel, his scientific education
brought him to Vienna where he studied mathematics. In his early
years he was attracted by Statistics and Operations Research.
During his employment at the Osterreichische Bundeskammer fUr
Gewerbliche Wirtschaft in Vienna he prepared his famous book on
queueing theory and stochastic processes in economics. This work
has been achieved during his scarce time left by his duties at the
Bundeskammer, mostly between 6 a.m. and midnight. All those
troubles were, however, soon rewarded by the chair of statistics at
Bonn University. As a real Austrian, the amenities of the Rhineland
could not prevent him from returning to Vienna, where he took the
chair of statistics.
This book provides the ultimate goal of economic studies to predict
how the economy develops-and what will happen if we implement
different policies. To be able to do that, we need to have a good
understanding of what causes what in economics. Prediction and
causality in economics are the main topics of this book's chapters;
they use both more traditional and more innovative
techniques-including quantum ideas -- to make predictions about the
world economy (international trade, exchange rates), about a
country's economy (gross domestic product, stock index, inflation
rate), and about individual enterprises, banks, and micro-finance
institutions: their future performance (including the risk of
bankruptcy), their stock prices, and their liquidity. Several
papers study how COVID-19 has influenced the world economy. This
book helps practitioners and researchers to learn more about
prediction and causality in economics -- and to further develop
this important research direction.
|
|