![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
A timely work which represents a major reappraisal of business cycle theory. It revives, with the help of modern analytical techniques, an old theme of Keynesian macroeconomics, namely that "market psychology" (i.e., volatile expectations) may be a significant cause of economic fluctuations. It is of interest not only to economists, but also to mathematicians and physicists.
This book combines both a comprehensive analytical framework and economic statistics that enable business decision makers to anticipate developing economic trends. The author blends recent and historical economic data with economic theory to provide important benchmarks or rules of thumb that give both economists and noneconomists enhanced understanding of unfolding economic data and their interrelationships. Through the matrix system, a disciplined approach is described for integrating readily available economic data into a comprehensive analysis without complex formulas. The extensive appendix of monthly key economic factors for 1978-1991 makes this an important reference source for economic and financial trend analysis. A new and practical method for economic trend analysis is introduced that provides more advanced knowledge than available from economic newsletters. Schaeffer begins with a general description of the business cycle and the typical behavior and effect of the credit markets, commercial banks, and the Federal Reserve. Next, fourteen key economic factors regularly reported by the business press are described, such as the capacity utilization rate and yield on three-month Treasury bills. Benchmarks for each of these key economic factors are set forth, together with an insightful discussion of the interrelationships indicating economic trends. A detailed discussion of the 1978-1991 American economy, incorporating monthly data from the historical matrix, demonstrates the practical application of the matrix system. Executives, investors, financial officers, and government policymakers will find this book useful in decision making.
This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
A new approach to explaining the existence of firms and markets, focusing on variability and coordination. It stands in contrast to the emphasis on transaction costs, and on monitoring and incentive structures, which are prominent in most of the modern literature in this field. This approach, called the variability approach, allows us to: show why both the need for communication and the coordination costs increase when the division of labor increases; explain why, while the firm relies on direction, the market does not; rigorously formulate the optimum divisionalization problem; better understand the relationship between technology and organization; show why the size' of the firm is limited; and to refine the analysis of whether the existence of a sharable input, or the presence of an external effect leads to the emergence of a firm. The book provides a wealth of insights for students and professionals in economics, business, law and organization.
This book contains an extensive up-to-date overview of nonlinear
time series models and their application to modelling economic
relationships. It considers nonlinear models in stationary and
nonstationary frameworks, and both parametric and nonparametric
models are discussed. The book contains examples of nonlinear
models in economic theory and presents the most common nonlinear
time series models. Importantly, it shows the reader how to apply
these models in practice. For this purpose, the building of various
nonlinear models with its three stages of model building:
specification, estimation and evaluation, is discussed in detail
and is illustrated by several examples involving both economic and
non-economic data. Since estimation of nonlinear time series models
is carried out using numerical algorithms, the book contains a
chapter on estimating parametric nonlinear models and another on
estimating nonparametric ones.
The more generous social welfare system in Europe is one of the most important differences between the European and the US society. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. On the other hand, the US economy is performing quite well such that crime rates might come down due to better legal income opportunities. This book takes this trade-off as a point of departure and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. It evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation. Among other aspects, results clearly reveal the crime reducing potential of intact families and the link beween crime and labour market. A special focus is on estimating the consequences of crime, a topic rarely analysed in literature.
Time Series: Theory and Methods is a systematic account of linear time series models and their application to the modelling and prediction of data collected sequentially in time. The aim is to provide specific techniques for handling data and at the same time to provide a thorough understanding of the mathematical basis for techniques. Both time and frequency domain methods are discussed, but the book is written in such a way that either approach could be emphasized. The book intended to be a text for graduate students in statistics, mathematics, engineering, and the natural or social sciences. It contains substantial chapters on multivariate series and state-space models (including applications of the Kalman recursions to missing-value problems) and shorter accounts of special topics including long-range dependence, infinite variance processes and non-linear models. Most of the programs used in the book are available on diskettes for the IBM-PC. These diskettes, with the accompanying manual, ITSM: The Interactive Time Series Modelling Package for the PC, also by Brockwell and Davis, can be purchased from Springer-Verlag.
The issue of unfunded public pension systems has moved to the center of public debate all over the world. Unfortunately, a large part of the discussions have remained on a qualitative level. This book seeks to address this by providing detailed knowledge on modeling pension systems.
Spatial econometrics deals with spatial dependence and spatial heterogeneity, critical aspects of the data used by regional scientists. These characteristics may cause standard econometric techniques to become inappropriate. In this book, I combine several recent research results to construct a comprehensive approach to the incorporation of spatial effects in econometrics. My primary focus is to demonstrate how these spatial effects can be considered as special cases of general frameworks in standard econometrics, and to outline how they necessitate a separate set of methods and techniques, encompassed within the field of spatial econometrics. My viewpoint differs from that taken in the discussion of spatial autocorrelation in spatial statistics - e.g., most recently by Cliff and Ord (1981) and Upton and Fingleton (1985) - in that I am mostly concerned with the relevance of spatial effects on model specification, estimation and other inference, in what I caIl a model-driven approach, as opposed to a data-driven approach in spatial statistics. I attempt to combine a rigorous econometric perspective with a comprehensive treatment of methodological issues in spatial analysis.
Shows the application of some of the developments in the mathematics of optimization, including the concepts of invexity and quasimax to models of economic growth, and to finance and investment. This book introduces a computational package called SCOM, for solving optimal control problems on MATLAB.
This book provides a new source of data and analysis on the role of multinational companies in U.S. international trade over the past two decades. Developed from benchmark surveys of foreign direct investment conducted by the U.S. Government, it contains 96 tables and companion analyses covering affiliate trade, intrafirm trade, bilateral trade, ultimate beneficial owners, commodity (SITC) trade, and affiliate industry groups. The book is intended for researchers and analysts in international business, international trade, and international finance. This book provides a new source of data and analysis on the role of multinational companies in U.S. international trade over the past two decades. Developed from benchmark surveys of foreign direct investment conducted by the U.S. Government, it contains 96 tables showing MNC-related trade for 1975, 1982, and 1989. Tables and analysis cover affiliate related trade, intrafirm related trade, bilateral trade with major trading partners, the role of ultimate beneficial owners, commodity (SITC) trade, and trade by affiliate industry groups. The data and analyses in the book will be equally useful to academic researchers and policy analysts in the fields of international business, international trade, and international finance.
Statistical Methods in Econometrics is appropriate for beginning
graduate courses in mathematical statistics and econometrics in
which the foundations of probability and statistical theory are
developed for application to econometric methodology. Because
econometrics generally requires the study of several unknown
parameters, emphasis is placed on estimation and hypothesis testing
involving several parameters. Accordingly, special attention is
paid to the multivariate normal and the distribution of quadratic
forms. Lagrange multiplier tests are discussed in considerable
detail, along with the traditional likelihood ration and Wald
tests. Characteristic functions and their properties are fully
exploited. Also asymptotic distribution theory, usually given only
cursory treatment, is discussed in detail.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled."
Hardbound. This book is a result of recent developments in several fields. Mathematicians, statisticians, finance theorists, and economists found several interconnections in their research. The emphasis was on common methods, although the applications were also interrelated.The main topic is dynamic stochastic models, in which information arrives and decisions are made sequentially. This gives rise to what finance theorists call option value, what some economists label quasi-option value. Some papers extend the mathematical theory, some deal with new methods of economic analysis, while some present important applications, to natural resources in particular.
Occupational licensure, including regulation of the professions, dates back to the medieval period. While the guilds that performed this regulatory function have long since vanished, professional regulation continues to this day. For instance, in the United States, 22 per cent of American workers must hold licenses simply to do their jobs. While long-established professions have more settled regulatory paradigms, the case studies in Paradoxes of Professional Regulation explore other professions, taking note of incompetent services and the serious risks they pose to the physical, mental, or emotional health, financial well-being, or legal status of uninformed consumers. Michael J. Trebilcock examines five case studies of the regulation of diverse professions, including alternative medicine, mental health care provision, financial planning, immigration consulting, and legal services. Noting the widely divergent approaches to the regulation of the same professions across different jurisdictions - paradoxes of professional regulation - the book is an attempt to develop a set of regulatory principles for the future. In its comparative approach, Paradoxes of Professional Regulation gets at the heart of the tensions influencing the regulatory landscape, and works toward practical lessons for bringing greater coherence to the way in which professions are regulated.
Nonlinear and nonnormal filters are introduced and developed. Traditional nonlinear filters such as the extended Kalman filter and the Gaussian sum filter give biased filtering estimates, and therefore several nonlinear and nonnormal filters have been derived from the underlying probability density functions. The density-based nonlinear filters introduced in this book utilize numerical integration, Monte-Carlo integration with importance sampling or rejection sampling and the obtained filtering estimates are asymptotically unbiased and efficient. By Monte-Carlo simulation studies, all the nonlinear filters are compared. Finally, as an empirical application, consumption functions based on the rational expectation model are estimated for the nonlinear filters, where US, UK and Japan economies are compared.
If you know a little bit about financial mathematics but don't yet know a lot about programming, then C++ for Financial Mathematics is for you. C++ is an essential skill for many jobs in quantitative finance, but learning it can be a daunting prospect. This book gathers together everything you need to know to price derivatives in C++ without unnecessary complexities or technicalities. It leads the reader step-by-step from programming novice to writing a sophisticated and flexible financial mathematics library. At every step, each new idea is motivated and illustrated with concrete financial examples. As employers understand, there is more to programming than knowing a computer language. As well as covering the core language features of C++, this book teaches the skills needed to write truly high quality software. These include topics such as unit tests, debugging, design patterns and data structures. The book teaches everything you need to know to solve realistic financial problems in C++. It can be used for self-study or as a textbook for an advanced undergraduate or master's level course.
Risk, Uncertainty, and Profit is a groundbreaking work of economic theory, distinguishing between risk, which is by nature measurable and quantifiable, and uncertainty, which can be neither be measured nor quantified. We begin with an analysis of the functions of profit, risk and uncertainty in the economy. Frank H. Knight introduces his work with a discussion on profit and how there are conflicts about its nature between various economic theorists. As the title implies, the author's chief concern is the interplay between making a profit, incurring risk, and determining if there is uncertainty. Risks are different from uncertainty in that they can be measured and protected against. For example a location chosen for a factory or farm may have a measured risk of flooding in a given year. Businesses, insurers and investors alike can be made aware of this, and behave according to the quantified risk.
The maximum principle and dynamic programming are the two most commonly used approaches in solving optimal control problems. These approaches have been developed independently. The theme of this book is to unify these two approaches, and to demonstrate that the viscosity solution theory provides the framework to unify them.
The Handbook of U.S. Labor Statistics is recognized as an authoritative resource on the U.S. labor force. It continues and enhances the Bureau of Labor Statistics's (BLS) discontinued publication, Labor Statistics. It allows the user to understand recent developments as well as to compare today's economy with that of the past. This publication includes several tables throughout the book examining the extensive effect that coronavirus (COVID-19) had on the labor market throughout 2020. A chapter titled “The Impact of Coronavirus (COVID-19) on the Labor Force” includes new information on hazard pay, safety measures businesses enforced during the pandemic, vaccine incentives, and compressed work schedules. In addition, there are several other tables within the book exploring its impact on employment, telework, and consumer expenditures. This edition of Handbook of U.S. Labor Statistics also includes a completely updated chapter on prices and the most current employment projections through 2030. The Handbook is a comprehensive reference providing an abundance of information on a variety of topics. In addition to providing statistics on employment, unemployment, and prices, it includes information on topics such as: Earnings; Productivity; Consumer expenditures; Occupational safety and health; Union membership; Working poor Recent trends in the labor force And much more! Features of the publication: In addition to over 215 tables that present practical data, the Handbook provides: Introductory material for each chapter that contains highlights of salient data and figures that call attention to noteworthy trends in the data Notes and definitions, which contain concise descriptions of the data sources, concepts, definitions, and methodology from which the data are derived References to more comprehensive reports which provide additional data and more extensive descriptions of estimation methods, sampling, and reliability measures
This book overviews latest ideas and developments in financial econometrics, with an emphasis on how to best use prior knowledge (e.g., Bayesian way) and how to best use successful data processing techniques from other application areas (e.g., from quantum physics). The book also covers applications to economy-related phenomena ranging from traditionally analyzed phenomena such as manufacturing, food industry, and taxes, to newer-to-analyze phenomena such as cryptocurrencies, influencer marketing, COVID-19 pandemic, financial fraud detection, corruption, and shadow economy. This book will inspire practitioners to learn how to apply state-of-the-art Bayesian, quantum, and related techniques to economic and financial problems and inspire researchers to further improve the existing techniques and come up with new techniques for studying economic and financial phenomena. The book will also be of interest to students interested in latest ideas and results.
Financial market volatility plays a crucial role in financial
decision making, as volatility forecasts are important input
parameters in areas such as option pricing, hedging strategies,
portfolio allocation and Value-at-Risk calculations. The fact that
financial innovations arrive at an ever-increasing rate has
motivated both academic researchers and practitioners and advances
in this field have been considerable. The use of Stochastic
Volatility (SV) models is one of the latest developments in this
area. Empirical Studies on Volatility in International Stock
Markets describes the existing techniques for the measurement and
estimation of volatility in international stock markets with
emphasis on the SV model and its empirical application. Eugenie Hol
develops various extensions of the SV model, which allow for
additional variables in both the mean and the variance equation. In
addition, the forecasting performance of SV models is compared not
only to that of the well-established GARCH model but also to
implied volatility and so-called realised volatility models which
are based on intraday volatility measures. |
You may like...
Voltage Regulators for Next Generation…
Toni Lopez, Reinhold Elferich, …
Hardcover
R4,239
Discovery Miles 42 390
RF / Microwave Circuit Design for…
Ulrich L. Rohde, Matthias Rudolph
Hardcover
R4,952
Discovery Miles 49 520
Artificial Intelligence and Hardware…
Ashutosh Mishra, Jaekwang Cha, …
Hardcover
R3,358
Discovery Miles 33 580
E-Activity and Intelligent Web…
Tokuro Matsuo, Takayuki Fujimoto
Hardcover
R4,555
Discovery Miles 45 550
Logic Synthesis for Field-Programmable…
Rajeev Murgai, Robert K. Brayton, …
Hardcover
R2,893
Discovery Miles 28 930
|