![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
In 1945, very early in the history of the development of a rigorous analytical theory of probability, Feller (1945) wrote a paper called "The fundamental limit theorems in probability" in which he set out what he considered to be "the two most important limit theorems in the modern theory of probability: the central limit theorem and the recently discovered ... 'Kolmogoroff's cel ebrated law of the iterated logarithm' ." A little later in the article he added to these, via a charming description, the "little brother (of the central limit theo rem), the weak law of large numbers," and also the strong law of large num bers, which he considers as a close relative of the law of the iterated logarithm. Feller might well have added to these also the beautiful and highly applicable results of renewal theory, which at the time he himself together with eminent colleagues were vigorously producing. Feller's introductory remarks include the visionary: "The history of probability shows that our problems must be treated in their greatest generality: only in this way can we hope to discover the most natural tools and to open channels for new progress. This remark leads naturally to that characteristic of our theory which makes it attractive beyond its importance for various applications: a combination of an amazing generality with algebraic precision."
World-renowned experts in spatial statistics and spatial econometrics present the latest advances in specification and estimation of spatial econometric models. This includes information on the development of tools and software, and various applications. The text introduces new tests and estimators for spatial regression models, including discrete choice and simultaneous equation models. The performance of techniques is demonstrated through simulation results and a wide array of applications related to economic growth, international trade, knowledge externalities, population-employment dynamics, urban crime, land use, and environmental issues. An exciting new text for academics with a theoretical interest in spatial statistics and econometrics, and for practitioners looking for modern and up-to-date techniques.
Gini's mean difference (GMD) was first introduced by Corrado Gini in 1912 as an alternative measure of variability. GMD and the parameters which are derived from it (such as the Gini coefficient or the concentration ratio) have been in use in the area of income distribution for almost a century. In practice, the use of GMD as a measure of variability is justified whenever the investigator is not ready to impose, without questioning, the convenient world of normality. This makes the GMD of critical importance in the complex research of statisticians, economists, econometricians, and policy makers. This book focuses on imitating analyses that are based on variance by replacing variance with the GMD and its variants. In this way, the text showcases how almost everything that can be done with the variance as a measure of variability, can be replicated by using Gini. Beyond this, there are marked benefits to utilizing Gini as opposed to other methods. One of the advantages of using Gini methodology is that it provides a unified system that enables the user to learn about various aspects of the underlying distribution. It also provides a systematic method and a unified terminology. Using Gini methodology can reduce the risk of imposing assumptions that are not supported by the data on the model. With these benefits in mind the text uses the covariance-based approach, though applications to other approaches are mentioned as well.
This book is an extension of the author's first book and serves as a guide and manual on how to specify and compute 2-, 3-, and 4-Event Bayesian Belief Networks (BBN). It walks the learner through the steps of fitting and solving fifty BBN numerically, using mathematical proof. The author wrote this book primarily for inexperienced learners as well as professionals, while maintaining a proof-based academic rigor. The author's first book on this topic, a primer introducing learners to the basic complexities and nuances associated with learning Bayes' theorem and inverse probability for the first time, was meant for non-statisticians unfamiliar with the theorem-as is this book. This new book expands upon that approach and is meant to be a prescriptive guide for building BBN and executive decision-making for students and professionals; intended so that decision-makers can invest their time and start using this inductive reasoning principle in their decision-making processes. It highlights the utility of an algorithm that served as the basis for the first book, and includes fifty 2-, 3-, and 4-event BBN of numerous variants.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts.
This book is an introductory exposition of different topics that emerged in the literature as unifying themes between two fields of econometrics of time series, namely nonlinearity and nonstationarity. Papers on these topics have exploded over the last two decades, but they are rarely ex amined together. There is, undoubtedly, a variety of arguments that justify such a separation. But there are also good reasons that motivate their combination. People who are reluctant to a combined analysis might argue that nonlinearity and nonstationarity enhance non-trivial problems, so their combination does not stimulate interest in regard to plausibly increased difficulties. This argument can, however, be balanced by other ones of an economic nature. A predominant idea, today, is that a nonstationary series exhibits persistent deviations from its long-run components (either deterministic or stochastic trends). These persistent deviations are modelized in various ways: unit root models, fractionally integrated processes, models with shifts in the time trend, etc. However, there are many other behaviors inherent to nonstationary processes, that are not reflected in linear models. For instance, economic variables with mixture distributions, or processes that are state-dependent, undergo episodes of changing dynamics. In models with multiple long-run equi libria, the moving from an equilibrium to another sometimes implies hys teresis. Also, it is known that certain shocks can change the economic fundamentals, thereby reducing the possibility that an initial position is re-established after a shock (irreversibility)."
Anyone who wants to understand stock market cycles and develop a focused, thoughtful, and solidly grounded valuation approach to the stock market must read this book. Bolten explains the causes and patterns of the cycles and identifies the causes of stock price changes. He identifies the sources of risks in the stock market and in individual stocks. Also covered is how the interaction of expected return and risk creates stock market cycles. Bolten talks about the industry sectors most likely to be profitable investments in each stage of the stock market cycles, while identifying the stock market bubble and sinkhole warning signs. The role of the Federal Reserve in each stage of the stock market cycle is also discussed. All the categories of risk are identified and explained while no specific risk is left undiscussed. The underlying causes for long-term stock price trends and cycles are highlighted. The book is useful in many areas including stock analysis, portfolio management, cost of equity capital, financing strategies, business valuations and spotting profit opportunities caused by general economic and specific company changes.
Professionals are constantly searching for competitive solutions to help determine current and future economic tendencies. Econometrics uses statistical methods and real-world data to predict and establish specific trends within business and finance. This analytical method sustains limitless potential, but the necessary research for professionals to understand and implement this approach is lacking. Applied Econometric Analysis: Emerging Research and Opportunities explores the theoretical and practical aspects of detailed econometric theories and applications within economics, political science, public policy, business, and finance. Featuring coverage on a broad range of topics such as cointegration, machine learning, and time series analysis, this book is ideally designed for economists, policymakers, financial analysts, marketers, researchers, academicians, and graduate students seeking research on the various techniques of econometric concepts.
Major transport infrastructures are increasingly in the news as both the engineering and financing possibilities come together. However, these projects have also demonstrated the inadequacy of most existing approaches to forecasting their impacts and their overall evaluation. This collection of papers from a conference organized by the Association of d'Econometrie Appliquee represents a state of the art look at issues of forecasting traffic, developing pricing strategies and estimating the impacts in a set of papers by leading authorities from Europe, North America and Japan.
This book combines both a comprehensive analytical framework and economic statistics that enable business decision makers to anticipate developing economic trends. The author blends recent and historical economic data with economic theory to provide important benchmarks or rules of thumb that give both economists and noneconomists enhanced understanding of unfolding economic data and their interrelationships. Through the matrix system, a disciplined approach is described for integrating readily available economic data into a comprehensive analysis without complex formulas. The extensive appendix of monthly key economic factors for 1978-1991 makes this an important reference source for economic and financial trend analysis. A new and practical method for economic trend analysis is introduced that provides more advanced knowledge than available from economic newsletters. Schaeffer begins with a general description of the business cycle and the typical behavior and effect of the credit markets, commercial banks, and the Federal Reserve. Next, fourteen key economic factors regularly reported by the business press are described, such as the capacity utilization rate and yield on three-month Treasury bills. Benchmarks for each of these key economic factors are set forth, together with an insightful discussion of the interrelationships indicating economic trends. A detailed discussion of the 1978-1991 American economy, incorporating monthly data from the historical matrix, demonstrates the practical application of the matrix system. Executives, investors, financial officers, and government policymakers will find this book useful in decision making.
A timely work which represents a major reappraisal of business cycle theory. It revives, with the help of modern analytical techniques, an old theme of Keynesian macroeconomics, namely that "market psychology" (i.e., volatile expectations) may be a significant cause of economic fluctuations. It is of interest not only to economists, but also to mathematicians and physicists.
This Festschrift is dedicated to Goetz Trenkler on the occasion of his 65th birthday. As can be seen from the long list of contributions, Goetz has had and still has an enormous range of interests, and colleagues to share these interests with. He is a leading expert in linear models with a particular focus on matrix algebra in its relation to statistics. He has published in almost all major statistics and matrix theory journals. His research activities also include other areas (like nonparametrics, statistics and sports, combination of forecasts and magic squares, just to mention afew). Goetz Trenkler was born in Dresden in 1943. After his school years in East G- many and West-Berlin, he obtained a Diploma in Mathematics from Free University of Berlin (1970), where he also discovered his interest in Mathematical Statistics. In 1973, he completed his Ph.D. with a thesis titled: On a distance-generating fu- tion of probability measures. He then moved on to the University of Hannover to become Lecturer and to write a habilitation-thesis (submitted 1979) on alternatives to the Ordinary Least Squares estimator in the Linear Regression Model, a topic that would become his predominant ?eld of research in the years to come.
A new approach to explaining the existence of firms and markets, focusing on variability and coordination. It stands in contrast to the emphasis on transaction costs, and on monitoring and incentive structures, which are prominent in most of the modern literature in this field. This approach, called the variability approach, allows us to: show why both the need for communication and the coordination costs increase when the division of labor increases; explain why, while the firm relies on direction, the market does not; rigorously formulate the optimum divisionalization problem; better understand the relationship between technology and organization; show why the size' of the firm is limited; and to refine the analysis of whether the existence of a sharable input, or the presence of an external effect leads to the emergence of a firm. The book provides a wealth of insights for students and professionals in economics, business, law and organization.
This book contains an extensive up-to-date overview of nonlinear
time series models and their application to modelling economic
relationships. It considers nonlinear models in stationary and
nonstationary frameworks, and both parametric and nonparametric
models are discussed. The book contains examples of nonlinear
models in economic theory and presents the most common nonlinear
time series models. Importantly, it shows the reader how to apply
these models in practice. For this purpose, the building of various
nonlinear models with its three stages of model building:
specification, estimation and evaluation, is discussed in detail
and is illustrated by several examples involving both economic and
non-economic data. Since estimation of nonlinear time series models
is carried out using numerical algorithms, the book contains a
chapter on estimating parametric nonlinear models and another on
estimating nonparametric ones.
The more generous social welfare system in Europe is one of the most important differences between the European and the US society. Defenders of the European welfare state argue that it improves social cohesion and prevents crime. On the other hand, the US economy is performing quite well such that crime rates might come down due to better legal income opportunities. This book takes this trade-off as a point of departure and contributes to a better interdisciplinary understanding of the interactions between crime, economic performance and social exclusion. It evaluates the existing economic and criminological research and provides innovative empirical investigations on the basis of international panel data sets from different levels of regional aggregation. Among other aspects, results clearly reveal the crime reducing potential of intact families and the link beween crime and labour market. A special focus is on estimating the consequences of crime, a topic rarely analysed in literature.
Time Series: Theory and Methods is a systematic account of linear time series models and their application to the modelling and prediction of data collected sequentially in time. The aim is to provide specific techniques for handling data and at the same time to provide a thorough understanding of the mathematical basis for techniques. Both time and frequency domain methods are discussed, but the book is written in such a way that either approach could be emphasized. The book intended to be a text for graduate students in statistics, mathematics, engineering, and the natural or social sciences. It contains substantial chapters on multivariate series and state-space models (including applications of the Kalman recursions to missing-value problems) and shorter accounts of special topics including long-range dependence, infinite variance processes and non-linear models. Most of the programs used in the book are available on diskettes for the IBM-PC. These diskettes, with the accompanying manual, ITSM: The Interactive Time Series Modelling Package for the PC, also by Brockwell and Davis, can be purchased from Springer-Verlag.
The issue of unfunded public pension systems has moved to the center of public debate all over the world. Unfortunately, a large part of the discussions have remained on a qualitative level. This book seeks to address this by providing detailed knowledge on modeling pension systems.
Spatial econometrics deals with spatial dependence and spatial heterogeneity, critical aspects of the data used by regional scientists. These characteristics may cause standard econometric techniques to become inappropriate. In this book, I combine several recent research results to construct a comprehensive approach to the incorporation of spatial effects in econometrics. My primary focus is to demonstrate how these spatial effects can be considered as special cases of general frameworks in standard econometrics, and to outline how they necessitate a separate set of methods and techniques, encompassed within the field of spatial econometrics. My viewpoint differs from that taken in the discussion of spatial autocorrelation in spatial statistics - e.g., most recently by Cliff and Ord (1981) and Upton and Fingleton (1985) - in that I am mostly concerned with the relevance of spatial effects on model specification, estimation and other inference, in what I caIl a model-driven approach, as opposed to a data-driven approach in spatial statistics. I attempt to combine a rigorous econometric perspective with a comprehensive treatment of methodological issues in spatial analysis.
Shows the application of some of the developments in the mathematics of optimization, including the concepts of invexity and quasimax to models of economic growth, and to finance and investment. This book introduces a computational package called SCOM, for solving optimal control problems on MATLAB.
Over the past 25 years, applied econometrics has undergone tremen dous changes, with active developments in fields of research such as time series, labor econometrics, financial econometrics and simulation based methods. Time series analysis has been an active field of research since the seminal work by Box and Jenkins (1976), who introduced a gen eral framework in which time series can be analyzed. In the world of financial econometrics and the application of time series techniques, the ARCH model of Engle (1982) has shifted the focus from the modelling of the process in itself to the modelling of the volatility of the process. In less than 15 years, it has become one of the most successful fields of 1 applied econometric research with hundreds of published papers. As an alternative to the ARCH modelling of the volatility, Taylor (1986) intro duced the stochastic volatility model, whose features are quite similar to the ARCH specification but which involves an unobserved or latent component for the volatility. While being more difficult to estimate than usual GARCH models, stochastic volatility models have found numerous applications in the modelling of volatility and more particularly in the econometric part of option pricing formulas. Although modelling volatil ity is one of the best known examples of applied financial econometrics, other topics (factor models, present value relationships, term structure 2 models) were also successfully tackled."
Statistical Methods in Econometrics is appropriate for beginning
graduate courses in mathematical statistics and econometrics in
which the foundations of probability and statistical theory are
developed for application to econometric methodology. Because
econometrics generally requires the study of several unknown
parameters, emphasis is placed on estimation and hypothesis testing
involving several parameters. Accordingly, special attention is
paid to the multivariate normal and the distribution of quadratic
forms. Lagrange multiplier tests are discussed in considerable
detail, along with the traditional likelihood ration and Wald
tests. Characteristic functions and their properties are fully
exploited. Also asymptotic distribution theory, usually given only
cursory treatment, is discussed in detail.
Nonlinear and nonnormal filters are introduced and developed. Traditional nonlinear filters such as the extended Kalman filter and the Gaussian sum filter give biased filtering estimates, and therefore several nonlinear and nonnormal filters have been derived from the underlying probability density functions. The density-based nonlinear filters introduced in this book utilize numerical integration, Monte-Carlo integration with importance sampling or rejection sampling and the obtained filtering estimates are asymptotically unbiased and efficient. By Monte-Carlo simulation studies, all the nonlinear filters are compared. Finally, as an empirical application, consumption functions based on the rational expectation model are estimated for the nonlinear filters, where US, UK and Japan economies are compared. |
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
An Introduction to State Space Time…
Jacques J.F. Commandeur, Siem Jan Koopman
Hardcover
R1,872
Discovery Miles 18 720
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
|