![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
The availability of financial data recorded on high-frequency level has inspired a research area which over the last decade emerged to a major area in econometrics and statistics. The growing popularity of high-frequency econometrics is driven by technological progress in trading systems and an increasing importance of intraday trading, liquidity risk, optimal order placement as well as high-frequency volatility. This book provides a state-of-the art overview on the major approaches in high-frequency econometrics, including univariate and multivariate autoregressive conditional mean approaches for different types of high-frequency variables, intensity-based approaches for financial point processes and dynamic factor models. It discusses implementation details, provides insights into properties of high-frequency data as well as institutional settings and presents applications to volatility and liquidity estimation, order book modelling and market microstructure analysis.
Features: New chapters on Barrier Options, Lookback Options, Asian Options, Optimal Stopping Theorem, and Stochastic Volatility. Contains over 235 exercises, and 16 problems with complete solutions. Added over 150 graphs and figures, for more than 250 in total, to optimize presentation. 57 R coding examples now integrated into the book for implementation of the methods. Substantially class-tested, so ideal for course use or self-study.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
The field of econometrics has gone through remarkable changes during the last thirty-five years. Widening its earlier focus on testing macroeconomic theories, it has become a rather comprehensive discipline concemed with the development of statistical methods and their application to the whole spectrum of economic data. This development becomes apparent when looking at the biography of an econometrician whose illustrious research and teaching career started about thirty-five years ago and who will retire very soon after his 65th birthday. This is Gerd Hansen, professor of econometrics at the Christian Albrechts University at Kiel and to whom this volume with contributions from colleagues and students has been dedicated. He has shaped the econometric landscape in and beyond Germany throughout these thirty-five years. At the end of the 1960s he developed one of the first econometric models for the German econ omy which adhered c10sely to the traditions put forth by the Cowles commission."
The modern system-wide approach to applied demand analysis emphasizes a unity between theory and applications. Its fIrm foundations in economic theory make it one of the most impressive areas of applied econometrics. This book presents a large number of applications of recent innovations in the area. The database used consist of about 18 annual observations for 10 commodities in 18 OECO countries (more than 3,100 data points). Such a large body of data should provide convincing evidence, one way or the other, about the validity of consumption theory. A PREVIEW OF THE BOOK The overall importance of the analysis presented in the book can be seen from the following table which shows the signifIcant contribution of the OECO to the world economy. As can be seen, the 24 member countries account for about 50 percent of world GOP in 1975. In this book we present an extensive analysis of the consumption patterns of the OECO countries.
In this book, we synthesize a rich and vast literature on econometric challenges associated with accounting choices and their causal effects. Identi?cation and es- mation of endogenous causal effects is particularly challenging as observable data are rarely directly linked to the causal effect of interest. A common strategy is to employ logically consistent probability assessment via Bayes' theorem to connect observable data to the causal effect of interest. For example, the implications of earnings management as equilibrium reporting behavior is a centerpiece of our explorations. Rather than offering recipes or algorithms, the book surveys our - periences with accounting and econometrics. That is, we focus on why rather than how. The book can be utilized in a variety of venues. On the surface it is geared - ward graduate studies and surely this is where its roots lie. If we're serious about our studies, that is, if we tackle interesting and challenging problems, then there is a natural progression. Our research addresses problems that are not well - derstood then incorporates them throughout our curricula as our understanding improves and to improve our understanding (in other words, learning and c- riculum development are endogenous). For accounting to be a vibrant academic discipline, we believe it is essential these issues be confronted in the undergr- uate classroom as well as graduate studies. We hope we've made some progress with examples which will encourage these developments.
Charles de Gaulle commence ses Memoires d'Espoir, ainsi: 'La France vient du fond des ages. Elle vito Les Siecles l'appellent. Mais elle demeure elle-meme au long du temps. Ses limites peuvent se modifier sans que changent Ie relief, Ie climat, les fleuves, les mers, qui la marquent indefmiment. Y habitent des peuples qu'etreignent, au cours de l'Histoire, les epreuves les plus diverses, mais que la nature des choses, utilisee par la politique, petrit sans cesse en une seule nation. Celle-ci a embrasse de nombreuses generations. Elle en comprend actuellement plusieurs. Elle en enfantera beaucoup d'autres. Mais, de par la geograpbie du pays qui est Ie sien, de par Ie genie des races qui la composent, de par les voisinages qui l'entourent, elle revet un caractere constant qui fait dependre de leurs peres les Fran ais de chaque epoque et les engage pour leurs descendants. A moins de se rompre, cet ensemble humain, sur ce territoire, au sein de cet univers, comporte donc un passe, un present, un avenir, indissolubles. Aussi l'ttat, qui repond de la France, est-il en charge, a la fois, de son heritage d'bier, de ses interets d'aujourd'hui et de ses espoirs de demain. ' A la lurniere de cette idee de nation, il est clair, qu'un dialogue entre nations est eminemment important et que la Semaine Universitaire Franco Neerlandaise est une institution pour stimuler ce dialogue."
The three decades which have followed the publication of Heinz Neudecker's seminal paper `Some Theorems on Matrix Differentiation with Special Reference to Kronecker Products' in the Journal of the American Statistical Association (1969) have witnessed the growing influence of matrix analysis in many scientific disciplines. Amongst these are the disciplines to which Neudecker has contributed directly - namely econometrics, economics, psychometrics and multivariate analysis. This book aims to illustrate how powerful the tools of matrix analysis have become as weapons in the statistician's armoury. The majority of its chapters are concerned primarily with theoretical innovations, but all of them have applications in view, and some of them contain extensive illustrations of the applied techniques. This book will provide research workers and graduate students with a cross-section of innovative work in the fields of matrix methods and multivariate statistical analysis. It should be of interest to students and practitioners in a wide range of subjects which rely upon modern methods of statistical analysis. The contributors to the book are themselves practitioners of a wide range of subjects including econometrics, psychometrics, educational statistics, computation methods and electrical engineering, but they find a common ground in the methods which are represented in the book. It is envisaged that the book will serve as an important work of reference and as a source of inspiration for some years to come.
Louis Phlips The stabilisation of primary commodity prices, and the related issue of the stabilisation of export earnings of developing countries, have traditionally been studied without reference to the futures markets (that exist or could exist) for these commodities. These futures markets have in turn been s udied in isolation. The same is true for the new developments on financial markets. Over the last few years, in particular sine the 1985 tin crisis and the October 1987 stock exchange crisis, it has become evident that there are inter actions between commodity, futures, and financial markets and that these inter actions are very important. The more so as trade on futures and financial markets has shown a spectacular increase. This volume brings together a number of recent and unpublished papers on these interactions by leading specialists (and their students). A first set of papers examines how the use of futures markets could help stabilising export earnings of developing countries and how this compares to the rather unsuccessful UNCTAD type interventions via buffer stocks, pegged prices and cartels. A second set of papers faces the fact, largely ignored in the literature, that commodity prices are determined in foreign currencies, with the result that developing countries suffer from the volatility of exchange rates of these currencies (even in cases where commodity prices are relatively stable). Financial markets are thus explicitly linked to futures and commodity markets."
New Directions in Computational Economics brings together for the first time a diverse selection of papers, sharing the underlying theme of application of computing technology as a tool for achieving solutions to realistic problems in computational economics and related areas in the environmental, ecological and energy fields. Part I of the volume addresses experimental and computational issues in auction mechanisms, including a survey of recent results for sealed bid auctions. The second contribution uses neural networks as the basis for estimating bid functions for first price sealed bid auctions. Also presented is the smart market' computational mechanism which better matches bids and offers for natural gas. Part II consists of papers that formulate and solve models of economics systems. Amman and Kendrick's paper deals with control models and the computational difficulties that result from nonconvexities. Using goal programming, Nagurney, Thore and Pan formulate spatial resource allocation models to analyze various policy issues. Thompson and Thrall next present a rigorous mathematical analysis of the relationship between efficiency and profitability. The problem of matching uncertain streams of assets and liabilities is solved using stochastic optimization techniques in the following paper in this section. Finally, Part III applies economic concepts to issues in computer science in addition to using computational techniques to solve economic models.
In 1945, very early in the history of the development of a rigorous analytical theory of probability, Feller (1945) wrote a paper called "The fundamental limit theorems in probability" in which he set out what he considered to be "the two most important limit theorems in the modern theory of probability: the central limit theorem and the recently discovered ... 'Kolmogoroff's cel ebrated law of the iterated logarithm' ." A little later in the article he added to these, via a charming description, the "little brother (of the central limit theo rem), the weak law of large numbers," and also the strong law of large num bers, which he considers as a close relative of the law of the iterated logarithm. Feller might well have added to these also the beautiful and highly applicable results of renewal theory, which at the time he himself together with eminent colleagues were vigorously producing. Feller's introductory remarks include the visionary: "The history of probability shows that our problems must be treated in their greatest generality: only in this way can we hope to discover the most natural tools and to open channels for new progress. This remark leads naturally to that characteristic of our theory which makes it attractive beyond its importance for various applications: a combination of an amazing generality with algebraic precision."
World-renowned experts in spatial statistics and spatial econometrics present the latest advances in specification and estimation of spatial econometric models. This includes information on the development of tools and software, and various applications. The text introduces new tests and estimators for spatial regression models, including discrete choice and simultaneous equation models. The performance of techniques is demonstrated through simulation results and a wide array of applications related to economic growth, international trade, knowledge externalities, population-employment dynamics, urban crime, land use, and environmental issues. An exciting new text for academics with a theoretical interest in spatial statistics and econometrics, and for practitioners looking for modern and up-to-date techniques.
Gini's mean difference (GMD) was first introduced by Corrado Gini in 1912 as an alternative measure of variability. GMD and the parameters which are derived from it (such as the Gini coefficient or the concentration ratio) have been in use in the area of income distribution for almost a century. In practice, the use of GMD as a measure of variability is justified whenever the investigator is not ready to impose, without questioning, the convenient world of normality. This makes the GMD of critical importance in the complex research of statisticians, economists, econometricians, and policy makers. This book focuses on imitating analyses that are based on variance by replacing variance with the GMD and its variants. In this way, the text showcases how almost everything that can be done with the variance as a measure of variability, can be replicated by using Gini. Beyond this, there are marked benefits to utilizing Gini as opposed to other methods. One of the advantages of using Gini methodology is that it provides a unified system that enables the user to learn about various aspects of the underlying distribution. It also provides a systematic method and a unified terminology. Using Gini methodology can reduce the risk of imposing assumptions that are not supported by the data on the model. With these benefits in mind the text uses the covariance-based approach, though applications to other approaches are mentioned as well.
Elementary Bayesian Statistics is a thorough and easily accessible introduction to the theory and practical application of Bayesian statistics. It presents methods to assist in the collection, summary and presentation of numerical data.Bayesian statistics are becoming an increasingly important and more frequently used method for analysing statistical data. The author defines concepts and methods with a variety of examples and uses a stage-by-stage approach to coach the reader through the applied examples. Also included are a wide range of problems to challenge the reader and the book makes extensive use of Minitab to apply computational techniques to statistical problems. Issues covered include probability, Bayes's Theorem and categorical states, frequency, the Bernoulli process and Poisson process, estimation, testing hypotheses and the normal process with known parameters and uncertain parameters. Elementary Bayesian Statistics will be an essential resource for students as a supplementary text in traditional statistics courses. It will also be welcomed by academics, researchers and econometricians wishing to know more about Bayesian statistics.
This book is an extension of the author's first book and serves as a guide and manual on how to specify and compute 2-, 3-, and 4-Event Bayesian Belief Networks (BBN). It walks the learner through the steps of fitting and solving fifty BBN numerically, using mathematical proof. The author wrote this book primarily for inexperienced learners as well as professionals, while maintaining a proof-based academic rigor. The author's first book on this topic, a primer introducing learners to the basic complexities and nuances associated with learning Bayes' theorem and inverse probability for the first time, was meant for non-statisticians unfamiliar with the theorem-as is this book. This new book expands upon that approach and is meant to be a prescriptive guide for building BBN and executive decision-making for students and professionals; intended so that decision-makers can invest their time and start using this inductive reasoning principle in their decision-making processes. It highlights the utility of an algorithm that served as the basis for the first book, and includes fifty 2-, 3-, and 4-event BBN of numerous variants.
Economic Phenomena before and after War is the result of the author's search for a scientific explanation of modern wars, by means of economic statistical data, in the statistics of consumption, production and natural growth of population. The theory discussed assumes that a state of war in modern communities is dependent on the general economic equilibrium, which becomes more and more unstable as industrialization progresses. A state of war indicates a turning point in the action of balancing forces; it moves the economic forces in an opposite direction and is therefore a means for stabilizing the general economic equilibrium.
Economists can use computer algebra systems to manipulate symbolic models, derive numerical computations, and analyze empirical relationships among variables. Maxima is an open-source multi-platform computer algebra system that rivals proprietary software. Maxima's symbolic and computational capabilities enable economists and financial analysts to develop a deeper understanding of models by allowing them to explore the implications of differences in parameter values, providing numerical solutions to problems that would be otherwise intractable, and by providing graphical representations that can guide analysis. This book provides a step-by-step tutorial for using this program to examine the economic relationships that form the core of microeconomics in a way that complements traditional modeling techniques. Readers learn how to phrase the relevant analysis and how symbolic expressions, numerical computations, and graphical representations can be used to learn from microeconomic models. In particular, comparative statics analysis is facilitated. Little has been published on Maxima and its applications in economics and finance, and this volume will appeal to advanced undergraduates, graduate-level students studying microeconomics, academic researchers in economics and finance, economists, and financial analysts.
Econometrics, Macroeconomics and Economic Policy presents eighteen papers by Carl Christ focusing on econometric models, their evaluation and history, and the interactions between monetary and fiscal policy.Professor Christ's pioneering contributions to econometrics, monetary and fiscal policies and the government's budget constraint are thoroughly covered in this volume. Other areas addressed include monetary economics, monetary policy, macroeconomic model building, and the role of the economist in economic policy making. The book also features an original new introduction by the author and a detailed bibliography. Econometricians and macroeconomists will welcome this outstanding volume in which Professor Christ argues firmly for the importance of testing econometric equations and models against new data, as well as for exploring the impact of the policies of central government.
Providing researchers in economics, finance, and statistics with an up-to-date introduction to applying Bayesian techniques to empirical studies, this book covers the full range of the new numerical techniques which have been developed over the last thirty years. Notably, these are: Monte Carlo sampling, antithetic replication, importance sampling, and Gibbs sampling. The author covers both advances in theory and modern approaches to numerical and applied problems, and includes applications drawn from a variety of different fields within economics, while also providing a quick overview of the underlying statistical ideas of Bayesian thought. The result is a book which presents a roadmap of applied economic questions that can now be addressed empirically with Bayesian methods. Consequently, many researchers will find this a readily readable survey of this growing topic.
This book is an introductory exposition of different topics that emerged in the literature as unifying themes between two fields of econometrics of time series, namely nonlinearity and nonstationarity. Papers on these topics have exploded over the last two decades, but they are rarely ex amined together. There is, undoubtedly, a variety of arguments that justify such a separation. But there are also good reasons that motivate their combination. People who are reluctant to a combined analysis might argue that nonlinearity and nonstationarity enhance non-trivial problems, so their combination does not stimulate interest in regard to plausibly increased difficulties. This argument can, however, be balanced by other ones of an economic nature. A predominant idea, today, is that a nonstationary series exhibits persistent deviations from its long-run components (either deterministic or stochastic trends). These persistent deviations are modelized in various ways: unit root models, fractionally integrated processes, models with shifts in the time trend, etc. However, there are many other behaviors inherent to nonstationary processes, that are not reflected in linear models. For instance, economic variables with mixture distributions, or processes that are state-dependent, undergo episodes of changing dynamics. In models with multiple long-run equi libria, the moving from an equilibrium to another sometimes implies hys teresis. Also, it is known that certain shocks can change the economic fundamentals, thereby reducing the possibility that an initial position is re-established after a shock (irreversibility)."
Anyone who wants to understand stock market cycles and develop a focused, thoughtful, and solidly grounded valuation approach to the stock market must read this book. Bolten explains the causes and patterns of the cycles and identifies the causes of stock price changes. He identifies the sources of risks in the stock market and in individual stocks. Also covered is how the interaction of expected return and risk creates stock market cycles. Bolten talks about the industry sectors most likely to be profitable investments in each stage of the stock market cycles, while identifying the stock market bubble and sinkhole warning signs. The role of the Federal Reserve in each stage of the stock market cycle is also discussed. All the categories of risk are identified and explained while no specific risk is left undiscussed. The underlying causes for long-term stock price trends and cycles are highlighted. The book is useful in many areas including stock analysis, portfolio management, cost of equity capital, financing strategies, business valuations and spotting profit opportunities caused by general economic and specific company changes.
This volume collects authoritative contributions on analytical methods and mathematical statistics. The methods presented include resampling techniques; the minimization of divergence; estimation theory and regression, eventually under shape or other constraints or long memory; and iterative approximations when the optimal solution is difficult to achieve. It also investigates probability distributions with respect to their stability, heavy-tailness, Fisher information and other aspects, both asymptotically and non-asymptotically. The book not only presents the latest mathematical and statistical methods and their extensions, but also offers solutions to real-world problems including option pricing. The selected, peer-reviewed contributions were originally presented at the workshop on Analytical Methods in Statistics, AMISTAT 2015, held in Prague, Czech Republic, November 10-13, 2015.
Professionals are constantly searching for competitive solutions to help determine current and future economic tendencies. Econometrics uses statistical methods and real-world data to predict and establish specific trends within business and finance. This analytical method sustains limitless potential, but the necessary research for professionals to understand and implement this approach is lacking. Applied Econometric Analysis: Emerging Research and Opportunities explores the theoretical and practical aspects of detailed econometric theories and applications within economics, political science, public policy, business, and finance. Featuring coverage on a broad range of topics such as cointegration, machine learning, and time series analysis, this book is ideally designed for economists, policymakers, financial analysts, marketers, researchers, academicians, and graduate students seeking research on the various techniques of econometric concepts.
This book combines both a comprehensive analytical framework and economic statistics that enable business decision makers to anticipate developing economic trends. The author blends recent and historical economic data with economic theory to provide important benchmarks or rules of thumb that give both economists and noneconomists enhanced understanding of unfolding economic data and their interrelationships. Through the matrix system, a disciplined approach is described for integrating readily available economic data into a comprehensive analysis without complex formulas. The extensive appendix of monthly key economic factors for 1978-1991 makes this an important reference source for economic and financial trend analysis. A new and practical method for economic trend analysis is introduced that provides more advanced knowledge than available from economic newsletters. Schaeffer begins with a general description of the business cycle and the typical behavior and effect of the credit markets, commercial banks, and the Federal Reserve. Next, fourteen key economic factors regularly reported by the business press are described, such as the capacity utilization rate and yield on three-month Treasury bills. Benchmarks for each of these key economic factors are set forth, together with an insightful discussion of the interrelationships indicating economic trends. A detailed discussion of the 1978-1991 American economy, incorporating monthly data from the historical matrix, demonstrates the practical application of the matrix system. Executives, investors, financial officers, and government policymakers will find this book useful in decision making. |
You may like...
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|