![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Financial market volatility plays a crucial role in financial
decision making, as volatility forecasts are important input
parameters in areas such as option pricing, hedging strategies,
portfolio allocation and Value-at-Risk calculations. The fact that
financial innovations arrive at an ever-increasing rate has
motivated both academic researchers and practitioners and advances
in this field have been considerable. The use of Stochastic
Volatility (SV) models is one of the latest developments in this
area. Empirical Studies on Volatility in International Stock
Markets describes the existing techniques for the measurement and
estimation of volatility in international stock markets with
emphasis on the SV model and its empirical application. Eugenie Hol
develops various extensions of the SV model, which allow for
additional variables in both the mean and the variance equation. In
addition, the forecasting performance of SV models is compared not
only to that of the well-established GARCH model but also to
implied volatility and so-called realised volatility models which
are based on intraday volatility measures.
• Introduces the dynamics, principles and mathematics behind ten macroeconomic models allowing students to visualise the models and understand the economic intuition behind them. • Provides a step-by-step guide, and the necessary MATLAB codes, to allow readers to simulate and experiment with the models themselves.
Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."
The rich, multi-faceted and multi-disciplinary field of matching-based market design is an active and important one due to its highly successful applications with economic and sociological impact. Its home is economics, but with intimate connections to algorithm design and operations research. With chapters contributed by over fifty top researchers from all three disciplines, this volume is unique in its breadth and depth, while still being a cohesive and unified picture of the field, suitable for the uninitiated as well as the expert. It explains the dominant ideas from computer science and economics underlying the most important results on market design and introduces the main algorithmic questions and combinatorial structures. Methodologies and applications from both the pre-Internet and post-Internet eras are covered in detail. Key chapters discuss the basic notions of efficiency, fairness and incentives, and the way market design seeks solutions guided by normative criteria borrowed from social choice theory.
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope."
This book focuses on discussing the issues of rating scheme design and risk aggregation of risk matrix, which is a popular risk assessment tool in many fields. Although risk matrix is usually treated as qualitative tool, this book conducts the analysis from the quantitative perspective. The discussed content belongs to the scope of risk management, and to be more specific, it is related to quick risk assessment. This book is suitable for the researchers and practitioners related to qualitative or quick risk assessment and highly helps readers understanding how to design more convincing risk assessment tools and do more accurate risk assessment in a uncertain context.
This book presents the effects of integrating information and communication technologies (ICT) and economic processes in macroeconomic dynamics, finance, marketing, industrial policies, and in government economic strategy. The text explores modeling and applications in these fields and also describes, in a clear and accessible manner, the theories that guide the integration among information technology (IT), telecommunications, and the economy, while presenting examples of their applications. Current trends such as artificial intelligence, machine learning, and big data technologies used in economics are also included. This volume is suitable for researchers, practitioners, and students working in economic theory and the computational social sciences.
The first edition of this book has been described as a landmark book, being the first of its kind in applied econometrics. This second edition is thoroughly revised and updated and explains how to use many recent technical developments in time series econometrics. The main objective of the book is to help many applied economists, with a limited background in econometric estimation theory, to understand and apply widely used time eseries econometric techniques.
Franz Ferschl is seventy. According to his birth certificate it is true, but it is unbelievable. Two of the three editors remembers very well the Golden Age of Operations Research at Bonn when Franz Ferschl worked together with Wilhelm Krelle, Martin Beckmann and Horst Albach. The importance of this fruitful cooperation is reflected by the fact that half of the contributors to this book were strongly influenced by Franz Ferschl and his colleagues at the University of Bonn. Clearly, Franz Ferschl left his traces at all the other places of his professional activities, in Vienna and Munich. This is demonstrated by the present volume as well. Born in 1929 in the Upper-Austrian Miihlviertel, his scientific education brought him to Vienna where he studied mathematics. In his early years he was attracted by Statistics and Operations Research. During his employment at the Osterreichische Bundeskammer fUr Gewerbliche Wirtschaft in Vienna he prepared his famous book on queueing theory and stochastic processes in economics. This work has been achieved during his scarce time left by his duties at the Bundeskammer, mostly between 6 a.m. and midnight. All those troubles were, however, soon rewarded by the chair of statistics at Bonn University. As a real Austrian, the amenities of the Rhineland could not prevent him from returning to Vienna, where he took the chair of statistics.
This book provides the ultimate goal of economic studies to predict how the economy develops-and what will happen if we implement different policies. To be able to do that, we need to have a good understanding of what causes what in economics. Prediction and causality in economics are the main topics of this book's chapters; they use both more traditional and more innovative techniques-including quantum ideas -- to make predictions about the world economy (international trade, exchange rates), about a country's economy (gross domestic product, stock index, inflation rate), and about individual enterprises, banks, and micro-finance institutions: their future performance (including the risk of bankruptcy), their stock prices, and their liquidity. Several papers study how COVID-19 has influenced the world economy. This book helps practitioners and researchers to learn more about prediction and causality in economics -- and to further develop this important research direction.
Through analysis of the European Union Emissions Trading Scheme (EU ETS) and the Clean Development Mechanism (CDM), this book demonstrates how to use a variety of econometric techniques to analyze the evolving and expanding carbon markets sphere, techniques that can be extrapolated to the worldwide marketplace. It features stylized facts about carbon markets from an economics perspective, as well as covering key aspects of pricing strategies, risk and portfolio management.
The research and its outcomes presented here focus on spatial sampling of agricultural resources. The authors introduce sampling designs and methods for producing accurate estimates of crop production for harvests across different regions and countries. With the help of real and simulated examples performed with the open-source software R, readers will learn about the different phases of spatial data collection. The agricultural data analyzed in this book help policymakers and market stakeholders to monitor the production of agricultural goods and its effects on environment and food safety.
"Game Theory for Economists" introduces economists to the game-theoretic approach of modelling economic behaviour and interaction, focusing on concepts and ideas from the vast field of game-theoretic models which find commonly used applications in economics. This careful selection of topics allows the reader to concentrate on the parts of the game which are the most relevant for the economist who does not want to become a specialist. Written at a level appropriate for a student or researcher with a solid microeconomic background, the book should provide the reader with skills necessary to formalize economic games and to make them accessible for game theoretic analysis. It offers a concise introduction to game theory which provides economists with the techniques and results necessary to follow the literature in economic theory; helps the reader formalize economic problems; and, concentrates on equilibrium concepts that are most commonly used in economics.
In the Administration building at Linkopi ] ng University we have one of Oscar Reutersvard' ] s "Impossible Figures" in three dimensions. I call it "Perspectives of Science." When viewed from a speci c point in space there is order and structure in the 3-dimensional gure. When viewed from other points there is disorder and no structure. If a speci c scienti c paradigm is used, there is order and structure; otherwise there is disorder and no structure. My perspective in Transportation Science has focused on understanding the mathematical structure and the logic underlying the choice probability models in common use. My book with N. F. Stewart on the Gravity model (Erlander and Stewart 1990), was written in this perspective. The present book stems from the same desire to understand underlying assumptions and structure. It investigateshow far a new way of de ning Cost-Minimizing Behavior can take us.Itturnsoutthatall commonlyusedchoiceprobabilitydistributionsoflogittype- log linear probability functions - follow from cost-minimizing behavior de ned in the new way. In addition some new nested models appear."
From Robin Sickles: As I indicated to you some months ago Professor William Horrace and I would like Springer to publish a Festschrift in Honor of Peter Schmidt, our professor. Peter s accomplishments are legendary among his students and the profession. I have a bit of that student perspective in my introductory and closing remarks on the website for the conference we had in his honor this last July. I have attached the conference program from which selected papers will come (as well as from students who were unable to attend). You will also find the names of his students (40) on the website. A top twenty economics department could be started up from those 40 students. Papers from some festschrifts have a thematic link among the papers based on subject material. What I think is unique to this festschrift is that the theme running through the papers will be Peter s remarkable legacy left to his students to frame a problem and then analyze and examine it in depth using rigorous techniques but rarely just for the purpose of showcasing technical refinements per se. I think this would be a book that graduate students would find invaluable in their early research careers and seasoned scholars would find invaluable in both their and their students research."
This work contains an up-to-date coverage of the last 20 years' advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods. This book is intended for econometrics and statistics postgraduates, professors and researchers in economics departments, business schools, statistics departments, or any research centre in the same fields, especially econometricians.
In Capital Theory and Equilibrium Analysis and Recursive Utility, Robert Becker and John Boyd have synthesized their previously unpublished work on recursive models. The use of recursive utility emphasizes time-consistent decision making. This permits a unified and systematic account of economic dynamics based on neoclassical growth theory.The book provides extensive coverage of optimal growth (including endogenous growth), dynamic competitive equilibria, nonlinear dynamics, and monotone comparative dynamics. It is addressed to all researchers in economic growth, and will be useful to professional economists and graduate students alike.
This book proposes a new methodology for the selection of one (model) from among a set of alternative econometric models. Let us recall that a model is an abstract representation of reality which brings out what is relevant to a particular economic issue. An econometric model is also an analytical characterization of the joint probability distribution of some random variables of interest, which yields some information on how the actual economy works. This information will be useful only if it is accurate and precise; that is, the information must be far from ambiguous and close to what we observe in the real world Thus, model selection should be performed on the basis of statistics which summarize the degree of accuracy and precision of each model. A model is accurate if it predicts right; it is precise if it produces tight confidence intervals. A first general approach to model selection includes those procedures based on both characteristics, precision and accuracy. A particularly interesting example of this approach is that of Hildebrand, Laing and Rosenthal (1980). See also Hendry and Richard (1982). A second general approach includes those procedures that use only one of the two dimensions to discriminate among models. In general, most of the tests we are going to examine correspond to this category.
Written for those who need an introduction, Applied Time Series Analysis reviews applications of the popular econometric analysis technique across disciplines. Carefully balancing accessibility with rigor, it spans economics, finance, economic history, climatology, meteorology, and public health. Terence Mills provides a practical, step-by-step approach that emphasizes core theories and results without becoming bogged down by excessive technical details. Including univariate and multivariate techniques, Applied Time Series Analysis provides data sets and program files that support a broad range of multidisciplinary applications, distinguishing this book from others.
The advent of low cost computation has made many previously intractable econometric models empirically feasible and computational methods are now realized as an integral part of the theory.This book provides graduate students and researchers not only with a sound theoretical introduction to the topic, but allows the reader through an internet based interactive computing method to learn from theory to practice the different techniques discussed in the book. Among the theoretical issues presented are linear regression analysis, univariate time series modelling with some interesting extensions such as ARCH models and dimensionality reduction techniques.The electronic version of the book including all computational possibilites can be viewed athttp://www.xplore-stat.de/ebooks/ebooks.html
A classic treatise that defined the field of applied demand analysis, Consumer Demand in the United States: Prices, Income, and Consumption Behavior is now fully updated and expanded for a new generation. Consumption expenditures by households in the United States account for about 70% of America's GDP. The primary focus in this book is on how households adjust these expenditures in response to changes in price and income. Econometric estimates of price and income elasticities are obtained for an exhaustive array of goods and services using data from surveys conducted by the Bureau of Labor Statistics and aggregate consumption expenditures from the National Income and Product Accounts, providing a better understanding of consumer demand. Practical models for forecasting future price and income elasticities are also demonstrated. Fully revised with over a dozen new chapters and appendices, the book revisits the original Houthakker-Taylor models while examining new material as well, such as the use of quantile regression and the stationarity of consumer preference. It also explores the emerging connection between neuroscience and consumer behavior, integrating the economic literature on demand theory with psychology literature. The most comprehensive treatment of the topic to date, this volume will be an essential resource for any researcher, student or professional economist working on consumer behavior or demand theory, as well as investors and policymakers concerned with the impact of economic fluctuations.
Reformation of Econometrics is a sequel to The Formation of Econometrics: A Historical Perspective (1993, OUP) which traces the formation of econometric theory during the period 1930-1960. This book provides an account of the advances in the field of econometrics since the 1970s. Based on original research, it focuses on the reformists' movement and schools of thought and practices that attempted a paradigm shift in econometrics in the 1970s and 1980s. It describes the formation and consolidation of the Cowles Commission (CC) paradigm and traces and analyses the three major methodological attempts to resolve problems involved in model choice and specification of the CC paradigm. These attempts have reoriented the focus of econometric research from internal questions (how to optimally estimate a priori given structural parameters) to external questions (how to choose, design, and specify models). It also examines various modelling issues and problems through two case studies - modelling the Phillips curve and business cycles. The third part of the book delves into the development of three key aspects of model specification in detail - structural parameters, error terms, and model selection and design procedures. The final chapter uses citation analyses to study the impact of the CC paradigm over the span of three and half decades (1970-2005). The citation statistics show that the impact has remained extensive and relatively strong in spite of certain weakening signs. It implies that the reformative attempts have fallen short of causing a paradigm shift.
This volume is in honour of the remarkable career of the Father of Spatial Econometrics, Professor Jean Paelinck, presently of the Tinbergen Institute, Rotterdam. Jean Paelinck, arguably, is the founder of modern spatial econometrics. The impact on the profession through his work in spatial econometrics, regional science, and more conventional economics can be measured in many ways: through the work of his students, his devotion to and activism in facilitating the diffusion of regional science to Poland, the formulation and development of his FLEUR model, his co-founding of the French-speaking Regional Science Association, the voluminous references to his scholarly publications, his many invitations to be a featured speaker at conferences and universities throughout the world, the offices he has held in scholarly and professional associations, Erasmus University Rotterdam and the Netherlands Economic Institute, and the numerous honorary degrees he has been awarded. A series of special sessions in honour of Jean Paelinck were organized at the most prominent regional science meetings around the world. A number of prominent scholars in the field organized and participated in special sessions labelled In Honour of Professor Paelinck.' These sessions reflect a truly global reach of the techniques and methods pioneered by him. As an outgrowth of six conferences final versions of the selection of papers are collected in this volume. Prominent ideas contained in each of the selected contributions can be traced explicitly to work by Jean Paelinck. |
You may like...
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Ranked Set Sampling - 65 Years Improving…
Carlos N. Bouza-Herrera, Amer Ibrahim Falah Al-Omari
Paperback
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Regional Research Frontiers - Vol. 2…
Randall Jackson, Peter Schaeffer
Hardcover
R4,391
Discovery Miles 43 910
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Macroeconomics and the Real World…
Roger E. Backhouse, Andrea Salanti
Hardcover
R4,296
Discovery Miles 42 960
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|