![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
Nonlinear and nonnormal filters are introduced and developed. Traditional nonlinear filters such as the extended Kalman filter and the Gaussian sum filter give biased filtering estimates, and therefore several nonlinear and nonnormal filters have been derived from the underlying probability density functions. The density-based nonlinear filters introduced in this book utilize numerical integration, Monte-Carlo integration with importance sampling or rejection sampling and the obtained filtering estimates are asymptotically unbiased and efficient. By Monte-Carlo simulation studies, all the nonlinear filters are compared. Finally, as an empirical application, consumption functions based on the rational expectation model are estimated for the nonlinear filters, where US, UK and Japan economies are compared.
This unorthodox book derives and tests a simple theory of economic time series using several well-known empirical economic puzzles, from stock market bubbles to the failure of conventional economic theory, to explain low levels of inflation and unemployment in the US.Professor Stanley develops a new econometric methodology which demonstrates the explanatory power of the behavioral inertia hypothesis and solves the pretest/specification dilemma. He then applies this to important measures of the world's economies including GDP, prices and consumer spending. The behavioral inertia hypothesis claims that inertia and randomness (or 'caprice') are the most important factors in representing and forecasting many economic time series. The development of this new model integrates well-known patterns in economic time series data with well-accepted ideas in contemporary philosophy of science. Academic economists will find this book interesting as it presents a unified approach to economic time series, solves a number of important empirical puzzles and introduces a new econometric methodology. Business and financial analysts will also find it useful because it offers a simple, yet powerful, framework in which to study and predict financial market movements.
How might one determine if a financial institution is taking risk in a balanced and productive manner? A powerful tool to address this question is economic capital, which is a model-based measure of the amount of equity that an entity must hold to satisfactorily offset its risk-generating activities. This book, with a particular focus on the credit-risk dimension, pragmatically explores real-world economic-capital methodologies and applications. It begins with the thorny practical issues surrounding the construction of an (industrial-strength) credit-risk economic-capital model, defensibly determining its parameters, and ensuring its efficient implementation. It then broadens its gaze to examine various critical applications and extensions of economic capital; these include loan pricing, the computation of loan impairments, and stress testing. Along the way, typically working from first principles, various possible modelling choices and related concepts are examined. The end result is a useful reference for students and practitioners wishing to learn more about a centrally important financial-management device.
Halbert White has made a major contribution to key areas of econometrics including specification analysis, specification testing, encompassing and Cox tests and model selection. This book presents his most important published work supplemented with new material setting his work in context.Together with new introductions to each of the chapters, the articles cover work from the early 1980s to 1996 and provide an excellent overview of the breadth of Professor White's work and the evolution of his ideas. Using rigorous mathematical techniques Halbert White develops many of the central themes in econometrics concerning models, data generating processes and estimation procedures. Throughout the book the unifying vision is that econometric models are only imperfect approximations to the processes generating economic data and that this has implications for the interpretation of estimates, inference and selection of econometric models. This unique collection of some of Halbert White's important work, not otherwise readily accessible, will be welcomed by researchers, graduates and academics in econometrics and statistics.
Zvi Griliches has made many seminal contributions to econometrics during the course of a long and distinguished career. His work has focused primarily on the economics of technological change and the econometric problems that arise in trying to study it. This major collection presents Professor Griliches's most important essays and papers on method, applied econometrics and specification problems. It reflects his interests in data-instigated contributions to econometric methodology, developments in and exposition of specification analysis, statistical aggregation, distributed lag models, sample selection bias and measurement error and other unobservable variance component models. These methods are applied to important substantive questions such as the estimation of the returns to education, the measurement of quality change, and productivity and economies of scale. Practicing Econometrics provides an essential reference source to the work of one of the most influential econometricians of the late 20th century.
The maximum principle and dynamic programming are the two most commonly used approaches in solving optimal control problems. These approaches have been developed independently. The theme of this book is to unify these two approaches, and to demonstrate that the viscosity solution theory provides the framework to unify them.
This book overviews latest ideas and developments in financial econometrics, with an emphasis on how to best use prior knowledge (e.g., Bayesian way) and how to best use successful data processing techniques from other application areas (e.g., from quantum physics). The book also covers applications to economy-related phenomena ranging from traditionally analyzed phenomena such as manufacturing, food industry, and taxes, to newer-to-analyze phenomena such as cryptocurrencies, influencer marketing, COVID-19 pandemic, financial fraud detection, corruption, and shadow economy. This book will inspire practitioners to learn how to apply state-of-the-art Bayesian, quantum, and related techniques to economic and financial problems and inspire researchers to further improve the existing techniques and come up with new techniques for studying economic and financial phenomena. The book will also be of interest to students interested in latest ideas and results.
Financial market volatility plays a crucial role in financial
decision making, as volatility forecasts are important input
parameters in areas such as option pricing, hedging strategies,
portfolio allocation and Value-at-Risk calculations. The fact that
financial innovations arrive at an ever-increasing rate has
motivated both academic researchers and practitioners and advances
in this field have been considerable. The use of Stochastic
Volatility (SV) models is one of the latest developments in this
area. Empirical Studies on Volatility in International Stock
Markets describes the existing techniques for the measurement and
estimation of volatility in international stock markets with
emphasis on the SV model and its empirical application. Eugenie Hol
develops various extensions of the SV model, which allow for
additional variables in both the mean and the variance equation. In
addition, the forecasting performance of SV models is compared not
only to that of the well-established GARCH model but also to
implied volatility and so-called realised volatility models which
are based on intraday volatility measures.
Recent economic history suggests that a key element in economic growth and development for many countries has been an aggressive export policy and a complementary import policy. Such policies can be very effective provided that resources are used wisely to encourage exports from industries that can be com petitive in the international arena. Also, import protection must be used carefully so that it encourages infant industries instead of providing rents to industries that are not competitive. Policy makers may use a variety of methods of analysis in planning trade policy. As computing power has grown in recent years increasing attention has been give to economic models as one of the most powerful aids to policy making. These models can be used on the one hand to help in selecting export industries to encourage and infant industries to protect and on the other hand to chart the larger effects ofttade policy on the entire economy. While many models have been developed in recent years there has not been any analysis of the strengths and weaknesses of the various types of models. Therefore, this monograph provides a review and analysis of the models which can be used to analyze dynamic comparative advantage."
Parallel Algorithms for Linear Models provides a complete and detailed account of the design, analysis and implementation of parallel algorithms for solving large-scale linear models. It investigates and presents efficient, numerically stable algorithms for computing the least-squares estimators and other quantities of interest on massively parallel systems. The monograph is in two parts. The first part consists of four chapters and deals with the computational aspects for solving linear models that have applicability in diverse areas. The remaining two chapters form the second part, which concentrates on numerical and computational methods for solving various problems associated with seemingly unrelated regression equations (SURE) and simultaneous equations models. The practical issues of the parallel algorithms and the theoretical aspects of the numerical methods will be of interest to a broad range of researchers working in the areas of numerical and computational methods in statistics and econometrics, parallel numerical algorithms, parallel computing and numerical linear algebra. The aim of this monograph is to promote research in the interface of econometrics, computational statistics, numerical linear algebra and parallelism.
In recent years there has been a growing interest in and concern for the development of a sound spatial statistical body of theory. This work has been undertaken by geographers, statisticians, regional scientists, econometricians, and others (e. g., sociologists). It has led to the publication of a number of books, including Cliff and Ord's Spatial Processes (1981), Bartlett's The Statistical Analysis of Spatial Pattern (1975), Ripley's Spatial Statistics (1981), Paelinck and Klaassen's Spatial Economet ics (1979), Ahuja and Schachter's Pattern Models (1983), and Upton and Fingleton's Spatial Data Analysis by Example (1985). The first of these books presents a useful introduction to the topic of spatial autocorrelation, focusing on autocorrelation indices and their sampling distributions. The second of these books is quite brief, but nevertheless furnishes an eloquent introduction to the rela tionship between spatial autoregressive and two-dimensional spectral models. Ripley's book virtually ignores autoregressive and trend surface modelling, and focuses almost solely on point pattern analysis. Paelinck and Klaassen's book closely follows an econometric textbook format, and as a result overlooks much of the important material necessary for successful spatial data analy sis. It almost exclusively addresses distance and gravity models, with some treatment of autoregressive modelling. Pattern Models supplements Cliff and Ord's book, which in combination provide a good introduction to spatial data analysis. Its basic limitation is a preoccupation with the geometry of planar patterns, and hence is very narrow in scope."
This book focuses on discussing the issues of rating scheme design and risk aggregation of risk matrix, which is a popular risk assessment tool in many fields. Although risk matrix is usually treated as qualitative tool, this book conducts the analysis from the quantitative perspective. The discussed content belongs to the scope of risk management, and to be more specific, it is related to quick risk assessment. This book is suitable for the researchers and practitioners related to qualitative or quick risk assessment and highly helps readers understanding how to design more convincing risk assessment tools and do more accurate risk assessment in a uncertain context.
This book presents the effects of integrating information and communication technologies (ICT) and economic processes in macroeconomic dynamics, finance, marketing, industrial policies, and in government economic strategy. The text explores modeling and applications in these fields and also describes, in a clear and accessible manner, the theories that guide the integration among information technology (IT), telecommunications, and the economy, while presenting examples of their applications. Current trends such as artificial intelligence, machine learning, and big data technologies used in economics are also included. This volume is suitable for researchers, practitioners, and students working in economic theory and the computational social sciences.
The first edition of this book has been described as a landmark book, being the first of its kind in applied econometrics. This second edition is thoroughly revised and updated and explains how to use many recent technical developments in time series econometrics. The main objective of the book is to help many applied economists, with a limited background in econometric estimation theory, to understand and apply widely used time eseries econometric techniques.
Panel data is a data type increasingly used in research in economics, social sciences, and medicine. Its primary characteristic is that the data variation goes jointly over space (across individuals, firms, countries, etc.) and time (over years, months, etc.). Panel data allow examination of problems that cannot be handled by cross-section data or time-series data. Panel data analysis is a core field in modern econometrics and multivariate statistics, and studies based on such data occupy a growing part of the field in many other disciplines. The book is intended as a text for master and advanced undergraduate courses. It may also be useful for PhD-students writing theses in empirical and applied economics and readers conducting empirical work on their own. The book attempts to take the reader gradually from simple models and methods in scalar (simple vector) notation to more complex models in matrix notation. A distinctive feature is that more attention is given to unbalanced panel data, the measurement error problem, random coefficient approaches, the interface between panel data and aggregation, and the interface between unbalanced panels and truncated and censored data sets. The 12 chapters are intended to be largely self-contained, although there is also natural progression. Most of the chapters contain commented examples based on genuine data, mainly taken from panel data applications to economics. Although the book, inter alia, through its use of examples, is aimed primarily at students of economics and econometrics, it may also be useful for readers in social sciences, psychology, and medicine, provided they have a sufficient background in statistics, notably basic regression analysis and elementary linear algebra.
An accessible, contemporary introduction to the methods for determining cause and effect in the social sciences "Causation versus correlation has been the basis of arguments-economic and otherwise-since the beginning of time. Causal Inference: The Mixtape uses legit real-world examples that I found genuinely thought-provoking. It's rare that a book prompts readers to expand their outlook; this one did for me."-Marvin Young (Young MC) Causal inference encompasses the tools that allow social scientists to determine what causes what. In a messy world, causal inference is what helps establish the causes and effects of the actions being studied-for example, the impact (or lack thereof) of increases in the minimum wage on employment, the effects of early childhood education on incarceration later in life, or the influence on economic growth of introducing malaria nets in developing regions. Scott Cunningham introduces students and practitioners to the methods necessary to arrive at meaningful answers to the questions of causation, using a range of modeling techniques and coding instructions for both the R and the Stata programming languages.
Franz Ferschl is seventy. According to his birth certificate it is true, but it is unbelievable. Two of the three editors remembers very well the Golden Age of Operations Research at Bonn when Franz Ferschl worked together with Wilhelm Krelle, Martin Beckmann and Horst Albach. The importance of this fruitful cooperation is reflected by the fact that half of the contributors to this book were strongly influenced by Franz Ferschl and his colleagues at the University of Bonn. Clearly, Franz Ferschl left his traces at all the other places of his professional activities, in Vienna and Munich. This is demonstrated by the present volume as well. Born in 1929 in the Upper-Austrian Miihlviertel, his scientific education brought him to Vienna where he studied mathematics. In his early years he was attracted by Statistics and Operations Research. During his employment at the Osterreichische Bundeskammer fUr Gewerbliche Wirtschaft in Vienna he prepared his famous book on queueing theory and stochastic processes in economics. This work has been achieved during his scarce time left by his duties at the Bundeskammer, mostly between 6 a.m. and midnight. All those troubles were, however, soon rewarded by the chair of statistics at Bonn University. As a real Austrian, the amenities of the Rhineland could not prevent him from returning to Vienna, where he took the chair of statistics.
This book provides the ultimate goal of economic studies to predict how the economy develops-and what will happen if we implement different policies. To be able to do that, we need to have a good understanding of what causes what in economics. Prediction and causality in economics are the main topics of this book's chapters; they use both more traditional and more innovative techniques-including quantum ideas -- to make predictions about the world economy (international trade, exchange rates), about a country's economy (gross domestic product, stock index, inflation rate), and about individual enterprises, banks, and micro-finance institutions: their future performance (including the risk of bankruptcy), their stock prices, and their liquidity. Several papers study how COVID-19 has influenced the world economy. This book helps practitioners and researchers to learn more about prediction and causality in economics -- and to further develop this important research direction.
Through analysis of the European Union Emissions Trading Scheme (EU ETS) and the Clean Development Mechanism (CDM), this book demonstrates how to use a variety of econometric techniques to analyze the evolving and expanding carbon markets sphere, techniques that can be extrapolated to the worldwide marketplace. It features stylized facts about carbon markets from an economics perspective, as well as covering key aspects of pricing strategies, risk and portfolio management.
The research and its outcomes presented here focus on spatial sampling of agricultural resources. The authors introduce sampling designs and methods for producing accurate estimates of crop production for harvests across different regions and countries. With the help of real and simulated examples performed with the open-source software R, readers will learn about the different phases of spatial data collection. The agricultural data analyzed in this book help policymakers and market stakeholders to monitor the production of agricultural goods and its effects on environment and food safety.
"Game Theory for Economists" introduces economists to the game-theoretic approach of modelling economic behaviour and interaction, focusing on concepts and ideas from the vast field of game-theoretic models which find commonly used applications in economics. This careful selection of topics allows the reader to concentrate on the parts of the game which are the most relevant for the economist who does not want to become a specialist. Written at a level appropriate for a student or researcher with a solid microeconomic background, the book should provide the reader with skills necessary to formalize economic games and to make them accessible for game theoretic analysis. It offers a concise introduction to game theory which provides economists with the techniques and results necessary to follow the literature in economic theory; helps the reader formalize economic problems; and, concentrates on equilibrium concepts that are most commonly used in economics.
In the Administration building at Linkopi ] ng University we have one of Oscar Reutersvard' ] s "Impossible Figures" in three dimensions. I call it "Perspectives of Science." When viewed from a speci c point in space there is order and structure in the 3-dimensional gure. When viewed from other points there is disorder and no structure. If a speci c scienti c paradigm is used, there is order and structure; otherwise there is disorder and no structure. My perspective in Transportation Science has focused on understanding the mathematical structure and the logic underlying the choice probability models in common use. My book with N. F. Stewart on the Gravity model (Erlander and Stewart 1990), was written in this perspective. The present book stems from the same desire to understand underlying assumptions and structure. It investigateshow far a new way of de ning Cost-Minimizing Behavior can take us.Itturnsoutthatall commonlyusedchoiceprobabilitydistributionsoflogittype- log linear probability functions - follow from cost-minimizing behavior de ned in the new way. In addition some new nested models appear."
From Robin Sickles: As I indicated to you some months ago Professor William Horrace and I would like Springer to publish a Festschrift in Honor of Peter Schmidt, our professor. Peter s accomplishments are legendary among his students and the profession. I have a bit of that student perspective in my introductory and closing remarks on the website for the conference we had in his honor this last July. I have attached the conference program from which selected papers will come (as well as from students who were unable to attend). You will also find the names of his students (40) on the website. A top twenty economics department could be started up from those 40 students. Papers from some festschrifts have a thematic link among the papers based on subject material. What I think is unique to this festschrift is that the theme running through the papers will be Peter s remarkable legacy left to his students to frame a problem and then analyze and examine it in depth using rigorous techniques but rarely just for the purpose of showcasing technical refinements per se. I think this would be a book that graduate students would find invaluable in their early research careers and seasoned scholars would find invaluable in both their and their students research."
This work contains an up-to-date coverage of the last 20 years' advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods. This book is intended for econometrics and statistics postgraduates, professors and researchers in economics departments, business schools, statistics departments, or any research centre in the same fields, especially econometricians. |
You may like...
The Handbook of Historical Economics
Alberto Bisin, Giovanni Federico
Paperback
R2,567
Discovery Miles 25 670
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R7,224
Discovery Miles 72 240
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Macroeconomics and the Real World…
Roger E. Backhouse, Andrea Salanti
Hardcover
R4,479
Discovery Miles 44 790
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Tools and Techniques for Economic…
Jelena Stankovi, Pavlos Delias, …
Hardcover
R5,167
Discovery Miles 51 670
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
|