![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This is the second of two volumes containing papers and commentaries presented at the Eleventh World Congress of the Econometric Society, held in Montreal, Canada in August 2015. These papers provide state-of-the-art guides to the most important recent research in economics. The book includes surveys and interpretations of key developments in economics and econometrics, and discussion of future directions for a wide variety of topics, covering both theory and application. These volumes provide a unique, accessible survey of progress on the discipline, written by leading specialists in their fields. The second volume addresses topics such as big data, macroeconomics, financial markets, and partially identified models.
Terence Mills' best-selling graduate textbook provides detailed coverage of the latest research techniques and findings relating to the empirical analysis of financial markets. In its previous editions it has become required reading for many graduate courses on the econometrics of financial modelling. The third edition, co-authored with Raphael Markellos, contains a wealth of new material reflecting the developments of the last decade. Particular attention is paid to the wide range of nonlinear models that are used to analyse financial data observed at high frequencies and to the long memory characteristics found in financial time series. The central material on unit root processes and the modelling of trends and structural breaks has been substantially expanded into a chapter of its own. There is also an extended discussion of the treatment of volatility, accompanied by a new chapter on nonlinearity and its testing.
Financial econometrics is a great success story in economics. Econometrics uses data and statistical inference methods, together with structural and descriptive modeling, to address rigorous economic problems. Its development within the world of finance is quite recent and has been paralleled by a fast expansion of financial markets and an increasing variety and complexity of financial products. This has fueled the demand for people with advanced econometrics skills. For professionals and advanced graduate students pursuing greater expertise in econometric modeling, this is a superb guide to the field's frontier. With the goal of providing information that is absolutely up-to-date—essential in today's rapidly evolving financial environment—Gourieroux and Jasiak focus on methods related to foregoing research and those modeling techniques that seem relevant to future advances. They present a balanced synthesis of financial theory and statistical methodology. Recognizing that any model is necessarily a simplified image of reality and that econometric methods must be adapted and applied on a case-by-case basis, the authors employ a wide variety of data sampled at frequencies ranging from intraday to monthly. These data comprise time series representing both the European and North American markets for stocks, bonds, and foreign currencies. Practitioners are encouraged to keep a critical eye and are armed with graphical diagnostics to eradicate misspecification errors. This authoritative, state-of-the-art reference text is ideal for upper-level graduate students, researchers, and professionals seeking to update their skills and gain greater facility in using econometric models. All will benefit from the emphasis on practical aspects of financial modeling and statistical inference. Doctoral candidates will appreciate the inclusion of detailed mathematical derivations of the deeper results as well as the more advanced problems concerning high-frequency data and risk control. By establishing a link between practical questions and the answers provided by financial and statistical theory, the book also addresses the needs of applied researchers employed by financial institutions.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
Originally published in 1989. ECESIS consists of 51 regional econometric models (one for each state and the District of Columbia) and a multiregional demographic model. Its distinguishing feature is the linking of sophisticated demographic accounts with sophisticated structural econometric models. This book, looking at how strong the interactions are between population dynamics and economic activity, determines to what extent the simultaneous economic-demographic interregional model provides improved projection and simulation properties over regional economic and demographic models used independently of one another.
Originally published in 1979. An Input/output database is an information system carrying current data on the intermediate consumption of any product or service by all the specified major firms that consume it. This book begins with a survey of how the interrelationships of an economic system can be represented in a two-dimensional model which traces the output of each economic sector to all other sectors. It talks about how the use of such databases to identify major buyers and sellers can illuminate problems of economic policy at the national, regional, and corporate level and aid in analyzing factors affecting the control of inflation, energy use, transportation, and environmental pollution. The book discusses how advances in database technology, have brought to the fore such issues as the right to individual privacy, corporate secrecy, the public's right of access to stored data, and the use of such information for national planning in a free-enterprise society.
Statistical physics concepts such as stochastic dynamics, short- and long-range correlations, self-similarity and scaling, permit an understanding of the global behavior of economic systems without first having to work out a detailed microscopic description of the system. This pioneering text explores the use of these concepts in the description of financial systems, the dynamic new specialty of econophysics. The authors illustrate the scaling concepts used in probability theory, critical phenomena, and fully-developed turbulent fluids and apply them to financial time series. They also present a new stochastic model that displays several of the statistical properties observed in empirical data. Physicists will find the application of statistical physics concepts to economic systems fascinating. Economists and other financial professionals will benefit from the book's empirical analysis methods and well-formulated theoretical tools that will allow them to describe systems composed of a huge number of interacting subsystems.
This textbook provides a self-contained presentation of the theory and models of time series analysis. Putting an emphasis on weakly stationary processes and linear dynamic models, it describes the basic concepts, ideas, methods and results in a mathematically well-founded form and includes numerous examples and exercises. The first part presents the theory of weakly stationary processes in time and frequency domain, including prediction and filtering. The second part deals with multivariate AR, ARMA and state space models, which are the most important model classes for stationary processes, and addresses the structure of AR, ARMA and state space systems, Yule-Walker equations, factorization of rational spectral densities and Kalman filtering. Finally, there is a discussion of Granger causality, linear dynamic factor models and (G)ARCH models. The book provides a solid basis for advanced mathematics students and researchers in fields such as data-driven modeling, forecasting and filtering, which are important in statistics, control engineering, financial mathematics, econometrics and signal processing, among other subjects.
This volume presents new methods and applications in longitudinal data estimation methodology in applied economic. Featuring selected papers from the 2020 the International Conference on Applied Economics (ICOAE 2020) held virtually due to the corona virus pandemic, this book examines interdisciplinary topics such as financial economics, international economics, agricultural economics, marketing and management. Country specific case studies are also featured.
When John Maynard Keynes likened Jan Tinbergen's early work in econometrics to black magic and alchemy, he was expressing a widely held view of a new discipline. However, even after half a century of practical work and theorizing by some of the most accomplished social scientists, Keynes' comments are still repeated today. This book assesses the foundations and development of econometrics and sets out a basis for the reconstruction of the foundations of econometric inference by examining the various interpretations of probability theory which underlie econometrics. Keuzenkamp claims that the probabilistic foundations of econometrics are weak, and although econometric inferences may yield interesting knowledge, claims to be able to falsify or verify economic theories are unwarranted. Methodological falsificationism in econometrics is an illusion. Instead, it is argued, econometrics should locate itself in the tradition of positivism.
This book offers a series of statistical tests to determine if the "crowd out" problem, known to hinder the effectiveness of Keynesian economic stimulus programs, can be overcome by monetary programs. It concludes there are programs that can do this, specifically "accommodative monetary policy." They were not used to any great extent prior to the Quantitative Easing program in 2008, causing the failure of many fiscal stimulus programs through no fault of their own. The book includes exhaustive statistical tests to prove this point. There is also a policy analysis section of the book. It examines how effectively the Federal Reserve's anti-crowd out programs have actually worked, to the extent they were undertaken at all. It finds statistical evidence that using commercial and savings banks instead of investment banks when implementing accommodating monetary policy would have markedly improved their effectiveness. This volume, with its companion volume Why Fiscal Stimulus Programs Fail, Volume 2: Statistical Tests Comparing Monetary Policy to Growth, provides 1000 separate statistical tests on the US economy to prove these assertions.
This book gives a thorough and systematic introduction to the latest research results about fuzzy decision-making method based on prospect theory. It includes eight chapters: Introduction, Intuitionistic fuzzy MADM based on prospect theory, QUALIFLEX based on prospect theory with probabilistic linguistic information, Group PROMETHEE based on prospect theory with hesitant fuzzy linguistic information, Prospect consensus with probabilistic hesitant fuzzy preference information, Improved TODIM based on prospect theory and the improved TODIM with probabilistic hesitant fuzzy information, etc. This book is suitable for the researchers in the fields of fuzzy mathematics, operations research, behavioral science, management science and engineering, etc. It is also useful as a textbook for postgraduate and senior-year undergraduate students of the relevant professional institutions of higher learning.
This is the first of three volumes containing edited versions of papers and a commentary presented at invited symposium sessions of the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarise and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
This is the first of three volumes containing edited versions of papers and a commentary presented at invited symposium sessions of the Ninth World Congress of the Econometric Society, held in London in August 2005. The papers summarise and interpret key developments, and they discuss future directions for a wide variety of topics in economics and econometrics. The papers cover both theory and applications. Written by leading specialists in their fields, these volumes provide a unique survey of progress in the discipline.
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
This book provides a comprehensive and concrete illustration of time series analysis focusing on the state-space model, which has recently attracted increasing attention in a broad range of fields. The major feature of the book lies in its consistent Bayesian treatment regarding whole combinations of batch and sequential solutions for linear Gaussian and general state-space models: MCMC and Kalman/particle filter. The reader is given insight on flexible modeling in modern time series analysis. The main topics of the book deal with the state-space model, covering extensively, from introductory and exploratory methods to the latest advanced topics such as real-time structural change detection. Additionally, a practical exercise using R/Stan based on real data promotes understanding and enhances the reader's analytical capability.
This highly accessible and innovative text with supporting web site uses Excel (R) to teach the core concepts of econometrics without advanced mathematics. It enables students to use Monte Carlo simulations in order to understand the data generating process and sampling distribution. Intelligent repetition of concrete examples effectively conveys the properties of the ordinary least squares (OLS) estimator and the nature of heteroskedasticity and autocorrelation. Coverage includes omitted variables, binary response models, basic time series, and simultaneous equations. The authors teach students how to construct their own real-world data sets drawn from the internet, which they can analyze with Excel (R) or with other econometric software. The accompanying web site with text support can be found at www.wabash.edu/econometrics.
This book analyzes evolution of monetary policy in Rwanda since it was first implemented by the National Bank of Rwanda in 1964 when the bank was established. It contributes to the understanding of monetary policy which is formulation and implementation in different stages of development of a financial system that comprises the financial market (money market and capital market), financial intermediaries such as commercial banks, and the financial sector infrastructures such as payment systems and the credit reference bureau. The book breaks down applied empirical research on the assessment of key assumptions of a monetary targeting framework, namely the stability of money multiplier and money demand using econometrics of time series, through a number of case studies. Presenting a detailed empirical analysis of the monetary transmission mechanism, one of the most analyzed topics in central banks in advanced economies, this book is a valuable read for central bankers and other researchers of monetary policy, particularly in developing economies.
The contents of this volume comprise the proceedings of the International Symposia in Economic Theory and Econometrics conference held in 1987 at the IC DEGREEST2 (Innovation, Creativity, and Capital) Institute at the University of Texas at Austin. The essays present fundamental new research on the analysis of complicated outcomes in relatively simple macroeconomic models. The book covers econometric modelling and time series analysis techniques in five parts. Part I focuses on sunspot equilibria, the study of uncertainty generated by nonstochastic economic models. Part II examines the more traditional examples of deterministic chaos: bubbles, instability, and hyperinflation. Part III contains the most current literature dealing with empirical tests for chaos and strange attractors. Part IV deals with chaos and informational complexity. Part V, Nonlinear Econometric Modelling, includes tests for and applications of nonlinearity
Composed in honour of the sixty-fifth birthday of Lloyd Shapley, this volume makes accessible the large body of work that has grown out of Shapley's seminal 1953 paper. Each of the twenty essays concerns some aspect of the Shapley value. Three of the chapters are reprints of the 'ancestral' papers: Chapter 2 is Shapley's original 1953 paper defining the value; Chapter 3 is the 1954 paper by Shapley and Shubik applying the value to voting models; and chapter 19 is Shapley's 1969 paper defining a value for games without transferable utility. The other seventeen chapters were contributed especially for this volume. The first chapter introduces the subject and the other essays in the volume, and contains a brief account of a few of Shapley's other major contributions to game theory. The other chapters cover the reformulations, interpretations and generalizations that have been inspired by the Shapley value, and its applications to the study of coalition formulation, to the organization of large markets, to problems of cost allocation, and to the study of games in which utility is not transferable.
The main objective of this 2002 book is to show that behind the bewildering diversity of historical speculative episodes it is possible to find hidden regularities, thus preparing the way for a unified theory of market speculation. Speculative bubbles require the study of various episodes in order for a comparative perspective to be obtained and the analysis developed in this book follows a few simple but unconventional ideas. Investors are assumed to exhibit the same basic behavior during speculative episodes whether they trade stocks, real estate, or postage stamps. The author demonstrates how some of the basic concepts of dynamical system theory, such as the notions of impulse response, reaction times and frequency analysis, play an instrumental role in describing and predicting speculative behavior. This book will serve as a useful introduction for students of econophysics, and readers with a general interest in economics as seen from the perspective of physics.
This 2005 volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose others. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.
This 2005 volume brings together twelve papers by many of the most prominent applied general equilibrium modelers honoring Herbert Scarf, the father of equilibrium computation in economics. It deals with developments in applied general equilibrium, a field which has broadened greatly since the 1980s. The contributors discuss some traditional as well as some modern topics in the field, including non-convexities in economy-wide models, tax policy, developmental modeling and energy modeling. The book also covers a range of distinct approaches, conceptual issues and computational algorithms, such as calibration and areas of application such as macroeconomics of real business cycles and finance. An introductory chapter written by the editors maps out issues and scenarios for the future evolution of applied general equilibrium.
This book is intended for use in a rigorous introductory Ph.D. level course in econometrics, or in a field course in econometric theory. It covers the measure -theoretical foundation of probability theory, the multivariate normal distribution with its application to classical linear regression analysis, various laws of large numbers, central limit theorems and related results for independent random variables as well as for stationary time series, with applications to asymptotic inference of M-estimators, and maximum likelihood theory. Some chapters have their own appendices containing the more advanced topics and/or difficult proofs. Moreover, there are three appendices with material that is supposed to be known. Appendix I contains a comprehensive review of linear algebra, including all the proofs. Appendix II reviews a variety of mathematical topics and concepts that are used throughout the main text, and Appendix III reviews complex analysis. Therefore, this book is uniquely self-contained. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Macroeconomics and the Real World…
Roger E. Backhouse, Andrea Salanti
Hardcover
R4,296
Discovery Miles 42 960
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
|