![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book deals with the application of wavelet and spectral methods for the analysis of nonlinear and dynamic processes in economics and finance. It reflects some of the latest developments in the area of wavelet methods applied to economics and finance. The topics include business cycle analysis, asset prices, financial econometrics, and forecasting. An introductory paper by James Ramsey, providing a personal retrospective of a decade's research on wavelet analysis, offers an excellent overview over the field.
A guide to economics, statistics and finance that explores the mathematical foundations underling econometric methods An Introduction to Econometric Theory offers a text to help in the mastery of the mathematics that underlie econometric methods and includes a detailed study of matrix algebra and distribution theory. Designed to be an accessible resource, the text explains in clear language why things are being done, and how previous material informs a current argument. The style is deliberately informal with numbered theorems and lemmas avoided. However, very few technical results are quoted without some form of explanation, demonstration or proof. The author -- a noted expert in the field -- covers a wealth of topics including: simple regression, basic matrix algebra, the general linear model, distribution theory, the normal distribution, properties of least squares, unbiasedness and efficiency, eigenvalues, statistical inference in regression, t and F tests, the partitioned regression, specification analysis, random regressor theory, introduction to asymptotics and maximum likelihood. Each of the chapters is supplied with a collection of exercises, some of which are straightforward and others more challenging. This important text: Presents a guide for teaching econometric methods to undergraduate and graduate students of economics, statistics or finance Offers proven classroom-tested material Contains sets of exercises that accompany each chapter Includes a companion website that hosts additional materials, solution manual and lecture slides Written for undergraduates and graduate students of economics, statistics or finance, An Introduction to Econometric Theory is an essential beginner's guide to the underpinnings of econometrics.
This book provides the reader with user-friendly applications of normal distribution. In several variables it is called the multinormal distribution which is often handled using matrices for convenience. The author seeks to make the arguments less abstract and hence, starts with the univariate case and moves progressively toward the vector and matrix cases. The approach used in the book is a gradual one, going from one scalar variable to a vector variable and to a matrix variable. The author presents the unified aspect of normal distribution, as well as addresses several other issues, including random matrix theory in physics. Other well-known applications, such as Herrnstein and Murray's argument that human intelligence is substantially influenced by both inherited and environmental factors, will be discussed in this book. It is a better predictor of many personal dynamics - including financial income, job performance, birth out of wedlock, and involvement in crime - than are an individual's parental socioeconomic status, or education level, and deserve to be mentioned and discussed.
Following the recent publication of the award winning and much acclaimed "The New Palgrave Dictionary of Economics," second edition which brings together Nobel Prize winners and the brightest young scholars to survey the discipline, we are pleased to announce "The New Palgrave Economics Collection." Due to demand from the economics community these books address key subject areas within the field. Each title is comprised of specially selected articles from the Dictionary and covers a fundamental theme within the discipline. All of the articles have been specifically chosen by the editors of the Dictionary, Steven N.Durlauf and Lawrence E.Blume and are written by leading practitioners in the field. The Collections provide the reader with easy to access information on complex and important subject areas, and allow individual scholars and students to have their own personal reference copy.
A careful basic theoretical and econometric analysis of the factors determining the real exchange rates of Canada, the U.K., Japan, France and Germany with respect to the United States is conducted. The resulting conclusion is that real exchange rates are almost entirely determined by real factors relating to growth and technology such as oil and commodity prices, international allocations of world investment across countries, and underlying terms of trade changes. Unanticipated money supply shocks, calculated in five alternative ways have virtually no effects. A Blanchard-Quah VAR analysis also indicates that the effects of real shocks predominate over monetary shocks by a wide margin. The implications of these facts for the conduct of monetary policy in countries outside the U.S. are then explored leading to the conclusion that all countries, to avoid exchange rate overshooting, have tended to automatically follow the same monetary policy as the United States. The history of world monetary policy is reviewed along with the determination of real exchange rates within the Euro Area.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
Often applied econometricians are faced with working with data that is less than ideal. The data may be observed with gaps in it, a model may suggest variables that are observed at different frequencies, and sometimes econometric results are very fragile to the inclusion or omission of just a few observations in the sample. Papers in this volume discuss new econometric techniques for addressing these problems.
In this book leading German econometricians in different fields present survey articles of the most important new methods in econometrics. The book gives an overview of the field and it shows progress made in recent years and remaining problems.
There are many problems regarding poverty, inequality and growth in developing countries in Asia and Africa. Policy makers at the national level and at international institutions such as the United Nations, World Bank, International Monetary Fund and others have implemented various policies in order to decrease poverty and inequality. This book provides empirical observations on Asian countries and Africa. Each chapter provides theoretical and empirical analysis on regional case studies with an emphasis on policy implications. The book will be of use to many who wish to assess and improve policies in developing countries and mitigate poverty and inequality, and stimulate growth, by drawing on relevant empirical research and economic theories. Clearly, there have been numerous policy failures and the book aims to provide a basis for improving policies and outcomes based on relevant empirical observations.
Co-integration, equilibrium and equilibrium correction are key
concepts in modern applications of econometrics to real world
problems. This book provides direction and guidance to the now vast
literature facing students and graduate economists. Econometric
theory is linked to practical issues such as how to identify
equilibrium relationships, how to deal with structural breaks
associated with regime changes and what to do when variables are of
different orders of integration.
* Includes many mathematical examples and problems for students to work directly with both standard and nonstandard models of behaviour to develop problem-solving and critical-thinking skills which are more valuable to students than memorizing content which will quickly be forgotten. * The applications explored in the text emphasise issues of inequality, social mobility, culture and poverty to demonstrate the impact of behavioral economics in areas which students are most passionate about. * The text has a standardized structure (6 parts, 3 chapters in each) which provides a clear and consistent roadmap for students taking the course.
Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.
For courses in Econometrics. A Clear, Practical Introduction to Econometrics Using Econometrics: A Practical Guide offers students an innovative introduction to elementary econometrics. Through real-world examples and exercises, the book covers the topic of single-equation linear regression analysis in an easily understandable format. The Seventh Edition is appropriate for all levels: beginner econometric students, regression users seeking a refresher, and experienced practitioners who want a convenient reference. Praised as one of the most important texts in the last 30 years, the book retains its clarity and practicality in previous editions with a number of substantial improvements throughout.
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."
Financial econometrics combines mathematical and statistical theory and techniques to understand and solve problems in financial economics. Modeling and forecasting financial time series, such as prices, returns, interest rates, financial ratios, and defaults, are important parts of this field. In Financial Econometrics, you'll be introduced to this growing discipline and the concepts associated with it--from background material on probability theory and statistics to information regarding the properties of specific models and their estimation procedures. With this book as your guide, you'll become familiar with: Autoregressive conditional heteroskedasticity (ARCH) and GARCH modeling Principal components analysis (PCA) and factor analysis Stable processes and ARMA and GARCH models with fat-tailed errors Robust estimation methods Vector autoregressive and cointegrated processes, including advanced estimation methods for cointegrated systems And much more The experienced author team of Svetlozar Rachev, Stefan Mittnik, Frank Fabozzi, Sergio Focardi, and Teo Jasic not only presents you with an abundant amount of information on financial econometrics, but they also walk you through a wide array of examples to solidify your understanding of the issues discussed. Filled with in-depth insights and expert advice, Financial Econometrics provides comprehensive coverage of this discipline and clear explanations of how the models associated with it fit into today's investment management process.
China's reform and opening-up have contributed to its long-term and rapid economic development, resulting in a much stronger economic strength and much better life for its people. Meanwhile, the deepening economic integration between China and the world has resulted in an increasingly complex environment, growing influencing factors and severe challenges to China's economic development. Under the "new normal" of the Chinese economy, accurate analysis of the economic situation is essential to scientific decision-making, sustainable and healthy economic development and to build a moderately prosperous society in all respects. By applying statistical and national economic accounting methods, and based on detailed statistics and national economic accounting data, this book presents an in-depth analysis of the key economic fields, such as real estate economy, automotive industry, high-tech industry, investment, opening-up, income distribution of residents, economic structure, balance of payments structure and financial operation, since the reform and opening-up, especially in recent years. It aims to depict the performance and characteristics of these key economic fields and their roles in the development of national economy, thus providing useful suggestions for economic decision-making, and facilitating the sustainable and healthy development of the economy and the realization of the goal of building a moderately prosperous society in all respects.
Complex dynamics constitute a growing and increasingly important area as they offer a strong potential to explain and formalize natural, physical, financial and economic phenomena. This book pursues the ambitious goal to bring together an extensive body of knowledge regarding complex dynamics from various academic disciplines. Beyond its focus on economics and finance, including for instance the evolution of macroeconomic growth models towards nonlinear structures as well as signal processing applications to stock markets, fundamental parts of the book are devoted to the use of nonlinear dynamics in mathematics, statistics, signal theory and processing. Numerous examples and applications, almost 700 illustrations and numerical simulations based on the use of Matlab make the book an essential reference for researchers and students from many different disciplines who are interested in the nonlinear field. An appendix recapitulates the basic mathematical concepts required to use the book.
This two-volume work aims to present as completely as possible the methods of statistical inference with special reference to their economic applications. It is a well-integrated textbook presenting a wide diversity of models in a coherent and unified framework. The reader will find a description not only of the classical concepts and results of mathematical statistics, but also of concepts and methods recently developed for the specific needs of econometrics. Although the two volumes do not demand a high level of mathematical knowledge, they do draw on linear algebra and probability theory. The breadth of approaches and the extensive coverage of this two-volume work provide for a thorough and entirely self-contained course in modern economics. Volume 1 provides an introduction to general concepts and methods in statistics and econometrics, and goes on to cover estimation and prediction. Volume 2 focuses on testing, confidence regions, model selection, and asymptotic theory.
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
In this compelling 1995 book, David Hendry and Mary Morgan bring together the classic papers of the pioneer econometricians. Together, these papers form the foundations of econometric thought. They are essential reading for anyone seeking to understand the aims, method and methodology of econometrics and the development of this statistical approach in economics. However, because they are technically straightforward, the book is also accessible to students and non-specialists. An editorial commentary places the readings in their historical context and indicates the continuing relevance of these early, yet highly sophisticated, works for current econometric analysis. While this book provides a companion volume to Mary Morgan's acclaimed The History of Econometric Ideas, the editors' commentary both adds to that earlier volume and also provides a stand-alone and synthetic account of the development of econometrics.
This book provides a detailed introduction to the theoretical and methodological foundations of production efficiency analysis using benchmarking. Two of the more popular methods of efficiency evaluation are Stochastic Frontier Analysis (SFA) and Data Envelopment Analysis (DEA), both of which are based on the concept of a production possibility set and its frontier. Depending on the assumed objectives of the decision-making unit, a Production, Cost, or Profit Frontier is constructed from observed data on input and output quantities and prices. While SFA uses different maximum likelihood estimation techniques to estimate a parametric frontier, DEA relies on mathematical programming to create a nonparametric frontier. Yet another alternative is the Convex Nonparametric Frontier, which is based on the assumed convexity of the production possibility set and creates a piecewise linear frontier consisting of a number of tangent hyper planes. Three of the papers in this volume provide a detailed and relatively easy to follow exposition of the underlying theory from neoclassical production economics and offer step-by-step instructions on the appropriate model to apply in different contexts and how to implement them. Of particular appeal are the instructions on (i) how to write the codes for different SFA models on STATA, (ii) how to write a VBA Macro for repetitive solution of the DEA problem for each production unit on Excel Solver, and (iii) how to write the codes for the Nonparametric Convex Frontier estimation. The three other papers in the volume are primarily theoretical and will be of interest to PhD students and researchers hoping to make methodological and conceptual contributions to the field of nonparametric efficiency analysis.
Tools to improve decision making in an imperfect world The publication has been developed and fine- tuned through a
decade of classroom experience, and readers will find the author's
approach very engaging and accessible. There are nearly 200
examples and exercises to help readers see how effective use of
Bayesian statistics enables them to make optimal decisions. MATLAB?
and R computer programs are integrated throughout the book. An
accompanying Web site provides readers with computer code for many
examples and datasets.
This book presents recent research on predictive econometrics and big data. Gathering edited papers presented at the 11th International Conference of the Thailand Econometric Society (TES2018), held in Chiang Mai, Thailand, on January 10-12, 2018, its main focus is on predictive techniques - which directly aim at predicting economic phenomena; and big data techniques - which enable us to handle the enormous amounts of data generated by modern computers in a reasonable time. The book also discusses the applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that employs mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. It is therefore important to develop data processing techniques that explicitly focus on prediction. The more data we have, the better our predictions will be. As such, these techniques are essential to our ability to process huge amounts of available data.
Predicting foreign exchange rates has presented a long-standing challenge for economists. However, the recent advances in computational techniques, statistical methods, newer datasets on emerging market currencies, etc., offer some hope. While we are still unable to beat a driftless random walk model, there has been serious progress in the field. This book provides an in-depth assessment of the use of novel statistical approaches and machine learning tools in predicting foreign exchange rate movement. First, it offers a historical account of how exchange rate regimes have evolved over time, which is critical to understanding turning points in a historical time series. It then presents an overview of the previous attempts at modeling exchange rates, and how different methods fared during this process. At the core sections of the book, the author examines the time series characteristics of exchange rates and how contemporary statistics and machine learning can be useful in improving predictive power, compared to previous methods used. Exchange rate determination is an active research area, and this book will appeal to graduate-level students of international economics, international finance, open economy macroeconomics, and management. The book is written in a clear, engaging, and straightforward way, and will greatly improve access to this much-needed knowledge in the field. |
You may like...
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,567
Discovery Miles 35 670
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
|