Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
This book contains some of the results from the research project "Demand for Food in the Nordic Countries," which was initiated in 1988 by Professor Olof Bolin of the Agricultural University in Ultuna, Sweden and by Professor Karl Iohan Weckman, of the University of Helsinki, Finland. A pilot study was carried out by Bengt Assarsson, which in 1989 led to a successful application for a research grant from the NKJ (The Nordic Contact Body for Agricultural Research) through the national research councils for agricultural research in Denmark, Finland, Norway and Sweden. We are very grateful to Olof Bolin and Karl Iohan Weckman, without whom this project would not have come about, and to the national research councils in the Nordic countries for the generous financial support we have received for this project. We have received comments and suggestions from many colleagues, and this has improved our work substantially. At the start of the project a reference group was formed, consisting of Professor Olof Bolin, Professor Anders Klevmarken, Agr. lie. Gert Aage Nielsen, Professor Karl Iohan Weckman and Cando oecon. Per Halvor Vale. Gert Aage Nielsen left the group early in the project for a position in Landbanken, and was replaced by Professor Lars Otto, while Per Halvor Vale soon joined the research staff. The reference group has given us useful suggestions and encouraged us in our work. Weare very grateful to them.
Based on economic knowledge and logical reasoning, this book proposes a solution to economic recessions and offers a route for societal change to end capitalism. The author starts with a brief review of the history of economics, and then questions and rejects the trend of recent decades that has seen econometrics replace economic theory. By reviewing the different schools of economic thought and by examining the limitations of existing theories to business cycles and economic growth, the author forms a new theory to explain cyclic economic growth. According to this theory, economic recessions result from innovation scarcity, which in turn results from the flawed design of the patent system. The author suggests a new design for the patent system and envisions that the new design would bring about large economic and societal changes. Under this new patent system, the synergy of the patent and capital markets would ensure that economic recessions could be avoided and that the economy would grow at the highest speed.
The behaviour of commodity prices never ceases to marvel economists, financial analysts, industry experts, and policymakers. Unexpected swings in commodity prices used to occur infrequently but have now become a permanent feature of global commodity markets. This book is about modelling commodity price shocks. It is intended to provide insights into the theoretical, conceptual, and empirical modelling of the underlying causes of global commodity price shocks. Three main objectives motivated the writing of this book. First, to provide a variety of modelling frameworks for documenting the frequency and intensity of commodity price shocks. Second, to evaluate existing approaches used for forecasting large movements in future commodity prices. Third, to cover a wide range and aspects of global commodities including currencies, rare-hard-lustrous transition metals, agricultural commodities, energy, and health pandemics. Some attempts have already been made towards modelling commodity price shocks. However, most tend to narrowly focus on a subset of commodity markets, i.e., agricultural commodities market and/or the energy market. In this book, the author moves the needle forward by operationalizing different models, which allow researchers to identify the underlying causes and effects of commodity price shocks. Readers also learn about different commodity price forecasting models. The author presents the topics to readers assuming less prior or specialist knowledge. Thus, the book is accessible to industry analysts, researchers, undergraduate and graduate students in economics and financial economics, academic and professional economists, investors, and financial professionals working in different sectors of the commodity markets. Another advantage of the book's approach is that readers are not only exposed to several innovative modelling techniques to add to their modelling toolbox but are also exposed to diverse empirical applications of the techniques presented.
This major collection presents a careful selection of the most important published articles in the field of financial econometrics. Starting with a review of the philosophical background, the collection covers such topics as the random walk hypothesis, long-memory processes, asset pricing, arbitrage pricing theory, variance bounds tests, term structure models, market microstructure, Bayesian methods and other statistical tools. Andrew Lo - one of the world's leading financial economists - has written an authoritative introduction, which offers a comprehensive overview of the subject and complements his selection.
Mathematical Economics is an authoritative collection of the most influential contributions essential to an understanding of this important area of economic science. These seminal papers illustrate the development of the field from its inception in the 19th century up to the present, and exhibit the power of mathematics to lead to new thinking which can illuminate the scientific structures underlying economic arguments. Many of these papers started new fields of economics, influencing deeply the way economists think about their world. They illustrate the extensive range of topics to which mathematics has been applied productively, and show the areas of mathematics which have proved valuable, including functional analysis, linear algebra, algebraic and differential topology, stochastic processes and dynamical systems. They also show the extent to which today's policy analysis rests on yesterday's mathematical economics. Anyone with an interest in economics as a science will find this collection indispensable. The collection is an essential part of any course using mathematical economics.
This book covers the basics of processing and spectral analysis of monovariate discrete-time signals. The approach is practical, the aim being to acquaint the reader with the indications for and drawbacks of the various methods and to highlight possible misuses. The book is rich in original ideas, visualized in new and illuminating ways, and is structured so that parts can be skipped without loss of continuity. Many examples are included, based on synthetic data and real measurements from the fields of physics, biology, medicine, macroeconomics etc., and a complete set of MATLAB exercises requiring no previous experience of programming is provided. Prior advanced mathematical skills are not needed in order to understand the contents: a good command of basic mathematical analysis is sufficient. Where more advanced mathematical tools are necessary, they are included in an Appendix and presented in an easy-to-follow way. With this book, digital signal processing leaves the domain of engineering to address the needs of scientists and scholars in traditionally less quantitative disciplines, now facing increasing amounts of data.
This book investigates why economics makes less visible progress over time than scientific fields with a strong practical component, where interactions with physical technologies play a key role. The thesis of the book is that the main impediment to progress in economics is "false feedback", which it defines as the false result of an empirical study, such as empirical evidence produced by a statistical model that violates some of its assumptions. In contrast to scientific fields that work with physical technologies, false feedback is hard to recognize in economics. Economists thus have difficulties knowing where they stand in their inquiries, and false feedback will regularly lead them in the wrong directions. The book searches for the reasons behind the emergence of false feedback. It thereby contributes to a wider discussion in the field of metascience about the practices of researchers when pursuing their daily business. The book thus offers a case study of metascience for the field of empirical economics. The main strength of the book are the numerous smaller insights it provides throughout. The book delves into deep discussions of various theoretical issues, which it illustrates by many applied examples and a wide array of references, especially to philosophy of science. The book puts flesh on complicated and often abstract subjects, particularly when it comes to controversial topics such as p-hacking. The reader gains an understanding of the main challenges present in empirical economic research and also the possible solutions. The main audience of the book are all applied researchers working with data and, in particular, those who have found certain aspects of their research practice problematic.
Upon the backdrop of impressive progress made by the Indian economy during the last two decades after the large-scale economic reforms in the early 1990s, this book evaluates the performance of the economy on some income and non-income dimensions of development at the national, state and sectoral levels. It examines regional economic growth and inequality in income originating from agriculture, industry and services. In view of the importance of the agricultural sector, despite its declining share in gross domestic product, it evaluates the performance of agricultural production and the impact of agricultural reforms on spatial integration of food grain markets. It studies rural poverty, analyzing the trend in employment, the trickle-down process and the inclusiveness of growth in rural India. It also evaluates the impact of microfinance, as an instrument of financial inclusion, on the socio-economic conditions of rural households. Lastly, it examines the relative performance of fifteen major states of India in terms of education, health and human development. An important feature of the book is that it approaches these issues, applying rigorously advanced econometric methods, and focusing primarily on their regional disparities during the post-reform period vis-a-vis the pre-reform period. It offers important results to guide policies for future development.
The book provides an integrated approach to risk sharing, risk spreading and efficient regulation through principal agent models. It emphasizes the role of information asymmetry and risk sharing in contracts as an alternative to transaction cost considerations. It examines how contracting, as an institutional mechanism to conduct transactions, spreads risks while attempting consolidation. It further highlights the shifting emphasis in contracts from Coasian transaction cost saving to risk sharing and shows how it creates difficulties associated with risk spreading, and emphasizes the need for efficient regulation of contracts at various levels. Each of the chapters is structured using a principal agent model, and all chapters incorporate adverse selection (and exogenous randomness) as a result of information asymmetry, as well as moral hazard (and endogenous randomness) due to the self-interest-seeking behavior on the part of the participants.
Game theory has revolutionised our understanding of industrial organisation and the traditional theory of the firm. Despite these advances, industrial economists have tended to rely on a restricted set of tools from game theory, focusing on static and repeated games to analyse firm structure and behaviour. Luca Lambertini, a leading expert on the application of differential game theory to economics, argues that many dynamic phenomena in industrial organisation (such as monopoly, oligopoly, advertising, R&D races) can be better understood and analysed through the use of differential games. After illustrating the basic elements of the theory, Lambertini guides the reader through the main models, spanning from optimal control problems describing the behaviour of a monopolist through to oligopoly games in which firms' strategies include prices, quantities and investments. This approach will be of great value to students and researchers in economics and those interested in advanced applications of game theory.
This book provides the reader with user-friendly applications of normal distribution. In several variables it is called the multinormal distribution which is often handled using matrices for convenience. The author seeks to make the arguments less abstract and hence, starts with the univariate case and moves progressively toward the vector and matrix cases. The approach used in the book is a gradual one, going from one scalar variable to a vector variable and to a matrix variable. The author presents the unified aspect of normal distribution, as well as addresses several other issues, including random matrix theory in physics. Other well-known applications, such as Herrnstein and Murray's argument that human intelligence is substantially influenced by both inherited and environmental factors, will be discussed in this book. It is a better predictor of many personal dynamics - including financial income, job performance, birth out of wedlock, and involvement in crime - than are an individual's parental socioeconomic status, or education level, and deserve to be mentioned and discussed.
China's reform and opening-up have contributed to its long-term and rapid economic development, resulting in a much stronger economic strength and much better life for its people. Meanwhile, the deepening economic integration between China and the world has resulted in an increasingly complex environment, growing influencing factors and severe challenges to China's economic development. Under the "new normal" of the Chinese economy, accurate analysis of the economic situation is essential to scientific decision-making, sustainable and healthy economic development and to build a moderately prosperous society in all respects. By applying statistical and national economic accounting methods, and based on detailed statistics and national economic accounting data, this book presents an in-depth analysis of the key economic fields, such as real estate economy, automotive industry, high-tech industry, investment, opening-up, income distribution of residents, economic structure, balance of payments structure and financial operation, since the reform and opening-up, especially in recent years. It aims to depict the performance and characteristics of these key economic fields and their roles in the development of national economy, thus providing useful suggestions for economic decision-making, and facilitating the sustainable and healthy development of the economy and the realization of the goal of building a moderately prosperous society in all respects.
Originally published in 1971, this is a rigorous analysis of the economic aspects of the efficiency of public enterprises at the time. The author first restates and extends the relevant parts of welfare economics, and then illustrates its application to particular cases, drawing on the work of the National Board for Prices and Incomes, of which he was Deputy Chairman. The analysis is developed stage by stage, with the emphasis on applicability and ease of comprehension, rather than on generality or mathematical elegance. Financial performance, the second-best, the optimal degree of complexity of price structures and problems of optimal quality are first discussed in a static framework. Time is next introduced, leading to a marginal cost concept derived from a multi-period optimizing model. The analysis is then related to urban transport, shipping, gas and coal. This is likely to become a standard work of more general scope than the authors earlier book on electricity supply. It rests, however, on a similar combination of economic theory and high-level experience of the real problems of public enterprises.
This book presents selected peer-reviewed contributions from the International Conference on Time Series and Forecasting, ITISE 2018, held in Granada, Spain, on September 19-21, 2018. The first three parts of the book focus on the theory of time series analysis and forecasting, and discuss statistical methods, modern computational intelligence methodologies, econometric models, financial forecasting, and risk analysis. In turn, the last three parts are dedicated to applied topics and include papers on time series analysis in the earth sciences, energy time series forecasting, and time series analysis and prediction in other real-world problems. The book offers readers valuable insights into the different aspects of time series analysis and forecasting, allowing them to benefit both from its sophisticated and powerful theory, and from its practical applications, which address real-world problems in a range of disciplines. The ITISE conference series provides a valuable forum for scientists, engineers, educators and students to discuss the latest advances and implementations in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing computer science, mathematics, statistics and econometrics.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. International Financial Markets: Volume I provides a key repository on the current state of knowledge, the latest debates and recent literature on international financial markets. Against the background of the "financialization of commodities" since the 2008 sub-primes crisis, section one contains recent contributions on commodity and financial markets, pushing the frontiers of applied econometrics techniques. The second section is devoted to exchange rate and current account dynamics in an environment characterized by large global imbalances. Part three examines the latest research in the field of meta-analysis in economics and finance. This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. Financial Mathematics, Volatility and Covariance Modelling: Volume 2 provides a key repository on the current state of knowledge, the latest debates and recent literature on financial mathematics, volatility and covariance modelling. The first section is devoted to mathematical finance, stochastic modelling and control optimization. Chapters explore the recent financial crisis, the increase of uncertainty and volatility, and propose an alternative approach to deal with these issues. The second section covers financial volatility and covariance modelling and explores proposals for dealing with recent developments in financial econometrics This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This book primarily addresses the optimality aspects of covariate designs. A covariate model is a combination of ANOVA and regression models. Optimal estimation of the parameters of the model using a suitable choice of designs is of great importance; as such choices allow experimenters to extract maximum information for the unknown model parameters. The main emphasis of this monograph is to start with an assumed covariate model in combination with some standard ANOVA set-ups such as CRD, RBD, BIBD, GDD, BTIBD, BPEBD, cross-over, multi-factor, split-plot and strip-plot designs, treatment control designs, etc. and discuss the nature and availability of optimal covariate designs. In some situations, optimal estimations of both ANOVA and the regression parameters are provided. Global optimality and D-optimality criteria are mainly used in selecting the design. The standard optimality results of both discrete and continuous set-ups have been adapted, and several novel combinatorial techniques have been applied for the construction of optimum designs using Hadamard matrices, the Kronecker product, Rao-Khatri product, mixed orthogonal arrays to name a few.
Volume 36 of Advances in Econometrics recognizes Aman Ullah's significant contributions in many areas of econometrics and celebrates his long productive career. The volume features original papers on the theory and practice of econometrics that is related to the work of Aman Ullah. Topics include nonparametric/semiparametric econometrics; finite sample econometrics; shrinkage methods; information/entropy econometrics; model specification testing; robust inference; panel/spatial models. Advances in Econometrics is a research annual whose editorial policy is to publish original research articles that contain enough details so that economists and econometricians who are not experts in the topics will find them accessible and useful in their research.
This volume comprises the classic articles on methods of identification and estimation of simultaneous equations econometric models. It includes path-breaking contributions by Trygve Haavelmo and Tjalling Koopmans, who founded the subject and received Nobel prizes for their work. It presents original articles that developed and analysed the leading methods for estimating the parameters of simultaneous equations systems: instrumental variables, indirect least squares, generalized least squares, two-stage and three-stage least squares, and maximum likelihood. Many of the articles are not readily accessible to readers in any other form.
Predicting foreign exchange rates has presented a long-standing challenge for economists. However, the recent advances in computational techniques, statistical methods, newer datasets on emerging market currencies, etc., offer some hope. While we are still unable to beat a driftless random walk model, there has been serious progress in the field. This book provides an in-depth assessment of the use of novel statistical approaches and machine learning tools in predicting foreign exchange rate movement. First, it offers a historical account of how exchange rate regimes have evolved over time, which is critical to understanding turning points in a historical time series. It then presents an overview of the previous attempts at modeling exchange rates, and how different methods fared during this process. At the core sections of the book, the author examines the time series characteristics of exchange rates and how contemporary statistics and machine learning can be useful in improving predictive power, compared to previous methods used. Exchange rate determination is an active research area, and this book will appeal to graduate-level students of international economics, international finance, open economy macroeconomics, and management. The book is written in a clear, engaging, and straightforward way, and will greatly improve access to this much-needed knowledge in the field.
The recent financial crisis has heightened the need for appropriate methodologies for managing and monitoring complex risks in financial markets. The measurement, management, and regulation of risks in portfolios composed of credits, credit derivatives, or life insurance contracts is difficult because of the nonlinearities of risk models, dependencies between individual risks, and the several thousands of contracts in large portfolios. The granularity principle was introduced in the Basel regulations for credit risk to solve these difficulties in computing capital reserves. In this book, authors Patrick Gagliardini and Christian Gourieroux provide the first comprehensive overview of the granularity theory and illustrate its usefulness for a variety of problems related to risk analysis, statistical estimation, and derivative pricing in finance and insurance. They show how the granularity principle leads to analytical formulas for risk analysis that are simple to implement and accurate even when the portfolio size is large."
Focuses on the assumptions underlying the algorithms rather than their statistical properties Presents cutting-edge analysis of factor models and finite mixture models. Uses a hands-on approach to examine the assumptions made by the models and when the models fail to estimate accurately Utilizes interesting real-world data sets that can be used to analyze important microeconomic problems Introduces R programming concepts throughout the book. Includes appendices that discuss many of the concepts introduced in the book, as well as measures of uncertainty in microeconometrics.
The second edition of this widely acclaimed text presents a thoroughly up-to-date intuitive account of recent developments in econometrics. It continues to present the frontiers of research in an accessible form for non-specialist econometricians, advanced undergraduates and graduate students wishing to carry out applied econometric research. This new edition contains substantially revised chapters on cointegration and vector autoregressive (VAR) modelling, reflecting the developments that have been made in these important areas since the first edition. Special attention is given to the Dickey-Pantula approach and the testing for the order of integration of a variable in the presence of a structural break. For VAR models, impulse response analysis is explained and illustrated. There is also a detailed but intuitive explanation of the Johansen method, an increasingly popular technique. The text contains specially constructed and original tables of critical values for a wide range of tests for stationarity and cointegration. These tables are for Dickey-Fuller tests, Dickey-Hasza-Fuller and HEGY seasonal integration tests and the Perron 'additive outlier' integration test.
This is a two-volume collection of major papers which have shaped the development of econometrics. Part I includes articles which together provide an overview of the history of econometrics, Part II addresses the relationship between econometrics and statistics, the articles in Part III constitute early applied studies, and Part IV includes articles concerned with the role and method of econometrics. The work comprises 42 articles, dating from 1921-1991, and contributors include E.W. Gilboy, W.C. Mitchell, J.J. Spengler, R. Stone, H.O. Wold and S. Wright.
This book analyzes the relationship between technological innovation and economic development in Japan before World War II. Guan Quan deploys econometric analysis, multivariate statistical analysis and case studies from different industries to shed light on technological innovation in the Japanese context with particular emphasis on the importance of the patent system. A great deal of new inventions and patents in this period led to fast economic growth in Japan characterized by the simultaneous development of both traditional and modern industries. These insights help reshape the understanding of Japan's economic development and industrial advancement at an early stage and provide pointers to developing countries as to how human capital, social capabilities and thereby technological innovation can figure in economic growth. The book will appeal to academics of the East Asian economy, development economics and modern economic history as well as general readers interested in the miracle of the Japanese economy as the first to achieve economic development and modernization among non-Western countries. |
You may like...
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,432
Discovery Miles 64 320
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,326
Discovery Miles 33 260
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
|