![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Economics > Econometrics
Will history repeat itself, leaving Saudi Arabia to face another financial crisis due to drastic overspending and/or a dramatic drop in oil revenue? If the situation remains on its current trajectory, by 2030 government debt due to rising expenditures over revenues will be too overwhelming for the government to cope with.
This book is a comprehensive introduction of the reader into the simulation and modelling techniques and their application in the management of organisations. The book is rooted in the thorough understanding of systems theory applied to organisations and focuses on how this theory can apply to econometric models used in the management of organisations. The econometric models in this book employ linear and dynamic programming, graph theory, queuing theory, game theory, etc. and are presented and analysed in various fields of application, such as investment management, stock management, strategic decision making, management of production costs and the lifecycle costs of quality and non-quality products, production quality Management, etc.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This is a book on deterministic and stochastic Growth Theory and the computational methods needed to produce numerical solutions. Exogenous and endogenous growth models are thoroughly reviewed. Special attention is paid to the use of these models for fiscal and monetary policy analysis. Modern Business Cycle Theory, the New Keynesian Macroeconomics, the class of Dynamic Stochastic General Equilibrium models, can be all considered as special cases of models of economic growth, and they can be analyzed by the theoretical and numerical procedures provided in the textbook. Analytical discussions are presented in full detail. The book is self contained and it is designed so that the student advances in the theoretical and the computational issues in parallel. EXCEL and Matlab files are provided on an accompanying website (see Preface to the Second Edition) to illustrate theoretical results as well as to simulate the effects of economic policy interventions. The structure of these program files is described in "Numerical exercise"-type of sections, where the output of these programs is also interpreted. The second edition corrects a few typographical errors and improves some notation.
The global demographic transition presents marked asymmetries as poor, emerging, and advanced countries are undergoing different stages of transition. Emerging countries are demographically younger than advanced economies. This youth is favorable to growth and generates a demographic dividend. However, the future of emerging economies will bring a decline in the working-age share and a rise in the older population, as is the case in today's developed world. Hence, developing countries must get rich before getting old, while advanced economies must try not to become poorer as they age. Asymmetric Demography and the Global Economy contributes to our understanding of why this demographic transition matters to the domestic macroeconomics and global capital movements affect the asset accumulation, growth potential, current account, and the economy's international investment position. This collaborative collection approaches these questions from the perspective of "systemically important" emerging countries i.e., members of the G20 but considers both the national and the global sides of the problem.
Increasing concerns regarding the world's natural resources and sustainability continue to be a major issue for global development. As a result several political initiatives and strategies for green or resource-efficient growth both on national and international levels have been proposed. A core element of these initiatives is the promotion of an increase of resource or material productivity. This dissertation examines material productivity developments in the OECD and BRICS countries between 1980 and 2008. By applying the concept of convergence stemming from economic growth theory to material productivity the analysis provides insights into both aspects: material productivity developments in general as well potentials for accelerated improvements in material productivity which consequently may allow a reduction of material use globally. The results of the convergence analysis underline the importance of policy-making with regard to technology and innovation policy enabling the production of resource-efficient products and services as well as technology transfer and diffusion.
The purpose of this book is to establish a connection between the traditional field of empirical economic research and the emerging area of empirical financial research and to build a bridge between theoretical developments in these areas and their application in practice. Accordingly, it covers broad topics in the theory and application of both empirical economic and financial research, including analysis of time series and the business cycle; different forecasting methods; new models for volatility, correlation and of high-frequency financial data and new approaches to panel regression, as well as a number of case studies. Most of the contributions reflect the state-of-art on the respective subject. The book offers a valuable reference work for researchers, university instructors, practitioners, government officials and graduate and post-graduate students, as well as an important resource for advanced seminars in empirical economic and financial research.
This peer reviewed volume is part of an annual series, dedicated to the presentation and discussion of state of the art studies in the application of management science to the solution of significant managerial decision making problems. It is hoped that this research annual will significantly aid in the dissemination of actual applications of management science in both the public and private sectors. Volume 11 is directed toward the applications of mathematical programming to (1) Multi-criteria decision making, (2) Supply chain management, (3) Performance management, and (4) Risk analysis. Its use can be found both in university classes in management science and operations research, (management and engineering schools), as well as to both the researcher and practitioner of management science and operations research.
The objective of this book is the discussion and the practical illustration of techniques used in applied macroeconometrics. There are currently three competing approaches: the LSE (London School of Economics) approach, the VAR approach, and the intertemporal optimization/Real Business Cycle approach. This book discusses and illustrates the empirical research strategy of these three alternative approaches, pairing them with extensive discussions and replications of the relevant empirical work. Common benchmarks are used to evaluate the alternative approaches.
This textbook provides a coherent introduction to the main concepts and methods of one-parameter statistical inference. Intended for students of Mathematics taking their first course in Statistics, the focus is on Statistics for Mathematicians rather than on Mathematical Statistics. The goal is not to focus on the mathematical/theoretical aspects of the subject, but rather to provide an introduction to the subject tailored to the mindset and tastes of Mathematics students, who are sometimes turned off by the informal nature of Statistics courses. This book can be used as the basis for an elementary semester-long first course on Statistics with a firm sense of direction that does not sacrifice rigor. The deeper goal of the text is to attract the attention of promising Mathematics students.
A Hands-On Approach to Understanding and Using Actuarial Models Computational Actuarial Science with R provides an introduction to the computational aspects of actuarial science. Using simple R code, the book helps you understand the algorithms involved in actuarial computations. It also covers more advanced topics, such as parallel computing and C/C++ embedded codes. After an introduction to the R language, the book is divided into four parts. The first one addresses methodology and statistical modeling issues. The second part discusses the computational facets of life insurance, including life contingencies calculations and prospective life tables. Focusing on finance from an actuarial perspective, the next part presents techniques for modeling stock prices, nonlinear time series, yield curves, interest rates, and portfolio optimization. The last part explains how to use R to deal with computational issues of nonlife insurance. Taking a do-it-yourself approach to understanding algorithms, this book demystifies the computational aspects of actuarial science. It shows that even complex computations can usually be done without too much trouble. Datasets used in the text are available in an R package (CASdatasets).
Upon the backdrop of impressive progress made by the Indian economy during the last two decades after the large-scale economic reforms in the early 1990s, this book evaluates the performance of the economy on some income and non-income dimensions of development at the national, state and sectoral levels. It examines regional economic growth and inequality in income originating from agriculture, industry and services. In view of the importance of the agricultural sector, despite its declining share in gross domestic product, it evaluates the performance of agricultural production and the impact of agricultural reforms on spatial integration of food grain markets. It studies rural poverty, analyzing the trend in employment, the trickle-down process and the inclusiveness of growth in rural India. It also evaluates the impact of microfinance, as an instrument of financial inclusion, on the socio-economic conditions of rural households. Lastly, it examines the relative performance of fifteen major states of India in terms of education, health and human development. An important feature of the book is that it approaches these issues, applying rigorously advanced econometric methods, and focusing primarily on their regional disparities during the post-reform period vis-a-vis the pre-reform period. It offers important results to guide policies for future development.
This book focuses on general frameworks for modeling heavy-tailed distributions in economics, finance, econometrics, statistics, risk management and insurance. A central theme is that of (non-)robustness, i.e., the fact that the presence of heavy tails can either reinforce or reverse the implications of a number of models in these fields, depending on the degree of heavy-tailed ness. These results motivate the development and applications of robust inference approaches under heavy tails, heterogeneity and dependence in observations. Several recently developed robust inference approaches are discussed and illustrated, together with applications.
Originally published in 1939, this book forms the second part of a two-volume series on the mathematics required for the examinations of the Institute of Actuaries, focusing on finite differences, probability and elementary statistics. Miscellaneous examples are included at the end of the text. This book will be of value to anyone with an interest in actuarial science and mathematics.
Economic Modeling Using Artificial Intelligence Methods examines the application of artificial intelligence methods to model economic data. Traditionally, economic modeling has been modeled in the linear domain where the principles of superposition are valid. The application of artificial intelligence for economic modeling allows for a flexible multi-order non-linear modeling. In addition, game theory has largely been applied in economic modeling. However, the inherent limitation of game theory when dealing with many player games encourages the use of multi-agent systems for modeling economic phenomena. The artificial intelligence techniques used to model economic data include: multi-layer perceptron neural networks radial basis functions support vector machines rough sets genetic algorithm particle swarm optimization simulated annealing multi-agent system incremental learning fuzzy networks Signal processing techniques are explored to analyze economic data, and these techniques are the time domain methods, time-frequency domain methods and fractals dimension approaches. Interesting economic problems such as causality versus correlation, simulating the stock market, modeling and controling inflation, option pricing, modeling economic growth as well as portfolio optimization are examined. The relationship between economic dependency and interstate conflict is explored, and knowledge on how economics is useful to foster peace - and vice versa - is investigated. Economic Modeling Using Artificial Intelligence Methods deals with the issue of causality in the non-linear domain and applies the automatic relevance determination, the evidence framework, Bayesian approach and Granger causality to understand causality and correlation. Economic Modeling Using Artificial Intelligence Methods makes an important contribution to the area of econometrics, and is a valuable source of reference for graduate students, researchers and financial practitioners.
The book is a collection of essays in honour of Clive Granger. The chapters are by some of the world's leading econometricians, all of whom have collaborated with or studied with (or both) Clive Granger. Central themes of Granger's work are reflected in the book with attention to tests for unit roots and cointegration, tests of misspecification, forecasting models and forecast evaluation, non-linear and non-parametric econometric techniques, and overall, a careful blend of practical empirical work and strong theory. The book shows the scope of Granger's research and the range of the profession that has been influenced by his work.
Mathematical Statistics for Economics and Business, Second Edition, provides a comprehensive introduction to the principles of mathematical statistics which underpin statistical analyses in the fields of economics, business, and econometrics. The selection of topics in this textbook is designed to provide students with a conceptual foundation that will facilitate a substantial understanding of statistical applications in these subjects. This new edition has been updated throughout and now also includes a downloadable Student Answer Manual containing detailed solutions to half of the over 300 end-of-chapter problems. After introducing the concepts of probability, random variables, and probability density functions, the author develops the key concepts of mathematical statistics, most notably: expectation, sampling, asymptotics, and the main families of distributions. The latter half of the book is then devoted to the theories of estimation and hypothesis testing with associated examples and problems that indicate their wide applicability in economics and business. Features of the new edition include: a reorganization of topic flow and presentation to facilitate reading and understanding; inclusion of additional topics of relevance to statistics and econometric applications; a more streamlined and simple-to-understand notation for multiple integration and multiple summation over general sets or vector arguments; updated examples; new end-of-chapter problems; a solution manual for students; a comprehensive answer manual for instructors; and a theorem and definition map. This book has evolved from numerous graduate courses in mathematical statistics and econometrics taught by the author, and will be ideal for students beginning graduate study as well as for advanced undergraduates.
This book gives an introduction to R to build up graphing, simulating and computing skills to enable one to see theoretical and statistical models in economics in a unified way. The great advantage of R is that it is free, extremely flexible and extensible. The book addresses the specific needs of economists, and helps them move up the R learning curve. It covers some mathematical topics such as, graphing the Cobb-Douglas function, using R to study the Solow growth model, in addition to statistical topics, from drawing statistical graphs to doing linear and logistic regression. It uses data that can be downloaded from the internet, and which is also available in different R packages. With some treatment of basic econometrics, the book discusses quantitative economics broadly and simply, looking at models in the light of data. Students of economics or economists keen to learn how to use R would find this book very useful.
Who decides how official statistics are produced? Do politicians have control or are key decisions left to statisticians in independent statistical agencies? Interviews with statisticians in Australia, Canada, Sweden, the UK and the USA were conducted to get insider perspectives on the nature of decision making in government statistical administration. While the popular adage suggests there are 'lies, damned lies and statistics', this research shows that official statistics in liberal democracies are far from mistruths; they are consistently insulated from direct political interference. Yet, a range of subtle pressures and tensions exist that governments and statisticians must manage. The power over statistics is distributed differently in different countries, and this book explains why. Differences in decision-making powers across countries are the result of shifting pressures politicians and statisticians face to be credible, and the different national contexts that provide distinctive institutional settings for the production of government numbers.
Analyzing Event Statistics in Corporate Finance provides new alternative methodologies to increase accuracy when performing statistical tests for event studies within corporate finance. In contrast to conventional surveys or literature reviews, Jeng focuses on various methodological defects or deficiencies that lead to inaccurate empirical results, which ultimately produce bad corporate policies. This work discusses the issues of data collection and structure, the recursive smoothing for systematic components in excess returns, the choices of event windows, different time horizons for the events, and the consequences of applications of different methodologies. In providing improvement for event studies in corporate finance, and based on the fact that changes in parameters for financial time series are common knowledge, a new alternative methodology is developed to extend the conventional analysis to more robust arguments.
Discover the Benefits of Risk Parity Investing Despite recent progress in the theoretical analysis and practical applications of risk parity, many important fundamental questions still need to be answered. Risk Parity Fundamentals uses fundamental, quantitative, and historical analysis to address these issues, such as: What are the macroeconomic dimensions of risk in risk parity portfolios? What are the appropriate risk premiums in a risk parity portfolio? What are market environments in which risk parity might thrive or struggle? What is the role of leverage in a risk parity portfolio? An experienced researcher and portfolio manager who coined the term "risk parity," the author provides investors with a practical understanding of the risk parity investment approach. Investors will gain insight into the merit of risk parity as well as the practical and underlying aspects of risk parity investing.
Gini's mean difference (GMD) was first introduced by Corrado Gini in 1912 as an alternative measure of variability. GMD and the parameters which are derived from it (such as the Gini coefficient or the concentration ratio) have been in use in the area of income distribution for almost a century. In practice, the use of GMD as a measure of variability is justified whenever the investigator is not ready to impose, without questioning, the convenient world of normality. This makes the GMD of critical importance in the complex research of statisticians, economists, econometricians, and policy makers. This book focuses on imitating analyses that are based on variance by replacing variance with the GMD and its variants. In this way, the text showcases how almost everything that can be done with the variance as a measure of variability, can be replicated by using Gini. Beyond this, there are marked benefits to utilizing Gini as opposed to other methods. One of the advantages of using Gini methodology is that it provides a unified system that enables the user to learn about various aspects of the underlying distribution. It also provides a systematic method and a unified terminology. Using Gini methodology can reduce the risk of imposing assumptions that are not supported by the data on the model. With these benefits in mind the text uses the covariance-based approach, though applications to other approaches are mentioned as well.
Delving into the connections between renewable energy and economics on an international level, this book focuses specifically on hydropower and geothermal power production for use in the power intensive industry. It takes readily available government and international statistics to provide insight into how businesses and economists can interpret the factors that influence the growth of power intensive industries. It also discusses the CarbFix and SulFix projects that involve the injection of hydrogen sulphide (H2S), and carbon dioxide (CO2) back to reservoir as an emission reduction method. With improved engineering processes, both types of power generation are increasingly subject to economies of scale. These exciting technological developments have a great potential to change the way the world works, as the economy continues to rely so heavily on energy to drive production. Green energy is without a question going to be a major factor in our future, so studying it at its nascence is particularly exciting. This book is intended for academic researchers and students interested in current economic and environmental hot topics, as well as people interested in the inner workings of a possible new investment opportunity.
This book investigates the existence of stochastic and deterministic convergence of real output per worker and the sources of output (physical capital per worker, human capital per worker, total factor productivity -TFP- and average annual hours worked) in 21 OECD countries over the period 1970-2011. Towards this end, the authors apply a large battery of panel unit root and stationarity tests, all of which are robust to the presence of cross-sectional dependence. The evidence fails to provide clear-cut evidence of convergence dynamics either in real GDP per worker or in the series of the sources of output. Due to some limitations associated with second-generation panel unit root and stationarity tests, the authors further use the more flexible PANIC approach which provides evidence that real GDP per worker, real physical capital per worker, human capital and average annual hours exhibit some degree of deterministic convergence, whereas TFP series display a high degree of stochastic convergence.
A unique and comprehensive source of information, this book is the only international publication providing economists, planners, policy makers and business people with worldwide statistics on current performance and trends in the manufacturing sector. The Yearbook is designed to facilitate international comparisons relating to manufacturing activity and industrial development and performance. It provides data which can be used to analyze patterns of growth and related long term trends, structural change and industrial performance in individual industries. Statistics on employment patterns, wages, consumption and gross output and other key indicators are also presented. Contents: Introduction Part I: Summary Tables 1.1. The Manufacturing Sector 1.2. The Manufacturing Branches Part II: Country Tables |
![]() ![]() You may like...
Intelligent Systems for Crisis…
Orhan Altan, Madhu Chandra, …
Hardcover
R5,617
Discovery Miles 56 170
Database Principles - Fundamentals of…
Carlos Coronel, Keeley Crockett, …
Paperback
Blackbody Radiometry - Volume 1…
Victor Sapritsky, Alexander Prokhorov
Hardcover
R5,236
Discovery Miles 52 360
Intelligent Processing Algorithms and…
Dewang Chen, Ruijun Cheng
Hardcover
R2,873
Discovery Miles 28 730
GeoComputation and Public Health - A…
Gouri Sankar Bhunia, Pravat Kumar Shit
Hardcover
R4,143
Discovery Miles 41 430
Small-Format Aerial Photography and UAS…
James S. Aber, Irene Marzolff, …
Paperback
R2,780
Discovery Miles 27 800
Analysis of Engineering Drawings and…
Thomas C. Henderson
Hardcover
|