Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Business & Economics > Economics > Econometrics > General
First Published in 1970. Econometric model-building, on the other hand, has been largely confined to the advanced industrialised countries. In the few cases where macro-models have been built for underdeveloped countries (e.g. the Narasimham model (112) for India) the underlying assumptions have been largely of the Keynesian type, and thus in the authors opinion unconnected with the theory of economic development. This study is a modest attempt at econometric model-building on the basis of a model of development of an underdeveloped country.
This book constitutes the first serious attempt to explain the basics of econometrics and its applications in the clearest and simplest manner possible. Recognising the fact that a good level of mathematics is no longer a necessary prerequisite for economics/financial economics undergraduate and postgraduate programmes, it introduces this key subdivision of economics to an audience who might otherwise have been deterred by its complex nature.
This two volume set is a collection of 30 classic papers presenting ideas which have now become standard in the field of Bayesian inference. Topics covered include the central field of statistical inference as well as applications to areas of probability theory, information theory, utility theory and computational theory. It is organized into seven sections: foundations, information theory and prior distributions; robustness and outliers; hierarchical, multivariate and non-parametric models; asymptotics; computations and Monte Carlo methods; and Bayesian econometrics.
Designed for a one-semester course, Applied Statistics for Business and Economics offers students in business and the social sciences an effective introduction to some of the most basic and powerful techniques available for understanding their world. Numerous interesting and important examples reflect real-life situations, stimulating students to think realistically in tackling these problems. Calculations can be performed using any standard spreadsheet package. To help with the examples, the author offers both actual and hypothetical databases on his website http://iwu.edu/~bleekley The text explores ways to describe data and the relationships found in data. It covers basic probability tools, Bayes' theorem, sampling, estimation, and confidence intervals. The text also discusses hypothesis testing for one and two samples, contingency tables, goodness-of-fit, analysis of variance, and population variances. In addition, the author develops the concepts behind the linear relationship between two numeric variables (simple regression) as well as the potentially nonlinear relationships among more than two variables (multiple regression). The final chapter introduces classical time-series analysis and how it applies to business and economics. This text provides a practical understanding of the value of statistics in the real world. After reading the book, students will be able to summarize data in insightful ways using charts, graphs, and summary statistics as well as make inferences from samples, especially about relationships.
Thoroughly classroom tested, this introductory text covers all the topics that constitute a foundation for basic econometrics, with concise and intuitive explanations of technical material. Important proofs are shown in detail; however, the focus is on developing regression models and understanding the residual
An accessible treatment of Monte Carlo methods, techniques, and applications in the field of finance and economics Providing readers with an in-depth and comprehensive guide, the Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics presents a timely account of the applicationsof Monte Carlo methods in financial engineering and economics. Written by an international leading expert in thefield, the handbook illustrates the challenges confronting present-day financial practitioners and provides various applicationsof Monte Carlo techniques to answer these issues. The book is organized into five parts: introduction andmotivation; input analysis, modeling, and estimation; random variate and sample path generation; output analysisand variance reduction; and applications ranging from option pricing and risk management to optimization. The Handbook in Monte Carlo Simulation features: * An introductory section for basic material on stochastic modeling and estimation aimed at readers who may need a summary or review of the essentials * Carefully crafted examples in order to spot potential pitfalls and drawbacks of each approach * An accessible treatment of advanced topics such as low-discrepancy sequences, stochastic optimization, dynamic programming, risk measures, and Markov chain Monte Carlo methods * Numerous pieces of R code used to illustrate fundamental ideas in concrete terms and encourage experimentation The Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics is a complete reference for practitioners in the fields of finance, business, applied statistics, econometrics, and engineering, as well as a supplement for MBA and graduate-level courses on Monte Carlo methods and simulation.
A high school student can create deep Q-learning code to control her robot, without any understanding of the meaning of 'deep' or 'Q', or why the code sometimes fails. This book is designed to explain the science behind reinforcement learning and optimal control in a way that is accessible to students with a background in calculus and matrix algebra. A unique focus is algorithm design to obtain the fastest possible speed of convergence for learning algorithms, along with insight into why reinforcement learning sometimes fails. Advanced stochastic process theory is avoided at the start by substituting random exploration with more intuitive deterministic probing for learning. Once these ideas are understood, it is not difficult to master techniques rooted in stochastic control. These topics are covered in the second part of the book, starting with Markov chain theory and ending with a fresh look at actor-critic methods for reinforcement learning.
Law and economics research has had an enormous impact on the laws of contracts, torts, property, crimes, corporations, and antitrust, as well as public regulation and fundamental rights. The Law and Economics of Patent Damages, Antitrust, and Legal Process examines several areas of important research by a variety of international scholars. It contains technical papers on the appropriate way to estimate damages in patent disputes, as well as methods for evaluating relevant markets and vertically integrated firms when determining the competitive effects of mergers and other actions. There are also papers on the implication of different legal processes, regulations, and liability rules on consumer welfare, which range from the impact of delays in legal decisions in labour cases in France to issues of criminal liability related to the use of artificial intelligence. This volume of Research in Law and Economics is a must-read for researchers and professionals of patent damages, antitrust, labour, and legal process.
Modelling trends and cycles in economic time series has a long history, with the use of linear trends and moving averages forming the basic tool kit of economists until the 1970s. Several developments in econometrics then led to an overhaul of the techniques used to extract trends and cycles from time series. In this second edition, Terence Mills expands on the research in the area of trends and cycles over the last (almost) two decades, to highlight to students and researchers the variety of techniques and the considerations that underpin their choice for modelling trends and cycles.
This major volume of essays by Kenneth F. Wallis features 28 articles published over a quarter of a century on the statistical analysis of economic time series, large-scale macroeconometric modelling, and the interface between them.The first part deals with time-series econometrics and includes significant early contributions to the development of the LSE tradition in time-series econometrics, which is the dominant British tradition and has considerable influence worldwide. Later sections discuss theoretical and practical issues in modelling seasonality and forecasting with applications in both large-scale and small-scale models. The final section summarizes the research programme of the ESRC Macroeconomic Modelling Bureau, a unique comparison project among economy-wide macroeconometric models. Professor Wallis has written a detailed introduction to the papers in this volume in which he explains the background to these papers and comments on subsequent developments.
Financial models are an inescapable feature of modern financial markets. Yet it was over reliance on these models and the failure to test them properly that is now widely recognized as one of the main causes of the financial crisis of 2007-2011. Since this crisis, there has been an increase in the amount of scrutiny and testing applied to such models, and validation has become an essential part of model risk management at financial institutions. The book covers all of the major risk areas that a financial institution is exposed to and uses models for, including market risk, interest rate risk, retail credit risk, wholesale credit risk, compliance risk, and investment management. The book discusses current practices and pitfalls that model risk users need to be aware of and identifies areas where validation can be advanced in the future. This provides the first unified framework for validating risk management models.
Volume 39A of Research in the History of Economic Thought and Methodology features a selection of essays presented at the 2019 Conference of the Latin American Society for the History of Economic Thought (ALAHPE), edited by Felipe Almeida and Carlos Eduardo Suprinyak, as well as a new general-research essay by Daniel Kuehn, an archival discovery by Katia Caldari and Luca Fiorito, and a book review by John Hall.
This text disputes the laissez-faire direction of both economic theory and practice that has gained prominence since the mid-1970s. Dissenting voices, the author argues, have been drowned out by a sea of circular arguments and complex mathematical models that ignore real-world conditions and disregard values that can't easily be turned into commodities. Included is an explanation of how some sectors of the economyrequire a blend of market, regulation and social outlay.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
This book provides an up-to-date series of advanced chapters on applied financial econometric techniques pertaining the various fields of commodities finance, mathematics & stochastics, international macroeconomics and financial econometrics. International Financial Markets: Volume I provides a key repository on the current state of knowledge, the latest debates and recent literature on international financial markets. Against the background of the "financialization of commodities" since the 2008 sub-primes crisis, section one contains recent contributions on commodity and financial markets, pushing the frontiers of applied econometrics techniques. The second section is devoted to exchange rate and current account dynamics in an environment characterized by large global imbalances. Part three examines the latest research in the field of meta-analysis in economics and finance. This book will be useful to students and researchers in applied econometrics; academics and students seeking convenient access to an unfamiliar area. It will also be of great interest established researchers seeking a single repository on the current state of knowledge, current debates and relevant literature.
This monograph addresses the methodological and empirical issues relevant for the development of sustainable agriculture, with a particular focus on Eastern Europe. It relates economic growth to the other dimensions of sustainability by applying integrated methods. The book comprises five chapters dedicated to the theoretical approaches towards sustainable rural development, productivity analysis, structural change analysis and environmental footprint. The book focuses on the transformations of the agricultural sector while taking into account economic, environmental, and social dynamics. The importance of agricultural transformations to the livelihood of the rural population and food security are highlighted. Further, advanced methodologies and frameworks are presented to fathom the underlying trends in different facets of agricultural production. The authors present statistical methods used for the analysis of agricultural sustainability along with applications for agriculture in the European Union. Additionally, they discuss the measures of efficiency, methodological approaches and empirical models. Finally, the book applies econometric and optimization techniques, which are useful for the estimation of the production functions and other representations of technology in the case of the European Union member states. Therefore, the book is a must-read for researchers and students of agricultural and production economics, as well as policy-makers and academia in general.
The Council of the European Union is the institutional heart of EU policy-making. But 'who gets what, when and how' in the Council? What are the dimensions of political conflict, and which countries form coalitions in the intense negotiations to achieve their desired policy outcomes? Focussing on collective decision-making in the Council between 1998 and 2007, this book provides a comprehensive account of these salient issues that lie at the heart of political accountability and legitimacy in the European Union. Based on a novel and unique dataset of estimates of government policy positions, salience and power in influencing deliberations, an explanatory model approximating the Nash-Bargaining solution is employed to predict the policy outcomes on ten policy domains of central importance to this institution. The book's analyses comprise investigations into the determinants of decision-making success, the architecture of the political space and the governments' coalition behavior.
This volume presents new methods and applications in longitudinal data estimation methodology in applied economic. Featuring selected papers from the 2020 the International Conference on Applied Economics (ICOAE 2020) held virtually due to the corona virus pandemic, this book examines interdisciplinary topics such as financial economics, international economics, agricultural economics, marketing and management. Country specific case studies are also featured.
"Econometric Theory" presents a modern approach to the theory of
econometric estimation and inference, with particular applications
to time series. An ideal reference for practitioners and
researchers, the book is also suited for advanced two-semester
econometrics courses and one-semester regression courses. Based on lectures originally given to graduates at the London School of Economics, the book applies recent developments in asymptotic theory to derive the properties of estimators when the model is only partially specified. Topics covered in depth include the linear regression model, dynamic modeling, simultaneous equations, optimization estimators, hypothesis testing, and the theory of nonstationary time series and cointegration.
This book gives a thorough and systematic introduction to the latest research results about fuzzy decision-making method based on prospect theory. It includes eight chapters: Introduction, Intuitionistic fuzzy MADM based on prospect theory, QUALIFLEX based on prospect theory with probabilistic linguistic information, Group PROMETHEE based on prospect theory with hesitant fuzzy linguistic information, Prospect consensus with probabilistic hesitant fuzzy preference information, Improved TODIM based on prospect theory and the improved TODIM with probabilistic hesitant fuzzy information, etc. This book is suitable for the researchers in the fields of fuzzy mathematics, operations research, behavioral science, management science and engineering, etc. It is also useful as a textbook for postgraduate and senior-year undergraduate students of the relevant professional institutions of higher learning.
This book scientifically tests the assertion that accommodative monetary policy can eliminate the "crowd out" problem, allowing fiscal stimulus programs (such as tax cuts or increased government spending) to stimulate the economy as intended. It also tests to see if natural growth in th economy can cure the crowd out problem as well or better. The book is intended to be the largest scale scientific test ever performed on this topic. It includes about 800 separate statistical tests on the U.S. economy testing different parts or all of the period 1960 - 2010. These tests focus on whether accommodative monetary policy, which increases the pool of loanable resources, can offset the crowd out problem as well as natural growth in the economy. The book, employing the best scientific methods available to economists for this type of problem, concludes accommodate monetary policy could have, but until the quantitative easing program, Federal Reserve efforts to accommodate fiscal stimulus programs were not large enough to offset more than 23% to 44% of any one year's crowd out problem. That provides the science part of the answer as to why accommodative monetary policy didn't accommodate: too little of it was tried. The book also tests whether other increases in loanable funds, occurring because of natural growth in the economy or changes in the savings rate can also offset crowd out. It concludes they can, and that these changes tend to be several times as effective as accommodative monetary policy. This book's companion volume Why Fiscal Stimulus Programs Fail explores the policy implications of these results.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
It is very useful and timely book as demand forecasting has become a very crucial tool and provides important information for destination on which policy are created and implemented. This is especially important given the complexities arising the aftermath of the Covid19 pandemic. * It looks at novel and recent developments in this field including judgement and scenario forecasting. * Offers a comprehensive approach to tourism econometrics, looking at a variety of aspects. * The authors are experts in this field and of the highest academic calibre.
Ragnar Frisch (1895-1973) received the first Nobel Memorial Prize in Economic Science together with Jan Tinbergen in 1969 for having played an important role in ensuring that mathematical techniques figure prominently in modern economic analysis. Frisch was also a co-founder of the Econometric Society in 1930, the inaugural editor of its journal Econometrica for over 20 years, and a major figure in Norwegian academic life. This collection of essays derived from the centennial symposium which marked Frisch's birth explores his contributions to econometrics and other key fields in the discipline as well as the results of new research. Contributors include eminent scholars from Europe, the United Kingdom and North America who investigate themes in utility measurement, production theory, microeconomic policy, econometric methods, macrodynamics, and macroeconomic planning.
How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales of organization, from individuals to cooperatives to societies. Game theory' strategic formulation of central problems in the analysis of social interactions is used to develop multi-level theories that examine the interplay between individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually addressed by game theory, especially focusing on the formation of groups and the role of social norms in maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the ethical principles embraced by a society. |
You may like...
Financial and Macroeconomic…
Francis X. Diebold, Kamil Yilmaz
Hardcover
R3,612
Discovery Miles 36 120
Tax Policy and Uncertainty - Modelling…
Christopher Ball, John Creedy, …
Hardcover
R2,656
Discovery Miles 26 560
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,179
Discovery Miles 21 790
Linear and Non-Linear Financial…
Mehmet Kenan Terzioglu, Gordana Djurovic
Hardcover
Handbook of Experimental Game Theory
C. M. Capra, Rachel T. A. Croson, …
Hardcover
R6,432
Discovery Miles 64 320
Introduction to Econometrics, Global…
James Stock, Mark Watson
Paperback
R2,447
Discovery Miles 24 470
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
|