![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
A systematic treatment of dynamic decision making and performance measurement Modern business environments are dynamic. Yet, the models used to make decisions and quantify success within them are stuck in the past. In a world where demands, resources, and technology are interconnected and evolving, measures of efficiency need to reflect that environment. In Dynamic Efficiency and Productivity Measurement, Elvira Silva, Spiro E. Stefanou, and Alfons Oude Lansink look at the business process from a dynamic perspective. Their systematic study covers dynamic production environments where current production decisions impact future production possibilities. By considering practical factors like adjustments over time, this book offers an important lens for contemporary microeconomic analysis. Silva, Stefanou, and Lansink develop the analytical foundations of dynamic production technology in both primal and dual representations, with an emphasis on directional distance functions. They cover concepts measuring the production structure (economies of scale, economies of scope, capacity utilization) and performance (allocative, scale and technical inefficiency, productivity) in a methodological and comprehensive way. Through a unified approach, Dynamic Efficiency and Productivity Measurement offers a guide to how firms maximize potential in changing environments and an invaluable contribution to applied microeconomics.
This book explores the novel uses and potentials of Data Envelopment Analysis (DEA) under big data. These areas are of widespread interest to researchers and practitioners alike. Considering the vast literature on DEA, one could say that DEA has been and continues to be, a widely used technique both in performance and productivity measurement, having covered a plethora of challenges and debates within the modelling framework.
This book offers a series of statistical tests to determine if the "crowd out" problem, known to hinder the effectiveness of Keynesian economic stimulus programs, can be overcome by monetary programs. It concludes there are programs that can do this, specifically "accommodative monetary policy." They were not used to any great extent prior to the Quantitative Easing program in 2008, causing the failure of many fiscal stimulus programs through no fault of their own. The book includes exhaustive statistical tests to prove this point. There is also a policy analysis section of the book. It examines how effectively the Federal Reserve's anti-crowd out programs have actually worked, to the extent they were undertaken at all. It finds statistical evidence that using commercial and savings banks instead of investment banks when implementing accommodating monetary policy would have markedly improved their effectiveness. This volume, with its companion volume Why Fiscal Stimulus Programs Fail, Volume 2: Statistical Tests Comparing Monetary Policy to Growth, provides 1000 separate statistical tests on the US economy to prove these assertions.
The University of Oxford has been and continues to be one of the most important global centres for economics. With six chapters on themes in Oxford economics and 24 chapters on the lives and work of Oxford economists, this volume shows how economics became established at the University, how it produced some of the world's best-known economists, including Francis Ysidro Edgeworth, Roy Harrod and David Hendry, and how it remains a global force for the very best in teaching and research in economics. With original contributions from a stellar cast, this volume provides economists - especially those interested in macroeconomics and the history of economic thought - with the first in-depth analysis of Oxford economics.
Since its establishment in the 1950s the American Economic Association's Committee on Economic Education has sought to promote improved instruction in economics and to facilitate this objective by stimulating research on the teaching of economics. These efforts are most apparent in the sessions on economic education that the Committee organizes at the Association's annual meetings. At these sessions economists interested in economic education have opportunities to present new ideas on teaching and research and also to report the findings of their research. The record of this activity can be found in the Proceedings of the American Eco nomic Review. The Committee on Economic Education and its members have been actively involved in a variety of other projects. In the early 1960s it organized the National Task Force on Economic Education that spurred the development of economics teaching at the precollege level. This in turn led to the development of a standardized research instrument, a high school test of economic understanding. This was followed later in the 1960s by the preparation of a similar test of understanding college economics. The development of these two instruments greatly facilitated research on the impact of economics instruction, opened the way for application of increasingly sophisticated statistical methods in measuring the impact of economic education, and initiated a steady stream of research papers on a subject that previously had not been explored."
This volume presents new methods and applications in longitudinal data estimation methodology in applied economic. Featuring selected papers from the 2020 the International Conference on Applied Economics (ICOAE 2020) held virtually due to the corona virus pandemic, this book examines interdisciplinary topics such as financial economics, international economics, agricultural economics, marketing and management. Country specific case studies are also featured.
This book helps and promotes the use of machine learning tools and techniques in econometrics and explains how machine learning can enhance and expand the econometrics toolbox in theory and in practice. Throughout the volume, the authors raise and answer six questions: 1) What are the similarities between existing econometric and machine learning techniques? 2) To what extent can machine learning techniques assist econometric investigation? Specifically, how robust or stable is the prediction from machine learning algorithms given the ever-changing nature of human behavior? 3) Can machine learning techniques assist in testing statistical hypotheses and identifying causal relationships in 'big data? 4) How can existing econometric techniques be extended by incorporating machine learning concepts? 5) How can new econometric tools and approaches be elaborated on based on machine learning techniques? 6) Is it possible to develop machine learning techniques further and make them even more readily applicable in econometrics? As the data structures in economic and financial data become more complex and models become more sophisticated, the book takes a multidisciplinary approach in developing both disciplines of machine learning and econometrics in conjunction, rather than in isolation. This volume is a must-read for scholars, researchers, students, policy-makers, and practitioners, who are using econometrics in theory or in practice.
In the theory and practice of econometrics the model, the method and the data are all interdependent links in information recovery-estimation and inference. Seldom, however, are the economic and statistical models correctly specified, the data complete or capable of being replicated, the estimation rules ‘optimal’ and the inferences free of distortion. Faced with these problems, Maximum Entropy Economeirics provides a new basis for learning from economic and statistical models that may be non-regular in the sense that they are ill-posed or underdetermined and the data are partial or incomplete. By extending the maximum entropy formalisms used in the physical sciences, the authors present a new set of generalized entropy techniques designed to recover information about economic systems. The authors compare the generalized entropy techniques with the performance of the relevant traditional methods of information recovery and clearly demonstrate theories with applications including
The Super Bowl is the most watched sporting event in the United States. But what does participating in this event mean for the players, the halftime performers, and the cities who host the games? Is there an economic benefit from being a part of the Super Bowl and if so, how much? This Palgrave Pivot examines the economic consequences for those who participate in the Super Bowl. The book fills in gaps in the literature by examining the benefits and costs of being involved in the game. Previously, the literature has largely ignored the affect the game has had on the careers of the players, particularly the stars of the game. The economic benefit of being the halftime performer has not been considered in the literature at all. While there have been past studies about the economic impact on the cities who host of the game, this book will expand on previous research and update it with new data.
Modelling trends and cycles in economic time series has a long history, with the use of linear trends and moving averages forming the basic tool kit of economists until the 1970s. Several developments in econometrics then led to an overhaul of the techniques used to extract trends and cycles from time series. In this second edition, Terence Mills expands on the research in the area of trends and cycles over the last (almost) two decades, to highlight to students and researchers the variety of techniques and the considerations that underpin their choice for modelling trends and cycles.
This conference proceedings volume presents advanced methods in time series estimation models that are applicable various areas of applied economic research such as international economics, macroeconomics, microeconomics, finance economics and agricultural economics. Featuring contributions presented at the 2018 International Conference on Applied Economics (ICOAE) held in Warsaw, Poland, this book presents contemporary research using applied econometric method for analysis as well as country specific studies with potential implications on economic policy. Applied economics is a rapidly growing field of economics that combines economic theory with econometrics to analyse economic problems of the real world usually with economic policy interest. ICOAE is an annual conference started in 2008 with the aim to bring together economists from different fields of applied economic research in order to share methods and ideas. Approximately 150 papers are submitted each year from about 40 countries around the world. The goal of the conference and the enclosed papers is to allow for an exchange of experiences with different applied econometric methods and to promote joint initiatives among well-established economic fields such as finance, agricultural economics, health economics, education economics, international trade theory and management and marketing strategies. Featuring global contributions, this book will be of interest to researchers, academics, professionals and policy makers in the field of applied economics and econometrics.
Arthur Vogt has devoted a great deal of his scientific efforts to
both person and work of Irving Fisher. This book, written with J
nos Barta, gives an excellent impression of Fisher's great
contributions to the theory of the price index on the one hand. On
the other hand, it continues Fisher's work on this subject along
the lines which several authors drew with respect to price index
theory since Fisher's death fifty years ago.
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."
Dynamic Programming in Economics is an outgrowth of a course intended for students in the first year PhD program and for researchers in Macroeconomics Dynamics. It can be used by students and researchers in Mathematics as well as in Economics. The purpose of Dynamic Programming in Economics is twofold: (a) to provide a rigorous, but not too complicated, treatment of optimal growth models in infinite discrete time horizon, (b) to train the reader to the use of optimal growth models and hence to help him to go further in his research. We are convinced that there is a place for a book which stays somewhere between the "minimum tool kit" and specialized monographs leading to the frontiers of research on optimal growth.
The emergence of new firm-level data, including the European Community Innovation Survey (CIS), has led to a surge of studies on innovation and firm behaviour. This book documents progress in four interrelated fields: investigation of the use of new indicators of innovation output; investigation of determinants of innovative behavior; the role of spillovers, the public knowledge infrastructure and research and development collaboration; and the impact of innovation on firm performance. Written by an international group of contributors, the studies are based on agriculture and the manufacturing and service industries in Europe and Canada and provide new insights into the driving forces behind innovation.
This is the first book to investigate individual's pessimistic and optimistic prospects for the future and their economic consequences based on sound mathematical foundations. The book focuses on fundamental uncertainty called Knightian uncertainty, where the probability distribution governing uncertainty is unknown, and it provides the reader with methods to formulate how pessimism and optimism act in an economy in a strict and unified way. After presenting decision-theoretic foundations for prudent behaviors under Knightian uncertainty, the book applies these ideas to economic models that include portfolio inertia, indeterminacy of equilibria in the Arrow-Debreu economy and in a stochastic overlapping-generations economy, learning, dynamic asset-pricing models, search, real options, and liquidity preferences. The book then proceeds to characterizations of pessimistic ( -contaminated) and optimistic ( -exuberant) behaviors under Knightian uncertainty and people's inherent pessimism (surprise aversion) and optimism (surprise loving). Those characterizations are shown to be useful in understanding several observed behaviors in the global financial crisis and in its aftermath. The book is highly recommended not only to researchers who wish to understand the mechanism of how pessimism and optimism affect economic phenomena, but also to policy makers contemplating effective economic policies whose success delicately hinges upon people's mindsets in the market. Kiyohiko Nishimura is Professor at the National Graduate Institute for Policy Studies (GRIPS) and Professor Emeritus and Distinguished Project Research Fellow of the Center for Advanced Research in Finance at The University of Tokyo. Hiroyuki Ozaki is Professor of Economics at Keio University.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
Panel Data Econometrics: Theory introduces econometric modelling. Written by experts from diverse disciplines, the volume uses longitudinal datasets to illuminate applications for a variety of fields, such as banking, financial markets, tourism and transportation, auctions, and experimental economics. Contributors emphasize techniques and applications, and they accompany their explanations with case studies, empirical exercises and supplementary code in R. They also address panel data analysis in the context of productivity and efficiency analysis, where some of the most interesting applications and advancements have recently been made.
This title is concerned with the investigation of the contemporary financial issues of the e-commerce market.
These proceedings highlight research on the latest trends and methods in experimental and behavioral economics. Featuring contributions presented at the 2017 Computational Methods in Experimental Economics (CMEE) conference, which was held in Lublin, Poland, it merges findings from various domains to present deep insights into topics such as game theory, decision theory, cognitive neuroscience and artificial intelligence. The fields of experimental economics and behavioral economics are rapidly evolving. Modern applications of experimental economics require the integration of know-how from disciplines including economics, computer science, psychology and neuroscience. The use of computer technology enhances researchers' ability to generate and analyze large amounts of data, allowing them to use non-standard methods of data logging for experiments such as cognitive neuronal methods. Experiments are currently being conducted with software that, on the one hand, provides interaction with the people involved in experiments, and on the other helps to accurately record their responses. The goal of the CMEE conference and the papers presented here is to provide the scientific community with essential research on and applications of computer methods in experimental economics. Combining theories, methods and regional case studies, the book offers a valuable resource for all researchers, scholars and policymakers in the areas of experimental and behavioral economics.
DEA is computational at its core and this book will be one of several books that we will look to publish on the computational aspects of DEA. This book by Zhu and Cook will deal with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex service industry and the public service domain types of problems that require modeling both qualitative and quantitative data. This will be a handbook treatment dealing with specific data problems including the following: (1) imprecise data, (2) inaccurate data, (3) missing data, (4) qualitative data, (5) outliers, (6) undesirable outputs, (7) quality data, (8) statistical analysis, (9) software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
Does game theory ? the mathematical theory of strategic interaction ? provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory ? the first monograph on the philosophy of game theory ? is a bold attempt to combine insights from epistemic logic and the philosophy of science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. De Bruin proves new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and he explores in detail the logical form of game theory as it is used in explanatory and normative contexts. He argues that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, De Bruin argues, has been rather successful in achieving this aim. |
You may like...
Applied Econometric Analysis - Emerging…
Brian W Sloboda, Yaya Sissoko
Hardcover
R5,351
Discovery Miles 53 510
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Introduction to Computational Economics…
Hans Fehr, Fabian Kindermann
Hardcover
R4,258
Discovery Miles 42 580
Introductory Econometrics - A Modern…
Jeffrey Wooldridge
Hardcover
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
Spatial Analysis Using Big Data…
Yoshiki Yamagata, Hajime Seya
Paperback
R3,021
Discovery Miles 30 210
Pricing Decisions in the Euro Area - How…
Silvia Fabiani, Claire Loupias, …
Hardcover
R2,160
Discovery Miles 21 600
Tools and Techniques for Economic…
Jelena Stankovi, Pavlos Delias, …
Hardcover
R5,167
Discovery Miles 51 670
|