![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This book has taken form over several years as a result of a number of courses taught at the University of Pennsylvania and at Columbia University and a series of lectures I have given at the International Monetary Fund. Indeed, I began writing down my notes systematically during the academic year 1972-1973 while at the University of California, Los Angeles. The diverse character of the audience, as well as my own conception of what an introductory and often terminal acquaintance with formal econometrics ought to encompass, have determined the style and content of this volume. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. As an example, a relatively elementary one-semester course can be based on Chapters one through five, omitting the appendices to these chapters and a few sections in some of the chapters so indicated. This would acquaint the student with the basic theory of the general linear model, some of the prob lems often encountered in empirical research, and some proposed solutions. For such a course, I should also recommend a brief excursion into Chapter seven (logit and pro bit analysis) in view of the increasing availability of data sets for which this type of analysis is more suitable than that based on the general linear model."
Dynamic Programming in Economics is an outgrowth of a course intended for students in the first year PhD program and for researchers in Macroeconomics Dynamics. It can be used by students and researchers in Mathematics as well as in Economics. The purpose of Dynamic Programming in Economics is twofold: (a) to provide a rigorous, but not too complicated, treatment of optimal growth models in infinite discrete time horizon, (b) to train the reader to the use of optimal growth models and hence to help him to go further in his research. We are convinced that there is a place for a book which stays somewhere between the "minimum tool kit" and specialized monographs leading to the frontiers of research on optimal growth.
This important new book presents the theoretical, econometric and applied foundations of the economics of innovation as well as offering a new approach to the measurement of technical change. The author, a leading expert in innovation economics and management, critically reviews current schools of thought and presents his own contribution to measurement techniques. Measurements of technical change have focused on the characteristics of price and quantity whilst useful theories and reliable indicators of the quality of innovation in new products have been sorely lacking. The author examines the theoretical foundations of the measurement of technical change and extends the analysis to consider the econometric and empirical perspective in the process of innovation. He outlines the key contributions to innovation research by reviewing the English-language literature and providing a very useful guide to the most important contributions in other languages. In the measurement of the quality of innovation, the techniques used in the author's contribution to new 'technometrics' are presented and explained in detail and are applied to the most important topical problems in innovation and management. This significant addition to the literature will be invaluable to graduates, scholars and managers working in the area of technical change, technology and innovation management.
The emergence of new firm-level data, including the European Community Innovation Survey (CIS), has led to a surge of studies on innovation and firm behaviour. This book documents progress in four interrelated fields: investigation of the use of new indicators of innovation output; investigation of determinants of innovative behavior; the role of spillovers, the public knowledge infrastructure and research and development collaboration; and the impact of innovation on firm performance. Written by an international group of contributors, the studies are based on agriculture and the manufacturing and service industries in Europe and Canada and provide new insights into the driving forces behind innovation.
This is the first book to investigate individual's pessimistic and optimistic prospects for the future and their economic consequences based on sound mathematical foundations. The book focuses on fundamental uncertainty called Knightian uncertainty, where the probability distribution governing uncertainty is unknown, and it provides the reader with methods to formulate how pessimism and optimism act in an economy in a strict and unified way. After presenting decision-theoretic foundations for prudent behaviors under Knightian uncertainty, the book applies these ideas to economic models that include portfolio inertia, indeterminacy of equilibria in the Arrow-Debreu economy and in a stochastic overlapping-generations economy, learning, dynamic asset-pricing models, search, real options, and liquidity preferences. The book then proceeds to characterizations of pessimistic ( -contaminated) and optimistic ( -exuberant) behaviors under Knightian uncertainty and people's inherent pessimism (surprise aversion) and optimism (surprise loving). Those characterizations are shown to be useful in understanding several observed behaviors in the global financial crisis and in its aftermath. The book is highly recommended not only to researchers who wish to understand the mechanism of how pessimism and optimism affect economic phenomena, but also to policy makers contemplating effective economic policies whose success delicately hinges upon people's mindsets in the market. Kiyohiko Nishimura is Professor at the National Graduate Institute for Policy Studies (GRIPS) and Professor Emeritus and Distinguished Project Research Fellow of the Center for Advanced Research in Finance at The University of Tokyo. Hiroyuki Ozaki is Professor of Economics at Keio University.
This advanced undergraduate/graduate textbook teaches students in finance and economics how to use R to analyse financial data and implement financial models. It demonstrates how to take publically available data and manipulate, implement models and generate outputs typical for particular analyses. A wide spectrum of timely and practical issues in financial modelling are covered including return and risk measurement, portfolio management, option pricing and fixed income analysis. This new edition updates and expands upon the existing material providing updated examples and new chapters on equities, simulation and trading strategies, including machine learnings techniques. Select data sets are available online.
These proceedings highlight research on the latest trends and methods in experimental and behavioral economics. Featuring contributions presented at the 2017 Computational Methods in Experimental Economics (CMEE) conference, which was held in Lublin, Poland, it merges findings from various domains to present deep insights into topics such as game theory, decision theory, cognitive neuroscience and artificial intelligence. The fields of experimental economics and behavioral economics are rapidly evolving. Modern applications of experimental economics require the integration of know-how from disciplines including economics, computer science, psychology and neuroscience. The use of computer technology enhances researchers' ability to generate and analyze large amounts of data, allowing them to use non-standard methods of data logging for experiments such as cognitive neuronal methods. Experiments are currently being conducted with software that, on the one hand, provides interaction with the people involved in experiments, and on the other helps to accurately record their responses. The goal of the CMEE conference and the papers presented here is to provide the scientific community with essential research on and applications of computer methods in experimental economics. Combining theories, methods and regional case studies, the book offers a valuable resource for all researchers, scholars and policymakers in the areas of experimental and behavioral economics.
In response to the damage caused by a growth-led global economy, researchers across the world started investigating the association between environmental pollution and its possible determinants using different models and techniques. Most famously, the environmental Kuznets curve hypothesizes an inverted U-shaped association between environmental quality and gross domestic product (GDP). This book explores the latest literature on the environmental Kuznets curve, including developments in the methodology, the impacts of the pandemic, and other recent findings. Researchers have recently broadened the range of the list of drivers of environmental pollution under consideration, which now includes variables such as foreign direct investment, trade expansion, financial development, human activities, population growth, and renewable and nonrenewable energy resources, all of which vary across different countries and times. And in addition to CO2 emissions, other proxies for environmental quality – such as water, land, and ecological footprints – have been used in recent studies. This book also incorporates analysis of the relationship between economic growth and the environment during the COVID-19 crisis, presenting new empirical work on the impact of the pandemic on energy use, the financial sector, trade, and tourism. Collectively, these developments have improved the direction and extent of the environmental Kuznets curve hypothesis and broadened the basket of dependent and independent variables which may be incorporated. This book will be invaluable reading for researchers in environmental economics and econometrics.
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
This valuable text provides a comprehensive introduction to VAR modelling and how it can be applied. In particular, the author focuses on the properties of the Cointegrated VAR model and its implications for macroeconomic inference when data are non-stationary. The text provides a number of insights into the links between statistical econometric modelling and economic theory and gives a thorough treatment of identification of the long-run and short-run structure as well as of the common stochastic trends and the impulse response functions, providing in each case illustrations of applicability. This book presents the main ingredients of the Copenhagen School of Time-Series Econometrics in a transparent and coherent framework. The distinguishing feature of this school is that econometric theory and applications have been developed in close cooperation. The guiding principle is that good econometric work should take econometrics, institutions, and economics seriously. The author uses a single data set throughout most of the book to guide the reader through the econometric theory while also revealing the full implications for the underlying economic model. To test ensure full understanding the book concludes with the introduction of two new data sets to combine readers understanding of econometric theory and economic models, with economic reality.
Panel Data Econometrics: Theory introduces econometric modelling. Written by experts from diverse disciplines, the volume uses longitudinal datasets to illuminate applications for a variety of fields, such as banking, financial markets, tourism and transportation, auctions, and experimental economics. Contributors emphasize techniques and applications, and they accompany their explanations with case studies, empirical exercises and supplementary code in R. They also address panel data analysis in the context of productivity and efficiency analysis, where some of the most interesting applications and advancements have recently been made.
• Introduces the dynamics, principles and mathematics behind ten macroeconomic models allowing students to visualise the models and understand the economic intuition behind them. • Provides a step-by-step guide, and the necessary MATLAB codes, to allow readers to simulate and experiment with the models themselves.
This title is concerned with the investigation of the contemporary financial issues of the e-commerce market.
Globalization and information and communications technology (ICT) have played a pivotal role in revolutionizing value creation through the development of human capital formation. The constantly changing needs and structure of the labour market are primarily responsible for the conversion of a traditional economy relying fundamentally on the application of physical abilities to a knowledge-based economy relying on ideas, technologies and innovations. In this economy, knowledge has to be created, acquired, developed, transmitted, preserved and utilized for the improvement of individual and social welfare. Comparative Advantage in the Knowledge Economy: A National and Organizational Resource provides a comprehensive and insightful understanding of all the dimensions of a transition from a traditional to a knowledge economy. It attempts to explain how educational achievement, skilled manpower, investment in knowledge capital and analytics will be the key to success of a nation's comparative advantage in the globalized era. The volume should be of interest to students, researchers and teachers of economics, policy makers and advanced graduate students with an interest in economic analyses and development policy.
DEA is computational at its core and this book will be one of several books that we will look to publish on the computational aspects of DEA. This book by Zhu and Cook will deal with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex service industry and the public service domain types of problems that require modeling both qualitative and quantitative data. This will be a handbook treatment dealing with specific data problems including the following: (1) imprecise data, (2) inaccurate data, (3) missing data, (4) qualitative data, (5) outliers, (6) undesirable outputs, (7) quality data, (8) statistical analysis, (9) software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.
For courses in Econometrics. A Clear, Practical Introduction to Econometrics Using Econometrics: A Practical Guide offers students an innovative introduction to elementary econometrics. Through real-world examples and exercises, the book covers the topic of single-equation linear regression analysis in an easily understandable format. The Seventh Edition is appropriate for all levels: beginner econometric students, regression users seeking a refresher, and experienced practitioners who want a convenient reference. Praised as one of the most important texts in the last 30 years, the book retains its clarity and practicality in previous editions with a number of substantial improvements throughout.
The Socialist Industrial State (1976) examines the state-socialist system, taking as the central example the Soviet Union - where the goals and values of Marxism-Leninism and the particular institutions, the form of economy and polity, were first adopted and developed. It then considers the historical developments, differences in culture, the level of economic development and the political processes of different state-socialist countries around the globe.
Does game theory ? the mathematical theory of strategic interaction ? provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory ? the first monograph on the philosophy of game theory ? is a bold attempt to combine insights from epistemic logic and the philosophy of science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. De Bruin proves new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and he explores in detail the logical form of game theory as it is used in explanatory and normative contexts. He argues that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, De Bruin argues, has been rather successful in achieving this aim.
Written for those who need an introduction, Applied Time Series Analysis reviews applications of the popular econometric analysis technique across disciplines. Carefully balancing accessibility with rigor, it spans economics, finance, economic history, climatology, meteorology, and public health. Terence Mills provides a practical, step-by-step approach that emphasizes core theories and results without becoming bogged down by excessive technical details. Including univariate and multivariate techniques, Applied Time Series Analysis provides data sets and program files that support a broad range of multidisciplinary applications, distinguishing this book from others.
Digital Asset Valuation and Cyber Risk Measurement: Principles of Cybernomics is a book about the future of risk and the future of value. It examines the indispensable role of economic modeling in the future of digitization, thus providing industry professionals with the tools they need to optimize the management of financial risks associated with this megatrend. The book addresses three problem areas: the valuation of digital assets, measurement of risk exposures of digital valuables, and economic modeling for the management of such risks. Employing a pair of novel cyber risk measurement units, bitmort and hekla, the book covers areas of value, risk, control, and return, each of which are viewed from the perspective of entity (e.g., individual, organization, business), portfolio (e.g., industry sector, nation-state), and global ramifications. Establishing adequate, holistic, and statistically robust data points on the entity, portfolio, and global levels for the development of a cybernomics databank is essential for the resilience of our shared digital future. This book also argues existing economic value theories no longer apply to the digital era due to the unique characteristics of digital assets. It introduces six laws of digital theory of value, with the aim to adapt economic value theories to the digital and machine era.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
This book sheds new light on a recently introduced monetary tool - negative interest rates policy (NIRP). It provides in-depth insight into this phenomenon, conducted by the central banks in several economies, for example, the Eurozone, Switzerland and Japan, and its possible impact on systemic risk. Although it has been introduced as a temporary policy instrument, it may remain widely used for a longer period and by a greater range of central banks than initially expected, thus the book explores its effects and implications on the banking sector and financial markets, with a particular focus on potentially adverse consequences. There is a strong accent on the uniqueness of negative policy rates in the context of financial stability concerns. The authors assess whether NIRP has any - or in principle a stronger - impact on systemic risk than conventional monetary policy. The book is targeted at presenting and evaluating the initial experiences of NIRP policy during normal, i.e. pre-COVID, times, rather than in periods in which pre-established macroeconomic relations are rapidly disrupted or, specifically, when the source of the disruption is not purely economic in nature, unlike in systemic crisis. The authors adopt both theoretical and practical approaches to explore the key issues and outline the policy implications for both monetary and macroprudential authorities, with respect to negative interest rate policy, thus the book will provide a useful guide for policymakers, academics, advanced students and researchers of financial economics and international finance. |
You may like...
Predictive Intelligence in Biomedical…
Rajshree Srivastava, Nhu Gia Nguyen, …
Hardcover
R3,855
Discovery Miles 38 550
Deep Learning - Research and…
Siddhartha Bhattacharyya, Vaclav Snasel, …
Hardcover
R3,854
Discovery Miles 38 540
Machine Learning Applications in…
Goutam Kumar Bose, Pritam Pain
Hardcover
R5,327
Discovery Miles 53 270
Research Anthology on Artificial Neural…
Information R Management Association
Hardcover
R12,932
Discovery Miles 129 320
|