![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics > General
This practical guide in Eviews is aimed at practitioners and students in business, economics, econometrics, and finance. It uses a step-by-step approach to equip readers with a toolkit that enables them to make the most of this widely used econometric analysis software. Statistical and econometrics concepts are explained visually with examples, problems, and solutions. Developed by economists, the Eviews statistical software package is used most commonly for time-series oriented econometric analysis. It allows users to quickly develop statistical relations from data and then use those relations to forecast future values of the data. The package provides convenient ways to enter or upload data series, create new series from existing ones, display and print series, carry out statistical analyses of relationships among series, and manipulate results and output. This highly hands-on resource includes more than 200 illustrative graphs and tables and tutorials throughout. Abdulkader Aljandali is Senior Lecturer at Coventry University in London. He is currently leading the Stochastic Finance Module taught as part of the Global Financial Trading MSc. His previously published work includes Exchange Rate Volatility in Emerging Markers, Quantitative Analysis, Multivariate Methods & Forecasting with IBM SPSS Statistics and Multivariate Methods and Forecasting with IBM (R) SPSS (R) Statistics. Dr Aljandali is an established member of the British Accounting and Finance Association and the Higher Education Academy. Motasam Tatahi is a specialist in the areas of Macroeconomics, Financial Economics, and Financial Econometrics at the European Business School, Regent's University London, where he serves as Principal Lecturer and Dissertation Coordinator for the MSc in Global Banking and Finance at The European Business School-London.
Panel Data Econometrics: Theory introduces econometric modelling. Written by experts from diverse disciplines, the volume uses longitudinal datasets to illuminate applications for a variety of fields, such as banking, financial markets, tourism and transportation, auctions, and experimental economics. Contributors emphasize techniques and applications, and they accompany their explanations with case studies, empirical exercises and supplementary code in R. They also address panel data analysis in the context of productivity and efficiency analysis, where some of the most interesting applications and advancements have recently been made.
This title is concerned with the investigation of the contemporary financial issues of the e-commerce market.
DEA is computational at its core and this book will be one of several books that we will look to publish on the computational aspects of DEA. This book by Zhu and Cook will deal with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex service industry and the public service domain types of problems that require modeling both qualitative and quantitative data. This will be a handbook treatment dealing with specific data problems including the following: (1) imprecise data, (2) inaccurate data, (3) missing data, (4) qualitative data, (5) outliers, (6) undesirable outputs, (7) quality data, (8) statistical analysis, (9) software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.
Does game theory ? the mathematical theory of strategic interaction ? provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory ? the first monograph on the philosophy of game theory ? is a bold attempt to combine insights from epistemic logic and the philosophy of science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. De Bruin proves new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and he explores in detail the logical form of game theory as it is used in explanatory and normative contexts. He argues that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, De Bruin argues, has been rather successful in achieving this aim.
Digital Asset Valuation and Cyber Risk Measurement: Principles of Cybernomics is a book about the future of risk and the future of value. It examines the indispensable role of economic modeling in the future of digitization, thus providing industry professionals with the tools they need to optimize the management of financial risks associated with this megatrend. The book addresses three problem areas: the valuation of digital assets, measurement of risk exposures of digital valuables, and economic modeling for the management of such risks. Employing a pair of novel cyber risk measurement units, bitmort and hekla, the book covers areas of value, risk, control, and return, each of which are viewed from the perspective of entity (e.g., individual, organization, business), portfolio (e.g., industry sector, nation-state), and global ramifications. Establishing adequate, holistic, and statistically robust data points on the entity, portfolio, and global levels for the development of a cybernomics databank is essential for the resilience of our shared digital future. This book also argues existing economic value theories no longer apply to the digital era due to the unique characteristics of digital assets. It introduces six laws of digital theory of value, with the aim to adapt economic value theories to the digital and machine era.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
Belief and Rule Compliance: An Experimental Comparison of Muslim and Non-Muslim Economic Behavior uses modern behavioral science and game theory to examine the behavior and compliance of Muslim populations to Islamic Finance laws and norms. The work identifies behaviors characterized by unexpected complexity and profound divergence, including expectations for sharing, cooperation and entrepreneurship gleaned from studies. Adopting a unique set of recent empirical observations, the work provides a reliable behavioral foundation for practitioners seeking to evaluate, create and market Islamic financial products.
Both parts of Volume 44 of Advances in Econometrics pay tribute to Fabio Canova for his major contributions to economics over the last four decades. Throughout his long and distinguished career, Canova's research has achieved both a prolific publication record and provided stellar research to the profession. His colleagues, co-authors and PhD students wish to express their deep gratitude to Fabio for his intellectual leadership and guidance, whilst showcasing the extensive advances in knowledge and theory made available by Canova for professionals in the field. Advances in Econometrics publishes original scholarly econometrics papers with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature. Annual volume themes, selected by the Series Editors, are their interpretation of important new methods and techniques emerging in economics, statistics and the social sciences.
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
What is econophysics? What makes an econophysicist? Why are financial economists reluctant to use results from econophysics? Can we overcome disputes concerning hypotheses used in financial economics and that make no sense for econophysicists? How can we create a profitable dialogue between financial economists and econophysicists? How do we develop a common theoretical framework allowing the creation of more efficient models for the financial industry? This book moves beyond the disciplinary frontiers in order to initiate the development of a common theoretical framework that makes sense for both traditionally trained financial economists and econophysicists. Unlike other publications dedicated to econophysics, this book is written by two financial economists and it situates econophysics in the evolution of financial economics. The major issues that concern the collaboration between the two fields are analyzed in detail. More specifically, this book explains the theoretical and methodological foundations of these two fields in an accessible vocabulary providing the first extensive analytic comparison between models and results from both fields. The book also identifies the major conceptual gate-keepers that complicate dialogue between the two communities while it provides elements to overcome them. By mixing conceptual, historical, theoretical and formal arguments our analysis bridges the current deaf dialogue between financial economists and econophysicists. This book details the recent results in econophysics that bring it closer to financial economics. So doing, it identifies what remains to be done for econophysicists to contribute significantly to financial economics. Beyond the clarification of the current situation, this book also proposes a generic model compatible with the two fields, defining minimal conditions for common models. Finally, this book provides a research agenda for a more fruitful collaboration between econophysicists and financial economists, creating new research opportunities. In this perspective, it lays the foundations for common theoretical framework and models.
This is the first book that examines the diverse range of experimental methods currently being used in the social sciences, gathering contributions by working economists engaged in experimentation, as well as by a political scientist, psychologists and philosophers of the social sciences. Until the mid-twentieth century, most economists believed that experiments in the economic sciences were impossible. But that's hardly the case today, as evinced by the fact that Vernon Smith, an experimental economist, and Daniel Kahneman, a behavioral economist, won the Nobel Prize in Economics in 2002. However, the current use of experimental methods in economics is more diverse than is usually assumed. As the concept of experimentation underwent considerable abstraction throughout the twentieth century, the areas of the social sciences in which experiments are applied are expanding, creating renewed interest in, and multifaceted debates on, the way experimental methods are used. This book sheds new light on the diversity of experimental methodologies used in the social sciences. The topics covered include historical insights into the evolution of experimental methods; the necessary "performativity" of experiments, i.e., the dynamic interaction with the social contexts in which they are embedded; the application of causal inferences in the social sciences; a comparison of laboratory, field, and natural experiments; and the recent use of randomized controlled trials (RCTs) in development economics. Several chapters also deal with the latest heated debates, such as those concerning the use of the random lottery method in laboratory experiments.
This book examines the macroeconomic and regulatory impact of domestic and international shocks on the South African economy resulting from the 2009 financial crisis. It also assesses the impact of the US economy's eventual recovery from the crisis and the prospect of higher US interest rates in future. Told in three parts, the book explores associations between economic growth, policy uncertainty and the key domestic and international transmission channels, and transmission effects, of global financial regulatory and domestic macro-economic uncertainties on subdued and volatile economic recovery, financial channels, lending rate margins, and credit growth. The book concludes by extending its focus to the role of US monetary policy, capital flows and rand/US dollar volatility on the South African economy.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
In many industries the tariffs are not strictly proportional to the quantity purchased, i. e, they are nonlinear. Examples of nonlinear tariffs include railroad and electricity schedules and rental rates for durable goods and space. The major justification for the nonlinear pricing is the existence of private information on the side of consumers. In the early papers on the subject, private information was captured either by assuming a finite number of types (e. g. Adams and Yellen, 1976) or by a unidimensional continuum of types (Mussa and Rosen, 1978). Economics of the unidimen sional problems is by now well understood. The unidimensional models, however, do not cover all the situations of practical interest. Indeed, often the nonlinear tariffs specify the payment as a function of a variety of characteristics. For example, railroad tariffs spec ify charges based on weight, volume, and distance of each shipment. Dif ferent customers may value each of these characteristics differently, hence the customer's type will not in general be captured by a unidimensional characteristic and a problem of multidimensional screening arises. In such models the consumer's private information (her type) is captured by an m-dimensional vector, while the good produced by the monopolist has n quality dimensions."
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
The volume highlights the state-of-the-art knowledge (including data analysis) of productivity, inequality and efficiency analysis. It showcases a selection of the best papers from the 9th North American Productivity Workshop. These papers are relevant to academia, but also to public and private sectors in terms of the challenges that firms, financial institutions, governments, and individuals may face when dealing with economic and education related activities that lead to increase or decrease of productivity. The volume also aims to bring together ideas from different parts of the world about the challenges those local economies and institutions may face when changes in productivity are observed. These contributions focus on theoretical and empirical research in areas including productivity, production theory and efficiency measurement in economics, management science, operation research, public administration, and education. The North American Productivity Workshop (NAPW) brings together academic scholars and practitioners in the field of productivity and efficiency analysis from all over the world, and this proceedings volume is a reflection of this mission. The papers in this volume also address general topics as education, health, energy, finance, agriculture, transport, utilities, and economic development, among others. The editors are comprised of the 2016 local organizers, program committee members, and celebrated guest conference speakers.
In response to the damage caused by a growth-led global economy, researchers across the world started investigating the association between environmental pollution and its possible determinants using different models and techniques. Most famously, the environmental Kuznets curve hypothesizes an inverted U-shaped association between environmental quality and gross domestic product (GDP). This book explores the latest literature on the environmental Kuznets curve, including developments in the methodology, the impacts of the pandemic, and other recent findings. Researchers have recently broadened the range of the list of drivers of environmental pollution under consideration, which now includes variables such as foreign direct investment, trade expansion, financial development, human activities, population growth, and renewable and nonrenewable energy resources, all of which vary across different countries and times. And in addition to CO2 emissions, other proxies for environmental quality – such as water, land, and ecological footprints – have been used in recent studies. This book also incorporates analysis of the relationship between economic growth and the environment during the COVID-19 crisis, presenting new empirical work on the impact of the pandemic on energy use, the financial sector, trade, and tourism. Collectively, these developments have improved the direction and extent of the environmental Kuznets curve hypothesis and broadened the basket of dependent and independent variables which may be incorporated. This book will be invaluable reading for researchers in environmental economics and econometrics.
The book provides a comprehensive overview of the latest econometric methods for studying the dynamics of macroeconomic and financial time series. It examines alternative methodological approaches and concepts, including quantile spectra and co-spectra, and explores topics such as non-linear and non-stationary behavior, stochastic volatility models, and the econometrics of commodity markets and globalization. Furthermore, it demonstrates the application of recent techniques in various fields: in the frequency domain, in the analysis of persistent dynamics, in the estimation of state space models and new classes of volatility models. The book is divided into two parts: The first part applies econometrics to the field of macroeconomics, discussing trend/cycle decomposition, growth analysis, monetary policy and international trade. The second part applies econometrics to a wide range of topics in financial economics, including price dynamics in equity, commodity and foreign exchange markets and portfolio analysis. The book is essential reading for scholars, students, and practitioners in government and financial institutions interested in applying recent econometric time series methods to financial and economic data.
The Socialist Industrial State (1976) examines the state-socialist system, taking as the central example the Soviet Union - where the goals and values of Marxism-Leninism and the particular institutions, the form of economy and polity, were first adopted and developed. It then considers the historical developments, differences in culture, the level of economic development and the political processes of different state-socialist countries around the globe.
This text covers the basic theory and computation for mathematical modeling in linear programming. It provides a strong background on how to set up mathematical proofs and high-level computation methods, and includes substantial background material and direction. Paris presents an intuitive and novel discussion of what it means to solve a system of equations that is a crucial stepping stone for solving any linear program. The discussion of the simplex method for solving linear programs gives an economic interpretation to every step of the simplex algorithm. The text combines in a unique and novel way the microeconomics of production with the structure of linear programming to give students and scholars of economics a clear notion of what it means, formulating a model of economic equilibrium and the computation of opportunity cost in the presence of many outputs and inputs. |
You may like...
Silicon and Nano-silicon in…
Hassan Etesami, Abdullah H. Al Saeedi, …
Paperback
R3,480
Discovery Miles 34 800
Hazardous and Trace Materials in Soil…
M. Naeem, Tariq Aftab, …
Paperback
R2,941
Discovery Miles 29 410
Fungi Associated with Pandanaceae
Stephen R. Whitton, Eric H. C. McKenzie, …
Hardcover
R5,217
Discovery Miles 52 170
Molecular Biological Technologies for…
Sonia M. Tiquia-Arashiro
Hardcover
R2,712
Discovery Miles 27 120
The Book Of Joy - Lasting Happiness In A…
Dalai Lama, Desmond Tutu
Hardcover
(11)
|