![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
Does game theory ? the mathematical theory of strategic interaction ? provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory ? the first monograph on the philosophy of game theory ? is a bold attempt to combine insights from epistemic logic and the philosophy of science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. De Bruin proves new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and he explores in detail the logical form of game theory as it is used in explanatory and normative contexts. He argues that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, De Bruin argues, has been rather successful in achieving this aim.
Digital Asset Valuation and Cyber Risk Measurement: Principles of Cybernomics is a book about the future of risk and the future of value. It examines the indispensable role of economic modeling in the future of digitization, thus providing industry professionals with the tools they need to optimize the management of financial risks associated with this megatrend. The book addresses three problem areas: the valuation of digital assets, measurement of risk exposures of digital valuables, and economic modeling for the management of such risks. Employing a pair of novel cyber risk measurement units, bitmort and hekla, the book covers areas of value, risk, control, and return, each of which are viewed from the perspective of entity (e.g., individual, organization, business), portfolio (e.g., industry sector, nation-state), and global ramifications. Establishing adequate, holistic, and statistically robust data points on the entity, portfolio, and global levels for the development of a cybernomics databank is essential for the resilience of our shared digital future. This book also argues existing economic value theories no longer apply to the digital era due to the unique characteristics of digital assets. It introduces six laws of digital theory of value, with the aim to adapt economic value theories to the digital and machine era.
Provides sound knowledge of optimal decision making in statistics and operations research problems. Serves a quick reference by exploring the research literature on the subject with commercial value-added research applications in statistics and operations research. Provides sound knowledge of optimisations and statistical techniques in modelling of real-world problems. Reviews recent developments and contributions in optimal decision-making problems using optimisation and statistical techniques. Provides an understanding of formulations of decision-making problems and their solution procedures. Describes latest developments in modelling of real-world problems and their solution approaches.
"A book perfect for this moment" -Katherine M. O'Regan, Former Assistant Secretary, US Department of Housing and Urban Development More than fifty years after the passage of the Fair Housing Act, American cities remain divided along the very same lines that this landmark legislation explicitly outlawed. Keeping Races in Their Places tells the story of these lines-who drew them, why they drew them, where they drew them, and how they continue to circumscribe residents' opportunities to this very day. Weaving together sophisticated statistical analyses of more than a century's worth of data with an engaging, accessible narrative that brings the numbers to life, Keeping Races in Their Places exposes the entrenched effects of redlining on American communities. This one-of-a-kind contribution to the real estate and urban economics literature applies the author's original geographic information systems analyses to historical maps to reveal redlining's causal role in shaping today's cities. Spanning the era from the Great Migration to the Great Recession, Keeping Races in Their Places uncovers the roots of the Black-white wealth gap, the subprime lending crisis, and today's lack of affordable housing in maps created by banks nearly a century ago. Most of all, it offers hope that with the latest scholarly tools we can pinpoint how things went wrong-and what we must do to make them right.
This book has become one of the main statistical tools for the
analysis of economic and financial data. Designed for both
theoreticians and practitioners, this book provides a comprehensive
treatment of GMM estimation and inference. All the main statistical
results are discussed intuitively and proved formally, and all the
inference techniques are illustrated using empirical examples in
macroeconomics and finance. This book is the first to provide an
intuitive introduction to the method combined with a unified
treatment of GMM statistical theory and a survey of recent
important developments in the field.
Medicine Price Surveys, Analyses and Comparisons establishes guidelines for the study and implementation of pharmaceutical price surveys, analyses, and comparisons. Its contributors evaluate price survey literature, discuss the accessibility and reliability of data sources, and provide a checklist and training kit on conducting price surveys, analyses, and comparisons. Their investigations survey price studies while accounting for the effects of methodologies and explaining regional differences in medicine prices. They also consider policy objectives such as affordable access to medicines and cost-containment as well as options for improving the effectiveness of policies.
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
Belief and Rule Compliance: An Experimental Comparison of Muslim and Non-Muslim Economic Behavior uses modern behavioral science and game theory to examine the behavior and compliance of Muslim populations to Islamic Finance laws and norms. The work identifies behaviors characterized by unexpected complexity and profound divergence, including expectations for sharing, cooperation and entrepreneurship gleaned from studies. Adopting a unique set of recent empirical observations, the work provides a reliable behavioral foundation for practitioners seeking to evaluate, create and market Islamic financial products.
This book presents a unique collection of contributions on modern topics in statistics and econometrics, written by leading experts in the respective disciplines and their intersections. It addresses nonparametric statistics and econometrics, quantiles and expectiles, and advanced methods for complex data, including spatial and compositional data, as well as tools for empirical studies in economics and the social sciences. The book was written in honor of Christine Thomas-Agnan on the occasion of her 65th birthday. Given its scope, it will appeal to researchers and PhD students in statistics and econometrics alike who are interested in the latest developments in their field.
Both parts of Volume 44 of Advances in Econometrics pay tribute to Fabio Canova for his major contributions to economics over the last four decades. Throughout his long and distinguished career, Canova's research has achieved both a prolific publication record and provided stellar research to the profession. His colleagues, co-authors and PhD students wish to express their deep gratitude to Fabio for his intellectual leadership and guidance, whilst showcasing the extensive advances in knowledge and theory made available by Canova for professionals in the field. Advances in Econometrics publishes original scholarly econometrics papers with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature. Annual volume themes, selected by the Series Editors, are their interpretation of important new methods and techniques emerging in economics, statistics and the social sciences.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.
What is econophysics? What makes an econophysicist? Why are financial economists reluctant to use results from econophysics? Can we overcome disputes concerning hypotheses used in financial economics and that make no sense for econophysicists? How can we create a profitable dialogue between financial economists and econophysicists? How do we develop a common theoretical framework allowing the creation of more efficient models for the financial industry? This book moves beyond the disciplinary frontiers in order to initiate the development of a common theoretical framework that makes sense for both traditionally trained financial economists and econophysicists. Unlike other publications dedicated to econophysics, this book is written by two financial economists and it situates econophysics in the evolution of financial economics. The major issues that concern the collaboration between the two fields are analyzed in detail. More specifically, this book explains the theoretical and methodological foundations of these two fields in an accessible vocabulary providing the first extensive analytic comparison between models and results from both fields. The book also identifies the major conceptual gate-keepers that complicate dialogue between the two communities while it provides elements to overcome them. By mixing conceptual, historical, theoretical and formal arguments our analysis bridges the current deaf dialogue between financial economists and econophysicists. This book details the recent results in econophysics that bring it closer to financial economics. So doing, it identifies what remains to be done for econophysicists to contribute significantly to financial economics. Beyond the clarification of the current situation, this book also proposes a generic model compatible with the two fields, defining minimal conditions for common models. Finally, this book provides a research agenda for a more fruitful collaboration between econophysicists and financial economists, creating new research opportunities. In this perspective, it lays the foundations for common theoretical framework and models.
In many industries the tariffs are not strictly proportional to the quantity purchased, i. e, they are nonlinear. Examples of nonlinear tariffs include railroad and electricity schedules and rental rates for durable goods and space. The major justification for the nonlinear pricing is the existence of private information on the side of consumers. In the early papers on the subject, private information was captured either by assuming a finite number of types (e. g. Adams and Yellen, 1976) or by a unidimensional continuum of types (Mussa and Rosen, 1978). Economics of the unidimen sional problems is by now well understood. The unidimensional models, however, do not cover all the situations of practical interest. Indeed, often the nonlinear tariffs specify the payment as a function of a variety of characteristics. For example, railroad tariffs spec ify charges based on weight, volume, and distance of each shipment. Dif ferent customers may value each of these characteristics differently, hence the customer's type will not in general be captured by a unidimensional characteristic and a problem of multidimensional screening arises. In such models the consumer's private information (her type) is captured by an m-dimensional vector, while the good produced by the monopolist has n quality dimensions."
This is the first book that examines the diverse range of experimental methods currently being used in the social sciences, gathering contributions by working economists engaged in experimentation, as well as by a political scientist, psychologists and philosophers of the social sciences. Until the mid-twentieth century, most economists believed that experiments in the economic sciences were impossible. But that's hardly the case today, as evinced by the fact that Vernon Smith, an experimental economist, and Daniel Kahneman, a behavioral economist, won the Nobel Prize in Economics in 2002. However, the current use of experimental methods in economics is more diverse than is usually assumed. As the concept of experimentation underwent considerable abstraction throughout the twentieth century, the areas of the social sciences in which experiments are applied are expanding, creating renewed interest in, and multifaceted debates on, the way experimental methods are used. This book sheds new light on the diversity of experimental methodologies used in the social sciences. The topics covered include historical insights into the evolution of experimental methods; the necessary "performativity" of experiments, i.e., the dynamic interaction with the social contexts in which they are embedded; the application of causal inferences in the social sciences; a comparison of laboratory, field, and natural experiments; and the recent use of randomized controlled trials (RCTs) in development economics. Several chapters also deal with the latest heated debates, such as those concerning the use of the random lottery method in laboratory experiments.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
In response to the damage caused by a growth-led global economy, researchers across the world started investigating the association between environmental pollution and its possible determinants using different models and techniques. Most famously, the environmental Kuznets curve hypothesizes an inverted U-shaped association between environmental quality and gross domestic product (GDP). This book explores the latest literature on the environmental Kuznets curve, including developments in the methodology, the impacts of the pandemic, and other recent findings. Researchers have recently broadened the range of the list of drivers of environmental pollution under consideration, which now includes variables such as foreign direct investment, trade expansion, financial development, human activities, population growth, and renewable and nonrenewable energy resources, all of which vary across different countries and times. And in addition to CO2 emissions, other proxies for environmental quality – such as water, land, and ecological footprints – have been used in recent studies. This book also incorporates analysis of the relationship between economic growth and the environment during the COVID-19 crisis, presenting new empirical work on the impact of the pandemic on energy use, the financial sector, trade, and tourism. Collectively, these developments have improved the direction and extent of the environmental Kuznets curve hypothesis and broadened the basket of dependent and independent variables which may be incorporated. This book will be invaluable reading for researchers in environmental economics and econometrics.
This book examines the macroeconomic and regulatory impact of domestic and international shocks on the South African economy resulting from the 2009 financial crisis. It also assesses the impact of the US economy's eventual recovery from the crisis and the prospect of higher US interest rates in future. Told in three parts, the book explores associations between economic growth, policy uncertainty and the key domestic and international transmission channels, and transmission effects, of global financial regulatory and domestic macro-economic uncertainties on subdued and volatile economic recovery, financial channels, lending rate margins, and credit growth. The book concludes by extending its focus to the role of US monetary policy, capital flows and rand/US dollar volatility on the South African economy.
The methodological needs of environmental studies are unique in the breadth of research questions that can be posed, calling for a textbook that covers a broad swath of approaches to conducting research with potentially many different kinds of evidence. Fully updated to address new developments such as the effects of the internet, recent trends in the use of computers, remote sensing, and large data sets, this new edition of Research Methods for Environmental Studies is written specifically for social science-based research into the environment. This revised edition contains new chapters on coding, focus groups, and an extended treatment of hypothesis testing. The textbook covers the best-practice research methods most used to study the environment and its connections to societal and economic activities and objectives. Over five key parts, Kanazawa introduces quantitative and qualitative approaches, mixed methods, and the special requirements of interdisciplinary research, emphasizing that methodological practice should be tailored to the specific needs of the project. Within these parts, detailed coverage is provided on key topics including the identification of a research project, hypothesis testing, spatial analysis, the case study method, ethnographic approaches, discourse analysis, mixed methods, survey and interview techniques, focus groups, and ethical issues in environmental research. Drawing on a variety of extended and updated examples to encourage problem-based learning and fully addressing the challenges associated with interdisciplinary investigation, this book will be an essential resource for students embarking on courses exploring research methods in environmental studies.
The book provides a comprehensive overview of the latest econometric methods for studying the dynamics of macroeconomic and financial time series. It examines alternative methodological approaches and concepts, including quantile spectra and co-spectra, and explores topics such as non-linear and non-stationary behavior, stochastic volatility models, and the econometrics of commodity markets and globalization. Furthermore, it demonstrates the application of recent techniques in various fields: in the frequency domain, in the analysis of persistent dynamics, in the estimation of state space models and new classes of volatility models. The book is divided into two parts: The first part applies econometrics to the field of macroeconomics, discussing trend/cycle decomposition, growth analysis, monetary policy and international trade. The second part applies econometrics to a wide range of topics in financial economics, including price dynamics in equity, commodity and foreign exchange markets and portfolio analysis. The book is essential reading for scholars, students, and practitioners in government and financial institutions interested in applying recent econometric time series methods to financial and economic data. |
You may like...
Operations And Supply Chain Management
David Collier, James Evans
Hardcover
Agent-Based Modeling and Network…
Akira Namatame, Shu-Heng Chen
Hardcover
R2,970
Discovery Miles 29 700
Handbook of Field Experiments, Volume 1
Esther Duflo, Abhijit Banerjee
Hardcover
R3,497
Discovery Miles 34 970
Design and Analysis of Time Series…
Richard McCleary, David McDowall, …
Hardcover
R3,286
Discovery Miles 32 860
Operations and Supply Chain Management
James Evans, David Collier
Hardcover
The Oxford Handbook of the Economics of…
Yann Bramoulle, Andrea Galeotti, …
Hardcover
R5,455
Discovery Miles 54 550
|