![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
This comprehensive book is an introduction to multilevel Bayesian models in R using brms and the Stan programming language. Featuring a series of fully worked analyses of repeated-measures data, focus is placed on active learning through the analyses of the progressively more complicated models presented throughout the book. In this book, the authors offer an introduction to statistics entirely focused on repeated measures data beginning with very simple two-group comparisons and ending with multinomial regression models with many 'random effects'. Across 13 well-structured chapters, readers are provided with all the code necessary to run all the analyses and make all the plots in the book, as well as useful examples of how to interpret and write-up their own analyses. This book provides an accessible introduction for readers in any field, with any level of statistical background. Senior undergraduate students, graduate students, and experienced researchers looking to 'translate' their skills with more traditional models to a Bayesian framework, will benefit greatly from the lessons in this text.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
This book sheds new light on a recently introduced monetary tool - negative interest rates policy (NIRP). It provides in-depth insight into this phenomenon, conducted by the central banks in several economies, for example, the Eurozone, Switzerland and Japan, and its possible impact on systemic risk. Although it has been introduced as a temporary policy instrument, it may remain widely used for a longer period and by a greater range of central banks than initially expected, thus the book explores its effects and implications on the banking sector and financial markets, with a particular focus on potentially adverse consequences. There is a strong accent on the uniqueness of negative policy rates in the context of financial stability concerns. The authors assess whether NIRP has any - or in principle a stronger - impact on systemic risk than conventional monetary policy. The book is targeted at presenting and evaluating the initial experiences of NIRP policy during normal, i.e. pre-COVID, times, rather than in periods in which pre-established macroeconomic relations are rapidly disrupted or, specifically, when the source of the disruption is not purely economic in nature, unlike in systemic crisis. The authors adopt both theoretical and practical approaches to explore the key issues and outline the policy implications for both monetary and macroprudential authorities, with respect to negative interest rate policy, thus the book will provide a useful guide for policymakers, academics, advanced students and researchers of financial economics and international finance.
This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.
This volume in Advances in Econometrics showcases fresh methodological and empirical research on the econometrics of networks. Comprising both theoretical, empirical and policy papers, the authors bring together a wide range of perspectives to facilitate a dialogue between academics and practitioners for better understanding this groundbreaking field and its role in policy discussions. This edited collection includes thirteen chapters which covers various topics such as identification of network models, network formation, networks and spatial econometrics and applications of financial networks. Readers can also learn about network models with different types of interactions, sample selection in social networks, trade networks, stochastic dynamic programming in space, spatial panels, survival and networks, financial contagion, spillover effects, interconnectedness on consumer credit markets and a financial risk meter. The topics covered in the book, centered on the econometrics of data and models, are a valuable resource for graduate students and researchers in the field. The collection is also useful for industry professionals and data scientists due its focus on theoretical and applied works.
Belief and Rule Compliance: An Experimental Comparison of Muslim and Non-Muslim Economic Behavior uses modern behavioral science and game theory to examine the behavior and compliance of Muslim populations to Islamic Finance laws and norms. The work identifies behaviors characterized by unexpected complexity and profound divergence, including expectations for sharing, cooperation and entrepreneurship gleaned from studies. Adopting a unique set of recent empirical observations, the work provides a reliable behavioral foundation for practitioners seeking to evaluate, create and market Islamic financial products.
Self-contained chapters on the most important applications and methodologies in finance, which can easily be used for the reader’s research or as a reference for courses on empirical finance. Each chapter is reproducible in the sense that the reader can replicate every single figure, table, or number by simply copy-pasting the code we provide. A full-fledged introduction to machine learning with tidymodels based on tidy principles to show how factor selection and option pricing can benefit from Machine Learning methods. Chapter 2 on accessing & managing financial data shows how to retrieve and prepare the most important datasets in the field of financial economics: CRSP and Compustat. The chapter also contains detailed explanations of the most important data characteristics. Each chapter provides exercises that are based on established lectures and exercise classes and which are designed to help students to dig deeper. The exercises can be used for self-studying or as source of inspiration for teaching exercises.
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
What is econophysics? What makes an econophysicist? Why are financial economists reluctant to use results from econophysics? Can we overcome disputes concerning hypotheses used in financial economics and that make no sense for econophysicists? How can we create a profitable dialogue between financial economists and econophysicists? How do we develop a common theoretical framework allowing the creation of more efficient models for the financial industry? This book moves beyond the disciplinary frontiers in order to initiate the development of a common theoretical framework that makes sense for both traditionally trained financial economists and econophysicists. Unlike other publications dedicated to econophysics, this book is written by two financial economists and it situates econophysics in the evolution of financial economics. The major issues that concern the collaboration between the two fields are analyzed in detail. More specifically, this book explains the theoretical and methodological foundations of these two fields in an accessible vocabulary providing the first extensive analytic comparison between models and results from both fields. The book also identifies the major conceptual gate-keepers that complicate dialogue between the two communities while it provides elements to overcome them. By mixing conceptual, historical, theoretical and formal arguments our analysis bridges the current deaf dialogue between financial economists and econophysicists. This book details the recent results in econophysics that bring it closer to financial economics. So doing, it identifies what remains to be done for econophysicists to contribute significantly to financial economics. Beyond the clarification of the current situation, this book also proposes a generic model compatible with the two fields, defining minimal conditions for common models. Finally, this book provides a research agenda for a more fruitful collaboration between econophysicists and financial economists, creating new research opportunities. In this perspective, it lays the foundations for common theoretical framework and models.
Metrology is the study of measurement science. Although classical economists have emphasized the importance of measurement per se, the majority of economics-based writings on the topic have taken the form of government reports related to the activities of specific national metrology laboratories. This book is the first systematic study of measurement activity at a national metrology laboratory, and the laboratory studied is the U.S. National Institute of Standards and Technology (NIST) within the U.S. Department of Commerce. The primary objective of the book is to emphasize for academic and policy audiences the economic importance of measurement not only as an area of study but also as a tool for sustaining technological advancement as an element of economic growth. Toward this goal, the book offers an overview of the economic benefits and consequences of measurement standards; an argument for public sector support of measurement standards; a historical perspective of the measurement activities at NIST; an empirical analysis of one particular measurement activity at NIST, namely calibration testing; and a roadmap for future research on the economics of metrology.
Essentials of Time Series for Financial Applications serves as an agile reference for upper level students and practitioners who desire a formal, easy-to-follow introduction to the most important time series methods applied in financial applications (pricing, asset management, quant strategies, and risk management). Real-life data and examples developed with EViews illustrate the links between the formal apparatus and the applications. The examples either directly exploit the tools that EViews makes available or use programs that by employing EViews implement specific topics or techniques. The book balances a formal framework with as few proofs as possible against many examples that support its central ideas. Boxes are used throughout to remind readers of technical aspects and definitions and to present examples in a compact fashion, with full details (workout files) available in an on-line appendix. The more advanced chapters provide discussion sections that refer to more advanced textbooks or detailed proofs.
Medicine Price Surveys, Analyses and Comparisons establishes guidelines for the study and implementation of pharmaceutical price surveys, analyses, and comparisons. Its contributors evaluate price survey literature, discuss the accessibility and reliability of data sources, and provide a checklist and training kit on conducting price surveys, analyses, and comparisons. Their investigations survey price studies while accounting for the effects of methodologies and explaining regional differences in medicine prices. They also consider policy objectives such as affordable access to medicines and cost-containment as well as options for improving the effectiveness of policies.
This is the first book that examines the diverse range of experimental methods currently being used in the social sciences, gathering contributions by working economists engaged in experimentation, as well as by a political scientist, psychologists and philosophers of the social sciences. Until the mid-twentieth century, most economists believed that experiments in the economic sciences were impossible. But that's hardly the case today, as evinced by the fact that Vernon Smith, an experimental economist, and Daniel Kahneman, a behavioral economist, won the Nobel Prize in Economics in 2002. However, the current use of experimental methods in economics is more diverse than is usually assumed. As the concept of experimentation underwent considerable abstraction throughout the twentieth century, the areas of the social sciences in which experiments are applied are expanding, creating renewed interest in, and multifaceted debates on, the way experimental methods are used. This book sheds new light on the diversity of experimental methodologies used in the social sciences. The topics covered include historical insights into the evolution of experimental methods; the necessary "performativity" of experiments, i.e., the dynamic interaction with the social contexts in which they are embedded; the application of causal inferences in the social sciences; a comparison of laboratory, field, and natural experiments; and the recent use of randomized controlled trials (RCTs) in development economics. Several chapters also deal with the latest heated debates, such as those concerning the use of the random lottery method in laboratory experiments.
This book examines the macroeconomic and regulatory impact of domestic and international shocks on the South African economy resulting from the 2009 financial crisis. It also assesses the impact of the US economy's eventual recovery from the crisis and the prospect of higher US interest rates in future. Told in three parts, the book explores associations between economic growth, policy uncertainty and the key domestic and international transmission channels, and transmission effects, of global financial regulatory and domestic macro-economic uncertainties on subdued and volatile economic recovery, financial channels, lending rate margins, and credit growth. The book concludes by extending its focus to the role of US monetary policy, capital flows and rand/US dollar volatility on the South African economy.
Partial least squares structural equation modeling (PLS-SEM) has become a standard approach for analyzing complex inter-relationships between observed and latent variables. Researchers appreciate the many advantages of PLS-SEM such as the possibility to estimate very complex models and the method's flexibility in terms of data requirements and measurement specification. This practical open access guide provides a step-by-step treatment of the major choices in analyzing PLS path models using R, a free software environment for statistical computing, which runs on Windows, macOS, and UNIX computer platforms. Adopting the R software's SEMinR package, which brings a friendly syntax to creating and estimating structural equation models, each chapter offers a concise overview of relevant topics and metrics, followed by an in-depth description of a case study. Simple instructions give readers the "how-tos" of using SEMinR to obtain solutions and document their results. Rules of thumb in every chapter provide guidance on best practices in the application and interpretation of PLS-SEM.
The second book in a set of ten on quantitative finance for practitioners Presents the theory needed to better understand applications Supplements previous training in mathematics Built from the author's four decades of experience in industry, research, and teaching
In many industries the tariffs are not strictly proportional to the quantity purchased, i. e, they are nonlinear. Examples of nonlinear tariffs include railroad and electricity schedules and rental rates for durable goods and space. The major justification for the nonlinear pricing is the existence of private information on the side of consumers. In the early papers on the subject, private information was captured either by assuming a finite number of types (e. g. Adams and Yellen, 1976) or by a unidimensional continuum of types (Mussa and Rosen, 1978). Economics of the unidimen sional problems is by now well understood. The unidimensional models, however, do not cover all the situations of practical interest. Indeed, often the nonlinear tariffs specify the payment as a function of a variety of characteristics. For example, railroad tariffs spec ify charges based on weight, volume, and distance of each shipment. Dif ferent customers may value each of these characteristics differently, hence the customer's type will not in general be captured by a unidimensional characteristic and a problem of multidimensional screening arises. In such models the consumer's private information (her type) is captured by an m-dimensional vector, while the good produced by the monopolist has n quality dimensions."
This book provides an introduction to the technical background of
unit root testing, one of the most heavily researched areas in
econometrics over the last twenty years. Starting from an
elementary understanding of probability and time series, it
develops the key concepts necessary to understand the structure of
random walks and brownian motion, and their role in tests for a
unit root. The techniques are illustrated with worked examples,
data and programs available on the book's website, which includes
more numerical and theoretical examples
This is the second volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with paradox and ambiguity.
This is the third volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with economic games and the functions of bargaining and solutions.
This is the fourth volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with the dialogues and beliefs that underpin probability concepts.
This is the fifth volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with the statistical theory that underlies the science of econometrics.
This is the sixth volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with the econometric exploration and diagnosis.
This is the seventh volume in a ten-volume set designed for publication in 1997. It reprints in book form a selection of the most important and influential articles on probability, econometrics and economic games which cumulatively have had a major impact on the development of modern economics. There are 242 articles, dating from 1936 to 1996. Many of them were originally published in relatively inaccessible journals and may not, therefore, be available in the archives of many university libraries. The volumes are available separately and also as a complete ten-volume set. The contributors include D. Ellsberg, R.M. Hogart, J.B. Kadane, B.O. Koopmans, E.L. Lehman, D.F. Nicholls, H. Rubin, T.J. Sarjent, L.H. Summers and C.R. Wymer. This particular volume deals with the probability approach to simultaneous equations. |
You may like...
Multimedia Processing, Communication and…
Punitha P. Swamy, Devanur S Guru
Hardcover
R6,325
Discovery Miles 63 250
Introduction to Text Visualization
Nan Cao, Weiwei Cui
Hardcover
Geospatial Abduction - Principles and…
Paulo Shakarian, V.S. Subrahmanian
Hardcover
R1,408
Discovery Miles 14 080
Open Source Software: New Horizons - 6th…
Par J A Gerfalk, Cornelia Boldyreff, …
Hardcover
R2,720
Discovery Miles 27 200
Modeling and Processing for…
Fatos Xhafa, Leonard Barolli, …
Hardcover
Neural Information Processing - 25th…
Long Cheng, Andrew Chi-Sing Leung, …
Paperback
R2,539
Discovery Miles 25 390
Web Mining Applications in E-Commerce…
I-Hsien Ting, Hui-Ju Wu
Hardcover
R2,654
Discovery Miles 26 540
|