![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Economics > Econometrics
This title is concerned with the investigation of the contemporary financial issues of the e-commerce market.
It is well-known that modern stochastic calculus has been exhaustively developed under usual conditions. Despite such a well-developed theory, there is evidence to suggest that these very convenient technical conditions cannot necessarily be fulfilled in real-world applications. Optional Processes: Theory and Applications seeks to delve into the existing theory, new developments and applications of optional processes on "unusual" probability spaces. The development of stochastic calculus of optional processes marks the beginning of a new and more general form of stochastic analysis. This book aims to provide an accessible, comprehensive and up-to-date exposition of optional processes and their numerous properties. Furthermore, the book presents not only current theory of optional processes, but it also contains a spectrum of applications to stochastic differential equations, filtering theory and mathematical finance. Features Suitable for graduate students and researchers in mathematical finance, actuarial science, applied mathematics and related areas Compiles almost all essential results on the calculus of optional processes in unusual probability spaces Contains many advanced analytical results for stochastic differential equations and statistics pertaining to the calculus of optional processes Develops new methods in finance based on optional processes such as a new portfolio theory, defaultable claim pricing mechanism, etc.
This book addresses one of the most important research activities in empirical macroeconomics. It provides a course of advanced but intuitive methods and tools enabling the spatial and temporal disaggregation of basic macroeconomic variables and the assessment of the statistical uncertainty of the outcomes of disaggregation. The empirical analysis focuses mainly on GDP and its growth in the context of Poland. However, all of the methods discussed can be easily applied to other countries. The approach used in the book views spatial and temporal disaggregation as a special case of the estimation of missing observations (a topic on missing data analysis). The book presents an econometric course of models of Seemingly Unrelated Regression Equations (SURE). The main advantage of using the SURE specification is to tackle the presented research problem so that it allows for the heterogeneity of the parameters describing relations between macroeconomic indicators. The book contains model specification, as well as descriptions of stochastic assumptions and resulting procedures of estimation and testing. The method also addresses uncertainty in the estimates produced. All of the necessary tests and assumptions are presented in detail. The results are designed to serve as a source of invaluable information making regional analyses more convenient and - more importantly - comparable. It will create a solid basis for making conclusions and recommendations concerning regional economic policy in Poland, particularly regarding the assessment of the economic situation. This is essential reading for academics, researchers, and economists with regional analysis as their field of expertise, as well as central bankers and policymakers.
Using data from the World Values Survey, this book sheds light on the link between happiness and the social group to which one belongs. The work is based on a rigorous statistical analysis of differences in the probability of happiness and life satisfaction between the predominant social group and subordinate groups. The cases of India and South Africa receive deep attention in dedicated chapters on cast and race, with other chapters considering issues such as cultural bias, religion, patriarchy, and gender. An additional chapter offers a global perspective. On top of this, the longitudinal nature of the data facilitates an examination of how world happiness has evolved between 1994 and 2014. This book will be a valuable reference for advanced students, scholars and policymakers involved in development economics, well-being, development geography, and sociology.
These proceedings highlight research on the latest trends and methods in experimental and behavioral economics. Featuring contributions presented at the 2017 Computational Methods in Experimental Economics (CMEE) conference, which was held in Lublin, Poland, it merges findings from various domains to present deep insights into topics such as game theory, decision theory, cognitive neuroscience and artificial intelligence. The fields of experimental economics and behavioral economics are rapidly evolving. Modern applications of experimental economics require the integration of know-how from disciplines including economics, computer science, psychology and neuroscience. The use of computer technology enhances researchers' ability to generate and analyze large amounts of data, allowing them to use non-standard methods of data logging for experiments such as cognitive neuronal methods. Experiments are currently being conducted with software that, on the one hand, provides interaction with the people involved in experiments, and on the other helps to accurately record their responses. The goal of the CMEE conference and the papers presented here is to provide the scientific community with essential research on and applications of computer methods in experimental economics. Combining theories, methods and regional case studies, the book offers a valuable resource for all researchers, scholars and policymakers in the areas of experimental and behavioral economics.
DEA is computational at its core and this book will be one of several books that we will look to publish on the computational aspects of DEA. This book by Zhu and Cook will deal with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex service industry and the public service domain types of problems that require modeling both qualitative and quantitative data. This will be a handbook treatment dealing with specific data problems including the following: (1) imprecise data, (2) inaccurate data, (3) missing data, (4) qualitative data, (5) outliers, (6) undesirable outputs, (7) quality data, (8) statistical analysis, (9) software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.
JEAN-FRANQOIS MERTENS This book presents a systematic exposition of the use of game theoretic methods in general equilibrium analysis. Clearly the first such use was by Arrow and Debreu, with the "birth" of general equi librium theory itself, in using Nash's existence theorem (or a generalization) to prove the existence of a competitive equilibrium. But this use appeared possibly to be merely tech nical, borrowing some tools for proving a theorem. This book stresses the later contributions, were game theoretic concepts were used as such, to explain various aspects of the general equilibrium model. But clearly, each of those later approaches also provides per sea game theoretic proof of the existence of competitive equilibrium. Part A deals with the first such approach: the equality between the set of competitive equilibria of a perfectly competitive (i.e., every trader has negligible market power) economy and the core of the corresponding cooperative game."
Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.
Does game theory ? the mathematical theory of strategic interaction ? provide genuine explanations of human behaviour? Can game theory be used in economic consultancy or other normative contexts? Explaining Games: The Epistemic Programme in Game Theory ? the first monograph on the philosophy of game theory ? is a bold attempt to combine insights from epistemic logic and the philosophy of science to investigate the applicability of game theory in such fields as economics, philosophy and strategic consultancy. De Bruin proves new mathematical theorems about the beliefs, desires and rationality principles of individual human beings, and he explores in detail the logical form of game theory as it is used in explanatory and normative contexts. He argues that game theory reduces to rational choice theory if used as an explanatory device, and that game theory is nonsensical if used as a normative device. A provocative account of the history of game theory reveals that this is not bad news for all of game theory, though. Two central research programmes in game theory tried to find the ultimate characterisation of strategic interaction between rational agents. Yet, while the Nash Equilibrium Refinement Programme has done badly thanks to such research habits as overmathematisation, model-tinkering and introversion, the Epistemic Programme, De Bruin argues, has been rather successful in achieving this aim.
Digital Asset Valuation and Cyber Risk Measurement: Principles of Cybernomics is a book about the future of risk and the future of value. It examines the indispensable role of economic modeling in the future of digitization, thus providing industry professionals with the tools they need to optimize the management of financial risks associated with this megatrend. The book addresses three problem areas: the valuation of digital assets, measurement of risk exposures of digital valuables, and economic modeling for the management of such risks. Employing a pair of novel cyber risk measurement units, bitmort and hekla, the book covers areas of value, risk, control, and return, each of which are viewed from the perspective of entity (e.g., individual, organization, business), portfolio (e.g., industry sector, nation-state), and global ramifications. Establishing adequate, holistic, and statistically robust data points on the entity, portfolio, and global levels for the development of a cybernomics databank is essential for the resilience of our shared digital future. This book also argues existing economic value theories no longer apply to the digital era due to the unique characteristics of digital assets. It introduces six laws of digital theory of value, with the aim to adapt economic value theories to the digital and machine era.
Provides sound knowledge of optimal decision making in statistics and operations research problems. Serves a quick reference by exploring the research literature on the subject with commercial value-added research applications in statistics and operations research. Provides sound knowledge of optimisations and statistical techniques in modelling of real-world problems. Reviews recent developments and contributions in optimal decision-making problems using optimisation and statistical techniques. Provides an understanding of formulations of decision-making problems and their solution procedures. Describes latest developments in modelling of real-world problems and their solution approaches.
This book has become one of the main statistical tools for the
analysis of economic and financial data. Designed for both
theoreticians and practitioners, this book provides a comprehensive
treatment of GMM estimation and inference. All the main statistical
results are discussed intuitively and proved formally, and all the
inference techniques are illustrated using empirical examples in
macroeconomics and finance. This book is the first to provide an
intuitive introduction to the method combined with a unified
treatment of GMM statistical theory and a survey of recent
important developments in the field.
Medicine Price Surveys, Analyses and Comparisons establishes guidelines for the study and implementation of pharmaceutical price surveys, analyses, and comparisons. Its contributors evaluate price survey literature, discuss the accessibility and reliability of data sources, and provide a checklist and training kit on conducting price surveys, analyses, and comparisons. Their investigations survey price studies while accounting for the effects of methodologies and explaining regional differences in medicine prices. They also consider policy objectives such as affordable access to medicines and cost-containment as well as options for improving the effectiveness of policies.
The aim of this publication is to identify and apply suitable methods for analysing and predicting the time series of gold prices, together with acquainting the reader with the history and characteristics of the methods and with the time series issues in general. Both statistical and econometric methods, and especially artificial intelligence methods, are used in the case studies. The publication presents both traditional and innovative methods on the theoretical level, always accompanied by a case study, i.e. their specific use in practice. Furthermore, a comprehensive comparative analysis of the individual methods is provided. The book is intended for readers from the ranks of academic staff, students of universities of economics, but also the scientists and practitioners dealing with the time series prediction. From the point of view of practical application, it could provide useful information for speculators and traders on financial markets, especially the commodity markets.
This book develops the theory of productivity measurement using the empirical index number approach. The theory uses multiplicative indices and additive indicators as measurement tools, instead of relying on the usual neo-classical assumptions, such as the existence of a production function characterized by constant returns to scale, optimizing behavior of the economic agents, and perfect foresight. The theory can be applied to all the common levels of aggregation (micro, meso, and macro), and half of the book is devoted to accounting for the links existing between the various levels. Basic insights from National Accounts are thereby used. The final chapter is devoted to the decomposition of productivity change into the contributions of efficiency change, technological change, scale effects, and input or output mix effects. Applications on real-life data demonstrate the empirical feasibility of the theory. The book is directed to a variety of overlapping audiences: statisticians involved in measuring productivity change; economists interested in growth accounting; researchers relating macro-economic productivity change to its industrial sources; enterprise micro-data researchers; and business analysts interested in performance measurement.
Belief and Rule Compliance: An Experimental Comparison of Muslim and Non-Muslim Economic Behavior uses modern behavioral science and game theory to examine the behavior and compliance of Muslim populations to Islamic Finance laws and norms. The work identifies behaviors characterized by unexpected complexity and profound divergence, including expectations for sharing, cooperation and entrepreneurship gleaned from studies. Adopting a unique set of recent empirical observations, the work provides a reliable behavioral foundation for practitioners seeking to evaluate, create and market Islamic financial products.
This book presents a unique collection of contributions on modern topics in statistics and econometrics, written by leading experts in the respective disciplines and their intersections. It addresses nonparametric statistics and econometrics, quantiles and expectiles, and advanced methods for complex data, including spatial and compositional data, as well as tools for empirical studies in economics and the social sciences. The book was written in honor of Christine Thomas-Agnan on the occasion of her 65th birthday. Given its scope, it will appeal to researchers and PhD students in statistics and econometrics alike who are interested in the latest developments in their field.
Both parts of Volume 44 of Advances in Econometrics pay tribute to Fabio Canova for his major contributions to economics over the last four decades. Throughout his long and distinguished career, Canova's research has achieved both a prolific publication record and provided stellar research to the profession. His colleagues, co-authors and PhD students wish to express their deep gratitude to Fabio for his intellectual leadership and guidance, whilst showcasing the extensive advances in knowledge and theory made available by Canova for professionals in the field. Advances in Econometrics publishes original scholarly econometrics papers with the intention of expanding the use of developed and emerging econometric techniques by disseminating ideas on the theory and practice of econometrics throughout the empirical economic, business and social science literature. Annual volume themes, selected by the Series Editors, are their interpretation of important new methods and techniques emerging in economics, statistics and the social sciences.
The Regression Discontinuity (RD) design is one of the most popular and credible research designs for program evaluation and causal inference. This volume 38 of Advances in Econometrics collects twelve innovative and thought-provoking contributions to the RD literature, covering a wide range of methodological and practical topics. Some chapters touch on foundational methodological issues such as identification, interpretation, implementation, falsification testing, estimation and inference, while others focus on more recent and related topics such as identification and interpretation in a discontinuity-in-density framework, empirical structural estimation, comparative RD methods, and extrapolation. These chapters not only give new insights for current methodological and empirical research, but also provide new bases and frameworks for future work in this area. This volume contributes to the rapidly expanding RD literature by bringing together theoretical and applied econometricians, statisticians, and social, behavioural and biomedical scientists, in the hope that these interactions will further spark innovative practical developments in this important and active research area.
Equilibrium Problems and Applications develops a unified variational approach to deal with single-valued, set-valued and quasi-equilibrium problems. The authors promote original results in relationship with classical contributions to the field of equilibrium problems. The content evolved in the general setting of topological vector spaces and it lies at the interplay between pure and applied nonlinear analysis, mathematical economics, and mathematical physics. This abstract approach is based on tools from various fields, including set-valued analysis, variational and hemivariational inequalities, fixed point theory, and optimization. Applications include models from mathematical economics, Nash equilibrium of non-cooperative games, and Browder variational inclusions. The content is self-contained and the book is mainly addressed to researchers in mathematics, economics and mathematical physics as well as to graduate students in applied nonlinear analysis.
There isn't a book currently on the market which focuses on multiple hypotheses testing. - Can be used on a range of course, including social & behavioral sciences, biological sciences, as well as professional researchers. Includes various examples of the multiple hypotheses method in practice in a variety of fields, including: sport and crime.
This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.
What is econophysics? What makes an econophysicist? Why are financial economists reluctant to use results from econophysics? Can we overcome disputes concerning hypotheses used in financial economics and that make no sense for econophysicists? How can we create a profitable dialogue between financial economists and econophysicists? How do we develop a common theoretical framework allowing the creation of more efficient models for the financial industry? This book moves beyond the disciplinary frontiers in order to initiate the development of a common theoretical framework that makes sense for both traditionally trained financial economists and econophysicists. Unlike other publications dedicated to econophysics, this book is written by two financial economists and it situates econophysics in the evolution of financial economics. The major issues that concern the collaboration between the two fields are analyzed in detail. More specifically, this book explains the theoretical and methodological foundations of these two fields in an accessible vocabulary providing the first extensive analytic comparison between models and results from both fields. The book also identifies the major conceptual gate-keepers that complicate dialogue between the two communities while it provides elements to overcome them. By mixing conceptual, historical, theoretical and formal arguments our analysis bridges the current deaf dialogue between financial economists and econophysicists. This book details the recent results in econophysics that bring it closer to financial economics. So doing, it identifies what remains to be done for econophysicists to contribute significantly to financial economics. Beyond the clarification of the current situation, this book also proposes a generic model compatible with the two fields, defining minimal conditions for common models. Finally, this book provides a research agenda for a more fruitful collaboration between econophysicists and financial economists, creating new research opportunities. In this perspective, it lays the foundations for common theoretical framework and models.
Biophysical Measurement in Experimental Social Science Research is an ideal primer for the experimental social scientist wishing to update their knowledge and skillset in the area of laboratory-based biophysical measurement. Many behavioral laboratories across the globe have acquired increasingly sophisticated biophysical measurement equipment, sometimes for particular research projects or for financial or institutional reasons. Yet the expertise required to use this technology and integrate the measures it can generate on human subjects into successful social science research endeavors is often scarce and concentrated amongst a small minority of researchers. This book aims to open the door to wider and more productive use of biophysical measurement in laboratory-based experimental social science research. Suitable for doctoral students through to established researchers, the volume presents examples of the successful integration of biophysical measures into analyses of human behavior, discussions of the academic and practical limitations of laboratory-based biophysical measurement, and hands-on guidance about how different biophysical measurement devices are used. A foreword and concluding chapters comprehensively synthesize and compare biophysical measurement options, address academic, ethical and practical matters, and address the broader historical and scientific context. Research chapters demonstrate the academic potential of biophysical measurement ranging fully across galvanic skin response, heart rate monitoring, eye tracking and direct neurological measurements. An extended Appendix showcases specific examples of device adoption in experimental social science lab settings.
Modelling Spatial and Spatial-Temporal Data: A Bayesian Approach is aimed at statisticians and quantitative social, economic and public health students and researchers who work with small-area spatial and spatial-temporal data. It assumes a grounding in statistical theory up to the standard linear regression model. The book compares both hierarchical and spatial econometric modelling, providing both a reference and a teaching text with exercises in each chapter. The book provides a fully Bayesian, self-contained, treatment of the underlying statistical theory, with chapters dedicated to substantive applications. The book includes WinBUGS code and R code and all datasets are available online. Part I covers fundamental issues arising when modelling spatial and spatial-temporal data. Part II focuses on modelling cross-sectional spatial data and begins by describing exploratory methods that help guide the modelling process. There are then two theoretical chapters on Bayesian models and a chapter of applications. Two chapters follow on spatial econometric modelling, one describing different models, the other substantive applications. Part III discusses modelling spatial-temporal data, first introducing models for time series data. Exploratory methods for detecting different types of space-time interaction are presented, followed by two chapters on the theory of space-time separable (without space-time interaction) and inseparable (with space-time interaction) models. An applications chapter includes: the evaluation of a policy intervention; analysing the temporal dynamics of crime hotspots; chronic disease surveillance; and testing for evidence of spatial spillovers in the spread of an infectious disease. A final chapter suggests some future directions and challenges. Robert Haining is Emeritus Professor in Human Geography, University of Cambridge, England. He is the author of Spatial Data Analysis in the Social and Environmental Sciences (1990) and Spatial Data Analysis: Theory and Practice (2003). He is a Fellow of the RGS-IBG and of the Academy of Social Sciences. Guangquan Li is Senior Lecturer in Statistics in the Department of Mathematics, Physics and Electrical Engineering, Northumbria University, Newcastle, England. His research includes the development and application of Bayesian methods in the social and health sciences. He is a Fellow of the Royal Statistical Society. |
You may like...
Gilda's Club Kc - A Mantra Mandalas…
Kristin G Hatch, Delaina J Miller
Paperback
R355
Discovery Miles 3 550
Controlled Release Fertilizers for…
F.B Lewu, Tatiana Volova, …
Paperback
R2,522
Discovery Miles 25 220
Brian O'Doherty/Patrick Ireland: Word…
Christa-Maria Lerm Hayes
Paperback
R782
Discovery Miles 7 820
New and Future Developments in Microbial…
H. B Singh, Vijai G. Gupta, …
Hardcover
Assessing Transformation Products of…
Joerg E. Drewes, Thomas Letzel
Hardcover
R4,835
Discovery Miles 48 350
Biobased Monomers, Polymers, and…
Patrick B. Smith, Richard B. Gross
Hardcover
R5,477
Discovery Miles 54 770
|