Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Science & Mathematics > Mathematics > Optimization > Game theory
This book presents a crisis scenario generator with black swans, black butterflies and worst case scenarios. It is the most useful scenario generator that can be used to manage assets in a crisis-prone period, offering more reliable values for Value at Risk (VaR), Conditional Value at Risk (CVaR) and Tail Value at Risk (TVaR). Hazardous Forecasts and Crisis Scenario Generator questions how to manage assets when crisis probability increases, enabling you to adopt a process for using generators in order to be well prepared for handling crises.
This original and timely monograph describes a unique self-contained excursion that reveals to the readers the roles of two basic cognitive abilities, i.e. intention recognition and arranging commitments, in the evolution of cooperative behavior. This book analyses intention recognition, an important ability that helps agents predict others' behavior, in its artificial intelligence and evolutionary computational modeling aspects, and proposes a novel intention recognition method. Furthermore, the book presents a new framework for intention-based decision making and illustrates several ways in which an ability to recognize intentions of others can enhance a decision making process. By employing the new intention recognition method and the tools of evolutionary game theory, this book introduces computational models demonstrating that intention recognition promotes the emergence of cooperation within populations of self-regarding agents. Finally, the book describes how commitment provides a pathway to the evolution of cooperative behavior, and how it further empowers intention recognition, thereby leading to a combined improved strategy.
Though the game-theoretic approach has been vastly studied and utilized in relation to economics of industrial organizations, it has hardly been used to tackle safety management in multi-plant chemical industrial settings. Using Game Theory for Improving Safety within Chemical Industrial Parks presents an in-depth discussion of game-theoretic modeling which may be applied to improve cross-company prevention and -safety management in a chemical industrial park. By systematically analyzing game-theoretic models and approaches in relation to managing safety in chemical industrial parks, Using Game Theory for Improving Safety within Chemical Industrial Parks explores the ways game theory can predict the outcome of complex strategic investment decision making processes involving several adjacent chemical plants. A number of game-theoretic decision models are discussed to provide strategic tools for decision-making situations. Offering clear and straightforward explanations of methodologies, Using Game Theory for Improving Safety within Chemical Industrial Parks provides managers and management teams with approaches to asses situations and to improve strategic safety- and prevention arrangements.
Broad and diverse ranges of activities are conducted within and by organized groups of individuals, including political, economic and social activities. These activities have become a subject of intense interest in economics and game theory. Some of the topics investigated in this collection are models of networks of power and privilege, trade networks, co-authorship networks, buyer-seller networks with differentiated products, and networks of medical innovation and the adaptation of new information. Other topics are social norms on punctuality, clubs and the provision of club goods and public goods, research and development and collusive alliances among corporations, and international alliances and trading agreements. While relatively recent, the literature on game theoretic studies of group formation in economics is already vast. This volume provides an introduction to this important literature on game-theoretic treatments of situations with networks, clubs, and coalitions, including some applications.
This book presents a rigorous treatment of the mathematical instruments available for dealing with income distributions, in particular Lorenz curves and related methods. The methods examined allow us to analyze, compare and modify such distributions from an economic and social perspective. Though balanced income distributions are key to peaceful coexistence within and between nations, it is often difficult to identify the right kind of balance needed, because there is an interesting interaction with innovation and economic growth. The issue of justice, as discussed in Thomas Piketty's bestseller "Capital in the Twenty-First Century" or in the important book "The Price of Inequality" by Nobel laureate Joseph Stiglitz, is also touched on. Further, there is a close connection to the issue of democracy in the context of globalization. One highlight of the book is its rigorous treatment of the so-called Atkinson theorem and some extensions, which help to explain under which type of societal utility functions nations tend to operate either in the direction of more balance or less balance. Finally, there are some completely new insights into changing the balance pattern of societies and the kind of coalitions between richer and poorer parts of society to organize political support in democracies in either case. Oxford University's Sir Tony Atkinson, well known for his so-called Atkinson theorem, writes in his foreword to the book: "[The authors] contribute directly to t he recent debates that are going on in politics. [...] with this book the foundation of arguments concerning a proper balance in income distribution in the sense of identifying an 'efficient inequality range' has got an additional push from mathematics, which I appreciate very much."
Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes's analysis of uncertainty. There is a need for further generalization - a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than computations over numbers and probabilities. The monograph exposes the foundations of a new decision theory with imperfect decision-relevant information on environment and a decision maker's behavior. This theory is based on the synthesis of the fuzzy sets theory with perception-based information and the probability theory. The book is self containing and represents in a systematic way the decision theory with imperfect information into the educational systems. The book will be helpful for teachers and students of universities and colleges, for managers and specialists from various fields of business and economics, production and social sphere.
This book presents a mathematically-based introduction into the fascinating topic of Fuzzy Sets and Fuzzy Logic and might be used as textbook at both undergraduate and graduate levels and also as reference guide for mathematician, scientists or engineers who would like to get an insight into Fuzzy Logic. Fuzzy Sets have been introduced by Lotfi Zadeh in 1965 and since then, they have been used in many applications. As a consequence, there is a vast literature on the practical applications of fuzzy sets, while theory has a more modest coverage. The main purpose of the present book is to reduce this gap by providing a theoretical introduction into Fuzzy Sets based on Mathematical Analysis and Approximation Theory. Well-known applications, as for example fuzzy control, are also discussed in this book and placed on new ground, a theoretical foundation. Moreover, a few advanced chapters and several new results are included. These comprise, among others, a new systematic and constructive approach for fuzzy inference systems of Mamdani and Takagi-Sugeno types, that investigates their approximation capability by providing new error estimates.
This brief presents a general unifying perspective on the fractional calculus. It brings together results of several recent approaches in generalizing the least action principle and the Euler-Lagrange equations to include fractional derivatives. The dependence of Lagrangians on generalized fractional operators as well as on classical derivatives is considered along with still more general problems in which integer-order integrals are replaced by fractional integrals. General theorems are obtained for several types of variational problems for which recent results developed in the literature can be obtained as special cases. In particular, the authors offer necessary optimality conditions of Euler-Lagrange type for the fundamental and isoperimetric problems, transversality conditions, and Noether symmetry theorems. The existence of solutions is demonstrated under Tonelli type conditions. The results are used to prove the existence of eigenvalues and corresponding orthogonal eigenfunctions of fractional Sturm-Liouville problems. Advanced Methods in the Fractional Calculus of Variations is a self-contained text which will be useful for graduate students wishing to learn about fractional-order systems. The detailed explanations will interest researchers with backgrounds in applied mathematics, control and optimization as well as in certain areas of physics and engineering.
Drawing on a wealth of new archival material, including personal correspondence and diaries, Robert Leonard tells the fascinating story of the creation of game theory by Hungarian Jewish mathematician John von Neumann and Austrian economist Oskar Morgenstern. Game theory first emerged amid discussions of the psychology and mathematics of chess in Germany and fin-de-siecle Austro-Hungary. In the 1930s, on the cusp of anti-Semitism and political upheaval, it was developed by von Neumann into an ambitious theory of social organization. It was shaped still further by its use in combat analysis in World War II and during the Cold War. Interweaving accounts of the period s economics, science, and mathematics, and drawing sensitively on the private lives of von Neumann and Morgenstern, Robert Leonard provides a detailed reconstruction of a complex historical drama.
The likelihood of observing Condorcet's Paradox is known to be very low for elections with a small number of candidates if voters' preferences on candidates reflect any significant degree of a number of different measures of mutual coherence. This reinforces the intuitive notion that strange election outcomes should become less likely as voters' preferences become more mutually coherent. Similar analysis is used here to indicate that this notion is valid for most, but not all, other voting paradoxes. This study also focuses on the Condorcet Criterion, which states that the pairwise majority rule winner should be chosen as the election winner, if one exists. Representations for the Condorcet Efficiency of the most common voting rules are obtained here as a function of various measures of the degree of mutual coherence of voters' preferences. An analysis of the Condorcet Efficiency representations that are obtained yields strong support for using Borda Rule.
This edited volume is an introduction to diverse methods and applications in operations research focused on local populations and community-based organizations that have the potential to improve the lives of individuals and communities in tangible ways. The book's themes include: space, place and community; disadvantaged, underrepresented or underserved populations; international and transnational applications; multimethod, cross-disciplinary and comparative approaches and appropriate technology; and analytics. The book is comprised of eleven original submissions, a re-print of a 2007 article by Johnson and Smilowitz that introduces CBOR, and an introductory chapter that provides policy motivation, antecedents to CBOR in OR/MS, a theory of CBOR and a comprehensive review of the chapters. It is hoped that this book will provide a resource to academics and practitioners who seek to develop methods and applications that bridge the divide between traditional OR/MS rooted in mathematical models and newer streams in 'soft OR' that emphasize problem structuring methods, critical approaches to OR/MS and community engagement and capacity-building.
Toward the late 1990s, several research groups independently began developing new, related theories in mathematical finance. These theories did away with the standard stochastic geometric diffusion "Samuelson" market model (also known as the Black-Scholes model because it is used in that most famous theory), instead opting for models that allowed minimax approaches to complement or replace stochastic methods. Among the most fruitful models were those utilizing game-theoretic tools and the so-called interval market model. Over time, these models have slowly but steadily gained influence in the financial community, providing a useful alternative to classical methods. A self-contained monograph, The Interval Market Model in Mathematical Finance: Game-Theoretic Methods assembles some of the most important results, old and new, in this area of research. Written by seven of the most prominent pioneers of the interval market model and game-theoretic finance, the work provides a detailed account of several closely related modeling techniques for an array of problems in mathematical economics. The book is divided into five parts, which successively address topics including: * probability-free Black-Scholes theory; * fair-price interval of an option; * representation formulas and fast algorithms for option pricing; * rainbow options; * tychastic approach of mathematical finance based upon viability theory. This book provides a welcome addition to the literature, complementing myriad titles on the market that take a classical approach to mathematical finance. It is a worthwhile resource for researchers in applied mathematics and quantitative finance, and has also been written in a manner accessible to financially-inclined readers with a limited technical background.
Today it appears that we understand more about the universe than about our interconnected socio-economic world. In order to uncover organizational structures and novel features in these systems, we present the first comprehensive complex systems analysis of real-world ownership networks. This effort lies at the interface between the realms of economics and the emerging field loosely referred to as complexity science. The structure of global economic power is reflected in the network of ownership ties of companies and the analysis of such ownership networks has possible implications for market competition and financial stability. Thus this work presents powerful new tools for the study of economic and corporate networks that are only just beginning to attract the attention of scholars.
This book examines the most controversial issues concerning the use of pre-drafted clauses in fine print, which are usually included in consumer contracts and presented to consumers on a take-it-or-leave-it basis. By applying a multi-disciplinary approach that combines consumer's psychology and seller's drafting power in the logic of efficiency and good faith, the book provides a fresh and unconventional analysis of the existing literature, both theoretical and empirical. Moving from the unconscionability doctrine, it criticizes (and in some cases refutes) its main conclusions based on criteria which are usually invoked to sustain the need for public intervention to protect consumers, and specifically related to Law (contract complexity), Psychology (consumer lack of sophistication criterion) and Economics (market structure criterion). It also analyzes the effects of different regulations, such as banning vexatious clauses or mandating disclosure clauses, showing that none of them protect consumers, but in fact prove to be harmful when consumers are more vulnerable, that is whenever sellers can exploit some degree of market power. In closing, the book combines these disparate aspects, arguing that the solution (if any) to the problem of consumer exploitation and market inefficiency associated with the use of contracts of adhesion in these contexts cannot be found in removing or prohibiting hidden clauses, but instead has to take into account the effects of these clauses on the contract as a whole.
This monograph provides a unified and comprehensive treatment of an order-theoretic fixed point theory in partially ordered sets and its various useful interactions with topological structures. The material progresses systematically, by presenting the preliminaries before moving to more advanced topics. In the treatment of the applications a wide range of mathematical theories and methods from nonlinear analysis and integration theory are applied; an outline of which has been given an appendix chapter to make the book self-contained. Graduate students and researchers in nonlinear analysis, pure and applied mathematics, game theory and mathematical economics will find this book useful.
The study of M-matrices, their inverses and discrete potential theory is now a well-established part of linear algebra and the theory of Markov chains. The main focus of this monograph is the so-called inverse M-matrix problem, which asks for a characterization of nonnegative matrices whose inverses are M-matrices. We present an answer in terms of discrete potential theory based on the Choquet-Deny Theorem. A distinguished subclass of inverse M-matrices is ultrametric matrices, which are important in applications such as taxonomy. Ultrametricity is revealed to be a relevant concept in linear algebra and discrete potential theory because of its relation with trees in graph theory and mean expected value matrices in probability theory. Remarkable properties of Hadamard functions and products for the class of inverse M-matrices are developed and probabilistic insights are provided throughout the monograph.
In the area of dynamic economics, David Cass's work has spawned a number of important lines of research, including the study of dynamic general equilibrium theory, the concept of sunspot equilibria, and general equilibrium theory when markets are incomplete. Based on these contributions, this volume contains new developments in the field, written by Cass's students and co-authors.
Throughout the history of economics, a variety of analytical tools have been borrowed from the so-called exact sciences. As Schoe?er (1955) puts it: "They have taken their mathematics and their ded- tive techniques from physics, their statistics from genetics and agr- omy, their systems of classi?cation from taxonomy and chemistry, their model-construction techniques from astronomy and mechanics, and their methods of analysis of the consequences of actions from en- neering". The possibility of similarities of structure in mathematical models of economic and physical systems has been an important f- tor in the development of neoclassical theory. To treat the state of an economy as an equilibrium, analogous to the equilibrium of a mech- ical system has been a key concept in economics ever since it became a mathematically formalized science. Adopting a Newtonian paradigm neoclassical economics often is based on three fundamental concepts. Firstly, the representative agent who is a scale model of the whole society with extraordinary capacities, particularly concerning her - pability of information processing and computation. Of course, this is a problematic reduction as agents are both heterogeneous and bou- edly rational and limited in their cognitive capabilities. Secondly, it often con?ned itself to study systems in a state of equilibrium. But this concept is not adequate to describe and to support phenomena in perpetual motion.
The book presents a peer-reviewed collection of papers presented during the 10th issue of the Artificial Economics conference, addressing a variety of issues related to macroeconomics, industrial organization, networks, management and finance, as well as purely methodological issues. The field of artificial economics covers a broad range of methodologies relying on computer simulations in order to model and study the complexity of economic and social phenomena. The grounding principle of artificial economics is the analysis of aggregate properties of simulated systems populated by interacting adaptive agents that are equipped with heterogeneous individual behavioral rules. These macroscopic properties are neither foreseen nor intended by the artificial agents but generated collectively by them. They are emerging characteristics of such artificially simulated systems.
Born of a belief that economic insights should not require much mathematical sophistication, this book proposes novel and parsimonious methods to incorporate ignorance and uncertainty into economic modeling, without complex mathematics. Economics has made great strides over the past several decades in modeling agents' decisions when they are incompletely informed, but many economists believe that there are aspects of these models that are less than satisfactory. Among the concerns are that ignorance is not captured well in most models, that agents' presumed cognitive ability is implausible, and that derived optimal behavior is sometimes driven by the fine details of the model rather than the underlying economics. Compte and Postlewaite lay out a tractable way to address these concerns, and to incorporate plausible limitations on agents' sophistication. A central aspect of the proposed methodology is to restrict the strategies assumed available to agents.
In his book "Marktform und Gleichgewicht", published initially in 1934, Heinrich von Stackelberg presented his groundbreaking leadership model of firm competition. In a work of great originality and richness, he described and analyzed a market situation in which the leader firm moves first and the follower firms then move sequentially. This game-theoretic model, now widely known as Stackelberg competition, has had tremendous impact on the theory of the firm and economic analysis in general, and has been applied to study decision-making in various fields of business. As the first translation of von Stackelberg's book into English, this volume makes his classic work available in its original form to an English-speaking audience for the very first time.
Energy issues feature frequently in the economic and financial press. Specific examples of topical energy issues come from around the globe and often concern economics and finance. The importance of energy production, consumption and trade raises fundamental economic issues that impact the global economy and financial markets. This volume presents research on energy economics and financial markets related to the themes of supply and demand, environmental impact and renewables, energy derivatives trading, and finance and energy. The contributions by experts in their fields take a global perspective, as well as presenting cases from various countries and continents.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
The utility maximization paradigm forms the basis of many economic, psychological, cognitive and behavioral models. However, numerous examples have revealed the deficiencies of the concept. This book helps to overcome those deficiencies by taking into account insensitivity of measurement threshold and context of choice. The second edition has been updated to include the most recent developments and a new chapter on classic and new results for infinite sets.
This title takes an in-depth look at the mathematics in the context of voting and electoral systems, with focus on simple ballots, complex elections, fairness, approval voting, ties, fair and unfair voting, and manipulation techniques. The exposition opens with a sketch of the mathematics behind the various methods used in conducting elections. The reader is lead to a comprehensive picture of the theoretical background of mathematics and elections through an analysis of Condorcet's Principle and Arrow's Theorem of conditions in electoral fairness. Further detailed discussion of various related topics include: methods of manipulating the outcome of an election, amendments, and voting on small committees.In recent years, electoral theory has been introduced into lower-level mathematics courses, as a way to illustrate the role of mathematics in our everyday life. Few books have studied voting and elections from a more formal mathematical viewpoint. This text will be useful to those who teach lower level courses or special topics courses and aims to inspire students to understand the more advanced mathematics of the topic. The exercises in this text are ideal for upper undergraduate and early graduate students, as well as those with a keen interest in the mathematics behind voting and elections. |
You may like...
Networks in the Global World V…
Artem Antonyuk, Nikita Basov
Hardcover
R4,278
Discovery Miles 42 780
Evaluating Voting Systems with…
Mostapha Diss, Vincent Merlin
Hardcover
R4,284
Discovery Miles 42 840
The History and Allure of Interactive…
Mark Kretzschmar, Sara Raffel
Hardcover
R2,970
Discovery Miles 29 700
Operational Research - IO2017, Valenca…
A. Ismael F. Vaz, Joao Paulo Almeida, …
Hardcover
R2,863
Discovery Miles 28 630
Time-Inconsistent Control Theory with…
Tomas Bjoerk, Mariana Khapko, …
Hardcover
R3,553
Discovery Miles 35 530
Game Theory - Breakthroughs in Research…
Information Resources Management Association
Hardcover
R8,677
Discovery Miles 86 770
|