![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Optimization > Game theory
The mathematical theory of democracy deals with selection of representatives who make decisions on behalf of the whole society. In this book, the notion of representativeness is operationalized with the index of popularity (the average percentage of the population whose opinion is represented on a number of issues) and the index of universality (the frequency of cases when the opinion of a majority is represented). These indices are applied to evaluate and study the properties of single representatives (e.g. president) and representative bodies (e.g. parliament, magistrate, cabinet, jury, coalition). To bridge representative and direct democracy, an election method is proposed that is based not on voting but on indexing candidates with respect to the electorate's political profile. In addition, societal and non-societal applications are considered.
Mathematical demography is the centerpiece of quantitative social science. The founding works of this field from Roman times to the late Twentieth Century are collected here, in a new edition of a classic work by David R. Smith and Nathan Keyfitz. Commentaries by Smith and Keyfitz have been brought up to date and extended by Kenneth Wachter and Herve Le Bras, giving a synoptic picture of the leading achievements in formal population studies. Like the original collection, this new edition constitutes an indispensable source for students and scientists alike, and illustrates the deep roots and continuing vitality of mathematical demography.
This book presents an up-to-date review of modeling and optimization approaches for location problems along with a new bi-level programming methodology which captures the effect of competition of both producers and customers on facility location decisions. While many optimization approaches simplify location problems by assuming decision making in isolation, this monograph focuses on models which take into account the competitive environment in which such decisions are made. New insights in modeling, algorithmic and theoretical possibilities are opened by this approach and new applications are possible. Competition on equal term plus competition between market leader and followers are considered in this study, consequently bi-level optimization methodology is emphasized and further developed. This book provides insights regarding modeling complexity and algorithmic approaches to discrete competitive location problems. In traditional location modeling, assignment of customer demands to supply sources are made for which the associated costs target the firm and not the customers, though in many real world situations the cost is incurred by the customers. Moreover, there may be customer competition for the provided services. Thus, a new methodological framework is needed in order to encompass such considerations into the modeling and solution process. This book offers initial directions for further research and development along these lines. Aimed toward graduate students and researchers in the field of mathematics, computer science, operational research and game theory, this title provides necessary information on which further research contributions can be based.
A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.
Search games and rendezvous problems have received growing attention in computer science within the past few years. Rendezvous problems emerge naturally, for instance, to optimize performance and convergence of mobile robots. This gives a new algorithmic point of view to the theory. Furthermore, modern topics such as the spreading of gossip or disease in social networks have lead to new challenging problems in search and rendezvous. Search Theory: A Game Theoretic Perspective introduces the first integrated approach to Search and Rendezvous from the perspectives of biologists, computer scientists and mathematicians. This contributed volume covers a wide range of topics including rendezvous problems and solutions, rendezvous on graphs, search games on biology, mobility in governed social networks, search and security, and more. Most chapters also include case studies or a survey, in addition to a chapter on the future direction of Search and Rendezvous research. This book targets researchers and practitioners working in computer science, mathematics and biology as a reference book. Advanced level students focused on these fields will also find this book valuable as a secondary text book or reference.
This is the first book that comprehensively analyses co-patenting in Japan and the U.S., which directly signifies collaborations between firms and inventors, using the methodology of network science. Network science approaches enable us to analyse the structures of co-patenting networks. In addition, generative models in network science estimate the probability of new connections between nodes, which enables us to discuss the temporal development of networks. On the other hand, regression analyses, which are broadly used in the field of economics, may be effective for determining what attributes are important for firms and inventors that are going to be connected, but such techniques cannot consider the complexity of networks. This book compiles a series of studies by the author on geographical location and co-patenting using data that were published in eight academic journal articles. This book gives the reader ideas about how we can utilize patent data to understand how firms and inventors collaborate under the effect of complex networks.
Though the game-theoretic approach has been vastly studied and utilized in relation to economics of industrial organizations, it has hardly been used to tackle safety management in multi-plant chemical industrial settings. Using Game Theory for Improving Safety within Chemical Industrial Parks presents an in-depth discussion of game-theoretic modeling which may be applied to improve cross-company prevention and -safety management in a chemical industrial park. By systematically analyzing game-theoretic models and approaches in relation to managing safety in chemical industrial parks, Using Game Theory for Improving Safety within Chemical Industrial Parks explores the ways game theory can predict the outcome of complex strategic investment decision making processes involving several adjacent chemical plants. A number of game-theoretic decision models are discussed to provide strategic tools for decision-making situations. Offering clear and straightforward explanations of methodologies, Using Game Theory for Improving Safety within Chemical Industrial Parks provides managers and management teams with approaches to asses situations and to improve strategic safety- and prevention arrangements.
This original and timely monograph describes a unique self-contained excursion that reveals to the readers the roles of two basic cognitive abilities, i.e. intention recognition and arranging commitments, in the evolution of cooperative behavior. This book analyses intention recognition, an important ability that helps agents predict others' behavior, in its artificial intelligence and evolutionary computational modeling aspects, and proposes a novel intention recognition method. Furthermore, the book presents a new framework for intention-based decision making and illustrates several ways in which an ability to recognize intentions of others can enhance a decision making process. By employing the new intention recognition method and the tools of evolutionary game theory, this book introduces computational models demonstrating that intention recognition promotes the emergence of cooperation within populations of self-regarding agents. Finally, the book describes how commitment provides a pathway to the evolution of cooperative behavior, and how it further empowers intention recognition, thereby leading to a combined improved strategy.
This publication links information asymmetries and decision processes of financial investors through quantitative models. The aim is to analyze empirical observations and synthesize outputs in order to add new academic insights with practical pertinence. Multivariate scoring models and statistical analyses investigate situations on the market level that enables corporations to lower their capital costs if specific conditions are met. Scenario techniques and further econometrical models are applied to research the microeconomic level.
Every day decision making and decision making in complex human-centric systems are characterized by imperfect decision-relevant information. Main drawback of the existing decision theories is namely incapability to deal with imperfect information and modeling vague preferences. Actually, a paradigm of non-numerical probabilities in decision making has a long history and arose also in Keynes's analysis of uncertainty. There is a need for further generalization - a move to decision theories with perception-based imperfect information described in NL. The languages of new decision models for human-centric systems should be not languages based on binary logic but human-centric computational schemes able to operate on NL-described information. Development of new theories is now possible due to an increased computational power of information processing systems which allows for computations with imperfect information, particularly, imprecise and partially true information, which are much more complex than computations over numbers and probabilities. The monograph exposes the foundations of a new decision theory with imperfect decision-relevant information on environment and a decision maker's behavior. This theory is based on the synthesis of the fuzzy sets theory with perception-based information and the probability theory. The book is self containing and represents in a systematic way the decision theory with imperfect information into the educational systems. The book will be helpful for teachers and students of universities and colleges, for managers and specialists from various fields of business and economics, production and social sphere.
Tools and methods from complex systems science can have a considerable impact on the way in which the quantitative assessment of economic and financial issues is approached, as discussed in this thesis. First it is shown that the self-organization of financial markets is a crucial factor in the understanding of their dynamics. In fact, using an agent-based approach, it is argued that financial markets' stylized facts appear only in the self-organized state. Secondly, the thesis points out the potential of so-called big data science for financial market modeling, investigating how web-driven data can yield a picture of market activities: it has been found that web query volumes anticipate trade volumes. As a third achievement, the metrics developed here for country competitiveness and product complexity is groundbreaking in comparison to mainstream theories of economic growth and technological development. A key element in assessing the intangible variables determining the success of countries in the present globalized economy is represented by the diversification of the productive basket of countries. The comparison between the level of complexity of a country's productive system and economic indicators such as the GDP per capita discloses its hidden growth potential.
This brief presents a general unifying perspective on the fractional calculus. It brings together results of several recent approaches in generalizing the least action principle and the Euler-Lagrange equations to include fractional derivatives. The dependence of Lagrangians on generalized fractional operators as well as on classical derivatives is considered along with still more general problems in which integer-order integrals are replaced by fractional integrals. General theorems are obtained for several types of variational problems for which recent results developed in the literature can be obtained as special cases. In particular, the authors offer necessary optimality conditions of Euler-Lagrange type for the fundamental and isoperimetric problems, transversality conditions, and Noether symmetry theorems. The existence of solutions is demonstrated under Tonelli type conditions. The results are used to prove the existence of eigenvalues and corresponding orthogonal eigenfunctions of fractional Sturm-Liouville problems. Advanced Methods in the Fractional Calculus of Variations is a self-contained text which will be useful for graduate students wishing to learn about fractional-order systems. The detailed explanations will interest researchers with backgrounds in applied mathematics, control and optimization as well as in certain areas of physics and engineering.
Today it appears that we understand more about the universe than about our interconnected socio-economic world. In order to uncover organizational structures and novel features in these systems, we present the first comprehensive complex systems analysis of real-world ownership networks. This effort lies at the interface between the realms of economics and the emerging field loosely referred to as complexity science. The structure of global economic power is reflected in the network of ownership ties of companies and the analysis of such ownership networks has possible implications for market competition and financial stability. Thus this work presents powerful new tools for the study of economic and corporate networks that are only just beginning to attract the attention of scholars.
Toward the late 1990s, several research groups independently began developing new, related theories in mathematical finance. These theories did away with the standard stochastic geometric diffusion "Samuelson" market model (also known as the Black-Scholes model because it is used in that most famous theory), instead opting for models that allowed minimax approaches to complement or replace stochastic methods. Among the most fruitful models were those utilizing game-theoretic tools and the so-called interval market model. Over time, these models have slowly but steadily gained influence in the financial community, providing a useful alternative to classical methods. A self-contained monograph, The Interval Market Model in Mathematical Finance: Game-Theoretic Methods assembles some of the most important results, old and new, in this area of research. Written by seven of the most prominent pioneers of the interval market model and game-theoretic finance, the work provides a detailed account of several closely related modeling techniques for an array of problems in mathematical economics. The book is divided into five parts, which successively address topics including: * probability-free Black-Scholes theory; * fair-price interval of an option; * representation formulas and fast algorithms for option pricing; * rainbow options; * tychastic approach of mathematical finance based upon viability theory. This book provides a welcome addition to the literature, complementing myriad titles on the market that take a classical approach to mathematical finance. It is a worthwhile resource for researchers in applied mathematics and quantitative finance, and has also been written in a manner accessible to financially-inclined readers with a limited technical background.
This book examines the most controversial issues concerning the use of pre-drafted clauses in fine print, which are usually included in consumer contracts and presented to consumers on a take-it-or-leave-it basis. By applying a multi-disciplinary approach that combines consumer's psychology and seller's drafting power in the logic of efficiency and good faith, the book provides a fresh and unconventional analysis of the existing literature, both theoretical and empirical. Moving from the unconscionability doctrine, it criticizes (and in some cases refutes) its main conclusions based on criteria which are usually invoked to sustain the need for public intervention to protect consumers, and specifically related to Law (contract complexity), Psychology (consumer lack of sophistication criterion) and Economics (market structure criterion). It also analyzes the effects of different regulations, such as banning vexatious clauses or mandating disclosure clauses, showing that none of them protect consumers, but in fact prove to be harmful when consumers are more vulnerable, that is whenever sellers can exploit some degree of market power. In closing, the book combines these disparate aspects, arguing that the solution (if any) to the problem of consumer exploitation and market inefficiency associated with the use of contracts of adhesion in these contexts cannot be found in removing or prohibiting hidden clauses, but instead has to take into account the effects of these clauses on the contract as a whole.
This monograph provides a unified and comprehensive treatment of an order-theoretic fixed point theory in partially ordered sets and its various useful interactions with topological structures. The material progresses systematically, by presenting the preliminaries before moving to more advanced topics. In the treatment of the applications a wide range of mathematical theories and methods from nonlinear analysis and integration theory are applied; an outline of which has been given an appendix chapter to make the book self-contained. Graduate students and researchers in nonlinear analysis, pure and applied mathematics, game theory and mathematical economics will find this book useful.
In the area of dynamic economics, David Cass's work has spawned a number of important lines of research, including the study of dynamic general equilibrium theory, the concept of sunspot equilibria, and general equilibrium theory when markets are incomplete. Based on these contributions, this volume contains new developments in the field, written by Cass's students and co-authors.
Throughout the history of economics, a variety of analytical tools have been borrowed from the so-called exact sciences. As Schoe?er (1955) puts it: "They have taken their mathematics and their ded- tive techniques from physics, their statistics from genetics and agr- omy, their systems of classi?cation from taxonomy and chemistry, their model-construction techniques from astronomy and mechanics, and their methods of analysis of the consequences of actions from en- neering". The possibility of similarities of structure in mathematical models of economic and physical systems has been an important f- tor in the development of neoclassical theory. To treat the state of an economy as an equilibrium, analogous to the equilibrium of a mech- ical system has been a key concept in economics ever since it became a mathematically formalized science. Adopting a Newtonian paradigm neoclassical economics often is based on three fundamental concepts. Firstly, the representative agent who is a scale model of the whole society with extraordinary capacities, particularly concerning her - pability of information processing and computation. Of course, this is a problematic reduction as agents are both heterogeneous and bou- edly rational and limited in their cognitive capabilities. Secondly, it often con?ned itself to study systems in a state of equilibrium. But this concept is not adequate to describe and to support phenomena in perpetual motion.
This book presents modern developments in time series econometrics that are applied to macroeconomic and financial time series, bridging the gap between methods and realistic applications. It presents the most important approaches to the analysis of time series, which may be stationary or nonstationary. Modelling and forecasting univariate time series is the starting point. For multiple stationary time series, Granger causality tests and vector autogressive models are presented. As the modelling of nonstationary uni- or multivariate time series is most important for real applied work, unit root and cointegration analysis as well as vector error correction models are a central topic. Tools for analysing nonstationary data are then transferred to the panel framework. Modelling the (multivariate) volatility of financial time series with autogressive conditional heteroskedastic models is also treated.
This book constitutes the refereed proceedings of the 5th International Conference on Decision and Game Theory for Security, GameSec 2014, held in Los Angeles, CA, USA, in November 2014. The 16 revised full papers presented together with 7 short papers were carefully reviewed and selected from numerous submissions. The covered topics cover multiple facets of cyber security that include: rationality of adversary, game-theoretic cryptographic techniques, vulnerability discovery and assessment, multi-goal security analysis, secure computation, economic-oriented security, and surveillance for security. Those aspects are covered in a multitude of domains that include networked systems, wireless communications, border patrol security, and control systems.
Most financial and investment decisions are based on considerations of possible future changes and require forecasts on the evolution of the financial world. Time series and processes are the natural tools for describing the dynamic behavior of financial data, leading to the required forecasts. This book presents a survey of the empirical properties of financial time series, their descriptions by means of mathematical processes, and some implications for important financial applications used in many areas like risk evaluation, option pricing or portfolio construction. The statistical tools used to extract information from raw data are introduced. Extensive multiscale empirical statistics provide a solid benchmark of stylized facts (heteroskedasticity, long memory, fat-tails, leverage...), in order to assess various mathematical structures that can capture the observed regularities. The author introduces a broad range of processes and evaluates them systematically against the benchmark, summarizing the successes and limitations of these models from an empirical point of view. The outcome is that only multiscale ARCH processes with long memory, discrete multiplicative structures and non-normal innovations are able to capture correctly the empirical properties. In particular, only a discrete time series framework allows to capture all the stylized facts in a process, whereas the stochastic calculus used in the continuum limit is too constraining. The present volume offers various applications and extensions for this class of processes including high-frequency volatility estimators, market risk evaluation, covariance estimation and multivariate extensions of the processes. The book discusses many practical implications and is addressed to practitioners and quants in the financial industry, as well as to academics, including graduate (Master or PhD level) students. The prerequisites are basic statistics and some elementary financial mathematics.
The likelihood of observing Condorcet's Paradox is known to be very low for elections with a small number of candidates if voters' preferences on candidates reflect any significant degree of a number of different measures of mutual coherence. This reinforces the intuitive notion that strange election outcomes should become less likely as voters' preferences become more mutually coherent. Similar analysis is used here to indicate that this notion is valid for most, but not all, other voting paradoxes. This study also focuses on the Condorcet Criterion, which states that the pairwise majority rule winner should be chosen as the election winner, if one exists. Representations for the Condorcet Efficiency of the most common voting rules are obtained here as a function of various measures of the degree of mutual coherence of voters' preferences. An analysis of the Condorcet Efficiency representations that are obtained yields strong support for using Borda Rule.
Sociological theories of crime include: theories of strain blame crime on personal stressors; theories of social learning blame crime on its social rewards, and see crime more as an institution in conflict with other institutions rather than as in- vidual deviance; and theories of control look at crime as natural and rewarding, and explore the formation of institutions that control crime. Theorists of corruption generally agree that corruption is an expression of the Patron-Client relationship in which a person with access to resources trades resources with kin and members of the community in exchange for loyalty. Some approaches to modeling crime and corruption do not involve an explicit simulation: rule based systems; Bayesian networks; game theoretic approaches, often based on rational choice theory; and Neoclassical Econometrics, a rational choice-based approach. Simulation-based approaches take into account greater complexities of interacting parts of social phenomena. These include fuzzy cognitive maps and fuzzy rule sets that may incorporate feedback; and agent-based simulation, which can go a step farther by computing new social structures not previously identified in theory. The latter include cognitive agent models, in which agents learn how to perceive their en- ronment and act upon the perceptions of their individual experiences; and reactive agent simulation, which, while less capable than cognitive-agent simulation, is adequate for testing a policy's effects with existing societal structures. For example, NNL is a cognitive agent model based on the REPAST Simphony toolkit.
Market failure at medium intervals is inevitable in a capitalist economy. Such failures may not be seriously seen in the short run because market adjusts demand through hoarding of inventory or import of required goods and services. The market also adjusts demand in the long run through expansion of concerned industrial output and also by the entry of new firms. The crucial variable is price which also adjusts the commodity and the labor market. The problem comes when there are issues of overproduction, over capacity utilization of plants, over liquidation and excess supply of money, change in demand because of change in tastes and habits of consumers, households and the public. All these create knife edge disturbances in the economy. As a consequence they need adjustment through some variables such as employment and growth of population, saving propensity, technology, exhaustion of existing inventory, monetary and fiscal balancing. In this volume an attempt has been made to appraise the working of a market economy where short term disturbances may occur, market efficiency reduces, recessionary cycle emerges and after certain fundamental measures the market recovers. Starting with a brief recent history of the crisis and the recession, discussions in this volume turn to how deliberations in macroeconomics yield implications for specific policies, some of which have been tried and others still to be tested. Further in the volume we propose policies necessary for efficient regulation of the economic system, and give a brief assessment of the extent to which global policy coordination has been mulled in policy circles even if these are not seriously practiced.
"Decision Systems and Non-stochastic Randomness" presents the first mathematical formalization of the statistical regularities of non-stochastic randomness and demonstrates how these regularities extend the standard probability-based model of decision making under uncertainty, allowing for the description of uncertain mass events that do not fit standard stochastic models. The formalism of statistical regularities developed in this book will have a significant influence on decision theory and information theory as well as numerous other disciplines. |
![]() ![]() You may like...
Pro Core Data for iOS - Data Access and…
Michael Privat, Robert Warner
Paperback
OCA Java SE 8 Programmer I Exam Guide…
Kathy Sierra, Bert Bates
Paperback
Data Abstraction and Problem Solving…
Janet Prichard, Frank Carrano
Paperback
R2,421
Discovery Miles 24 210
Shepherding UxVs for Human-Swarm Teaming…
Hussein A. Abbass, Robert A. Hunjet
Hardcover
R5,625
Discovery Miles 56 250
|