![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Optimization > Game theory
Max-Min problems are two-step allocation problems in which one side must make his move knowing that the other side will then learn what the move is and optimally counter. They are fundamental in parti cular to military weapons-selection problems involving large systems such as Minuteman or Polaris, where the systems in the mix are so large that they cannot be concealed from an opponent. One must then expect the opponent to determine on an optlmal mixture of, in the case men tioned above, anti-Minuteman and anti-submarine effort. The author's first introduction to a problem of Max-Min type occurred at The RAND Corporation about 1951. One side allocates anti-missile defenses to various cities. The other side observes this allocation and then allocates missiles to those cities. If F(x, y) denotes the total residual value of the cities after the attack, with x denoting the defender's strategy and y the attacker's, the problem is then to find Max MinF(x, y) = Max MinF(x, y)] ."
This book constitutes the refereed proceedings of the 18th International Symposium Fundamentals of Computation Theory, FCT 2011, held in Oslo, Norway, in August 2011. The 28 revised full papers presented were carefully reviewed and selected from 78 submissions. FCT 2011 focused on algorithms, formal methods, and emerging fields, such as ad hoc, dynamic and evolving systems; algorithmic game theory; computational biology; foundations of cloud computing and ubiquitous systems; and quantum computation.
The management of financial portfolios or funds constitutes a widely known problematic in financial markets which normally requires a rigorous analysis in order to select the most profitable assets. This subject is becoming popular among computer scientists which try to adapt known Intelligent Computation techniques to the market's domain. This book proposes a potential system based on Genetic Algorithms, which aims to manage a financial portfolio by using technical analysis indicators. The results are promising since the approach clearly outperforms the remaining approaches during the recent market crash.
Changing interest rates constitute one of the major risk sources for banks, insurance companies, and other financial institutions. Modeling the term-structure movements of interest rates is a challenging task. This volume gives an introduction to the mathematics of term-structure models in continuous time. It includes practical aspects for fixed-income markets such as day-count conventions, duration of coupon-paying bonds and yield curve construction; arbitrage theory; short-rate models; the Heath-Jarrow-Morton methodology; consistent term-structure parametrizations; affine diffusion processes and option pricing with Fourier transform; LIBOR market models; and credit risk. The focus is on a mathematically straightforward but rigorous development of the theory. Students, researchers and practitioners will find this volume very useful. Each chapter ends with a set of exercises, that provides source for homework and exam questions. Readers are expected to be familiar with elementary Ito calculus, basic probability theory, and real and complex analysis."
This book is based on the papers presented at the International Conference 'Quality Improvement through Statistical Methods' in Cochin, India during December 28-31, 1996. The Conference was hosted by the Cochin University of Science and Technology, Cochin, India; and sponsored by the Institute for Improvement in Quality and Productivity (IIQP) at the University of Waterloo, Canada, the Statistics in Industry Committee of the International Statistical Institute (lSI) and by the Indian Statistical Institute. There has been an increased interest in Quality Improvement (QI) activities in many organizations during the last several years since the airing of the NBC television program, "If Japan can ... why can't we?" Implementation of QI meth ods requires statistical thinking and the utilization of statistical tools, thus there has been a renewed interest in statistical methods applicable to industry and technology. This revitalized enthusiasm has created worldwide discussions on Industrial Statistics Research and QI ideas at several international conferences in recent years. The purpose of this conference was to provide a forum for presenting and ex changing ideas in Statistical Methods and for enhancing the transference of such technologies to quality improvement efforts in various sectors. It also provided an opportunity for interaction between industrial practitioners and academia. It was intended that the exchange of experiences and ideas would foster new international collaborations in research and other technology transfers."
Game theory is a rich and active area of research of which this new volume of the Annals of the International Society of Dynamic Games is yet fresh evidence. Since the second half of the 20th century, the area of dynamic games has man aged to attract outstanding mathematicians, who found exciting open questions requiring tools from a wide variety of mathematical disciplines; economists, so cial and political scientists, who used game theory to model and study competition and cooperative behavior; and engineers, who used games in computer sciences, telecommunications, and other areas. The contents of this volume are primarily based on selected presentation made at the 8th International Symposium of Dynamic Games and Applications, held in Chateau Vaalsbroek, Maastricht, the Netherlands, July 5-8, 1998; this conference took place under the auspices of the International Society of Dynamic Games (ISDG), established in 1990. The conference has been cosponsored by the Control Systems Society of the IEEE, IFAC (International Federation of Automatic Con trol), INRIA (Institute National de Recherche en Informatique et Automatique), and the University of Maastricht. One ofthe activities of the ISDG is the publica tion of the Annals. Every paper that appears in this volume has passed through a stringent reviewing process, as is the case with publications for archival journals.
Inverse limits with set-valued functions are quickly becoming a popular topic of research due to their potential applications in dynamical systems and economics. This brief provides a concise introduction dedicated specifically to such inverse limits. The theory is presented along with detailed examples which form the distinguishing feature of this work. The major differences between the theory of inverse limits with mappings and the theory with set-valued functions are featured prominently in this book in a positive light. The reader is assumed to have taken a senior level course in analysis and a basic course in topology. Advanced undergraduate and graduate students, and researchers working in this area will find this brief useful.
Since the introduction of electrosurgery the techniques of surgery on the nervous system have passed through further improvements (bipolar coagulation, microscope), even if the procedure was not substantially modified. Today, laser represents a new "discipline," as it offers a new way of performing all basic maneuvers (dissection, demolition, hemostasis, vessel sutures). Furthermore, laser offers the possibility of a special maneuver, namely reduction of the volume of a tumoral mass through vaporization. Its application is not restricted to traditional neurosurgery but extends also to stereotactic and vascular neurosurgery. Laser surgery has also influenced the anesthesiologic techniques. At the same time new instrumentation has been introduced: CUSA ultrasonic aspiration, echotomography, and Doppler flowmeter. I have had the chance to utilize these new technologies all at a time and have come to the conclusion that we are facing the dawn of a new methodology which has already shown its validity and lack of inconveniences, and whose object is to increase the precision of neurological surgery. The technological development is still going on, and some improvements are to be foreseen. Laser scalpel is splitting the initial laser surgery into NO TOUCH and TOUCH surgery with laser. As new instrumentarium will be developed, a variable and tunable beam will become available. For example, in a few years Free Electron Laser will further add to the progress in this field."
Groups of people perform acts that are subject to standards of rationality. A committee may sensibly award fellowships, or may irrationally award them in violation of its own policies. A theory of collective rationality defines collective acts that are evaluable for rationality and formulates principles for their evaluation. This book argues that a group's act is evaluable for rationality if it is the products of acts its members fully control. It also argues that such an act is collectively rational if the acts of the group's members are rational. Efficiency is a goal of collective rationality, but not a requirement, except in cases where conditions are ideal for joint action and agents have rationally prepared for joint action. The people engaged in a game of strategy form a group, and the combination of their acts yields a collective act. If their collective act is rational, it constitutes a solution to their game. A theory of collective rationality yields principles concerning solutions to games. One principle requires that a solution constitute an equilibrium among the incentives of the agents in the game. In a cooperative game some agents are coalitions of individuals, and it may be impossible for all agents to pursue all incentives. Because rationality is attainable, the appropriate equilibrium standard for cooperative games requires that agents pursue only incentives that provide sufficient reasons to act. The book's theory of collective rationality supports an attainable equilibrium-standard for solutions to cooperative games and shows that its realization follows from individuals' rational acts. By extending the theory of rationality to groups, this book reveals the characteristics that make an act evaluable for rationality and the way rationality's evaluation of an act responds to the type of control its agent exercises over the act. The book's theory of collective rationality contributes to philosophical projects such as contractarian ethics and to practical projects such as the design of social institutions.
This book constitutes the refereed proceedings of the 5th International Symposium on Algorithmic Game Theory, SAGT 2012, held in Barcelona, Spain, in October 2012. The 22 revised full papers presented together with 2 invited lectures were carefully reviewed and selected from 65 submissions. The papers present original research at the intersection of Algorithms and Game Theory and address various current topics such as solution concepts in game theory; efficiency of equilibria and price of anarchy; complexity classes in game theory; computational aspects of equilibria; computational aspects of fixed-point theorems; repeated games; evolution and learning in games; convergence of dynamics; coalitions, coordination and collective action; reputation, recommendation and trust systems; graph-theoretic aspects of social networks; network games; cost-sharing algorithms and analysis; computing with incentives; algorithmic mechanism design; computational social choice; decision theory, and pricing; auction algorithms and analysis; economic aspects of distributed computing; internet economics and computational advertising.
This book provides a game theoretic model of interaction among VoIP telecommunications providers regarding their willingness to enter peering agreements with one another. The author shows that the incentive to peer is generally based on savings from otherwise payable long distance fees. At the same time, termination fees can have a countering and dominant effect, resulting in an environment in which VoIP firms decide against peering. Various scenarios of peering and rules for allocation of the savings are considered. The first part covers the relevant aspects of game theory and network theory, trying to give an overview of the concepts required in the subsequent application. The second part of the book introduces first a model of how the savings from peering can be calculated and then turns to the actual formation of peering relationships between VoIP firms. The conditions under which firms are willing to peer are then described, considering the possible influence of a regulatory body.
Multifractal Financial Markets explores appropriate models for estimating risk and profiting from market swings, allowing readers to develop enhanced portfolio management skills and strategies. Fractals in finance allow us to understand market instability and persistence. When applied to financial markets, these models produce the requisite amount of data necessary for gauging market risk in order to mitigate loss. This brief delves deep into the multifractal market approach to portfolio management through real-world examples and case studies, providing readers with the tools they need to forecast profound shifts in market activity.
In the case of completely integrable systems, periodic solutions are found by inspection. For nonintegrable systems, such as the three-body problem in celestial mechanics, they are found by perturbation theory: there is a small parameter EURO in the problem, the mass of the perturbing body for instance, and for EURO = 0 the system becomes completely integrable. One then tries to show that its periodic solutions will subsist for EURO -# 0 small enough. Poincare also introduced global methods, relying on the topological properties of the flow, and the fact that it preserves the 2-form L~=l dPi 1\ dqi' The most celebrated result he obtained in this direction is his last geometric theorem, which states that an area-preserving map of the annulus which rotates the inner circle and the outer circle in opposite directions must have two fixed points. And now another ancient theme appear: the least action principle. It states that the periodic solutions of a Hamiltonian system are extremals of a suitable integral over closed curves. In other words, the problem is variational. This fact was known to Fermat, and Maupertuis put it in the Hamiltonian formalism. In spite of its great aesthetic appeal, the least action principle has had little impact in Hamiltonian mechanics. There is, of course, one exception, Emmy Noether's theorem, which relates integrals ofthe motion to symmetries of the equations. But until recently, no periodic solution had ever been found by variational methods.
Game Theoretic Risk Analysis of Security Threats introduces reliability and risk analysis in the face of threats by intelligent agents. More specifically, game-theoretic models are developed for identifying optimal and/or equilibrium defense and attack strategies in systems of varying degrees of complexity. The book covers applications to networks, including problems in both telecommunications and transportation. However, the book s primary focus is to integrate game theory and reliability methodologies into a set of techniques to predict, detect, diminish, and stop intentional attacks at targets that vary in complexity. In this book, Bier and Azaiez highlight work by researchers who combine reliability and risk analysis with game theory methods to create a set of functional tools that can be used to offset intentional, intelligent threats (including the threats of terrorism and war). A comprehensive treatment of such problems must consider two aspects: (1) the structure of the system to be protected; and (2) the adaptive nature of the threat). The book provides a set of tools for applying game theory TO reliability problems in the presence of intentional, intelligent threats. These tools will help to address problems of global security and also facilitate more cost-effective defensive investments. "
This volume contains twelve of my game-theoretical papers, published in the period of 1956-80. It complements my Essays on Ethics, Social Behavior, and Scientific Explanation, Reidel, 1976, and my Rational Behavior and Bargaining Equilibrium in Games and Social Situations, Cambridge University Press, 1977. These twelve papers deal with a wide range of game-theoretical problems. But there is a common intellectual thread going though all of them: they are all parts of an attempt to generalize and combine various game-theoretical solution concepts into a unified solution theory yielding one-point solutions for both cooperative and noncooperative games, and covering even such 'non-classical' games as games with incomplete information. SECTION A The first three papers deal with bargaining models. The first one discusses Nash's two-person bargaining solution and shows its equivalence with Zeuthen's bargaining theory. The second considers the rationality postulates underlying the Nash-Zeuthen theory and defends it against Schelling's objections. The third extends the Shapley value to games without transferable utility and proposes a solution concept that is at the same time a generaliza tion of the Shapley value and of the Nash bargaining solution."
Distributed Decision Making and Control is a mathematical treatment of relevant problems in distributed control, decision and multiagent systems, The research reported was prompted by the recent rapid development in large-scale networked and embedded systems and communications. One of the main reasons for the growing complexity in such systems is the dynamics introduced by computation and communication delays. Reliability, predictability, and efficient utilization of processing power and network resources are central issues and the new theory and design methods presented here are needed to analyze and optimize the complex interactions that arise between controllers, plants and networks. The text also helps to meet requirements arising from industrial practice for a more systematic approach to the design of distributed control structures and corresponding information interfaces Theory for coordination of many different control units is closely related to economics and game theory network uses being dictated by congestion-based pricing of a given pathway. The text extends existing methods which represent pricing mechanisms as Lagrange multipliers to distributed optimization in a dynamic setting. In Distributed Decision Making and Control, the main theme is distributed decision making and control with contributions to a general theory and methodology for control of complex engineering systems in engineering, economics and logistics. This includes scalable methods and tools for modeling, analysis and control synthesis, as well as reliable implementations using networked embedded systems. Academic researchers and graduate students in control science, system theory, and mathematical economics and logistics will find mcu to interest them in this collection, first presented orally by the contributors during a sequence of workshops organized in Spring 2010 by the Lund Center for Control of Complex Engineering Systems, a Linnaeus Center at Lund University, Sweden.>
Climate change is one of the major environmental concern of many countries in the world. Negotiations to control potential climate changes have been taking place, from Rio to Kyoto, for the last five years. There is a widespread consciousness that the risk of incurring in relevant economic and environmental losses due to climate change is high. Scientific analyses have become more and more precise on the likely impacts of climate change. According to the Second Assessment Report of the Intergovernmental Panel on Climate Change, current trends in greenhouse gases (GHGs) emissions may indeed cause the average global temperature to increase by 1-3. 5 DegreesC over the next 100 years. As a result, sea levels are expected to rise by 15 to 95 em and climate zones to shift towards the poles by 150 to 550 km in mid latitudes. In order to mitigate the adverse effects of climate change, the IPCC report concludes that a stabilization of atmospheric concentration of carbon dioxide - one of the major GHGs - at 550 parts per million by volume (ppmv) is recommended. This would imply a reduction of global emissions of about 50 per cent with respect to current levels. In this context, countries are negotiating to achieve a world-wide agreement on GHGs emissions control in order to stabilize climate changes. Despite the agreement on targets achieved in Kyoto, many issues still remain unresolved.
The theory of social choice deals with both the processes and results of col lective decision making. In this book, we explore some issues in the theory of social choice and mechanism design. We examine the premises of this theory, the axiomatic approach, and the mechanism design approach. The main questions are what is collective interest, how is it related to individuals' interests, how should one design social interactions, laws, and in stitutions? These questions are not new. Philosophers, social scientists have indeed pondered upon them for years. And, in fact, the organizational struc tures of many social institutions -courts, parliaments, committees and reg ulatory boards -often lack a sound theoretical base. This is not surprising, as it is, indeed, difficult to provide for a comprehensive formalization of the activities of such organizations. Nevertheless, there has been a definite trend towards providing clear and unambiguous rules for collective decision mak ing. These very rules constitute the body of social choice theory and its main object. The basic problem of social choice We explain here more precisely what a problem of social choice is, what approaches might be used to tackle it, and what kind of solutions it leads to. We introduce a few basic notions in preliminarily fashion and, in doing so, we stress both motivations and explanations."
The aim of this volume is to make available to a large audience recent material in nonlinear functional analysis that has not been covered in book format before. Here, several topics of current and growing interest are systematically presented, such as fixed point theory, best approximation, the KKM-map principle, and results related to optimization theory, variational inequalities and complementarity problems. Illustrations of suitable applications are given, the links between results in various fields of research are highlighted, and an up-to-date bibliography is included to assist readers in further studies. Audience: This book will be of interest to graduate students, researchers and applied mathematicians working in nonlinear functional analysis, operator theory, approximations and expansions, convex sets and related geometric topics and game theory.
Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgue spaces, representation theorems) is generalized, at least for submodular measures which are characterized by having a subadditive integral. The theory is of interest for applications to economic decision theory (decisions under risk and uncertainty), to statistics (including belief functions, fuzzy measures) to cooperative game theory, artificial intelligence, insurance, etc. Non-Additive Measure and Integral collects the results of scattered and often isolated approaches to non-additive measures and their integrals which originate in pure mathematics, potential theory, statistics, game theory, economic decision theory and other fields of application. It unifies, simplifies and generalizes known results and supplements the theory with new results, thus providing a sound basis for applications and further research in this growing field of increasing interest. It also contains fundamental results of sigma-additive and finitely additive measure and integration theory and sheds new light on additive theory. Non-Additive Measure and Integral employs distribution functions and quantile functions as basis tools, thus remaining close to the familiar language of probability theory. In addition to serving as an important reference, the book can be used as a mathematics textbook for graduate courses or seminars, containing many exercises to support or supplement the text.
In complementarity theory, which is a relatively new domain of
applied mathematics, several kinds of mathematical models and
problems related to the study of equilibrium are considered from
the point of view of physics as well as economics. In this book the
authors have combined complementarity theory, equilibrium of
economical systems, and efficiency in Pareto's sense. The authors
discuss the use of complementarity theory in the study of
equilibrium of economic systems and present results they have
obtained. In addition the authors present several new results in
complementarity theory and several numerical methods for solving
complementarity problems associated with the study of economic
equilibrium. The most important notions of Pareto efficiency are
also presented.
The use of the internet for commerce has spawned a variety of
auctions, marketplaces, and exchanges for trading everything from
bandwidth to books. Mechanisms for bidding agents, dynamic pricing,
and combinatorial bids are being implemented in support of
internet-based auctions, giving rise to new versions of
optimization and resource allocation models. This volume, a
collection of papers from an IMA "Hot Topics" workshop in internet
auctions, includes descriptions of real and proposed auctions,
complete with mathematical model formulations, theoretical results,
solution approaches, and computational studies.
In recent years there is a growing interest in generalized convex fu- tions and generalized monotone mappings among the researchers of - plied mathematics and other sciences. This is due to the fact that mathematical models with these functions are more suitable to describe problems of the real world than models using conventional convex and monotone functions. Generalized convexity and monotonicity are now considered as an independent branch of applied mathematics with a wide range of applications in mechanics, economics, engineering, finance and many others. The present volume contains 20 full length papers which reflect c- rent theoretical studies of generalized convexity and monotonicity, and numerous applications in optimization, variational inequalities, equil- rium problems etc. All these papers were refereed and carefully selected from invited talks and contributed talks that were presented at the 7th International Symposium on Generalized Convexity/Monotonicity held in Hanoi, Vietnam, August 27-31, 2002. This series of Symposia is or- nized by the Working Group on Generalized Convexity (WGGC) every 3 years and aims to promote and disseminate research on the field. The WGGC (http: //www.genconv.org) consists of more than 300 researchers coming from 36 countries
This book tries to sort out the different meanings of uncertainty and to discover their foundations. It shows that uncertainty can be represented using various tools and mental guidelines. Coverage also examines alternative ways to deal with risk and risk attitude concepts. Behavior under uncertainty emerges from this book as something to base more on inquiry and reflection rather than on mere intuition.
This volume contains a selection consisting of the best papers presented at the FUR XII conference, held at LUISS in Roma, Italy, in June 2006, organized by John Hey and Daniela Di Cagno. The objectives of the FUR (Foundations of Utility and Risk theory) conferences have always been to bring together leading academics from Economics, Psychology, Statistics, Operations Research, Finance, Applied Mat- matics, and other disciplines, to address the issues of decision-making from a g- uinely multi-disciplinary point of view. This twelfth conference in the series was no exception. The early FUR conferences - like FUR I (organized by Maurice Allais and Ole Hagen) and FUR III (organized by Bertrand Munier) - initiated the move away from the excessively rigid and descriptively-inadequate modelling of beh- iour under risk and uncertainty that was in vogue in conventional economics at that time. More than twenty years later, things have changed fundamentally, and now - novations arising from the FUR conferences, and manifesting themselves in the new behavioural economics, are readily accepted by the profession. Working with new models of ambiguity, and bounded rationality, for example, behavioural decision making is no longer considered a sign of mere non-standard intellectual diversi?- tion. FUR XII was organised with this new spirit. In the sense that the behavioural concerns initiated by the ?rst FUR conferences are now part of conventional e- nomics, and the design and organisation of FUR XII re?ects this integration, FUR XII represents a key turning point in the FUR conference series. |
You may like...
The English Handbook and Study Guide - A…
Beryl Lutrin
Paperback
(1)
Statistics for Management and Economics
Gerald Keller, Nicoleta Gaciu
Paperback
|