![]() |
![]() |
Your cart is empty |
||
Books > Business & Economics > Business & management > Management & management techniques > Management decision making
A systematic review of the most current decision models and techniques for disease prevention and treatment Decision Analytics and Optimization in Disease Prevention and Treatment offers a comprehensive resource of the most current decision models and techniques for disease prevention and treatment. With contributions from leading experts in the field, this important resource presents information on the optimization of chronic disease prevention, infectious disease control and prevention, and disease treatment and treatment technology. Designed to be accessible, in each chapter the text presents one decision problem with the related methodology to showcase the vast applicability of operations research tools and techniques in advancing medical decision making. This vital resource features the most recent and effective approaches to the quickly growing field of healthcare decision analytics, which involves cost-effectiveness analysis, stochastic modeling, and computer simulation. Throughout the book, the contributors discuss clinical applications of modeling and optimization techniques to assist medical decision making within complex environments. Accessible and authoritative, Decision Analytics and Optimization in Disease Prevention and Treatment: Presents summaries of the state-of-the-art research that has successfully utilized both decision analytics and optimization tools within healthcare operations research Highlights the optimization of chronic disease prevention, infectious disease control and prevention, and disease treatment and treatment technology Includes contributions by well-known experts from operations researchers to clinical researchers, and from data scientists to public health administrators Offers clarification on common misunderstandings and misnomers while shedding light on new approaches in this growing area Designed for use by academics, practitioners, and researchers, Decision Analytics and Optimization in Disease Prevention and Treatment offers a comprehensive resource for accessing the power of decision analytics and optimization tools within healthcare operations research.
From profiles and interviews with the world's leading venture capitalists and high-profile coaches of business founders, A Dozen Lessons distills a set of bedrock methods for approaching business questions and creating value. The veteran business writer Tren Griffin takes the reader through the investment philosophies of VC luminaries such as Bill Gurley of Benchmark Capital, Marc Andreesen and Ben Horowitz of Andreesen Horowitz, and Jenny Lee of GGV Capital to draw out a set of guiding principles that successful businesses follow. With insight and verve, Griffin argues that venture capital is, at a fundamental level, a service business that depends hugely on "human factors." Griffin suggests that, among a number of common features, these investors succeeded because of their sense of hustle, keen judgment, hard work, and good luck. But most of all, they share a deep love of building businesses that goes beyond financial considerations. Griffin reminds us that success is a multidimensional phenomenon requiring talented people, customer traction, productive partnerships, and brand value. These features amplify one another, with incremental success attracting more attention, talent, and investment.
In this volume we present some of the papers delivered at FUR-IV - the Fourth International Conference on Founda tions and Applications of Utility, Risk and Decision Theory in Budapest, June 1988. The FUR Conferences have provided an appreciated forum every two years since 1982 within which scientists can report recent issues and prospective applications of decision theory, and exchange ideas about controversial questions of this field. Focal points of the presented papers are: expected utility versus alterna tive utility models, concepts of risk and uncertainty, developments of game theory, and investigations of real decision making behaviour under uncertainty and/or in risky situations. We hope that this sample of papers will appeal to a wide spectrum of readers who are interested in and fami liar with this interesting and exciting issues of decision theory. A wide range of theoretical and practical questions is considered in papers included in this volume, and many of them closely related to economics. In fact, there were two Nobel-Laureates in economics among the participants: I. Herbert A. Simon (1978) and Maurice Allais (1988), who won the prize just after the conference. His paper deals with problems of cardinal utility. After a concise overview of the history and theory of cardinal utility he gives an estimate of the invariant cardinal utility function for its whole domain of variation (i. e."
The present book treats a highly specialized topic, namely effec tivity functions, which are a tool for describing the power structure implicit in social choice situations of various kind. One of the ad vantages of effectivity functions is that they seem to contain exactly the information which is needed in several problems of implementa tion, that is in designing the rules for individual behaviour given that this behaviour at equilibrium should result in a prescribed functional connection between preferences and outcome. We shall be interested in both formal properties of effectiv ity functions and applications of them in social choice theory, and among such applications in particular the implementation problem. This choice of emphasis necessarily means that some other topics are treated only superficially or not at all. We do not attempt to cover all contributions to the field, rather we try to put some of the results together in order to get a reasonably coherent theory about the role of the power structure in cooperative implementation. The authors are indebted to many persons for assistance and advice during the work on this book. In particular, we would like to thank Peter Fristrup and Bodil Hansen for critical reading of the manuscript, and Lene Petersen for typesetting in '.lEX."
Risk communication: the evolution of attempts Risk communication is at once a very new and a very old field of interest. Risk analysis, as Krimsky and Plough (1988:2) point out, dates back at least to the Babylonians in 3200 BC. Cultures have traditionally utilized a host of mecha nisms for anticipating, responding to, and communicating about hazards - as in food avoidance, taboos, stigma of persons and places, myths, migration, etc. Throughout history, trade between places has necessitated labelling of containers to indicate their contents. Seals at sites of the ninth century BC Harappan civilization of South Asia record the owner and/or contents of the containers (Hadden, 1986:3). The Pure Food and Drug Act, the first labelling law with national scope in the United States, was passed in 1906. Common law covering the workplace in a number of countries has traditionally required that employers notify workers about significant dangers that they encounter on the job, an obligation formally extended to chronic hazards in the OSHA's Hazard Communication regulation of 1983 in the United States. In this sense, risk communication is probably the oldest way of risk manage ment. However, it is only until recently that risk communication has attracted the attention of regulators as an explicit alternative to the by now more common and formal approaches of standard setting, insuring etc. (Baram, 1982)."
Decision making is certainly a very crucial component of many human activities. It is, therefore, not surprising that models of decisions play a very important role not only in decision theory but also in areas such as operations Research, Management science, social Psychology etc . . The basic model of a decision in classical normative decision theory has very little in common with real decision making: It portrays a decision as a clear-cut act of choice, performed by one individual decision maker and in which states of nature, possible actions, results and preferences are well and crisply defined. The only compo nent in which uncertainty is permitted is the occurence of the different states of nature, for which probabilistic descriptions are allowed. These probabilities are generally assumed to be known numerically, i. e. as single probabili ties or as probability distribution functions. Extensions of this basic model can primarily be conceived in three directions: 1. Rather than a single decision maker there are several decision makers involved. This has lead to the areas of game theory, team theory and group decision theory. 2. The preference or utility function is not single valued but rather vector valued. This extension is considered in multiattribute utility theory and in multicritieria analysis. 3."
The motivation for this monograph can be traced to a seminar on Simple Games given by Professor S.H. Tijs of the Catholic University at Nijmegen way back in 1981 or 1982 at the Delhi campus of the Indian Statistical Institute. As an ap plied statistician and a consultant in quality control, I was naturally interested in Reliability Theory. I was aquainted with topics in reliability like coherent systems, importance of components etc., mainly through Barlow and Proschan's book. At the seminar given by Professor Tijs, I noticed the striking similarity between the concepts in reliability and simple games and this kindled my interest in simple games. When I started going deep into the literature of simple games, I noticed that a number of concepts as well as results which were well known in game theory were rediscovered much later by researchers in reliability. Though the conceptual equivalence of coherent structures and simple games has been noticed quite early, it is not that much well known. In fact, the theoretical developments have taken place practically independent of each other, with considerable duplication of research effort. The basic objective of this monograph is to unify some of the concepts and developments in reliability and simple games so as to avoid further duplication."
This book is about the interplay of theory and experimentation on group decision making in economics. The theories that the book subjects to experimental testing mostly come from the theory of games. The decisions investigated in the book mostly concern economic interaction like strict competition. two-person bargaining. and coalition formation. The underlying philosophy of the articles collected in this book is consistent with the opinion of a growing number of economists and psychologists that economic issues cannot be understood fully just by thinking about them. Rather. the interplay between theory and experimentation is critical for the development of economics as an observational science (Smith. 1989). Reports of laboratory experiments in decision making and economics date back more than thirty years (e.g . Allais. 1953; Davidson. Suppes. and Siegel. 1957; Flood. 1958; Friedman. 1%3; Kalisch. Milnor. Nash. and Nering. 1954; Lieberman. 1%0; Mosteller and Nogee. 1951; Rapoport. Chammah. Dwyer. and Gyr. I %2; Siegel and Fouraker. I %0; Stone. 1958). However. only in the last ten or fifteen years has laboratory experimentation in economics started its steady transformation from an occasional curiosity into a regular means for investigating various economic phenomena and examining the role of economic institutions. Groups of researchers in the USA and abroad have used experimental methods with increasing sophistication to attack economic problems that arise in individual decision making under risk. two-person bargaining."
It is not easy to summarize -even in a volume -the results of a scientific study con ducted by circa 30 researchers, in four different research institutions, though cooperating between them and jointly with the International Institute for Applied Systems Analysis, but working part-time, sponsored not only by IIASA's national currency funds, but also by several other research grants in Poland. The aims of this cooperative study were de fined broadly by its title Theory, Software and Testing Examples for Decision Support Systems. The focusing theme was the methodology of decision analysis and support related to the principle of reference point optimization (developed by the editors of this volume and called also variously: aspiration-led decision support, quasi-satisfying framework of rationality, DIDAS methodology etc. ). This focusing theme motivated extensive theoretical research - from basic methodological issues of decision analysis, through various results in mathematical programming (in the fields of large scale and stochastic optimization, nondifferentiable optimization, cooperative game theory) mo tivated and needed because of this theme, through methodological issues related to software development to issues resulting from testing and applications. We could not include in this volume all papers -theoretical, methodological, appiied, software manu als and documentation -written during this cooperative study."
This book presents the content of a year's course in decision processes for third and fourth year students given at the University of Toronto. A principal theme of the book is the relationship between normative and descriptive decision theory. The distinction between the two approaches is not clear to everyone, yet it is of great importance. Normative decision theory addresses itself to the question of how people ought to make decisions in various types of situations, if they wish to be regarded (or to regard themselves) as 'rational'. Descriptive decision theory purports to describe how people actually make decisions in a variety of situations. Normative decision theory is much more formalized than descriptive theory. Especially in its advanced branches, normative theory makes use of mathematicallanguage, mode of discourse, and concepts. For this reason, the definitions of terms encountered in normative decision theory are precise, and its deductions are rigorous. Like the terms and assertions of other branches of mathematics, those of mathematically formalized decision theory need not refer to anything in the 'real', i. e. the observable, world. The terms and assertions can be interpreted in the context of models of real li fe situations, but the verisimilitude of the models is not important. They are meant to capture only the essentials of adecision situation, which in reallife may be obscured by complex details and ambiguities. It is these details and ambiguities, however, that may be crucial in determining the outcomes of the decisions.
Ira Horowitz Depending upon one's perspective, the need to choose among alternatives can be an unwelcome but unavoidable responsibility, an exciting and challenging opportunity, a run-of-the-mill activity that one performs seem ingly "without thinking very much about it," or perhaps something in between. Your most recent selections from a restaurant menu, from a set of jobs or job candidates, or from a rent-or-buy or sell-or-Iease option, are cases in point. Oftentimes we are involved in group decision processes, such as the choice of a president, wherein one group member's unwelcome responsibility is another's exciting opportunity. Many of us that voted in the presidential elections of both 1956 and 1984, irrespective of political affiliation, experienced both emotions; others just pulled the lever or punched the card without thinking very much about it. Arriving at either an individual or a group decision can sometimes be a time consuming, torturous, and traumatic process that results in a long regretted choice that could have been reached right off the bat. On other occasions, the "just let's get it over with and get out of here" solution to a long-festering problem can yield rewards that are reaped for many 1 ORGANIZATION AND DECISION THEORY 2 years to come. One way or another, however, individuals and organiza tions somehow manage to get the decision-making job done, even if they don't quite understand, and often question, just how this was accomplished."
This book grew out of the conviction that the preparation and management of large-scale technological projects can be substantially improved. We have witnessed the often unhappy course of societal and political decision making concerning projects such as hazardous chemical installations, novel types of electric power plant or storage sites for solid wastes. This has led us to believe that probabilistic risk analysis, technical reliability analysis and environm, ental impact analysis are necessary but insufficient for making acceptable, and justifiable, social decisions about such projects. There is more to socio-technical decision making than applying acceptance rules based on neglige ably low accident probabilities or on maximum credible accidents. Consideration must also be given to psychological, social and political issues and methods of decision making. Our conviction initially gave rise to an international experts' workshop titled 'Social decision methodology for technological projects' (SDMTP) and held in May 1986 at the University of Groningen, the Netherlands, at a time when Cvetkovich spent a sabbatical there. The work shop - aimed at surveying the issues and listing the methods to address them - was the first part of an effort whose second part was directed at the production of this volume. Plans called for the book to deal systematically with the main problems of socio-technical decision making; it was to list a number of useful approaches and methods; and it was to present a number of integrative conclusions and recommendations for both policy makers and methodologists."
There is today a wide range of pubLications avaiLabLe on the theory of reLiabiLity and the technique of ProbabiListic Safety AnaLysis (PSA). To pLace this work properLy in this context, we must recaLL a basic concept underLying both theory and technique, that of redundancy. ReLiabiLity is something which can be designed into a system, by the introduction of redundancy at appropriate points. John Von Neumann's historic paper of 1952 'ProbabiListic Logics and the Synthesis of ReLiabLe Organisms from UnreLiabLe Components" has served as inspiration for aLL subsequent work on systems reLiabiLity. This paper sings the praises of redundancy as a means of designing reLiabiLity into systems, or, to use Von Neumann's words, of minimising error. Redundancy, then, is a fundamentaL characteristic which a designer seeks to buiLd in by using appropriate structuraL characteristics of the 'modeL" or representation which he uses for his work. But any modeL is estabLished through a process of de Limination and decomposition. FirstLy, a "Universe of Discourse" is delineated; its component eLements are then separated out; and moreover in a probabiListic framework for each eLement each possibLe state is defined and assigned an appropriate possibiLity measure caLLed probability.
One of the greatest challenges facing those concerned with health and environmental risks is how to carry on a useful public dialogue on these subjects. In a democracy, it is the public that ultimately makes the key decisions on how these risks will be controlled. The stakes are too high for us not to do our very best. The importance of this subject is what led the Task Force on Environmental Cancer and Heart and Lung Disease to establish an Interagency Group on Public Education and Communication. This volume captures the essence of the "Workshop on the Role of Government in Health Risk Communication and Public Education" held in January 1987. It also includes some valuable appendixes with practical guides to risk communication. As such, it is an important building block in the effort to improve our collective ability to carry on this critical public dialogue. Lee M. Thomas Administrator, U. S. Environmental Protection Agency, and Chairman, The Task Force on Environmental Cancer and Heart and Lung Disease Preface The Task Force on Environmental Cancer and Heart and Lung Disease is an interagency group established by the Clean Air Act Amendments of 1977 (P.L. 95-95). Congress mandated the Task Force to recommend research to determine the relationship between environmental pollutants and human disease and to recommend research aimed at reduc ing the incidence of environment-related disease. The Task Force's Project Group on Public Education and Communication focuses on education as a means of reducing or preventing disease."
Bernard ROY Professor, University of Paris-Dauphine Director of LAMSADE 11 is not unusual for a dozen or so loosely related working papers to be published in book form as the natural outgrowth of a scientific gathering. Although many a volu- me of collected papers has come into point in this way, the homogeneity of the arti- cles included will often be more apparent than real. As the reader will quickly ob- serve, such is not the case with the present volume. As one can judge from its ti- tle, 1t is in fact an outcome of an ed~torial project by J. Kacprzyk and M. Roubens. T~ey asked contributing authors to submit recent works which would examine. within a non-traditional theoretical framework, preference analysis and preference modeliing 1n a fuzzy context oriented towards decision aid. The articles by J.P. Ooignon, B. Monjardet, T. Tanino and Ph. Vincke empnasize the analysis of oreference structures, mainly in the presence of incomparability. In- transitivlty, thresholds and, more generally, inaccurate determination. Considera- ble attention is devoted to the analysis of efficient and non-dominated (in Pareto's sense of the term) decisions in the four papers presented by S. Ovchinnikov and M.
The completion of this thesis gives me feelings of satisfaction and thankfulness. Satisfaction because its results appear to be worthwile and relevant, and thankfulness towards so many persons who contributed to the progress of the work. The project "Analysis of multilevel decisions" was granted by the common research pool of Tilburg University and Eindhoven University of Techno- logy (Samenwerkingsorgaan Brabantse Universiteiten). During the 4-year lead time, the Department of Econometrics of Tilburg University provided not only a single room but also a pleasant and inspiring environment, for which I am very grateful. The research itself, particularly the inevitable scientific struggles, was perfectly coached by my promotors, Prof. Dr. P.A. Verheyen and Prof. Dr. J.F. Benders. I cannot give even the slightest description of the unique way in which they managed to do this. In all criticism they succeeded to maintain a positive, and thus stimulating, working atmosphere. The work also benefited from the suggestions gi ven by Prof. Dr. Th.M.A. Bemelmans, Prof. Dr. J.P.C. Kleijnen, Prof. Dr. P.H.M. Ruys and Prof. Dr. A. Schrijver. Furthermore I am indebted to Dr. Adam Wofniak (Warsaw University of Technology), who made me participate in his multi- level experience and critically commented on an earlier draft of the thesis.
This monograph is intended for an advanced undergraduate or graduate course of engineering and management science. as well as for persons in business. industry. military or in any field. who want an introductory and a capsule look into the methods of group decision making under multiple criteria. This is a sequel to our previous works entitled "Multiple Objective Decision Making--Methods and Applications (No. 164 of the Lecture Notes). and "Multiple Attribute Decision Making--Methods and Applications (No. 186 of the Lecture Notes). Moving from a single decision maker (the consideration of Lecture Notes 164 and 186) to a multiple decision maker setting introduces a great deal of complexity into the analysis. The problem is no longer the selection of the most preferred alternative among the nondominated solutions according to one individual's (single decision maker's) preference structure. The analysis is extended to account for the conflicts among different interest groups who have different objectives. goals. and so forth. Group decision making under multiple criteria includes such diverse and interconnected fields as preference analysis. utility theory. social choice theory. committee decision theory. theory of voting. game theory. expert evaluation analysis. aggregation of qualitative factors. economic equilibrium theory. etc; these are simplified and systematically classified for beginners. This work is to provide readers with a capsule look into the existing methods. their characteristics. and applicability in the complexity of group decision making.
Much of the work in this volume was supported by the National Science Foundation under Grant SES82-05112 from the Program in History and Philosophy of Science and the Division of Policy Research and Analysis. (Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author and do not necessarily reflect the views of the National Science Foundation. ) Several of these essays were written because of the impetus afforded by speaking invitations. An earlier version of Chapter 3 was presented in Berkeley in January 1983 at a Principal Investi gators' Conference sponsored by the National Science Foundation, Division of Policy Research and Analysis, Technology Assessment and Risk Assessment Group. In May 1982, an earlier version of Chapter 5 was presented at the meeting of the Society for Philos ophy and Technology, held in conjunction with the American Philosophical Association meeting, Western Division, in Columbus, Ohio. Finally, earlier versions of Chapter 6 were presented in Boston in December 1981 at the Boston Colloquium for the Philosophy of Science, as well as at the University of Delaware in January 1982 and at the Biennial Meeting of the Philosophy of Science Association held in Philadelphia in October 1982. An earlier version of this same chapter was published in Philosophy of Science Association 82, volume 1, ed. T. Nickles, Philosophy of Science Association, East Lansing, Michigan, 1982. A number of people have helped to make this book better than it might have been."
Suppose you had the chance to invest in a venture that succeeds half the time. When you fail you lose your in vestment; when you succeed you make a profit of$1.60 for every $1.00 you invest. The odds are 8 to 5 in your favor and you should do well-casinos and insurance companies thrive under less favorable conditions. If you can invest as much as you like, as often as you like, using a betting system that guarantees you can't go broke, common sense suggests you will almost certainly make aprofitafteryou make a large numberofinvestments. In response to yourrequest for a hot stock yourastrologer tells you ABC Inc. will triple in a year (she's really a fraud and picked the stock at random). But since such stocks are rare (one in athousand) you consultan expert and, strangely enough, he confirms the astrologer. From experience you know that the expert diagnoses all stocks, good and bad, correctly, 90% of the time. Common sense suggests you have an excellent chance of tripling your money. You are chairman of acommittee ofthree. Decisions are made by majority rule but if there is no majority your vote as chairman breaks ties. Common sense suggests you will inevitably have more power to determine the outcome than the other members."
Optimizing Digital Strategy explores the choices facing organizations in the rapidly changing world of technology-enabled business. From performance marketing through to personalization, on-demand retailing and AI, this book maps out commercial and customer-focused challenges and explains how leaders can get the most out of their digital strategies. Rather than rushing headlong into adopting the latest digital platforms, tools and technologies, the book challenges leaders to step back from the demands for constant investment in new technology and drive better returns from existing assets. Presenting a sustainable model of e-commerce that is appropriate to any individual organization's needs, Optimizing Digital Strategy addresses the repetitive dilemma between even more investment in technology and the need to improve margins and grow revenue. Illustrated by the authors' own digital work for global brands such as The Economist, Sky, O2, Regus, the Financial Times, Lidl and L.K.Bennett, this book shows how to balance the need to remain competitive, fully deliver customer expectations, and put resources behind investments that will deliver the best return.
Getting what you want - even if you are the boss - isn't always easy. Almost every organization, big or small, works among a network of competing interests. Whether it's governments pushing through policies, companies trying to increase profits, or even families deciding where to move house, rarely can decisions be made in isolation from competing interests both within the organization and outside it. In this accessible and straightforward account, Hans de Bruijn and Ernst ten Heuvelhof cast light on multi-stakeholder decision-making. Using plain language, they reveal the nuts and bolts of decision-making within the numerous dilemmas and tensions at work. Drawing on a diverse range of illustrative examples throughout, their perceptive analysis examines how different interests can either support or block change, and the strategies available for managing a variety of stakeholders. The second edition of Management in Networks incorporates a wider spread of international cases, a new chapter giving an overview of different network types, and a new chapter looking at digital governance and the impact of big data on networks. This insightful text is invaluable reading for students of management and organizational studies, plus practitioners - or actors - operating in a range of contexts.
Combinatorial optimization is a multidisciplinary scientific area, lying in the interface of three major scientific domains: mathematics, theoretical computer science and management. The three volumes of the Combinatorial Optimization series aim to cover a wide range of topics in this area. These topics also deal with fundamental notions and approaches as with several classical applications of combinatorial optimization. Concepts of Combinatorial Optimization, is divided into three parts: - On the complexity of combinatorial optimization problems, presenting basics about worst-case and randomized complexity; - Classical solution methods, presenting the two most-known methods for solving hard combinatorial optimization problems, that are Branch-and-Bound and Dynamic Programming; - Elements from mathematical programming, presenting fundamentals from mathematical programming based methods that are in the heart of Operations Research since the origins of this field. |
![]() ![]() You may like...
Matthew Fox - Essential Writings on…
Matthew Fox, Dr.Charles Burack
Paperback
|