![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Business & management > Management & management techniques > Operational research
This book presents innovative and high-quality research regarding advanced decision support systems (DSSs). It describes the foundations, methods, methodologies, models, tools, and techniques for designing, developing, implementing and evaluating advanced DSSs in different fields, including finance, health, emergency management, industry and pollution control. Decision support systems employ artificial intelligence methods to heuristically address problems that are cannot be solved using formal techniques. In this context, technologies such as the Semantic Web, linked data, big data, and machine learning are being applied to provide integrated support for individuals and organizations to make more rational decisions. The book is organized into two parts. The first part covers decision support systems for industry, while the second part presents case studies related to clinical emergency management and pollution control.
Effective decision-making while trading off the constraints and conflicting multiple objectives under rapid technological developments, massive generation of data, and extreme volatility is of paramount importance to organizations to win over the time-based competition today. While agility is a crucial issue, the firms have been increasingly relying on evidence-based decision-making through intelligent decision support systems driven by computational intelligence and automation to achieve a competitive advantage. The decisions are no longer confined to a specific functional area. Instead, business organizations today find actionable insight for formulating future courses of action by integrating multiple objectives and perspectives. Therefore, multi-objective decision-making plays a critical role in businesses and industries. In this regard, the importance of Operations Research (OR) models and their applications enables the firms to derive optimum solutions subject to various constraints and/or objectives while considering multiple functional areas of the organizations together. Hence, researchers and practitioners have extensively applied OR models to solve various organizational issues related to manufacturing, service, supply chain and logistics management, human resource management, finance, and market analysis, among others. Further, OR models driven by AI have been enabled to provide intelligent decision-support frameworks for achieving sustainable development goals. The present issue provides a unique platform to showcase the contributions of the leading international experts on production systems and business from academia, industry, and government to discuss the issues in intelligent manufacturing, operations management, financial management, supply chain management, and Industry 4.0 in the Artificial Intelligence era. Some of the general (but not specific) scopes of this proceeding entail OR models such as Optimization and Control, Combinatorial Optimization, Queuing Theory, Resource Allocation Models, Linear and Nonlinear Programming Models, Multi-objective and multi-attribute Decision Models, Statistical Quality Control along with AI, Bayesian Data Analysis, Machine Learning and Econometrics and their applications vis-à -vis AI & Data-driven Production Management, Marketing and Retail Management, Financial Management, Human Resource Management, Operations Management, Smart Manufacturing & Industry 4.0, Supply Chain and Logistics Management, Digital Supply Network, Healthcare Administration, Inventory Management, consumer behavior, security analysis, and portfolio management and sustainability.  The present issue shall be of interest to the faculty members, students, and scholars of various engineering and social science institutions and universities, along with the practitioners and policymakers of different industries and organizations.
Focuses on the use of simulation techniques to model and evaluate repetitive construction operations. Based on the CYCLONE and MICROCYCLONE software developed by the authors and used at 38 universities nationwide, it uses a variety of examples from all areas of construction to demonstrate the application of simulation to analyze construction operations.
Operational research is a collection of modelling techniques used to structure, analyse, and solve problems related to the design and operation of complex human systems. While many argue that operational research should play a key role in improving healthcare services, staff may be largely unaware of its potential applications. This Element explores operational research's wartime origins and introduce several approaches that operational researchers use to help healthcare organisations: address well-defined decision problems; account for multiple stakeholder perspectives; and describe how system performance may be impacted by changing the configuration or operation of services. The authors draw on examples that illustrate the valuable perspective that operational research brings to improvement initiatives and the challenges of implementing and scaling operational research solutions. They discuss how operational researchers are working to surmount these problems and suggest further research to help operational researchers have greater beneficial impact in healthcare improvement. This title is also available as Open Access on Cambridge Core.
"Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, "provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined on the basis of numerous courses that the authors have held for practitioners worldwide. "
This book introduces the theory and applications of uncertain optimal control, and establishes two types of models including expected value uncertain optimal control and optimistic value uncertain optimal control. These models, which have continuous-time forms and discrete-time forms, make use of dynamic programming. The uncertain optimal control theory relates to equations of optimality, uncertain bang-bang optimal control, optimal control with switched uncertain system, and optimal control for uncertain system with time-delay. Uncertain optimal control has applications in portfolio selection, engineering, and games. The book is a useful resource for researchers, engineers, and students in the fields of mathematics, cybernetics, operations research, industrial engineering, artificial intelligence, economics, and management science.
Many decision problems in Operations Research are defined on temporal networks, that is, workflows of time-consuming tasks whose processing order is constrained by precedence relations. For example, temporal networks are used to model projects, computer applications, digital circuits and production processes. Optimization problems arise in temporal networks when a decision maker wishes to determine a temporal arrangement of the tasks and/or a resource assignment that optimizes some network characteristic (e.g. the time required to complete all tasks). The parameters of these optimization problems (e.g. the task durations) are typically unknown at the time the decision problem arises. This monograph investigates solution techniques for optimization problems in temporal networks that explicitly account for this parameter uncertainty. We study several formulations, each of which requires different information about the uncertain problem parameters.
Management: The Basics provides an easy, jargon-free introduction to the fundamental principles and practices of modern management.
This book offers an in-depth and comprehensive introduction to the priority methods of intuitionistic preference relations, the consistency and consensus improving procedures for intuitionistic preference relations, the approaches to group decision making based on intuitionistic preference relations, the approaches and models for interactive decision making with intuitionistic fuzzy information, and the extended results in interval-valued intuitionistic fuzzy environments.
This handbook is a compilation of comprehensive reference sources that provide state-of-the-art findings on both theoretical and applied research on sustainable fashion supply chain management. It contains three parts, organized under the headings of "Reviews and Discussions," "Analytical Research," and "Empirical Research," featuring peer-reviewed papers contributed by researchers from Asia, Europe, and the US. This book is the first to focus on sustainable supply chain management in the fashion industry and is therefore a pioneering text on this topic. In the fashion industry, disposable fashion under the fast fashion concept has become a trend. In this trend, fashion supply chains must be highly responsive to market changes and able to produce fashion products in very small quantities to satisfy changing consumer needs. As a result, new styles will appear in the market within a very short time and fashion brands such as Zara can reduce the whole process cycle from conceptual design to a final ready-to-sell "well-produced and packaged" product on the retail sales floor within a few weeks. From the supply chain's perspective, the fast fashion concept helps to match supply and demand and lowers inventory. Moreover, since many fast fashion companies, e.g., Zara, H&M, and Topshop, adopt a local sourcing approach and obtain supply from local manufacturers (to cut lead time), the corresponding carbon footprint is much reduced. Thus, this local sourcing scheme under fast fashion would enhance the level of environmental friendliness compared with the more traditional offshore sourcing. Furthermore, since the fashion supply chain is notorious for generating high volumes of pollutants, involving hazardous materials in the production processes, and producing products by companies with low social responsibility, new management principles and theories, especially those that take into account consumer behaviours and preferences, need to be developed to address many of these issues in order to achieve the goal of sustainable fashion supply chain management. The topics covered include Reverse Logistics of US Carpet Recycling; Green Brand Strategies in the Fashion Industry; Impacts of Social Media on Consumers' Disposals of Apparel; Fashion Supply Chain Network Competition with Eco-labelling; Reverse Logistics as a Sustainable Supply Chain Practice for the Fashion Industry; Apparel Manufacturers' Path to World-class Corporate Social Responsibility; Sustainable Supply Chain Management in the Slow-Fashion Industry; Mass Market Second-hand Clothing Retail Operations in Hong Kong; Constraints and Drivers of Growth in the Ethical Fashion Sector: The case of France; and Effects of Used Garment Collection Programmes in Fast Fashion Brands.
This book addresses the measurement of the effect of information
technology (IT) investments on a firm's productivity. Determining a
quantifiable impact of a firm's IT has plagued senior executives,
researchers, and policy-makers for several years, as evidenced by
articles in trade magazines such as Fortune and Businessweek and in
academic journals such as Management Science. Simple statistical
techniques for measuring IT impact in a firm are fraught with
methodological problems, as these techniques do not account for
either the causal direction in managerial decision making or the
behavioral assumptions about firms. Therefore, such studies have
led to results and inferences that are not generalizable. While
studies that measure the satisfaction of people who use IT are
important, management typically would like to know whether IT has
reduced operation costs by streamlining processes or increased
revenues by increasing the demand-meeting capability of the firm.
This book attempts to determine cost-reduction or
output-enhancement that may be linked to IT investments through
methodological sophistication.
Two-person zero-sum game theory deals with situations that are perfectly competitive there are exactly two decision makers for whom there is no possibility of cooperation or compromise. It is the most fundamental part of game theory, and the part most commonly applied. There are diverse applications to military battles, sports, parlor games, economics and politics. The theory was born in World War II, and has by now matured into a significant and tractable body of knowledge about competitive decision making. The advent of modern, powerful computers has enabled the solution of many games that were once beyond computational reach. "Two-Person Zero-Sum Games, 4th Ed." offers an up-to-date introduction to the subject, especially its computational aspects. Any finite game can be solved by the brute force method of enumerating all possible strategies and then applying linear programming. The trouble is that many interesting games have far too many strategies to enumerate, even with the aid of computers. After introducing ideas, terminology, and the brute force method in the initial chapters, the rest of the book is devoted to classes of games that can be solved without enumerating every strategy. Numerous examples are given, as well as an extensive set of exercises. Many of the exercises are keyed to sheets of an included Excel workbook that can be freely downloaded from the SpringerExtras website. This new edition can be used as either a reference book or as a textbook."
The efficiency of computational methods and the choice of the most efficient methods for solving a specific problem or a specific class of problems have always played an important role in numerical analysis. Optimization of the computerized solution process is now a major problem of applied mathematics, which stimulates the search for new computational methods and ways to implement them. In "Minimax Models in the theory of Numerical Methods", methods for estimating the efficiency of computational algorithms and problems of their optimality are studied within the framework of a general computation model. The subjects dealt with in this are very different from the traditional subjects of computational methods. Close attention is paid to adaptive (sequential) computational algorithms, the process of computation being regarded as a controlled process and the algorithm as a control strategy. This approach allows methods of game theory and other methods of operations research and systems analysis to be widely used for constructing optimal algorithms. The goal underlying the study of the various comutation models dealt with in this title is the construction of concrete numerical algorithms admitting programme implementation. The central role belongs to the concept of a sequentially optimal algorithms, which in many cases reflects the characterics of real-life computational processes more fully than the traditional optimality concepts.
7. 1. 1 Background Uncertainty can be considered as the lack of adequate information to make a decision. It is important to quantify uncertainties in mathematical models used for design and optimization of nondeterministic engineering systems. In general, - certainty can be broadly classi?ed into three types (Bae et al. 2004; Ha-Rok 2004; Klir and Wierman 1998; Oberkampf and Helton 2002; Sentz 2002). The ?rst one is aleatory uncertainty (also referred to as stochastic uncertainty or inherent - certainty) - it results from the fact that a system can behave in random ways. For example, the failure of an engine can be modeled as an aleatory uncertaintybecause the failure can occur at a random time. One cannot predict exactly when the engine will fail even if a large quantity of failure data is gathered (available). The second one is epistemic uncertainty (also known as subjective uncertainty or reducible - certainty) - it is the uncertainty of the outcome of some random event due to lack of knowledge or information in any phase or activity of the modeling process. By gaining information about the system or environmental factors, one can reduce the epistemic uncertainty. For example, a lack of experimental data to characterize new materials and processes leads to epistemic uncertainty.
Systems Thinking for a Turbulent World will help practitioners in any field of change engage more effectively in transformative innovation. Such innovation addresses the paradigm shift needed to meet the diverse unfolding global challenges facing us today, often summed up as the Anthropocene. Fragmentation of local and global societies is escalating, and this is aggravating vicious cycles. To heal the rifts, we need to reintroduce the human element into our understandings - whether the context is civic or scientific - and strengthen truth-seeking in decision-making. Aided by appropriate concepts and methods, this healing will enable a switch from reaction to anticipation, even in the face of discontinuous change and high uncertainty. The outcome is to privilege the positive human skills for collaborative navigation through uncertainty over the disjointed rationality of mechanism and artificial intelligence, which increasingly alienates us. The reader in search of new ways of thinking will be introduced to concepts new to systems thinking that integrate systems thinking and futures thinking. The concept of anticipatory present moment (APM) serves as a basis for learning the cognitive skills that better enable navigation through turbulent times. A key personal and team practice is participative repatterning, which is the basis for transformative innovation. This practice is aided by new methods of visual facilitation. The reader is guided through the unfolding of the ideas and practices with a narrative based on the metaphor of search portrayed in the tradition of ox herding, found in traditional Far Eastern consciousness practice.
This volume contains contributions from prominent researchers who participated in the 2007 IAENG International Conference on Operations Research. Topics covered include quality management systems, reliability and quality control, engineering experimental design, computer supported collaborative engineering, human factors and ergonomics, computer aided manufacturing, manufacturing processes and methods, engineering management and leadership, optimization, transportation network design, stochastics modelling, queueing theory, and industrial applications. The book offers the states of arts of tremendous advances in communication systems and electrical engineering and also serve as an excellent reference work for researchers and graduate students working with/on industrial engineering and operations research.
In today's hyper-competitive, global marketplace, a manufacturing company needs a competitive edge if it is to survive and grow. That edge could be anything from superior manufacturing technology to innovative product design; from patent protection to solid, well-established customer relationships. One competitive edge available to all manufacturers, but realized by only a few, is the ability to accurately measure, control, and optimize costs throughout a product's entire life cycle. The lack of a methodology to engineer cost optimization into every product makes attaining and maintaining profitability all that the more difficult. Cost Engineering provides a means for a manufacturer to achieve and sustain profitability by designing and manufacturing products to specific cost requirements. It incorporates a variety of proven methodologies including cost estimating, cost control, and cost optimization. Features: Describes the components and organization of an effective cost optimization process Provides detailed explanations of cost estimating techniques for many of the most common manufacturing processes Explains the selection and use of appropriate cost allocation methods Presents the fundamentals of cost-based negotiation Includes both proper and improper executions of cost engineering principles The details presented in this book are important to design engineers, manufacturing engineers, buyers, accountants, cost estimators, cost optimization specialists, and their managers and provides CEOs, COOs, general managers, product line managers, and plant managers with guidance on improving and sustaining profitability. .
This book presents methods for full-wave computer simulation that can be used in various applications and contexts, e.g. seismic prospecting, earthquake stability, global seismic patterns on Earth and Mars, medicine, traumatology, ultrasound investigation of the human body, ultrasound and laser operations, ultrasonic non-destructive railway testing, modelling aircraft composites, modelling composite material delamination, etc. The key innovation of this approach is the ability to study spatial dynamical wave processes, which is made possible by cutting-edge numerical finite-difference grid-characteristic methods. The book will benefit all students, researchers, practitioners and professors interested in numerical mathematics, computer science, computer simulation, high-performance computer systems, unstructured meshes, interpolation, seismic prospecting, geophysics, medicine, non-destructive testing and composite materials.
Applying Occupational Psychology in the Fire Service: Emotion, Risk and Decision-Making provides readers with an overview of the latest research informing the policies, procedures and practices of those working on the ground in the UK Fire Service. Using best-practice principles and cutting-edge theory, the current text demonstrates how occupational psychology can be applied to fire services around the globe to improve individual, management, and organisational decisions. The authors aim to provide students, trainees, practitioners and fire personnel with a unique insight into a range of topics, including resilience, injury, work related wellbeing, community engagement as well as decision making and operational preparedness. This book represents a call to arms for more robust practices to support the Fire Service, highlighting the psychological factors involved in the firefighter occupation and paving the way towards a better understanding of emotion, risk, safety, and decision-making within the fire context.
The survey process is a highly complex and situationally dependent one, in need of careful management. If poorly designed and administered, surveys can create disappointment and even disaster. Little has been written so far for those responsible for designing and implementing surveys in organizations. These authors have drawn on their extensive consulting experience to develop a concise, pragmatic, seven-step model covering the entire process, from initiation, to final evaluation, to making the results meaningful to the future of the organization. They pay special attention to the political and human sensitivities concerned and show how to overcome the many potential barriers to a successful outcome.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Der Autor untersucht unter Berucksichtigung der Neuerungen durch das SanInsFoG, ob die mit der drohenden Zahlungsunfahigkeit i. S. d. 18 InsO verbundenen rechtspolitischen Ziele des Gesetzgebers erreicht wurden. Dabei nimmt er zu bisher wenig beachteten Rechtsfragen im Zusammenhang mit den Eroeffnungsgrunden nach 17 ff. InsO Stellung, entwickelt eine eigene Prufungssystematik fur den Tatbestand des 18 InsO und unterbreitet einen Reformvorschlag zur Abloesung des 19 InsO. Er beleuchtet die bestehenden Anreize fur eine fruhzeitige Verfahrenseinleitung mit Fokus auf die gesetzlichen Sanierungsinstrumente sowie die Konkurrenzsituation zum StaRUG und unterbreitet fur klarungsbedurftige Einzelfragen Loesungsvorschlage. Der Autor stellt fest, dass de lege lata kaum geeignete Anreize zur Foerderung einer Verfahrenseinleitung bereits bei drohender Zahlungsunfahigkeit vorhanden sind und der Gesetzgeber daher das mit 18 InsO verfolgte Ziel nach wie vor verfehlt. Anschliessend prasentiert er konkrete Vorschlage zur Weiterentwicklung des geltenden Rechts de lege ferenda. |
You may like...
|