![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Business & Economics > Business & management > Management & management techniques > Operational research
This research monograph summarizes a line of research that maps certain classical problems of discrete mathematics and operations research - such as the Hamiltonian Cycle and the Travelling Salesman Problems - into convex domains where continuum analysis can be carried out. Arguably, the inherent difficulty of these, now classical, problems stems precisely from the discrete nature of domains in which these problems are posed. The convexification of domains underpinning these results is achieved by assigning probabilistic interpretation to key elements of the original deterministic problems. In particular, the approaches summarized here build on a technique that embeds Hamiltonian Cycle and Travelling Salesman Problems in a structured singularly perturbed Markov decision process. The unifying idea is to interpret subgraphs traced out by deterministic policies (including Hamiltonian cycles, if any) as extreme points of a convex polyhedron in a space filled with randomized policies. The above innovative approach has now evolved to the point where there are many, both theoretical and algorithmic, results that exploit the nexus between graph theoretic structures and both probabilistic and algebraic entities of related Markov chains. The latter include moments of first return times, limiting frequencies of visits to nodes, or the spectra of certain matrices traditionally associated with the analysis of Markov chains. However, these results and algorithms are dispersed over many research papers appearing in journals catering to disparate audiences. As a result, the published manuscripts are often written in a very terse manner and use disparate notation, thereby making it difficult for new researchers to make use of the many reported advances. Hence the main purpose of this book is to present a concise and yet easily accessible synthesis of the majority of the theoretical and algorithmic results obtained so far. In addition, the book discusses numerous open questions and problems that arise from this body of work and which are yet to be fully solved. The approach casts the Hamiltonian Cycle Problem in a mathematical framework that permits analytical concepts and techniques, not used hitherto in this context, to be brought to bear to further clarify both the underlying difficulty of NP-completeness of this problem and the relative exceptionality of truly difficult instances. Finally, the material is arranged in such a manner that the introductory chapters require very little mathematical background and discuss instances of graphs with interesting structures that motivated a lot of the research in this topic. More difficult results are introduced later and are illustrated with numerous examples.
This book presents advanced case studies that address a range of important issues arising in space engineering. An overview of challenging operational scenarios is presented, with an in-depth exposition of related mathematical modeling, algorithmic and numerical solution aspects. The model development and optimization approaches discussed in the book can be extended also towards other application areas. The topics discussed illustrate current research trends and challenges in space engineering as summarized by the following list: * Next Generation Gravity Missions * Continuous-Thrust Trajectories by Evolutionary Neurocontrol * Nonparametric Importance Sampling for Launcher Stage Fallout * Dynamic System Control Dispatch * Optimal Launch Date of Interplanetary Missions * Optimal Topological Design * Evidence-Based Robust Optimization * Interplanetary Trajectory Design by Machine Learning * Real-Time Optimal Control * Optimal Finite Thrust Orbital Transfers * Planning and Scheduling of Multiple Satellite Missions * Trajectory Performance Analysis * Ascent Trajectory and Guidance Optimization * Small Satellite Attitude Determination and Control * Optimized Packings in Space Engineering * Time-Optimal Transfers of All-Electric GEO Satellites Researchers working on space engineering applications will find this work a valuable, practical source of information. Academics, graduate and post-graduate students working in aerospace, engineering, applied mathematics, operations research, and optimal control will find useful information regarding model development and solution techniques, in conjunction with real-world applications.
Effective decision-making while trading off the constraints and conflicting multiple objectives under rapid technological developments, massive generation of data, and extreme volatility is of paramount importance to organizations to win over the time-based competition today. While agility is a crucial issue, the firms have been increasingly relying on evidence-based decision-making through intelligent decision support systems driven by computational intelligence and automation to achieve a competitive advantage. The decisions are no longer confined to a specific functional area. Instead, business organizations today find actionable insight for formulating future courses of action by integrating multiple objectives and perspectives. Therefore, multi-objective decision-making plays a critical role in businesses and industries. In this regard, the importance of Operations Research (OR) models and their applications enables the firms to derive optimum solutions subject to various constraints and/or objectives while considering multiple functional areas of the organizations together. Hence, researchers and practitioners have extensively applied OR models to solve various organizational issues related to manufacturing, service, supply chain and logistics management, human resource management, finance, and market analysis, among others. Further, OR models driven by AI have been enabled to provide intelligent decision-support frameworks for achieving sustainable development goals. The present issue provides a unique platform to showcase the contributions of the leading international experts on production systems and business from academia, industry, and government to discuss the issues in intelligent manufacturing, operations management, financial management, supply chain management, and Industry 4.0 in the Artificial Intelligence era. Some of the general (but not specific) scopes of this proceeding entail OR models such as Optimization and Control, Combinatorial Optimization, Queuing Theory, Resource Allocation Models, Linear and Nonlinear Programming Models, Multi-objective and multi-attribute Decision Models, Statistical Quality Control along with AI, Bayesian Data Analysis, Machine Learning and Econometrics and their applications vis-à -vis AI & Data-driven Production Management, Marketing and Retail Management, Financial Management, Human Resource Management, Operations Management, Smart Manufacturing & Industry 4.0, Supply Chain and Logistics Management, Digital Supply Network, Healthcare Administration, Inventory Management, consumer behavior, security analysis, and portfolio management and sustainability.  The present issue shall be of interest to the faculty members, students, and scholars of various engineering and social science institutions and universities, along with the practitioners and policymakers of different industries and organizations.
This text is among the first to reveal the intricacies of an airline’s Operations Control Centre; especially the thought processes, information flows, and strategies taken to mitigate disruptions.
Focuses on the use of simulation techniques to model and evaluate repetitive construction operations. Based on the CYCLONE and MICROCYCLONE software developed by the authors and used at 38 universities nationwide, it uses a variety of examples from all areas of construction to demonstrate the application of simulation to analyze construction operations.
Operational research is a collection of modelling techniques used to structure, analyse, and solve problems related to the design and operation of complex human systems. While many argue that operational research should play a key role in improving healthcare services, staff may be largely unaware of its potential applications. This Element explores operational research's wartime origins and introduce several approaches that operational researchers use to help healthcare organisations: address well-defined decision problems; account for multiple stakeholder perspectives; and describe how system performance may be impacted by changing the configuration or operation of services. The authors draw on examples that illustrate the valuable perspective that operational research brings to improvement initiatives and the challenges of implementing and scaling operational research solutions. They discuss how operational researchers are working to surmount these problems and suggest further research to help operational researchers have greater beneficial impact in healthcare improvement. This title is also available as Open Access on Cambridge Core.
"Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, "provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined on the basis of numerous courses that the authors have held for practitioners worldwide. "
Many decision problems in Operations Research are defined on temporal networks, that is, workflows of time-consuming tasks whose processing order is constrained by precedence relations. For example, temporal networks are used to model projects, computer applications, digital circuits and production processes. Optimization problems arise in temporal networks when a decision maker wishes to determine a temporal arrangement of the tasks and/or a resource assignment that optimizes some network characteristic (e.g. the time required to complete all tasks). The parameters of these optimization problems (e.g. the task durations) are typically unknown at the time the decision problem arises. This monograph investigates solution techniques for optimization problems in temporal networks that explicitly account for this parameter uncertainty. We study several formulations, each of which requires different information about the uncertain problem parameters.
This book introduces the theory and applications of uncertain optimal control, and establishes two types of models including expected value uncertain optimal control and optimistic value uncertain optimal control. These models, which have continuous-time forms and discrete-time forms, make use of dynamic programming. The uncertain optimal control theory relates to equations of optimality, uncertain bang-bang optimal control, optimal control with switched uncertain system, and optimal control for uncertain system with time-delay. Uncertain optimal control has applications in portfolio selection, engineering, and games. The book is a useful resource for researchers, engineers, and students in the fields of mathematics, cybernetics, operations research, industrial engineering, artificial intelligence, economics, and management science.
This book offers an in-depth and comprehensive introduction to the priority methods of intuitionistic preference relations, the consistency and consensus improving procedures for intuitionistic preference relations, the approaches to group decision making based on intuitionistic preference relations, the approaches and models for interactive decision making with intuitionistic fuzzy information, and the extended results in interval-valued intuitionistic fuzzy environments.
This handbook is a compilation of comprehensive reference sources that provide state-of-the-art findings on both theoretical and applied research on sustainable fashion supply chain management. It contains three parts, organized under the headings of "Reviews and Discussions," "Analytical Research," and "Empirical Research," featuring peer-reviewed papers contributed by researchers from Asia, Europe, and the US. This book is the first to focus on sustainable supply chain management in the fashion industry and is therefore a pioneering text on this topic. In the fashion industry, disposable fashion under the fast fashion concept has become a trend. In this trend, fashion supply chains must be highly responsive to market changes and able to produce fashion products in very small quantities to satisfy changing consumer needs. As a result, new styles will appear in the market within a very short time and fashion brands such as Zara can reduce the whole process cycle from conceptual design to a final ready-to-sell "well-produced and packaged" product on the retail sales floor within a few weeks. From the supply chain's perspective, the fast fashion concept helps to match supply and demand and lowers inventory. Moreover, since many fast fashion companies, e.g., Zara, H&M, and Topshop, adopt a local sourcing approach and obtain supply from local manufacturers (to cut lead time), the corresponding carbon footprint is much reduced. Thus, this local sourcing scheme under fast fashion would enhance the level of environmental friendliness compared with the more traditional offshore sourcing. Furthermore, since the fashion supply chain is notorious for generating high volumes of pollutants, involving hazardous materials in the production processes, and producing products by companies with low social responsibility, new management principles and theories, especially those that take into account consumer behaviours and preferences, need to be developed to address many of these issues in order to achieve the goal of sustainable fashion supply chain management. The topics covered include Reverse Logistics of US Carpet Recycling; Green Brand Strategies in the Fashion Industry; Impacts of Social Media on Consumers' Disposals of Apparel; Fashion Supply Chain Network Competition with Eco-labelling; Reverse Logistics as a Sustainable Supply Chain Practice for the Fashion Industry; Apparel Manufacturers' Path to World-class Corporate Social Responsibility; Sustainable Supply Chain Management in the Slow-Fashion Industry; Mass Market Second-hand Clothing Retail Operations in Hong Kong; Constraints and Drivers of Growth in the Ethical Fashion Sector: The case of France; and Effects of Used Garment Collection Programmes in Fast Fashion Brands.
Two-person zero-sum game theory deals with situations that are perfectly competitive there are exactly two decision makers for whom there is no possibility of cooperation or compromise. It is the most fundamental part of game theory, and the part most commonly applied. There are diverse applications to military battles, sports, parlor games, economics and politics. The theory was born in World War II, and has by now matured into a significant and tractable body of knowledge about competitive decision making. The advent of modern, powerful computers has enabled the solution of many games that were once beyond computational reach. "Two-Person Zero-Sum Games, 4th Ed." offers an up-to-date introduction to the subject, especially its computational aspects. Any finite game can be solved by the brute force method of enumerating all possible strategies and then applying linear programming. The trouble is that many interesting games have far too many strategies to enumerate, even with the aid of computers. After introducing ideas, terminology, and the brute force method in the initial chapters, the rest of the book is devoted to classes of games that can be solved without enumerating every strategy. Numerous examples are given, as well as an extensive set of exercises. Many of the exercises are keyed to sheets of an included Excel workbook that can be freely downloaded from the SpringerExtras website. This new edition can be used as either a reference book or as a textbook."
This book addresses the measurement of the effect of information
technology (IT) investments on a firm's productivity. Determining a
quantifiable impact of a firm's IT has plagued senior executives,
researchers, and policy-makers for several years, as evidenced by
articles in trade magazines such as Fortune and Businessweek and in
academic journals such as Management Science. Simple statistical
techniques for measuring IT impact in a firm are fraught with
methodological problems, as these techniques do not account for
either the causal direction in managerial decision making or the
behavioral assumptions about firms. Therefore, such studies have
led to results and inferences that are not generalizable. While
studies that measure the satisfaction of people who use IT are
important, management typically would like to know whether IT has
reduced operation costs by streamlining processes or increased
revenues by increasing the demand-meeting capability of the firm.
This book attempts to determine cost-reduction or
output-enhancement that may be linked to IT investments through
methodological sophistication.
The efficiency of computational methods and the choice of the most efficient methods for solving a specific problem or a specific class of problems have always played an important role in numerical analysis. Optimization of the computerized solution process is now a major problem of applied mathematics, which stimulates the search for new computational methods and ways to implement them. In "Minimax Models in the theory of Numerical Methods", methods for estimating the efficiency of computational algorithms and problems of their optimality are studied within the framework of a general computation model. The subjects dealt with in this are very different from the traditional subjects of computational methods. Close attention is paid to adaptive (sequential) computational algorithms, the process of computation being regarded as a controlled process and the algorithm as a control strategy. This approach allows methods of game theory and other methods of operations research and systems analysis to be widely used for constructing optimal algorithms. The goal underlying the study of the various comutation models dealt with in this title is the construction of concrete numerical algorithms admitting programme implementation. The central role belongs to the concept of a sequentially optimal algorithms, which in many cases reflects the characterics of real-life computational processes more fully than the traditional optimality concepts.
7. 1. 1 Background Uncertainty can be considered as the lack of adequate information to make a decision. It is important to quantify uncertainties in mathematical models used for design and optimization of nondeterministic engineering systems. In general, - certainty can be broadly classi?ed into three types (Bae et al. 2004; Ha-Rok 2004; Klir and Wierman 1998; Oberkampf and Helton 2002; Sentz 2002). The ?rst one is aleatory uncertainty (also referred to as stochastic uncertainty or inherent - certainty) - it results from the fact that a system can behave in random ways. For example, the failure of an engine can be modeled as an aleatory uncertaintybecause the failure can occur at a random time. One cannot predict exactly when the engine will fail even if a large quantity of failure data is gathered (available). The second one is epistemic uncertainty (also known as subjective uncertainty or reducible - certainty) - it is the uncertainty of the outcome of some random event due to lack of knowledge or information in any phase or activity of the modeling process. By gaining information about the system or environmental factors, one can reduce the epistemic uncertainty. For example, a lack of experimental data to characterize new materials and processes leads to epistemic uncertainty.
This volume contains contributions from prominent researchers who participated in the 2007 IAENG International Conference on Operations Research. Topics covered include quality management systems, reliability and quality control, engineering experimental design, computer supported collaborative engineering, human factors and ergonomics, computer aided manufacturing, manufacturing processes and methods, engineering management and leadership, optimization, transportation network design, stochastics modelling, queueing theory, and industrial applications. The book offers the states of arts of tremendous advances in communication systems and electrical engineering and also serve as an excellent reference work for researchers and graduate students working with/on industrial engineering and operations research.
This book begins by introducing the topic of knowledge in literature, including its scientific foundations. Due to the ever-increasing number of scientific publications, literature reviews are becoming more and more essential to stay updated. Literature Reviews describes an innovative system for creating systematic literature reviews, through reviewing, analyzing, and synthesizing scientific and technological literature. It then discusses systematic literature reviews, content analysis, and literature synthesis separately, before presenting the methodology to combine them in one process. It showcases computational tools to aid in this technique and offers examples of the method in action. Finally, the book takes a new of future developments in the subject. This book is of interest to graduate students, as well as researchers and academics, helping them to deepen insights and improve skills needed to conduct thorough literature reviews.
This book is based on a number of systems concepts, of which the following are emphasized here: oThe interacting systems of society and the environment are dynamic and evolution ary oEvolution of these systems carries them through stages of differential stability and instability, continuity and discontinuity oAssociated with evolution and instability is structural change that is essentially irre versible oThe present is a stage of world transformation that may not have been equaled for decades or even centuries oPolicies and decisions must match the times, in the present case the stage of world transformation The time 11:59:59 PM, approximately, on December 31, 2000 has an impor tant symbolic meaning. It marks the end of a minute, the end of an hour, the end of a day, the end of a year, the end of a decade, the end of a century, and the end of a millennium. The time and date provide a convenient yardstick against which we can evaluate the evolution of our thinking and the adequacy of our assumptions, mental models, paradigms, and policies. Will the beginning tum out to be appropriately dif ferent from the end? We hope that this book is helpful in such evaluation. This is a new-paradigm book, which both presents and advances the new way of thinking about the systems of science, technology, society, economics, politics, and the environment, and actively calls for the replacement of the worn out cognitive/sociotechnical paradigm."
This book presents methods for full-wave computer simulation that can be used in various applications and contexts, e.g. seismic prospecting, earthquake stability, global seismic patterns on Earth and Mars, medicine, traumatology, ultrasound investigation of the human body, ultrasound and laser operations, ultrasonic non-destructive railway testing, modelling aircraft composites, modelling composite material delamination, etc. The key innovation of this approach is the ability to study spatial dynamical wave processes, which is made possible by cutting-edge numerical finite-difference grid-characteristic methods. The book will benefit all students, researchers, practitioners and professors interested in numerical mathematics, computer science, computer simulation, high-performance computer systems, unstructured meshes, interpolation, seismic prospecting, geophysics, medicine, non-destructive testing and composite materials.
Fund Custody and Administration provides an overall perspective of investment funds without limiting its analysis to specific fund structures, as other books do. Since governance and oversight of investment funds are now major regulatory requirements, administrators and custodians must place greater emphasis on the custody and safekeeping of fund assets, on the independent and robust valuation of the assets, and on collateral management. By focusing on both the asset transactions made by the investment manager for the portfolio and on the transactions in the shares or units of the fund itself, it gives readers insights about the essential elements of investment fund management and administration, regardless of their geographical backgrounds.
Applying Occupational Psychology in the Fire Service: Emotion, Risk and Decision-Making provides readers with an overview of the latest research informing the policies, procedures and practices of those working on the ground in the UK Fire Service. Using best-practice principles and cutting-edge theory, the current text demonstrates how occupational psychology can be applied to fire services around the globe to improve individual, management, and organisational decisions. The authors aim to provide students, trainees, practitioners and fire personnel with a unique insight into a range of topics, including resilience, injury, work related wellbeing, community engagement as well as decision making and operational preparedness. This book represents a call to arms for more robust practices to support the Fire Service, highlighting the psychological factors involved in the firefighter occupation and paving the way towards a better understanding of emotion, risk, safety, and decision-making within the fire context.
The survey process is a highly complex and situationally dependent one, in need of careful management. If poorly designed and administered, surveys can create disappointment and even disaster. Little has been written so far for those responsible for designing and implementing surveys in organizations. These authors have drawn on their extensive consulting experience to develop a concise, pragmatic, seven-step model covering the entire process, from initiation, to final evaluation, to making the results meaningful to the future of the organization. They pay special attention to the political and human sensitivities concerned and show how to overcome the many potential barriers to a successful outcome.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications.
Applications of queueing network models have multiplied in the last generation, including scheduling of large manufacturing systems, control of patient flow in health systems, load balancing in cloud computing, and matching in ride sharing. These problems are too large and complex for exact solution, but their scale allows approximation. This book is the first comprehensive treatment of fluid scaling, diffusion scaling, and many-server scaling in a single text presented at a level suitable for graduate students. Fluid scaling is used to verify stability, in particular treating max weight policies, and to study optimal control of transient queueing networks. Diffusion scaling is used to control systems in balanced heavy traffic, by solving for optimal scheduling, admission control, and routing in Brownian networks. Many-server scaling is studied in the quality and efficiency driven Halfin-Whitt regime and applied to load balancing in the supermarket model and to bipartite matching in ride-sharing applications. |
You may like...
29th European Symposium on Computer…
Anton A Kiss, Edwin Zondervan, …
Hardcover
R11,317
Discovery Miles 113 170
Demand for Emerging Transportation…
Constantinos Antoniou, Dimitrios Efthymiou, …
Paperback
R2,527
Discovery Miles 25 270
Handbook of Research on Strategic Supply…
Narasimha Kamath, Swapnil Saurav
Hardcover
R5,941
Discovery Miles 59 410
Strategic Collaborative Innovations in…
Mambo Mupepi, Robert Costello
Hardcover
R4,855
Discovery Miles 48 550
Emerging Trends in Sustainable Supply…
Muhammad Waqas, Syed Abdul Rehman Khan, …
Hardcover
R6,163
Discovery Miles 61 630
|