![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Optimization > General
The purpose of this four volume series is to make available for college teachers and students samples of important and realistic applications of mathematics which can be covered in undergraduate programs. The goal is to provide illustrations of how modem mathematics is actually employed to solve relevant contemporary problems. Although these independent chapters were prepared primarily for teachers in the general mathematical sciences, they should prove valuable to students, teachers, and research scientists in many of the fields of application as well. Prerequisites for each chapter and suggestions for the teacher are provided. Several of these chapters have been tested in a variety of classroom settings, and all have undergone extensive peer review and revision. Illustrations and exercises be covered in one class, are included in most chapters. Some units can whereas others provide sufficient material for a few weeks of class time. Volume 1 contains 23 chapters and deals with differential equations and, in the last four chapters, problems leading to partial differential equations. Applications are taken from medicine, biology, traffic systems and several other fields. The 14 chapters in Volume 2 are devoted mostly to problems arising in political science, but they also address questions appearing in sociology and ecology. Topics covered include voting systems, weighted voting, proportional representation, coalitional values, and committees. The 14 chapters in Volume 3 emphasize discrete mathematical methods such as those which arise in graph theory, combinatorics, and networks.
This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical ad vances and new computer technologies is an exciting challenge that involves all scientists willing to develop high performance numerical software. This book contains several important contributions from different and com plementary standpoints. Obviously, the articles in the book do not cover all the areas of the conference topic or all the most recent developments, because of the large number of new theoretical and computational ideas of the last few years."
After the IUTAM Symposium on Optimization in Structural Design held in Warsaw in 1973, it was clear to me that the time had come for organizing into a consistent body of thought the enormous quantity of results obtained in this domain, studied from so many different points of view, with so many different methods, and at so many levels of practical applicability. My colleague and friend Gianantonnio Sacchi from Milan and I met with Professor Prager in Savognin in July 1974, where I submitted to them my first ideas for a treatise on structural optimization: It should cover the whole domain from basic theory to practical applications, and deal with various materials, various types of structures, various functions required of the structures, and various types of cost . . Obviously, this was to be a team effort, to total three or four volumes, to be written in a balanced manner as textbooks and handbooks. Nothing similar existed at that time, and, indeed, nothing has been published to date. Professor Prager was immedi ately in favor of such a project. He agreed to write a first part on optimality criteria with me and to help me in the general organization of the series. Since Professor Sacchi was willing to write the text on variational methods, it remained to find authors for parts on the mathematical programming approach to structural optimization (and, more generally, on numerical methods) and on practical optimal design procedures in metal and concrete."
During the last three decades, breakthroughs in computer technology have made a tremendous impact on optimization. In particular, parallel computing has made it possible to solve larger and computationally more difficult prob lems. This volume contains mainly lecture notes from a Nordic Summer School held at the Linkoping Institute of Technology, Sweden in August 1995. In order to make the book more complete, a few authors were invited to contribute chapters that were not part of the course on this first occasion. The purpose of this Nordic course in advanced studies was three-fold. One goal was to introduce the students to the new achievements in a new and very active field, bring them close to world leading researchers, and strengthen their competence in an area with internationally explosive rate of growth. A second goal was to strengthen the bonds between students from different Nordic countries, and to encourage collaboration and joint research ventures over the borders. In this respect, the course built further on the achievements of the "Nordic Network in Mathematical Programming," which has been running during the last three years with the support ofthe Nordic Council for Advanced Studies (NorFA). The final goal was to produce literature on the particular subject, which would be available to both the participating students and to the students of the "next generation" ."
2. The Algorithm ...59 3. Convergence Analysis ..., ...60 4. Complexity Analysis ...63 5. Conclusions ...67 References ...67 A Simple Proof for a Result of Ollerenshaw on Steiner Trees ...68 Xiufeng Du, Ding-Zhu Du, Biao Gao, and Lixue Qii 1. Introduction ...68 2. In the Euclidean Plane ...69 3. In the Rectilinear Plane ...70 4. Discussion ...-...71 References ...71 Optimization Algorithms for the Satisfiability (SAT) Problem ...72 Jun Gu 1. Introduction ...72 2. A Classification of SAT Algorithms ...7:3 3. Preliminaries ...IV 4. Complete Algorithms and Incomplete Algorithms ...81 5. Optimization: An Iterative Refinement Process ...86 6. Local Search Algorithms for SAT ...89 7. Global Optimization Algorithms for SAT Problem ...106 8. Applications ...137 9. Future Work ...140 10. Conclusions ...141 References ...143 Ergodic Convergence in Proximal Point Algorithms with Bregman Functions ...155 Osman Guier 1. Introduction ...: ...155 2. Convergence for Function Minimization ...158 3. Convergence for Arbitrary Maximal Monotone Operators ...161 References ...163 Adding and Deleting Constraints in the Logarithmic Barrier Method for LP ...166 D. den Hertog, C. Roos, and T. Terlaky 1. Introduction ...16(5 2. The Logarithmic Darrier Method ...lG8 CONTENTS IX 3. The Effects of Shifting, Adding and Deleting Constraints ...171 4. The Build-Up and Down Algorithm ...177 ...5. Complexity Analysis ...180 References ...184 A Projection Method for Solving Infinite Systems of Linear Inequalities ...186 Hui Hu 1. Introduction ...186 2. The Projection Method ...186 3. Convergence Rate ...189 4. Infinite Systems of Convex Inequalities ...191 5. Application ...193 References ...
In 1984, N. Karmarkar published a seminal paper on algorithmic linear programming. During the subsequent decade, it stimulated a huge outpouring of new algorithmic results by researchers world-wide in many areas of mathematical programming and numerical computation. This book gives an overview of the resulting, dramatic reorganization that has occurred in one of these areas: algorithmic differentiable optimization and equation-solving, or, more simply, algorithmic differentiable programming. The book is aimed at readers familiar with advanced calculus, numerical analysis, in particular numerical linear algebra, the theory and algorithms of linear and nonlinear programming, and the fundamentals of computer science, in particular, computer programming and the basic models of computation and complexity theory. J.L. Nazareth is a Professor in the Department of Pure and Applied Mathematics at Washington State University. He is the author of two books previously published by Springer-Verlag, DLP and Extensions: An Optimization Model and Decision Support System (2001) and The Newton-Cauchy Framework: A Unified Approach to Unconstrained Nonlinear Minimization (1994).
Over the past decade the financial and business environments have undergone significant changes. During the same period several advances have been made within the field of financial engineering, involving both the methodological tools as well as the application areas. This comprehensive edited volume discusses the most recent advances within the field of financial engineering, focusing not only on the description of the existing areas in financial engineering research, but also on the new methodologies that have been developed for modeling and addressing financial engineering problems. This book is divided into four major parts, each covering different aspects of financial engineering and modeling such as portfolio management and trading, risk management, applications of operation research methods, and credit rating models. Handbook of Financial Engineering is intended for financial engineers, researchers, applied mathematicians, and graduate students interested in real-world applications to financial engineering.
A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized convexity off unctions as monotonicity relates to convexity. Generalized monotonicity plays a role in variational inequality problems, complementarity problems and more generally, in equilibrium prob lems."
A logic view of 0-1 integer programming problems, providing new insights into the structure of problems that can lead the researcher to more effective solution techniques depending on the problem class. Operations research techniques are integrated into a logic programming environment. The first monographic treatment that begins to unify these two methodological approaches. Logic-based methods for modelling and solving combinatorial problems have recently started to play a significant role in both theory and practice. The application of logic to combinatorial problems has a dual aspect. On one hand, constraint logic programming allows one to declaratively model combinatorial problems over an appropriate constraint domain, the problems then being solved by a corresponding constraint solver. Besides being a high-level declarative interface to the constraint solver, the logic programming language allows one also to implement those subproblems that cannot be naturally expressed with constraints. On the other hand, logic-based methods can be used as a constraint solving technique within a constraint solver for combinatorial problems modelled as 0-1 integer programs.
One has to make everything as simple as possible but, never more simple. Albert Einstein Discovery consists of seeing what every body has seen and thinking what nobody has thought. Albert S. ent_Gyorgy; The primary goal of this book is to provide an introduction to the theory of Interior Point Methods (IPMs) in Mathematical Programming. At the same time, we try to present a quick overview of the impact of extensions of IPMs on smooth nonlinear optimization and to demonstrate the potential of IPMs for solving difficult practical problems. The Simplex Method has dominated the theory and practice of mathematical pro gramming since 1947 when Dantzig discovered it. In the fifties and sixties several attempts were made to develop alternative solution methods. At that time the prin cipal base of interior point methods was also developed, for example in the work of Frisch (1955), Caroll (1961), Huard (1967), Fiacco and McCormick (1968) and Dikin (1967). In 1972 Klee and Minty made explicit that in the worst case some variants of the simplex method may require an exponential amount of work to solve Linear Programming (LP) problems. This was at the time when complexity theory became a topic of great interest. People started to classify mathematical programming prob lems as efficiently (in polynomial time) solvable and as difficult (NP-hard) problems. For a while it remained open whether LP was solvable in polynomial time or not. The break-through resolution ofthis problem was obtained by Khachijan (1989)."
The International Union of Theoretical and Applied Mechanics (IUTAM) initiated and sponsored an International Symposium on Optimization of Mechanical Systems held in 1995 in Stuttgart, Germany. The Symposium was intended to bring together scientists working in different fields of optimization to exchange ideas and to discuss new trends with special emphasis on multi body systems. A Scientific Committee was appointed by the Bureau of IUTAM with the following members: S. Arimoto (Japan) EL. Chernousko (Russia) M. Geradin (Belgium) E.J. Haug (U.S.A.) C.A.M. Soares (Portugal) N. Olhoff (Denmark) W.O. Schiehlen (Germany, Chairman) K. Schittkowski (Germany) R.S. Sharp (U.K.) W. Stadler (U.S.A.) H.-B. Zhao (China) This committee selected the participants to be invited and the papers to be presented at the Symposium. As a result of this procedure, 90 active scientific participants from 20 countries followed the invitation, and 49 papers were presented in lecture and poster sessions.
In 1995 the Handbook of Global Optimization (first volume), edited by R. Horst, and P.M. Pardalos, was published. This second volume of the Handbook of Global Optimization is comprised of chapters dealing with modern approaches to global optimization, including different types of heuristics. Together (available as a set, set ISBN 1-4020-0742-6), the two volumes of the handbook cover a complete and broad spectrum of approaches for dealing with global optimization problems. The goal of the editors is to provide a true handbook that does not focus on particular applications of the heuristics and algorithms, but rather describes the state of the art for the different methodologies. Topics covered in the handbook include various metaheuristics, such as simulated annealing, genetic algorithms, neural networks, taboo search, shake-and-bake methods, and deformation methods. In addition, the book contains chapters on new exact stochastic and deterministic approaches to continuous and mixed-integer global optimization, such as stochastic adaptive search, two-phase methods, branch-and-bound methods with new relaxation and branching strategies, algorithms based on local optimization, and dynamical search. Finally, the book contains chapters on experimental analysis of algorithms and software, test problems, and applications. Audience: Graduate students in engineering and operations research, academic research, as well as practitioners, who can tailor the general approaches described in the handbook to their specific needs and applications.
The NATO Advanced Study Institute on "Algorithms for continuous optimiza tion: the state of the art" was held September 5-18, 1993, at II Ciocco, Barga, Italy. It was attended by 75 students (among them many well known specialists in optimiza tion) from the following countries: Belgium, Brasil, Canada, China, Czech Republic, France, Germany, Greece, Hungary, Italy, Poland, Portugal, Rumania, Spain, Turkey, UK, USA, Venezuela. The lectures were given by 17 well known specialists in the field, from Brasil, China, Germany, Italy, Portugal, Russia, Sweden, UK, USA. Solving continuous optimization problems is a fundamental task in computational mathematics for applications in areas of engineering, economics, chemistry, biology and so on. Most real problems are nonlinear and can be of quite large size. Devel oping efficient algorithms for continuous optimization has been an important field of research in the last 30 years, with much additional impetus provided in the last decade by the availability of very fast and parallel computers. Techniques, like the simplex method, that were already considered fully developed thirty years ago have been thoroughly revised and enormously improved. The aim of this ASI was to present the state of the art in this field. While not all important aspects could be covered in the fifty hours of lectures (for instance multiob jective optimization had to be skipped), we believe that most important topics were presented, many of them by scientists who greatly contributed to their development.
Researchers develop simulation models that emulate real-world situations. While these simulation models are simpler than the real situation, they are still quite complex and time consuming to develop. It is at this point that metamodeling can be used to help build a simulation study based on a complex model. A metamodel is a simpler, analytical model, auxiliary to the simulation model, which is used to better understand the more complex model, to test hypotheses about it, and provide a framework for improving the simulation study. The use of metamodels allows the researcher to work with a set of mathematical functions and analytical techniques to test simulations without the costly running and re-running of complex computer programs. In addition, metamodels have other advantages, and as a result they are being used in a variety of ways: model simplification, optimization, model interpretation, generalization to other models of similar systems, efficient sensitivity analysis, and the use of the metamodel's mathematical functions to answer questions about different variables within a simulation study.
I am very happy to have this opportunity to present the work of Boris Mirkin, a distinguished Russian scholar in the areas of data analysis and decision making methodologies. The monograph is devoted entirely to clustering, a discipline dispersed through many theoretical and application areas, from mathematical statistics and combina torial optimization to biology, sociology and organizational structures. It compiles an immense amount of research done to date, including many original Russian de velopments never presented to the international community before (for instance, cluster-by-cluster versions of the K-Means method in Chapter 4 or uniform par titioning in Chapter 5). The author's approach, approximation clustering, allows him both to systematize a great part of the discipline and to develop many in novative methods in the framework of optimization problems. The optimization methods considered are proved to be meaningful in the contexts of data analysis and clustering. The material presented in this book is quite interesting and stimulating in paradigms, clustering and optimization. On the other hand, it has a substantial application appeal. The book will be useful both to specialists and students in the fields of data analysis and clustering as well as in biology, psychology, economics, marketing research, artificial intelligence, and other scientific disciplines. Panos Pardalos, Series Editor."
Most books on inventory theory use the item approach to determine stock levels, ignoring the impact of unit cost, echelon location, and hardware indenture. Optimal Inventory Modeling of Systems is the first book to take the system approach to inventory modeling. The result has been dramatic reductions in the resources to operate many systems - fleets of aircraft, ships, telecommunications networks, electric utilities, and the space station. Although only four chapters and appendices are totally new in this edition, extensive revisions have been made in all chapters, adding numerous worked-out examples. Many new applications have been added including commercial airlines, experience gained during Desert Storm, and adoption of the Windows interface as a standard for personal computer models.
Over the past several years, cooperative control and optimization has un questionably been established as one of the most important areas of research in the military sciences. Even so, cooperative control and optimization tran scends the military in its scope -having become quite relevant to a broad class of systems with many exciting, commercial, applications. One reason for all the excitement is that research has been so incredibly diverse -spanning many scientific and engineering disciplines. This latest volume in the Cooperative Systems book series clearly illustrates this trend towards diversity and creative thought. And no wonder, cooperative systems are among the hardest systems control science has endeavored to study, hence creative approaches to model ing, analysis, and synthesis are a must The definition of cooperation itself is a slippery issue. As you will see in this and previous volumes, cooperation has been cast into many different roles and therefore has assumed many diverse meanings. Perhaps the most we can say which unites these disparate concepts is that cooperation (1) requires more than one entity, (2) the entities must have some dynamic behavior that influences the decision space, (3) the entities share at least one common objective, and (4) entities are able to share information about themselves and their environment. Optimization and control have long been active fields of research in engi neering."
This collection of papers is dedicated to the memory of Gaetano Fichera, a great mathematician and also a good friend to the editors. Regrettably it took an unusual amount of time to bring this collection out. This was primarily due to the fact that the main editor who had collected all of the materials, for this volume, P. D. Panagiotopoulos, died unexpectedly during the period when we were editing the manuscript. The other two editors in appreciation of Panagiotopoulos' contribution to this field, believe it is therefore fitting that this collection be dedicated to his memory also. The theme of the collection is centered around the seminal research of G. Fichera on the Signorini problem. Variants on this idea enter in different ways. For example, by bringing in friction the problem is no longer self-adjoint and the minimization formulation is not valid. A large portion of this collection is devoted to survey papers concerning hemivariational methods, with a main point of its application to nonsmooth mechanics. Hemivariational inequali ties, which are a generalization of variational inequalities, were pioneered by Panagiotopoulos. There are many applications of this theory to the study of non convex energy functionals occurring in many branches of mechanics. An area of concentration concerns contact problems, in particular, quasistatic and dynamic contact problems with friction and damage. Nonsmooth optimization methods which may be divided into the main groups of subgradient methods and bundle methods are also discussed in this collection."
Global Optimization has emerged as one of the most exciting new areas of mathematical programming. Global optimization has received a wide attraction from many fields in the past few years, due to the success of new algorithms for addressing previously intractable problems from diverse areas such as computational chemistry and biology, biomedicine, structural optimization, computer sciences, operations research, economics, and engineering design and control. This book contains refereed invited papers submitted at the 4th international confer ence on Frontiers in Global Optimization held at Santorini, Greece during June 8-12, 2003. Santorini is one of the few sites of Greece, with wild beauty created by the explosion of a volcano which is in the middle of the gulf of the island. The mystic landscape with its numerous mult-extrema, was an inspiring location particularly for researchers working on global optimization. The three previous conferences on "Recent Advances in Global Opti mization," "State-of-the-Art in Global Optimization," and "Optimization in Computational Chemistry and Molecular Biology: Local and Global approaches" took place at Princeton University in 1991, 1995, and 1999, respectively. The papers in this volume focus on de terministic methods for global optimization, stochastic methods for global optimization, distributed computing methods in global optimization, and applications of global optimiza tion in several branches of applied science and engineering, computer science, computational chemistry, structural biology, and bio-informatics."
This volume contains refereed papers based on the lectures presented at the XIV International Conference on Mathematical Programming held at Matrahaza, Hungary, between 27-31 March 1999. This conference was organized by the Laboratory of Operations Research and Deci sion Systems at the Computer and Automation Institute, Hungarian Academy of Sciences. The editors hope this volume will contribute to the theory and applications of mathematical programming. As a tradition of these events, the main purpose of the confer ence was to review and discuss recent advances and promising research trends concerning theory, algorithms and applications in different fields of Optimization Theory and related areas such as Convex Analysis, Complementarity Systems and Variational Inequalities. The conference is traditionally held in the Matra Mountains, and housed by the resort house of the Hungarian Academy of Sciences. This was the 14th event of the long lasting series of conferences started in 1973. The organizers wish to express their thanks to the authors for their contributions in this volume, and the anonymous referees for their valu able comments. Special thanks are directed to our sponsors, the Hun garian Academy of Sciences, the National Committee for Technological Development, the Hungarian National Science Foundation, and last but not least, the Hungarian Operational Research Society. We would like to thank John Martindale from Kluwer Academic Publishers for helping us produce this volume, Eva Nora Nagy for cor rections and proof-readings, and Peter Dombi for his excellent work on typesetting and editing the manuscript."
In the quest to understand and model the healthy or sick human body, re searchers and medical doctors are utilizing more and more quantitative tools and techniques. This trend is pushing the envelope of a new field we call Biomedical Computing, as an exciting frontier among signal processing, pattern recognition, optimization, nonlinear dynamics, computer science and biology, chemistry and medicine. A conference on Biocomputing was held during February 25-27, 2001 at the University of Florida. The conference was sponsored by the Center for Applied Optimization, the Computational Neuroengineering Center, the Biomedical En gineering Program (through a Whitaker Foundation grant), the Brain Institute, the School of Engineering, and the University of Florida Research & Graduate Programs. The conference provided a forum for researchers to discuss and present new directions in Biocomputing. The well-attended three days event was highlighted by the presence of top researchers in the field who presented their work in Biocomputing. This volume contains a selective collection of ref ereed papers based on talks presented at this conference. You will find seminal contributions in genomics, global optimization, computational neuroscience, FMRI, brain dynamics, epileptic seizure prediction and cancer diagnostics. We would like to take the opportunity to thank the sponsors, the authors of the papers, the anonymous referees, and Kluwer Academic Publishers for making the conference successful and the publication of this volume possible. Panos M. Pardalos and Jose C."
Optimization problems abound in most fields of science, engineering, and tech nology. In many of these problems it is necessary to compute the global optimum (or a good approximation) of a multivariable function. The variables that define the function to be optimized can be continuous and/or discrete and, in addition, many times satisfy certain constraints. Global optimization problems belong to the complexity class of NP-hard prob lems. Such problems are very difficult to solve. Traditional descent optimization algorithms based on local information are not adequate for solving these problems. In most cases of practical interest the number of local optima increases, on the aver age, exponentially with the size of the problem (number of variables). Furthermore, most of the traditional approaches fail to escape from a local optimum in order to continue the search for the global solution. Global optimization has received a lot of attention in the past ten years, due to the success of new algorithms for solving large classes of problems from diverse areas such as engineering design and control, computational chemistry and biology, structural optimization, computer science, operations research, and economics. This book contains refereed invited papers presented at the conference on "State of the Art in Global Optimization: Computational Methods and Applications" held at Princeton University, April 28-30, 1995. The conference presented current re search on global optimization and related applications in science and engineering. The papers included in this book cover a wide spectrum of approaches for solving global optimization problems and applications."
grams of which the objective is given by the ratio of a convex by a positive (over a convex domain) concave function. As observed by Sniedovich (Ref. [102, 103]) most of the properties of fractional pro grams could be found in other programs, given that the objective function could be written as a particular composition of functions. He called this new field C programming, standing for composite concave programming. In his seminal book on dynamic programming (Ref. [104]), Sniedovich shows how the study of such com positions can help tackling non-separable dynamic programs that otherwise would defeat solution. Barros and Frenk (Ref. [9]) developed a cutting plane algorithm capable of optimizing C-programs. More recently, this algorithm has been used by Carrizosa and Plastria to solve a global optimization problem in facility location (Ref. [16]). The distinction between global optimization problems (Ref. [54]) and generalized convex problems can sometimes be hard to establish. That is exactly the reason why so much effort has been placed into finding an exhaustive classification of the different weak forms of convexity, establishing a new definition just to satisfy some desirable property in the most general way possible. This book does not aim at all the subtleties of the different generalizations of convexity, but concentrates on the most general of them all, quasiconvex programming. Chapter 5 shows clearly where the real difficulties appear.
The purpose of this book is to develop a framework for analyzing strategic rationality, a notion central to contemporary game theory, which is the formal study of the interaction of rational agents, and which has proved extremely fruitful in economics, political theory, and business management. The author argues that a logical paradox (known since antiquity as "the Liar paradox") lies at the root of a number of persistent puzzles in game theory, in particular those concerning rational agents who seek to establish some kind of reputation. Building on the work of Parsons, Burge, Gaifman, and Barwise and Etchemendy, Robert Koons constructs a context-sensitive solution to the whole family of Liar-like paradoxes, including, for the first time, a detailed account of how the interpretation of paradoxial statements is fixed by context. This analysis provides a new understanding of how the rational agent model can account for the emergence of rules, practices, and institutions.
Meta-heuristics have developed dramatically since their inception in the early 1980s. They have had widespread success in attacking a variety of practical and difficult combinatorial optimization problems. These families of approaches include, but are not limited to greedy random adaptive search procedures, genetic algorithms, problem-space search, neural networks, simulated annealing, tabu search, threshold algorithms, and their hybrids. They incorporate concepts based on biological evolution, intelligent problem solving, mathematical and physical sciences, nervous systems, and statistical mechanics. Since the 1980s, a great deal of effort has been invested in the field of combinatorial optimization theory in which heuristic algorithms have become an important area of research and applications. This volume is drawn from the first conference on Meta-Heuristics and contains 41 papers on the state-of-the-art in heuristic theory and applications. The book treats the following meta-heuristics and applications: Genetic Algorithms, Simulated Annealing, Tabu Search, Networks & Graphs, Scheduling and Control, TSP, and Vehicle Routing Problems. It represents research from the fields of Operations Research, Management Science, Artificial Intelligence and Computer Science. |
You may like...
|