![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Optimization > General
This book constitutes the refereed proceedings of the International Workshop on Engineering Stochastic Local Search Algorithms 2007, held in Brussels, Belgium, September 6-8, 2007. The 12 revised full papers presented together with 9 short papers were carefully reviewed and selected from more than 50 submissions. The topics include Methodological developments, behavior of SLS algorithms, search space analysis, algorithm performance, tuning procedures, AI/OR techniques and dynamic behaviour.
This volume contains the Proceedings of the Twelfth French-German-Spanish Conference on Optimization held at the University of Avignon in 2004. We refer to this conference by using the acronym FGS-2004. During the period September 20-24, 2004, about 180 scientists from around the world met at Avignon (France) to discuss recent developments in optimization and related fields. The main topics discussed during this meeting were the following: 1. smooth and nonsmooth continuous optimization problems, 2. numerical methods for mathematical programming, 3. optimal control and calculus of variations, 4. differential inclusions and set-valued analysis, 5. stochastic optimization, 6. multicriteria optimization, 7. game theory and equilibrium concepts, 8. optimization models in finance and mathematical economics, 9. optimization techniques for industrial applications. The Scientific Committee of the conference consisted of F. Bonnans (Rocqu- court, France), J.-B. Hiriart-Urruty (Toulouse, France), F. Jarre (Diisseldorf, Germany), M.A. Lopez (Alicante, Spain), J.E. Martinez-Legaz (Barcelona, Spain), H. Maurer (Miinster, Germany), S. Pickenhain (Cottbus, Germany), A. Seeger (Avignon, France), and M. Thera (Limoges, France). The conference FGS-2004 is the 12th of the series of French-German meetings which started in Oberwolfach in 1980 and was continued in Confolant (1981), Luminy (1984), Irsee (1986), Varetz (1988), Lambrecht (1991), Dijon (1994), Trier (1996), Namur (1998), Montpellier (2000), and Cottbus (2002).
Combinatorial optimization and in particular the great variety of fascinating problemsthatbelong to thisareahaveattractedmanyresearchersformorethan halfacentury.Duetothepracticalrelevanceofsolvinghardreal-worldproblems, much research e?ort has been devoted to the development of heuristic methods aimed at ?nding good approximate solutions in a reasonable computation time. Some solution paradigms that are not speci?c for one particular problem have been deeply studied in the past, and the term metaheuristic is now common for such optimization heuristics. Several metaheuristics - simulated annealing, - netic and evolutionary algorithms, tabu search, ant colony optimization, scatter search, iterated local search, and greedy randomized adaptive search procedures beingsomeofthem-havefoundtheirownresearchcommunities, andspecialized conferences devoted to such techniques have been organized. Plenty of classical hard problems, such as the quadratic assignment pr- lem, the traveling salesman problem, problems in vehicle routing, scheduling, and timetabling, etc., have been tackled successfully with metaheuristic - proaches. Several thereof are currently considered state-of-the-art methods for solving such problems. However, for many years the main focus of research was on the application of single metaheuristics to given problems. A tendency to compare di?erent metaheuristics against each other could be observed, and sometimes this competition led to thinking in stereotypes in the research communities
Critical regimes of two-phase flows with a polydisperse solid phase form the basis of such widespread industrial processes as separation of various powdery materials and minerals dressing. It is impossible to describe such complicated flows analytically. Therefore, this study concentrates on invariants experimentally revealed and theoretically grounded for such flows. This approach can be compared with the situation in gases, where in order to determine principal parameters of their state, one does not need to measure the kinetic energy and velocity of each molecule and find its contribution to the temperature and pressure. These parameters are determined in a simple way for the system on the whole. A novel conception of two-phase flows allowing the formulation of their statistical parameters is physically substantiated. On the basis of the invariants and these parameters, a comprehensive method of estimating and predicting mass transfer in such flows is developed. It is noteworthy that the presented results are mostly phenomenological. Such an approach can be successfully extended to the separation of liquids, gases and isotopes. The book is intended for students and specialists engaged in chemical technology, mineral dressing, ceramics, microelectronics, pharmacology, power generation, thermal engineering and other fields in which flows carrying solid particles are used in the technological process.
This book constitutes the refereed proceedings of the 6th European Conference on Evolutionary Computation in Combinatorial Optimization, EvoCOP 2006, held in Budapest, Hungary in April 2006. The 24 revised full papers presented were carefully reviewed and selected from 77 submissions. The papers include coverage of evolutionary algorithms as well as various other metaheuristics, like scatter search, tabu search, and memetic algorithms.
Problems demanding globally optimal solutions are ubiquitous, yet many are intractable when they involve constrained functions having many local optima and interacting, mixed-type variables. The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast. Packed with illustrations, computer code, new insights, and practical advice, this volume explores DE in both principle and practice. It is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization. A companion CD includes DE-based optimization software in several programming languages.
1 With its fourth edition, the ANTS series of workshops has changed its name. Theoriginal"ANTS-FromAntColoniestoArti?cialAnts: InternationalWo- shop on Ant Algorithms" has become "ANTS - International Workshop on Ant Colony Optimization and Swarm Intelligence." This change is mainly due to the following reasons. First, the term "ant algorithms" was slower in spreading in the research community than the term "swarm intelligence," while at the sametime research inso-calledswarm robotics wasthesubjectofincreasingactivity: itwastherefore an obvious choice to substitute the term ant algorithms with the more accepted and used term swarm intelligence. Second, although swarm intelligence research has undoubtedly produced a 2 number of interesting and promising research directions, we think it is fair to say that its most successful strand is the one known as "ant colony optimi- tion."Ant colony optimization, ?rst introducedin the early1990sasa noveltool fortheapproximatesolutionofdiscreteoptimizationproblems, hasrecentlyseen an explosion in the number of its applications, both to academic and real-world problems, and is currently being extended to the realm of continuous optimi- tion (a few papers on this subject being published in these proceedings). It is therefore a reasonable choice to have the term ant colony optimization as part of the workshop name.
This book presents basic optimization principles and gradient-based algorithms to a general audience in a brief and easy-to-read form, without neglecting rigor. The work should enable professionals to apply optimization theory and algorithms to their own particular practical fields of interest, be it engineering, physics, chemistry, or business economics. Most importantly, for the first time in a relatively brief and introductory work, due attention is paid to the difficulties - such as noise, discontinuities, expense of function evaluations, and the existence of multiple minima - that often unnecessarily inhibit the use of gradient-based methods. In a separate chapter on new gradient-based methods developed by the author and his coworkers, it is shown how these difficulties may be overcome without losing the desirable features of classical gradient-based methods.
Optimization is the art, science and mathematics of finding the "best" member of a finite or infinite set of possible choices, based on some objective measure of the merit of each choice in the set. Three key facets of the subject are: - the construction of optimization models that capture the range of available choices within a feasible set and the measure-of-merit of any particular choice in a feasible set relative to its competitors; - the invention and implementation of efficient algorithms for solving optimization models; - a mathematical principle of duality that relates optimization models to one another in a fundamental way. Duality cuts across the entire field of optimization and is useful, in particular, for identifying optimality conditions, i.e., criteria that a given member of a feasible set must satisfy in order to be an optimal solution. This booklet provides a gentle introduction to the above topics and will be of interest to college students taking an introductory course in optimization, high school students beginning their studies in mathematics and science, the general reader looking for an overall sense of the field of optimization, and specialists in optimization interested in developing new ways of teaching the subject to their students. John Lawrence Nazareth is Professor Emeritus in the Department of Mathematics at Washington State University and Affiliate Professor in the Department of Applied Mathematics at the University of Washington. He is the author of two recent books also published by Springer-Verlag which explore the above topics in more depth, Differentiable Optimization and Equation Solving (2003) and DLP andExtensions: An Optimization Model and Decision Support System (2001).
Evolutionary Computation (EC) involves the study of problem solving and op- mization techniques inspired by principles of natural evolution and genetics. EC has been able to draw the attention of an increasing number of researchers and practitioners in several ?elds. Evolutionary algorithms have in particular been showntobee?ectivefordi?cultcombinatorialoptimizationproblemsappearing in various industrial, economic, and scienti?c domains. This volume contains the proceedings of EvoCOP 2004, the 4th European ConferenceonEvolutionaryComputationinCombinatorialOptimization.Itwas held in Coimbra, Portugal, on April 5 7, 2004, jointly with EuroGP 2004, the 7th European Conference on Genetic Programming, and EvoWorkshops 2004, which consisted of the following six individual workshops: EvoBIO, the 2nd - ropean Workshop on Evolutionary Bioinformatics; EvoCOMNET, the 1st - ropean Workshop on Evolutionary Computation in Communications, Networks, and Connected Systems; EvoHOT, the 1st European Workshop on Hardware Optimisation; EvoIASP, the 6th European Workshop on Evolutionary Com- tation in Image Analysis and Signal Processing; EvoMUSART, the 2nd Eu- pean Workshop on Evolutionary Music and Art; and EvoSTOC, the 1st Eu- pean Workshop on Evolutionary Algorithms in Stochastic and Dynamic En- ronments."
This volume contains a selection of papers referring to lectures presented at the symposium "Operations Research 2004" (OR 2004) held at Tilburg University, September 1-3, 2004. This international conference took place under the auspices of the German Operations Research Society (GOR) and the Dutch Operations Research Society (NGB). The symposium had about 500 participants from countries all over the world. It attracted academics and practitioners working in various fields of Operations Research and provided them with the most recent advances in Operations Research and related areas in Economics, Mathematics, and Computer Science. The program consisted of 4 plenary and 19 semi-plenary talks and more than 300 contributed presentations selected by the program committee to be presented in 20 sections.
A large number of real-life optimisation problems can only be realistically modelled with several~often conflicting~objectives. This fact requires us to abandon the concept of "optimal solution" in favour of vector optimization notions dealing with "efficient solution" and "efficient set". To solve these challenging multiobjective problems, the metaheuristics community has put forward a number of techniques commonly referred to as multiobjective meta- heuristics (MOMH). By its very nature, the field of MOMH covers a large research area both in terms of the types of problems solved and the techniques used to solve these problems. Its theoretical interest and practical applicability have attracted a large number of researchers and generated numerous papers, books and spe- cial issues. Moreover, several conferences and workshops have been organised, often specialising in specific sub-areas such as multiobjective evolutionary op- timisation. The main purpose of this volume is to provide an overview of the current state-of-the-art in the research field of MOMH. This overview is necessar- ily non-exhaustive, and contains both methodological and problem-oriented contributions, and applications of both population-based and neighbourhood- based heuristics. This volume originated from the workshop on multiobjective metaheuristics that was organised at the Carre des Sciences in Paris on November 4-5, 2002. This meeting was a joint effort of two working groups: ED jME and PM20.
This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Global Constraints Optimization and Costraint Satisfaction, COCOS 2002, held in Valbonne-Sophia Antipolis, France in October 2002. The 15 revised full papers presented together with 2 invited papers were carefully selected during two rounds of reviewing and improvement. The papers address current issues in global optimization, mathematical programming, and constraint programming; they are grouped in topical sections on optimization, constraint satisfaction, and benchmarking.
The main aim of this book is to present several results related to functions of unitary operators on complex Hilbert spaces obtained, by the author in a sequence of recent research papers. The fundamental tools to obtain these results are provided by some new Riemann-Stieltjes integral inequalities of continuous integrands on the complex unit circle and integrators of bounded variation. Features All the results presented are completely proved and the original references where they have been firstly obtained are mentioned Intended for use by both researchers in various fields of Linear Operator Theory and Mathematical Inequalities, as well as by postgraduate students and scientists applying inequalities in their specific areas Provides new emphasis to mathematical inequalities, approximation theory and numerical analysis in a simple, friendly and well-digested manner. About the Author Silvestru Sever Dragomir is Professor and Chair of Mathematical Inequalities at the College of Engineering & Science, Victoria University, Melbourne, Australia. He is the author of many research papers and several books on Mathematical Inequalities and their Applications. He also chairs the international Research Group in Mathematical Inequalities and Applications (RGMIA). For details, see https://rgmia.org/index.php.
This book constitutes the joint refereed proceedings of the 6th International Workshop on Approximation Algorithms for Optimization Problems, APPROX 2003 and of the 7th International Workshop on Randomization and Approximation Techniques in Computer Science, RANDOM 2003, held in Princeton, NY, USA in August 2003. The 33 revised full papers presented were carefully reviewed and selected from 74 submissions. Among the issues addressed are design and analysis of randomized and approximation algorithms, online algorithms, complexity theory, combinatorial structures, error-correcting codes, pseudorandomness, derandomization, network algorithms, random walks, Markov chains, probabilistic proof systems, computational learning, randomness in cryptography, and various applications.
This tutorial contains written versions of seven lectures on Computational Combinatorial Optimization given by leading members of the optimization community. The lectures introduce modern combinatorial optimization techniques, with an emphasis on branch and cut algorithms and Lagrangian relaxation approaches. Polyhedral combinatorics as the mathematical backbone of successful algorithms are covered from many perspectives, in particular, polyhedral projection and lifting techniques and the importance of modeling are extensively discussed. Applications to prominent combinatorial optimization problems, e.g., in production and transport planning, are treated in many places; in particular, the book contains a state-of-the-art account of the most successful techniques for solving the traveling salesman problem to optimality.
This book constitutes the refereed proceedings of the 5th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX 2002, held in Rome, Italy in September 2002.The 20 revised full papers presented were carefully reviewed and selected from 54 submissions. Among the topics addressed are design and analysis of approximation algorithms, inapproximability results, online problems, randomization techniques, average-case analysis, approximation classes, scheduling problems, routing and flow problems, coloring and partitioning, cuts and connectivity, packing and covering, geometric problems, network design, and applications to game theory and other fields.
A famous saying (due toHerriot)definescultureas "what remainswhen everythingisforgotten ." One couldparaphrase thisdefinitionin statingthat generalizedconvexity iswhat remainswhen convexity has been dropped . Of course, oneexpectsthatsome convexityfeaturesremain.For functions, convexity ofepigraphs(what is above thegraph) is a simplebut strong assumption.It leads tobeautifulpropertiesand to a field initselfcalled convex analysis. In several models, convexity is not presentandintroducing genuine convexityassumptionswouldnotberealistic. A simple extensionof thenotionof convexity consists in requiringthatthe sublevel sets ofthe functionsare convex (recall thata sublevel set offunction a is theportionof thesourcespaceon which thefunctiontakesvalues below a certainlevel).Its first use is usuallyattributed to deFinetti, in 1949. This propertydefinesthe class ofquasiconvexfunctions, which is much larger thanthe class of convex functions: a non decreasingor nonincreasingone variablefunctionis quasiconvex, as well asanyone-variable functionwhich is nonincreasingon someinterval(-00, a] or(-00, a) and nondecreasingon its complement.Many otherclasses ofgeneralizedconvexfunctionshave been introduced, often fortheneeds ofvariousapplications: algorithms, economics, engineering, management science, multicriteria optimization, optimalcontrol, statistics .Thus, theyplay animportantrole in severalappliedsciences . A monotonemappingF from aHilbertspace to itself is a mappingfor which the angle between F(x) - F(y) and x- y isacutefor anyx, y. It is well-known thatthegradientof a differentiable convexfunctionis monotone.The class of monotonemappings(and theclass ofmultivaluedmonotoneoperators) has remarkableproperties.This class has beengeneralizedin various direc tions, withapplicationsto partialdifferentialequations, variationalinequal ities, complementarity problemsand more generally, equilibriumproblems. The classes ofgeneralizedmonotonemappingsare more or lessrelatedto the classes ofgeneralizedfunctionsvia differentiation or subdifferentiation procedures.They are also link edvia severalothermeans."
This book constitutes the refereed proceedings of the First International Conference on Multi-Criterion Optimization, EMO 2001, held in Zurich, Switzerland in March 2001.The 45 revised full papers presented were carefully reviewed and selected from a total of 87 submissions. Also included are two tutorial surveys and two invited papers. The book is organized in topical sections on algorithm improvements, performance assessment and comparison, constraint handling and problem decomposition, uncertainty and noise, hybrid and alternative methods, scheduling, and applications of multi-objective optimization in a variety of fields.
This volume provides an up-to-date overview of major advances, emerging trends, and projected industrial applications in the field of multidisciplinary optimization. It concentrates on the current status of the field, exposes commonalities, innovative, promising, and speculative methods. This book provides a view of today's multidisciplinary optimization environment through a balenced theoretical and practical treatment. The contributors are the foremost authorities in each area of specialisation.
The book deals with linear time-invariant delay-differential equations with commensurated point delays in a control-theoretic context. The aim is to show that with a suitable algebraic setting a behavioral theory for dynamical systems described by such equations can be developed. The central object is an operator algebra which turns out to be an elementary divisor domain and thus provides the main tool for investigating the corresponding matrix equations. The book also reports the results obtained so far for delay-differential systems with noncommensurate delays. Moreover, whenever possible it points out similarities and differences to the behavioral theory of multidimensional systems, which is based on a great deal of algebraic structure itself. The presentation is introductory and self-contained. It should also be accessible to readers with no background in delay-differential equations or behavioral systems theory. The text should interest researchers and graduate students.
This book gathers papers presented at the 13th International Conference on Mesh Methods for Boundary-Value Problems and Applications, which was held in Kazan, Russia, in October 2020. The papers address the following topics: the theory of mesh methods for boundary-value problems in mathematical physics; non-linear mathematical models in mechanics and physics; algorithms for solving variational inequalities; computing science; and educational systems. Given its scope, the book is chiefly intended for students in the fields of mathematical modeling science and engineering. However, it will also benefit scientists and graduate students interested in these fields.
This volume contains a collection of papers based on lectures and presentations delivered at the International Conference on Constructive Nonsmooth Analysis (CNSA) held in St. Petersburg (Russia) from June 18-23, 2012. This conference was organized to mark the 50th anniversary of the birth of nonsmooth analysis and nondifferentiable optimization and was dedicated to J.-J. Moreau and the late B.N. Pshenichnyi, A.M. Rubinov, and N.Z. Shor, whose contributions to NSA and NDO remain invaluable. The first four chapters of the book are devoted to the theory of nonsmooth analysis. Chapters 5-8 contain new results in nonsmooth mechanics and calculus of variations. Chapters 9-13 are related to nondifferentiable optimization, and the volume concludes with four chapters containing interesting and important historical chapters, including tributes to three giants of nonsmooth analysis, convexity, and optimization: Alexandr Alexandrov, Leonid Kantorovich, and Alex Rubinov. The last chapter provides an overview and important snapshots of the 50-year history of convex analysis and optimization.
Describes how evolutionary algorithms (EAs) can be used to identify, model, and minimize day-to-day problems that arise for researchers in optimization and mobile networking Mobile ad hoc networks (MANETs), vehicular networks (VANETs), sensor networks (SNs), and hybrid networks--each of these require a designer's keen sense and knowledge of evolutionary algorithms in order to help with the common issues that plague professionals involved in optimization and mobile networking. This book introduces readers to both mobile ad hoc networks and evolutionary algorithms, presenting basic concepts as well as detailed descriptions of each. It demonstrates how metaheuristics and evolutionary algorithms (EAs) can be used to help provide low-cost operations in the optimization process--allowing designers to put some "intelligence" or sophistication into the design. It also offers efficient and accurate information on dissemination algorithms, topology management, and mobility models to address challenges in the field. "Evolutionary Algorithms for Mobile Ad Hoc Networks" Instructs on how to identify, model, and optimize solutions to problems that arise in daily researchPresents complete and up-to-date surveys on topics like network and mobility simulatorsProvides sample problems along with solutions/descriptions used to solve each, with performance comparisonsCovers current, relevant issues in mobile networks, like energy use, broadcasting performance, device mobility, and more "Evolutionary Algorithms for Mobile Ad Hoc Networks" is an ideal book for researchers and students involved in mobile networks, optimization, advanced search techniques, and multi-objective optimization.
In this edition, the scope and character of the monograph did not change with respect to the first edition. Taking into account the rapid development of the field, we have, however, considerably enlarged its contents. Chapter 4 includes two additional sections 4.4 and 4.6 on theory and algorithms of D.C. Programming. Chapter 7, on Decomposition Algorithms in Nonconvex Optimization, is completely new. Besides this, we added several exercises and corrected errors and misprints in the first edition. We are grateful for valuable suggestions and comments that we received from several colleagues. R. Horst, P.M. Pardalos and N.V. Thoai March 2000 Preface to the First Edition Many recent advances in science, economics and engineering rely on nu merical techniques for computing globally optimal solutions to corresponding optimization problems. Global optimization problems are extraordinarily di verse and they include economic modeling, fixed charges, finance, networks and transportation, databases and chip design, image processing, nuclear and mechanical design, chemical engineering design and control, molecular biology, and environment al engineering. Due to the existence of multiple local optima that differ from the global solution all these problems cannot be solved by classical nonlinear programming techniques. During the past three decades, however, many new theoretical, algorith mic, and computational contributions have helped to solve globally multi extreme problems arising from important practical applications." |
You may like...
Progress in Industrial Mathematics at…
Peregrina Quintela, Patricia Barral, …
Hardcover
R5,308
Discovery Miles 53 080
Radar Waveform Design based on…
Guolong Cui, Antonio Maio, …
Hardcover
Computational Optimization Techniques…
Muhammad Sarfraz, Samsul Ariffin Abdul Karim
Hardcover
R3,099
Discovery Miles 30 990
Sparse Polynomial Optimization: Theory…
Victor Magron, Jie Wang
Hardcover
R2,132
Discovery Miles 21 320
|