![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Science & Mathematics > Mathematics > Optimization > General
One has to make everything as simple as possible but, never more simple. Albert Einstein Discovery consists of seeing what every body has seen and thinking what nobody has thought. Albert S. ent_Gyorgy; The primary goal of this book is to provide an introduction to the theory of Interior Point Methods (IPMs) in Mathematical Programming. At the same time, we try to present a quick overview of the impact of extensions of IPMs on smooth nonlinear optimization and to demonstrate the potential of IPMs for solving difficult practical problems. The Simplex Method has dominated the theory and practice of mathematical pro gramming since 1947 when Dantzig discovered it. In the fifties and sixties several attempts were made to develop alternative solution methods. At that time the prin cipal base of interior point methods was also developed, for example in the work of Frisch (1955), Caroll (1961), Huard (1967), Fiacco and McCormick (1968) and Dikin (1967). In 1972 Klee and Minty made explicit that in the worst case some variants of the simplex method may require an exponential amount of work to solve Linear Programming (LP) problems. This was at the time when complexity theory became a topic of great interest. People started to classify mathematical programming prob lems as efficiently (in polynomial time) solvable and as difficult (NP-hard) problems. For a while it remained open whether LP was solvable in polynomial time or not. The break-through resolution ofthis problem was obtained by Khachijan (1989)."
The International Union of Theoretical and Applied Mechanics (IUTAM) initiated and sponsored an International Symposium on Optimization of Mechanical Systems held in 1995 in Stuttgart, Germany. The Symposium was intended to bring together scientists working in different fields of optimization to exchange ideas and to discuss new trends with special emphasis on multi body systems. A Scientific Committee was appointed by the Bureau of IUTAM with the following members: S. Arimoto (Japan) EL. Chernousko (Russia) M. Geradin (Belgium) E.J. Haug (U.S.A.) C.A.M. Soares (Portugal) N. Olhoff (Denmark) W.O. Schiehlen (Germany, Chairman) K. Schittkowski (Germany) R.S. Sharp (U.K.) W. Stadler (U.S.A.) H.-B. Zhao (China) This committee selected the participants to be invited and the papers to be presented at the Symposium. As a result of this procedure, 90 active scientific participants from 20 countries followed the invitation, and 49 papers were presented in lecture and poster sessions.
In 1995 the Handbook of Global Optimization (first volume), edited by R. Horst, and P.M. Pardalos, was published. This second volume of the Handbook of Global Optimization is comprised of chapters dealing with modern approaches to global optimization, including different types of heuristics. Together (available as a set, set ISBN 1-4020-0742-6), the two volumes of the handbook cover a complete and broad spectrum of approaches for dealing with global optimization problems. The goal of the editors is to provide a true handbook that does not focus on particular applications of the heuristics and algorithms, but rather describes the state of the art for the different methodologies. Topics covered in the handbook include various metaheuristics, such as simulated annealing, genetic algorithms, neural networks, taboo search, shake-and-bake methods, and deformation methods. In addition, the book contains chapters on new exact stochastic and deterministic approaches to continuous and mixed-integer global optimization, such as stochastic adaptive search, two-phase methods, branch-and-bound methods with new relaxation and branching strategies, algorithms based on local optimization, and dynamical search. Finally, the book contains chapters on experimental analysis of algorithms and software, test problems, and applications. Audience: Graduate students in engineering and operations research, academic research, as well as practitioners, who can tailor the general approaches described in the handbook to their specific needs and applications.
The NATO Advanced Study Institute on "Algorithms for continuous optimiza tion: the state of the art" was held September 5-18, 1993, at II Ciocco, Barga, Italy. It was attended by 75 students (among them many well known specialists in optimiza tion) from the following countries: Belgium, Brasil, Canada, China, Czech Republic, France, Germany, Greece, Hungary, Italy, Poland, Portugal, Rumania, Spain, Turkey, UK, USA, Venezuela. The lectures were given by 17 well known specialists in the field, from Brasil, China, Germany, Italy, Portugal, Russia, Sweden, UK, USA. Solving continuous optimization problems is a fundamental task in computational mathematics for applications in areas of engineering, economics, chemistry, biology and so on. Most real problems are nonlinear and can be of quite large size. Devel oping efficient algorithms for continuous optimization has been an important field of research in the last 30 years, with much additional impetus provided in the last decade by the availability of very fast and parallel computers. Techniques, like the simplex method, that were already considered fully developed thirty years ago have been thoroughly revised and enormously improved. The aim of this ASI was to present the state of the art in this field. While not all important aspects could be covered in the fifty hours of lectures (for instance multiob jective optimization had to be skipped), we believe that most important topics were presented, many of them by scientists who greatly contributed to their development.
Researchers develop simulation models that emulate real-world situations. While these simulation models are simpler than the real situation, they are still quite complex and time consuming to develop. It is at this point that metamodeling can be used to help build a simulation study based on a complex model. A metamodel is a simpler, analytical model, auxiliary to the simulation model, which is used to better understand the more complex model, to test hypotheses about it, and provide a framework for improving the simulation study. The use of metamodels allows the researcher to work with a set of mathematical functions and analytical techniques to test simulations without the costly running and re-running of complex computer programs. In addition, metamodels have other advantages, and as a result they are being used in a variety of ways: model simplification, optimization, model interpretation, generalization to other models of similar systems, efficient sensitivity analysis, and the use of the metamodel's mathematical functions to answer questions about different variables within a simulation study.
I am very happy to have this opportunity to present the work of Boris Mirkin, a distinguished Russian scholar in the areas of data analysis and decision making methodologies. The monograph is devoted entirely to clustering, a discipline dispersed through many theoretical and application areas, from mathematical statistics and combina torial optimization to biology, sociology and organizational structures. It compiles an immense amount of research done to date, including many original Russian de velopments never presented to the international community before (for instance, cluster-by-cluster versions of the K-Means method in Chapter 4 or uniform par titioning in Chapter 5). The author's approach, approximation clustering, allows him both to systematize a great part of the discipline and to develop many in novative methods in the framework of optimization problems. The optimization methods considered are proved to be meaningful in the contexts of data analysis and clustering. The material presented in this book is quite interesting and stimulating in paradigms, clustering and optimization. On the other hand, it has a substantial application appeal. The book will be useful both to specialists and students in the fields of data analysis and clustering as well as in biology, psychology, economics, marketing research, artificial intelligence, and other scientific disciplines. Panos Pardalos, Series Editor."
Most books on inventory theory use the item approach to determine stock levels, ignoring the impact of unit cost, echelon location, and hardware indenture. Optimal Inventory Modeling of Systems is the first book to take the system approach to inventory modeling. The result has been dramatic reductions in the resources to operate many systems - fleets of aircraft, ships, telecommunications networks, electric utilities, and the space station. Although only four chapters and appendices are totally new in this edition, extensive revisions have been made in all chapters, adding numerous worked-out examples. Many new applications have been added including commercial airlines, experience gained during Desert Storm, and adoption of the Windows interface as a standard for personal computer models.
Over the past several years, cooperative control and optimization has un questionably been established as one of the most important areas of research in the military sciences. Even so, cooperative control and optimization tran scends the military in its scope -having become quite relevant to a broad class of systems with many exciting, commercial, applications. One reason for all the excitement is that research has been so incredibly diverse -spanning many scientific and engineering disciplines. This latest volume in the Cooperative Systems book series clearly illustrates this trend towards diversity and creative thought. And no wonder, cooperative systems are among the hardest systems control science has endeavored to study, hence creative approaches to model ing, analysis, and synthesis are a must The definition of cooperation itself is a slippery issue. As you will see in this and previous volumes, cooperation has been cast into many different roles and therefore has assumed many diverse meanings. Perhaps the most we can say which unites these disparate concepts is that cooperation (1) requires more than one entity, (2) the entities must have some dynamic behavior that influences the decision space, (3) the entities share at least one common objective, and (4) entities are able to share information about themselves and their environment. Optimization and control have long been active fields of research in engi neering."
This collection of papers is dedicated to the memory of Gaetano Fichera, a great mathematician and also a good friend to the editors. Regrettably it took an unusual amount of time to bring this collection out. This was primarily due to the fact that the main editor who had collected all of the materials, for this volume, P. D. Panagiotopoulos, died unexpectedly during the period when we were editing the manuscript. The other two editors in appreciation of Panagiotopoulos' contribution to this field, believe it is therefore fitting that this collection be dedicated to his memory also. The theme of the collection is centered around the seminal research of G. Fichera on the Signorini problem. Variants on this idea enter in different ways. For example, by bringing in friction the problem is no longer self-adjoint and the minimization formulation is not valid. A large portion of this collection is devoted to survey papers concerning hemivariational methods, with a main point of its application to nonsmooth mechanics. Hemivariational inequali ties, which are a generalization of variational inequalities, were pioneered by Panagiotopoulos. There are many applications of this theory to the study of non convex energy functionals occurring in many branches of mechanics. An area of concentration concerns contact problems, in particular, quasistatic and dynamic contact problems with friction and damage. Nonsmooth optimization methods which may be divided into the main groups of subgradient methods and bundle methods are also discussed in this collection."
Global Optimization has emerged as one of the most exciting new areas of mathematical programming. Global optimization has received a wide attraction from many fields in the past few years, due to the success of new algorithms for addressing previously intractable problems from diverse areas such as computational chemistry and biology, biomedicine, structural optimization, computer sciences, operations research, economics, and engineering design and control. This book contains refereed invited papers submitted at the 4th international confer ence on Frontiers in Global Optimization held at Santorini, Greece during June 8-12, 2003. Santorini is one of the few sites of Greece, with wild beauty created by the explosion of a volcano which is in the middle of the gulf of the island. The mystic landscape with its numerous mult-extrema, was an inspiring location particularly for researchers working on global optimization. The three previous conferences on "Recent Advances in Global Opti mization," "State-of-the-Art in Global Optimization," and "Optimization in Computational Chemistry and Molecular Biology: Local and Global approaches" took place at Princeton University in 1991, 1995, and 1999, respectively. The papers in this volume focus on de terministic methods for global optimization, stochastic methods for global optimization, distributed computing methods in global optimization, and applications of global optimiza tion in several branches of applied science and engineering, computer science, computational chemistry, structural biology, and bio-informatics."
This volume contains refereed papers based on the lectures presented at the XIV International Conference on Mathematical Programming held at Matrahaza, Hungary, between 27-31 March 1999. This conference was organized by the Laboratory of Operations Research and Deci sion Systems at the Computer and Automation Institute, Hungarian Academy of Sciences. The editors hope this volume will contribute to the theory and applications of mathematical programming. As a tradition of these events, the main purpose of the confer ence was to review and discuss recent advances and promising research trends concerning theory, algorithms and applications in different fields of Optimization Theory and related areas such as Convex Analysis, Complementarity Systems and Variational Inequalities. The conference is traditionally held in the Matra Mountains, and housed by the resort house of the Hungarian Academy of Sciences. This was the 14th event of the long lasting series of conferences started in 1973. The organizers wish to express their thanks to the authors for their contributions in this volume, and the anonymous referees for their valu able comments. Special thanks are directed to our sponsors, the Hun garian Academy of Sciences, the National Committee for Technological Development, the Hungarian National Science Foundation, and last but not least, the Hungarian Operational Research Society. We would like to thank John Martindale from Kluwer Academic Publishers for helping us produce this volume, Eva Nora Nagy for cor rections and proof-readings, and Peter Dombi for his excellent work on typesetting and editing the manuscript."
In the quest to understand and model the healthy or sick human body, re searchers and medical doctors are utilizing more and more quantitative tools and techniques. This trend is pushing the envelope of a new field we call Biomedical Computing, as an exciting frontier among signal processing, pattern recognition, optimization, nonlinear dynamics, computer science and biology, chemistry and medicine. A conference on Biocomputing was held during February 25-27, 2001 at the University of Florida. The conference was sponsored by the Center for Applied Optimization, the Computational Neuroengineering Center, the Biomedical En gineering Program (through a Whitaker Foundation grant), the Brain Institute, the School of Engineering, and the University of Florida Research & Graduate Programs. The conference provided a forum for researchers to discuss and present new directions in Biocomputing. The well-attended three days event was highlighted by the presence of top researchers in the field who presented their work in Biocomputing. This volume contains a selective collection of ref ereed papers based on talks presented at this conference. You will find seminal contributions in genomics, global optimization, computational neuroscience, FMRI, brain dynamics, epileptic seizure prediction and cancer diagnostics. We would like to take the opportunity to thank the sponsors, the authors of the papers, the anonymous referees, and Kluwer Academic Publishers for making the conference successful and the publication of this volume possible. Panos M. Pardalos and Jose C."
Optimization problems abound in most fields of science, engineering, and tech nology. In many of these problems it is necessary to compute the global optimum (or a good approximation) of a multivariable function. The variables that define the function to be optimized can be continuous and/or discrete and, in addition, many times satisfy certain constraints. Global optimization problems belong to the complexity class of NP-hard prob lems. Such problems are very difficult to solve. Traditional descent optimization algorithms based on local information are not adequate for solving these problems. In most cases of practical interest the number of local optima increases, on the aver age, exponentially with the size of the problem (number of variables). Furthermore, most of the traditional approaches fail to escape from a local optimum in order to continue the search for the global solution. Global optimization has received a lot of attention in the past ten years, due to the success of new algorithms for solving large classes of problems from diverse areas such as engineering design and control, computational chemistry and biology, structural optimization, computer science, operations research, and economics. This book contains refereed invited papers presented at the conference on "State of the Art in Global Optimization: Computational Methods and Applications" held at Princeton University, April 28-30, 1995. The conference presented current re search on global optimization and related applications in science and engineering. The papers included in this book cover a wide spectrum of approaches for solving global optimization problems and applications."
grams of which the objective is given by the ratio of a convex by a positive (over a convex domain) concave function. As observed by Sniedovich (Ref. [102, 103]) most of the properties of fractional pro grams could be found in other programs, given that the objective function could be written as a particular composition of functions. He called this new field C programming, standing for composite concave programming. In his seminal book on dynamic programming (Ref. [104]), Sniedovich shows how the study of such com positions can help tackling non-separable dynamic programs that otherwise would defeat solution. Barros and Frenk (Ref. [9]) developed a cutting plane algorithm capable of optimizing C-programs. More recently, this algorithm has been used by Carrizosa and Plastria to solve a global optimization problem in facility location (Ref. [16]). The distinction between global optimization problems (Ref. [54]) and generalized convex problems can sometimes be hard to establish. That is exactly the reason why so much effort has been placed into finding an exhaustive classification of the different weak forms of convexity, establishing a new definition just to satisfy some desirable property in the most general way possible. This book does not aim at all the subtleties of the different generalizations of convexity, but concentrates on the most general of them all, quasiconvex programming. Chapter 5 shows clearly where the real difficulties appear.
The purpose of this book is to develop a framework for analyzing strategic rationality, a notion central to contemporary game theory, which is the formal study of the interaction of rational agents, and which has proved extremely fruitful in economics, political theory, and business management. The author argues that a logical paradox (known since antiquity as "the Liar paradox") lies at the root of a number of persistent puzzles in game theory, in particular those concerning rational agents who seek to establish some kind of reputation. Building on the work of Parsons, Burge, Gaifman, and Barwise and Etchemendy, Robert Koons constructs a context-sensitive solution to the whole family of Liar-like paradoxes, including, for the first time, a detailed account of how the interpretation of paradoxial statements is fixed by context. This analysis provides a new understanding of how the rational agent model can account for the emergence of rules, practices, and institutions.
Meta-heuristics have developed dramatically since their inception in the early 1980s. They have had widespread success in attacking a variety of practical and difficult combinatorial optimization problems. These families of approaches include, but are not limited to greedy random adaptive search procedures, genetic algorithms, problem-space search, neural networks, simulated annealing, tabu search, threshold algorithms, and their hybrids. They incorporate concepts based on biological evolution, intelligent problem solving, mathematical and physical sciences, nervous systems, and statistical mechanics. Since the 1980s, a great deal of effort has been invested in the field of combinatorial optimization theory in which heuristic algorithms have become an important area of research and applications. This volume is drawn from the first conference on Meta-Heuristics and contains 41 papers on the state-of-the-art in heuristic theory and applications. The book treats the following meta-heuristics and applications: Genetic Algorithms, Simulated Annealing, Tabu Search, Networks & Graphs, Scheduling and Control, TSP, and Vehicle Routing Problems. It represents research from the fields of Operations Research, Management Science, Artificial Intelligence and Computer Science.
In the last few decades, multiscale algorithms have become a dominant trend in large-scale scientific computation. Researchers have successfully applied these methods to a wide range of simulation and optimization problems. This book gives a general overview of multiscale algorithms; applications to general combinatorial optimization problems such as graph partitioning and the traveling salesman problem; and VLSICAD applications, including circuit partitioning, placement, and VLSI routing. Additional chapters discuss optimization in reconfigurable computing, convergence in multilevel optimization, and model problems with PDE constraints. Audience Written at the graduate level, the book is intended for engineers and mathematical and computational scientists studying large-scale optimization in electronic design automation.
This book deals with the aspects of modeling and solving real-world optimiza- tion problems in a unique combination. It treats systematically the major mod- eling languages and modeling systems used to solve mathematical optimization problems. The book is an offspring ofthe 71 st Meeting of the GOR (Gesellschaft fill Operations Research) Working Group Mathematical Optimization in Real Life which was held under the title Modeling Languages in Mathematical Op- timization during April 23-25, 2003 in the German Physics Society Confer- ence Building in Bad Honnef, Germany. The modeling language providers AIMMS Johannes Bisschop, Paragon Decision Technology B. V, Haarlem, The Netherlands, AMPL Bob Fourer, Northwestern Univ.; David M. Gay, AMPL Optimization LLC. , NJ, GAMS Alexander Meeraus, GAMS Development Corporation, Washington D. C. , Mosel Bob Daniel, Dash Optimization, Blisworth, UK, MPL Bjami Krist jansson, Maximal Software, Arlington, VA, NOP-2 Hermann Schichl, Vienna University, Austria, PCOMP Klaus Schittkowski, Bayreuth University, Germany, and OPL Sofiane Oussedik, ILOG Inc. , Paris, France gave deep insight into their motivations and conceptual design features of their software, highlighted their advantages but also critically discussed their limits. The participants benefited greatly from this symposium which gave a useful overview and orientation on today's modeling languages in optimization. Roughly speaking, a modeling language serves the need to pass data and a mathematical model description to a solver in the same way that people, es- Of course, in pecially mathematicians describe those problems to each other.
This volume contains the edited texts of the lectures presented at the Workshop on High Performance Algorithms and Software for Nonlinear Optimization held in Erice, Sicily, at the "G. Stampacchia" School of Mathematics of the "E. Majorana" Centre for Scientific Culture, June 30 - July 8, 2001. In the first year of the new century, the aim of the Workshop was to assess the past and to discuss the future of Nonlinear Optimization, and to highlight recent achieve ments and promising research trends in this field. An emphasis was requested on algorithmic and high performance software developments and on new computational experiences, as well as on theoretical advances. We believe that such goal was basically achieved. The Workshop was attended by 71 people from 22 countries. Although not all topics were covered, the presentations gave indeed a wide overview of the field, from different and complementary stand points. Besides the lectures, several formal and informal discussions took place. We wish to express our appreciation for the active contribution of all the participants in the meeting. The 18 papers included in this volume represent a significant selection of the most recent developments in nonlinear programming theory and practice. They show that there is plenty of exciting ideas, implementation issues and new applications which produce a very fast evolution in the field."
Optimization from Human Genes to Cutting Edge Technologies The challenges faced by industry today are so complex that they can only be solved through the help and participation of optimization ex perts. For example, many industries in e-commerce, finance, medicine, and engineering, face several computational challenges due to the mas sive data sets that arise in their applications. Some of the challenges include, extended memory algorithms and data structures, new program ming environments, software systems, cryptographic protocols, storage devices, data compression, mathematical and statistical methods for knowledge mining, and information visualization. With advances in computer and information systems technologies, and many interdisci plinary efforts, many of the "data avalanche challenges" are beginning to be addressed. Optimization is the most crucial component in these efforts. Nowadays, the main task of optimization is to investigate the cutting edge frontiers of these technologies and systems and find the best solutions for their realization. Optimization principles are evident in nature (the perfect optimizer) and appeared early in human history. Did you ever watch how a spider catches a fly or a mosquito? Usually a spider hides at the edge of its net. When a fly or a mosquito hits the net the spider will pick up each line in the net to choose the tense line? Some biologists explain that the line gives the shortest path from the spider to its prey."
Researchers working with nonlinear programming often claim "the word is non linear" indicating that real applications require nonlinear modeling. The same is true for other areas such as multi-objective programming (there are always several goals in a real application), stochastic programming (all data is uncer tain and therefore stochastic models should be used), and so forth. In this spirit we claim: The word is multilevel. In many decision processes there is a hierarchy of decision makers, and decisions are made at different levels in this hierarchy. One way to handle such hierar chies is to focus on one level and include other levels' behaviors as assumptions. Multilevel programming is the research area that focuses on the whole hierar chy structure. In terms of modeling, the constraint domain associated with a multilevel programming problem is implicitly determined by a series of opti mization problems which must be solved in a predetermined sequence. If only two levels are considered, we have one leader (associated with the upper level) and one follower (associated with the lower level)."
The intensive use of automatic data acquisition system and the use of cloud computing for process monitoring have led to an increased occurrence of industrial processes that utilize statistical process control and capability analysis. These analyses are performed almost exclusively with multivariate methodologies. The aim of this Brief is to present the most important MSQC techniques developed in R language. The book is divided into two parts. The first part contains the basic R elements, an introduction to statistical procedures, and the main aspects related to Statistical Quality Control (SQC). The second part covers the construction of multivariate control charts, the calculation of Multivariate Capability Indices.
Data uncertainty is a concept closely related with most real life applications that involve data collection and interpretation. Examples can be found in data acquired with biomedical instruments or other experimental techniques. Integration of robust optimization in the existing data mining techniques aim to create new algorithms resilient to error and noise. This work encapsulates all the latest applications of robust optimization in data mining. This brief contains an overview of the rapidly growing field ofrobust data mining research field and presents the most well known machine learning algorithms, their robust counterpart formulations and algorithms for attacking these problems. Thisbrief will appeal to theoreticians and data miners working in this field. "
This book arose out of an invited feature article on visualization and opti mization that appeared in the ORSA Journal on Computing in 1994. That article briefly surveyed the current state of the art in visualization as it ap plied to optimization. In writing the feature article, it became clear that there was much more to say. Apparently others agreed, and thus this book was born. The book is targeted primarily towards the optimization community rather than the visualization community. Although both optimization and visualization both seek to help people understand complex problems, prac titioners in one field are generally unaware of work in the other field. Given the common goals of the respective fields, it seemed fruitful to consider how each can contribute to the other. One might argue that this book should not be focused specifically on optimization but on decision making in general. Perhaps, but it seems that there is sufficient material to create a book targeted specifically to optimization. Certainly many of the ideas presented in the book are appli cable to other areas, including computer simulation, decision theory and stochastic modeling. Another book could discuss the use of visualization in these areas.
Linear Programming (LP) is perhaps the most frequently used
optimization technique. One of the reasons for its wide use is that
very powerful solution algorithms exist for linear optimization.
Computer programs based on either the simplex or interior point
methods are capable of solving very large-scale problems with high
reliability and within reasonable time. Model builders are aware of
this and often try to formulate real-life problems within this
framework to ensure they can be solved efficiently. It is also true
that many real-life optimization problems can be formulated as
truly linear models and also many others can well be approximated
by linearization. The two main methods for solving LP problems are
the variants of the simplex method and the interior point methods
(IPMs). It turns out that both variants have their role in solving
different problems. It has been recognized that, since the
introduction of the IPMs, the efficiency of simplex based solvers
has increased by two orders of magnitude. This increased efficiency
can be attributed to the following: (1) theoretical developments in
the underlying algorithms, (2) inclusion of results of computer
science, (3) using the principles of software engineering, and (4)
taking into account the state-of-the-art in computer technology.
|
You may like...
On Writing Well - The Classic Guide to…
William Knowlton Zinsser
Paperback
(4)
Critical Reading and Writing in the…
Andrew Goatly, Preet Hiradhar
Paperback
(1)R1,002 Discovery Miles 10 020
Connect: Writing For Online Audiences
Maritha Pritchard, Karabo Sitto
Paperback
(1)R460 Discovery Miles 4 600
|