![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Optimization > Linear programming
Linear programming represents one of the major applications of mathematics to business, industry, and economics. It provides a methodology for optimizing an output given that is a linear function of a number of inputs. George Dantzig is widely regarded as the founder of the subject with his invention of the simplex algorithm in the 1940's. This second volume is intended to add to the theory of the items discussed in the first volume. It also includes additional advanced topics such as variants of the simplex method, interior point methods (early and current methods), GUB, decomposition, integer programming, and game theory. Graduate students in the fields of operations research, industrial engineering, and applied mathematics will find this volume of particular interest.
This book offers a comprehensive treatment of the exercises and case studies as well as summaries of the chapters of the book "Linear Optimization and Extensions" by Manfred Padberg. It covers the areas of linear programming and the optimization of linear functions over polyhedra in finite dimensional Euclidean vector spaces.Here are the main topics treated in the book: Simplex algorithms and their derivatives including the duality theory of linear programming. Polyhedral theory, pointwise and linear descriptions of polyhedra, double description algorithms, Gaussian elimination with and without division, the complexity of simplex steps. Projective algorithms, the geometry of projective algorithms, Newtonian barrier methods. Ellipsoids algorithms in perfect and in finite precision arithmetic, the equivalence of linear optimization and polyhedral separation. The foundations of mixed-integer programming and combinatorial optimization.
Originally published in 1987. This collection of original papers deals with various issues of specification in the context of the linear statistical model. The volume honours the early econometric work of Donald Cochrane, late Dean of Economics and Politics at Monash University in Australia. The chapters focus on problems associated with autocorrelation of the error term in the linear regression model and include appraisals of early work on this topic by Cochrane and Orcutt. The book includes an extensive survey of autocorrelation tests; some exact finite-sample tests; and some issues in preliminary test estimation. A wide range of other specification issues is discussed, including the implications of random regressors for Bayesian prediction; modelling with joint conditional probability functions; and results from duality theory. There is a major survey chapter dealing with specification tests for non-nested models, and some of the applications discussed by the contributors deal with the British National Accounts and with Australian financial and housing markets.
This book introduces several topics related to linear model theory: multivariate linear models, discriminant analysis, principal components, factor analysis, time series in both the frequency and time domains, and spatial data analysis. The second edition adds new material on nonparametric regression, response surface maximization, and longitudinal models. The book provides a unified approach to these disparate subject and serves as a self-contained companion volume to the author's Plane Answers to Complex Questions: The Theory of Linear Models. Ronald Christensen is Professor of Statistics at the University of New Mexico. He is well known for his work on the theory and application of linear models having linear structure. He is the author of numerous technical articles and several books and he is a Fellow of the American Statistical Association and the Institute of Mathematical Statistics. Also Available: Christensen, Ronald. Plane Answers to Complex Questions: The Theory of Linear Models, Second Edition (1996). New York: Springer-Verlag New York, Inc. Christensen, Ronald. Log-Linear Models and Logistic Regression, Second Edition (1997). New York: Springer-Verlag New York, Inc.
In this book, the author considers separable programming and, in particular, one of its important cases - convex separable programming. Some general results are presented, techniques of approximating the separable problem by linear programming and dynamic programming are considered. Convex separable programs subject to inequality/ equality constraint(s) and bounds on variables are also studied and iterative algorithms of polynomial complexity are proposed. As an application, these algorithms are used in the implementation of stochastic quasigradient methods to some separable stochastic programs. Numerical approximation with respect to I1 and I4 norms, as a convex separable nonsmooth unconstrained minimization problem, is considered as well. Audience: Advanced undergraduate and graduate students, mathematical programming/ operations research specialists.
The articles in this proceedings volume reflect the current trends in the theory of approximation, optimization and mathematical economics, and include numerous applications. The book will be of interest to researchers and graduate students involved in functional analysis, approximation theory, mathematical programming and optimization, game theory, mathematical finance and economics.
The 9th Belgian-French-German Conference on Optimization has been held in Namur (Belgium) on September 7-11, 1998. This volume is a collection of papers presented at this Conference. Originally, this Conference was a French-German Conference but this year, in accordance with the organizers' wishes, a third country, Belgium, has joined the founding members of the Conference. Hence the name: Belgian French-German Conference on Optimization. Since the very beginning, the purpose of these Conferences has been to bring together researchers working in the area of Optimization and partic ularly to encourage young researchers to present their work. Most of the participants come from the organizing countries. However the general ten dancy is to invite outside researchers to attend the meeting. So this year, among the 101 participants at this Conference, twenty researchers came from other countries. The general theme of the Conference is everything that concerns the area of Optimization without specification of particular topics. So theoretical as pects of Optimization, in addition to applications and algorithms of Opti mization, will be developed. However, and this point was very important for the organizers, the Conference must retain its convivial character. No more than two parallel sessions are organized. This would allow useful contacts between researchers to be promoted. The editors express their sincere thanks to all those who took part in this Conference. Their invaluable discussions have made this volume possible."
Complementarity theory is a new domain in applied mathematics and is concerned with the study of complementarity problems. These problems represent a wide class of mathematical models related to optimization, game theory, economic engineering, mechanics, fluid mechanics, stochastic optimal control etc. The book is dedicated to the study of nonlinear complementarity problems by topological methods. Audience: Mathematicians, engineers, economists, specialists working in operations research and anybody interested in applied mathematics or in mathematical modeling.
This monograph presents a collection of results, observations, and examples related to dynamical systems described by linear and nonlinear ordinary differential and difference equations. In particular, dynamical systems that are susceptible to analysis by the Liapunov approach are considered. The naive observation that certain "diagonal-type" Liapunov functions are ubiquitous in the literature attracted the attention of the authors and led to some natural questions. Why does this happen so often? What are the spe cial virtues of these functions in this context? Do they occur so frequently merely because they belong to the simplest class of Liapunov functions and are thus more convenient, or are there any more specific reasons? This monograph constitutes the authors' synthesis of the work on this subject that has been jointly developed by them, among others, producing and compiling results, properties, and examples for many years, aiming to answer these questions and also to formalize some of the folklore or "cul ture" that has grown around diagonal stability and diagonal-type Liapunov functions. A natural answer to these questions would be that the use of diagonal type Liapunov functions is frequent because of their simplicity within the class of all possible Liapunov functions. This monograph shows that, although this obvious interpretation is often adequate, there are many in stances in which the Liapunov approach is best taken advantage of using diagonal-type Liapunov functions. In fact, they yield necessary and suffi cient stability conditions for some classes of nonlinear dynamical systems."
For a long time the techniques of solving linear optimization (LP) problems improved only marginally. Fifteen years ago, however, a revolutionary discovery changed everything. A new golden age' for optimization started, which is continuing up to the current time. What is the cause of the excitement? Techniques of linear programming formed previously an isolated body of knowledge. Then suddenly a tunnel was built linking it with a rich and promising land, part of which was already cultivated, part of which was completely unexplored. These revolutionary new techniques are now applied to solve conic linear problems. This makes it possible to model and solve large classes of essentially nonlinear optimization problems as efficiently as LP problems. This volume gives an overview of the latest developments of such High Performance Optimization Techniques'. The first part is a thorough treatment of interior point methods for semidefinite programming problems. The second part reviews today's most exciting research topics and results in the area of convex optimization. Audience: This volume is for graduate students and researchers who are interested in modern optimization techniques.
Mathematical elegance is a constant theme in this treatment of linear programming and matrix games. Condensed tableau, minimal in size and notation, are employed for the simplex algorithm. In the context of these tableau the beautiful termination theorem of R.G. Bland is proven more simply than heretofore, and the important duality theorem becomes almost obvious. Examples and extensive discussions throughout the book provide insight into definitions, theorems, and applications. There is considerable informal discussion on how best to play matrix games. The book is designed for a one-semester undergraduate course. Readers will need a degree of mathematical sophistication and general tools such as sets, functions, and summation notation. No single college course is a prerequisite, but most students will do better with some prior college mathematics. This thorough introduction to linear programming and game theory will impart a deep understanding of the material and also increase the student's mathematical maturity.
This book focuses largely on constrained optimization. It begins with a substantial treatment of linear programming and proceeds to convex analysis, network flows, integer programming, quadratic programming, and convex optimization. Along the way, dynamic programming and the linear complementarity problem are touched on as well. This book aims to be the first introduction to the topic. Specific examples and concrete algorithms precede more abstract topics. Nevertheless, topics covered are developed in some depth, a large number of numerical examples worked out in detail, and many recent results are included, most notably interior-point methods. The exercises at the end of each chapter both illustrate the theory, and, in some cases, extend it. Optimization is not merely an intellectual exercise: its purpose is to solve practical problems on a computer. Accordingly, the book comes with software that implements the major algorithms studied. At this point, software for the following four algorithms is available: The two-phase simplex method The primal-dual simplex method The path-following interior-point method The homogeneous self-dual methods.GBP/LISTGBP.
The first comprehensive account of the theory of mass transportation problems and its applications. In Volume I, the authors systematically develop the theory with emphasis on the Monge-Kantorovich mass transportation and the Kantorovich-Rubinstein mass transshipment problems. They then discuss a variety of different approaches towards solving these problems and exploit the rich interrelations to several mathematical sciences - from functional analysis to probability theory and mathematical economics. The second volume is devoted to applications of the above problems to topics in applied probability, theory of moments and distributions with given marginals, queuing theory, risk theory of probability metrics and its applications to various fields, among them general limit theorems for Gaussian and non-Gaussian limiting laws, stochastic differential equations and algorithms, and rounding problems. Useful to graduates and researchers in theoretical and applied probability, operations research, computer science, and mathematical economics, the prerequisites for this book are graduate level probability theory and real and functional analysis.
The 5th edition of this classic textbook covers the central concepts of practical optimization techniques, with an emphasis on methods that are both state-of-the-art and popular. One major insight is the connection between the purely analytical character of an optimization problem and the behavior of algorithms used to solve that problem. End-of-chapter exercises are provided for all chapters. The material is organized into three separate parts. Part I offers a self-contained introduction to linear programming. The presentation in this part is fairly conventional, covering the main elements of the underlying theory of linear programming, many of the most effective numerical algorithms, and many of its important special applications. Part II, which is independent of Part I, covers the theory of unconstrained optimization, including both derivations of the appropriate optimality conditions and an introduction to basic algorithms. This part of the book explores the general properties of algorithms and defines various notions of convergence. In turn, Part III extends the concepts developed in the second part to constrained optimization problems. Except for a few isolated sections, this part is also independent of Part I. As such, Parts II and III can easily be used without reading Part I and, in fact, the book has been used in this way at many universities. New to this edition are popular topics in data science and machine learning, such as the Markov Decision Process, Farkas' lemma, convergence speed analysis, duality theories and applications, various first-order methods, stochastic gradient method, mirror-descent method, Frank-Wolf method, ALM/ADMM method, interior trust-region method for non-convex optimization, distributionally robust optimization, online linear programming, semidefinite programming for sensor-network localization, and infeasibility detection for nonlinear optimization.
Encompassing all the major topics students will encounter in courses on the subject, the authors teach both the underlying mathematical foundations and how these ideas are implemented in practice. They illustrate all the concepts with both worked examples and plenty of exercises, and, in addition, provide software so that students can try out numerical methods and so hone their skills in interpreting the results. As a result, this will make an ideal textbook for all those coming to the subject for the first time. Authors' note: A problem recently found with the software is due to a bug in Formula One, the third party commercial software package that was used for the development of the interface. It occurs when the date, currency, etc. format is set to a non-United States version. Please try setting your computer date/currency option to the United States option . The new version of Formula One, when ready, will be posted on WWW.
This book focuses largely on constrained optimization. It begins with a substantial treatment of linear programming and proceeds to convex analysis, network flows, integer programming, quadratic programming, and convex optimization. Along the way, dynamic programming and the linear complementarity problem are touched on as well. This book aims to be the first introduction to the topic. Specific examples and concrete algorithms precede more abstract topics. Nevertheless, topics covered are developed in some depth, a large number of numerical examples worked out in detail, and many recent results are included, most notably interior-point methods. The exercises at the end of each chapter both illustrate the theory, and, in some cases, extend it. Optimization is not merely an intellectual exercise: its purpose is to solve practical problems on a computer. Accordingly, the book comes with software that implements the major algorithms studied. At this point, software for the following four algorithms is available: The two-phase simplex method The primal-dual simplex method The path-following interior-point method The homogeneous self-dual methods.GBP/LISTGBP.
Linear Programming provides an in-depth look at simplex based as well as the more recent interior point techniques for solving linear programming problems. Starting with a review of the mathematical underpinnings of these approaches, the text provides details of the primal and dual simplex methods with the primal-dual, composite, and steepest edge simplex algorithms. This then is followed by a discussion of interior point techniques, including projective and affine potential reduction, primal and dual affine scaling, and path following algorithms. Also covered is the theory and solution of the linear complementarity problem using both the complementary pivot algorithm and interior point routines. A feature of the book is its early and extensive development and use of duality theory. Audience: The book is written for students in the areas of mathematics, economics, engineering and management science, and professionals who need a sound foundation in the important and dynamic discipline of linear programming.
A comprehensive, up-to-date text on linear programming. Covers all practical modeling, mathematical, geometrical, algorithmic, and computational aspects. Surveys recent developments in the field, including the Ellipsoid method. Includes extensive examples and exercises. Designed for advanced undergraduates or graduates majoring in engineering, mathematics, or business administration.
In Linear Programming: A Modern Integrated Analysis, both boundary (simplex) and interior point methods are derived from the complementary slackness theorem and, unlike most books, the duality theorem is derived from Farkas's Lemma, which is proved as a convex separation theorem. The tedium of the simplex method is thus avoided. A new and inductive proof of Kantorovich's Theorem is offered, related to the convergence of Newton's method. Of the boundary methods, the book presents the (revised) primal and the dual simplex methods. An extensive discussion is given of the primal, dual and primal-dual affine scaling methods. In addition, the proof of the convergence under degeneracy, a bounded variable variant, and a super-linearly convergent variant of the primal affine scaling method are covered in one chapter. Polynomial barrier or path-following homotopy methods, and the projective transformation method are also covered in the interior point chapter. Besides the popular sparse Cholesky factorization and the conjugate gradient method, new methods are presented in a separate chapter on implementation. These methods use LQ factorization and iterative techniques.
The effectiveness of the algorithms of linear programming in solving problems is largely dependent upon the particular applications from which these problems arise. A first course in linear programming should not only allow one to solve many different types of problems in many different contexts but should provide deeper insights into the fields in which linear programming finds its utility. To this end, the emphasis throughtout Linear Programming and Its Applications is on the acquisition of linear programming skills via the algorithmic solution of small-scale problems both in the general sense and in the specific applications where these problems naturally occur. The first part of the book deals with methods to solve general linear programming problems and discusses the theory of duality which connects these problems. The second part of the book deals with linear programming in different applications including the fields of game theory and graph theory as well as the more traditional transportation and assignment problems. The book is versatile; in as much as Linear Programming and Its Applications is intended to be used as a first course in linear programming, it is suitable for students in such varying fields as mathematics, computer science, engineering, actuarial science, and economics.
This collection of 188 nonlinear programming test examples is a supplement of the test problem collection published by Hock and Schittkowski [2]. As in the former case, the intention is to present an extensive set of nonlinear programming problems that were used by other authors in the past to develop, test or compare optimization algorithms. There is no distinction between an "easy" or "difficult" test problem, since any related classification must depend on the underlying algorithm and test design. For instance, a nonlinear least squares problem may be solved easily by a special purpose code within a few iterations, but the same problem can be unsolvable for a general nonlinear programming code due to ill-conditioning. Thus one should consider both collections as a possible offer to choose some suitable problems for a specific test frame. One difference between the new collection and the former one pub lished by Hock and Schittkowski [2], is the attempt to present some more realistic or "real world" problems. Moreover a couple of non linear least squares test problems were collected which can be used e. g. to test data fitting algorithms. The presentation of the test problems is somewhat simplified and numerical solutions are computed only by one nonlinear programming code, the sequential quadratic programming algorithm NLPQL of Schittkowski [3]. But both test problem collections are implemeted in the same way in form of special FORTRAN subroutines, so that the same test programs can be used.
From the foreword: "This volume contains most of the 113 papers presented during the Eighth International Conference on Analysis and Optimization of Systems organized by the Institut National de Recherche en Informatique et en Automatique. Papers were presented by speakers coming from 21 different countries. These papers deal with both theoretical and practical aspects of Analysis and Optimization of Systems. Most of the topics of System Theory have been covered and five invited speakers of international reputation have presented the new trends of the field."
Uniquely blends mathematical theory and algorithm design for understanding and modeling real-world problems Optimization modeling and algorithms are key components to problem-solving across various fields of research, from operations research and mathematics to computer science and engineering. Addressing the importance of the algorithm design process. "Deterministic Operations Research" focuses on the design of solution methods for both continuous and discrete linear optimization problems. The result is a clear-cut resource for understanding three cornerstones of deterministic operations research: modeling real-world problems as linear optimization problem; designing the necessary algorithms to solve these problems; and using mathematical theory to justify algorithmic development. Treating real-world examples as mathematical problems, the author begins with an introduction to operations research and optimization modeling that includes applications form sports scheduling an the airline industry. Subsequent chapters discuss algorithm design for continuous linear optimization problems, covering topics such as convexity. Farkas' Lemma, and the study of polyhedral before culminating in a discussion of the Simplex Method. The book also addresses linear programming duality theory and its use in algorithm design as well as the Dual Simplex Method. Dantzig-Wolfe decomposition, and a primal-dual interior point algorithm. The final chapters present network optimization and integer programming problems, highlighting various specialized topics including label-correcting algorithms for the shortest path problem, preprocessing and probing in integer programming, lifting of valid inequalities, and branch and cut algorithms. Concepts and approaches are introduced by outlining examples that demonstrate and motivate theoretical concepts. The accessible presentation of advanced ideas makes core aspects easy to understand and encourages readers to understand how to think about the problem, not just what to think. Relevant historical summaries can be found throughout the book, and each chapter is designed as the continuation of the "story" of how to both model and solve optimization problems by using the specific problems-linear and integer programs-as guides. The book's various examples are accompanied by the appropriate models and calculations, and a related Web site features these models along with Maple(TM) and MATLAB(R) content for the discussed calculations. Thoroughly class-tested to ensure a straightforward, hands-on approach, "Deterministic Operations Research" is an excellent book for operations research of linear optimization courses at the upper-undergraduate and graduate levels. It also serves as an insightful reference for individuals working in the fields of mathematics, engineering, computer science, and operations research who use and design algorithms to solve problem in their everyday work.
|
![]() ![]() You may like...
|