![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory > Cybernetics & systems theory
In light of recent alarming environmental trends combined with increasing commercial viability of fuel cells, the time is propitious for a book focusing on the systematic aspects of cell plant technology. This multidisciplinary text covers the main types of fuel cells, R&D issues, plant design and construction, and economic factors to provide industrial and academic researchers working in electrical systems design, electrochemistry, and engineering with a unique and comprehensive resource.
This book is an introduction to health care as a complex adaptive system, a system that feeds back on itself. The first section introduces systems and complexity theory from a science, historical, epistemological, and technical perspective, describing the principles and mathematics. Subsequent sections build on the health applications of systems science theory, from human physiology to medical decision making, population health and health services research. The aim of the book is to introduce and expand on important population health issues from a systems and complexity perspective, highlight current research developments and their implications for health care delivery, consider their ethical implications, and to suggest directions for and potential pitfalls in the future.
Optimization is of central concern to a number of discip lines. Operations Research and Decision Theory are often consi dered to be identical with optimizationo But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp," i. e. the solutions were considered to be either fea sible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeller to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real prob lems. This is particularly true if the problem under considera tion includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if na tural language has to be modelled or if state variables can only be described approximately. Until recently, everything which was not known with cer tainty, i. e. which was not known to be either true or false or which was not known to either happen with certainty or to be impossible to occur, was modelled by means of probabilitieso This holds in particular for uncertainties concerning the oc currence of events."
The idea of optimization runs through most parts of control theory. The simplest optimal controls are preplanned (programmed) ones. The problem of constructing optimal preplanned controls has been extensively worked out in literature (see, e. g., the Pontrjagin maximum principle giving necessary conditions of preplanned control optimality). However, the concept of op timality itself has a restrictive character: it is limited by what one means under optimality in each separate case. The internal contradictoriness of the preplanned control optimality ("the better is the enemy of the good") yields that the practical significance of optimal preplanned controls proves to be not great: such controls are usually sensitive to unregistered disturbances (includ ing the round-off errors which are inevitable when computer devices are used for forming controls), as there is the effect of disturbance accumulation in the control process which makes controls to be of little use on large time inter vals. This gap is mainly provoked by oversimplified settings of optimization problems. The outstanding result of control theory established in the end of the first half of our century is that controls in feedback form ensure the weak sensitivity of closed loop systems with respect to "small" unregistered internal and external disturbances acting in them (here we do not need to discuss performance indexes, since the considered phenomenon is of general nature). But by far not all optimal preplanned controls can be represented in a feedback form."
A predictive control algorithm uses a model of the controlled system to predict the system behavior for various input scenarios and determines the most appropriate inputs accordingly. Predictive controllers are suitable for a wide range of systems; therefore, their advantages are especially evident when dealing with relatively complex systems, such as nonlinear, constrained, hybrid, multivariate systems etc. However, designing a predictive control strategy for a complex system is generally a difficult task, because all relevant dynamical phenomena have to be considered. Establishing a suitable model of the system is an essential part of predictive control design. Classic modeling and identification approaches based on linear-systems theory are generally inappropriate for complex systems; hence, models that are able to appropriately consider complex dynamical properties have to be employed in a predictive control algorithm. This book first introduces some modeling frameworks, which can encompass the most frequently encountered complex dynamical phenomena and are practically applicable in the proposed predictive control approaches. Furthermore, unsupervised learning methods that can be used for complex-system identification are treated. Finally, several useful predictive control algorithms for complex systems are proposed and their particular advantages and drawbacks are discussed. The presented modeling, identification and control approaches are complemented by illustrative examples. The book is aimed towards researches and postgraduate students interested in modeling, identification and control, as well as towards control engineers needing practically usable advanced control methods for complex systems.
Model-based fuzzy control uses a given conventional or a fuzzy open loop of the plant under control in order to derive the set of fuzzy if-then rules constituting the corresponding fuzzy controller. Furthermore, of central interest are the consequent stability, performance, and robustness analysis of the resulting closed loop system involving a conventional model and a fuzzy controller, or a fuzzy model and a fuzzy controller. The major objective of the model-based fuzzy control is to use the full available range of existing linear and nonlinear design of such fuzzy controllers which have better stability, performance, and robustness properties than the corresponding non-fuzzy controllers designed by the use of these same techniques.
The problem of controlling or stabilizing a system of differential equa tions in the presence of random disturbances is intuitively appealing and has been a motivating force behind a wide variety of results grouped loosely together under the heading of "Stochastic Control." This book is concerned with a special instance of this general problem, the "Adaptive LQ Regulator," which is a stochastic control problem of partially observed type that can, in certain cases, be solved explicitly. We first describe this problem, as it is the focal point for the entire book, and then describe the contents of the book. The problem revolves around an uncertain linear system x(O) = x~ in R", where 0 E {1, ... , N} is a random variable representing this uncertainty and (Ai' B , C) and xJ are the coefficient matrices and initial state, respectively, of j j a linear control system, for eachj = 1, ... , N. A common assumption is that the mechanism causing this uncertainty is additive noise, and that conse quently the "controller" has access only to the observation process y( . ) where y = Cex +~.
Prediction of behavior of the dynamical systems, analysis and modeling of its structure is vitally important problem in engineering, economy and science today. Examples of such systems can be seen in the world around us and of course in almost every scientific discipline including such "exotic" domains like the earth's atmosphere, turbulent fluids, economies (exchange rate and stock markets), population growth, physics (control of plasma), information flow in social networks and its dynamics, chemistry and complex networks. To understand such dynamics and to use it in research or industrial applications, it is important to create its models. For this purpose there is rich spectra of methods, from classical like ARMA models or Box Jenkins method to such modern ones like evolutionary computation, neural networks, fuzzy logic, fractal geometry, deterministic chaos and more. This proceeding book is a collection of the accepted papers to conference Nostradamus that has been held in Ostrava, Czech Republic. Proceeding also comprises of outstanding keynote speeches by distinguished guest speakers: Guanrong Chen (Hong Kong), Miguel A. F. Sanjuan (Spain), Gennady Leonov and Nikolay Kuznetsov (Russia), Petr Skoda (Czech Republic). The main aim of the conference is to create periodical possibility for students, academics and researchers to exchange their ideas and novel methods. This conference will establish forum for presentation and discussion of recent trends in the area of applications of various predictive methods for researchers, students and academics.
It is in the area of Systems Diagnosis. Supervision and Control that Knowledge-Based Techniques have had their most significant impact in recent years. In this volume. Spyros Tzafestas has ably put together the current state of the art of the application of Artificial Intelligence concepts to problems of Systems Diagnosis. All the authors in this edited work are distinguished internationally. recognized experts on various aspects of Artificial Intelligence and its applications. and the coverage of the field that they provide is both readable and authoritative. The sixteen chapters break down in a natural way into three broad categories i.e ** (a) introduction to the applications of Expert Systems in Engineering. (b) Knowledge-based systems architectures. models and techniques for fault diagnosis. supervision and real time control and finally. (c) applications and case studies in three specific 'areas. namely: Manufacturing. Chemical Processes and Communications Networks. The final chapter provides a com prehensive survey of the field with an extensive bibliography. The mix of original scientific articles. tutorial and survey papers makes this col lection a very timely and valuable addition to the literature in this important field. MADAN G. SINGH Professor of Information Engineering at U.M.I.S.T.
Recently, the subject of nonlinear control systems analysis has grown rapidly and this book provides a simple and self-contained presentation of their stability and feedback stabilization which enables the reader to learn and understand major techniques used in mathematical control theory. In particular: the important techniques of proving global stability properties are presented closely linked with corresponding methods of nonlinear feedback stabilization; a general framework of methods for proving stability is given, thus allowing the study of a wide class of nonlinear systems, including finite-dimensional systems described by ordinary differential equations, discrete-time systems, systems with delays and sampled-data systems; approaches to the proof of classical global stability properties are extended to non-classical global stability properties such as non-uniform-in-time stability and input-to-output stability; and new tools for stability analysis and control design of a wide class of nonlinear systems are introduced. The presentational emphasis of Stability and Stabilization of Nonlinear Systems is theoretical but the theory's importance for concrete control problems is highlighted with a chapter specifically dedicated to applications and with numerous illustrative examples. Researchers working on nonlinear control theory will find this monograph of interest while graduate students of systems and control can also gain much insight and assistance from the methods and proofs detailed in this book.
The Engineering of Complex Real-Time Computer Control Systems brings together in one place important contributions and up-to-date research results in this important area. The Engineering of Complex Real-Time Computer Control Systems serves as an excellent reference, providing insight into some of the most important research issues in the field.
This book is a venture in the worlds of modeling and of metamodeling. At this point, I will not reveal to readers what constitutes metamodeling. Suf fice it to say that the pitfalls and shortcomings of modeling can be cured only if we resort to a higher level of inquiry called metainquiry and metadesign. We reach this level by the process of abstraction. The book contains five chapters from my previous work, Applied General Systems Theory (Harper and Row, London and New York, First Edition 1974, Second Edition 1978). More than ten years after its publication, this material still appears relevant to the main thrust of system design. This book is dedicated to all those who are involved in changing the world for the better. In a way we all are involved in system design: from the city manager who struggles with the problems of mass transportation or the consolidation of a city and its suburbs to the social worker who tries to provide benefits to the urban poor. It includes the engineer who designs the shuttle rockets. It involves the politician engaged in drafting a bill to recycle containers, or one to prevent pesticide contamination of our food. The politician might even need system design to chart his or her own re-election campaign."
What are the relations between the shape of a system of cities and that of fish school? Which events should happen in a cell in order that it participates to one of the finger of our hands? How to interpret the shape of a sand dune? This collective book written for the non-specialist addresses these questions and more generally, the fundamental issue of the emergence of forms and patterns in physical and living systems. It is a single book gathering the different aspects of morphogenesis and approaches developed in different disciplines on shape and pattern formation. Relying on the seminal works of D'Arcy Thompson, Alan Turing and Rene Thom, it confronts major examples like plant growth and shape, intra-cellular organization, evolution of living forms or motifs generated by crystals. A book essential to understand universal principles at work in the shapes and patterns surrounding us but also to avoid spurious analogies.
The International Symposium on Generalized Functions and Their Applications was organized by the Department of Mathematics, Banaras Hindu University, and held December 23-26, 1991, on the occasion of the Platinum Jubilee Celebration of the university. More than a hundred mathematicians from ten countries participated in the deliberations of the symposium. Thirty lectures were delivered on a variety of topics within the area. The contributions to the proceedings of the symposium are, with a few exceptions, expanded versions of the lectures delivered by the invited speakers. The survey papers by Komatsu and Hoskins and Sousa Pinto provide an up-to-date account of the theory of hyperfunctions, ultradistributions and microfunctions, and the nonstandard theory of new generalized functions, respectively; those by Stankovic and Kanwal deal with structures and asymptotics. Choquet-Bruhat's work studies generalized functions on manifold and gives applications to shocks and discrete models. The other contributions relate to contemporary problems and achievements in theory and applications, especially in the theory of partial differential equations, differential geometry, mechanics, mathematical physics, and systems science. The proceedings give a very clear impression of the present state of the art in this field and contain many challenges, ideas, and open problems. The volume is very helpful for a broad spectrum of readers: graduate students to mathematical researchers.
Developments in electronic hardware, particularly microprocessors and solid-state cameras, have resulted in a vast explosion in the range and variety of applications to which intelligent processing may be applied to yield cost-effective automation. Typical examples include automated visual inspection and repetitive assembly. The technology required is recent and specialized, and is thus not widely known. VISION AND INFORMATION PROCESSING FOR AUTOMATION has arisen from a short course given by the authors to introduce potential users to the technology. Its content is a development and extension of material presented in the course. The objective of the book is to introduce readers to modern concepts and techniques basic to intelligent automation, and explain how these are applied to prac tical problems. Its emphasis is on machine vision. Intelligent instrumentation is concerned with processing infor mation, and an appreciation of the nature of information is essential in configuring instrumentation to handle it effiCiently. An understand ing of the fundamental principles of efficient computation and of the way in which machines make decisions is vital for the same reasons. Selection of appropriate sensing (e.g., camera type and configuration), of illumination, of hardware for processing (microchip or parallel processor?) to give most effective information flow, and of the most appropriate processing algorithms is critical in obtaining an optimal solution. Analysis of performance, to demonstrate that requirements have been met, and to identify the causes if they have not, is also important. All of these topics are covered in this volume."
Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics. Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees. Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton). The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles. Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments. The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models. Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
The University of Genoa - Ohio State University Joint Conference on New Trends in Systems Theory was held at the Badia di S. Andrea in Genoa on July 9-11, 1990. This Proceedings volume contains articles based on two of the three Plenary talks and most of the shorter presentations. The papers are arranged by author, and no attempt has been made to organize them by topic. We would like to thank the members of the Scientific Committee and of the Program Committee, the speakers and authors, and everyone who attended the conference. Approximately 120 researchers and students from all over the world visited Genoa for the meeting, representing a wide spectrum of areas in pure and applied control and systems theory. The success of the conference depended on their high level of scientific and engineering expertise, not to mention their enthusiasm. The Conference on New Trends in Systems Theory would not have been possible without the help of a great many institutions and people. We would like to thank the University of Genoa, particularly Professor Enrico Beltrametti, and the Ohio State University's Columbian Quincentenary Committee led by Professor Christian Zacher, for encouragement and financial assistance. The University of Genoa Mathematics Department and Communication, Computer and System Sciences Department supplied assistance and technical help. The staff of the Consorzio Genova Ricerche, particularly Ms. Piera Ponta and Ms. Camilla Marconi, worked diligently over many months and especially during the conference itself to insure a smooth and enjoyable meeting.
Since the begining of the sixties, control theorists have developed a large body of knowledge concerning complex or large-scale systems theory. Using the state space approach, their purpose was to extend methods to cope with the increasingly sophisticated automation needs of man-made systems. Despite several remarkable contributions, and some successful applications, it can be stated that this theory has not yet become an engineering tool. On the other hand, the emergence of cheap and reliable microprocessors has profoundly transformed industrial instrumentation and control systems. Process control equipment is organized in multilevel distributed structures, closely related to the concepts introduced by complex systems control theory. This similarity should favor a fruitful intersection for practical applications. However, a gap still exists between the literature on control theory and the world of technological achievements. In the many books on complex systems, few have given attention to the technological aspects of a practical control problem. The present book is an attempt to fill this gap. To do this, it consistently reflects the viewpoints that: - Theory and technology are two indivisible facets of the same problem. -On-line implementation for real time applications is the ultimate goal of a control study.
In 1978, when the book Living Systems was published, it contained the prediction that the sciences that were concerned with the biological and social sciences would, in the future, be stated as rigorously as the "hard sciences" that study such nonliving phenomena as temperature, distance, and the interaction of chemical elements. Principles of Quantitative Living Systems Science, the first of a planned series of three books, begins an attempt to fulfill that prediction. The view that living things are similar to other parts of the physical world, differing only in their complexity, was explicitly stated in the early years of the twentieth century by the biologist Ludwig von Bertalanffy. His ideas could not be published until the end of the war in Europe in the 1940s. Von Bertalanffy was strongly opposed to vitalism, the theory current among biologists at the time that life could only be explained by recourse to a "vital principle" or God. He c- sidered living things to be a part of the natural order, "systems" like atoms and molecules and planetary systems. Systems were described as being made up of a number of interrelated and interdependent parts, but because of the interrelations, the total system became more than the sum of those parts. These ideas led to the development of systems movements, in both Europe and the United States, that included not only biologists but scientists in other fields as well. Systems societies were formed on both continents.
The 21st century is now almost upon us and, whilst this represents a somewhat artificial boundary, it provides an opportunity for reflection upon the changes, and the accelerating pace of change, in our social, economic, and natural environments. These changes and their effects are profound, not least in terms of access to information and communication technologies, at once global in effect and manifest locally. These changes and their consequent demands are reflected in the theme of this volume: Synergy Matters, proceedings from the 6th UK Systems Society International Conference.
Flexible Neuro-Fuzzy Systems is the first professional literature about the new class of powerful, flexible fuzzy systems. The author incorporates various flexibility parameters to the construction of neuro-fuzzy systems. This approach dramatically improves their performance, allowing the systems to perfectly represent the pattern encoded in data. Flexible Neuro-Fuzzy Systems is the only book that proposes a flexible approach to fuzzy modeling and fills the gap in existing literature. This book introduces new fuzzy systems which outperform previous approaches to system modeling and classification, and has the following features: -Provides a framework for unification, construction and development of neuro-fuzzy systems; -Presents complete algorithms in a systematic and structured fashion, facilitating understanding and implementation, -Covers not only advanced topics but also fundamentals of fuzzy sets, -Includes problems and exercises following each chapter, -Illustrates the results on a wide variety of simulations, -Provides tools for possible applications in business and economics, medicine and bioengineering, automatic control, robotics and civil engineering.
Twenty five years ago, in 1964, The Operational Research Society's first International Conference (held at Gonville and Caius College, Cambridge) took as its theme "Operational Research and the Social Sciences." The Conference sessions were organised around topics such as: Organisations and Control; Social Effects of Policies; Conflict Resolution; The Systems Concept; Models, Decisions and Operational Research. An examination of the published proceedings (J.R.Lawrence ed., 1966, Operational Research and the Social Sciences, Tavistock, London) reveals a distinct contrast between the types of contribution made by the representatives of the two academic communities involved. Nevertheless, the Conference served to break down some barriers, largely of ignorance about the objects, methods and findings of each concern. In the ensuing twenty five years, although debate has continued about the relationship between OR and the social sciences, mutual understanding has proved more difficult to achieve than many must have hoped for in 1964.
Human-in-the-Loop Simulations is a compilation of articles from experts in the design, development, and use of human-in-the-loop simulations. The first section of the handbook consists of papers on fundamental concepts in human-in-the-loop simulations, such as object-oriented simulation development, interface design and development, and performance measurement. The second section includes papers from researchers who utilized HITL simulations to inform models of cognitive processes to include decision making and metacognition. The last section describes human-in-the-loop processes for complex simulation models in trade space exploration and epidemiological analyses. Human-in-the-Loop Simulations is a useful tool for multiple audiences, including graduate students and researchers in engineering and computer science.
This new edition of the well established text Scheduling - Theory, Algorithms, and Systems provides an up-to-date coverage of important theoretical models in the scheduling literature as well as significant scheduling problems that occur in the real world. It again includes supplementary material in the form of slide-shows from industry and movies that show implementations of scheduling systems. The main structure of the book as per previous edition consists of three parts. The first part focuses on deterministic scheduling and the related combinatorial problems. The second part covers probabilistic scheduling models; in this part it is assumed that processing times and other problem data are random and not known in advance. The third part deals with scheduling in practice; it covers heuristics that are popular with practitioners and discusses system design and implementation issues. All three parts of this new edition have been revamped and streamlined. The references have been made completely up-to-date. Theoreticians and practitioners alike will find this book of interest. Graduate students in operations management, operations research, industrial engineering, and computer science will find the book an accessible and invaluable resource. Scheduling - Theory, Algorithms, and Systems will serve as an essential reference for professionals working on scheduling problems in manufacturing, services, and other environments. Reviews of third edition: This well-established text covers both the theory and practice of scheduling. The book begins with motivating examples and the penultimate chapter discusses some commercial scheduling systems and examples of their implementations." (Mathematical Reviews, 2009)
When analyzing systems with a large number of parameters, the dimen sion of the original system may present insurmountable difficulties for the analysis. It may then be convenient to reformulate the original system in terms of substantially fewer aggregated variables, or macrovariables. In other words, an original system with an n-dimensional vector of states is reformulated as a system with a vector of dimension much less than n. The aggregated variables are either readily defined and processed, or the aggregated system may be considered as an approximate model for the orig inal system. In the latter case, the operation of the original system can be exhaustively analyzed within the framework of the aggregated model, and one faces the problems of defining the rules for introducing macrovariables, specifying loss of information and accuracy, recovering original variables from aggregates, etc. We consider also in detail the so-called iterative aggregation approach. It constructs an iterative process, at. every step of which a macroproblem is solved that is simpler than the original problem because of its lower dimension. Aggregation weights are then updated, and the procedure passes to the next step. Macrovariables are commonly used in coordinating problems of hierarchical optimization." |
You may like...
Computer Aided Verification
Hana Chockler, Georg Weissenbacher
Hardcover
R2,035
Discovery Miles 20 350
Binary Bullets - The Ethics of…
Fritz Allhoff, Adam Henschke, …
Hardcover
R3,569
Discovery Miles 35 690
Chaos, Complexity and Leadership 2017…
Sefika Sule Ercetin, Nihan Potas
Hardcover
R5,292
Discovery Miles 52 920
Machine Learning for Cyber Physical…
Oliver Niggemann, Christian Kuhnert, …
Hardcover
R1,244
Discovery Miles 12 440
|