Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Information theory > Cybernetics & systems theory
The emergence of flow control as an attractive new field is owed to breakthroughs in MEMS (micro-electromechanical systems) and related technologies. The instrumentation of fluid flows on extremely short length and short time scales requires the practical tool of control algorithms with provable performance guarantees. Dedicated to this problem, Flow Control by Feedback, brings together controller design and fluid mechanics expertise in an exposition of the latest research results. Featuring: Exhaustive treatment of flow control core areas including stabilization and mixing control techniques; self-contained introductory sections on Navier-Stokes equations, linear and nonlinear control and sensors and MEMS to facilitate accessibility to this cross-disciplinary subject; a comprehensive survey of feedback algorithms for flow control that are currently available. In response to the intense interest in flow control, this volume will be an essential addition to the library of researchers and graduate students in control theory, fluid mechanics, mathematics and physics. Content structure is ideal for instruction on flow control modules or as supplementary reading on fluid dynamics and infinite dimensional systems courses.
The aim of this book is to furnish the reader with a rigorous and detailed exposition of the concept of control parametrization and time scaling transformation. It presents computational solution techniques for a special class of constrained optimal control problems as well as applications to some practical examples. The book may be considered an extension of the 1991 monograph A Unified Computational Approach Optimal Control Problems, by K.L. Teo, C.J. Goh, and K.H. Wong. This publication discusses the development of new theory and computational methods for solving various optimal control problems numerically and in a unified fashion. To keep the book accessible and uniform, it includes those results developed by the authors, their students, and their past and present collaborators. A brief review of methods that are not covered in this exposition, is also included. Knowledge gained from this book may inspire advancement of new techniques to solve complex problems that arise in the future. This book is intended as reference for researchers in mathematics, engineering, and other sciences, graduate students and practitioners who apply optimal control methods in their work. It may be appropriate reading material for a graduate level seminar or as a text for a course in optimal control.
organized around health and human development, environment and sustainability, and communities and social change Includes agent-based modeling, system dynamics, and network analysis Indroductory framing essays for each section
This book presents applications of Newton-like and other similar methods to solve abstract functional equations involving fractional derivatives. It focuses on Banach space-valued functions of a real domain - studied for the first time in the literature. Various issues related to the modeling and analysis of fractional order systems continue to grow in popularity, and the book provides a deeper and more formal analysis of selected issues that are relevant to many areas - including decision-making, complex processes, systems modeling and control - and deeply embedded in the fields of engineering, computer science, physics, economics, and the social and life sciences. The book offers a valuable resource for researchers and graduate students, and can also be used as a textbook for seminars on the above-mentioned subjects. All chapters are self-contained and can be read independently. Further, each chapter includes an extensive list of references.
Introduction to Intelligent Simulation of Complex Discrete Systems and Processes: RAO Language focuses on a unique approach in modeling and simulation of complex systems. In this volume are considered features of complex systems and processes, their mathematical description, and modeling. Theoretical foundations of the RAO (Resource-Action-Operation) language as well as its syntax and utilisation are given. Examples of simulation models of different complexity levels, related to different fields, are also presented. The RAO intelligent modeling system, introduced and described in Introduction to Intelligent Simulation of Complex Discrete Systems and Processes is unique because: (1) it makes simulation modeling universal for the classes of systems and processes modeled; (2) it is simple to modify the models; and (3) it has the capacity to model complex control systems together with the object controlled (including simulation modeling for on-line control). The RAO tool allows the user to use a language very similar to his professional language and rids him of intermediary, supplementary description of the system modeled. In fifteen chapters this volume provides an overview of general modeling trends, and hence serves the research community in guiding their modeling methods; intelligent simulation modeling is introduced to solve complex systems and processes.
The book you hold in your hands is the outcome of the "ISCS 2013: Interdisciplinary Symposium on Complex Systems" held at the historical capital of Bohemia as a continuation of our series of symposia in the science of complex systems. Prague, one of the most beautiful European cities, has its own beautiful genius loci. Here, a great number of important discoveries were made and many important scientists spent fruitful and creative years to leave unforgettable traces. The perhaps most significant period was the time of Rudolf II who was a great supporter of the art and the science and attracted a great number of prominent minds to Prague. This trend would continue. Tycho Brahe, Niels Henrik Abel, Johannes Kepler, Bernard Bolzano, August Cauchy Christian Doppler, Ernst Mach, Albert Einstein and many others followed developing fundamental mathematical and physical theories or expanding them. Thus in the beginning of the 17th century, Kepler formulated here the first two of his three laws of planetary motion on the basis of Tycho Brahe's observations. In the 19th century, nowhere differentiable continuous functions (of a fractal character) were constructed here by Bolzano along with a treatise on infinite sets, titled "Paradoxes of Infinity" (1851). Weierstrass would later publish a similar function in 1872. In 1842, Doppler as a professor of mathematics at the Technical University of Prague here first lectured about a physical effect to bear his name later. And the epoch-making physicist Albert Einstein - while being a chaired professor of theoretical physics at the German University of Prague - arrived at the decisive steps of his later finished theory of general relativity during the years 1911-1912. In Prague, also many famous philosophers and writers accomplished their works; for instance, playwright arel ape coined the word "robot" in Prague ("robot" comes from the Czech word "robota" which means "forced labor").
The book outlines selected projects conducted under the supervision of the author. Moreover, it discusses significant relations between Interactive Granular Computing (IGrC) and numerous dynamically developing scientific domains worldwide, along with features characteristic of the author's approach to IGrC. The results presented are a continuation and elaboration of various aspects of Wisdom Technology, initiated and developed in cooperation with Professor Andrzej Skowron. Based on the empirical findings from these projects, the author explores the following areas: (a) understanding the causes of the theory and practice gap problem (TPGP) in complex systems engineering (CSE);(b) generalizing computing models of complex adaptive systems (CAS) (in particular, natural computing models) by constructing an interactive granular computing (IGrC) model of networks of interrelated interacting complex granules (c-granules), belonging to a single agent and/or to a group of agents; (c) developing methodologies based on the IGrC model to minimize the negative consequences of the TPGP. The book introduces approaches to the above issues, using the proposed IGrC model. In particular, the IGrC model refers to the key mechanisms used to control the processes related to the implementation of CSE projects. One of the main aims was to develop a mechanism of IGrC control over computations that model a project's implementation processes to maximize the chances of its success, while at the same time minimizing the emerging risks. In this regard, the IGrC control is usually performed by means of properly selected and enforced (among project participants) project principles. These principles constitute examples of c-granules, expressed by complex vague concepts (represented by c-granules too). The c-granules evolve with time (in particular, the meaning of the concepts is also subject of change). This methodology is illustrated using project principles applied by the author during the implementation of the POLTAX, AlgoTradix, Merix, and Excavio projects outlined in the book.
Codes, Curves, and Signals: Common Threads in Communications is a collection of seventeen contributions from leading researchers in communications. The book provides a representative cross-section of cutting edge contemporary research in the fields of algebraic curves and the associated decoding algorithms, the use of signal processing techniques in coding theory, and the application of information-theoretic methods in communications and signal processing. The book is organized into three parts: Curves and Codes, Codes and Signals, and Signals and Information. Codes, Curves, and Signals: Common Threads in Communications is a tribute to the broad and profound influence of Richard E. Blahut on the fields of algebraic coding, information theory, and digital signal processing. All the contributors have individually and collectively dedicated their work to R. E. Blahut. Codes, Curves, and Signals: Common Threads in Communications is an excellent reference for researchers and professionals.
Our world is composed of systems within systems-the machines we build, the information we share, the organizations we form, and elements of nature that surround us. Therefore, nearly every field of study and practice embodies behaviors stemming from system dynamics. Yet the study of systems has remained somewhat fragmented based on philosophies, methodologies, and intentions. Many methodologies for analyzing complex systems extend far beyond the traditional framework of deduction evaluation and may, thus, appear mysterious to the uninitiated. This book seeks to dispel the mysteries of systems analysis by holistically explaining the philosophies, methodologies, and intentions in the context of understanding how all types of systems in our world form and how these systems break. This presentation is made at the level of conceptual understanding, with plenty of figures but no mathematical formulas, for the beginning student and interested readers new to studying systems. Through the conceptual understanding provided, students are given a powerful capability to see the hidden behaviors and unexplained consequences in the world around us.
Creative Space summarizes and integrates the various up-to-date approaches of computational intelligence to knowledge and technology creation including the specific novel feature of utilizing the creative abilities of the human mind, such as tacit knowledge, emotions and instincts, and intuition. It analyzes several important approaches of this new paradigm such as the Shinayakana Systems Approach, the organizational knowledge creation theory, in particular SECI Spiral, and the Rational Theory of Intuition - resulting in the concept of Creative Space. This monograph presents and analyzes in detail this new concept together with its ontology - the list and meanings of the analyzed nodes of this space and of the character of transitions linking these nodes.
Proceedings volume contains carefully selected papers presented during the 17th IFIP Conference on System Modelling and Optimization. Optimization theory and practice, optimal control, system modelling, stochastic optimization, and technical and non-technical applications of the existing theory are among areas mostly addressed in the included papers. Main directions are treated in addition to several survey papers based on invited presentations of leading specialists in the respective fields. Publication provides state-of-the-art in the area of system theory and optimization and points out several new areas (e.g fuzzy set, neural nets), where classical optimization topics intersects with computer science methodology.
This three-volume work presents a coherent description of the
theoretical and practical aspects of coloured Petri nets (CP-nets).
The second volume contains a detailed presentation of the analysis
methods for CP-nets. They allow the modeller to investigate dynamic
properties of CP-nets.
This book presents the fmdings of a comparative study of three European metropolitan regions: Vienna, Barcelona and Stockholm. The heart of the work consists of empirical studies carefully designed and developed in order to identify the main actors and mechanisms supporting technological innovation in each of the metropolitan regions. The authors have also highlighted the similarities and differences across regions and countries, investigating how these came to be, and discussing the possible implications. The introductory as well as the concluding Chapter was written by Manfred M. Fischer who, assisted by Attila Varga, was also responsible for Chapter 2 on the Metropolitan Region of Vienna. Javier Revilla Diez contributed Chapter 3 on the Barcelona Metropolitan Region. Folke Snickars has provided Chapter 4 which examines the Metropolitan Region of Stockholm and. All authors have reviewed and commented on the whole contents so that the volume represents a collective endeavour which has been rendered as homogeneous as possible. A particular effort has been made to ensure that the study is based on a common conceptual framework.
This book develops applications of novel generalizations of fuzzy information measures in the field of pattern recognition, medical diagnosis, multi-criteria and multi-attribute decision making and suitability in linguistic variables. The focus of this presentation lies on introducing consistently strong and efficient generalizations of information and information-theoretic divergence measures in fuzzy and intuitionistic fuzzy environment covering different practical examples. The target audience comprises primarily researchers and practitioners in the involved fields but the book may also be beneficial for graduate students.
In the context of life sciences, we are constantly confronted with information that possesses precise semantic values and appears essentially immersed in a specific evolutionary trend. In such a framework, Nature appears, in Monod's words, as a tinkerer characterized by the presence of precise principles of self-organization. However, while Monod was obliged to incorporate his brilliant intuitions into the framework of first-order cybernetics and a theory of information with an exclusively syntactic character such as that defined by Shannon, research advances in recent decades have led not only to the definition of a second-order cybernetics but also to an exploration of the boundaries of semantic information. As H. Atlan states, on a biological level "the function self-organizes together with its meaning". Hence the need to refer to a conceptual theory of complexity and to a theory of self-organization characterized in an intentional sense. There is also a need to introduce, at the genetic level, a distinction between coder and ruler as well as the opportunity to define a real software space for natural evolution. The recourse to non-standard model theory, the opening to a new general semantics, and the innovative definition of the relationship between coder and ruler can be considered, today, among the most powerful theoretical tools at our disposal in order to correctly define the contours of that new conceptual revolution increasingly referred to as metabiology. This book focuses on identifying and investigating the role played by these particular theoretical tools in the development of this new scientific paradigm. Nature "speaks" by means of mathematical forms: we can observe these forms, but they are, at the same time, inside us as they populate our organs of cognition. In this context, the volume highlights how metabiology appears primarily to refer to the growth itself of our instruments of participatory knowledge of the world.
Modeling and Simulation: Theory and Practice provides a comprehensive review of both methodologies and applications of simulation and modeling. The methodology section includes such topics as the philosophy of simulation, inverse problems in simulation, simulation model compilers, treatment of ill-defined systems, and a survey of simulation languages. The application section covers a wide range of topics, including applications to environmental management, biology and medicine, neural networks, collaborative visualization and intelligent interfaces. The book consists of 13 invited chapters written by former colleagues and students of Professor Karplus. Also included are several short 'reminiscences' describing Professor Karplus' impact on the professional careers of former colleagues and students who worked closely with him over the years.
Experimental Econophysics describes the method of controlled human experiments, which is developed by physicists to study some problems in economics or finance, namely, stylized facts, fluctuation phenomena, herd behavior, contrarian behavior, hedge behavior, cooperation, business cycles, partial information, risk management, and stock prediction. Experimental econophysics together with empirical econophysics are two branches of the field of econophysics. The latter one has been extensively discussed in the existing books, while the former one has been seldom touched. In this book, the author will focus on the branch of experimental econophysics. Empirical econophysics is based on the analysis of data in real markets by using some statistical tools borrowed from traditional statistical physics. Differently, inspired by the role of controlled experiments and system modelling (for computer simulations and/or analytical theory) in developing modern physics, experimental econophysics specially relies on controlled human experiments in the laboratory (producing data for analysis) together with agent-based modelling (for computer simulations and/or analytical theory), with an aim at revealing the general cause-effect relationship between specific parameters and emergent properties of real economic/financial markets. This book covers the basic concepts, experimental methods, modelling approaches, and latest progress in the field of experimental econophysics.
The underlying technologies enabling the realization of recent advances in areas like mobile and enterprise computing are artificial intelligence (AI), modeling and simulation, and software engineering. A disciplined, multifaceted, and unified approach to modeling and simulation is now essential in new frontiers, such as Simulation Based Acquisition. This volume is an edited survey of international scientists, academicians, and professionals who present their latest research findings in the various fields of AI; collaborative/distributed computing; and modeling, simulation, and their integration. Whereas some of these areas continue to seek answers to basic fundamental scientific inquiries, new questions have emerged only recently due to advances in computing infrastructures, technologies, and tools. The book¿s principal goal is to provide a unifying forum for developing postmodern, AI-based modeling and simulation environments and their utilization in both traditional and modern application domains. Features and topics: * Blends comprehensive, advanced modeling and simulation theories and methodologies in a presentation founded on formal, system-theoretic and AI-based approaches * Uses detailed, real-world examples to illustrate key concepts in systems theory, modeling, simulation, object orientation, and intelligent systems * Addresses a broad range of critical topics in the areas of modeling frameworks, distributed and high-performance object-oriented simulation approaches, as well as robotics, learning, multi-scale and multi-resolution models, and multi-agent systems * Includes new results pertaining to intelligent and agent-based modeling, the relationship between AI-based reasoning and Discrete-Event System Specification, and large-scale distributed modeling and simulation frameworks * Provides cross-disciplinary insight into how computer science, computer engineering, and systems engineering can collectively provide a rich set of theories and methods enabling contemporary modeling and simulation This state-of-the-art survey on collaborative/distributed modeling and simulation computing environments is an essential resource for the latest developments and tools in the field for all computer scientists, systems engineers, and software engineers. Professionals, practitioners, and graduate students will find this reference invaluable to their work involving computer simulation, distributed modeling, discrete-event systems, AI, and software engineering.
1. 1 Introduction This book is written in two major parts. The ?rst part includes the int- ductory chapters consisting of Chapters 1 through 6. In part two, Chapters 7-26, we present the applications. This book continues our research into simulating fuzzy systems. We started with investigating simulating discrete event fuzzy systems ([7],[13],[14]). These systems can usually be described as queuing networks. Items (transactions) arrive at various points in the s- tem and go into a queue waiting for service. The service stations, preceded by a queue, are connected forming a network of queues and service, until the transaction ?nally exits the system. Examples considered included - chine shops, emergency rooms, project networks, bus routes, etc. Analysis of all of these systems depends on parameters like arrival rates and service rates. These parameters are usually estimated from historical data. These estimators are generally point estimators. The point estimators are put into the model to compute system descriptors like mean time an item spends in the system, or the expected number of transactions leaving the system per unit time. We argued that these point estimators contain uncertainty not shown in the calculations. Our estimators of these parameters become fuzzy numbers, constructed by placing a set of con?dence intervals one on top of another. Using fuzzy number parameters in the model makes it into a fuzzy system. The system descriptors we want (time in system, number leaving per unit time) will be fuzzy numbers.
The biennial conferences of the Society for Underwater Technology have achieved an excellent reputation for the quality of their presentations, which cover topics of the most acute current interest, as well as those at the forefront of review and development. The 1994 conference on Subsea Control and Data Acquisition formed no exception, since it covers subjects at the cutting edge of modern technology. It is a matter of increasing concern that products are becoming overspecified, resulting in excessive costs and longer development schedules, while not conferring an equivalent benefit in reliability of the finished product. Subsea Control and Data Acquisition is vital reading for all subsea control system designers, manufacturers and operators, equipment consultants, application engineers, academics in the subsea engineering field, and all subsea engineers.
The theory of finite fields, whose origins can be traced back to the works of Gauss and Galois, has played a part in various branches of mathematics, in recent years there has been a resurgence of interest in finite fields, and this is partly due to important applications in coding theory and cryptography. Applications of Finite Fields introduces some of these recent developments. This book focuses attention on some specific recent developments in the theory and applications of finite fields. While the topics selected are treated in some depth, Applications of Finite Fields does not attempt to be encyclopedic. Among the topics studied are different methods of representing the elements of a finite field (including normal bases and optimal normal bases), algorithms for factoring polynomials over finite fields, methods for constructing irreducible polynomials, the discrete logarithm problem and its implications to cryptography, the use of elliptic curves in constructing public key cryptosystems, and the uses of algebraic geometry in constructing good error-correcting codes. This book is developed from a seminar held at the University of Waterloo. The purpose of the seminar was to bridge the knowledge of the participants whose expertise and interests ranged from the purely theoretical to the applied. As a result, this book will be of interest to a wide range of students, researchers and practitioners in the disciplines of computer science, engineering and mathematics. Applications of Finite Fields is an excellent reference and may be used as a text for a course on the subject.
Driven by the request for increased productivity, flexibility, and competitiveness, modern civilization increasingly has created high-performance discrete event dynamic systems (DEDSs). These systems exhibit concurrent, sequential, competitive activities among their components. They are often complex and large in scale, and necessarily flexible and thus highly capital-intensive. Examples of systems are manufacturing systems, communication networks, traffic and logistic systems, and military command and control systems. Modeling and performance evaluation play a vital role in the design and operation of such high-performance DEDSs and thus have received widespread attention from researchers over the past two decades. One methodology resulting from this effort is based on timed Petri nets and related graphical and mathematical tools. The popularity that Petri nets have been gaining in modeling of DEDSs is due to their powerful representational ability of concurrency and synchronization; however these properties of DEDSs cannot be expressed easily in traditional formalisms developed for analysis of classical' systems with sequential behaviors. This book introduces the theories and applications of timed Petri nets systematically. Moreover, it also presents many practical applications in addition to theoretical developments, together with the latest research results and industrial applications of timed Petri nets. Timed Petri Nets: Theory and Application is intended for use by researchers and practitioners in the area of Discrete Event Dynamic Systems.
Intelligent technical systems, which combine mechanical, electrical and software engineering with control engineering and advanced mathematics, go far beyond the state of the art in mechatronics and open up fascinating perspectives. Among these systems are so-called self-optimizing systems, which are able to adapt their behavior autonomously and flexibly to changing operating conditions. Self-optimizing systems create high value for example in terms of energy and resource efficiency as well as reliability. The Collaborative Research Center 614 "Self-optimizing Concepts and Structures in Mechanical Engineering" pursued the long-term aim to open up the active paradigm of self-optimization for mechanical engineering and to enable others to develop self-optimizing systems. This book is directed to researchers and practitioners alike. It provides a design methodology for the development of self-optimizing systems consisting of a reference process, methods, and tools. The reference process is divided into two phases the domain-spanning conceptual design and the domain-specific design and development. For the conceptual design a holistic approach is provided. Domain-specific methods and tools developed especially for the design and development of self-optimizing systems are described and illustrated by application examples. This book will enable the reader to identify the potential for self-optimization and to develop self-optimizing systems independently."
This book highlights current research into virtual tutoring software and presents a case study of the design and application of a social tutor for children with autism. Best practice guidelines for developing software-based educational interventions are discussed, with a major emphasis on facilitating the generalisation of skills to contexts outside of the software itself, and on maintaining these skills over time. Further, the book presents the software solution Thinking Head Whiteboard, which provides a framework for families and educators to create unique educational activities utilising virtual character technology and customised to match learners' needs and interests. In turn, the book describes the development and evaluation of a social tutor incorporating multiple life-like virtual humans, leading to an exploration of the lessons learned and recommendations for the future development of related technologies. |
You may like...
|