![]() |
![]() |
Your cart is empty |
||
Books > Reference & Interdisciplinary > Communication studies > Information theory
This book presents state-of-the-art solution methods and applications of stochastic optimal control. It is a collection of extended papers discussed at the traditional Liverpool workshop on controlled stochastic processes with participants from both the east and the west. New problems are formulated, and progresses of ongoing research are reported. Topics covered in this book include theoretical results and numerical methods for Markov and semi-Markov decision processes, optimal stopping of Markov processes, stochastic games, problems with partial information, optimal filtering, robust control, Q-learning, and self-organizing algorithms. Real-life case studies and applications, e.g., queueing systems, forest management, control of water resources, marketing science, and healthcare, are presented. Scientific researchers and postgraduate students interested in stochastic optimal control,- as well as practitioners will find this book appealing and a valuable reference.
This book provides a coherent framework for understanding the essence of complex systems and the nature of digital transformations, analyzes challenges of and patterns in innovative development, and shares a wealth of insights and best practices, resulting in the most extensive coverage of the topic available. In particular, the book's cutting-edge contributions, prepared by scientists, engineers, and field experts,focus on the design, implementation, and evaluation of practical interventions that promote the innovative and sustainable development of complex systems. In addition to sharing a rich collection of cases from around the world, they provide a broad interdisciplinary analysis of collaboration mechanisms, theories and approaches to support and accelerate the development of complex systems.
recently discovered advantages of amorphous forms of medicines/pharmaceutical products which focused a significant part of industry-related efforts on the GFA (Glass Forming Ability) and the glass temperature (T) versus pressure g dependences. 1 b ? 0 ? ? o ? P ? Pg P ? Pg 0 ? ? ? ? T (P ) = F (P )D (P ) =T 1 + exp ? g g ? 0 ? ? ? ? c + Pg ? ? ? ? 400 1 b 0 o ? ? ? ? P ? P P ? P g g 0 ? ? ? ? T (P ) = F (P )D (P ) =T 1 + exp ? g g 0 ? ? ? ? c ? + P max g ? ? ? ? T ~7 GPa g max P ~ 304 K Liquid g 300 1 HS glass 0 200 -1 mSG ?=0. 044 Liquid -2 100 -3 glass ?=0. 12 -1. 2 -0. 9 -0. 6 -0. 3 0. 0 log T 10 scaled -1 0 1 2 3 4 5 6 7 8 9 10 11 12 P (GPa) g 19 Figure 1. T he pressure evolution of the glass temperature in gl Th ye s cerol ol . id curve shows the parameterization of experimental data via the novel, modifie d Glat Sizm elon type equation, given in the Figure.
Only a few books stand as landmarks in social and scientific upheaval. Norbert Wiener's classic is one in that small company. Founder of the science of cybernetics--the study of the relationship between computers and the human nervous system--Wiener was widely misunderstood as one who advocated the automation of human life. As this book reveals, his vision was much more complex and interesting. He hoped that machines would release people from relentless and repetitive drudgery in order to achieve more creative pursuits. At the same time he realized the danger of dehumanizing and displacement. His book examines the implications of cybernetics for education, law, language, science, technology, as he anticipates the enormous impact--in effect, a third industrial revolution--that the computer has had on our lives.
This book presents a differential geometric method for designing nonlinear observers for multiple types of nonlinear systems, including single and multiple outputs, fully and partially observable systems, and regular and singular dynamical systems. It is an exposition of achievements in nonlinear observer normal forms. The book begins by discussing linear systems, introducing the concept of observability and observer design, and then explains the difficulty of those problems for nonlinear systems. After providing foundational information on the differential geometric method, the text shows how to use the method to address observer design problems. It presents methods for a variety of systems. The authors employ worked examples to illustrate the ideas presented. Observer Design for Nonlinear Dynamical Systems will be of interest to researchers, graduate students, and industrial professionals working with control of mechanical and dynamical systems.
This book presents best selected research papers presented at Innovation in Sustainable Energy and Technology India (ISET 2020), organized by Energy Institute Bangalore (A unit of RGIPT, an Institute of National Importance), India, during 3-4 December 2020. The book covers various topics of sustainable energy and technologies which includes renewable energy (solar photovoltaic, solar thermal and CSP, biomass, wind energy, micro hydro power, hydrogen energy, geothermal energy, energy materials, energy storage, hybrid energy), smart energy systems (electrical vehicle, cybersecurity, charging infrastructures, IOT & AI, waste management, PHEV (CNG/EV) and mobility (smart grids, IOT & AI, energy-efficient buildings, mart agriculture).
This book presents the latest research on applications of artificial intelligence and the Internet of Things in renewable energy systems. Advanced renewable energy systems must necessarily involve the latest technology like artificial intelligence and Internet of Things to develop low cost, smart and efficient solutions. Intelligence allows the system to optimize the power, thereby making it a power efficient system; whereas, Internet of Things makes the system independent of wire and flexibility in operation. As a result, intelligent and IOT paradigms are finding increasing applications in the study of renewable energy systems. This book presents advanced applications of artificial intelligence and the internet of things in renewable energy systems development. It covers such topics as solar energy systems, electric vehicles etc. In all these areas applications of artificial intelligence methods such as artificial neural networks, genetic algorithms, fuzzy logic and a combination of the above, called hybrid systems, are included. The book is intended for a wide audience ranging from the undergraduate level up to the research academic and industrial communities engaged in the study and performance prediction of renewable energy systems.
This book illustrates how modern mathematical wavelet transform techniques offer fresh insights into the complex behavior of neural systems at different levels: from the microscopic dynamics of individual cells to the macroscopic behavior of large neural networks. It also demonstrates how and where wavelet-based mathematical tools can provide an advantage over classical approaches used in neuroscience. The authors well describe single neuron and populational neural recordings. This 2nd edition discusses novel areas and significant advances resulting from experimental techniques and computational approaches developed since 2015, and includes three new topics: * Detection of fEPSPs in multielectrode LFPs recordings. * Analysis of Visual Sensory Processing in the Brain and BCI for Human Attention Control; * Analysis and Real-time Classification of Motor-related EEG Patterns; The book is a valuable resource for neurophysiologists and physicists familiar with nonlinear dynamical systems and data processing, as well as for graduate students specializing in these and related areas.
This book deals with the autoregressive method for digital processing of random oscillations. The method is based on a one-to-one transformation of the numeric factors of the Yule series model to linear elastic system characteristics. This parametric approach allowed to develop a formal processing procedure from the experimental data to obtain estimates of logarithmic decrement and natural frequency of random oscillations. A straightforward mathematical description of the procedure makes it possible to optimize a discretization of oscillation realizations providing efficient estimates. The derived analytical expressions for confidence intervals of estimates enable a priori evaluation of their accuracy. Experimental validation of the method is also provided. Statistical applications for the analysis of mechanical systems arise from the fact that the loads experienced by machineries and various structures often cannot be described by deterministic vibration theory. Therefore, a sufficient description of real oscillatory processes (vibrations) calls for the use of random functions. In engineering practice, the linear vibration theory (modeling phenomena by common linear differential equations) is generally used. This theory's fundamental concepts such as natural frequency, oscillation decrement, resonance, etc. are credited for its wide use in different technical tasks. In technical applications two types of research tasks exist: direct and inverse. The former allows to determine stochastic characteristics of the system output X(t) resulting from a random process E(t) when the object model is considered known. The direct task enables to evaluate the effect of an operational environment on the designed object and to predict its operation under various loads. The inverse task is aimed at evaluating the object model on known processes E(t) and X(t), i.e. finding model (equations) factors. This task is usually met at the tests of prototypes to identify (or verify) its model experimentally. To characterize random processes a notion of "shaping dynamic system" is commonly used. This concept allows to consider the observing process as the output of a hypothetical system with the input being stationary Gauss-distributed ("white") noise. Therefore, the process may be exhaustively described in terms of parameters of that system. In the case of random oscillations, the "shaping system" is an elastic system described by the common differential equation of the second order: X (t)+2hX (t)+ _0^2 X(t)=E(t), where 0 = 2 / 0 is the natural frequency, T0 is the oscillation period, and h is a damping factor. As a result, the process X(t) can be characterized in terms of the system parameters - natural frequency and logarithmic oscillations decrement = hT0 as well as the process variance. Evaluation of these parameters is subjected to experimental data processing based on frequency or time-domain representations of oscillations. It must be noted that a concept of these parameters evaluation did not change much during the last century. For instance, in case of the spectral density utilization, evaluation of the decrement values is linked with bandwidth measurements at the points of half-power of the observed oscillations. For a time-domain presentation, evaluation of the decrement requires measuring covariance values delayed by a time interval divisible by T0. Both estimation procedures are derived from a continuous description of research phenomena, so the accuracy of estimates is linked directly to the adequacy of discrete representation of random oscillations. This approach is similar a concept of transforming differential equations to difference ones with derivative approximation by corresponding finite differences. The resulting discrete model, being an approximation, features a methodical error which can be decreased but never eliminated. To render such a presentation more accurate it is imperative to decrease the discretization interval and to increase realization size growing requirements for computing power. The spectral density and covariance function estimates comprise a non-parametric (non-formal) approach. In principle, any non-formal approach is a kind of art i.e. the results depend on the performer's skills. Due to interference of subjective factors in spectral or covariance estimates of random signals, accuracy of results cannot be properly determined or justified. To avoid the abovementioned difficulties, the application of linear time-series models with well-developed procedures for parameter estimates is more advantageous. A method for the analysis of random oscillations using a parametric model corresponding discretely (no approximation error) with a linear elastic system is developed and presented in this book. As a result, a one-to-one transformation of the model's numerical factors to logarithmic decrement and natural frequency of random oscillations is established. It allowed to develop a formal processing procedure from experimental data to obtain the estimates of and 0. The proposed approach allows researchers to replace traditional subjective techniques by a formal processing procedure providing efficient estimates with analytically defined statistical uncertainties.
The research presented in this book shows how combining deep neural networks with a special class of fuzzy logical rules and multi-criteria decision tools can make deep neural networks more interpretable - and even, in many cases, more efficient. Fuzzy logic together with multi-criteria decision-making tools provides very powerful tools for modeling human thinking. Based on their common theoretical basis, we propose a consistent framework for modeling human thinking by using the tools of all three fields: fuzzy logic, multi-criteria decision-making, and deep learning to help reduce the black-box nature of neural models; a challenge that is of vital importance to the whole research community.
RDF-based knowledge graphs require additional formalisms to be fully context-aware, which is presented in this book. This book also provides a collection of provenance techniques and state-of-the-art metadata-enhanced, provenance-aware, knowledge graph-based representations across multiple application domains, in order to demonstrate how to combine graph-based data models and provenance representations. This is important to make statements authoritative, verifiable, and reproducible, such as in biomedical, pharmaceutical, and cybersecurity applications, where the data source and generator can be just as important as the data itself. Capturing provenance is critical to ensure sound experimental results and rigorously designed research studies for patient and drug safety, pathology reports, and medical evidence generation. Similarly, provenance is needed for cyberthreat intelligence dashboards and attack maps that aggregate and/or fuse heterogeneous data from disparate data sources to differentiate between unimportant online events and dangerous cyberattacks, which is demonstrated in this book. Without provenance, data reliability and trustworthiness might be limited, causing data reuse, trust, reproducibility and accountability issues. This book primarily targets researchers who utilize knowledge graphs in their methods and approaches (this includes researchers from a variety of domains, such as cybersecurity, eHealth, data science, Semantic Web, etc.). This book collects core facts for the state of the art in provenance approaches and techniques, complemented by a critical review of existing approaches. New research directions are also provided that combine data science and knowledge graphs, for an increasingly important research topic.
This book presents the latest advances in computational intelligence and data analytics for sustainable future smart cities. It focuses on computational intelligence and data analytics to bring together the smart city and sustainable city endeavors. It also discusses new models, practical solutions and technological advances related to the development and the transformation of cities through machine intelligence and big data models and techniques. This book is helpful for students and researchers as well as practitioners.
This book is useful to engineers, researchers, entrepreneurs, and students in different branches of production, engineering, and systems sciences. The polytopic roadmaps are the guidelines inspired by the development stages of cognitive-intelligent systems, and expected to become powerful instruments releasing an abundance of new capabilities and structures for complex engineering systems implementation. The 4D approach developed in previous monographs and correlated with industry 4.0and Fourth Industrial Revolution is continued here toward higher dimensions approaches correlated with polytopic operations, equipment, technologies, industries, and societies. Methodology emphasizes the role of doubling, iteration, dimensionality, and cyclicality around the center, of periodic tables and of conservative and exploratory strategies. Partitions, permutations, classifications, and complexification, as polytopic chemistry, are the elementary operations analyzed. Multi-scale transfer, cyclic operations, conveyors, and assembly lines are the practical examples of operations and equipment. Polytopic flow sheets, online analytical processing, polytopic engineering designs, and reality-inspired engineering are presented. Innovative concepts such as Industry 5.0, polytopic industry, Society 5.0, polytopic society, cyber physical social systems, industrial Internet, and digital twins have been discussed. The general polytopic roadmaps, (GPTR), are proposed as universal guidelines and as common methodologies to synthesize the systemic thinking and capabilities for growing complexity projects implementation.
The subject of the book is to present the modeling, parameter estimation and other aspects of the identification of nonlinear dynamic systems. The treatment is restricted to the input-output modeling approach. Because of the widespread usage of digital computers discrete time methods are preferred. Time domain parameter estimation methods are dealt with in detail, frequency domain and power spectrum procedures are described shortly. The theory is presented from the engineering point of view, and a large number of examples of case studies on the modeling and identifications of real processes illustrate the methods. Almost all processes are nonlinear if they are considered not merely in a small vicinity of the working point. To exploit industrial equipment as much as possible, mathematical models are needed which describe the global nonlinear behavior of the process. If the process is unknown, or if the describing equations are too complex, the structure and the parameters can be determined experimentally, which is the task of identification. The book is divided into seven chapters dealing with the following topics: 1. Nonlinear dynamic process models 2. Test signals for identification 3. Parameter estimation methods 4. Nonlinearity test methods 5. Structure identification 6. Model validity tests 7. Case studies on identification of real processes Chapter I summarizes the different model descriptions of nonlinear dynamical systems.
This book focuses on distributed and economic Model Predictive Control (MPC) with applications in different fields. MPC is one of the most successful advanced control methodologies due to the simplicity of the basic idea (measure the current state, predict and optimize the future behavior of the plant to determine an input signal, and repeat this procedure ad infinitum) and its capability to deal with constrained nonlinear multi-input multi-output systems. While the basic idea is simple, the rigorous analysis of the MPC closed loop can be quite involved. Here, distributed means that either the computation is distributed to meet real-time requirements for (very) large-scale systems or that distributed agents act autonomously while being coupled via the constraints and/or the control objective. In the latter case, communication is necessary to maintain feasibility or to recover system-wide optimal performance. The term economic refers to general control tasks and, thus, goes beyond the typically predominant control objective of set-point stabilization. Here, recently developed concepts like (strict) dissipativity of optimal control problems or turnpike properties play a crucial role. The book collects research and survey articles on recent ideas and it provides perspectives on current trends in nonlinear model predictive control. Indeed, the book is the outcome of a series of six workshops funded by the German Research Foundation (DFG) involving early-stage career scientists from different countries and from leading European industry stakeholders.
The aim of this book is to furnish the reader with a rigorous and detailed exposition of the concept of control parametrization and time scaling transformation. It presents computational solution techniques for a special class of constrained optimal control problems as well as applications to some practical examples. The book may be considered an extension of the 1991 monograph A Unified Computational Approach Optimal Control Problems, by K.L. Teo, C.J. Goh, and K.H. Wong. This publication discusses the development of new theory and computational methods for solving various optimal control problems numerically and in a unified fashion. To keep the book accessible and uniform, it includes those results developed by the authors, their students, and their past and present collaborators. A brief review of methods that are not covered in this exposition, is also included. Knowledge gained from this book may inspire advancement of new techniques to solve complex problems that arise in the future. This book is intended as reference for researchers in mathematics, engineering, and other sciences, graduate students and practitioners who apply optimal control methods in their work. It may be appropriate reading material for a graduate level seminar or as a text for a course in optimal control.
This book includes the original, peer reviewed research articles from the 2nd International Conference on Cybernetics, Cognition and Machine Learning Applications (ICCCMLA 2020), held in August, 2020 at Goa, India. It covers the latest research trends or developments in areas of data science, artificial intelligence, neural networks, cognitive science and machine learning applications, cyber physical systems and cybernetics.
This book is a collection of selected papers presented at the International Conference on Mathematical Analysis and Computing (ICMAC 2019) held at Sri Sivasubramaniya Nadar College of Engineering, Chennai, India, from 23-24 December 2019. Having found its applications in game theory, economics, and operations research, mathematical analysis plays an important role in analyzing models of physical systems and provides a sound logical base for problems stated in a qualitative manner. This book aims at disseminating recent advances in areas of mathematical analysis, soft computing, approximation and optimization through original research articles and expository survey papers. This book will be of value to research scholars, professors, and industrialists working in these areas.
'It can be used as a supplementary material for teaching thermodynamics and statistical physics at an undergraduate or postgraduate level and can be a great read for undergraduate and postgraduate students of Sciences and Engineering.'Contemporary PhysicsIn this unique book, the reader is invited to experience the joy of appreciating something which has eluded understanding for many years - entropy and the Second Law of Thermodynamics. The book has a two-pronged message: first, that the Second Law is not infinitely incomprehensible as commonly stated in most textbooks on thermodynamics, but can, in fact, be comprehended through sheer common sense; and second, that entropy is not a mysterious quantity that has resisted understanding but a simple, familiar and easily comprehensible concept.Written in an accessible style, the book guides the reader through an abundance of dice games and examples from everyday life. The author paves the way for readers to discover for themselves what entropy is, how it changes, and, most importantly, why it always changes in one direction in a spontaneous process.In this new edition, seven simulated games are included so that the reader can actually experiment with the games described in the book. These simulated games are meant to enhance the readers' understanding and sense of joy upon discovering the Second Law of Thermodynamics.All errors in the previous edition were corrected and a whole new section (7.7) has been added in which the meaning of entropy is explain in simple lanaguage.
How to build and maintain strong data organizations--the Dummies way Data Governance For Dummies offers an accessible first step for decision makers into understanding how data governance works and how to apply it to an organization in a way that improves results and doesn't disrupt. Prep your organization to handle the data explosion (if you know, you know) and learn how to manage this valuable asset. Take full control of your organization's data with all the info and how-tos you need. This book walks you through making accurate data readily available and maintaining it in a secure environment. It serves as your step-by-step guide to extracting every ounce of value from your data. Identify the impact and value of data in your business Design governance programs that fit your organization Discover and adopt tools that measure performance and need Address data needs and build a more data-centric business culture This is the perfect handbook for professionals in the world of data analysis and business intelligence, plus the people who interact with data on a daily basis. And, as always, Dummies explains things in terms anyone can understand, making it easy to learn everything you need to know.
The past few years have seen the attention and rapid developments in event-triggered sampled-data systems, in which the effect of event-triggered sensor measurements and controller updates is explored in controller analysis and design. This book offers the first systematic treatment of event-triggered sampled-data control system design using active disturbance rejection control (ADRC), an effective approach that is popular in both theoretic research and industrial applications. Extensive application examples with numerous illustrations are included to show how the event-triggered ADRC with theoretic performance guarantees can be implemented in engineering systems and how the performance can be actually achieved. For theoretic researchers and graduate students, the presented results provide new directions in theoretic research on event-triggered sampled-data systems; for control practitioners, the book offers an effective approach to achieving satisfactory performance with limited sampling rates.
This book constitutes the refereed post-conference proceedings of the 4th International Conference on Intelligence Science, ICIS 2020, held in Durgapur, India, in February 2021 (originally November 2020). The 23 full papers and 4 short papers presented were carefully reviewed and selected from 42 submissions. One extended abstract is also included. They deal with key issues in brain cognition; uncertain theory; machine learning; data intelligence; language cognition; vision cognition; perceptual intelligence; intelligent robot; and medical artificial intelligence.
This book covers some selected problems of the descriptor integer and fractional order positive continuous-time and discrete-time systems. The book consists of 3 chapters, 4 appendices and the list of references. Chapter 1 is devoted to descriptor integer order continuous-time and discrete-time linear systems. In Chapter 2, descriptor fractional order continuous-time and discrete-time linear systems are considered. Chapter 3 is devoted to the stability of descriptor continuous-time and discrete-time systems of integer and fractional orders. In Appendix A, extensions of the Cayley-Hamilton theorem for descriptor linear systems are given. Some methods for computation of the Drazin inverse are presented in Appendix B. In Appendix C, some basic definitions and theorems on Laplace transforms and Z-transforms are given. Some properties of the nilpotent matrices are given in Appendix D.
This book highlights the prevention of possible accidents and crashes of aircrafts by analyzing the many factors that affect such events. It includes the theoretical study of known ideas and concepts, as well as a set of new methods and mathematical models. It contains factual information to investigate famous disasters and aviation accidents with aircrafts. The book proposes methods and models that can be the basis in developing guidance material for decision-making by the flight crew and experts in air traffic control. Some of the contents presented in this book are also useful in the design and operation of data transmission systems of aircraft. The book is intended for engineering and technical specialists engaged in the development, manufacturing and operations of onboard radio electronic systems of aircraft and ground-based radio engineering support for flights, as well as graduate students and senior students of radio engineering specialties. It is useful to researchers and managers whose activities are related to air traffic control.
This book provides novel approach to the diagnosis of complex technical systems that are widely used in various kinds of transportation, energy, metallurgy, metalworking, fuels, mining, chemical, paper industries, etc. Effective diagnostic systems are necessary for the early detection of errors in mechatronic systems, for the organization of maintenance and for the assessment of the performed service quality. Unfortunately, the practical use of AI in the diagnosis of mechatronic systems is still quite limited and the inability to build effective mechatronic systems leads to significant economic losses and dangers. The main aim of this book is to contribute to knowledge within the topic of diagnostics of mechatronic systems by the analysis of the elements reliability characteristics, using methods, models and algorithms for diagnostics and by studying examples of model diagnostic systems using AI methods based on neural networks, fuzzy inference systems and genetic algorithms. |
![]() ![]() You may like...
Research Anthology on Social Media's…
Information Resources Management Association
Hardcover
R10,120
Discovery Miles 101 200
Smart Grids and Their Communication…
Ersan Kabalci, Yasin Kabalci
Hardcover
R6,456
Discovery Miles 64 560
|